.ini settings and what they really do

Post » Mon Apr 25, 2011 10:26 am

I wanted to create a thread for debunking a lot of the alleged game 'tweaks' that have been floating around since the days of Oblivion and explain what they really mean and why you shouldn't use them. Somebody has gone as far as to make an .ini "enhancer" file on the new vegas nexus site that supposedly creates a tricked-out .ini file and it's just bad bad bad

First of all, do not religiously follow any of the .ini tweaks you read here or on the tweakguides. At no point has a Bethesda employee ever officially explained what any of these settings actually does, anything you read is a guess

The educated consensus after years of messing with Oblivion is that any performance increase is entirely placebo and there is also a great chance that you will decrease performance. Some of these settings might be legacy and have no in-game effect (or are overridden at runtime)

Memory, background loading, and multi-threading tweaks:

Supposedly you can increase the following settings which will allocate more RAM to increase the game's performance:

uInterior Cell Buffer=3
uExterior Cell Buffer=36
iPreloadSizeLimit=26214400

The truth is that the preloadsizelimit and the cell buffers are NOT EVEN RELATED TO EACH OTHER despite what the tweakguide says. uExterior Cell Buffer= is actually related to a setting which does affect performance, uGridsToLoad=

This is what an anonymous artist who worked on Oblivion says about uGridsToLoad:

raise to increase the X/Y dimension of the number of high detail exterior cells loaded - should be an odd number, 3 or greater. This is quite a performance hit with anything above 5. If there are 25 cells loaded at a time with a value of 5, then there are 49 cells loaded with a value of 7 and so on. I would not recommend going above 7 or 9, but terrain looks nicer if your comp can take it

The uGridsToLoad variable effects the number of exterior world space cells that are loaded. If, for example, you were to change this to a value of 9 you would have to adjust uExterior Cell Buffer to 100 otherwise it would crash. The formula to do this is (1+uGrids)^2
(I didn't come up with this, I found out about it here - http://www.yacoby.net/es/forum/25/7183221182807060.html ) uInterior Cell Buffer should stay at 3 and never go above 5 since that many interior cells are never even loaded to begin with. Setting these buffer values higher will only cause the game to reserve RAM that otherwise the game would utilize. If you do play with the uGridsToLoad setting, make sure it is always an odd number and also make sure you adjust the exterior cell buffer using the above formula. If your system can handle it, you can actually increase the quality of objects in the distance. There might be some visual oddities though having to do with the LOD textures and if you play with a high setting and lower it back down later, you will not be able to load any games you saved with the higher value. High uGridsToLoad values will severely impact framerate and also cause longer loading times. The 4GB enabler (below) might help with high values

iPreloadSizeLimit is not a cache as the tweakguide says that it is. According to the link above, you shouldn't deviate from the default value (smaller values are actually a good thing) unless you are experiencing stability issues due to the amount of mods you have installed. So don't raise it unless you're using a lot of mods, and if you do run a lot of mods also make sure to use the 4GB enabler. The vanilla game will never use more than 1GB of RAM (can be verified by task manager if you don't believe it) so do not use the 4GB enabler either unless you are using a lot of mods (especially high res texture mods). The 4GB enabler will actually decrease your game's overall performance so only use it if you need it (ex - if you have enough mods that actually eat into over 2GB of RAM)

bUseHardDriveCache= - This setting is for the Xbox 360. Do not use it unless you want your game to stutter even more because it's using your slower HD as a cache instead of speedy RAM
Audio Cache settings - Changing any of these has been known to cause the game to crash

Background Loading Settings - Leave these at their default values ; Do not switch any that default to 0 to 1. Contrary to what the tweakguide says, these will INCREASE loading-related stuttering if you turn them on because instead of loading at the load screens it will load while you're playing the game. There might actually be a performance benefit to turning some of these off although I personally haven't tried it (changing bBackgroundCellLoads= to 0 might reduce loading-related stuttering)

"Threading" Settings - Nobody knows for sure that any of these settings have something to do with multi-threading but that doesn't keep people from suggesting they all be turned on. They are off for a reason.. bUseMultiThreadedFaceGen will make NPC turn into odd colors for example and bMultiThreadAudio will cause the game to freeze when you exit to windows. Leave these all at their default values. Even the mods here recommend bUseThreadedAI be turned on but there is absolutely no proof that this has any sort of effect on the game other than placebo. No performance increases were noticed from altering these settings, leave them alone

iNumHWThreads=2 - Recommended for Fallout 3 to stop freezing. It stopped it alright, instead of freezing the game would CTD. This has been fixed in NV so don't use this setting. Even set to 2 the game will still use all available cores if it needs them unless you manually set the affinity which I do not recommend since you will get reduced FPS. Yes the game primarily uses 1 core, but forcing it to use only one core (or two cores) will hurt your performance and this can be verified with FRAPS even. No matter what you set any of the threading settings to, the game will load around 40 threads anyways (can be verified from task manager's performance monitor if you don't believe it)

Pixel Shader 3.0 support - By default the .ini says bAllow30Shaders=0. Changing this to 1 does not force shader 3.0 support as the game uses 2.0 shaders exclusively (can be verified by checking your rendererinfo.txt file, the line that says RenderPath : BSSM_SV_2_x where x is A or B depending on ATI or nvidia). Supposedly, copying shaderpackage19 to whatever your game uses on your card and changing this setting to 1 will force 3.0 shaders. Don't do that, 3.0 shader support (like 1.0 shader support used in 'oldblivion') is incomplete and inferior to 2.0, PLUS shaderpackage19 is for ATI cards only. Do not change this, I personally tried it when using Fallout 3 and it cut my performance on Mothership Zeta from the 30s to the teens with my old computer.

You CAN adjust settings in the .ini to extend the LOD settings past what is allowed in the launcher or adjust grass/tree settings. These all have an impact on FPS but can make the game look a little better depending on your setup. As far as tweaking goes though in order to squeeze more performance, the default .ini already does an excellent job by itself doing that. Don't mess with it, don't blindingly follow every 'tweakguide' you read on the interwebz unless it is marked as official either
User avatar
[Bounty][Ben]
 
Posts: 3352
Joined: Mon Jul 30, 2007 2:11 pm

Return to Fallout: New Vegas