» Thu May 17, 2012 11:38 pm
Surxenberg and C89c are both right. The truth is that unless someone ITT is one of the developers, nobody knows how the game is juggling the system resources and are just guessing.
Whenever a new PC game comes out, you can count on a lot of whining in the hardocp forum, nvidia's forum, guru3d forum, AMD/ATI's forum, steam's forum, etc. over the same exact things:
"My GPU usage is only 33% so this game is CPU limited" (PS3 and 360 have don't exactly have state of the art CPUs and it runs on theirs. Contrary to popular belief, Cell is not the beast it was made out to be)
"Why is the executable not large address aware or 64-bit?" (because it doesn't matter for the vanilla game, only when using a ton of mods)
"Why does the game run so slow on my new laptop with Intel integrated graphics?" (if it runs at all)
"DX9? In 2011?" (a legit complaint kinda, but most probably can't see a difference)
"The game only uses 1 or 2 cores" (open up task manager, it's using all your cores just not fully)
"The game stutters like crazy"
"I'm only getting 10fps on my i7-990x SLI 580 GTX 128GB RAM rig"
"I'm never buying a (insert publisher) product again"
You get the idea. It doesn't matter if it's Skyrim, COD MW3, BF3, or even friggin Portal 2. To be fair though, many are playing Skyrim below 60fps which seems a bit unusual considering it is a console port running an updated version of the FO3 engine. I'm playing with a laptop GPU, nvidia GTX 460m and a 1.7 (2.8 turboboost) i7. I'm using 1920x1080, ultra, AA off, FXAA on, shadows on low, no modifications to ini files and the default control panel settings. I'm getting on average of 35fps outdoors and 40-60fps indoors with the occasional dips below. This is still better than the framerate 360/PS3 users are getting with a much better image quality.
If using FXAA in conjunction with AA, I recommend disabling AA. FXAA introduces a slight blur but also smooths out the alpha textures (transparencies) as a trade off. I was using FXAA with 4xMSAA until I read that using AA and FXAA together is pointless. I was skeptical, but to my eyes FXAA looks the same as 4xMSAA. AA has more of a performance impact on Skyrim than other games, even at 2x. FXAA has very little performance impact. So try turning off AA and using FXAA, it's pointless to use them together because it will look the same as if AA were disabled and you lose like 5fps.
When the sun moves, the shadows re-render outdoors. Shadows are blocky period so low doesn't bother me but it makes that re-rendering effect more pronounced. I can live with it.
Beware of messing around with the .ini file. Self-proclaimed tweaking experts have been recommending things like turning on transparency AA or water multisampling. Those two settings don't even work in Skyrim. They probably left them from an earlier pre-FXAA build. Either use transparency AA in your driver control panel (which was better than the FO3 dithering TAA anyways) or FXAA. Otherwise enjoy your placebo ini transparency and water multisampling. That's just one example, there are others being recommended that either don't work or hurt performance. You can improve LOD and mess with shadows but in addition to potential fps hits it might mess up how the game was intended to render. Disabling vsync in the .ini will cause the game to speed up when there is no AI interaction. Some people are confusing this speedup with vsync mouse sensitivity. It's not the same thing, the game runs faster than intended when it goes above 60fps and no AI elements are on the screen (FO:NV also did this). Also, if you use Direct3doverrider it probably isn't working. It doesn't chime when the game loads. It didn't work for NV either (but did for Oblivion and FO3).
Experiment for yourself with fraps open, don't take some anon on the internet's word for it (not even mine). This is the internet. People lie. They lie about their rigs and exaggerate their performance (or lack of performance). Ex "I'm always getting 60 fps with everything at max" is probably unlikely all the time with this game. At least right now for most of us. Their framerate is likely dropping and they just can't tell. They probably didn't use fraps either. PC gamers are a strange bunch that don't always see eye to eye (ex, recommending a high res texture mod that destroy the art style with bad taste... or the furry mods) and often don't have a clue what they are talking about.
People said NV was CPU-limited because the fps svcked for the first couple weeks only to find out it was a direct x bug that could temporarily be fixed with a hooking dll. I doubt Skyrim has a similar problem, but who knows? Maybe they can improve it. Just don't expect any improvements for flickering textures in the distance or the occasional loading/streaming hiccup. Those have been part of the engine since Oblivion.