Dear Bethesda, I Only Want 1 Thing Patched

Post » Wed May 30, 2012 5:59 pm

This is rubbish. My framerate goes down to 32fps in some parts, mainly cities, and it is certainly IS very visible. You must have bad senses to not see that, we don't have super senses for goodness sake! Your silly comment is putting forward the idea that 32fps on a 4.4ghz Core i5 2500k CPU is fine, well, it is not!

I can tell a huge difference as well. When I'm running 50 to 60 FPS it's silky smooth like butter. But when I start dropping below 50 down into the 30's & 40's I get an extremely noticeable micro-stutter and lag. It's really, really annoying. And I also agree that newer systems get no benefit from their hardware. The primary reason I just rebuilt my rig was for Skyrim. But it appears that was a waste of money in terms of this game:

i7 Sandy Bridge
16 Gig DDR3 1600
Corsair SSD
2x GTX 470
User avatar
Johnny
 
Posts: 3390
Joined: Fri Jul 06, 2007 11:32 am

Post » Wed May 30, 2012 2:06 pm

I understand FPS lows are annoying, but it's how the engine is, and how it always will be.

For example, I cap my FPS at 35. I'm at 35 solid 90% of the time. I sometimes drop to 30 in cities, however I go to 20fps in one location in Markarth. That is really noticeable, yeah. But to whine about going from 60 to 40 I find pretty amazing to be honest. This isn't a first person shooter. This is an RPG designed to be played at around 30fps. You don't need constant 60fps, and to be honest you will never, ever get constant 60fps.
User avatar
Sheeva
 
Posts: 3353
Joined: Sat Nov 11, 2006 2:46 am

Post » Thu May 31, 2012 5:55 am

Errr...

Unless you are looking through the eyes of someone else, no one is qualified to comment about someone else's fps detection based upon what you see with your own eyes.

I have a great 18 month old gaming system (thank you Velocity Micro!) that is running Skyrim at 2560x1600 on a Dell 30" monitor, everything maxed, with fps capped at 60 (V-sync). I monitor fps with EVGA Precision continually, using my Logitech keyboard display, and it rarely drops below 60, and only on very rare, and very short periods (less than 2 seconds by my estimation), does fps drop into the 40's. I am not manipulating any CPU cores and I have no mods installed. I am running W7U, an i7 980X OC'd to just over 4.1GHz, 12 GB RAM, and twin EVGA GTX 580 OC's in SLi (cards were upgraded since the build) using the latest Nvidia Beta drivers.

I feel confident that given the popularity of Skyrim for the PC, video graphics card chip manufacturer's are scrambling to optimize their drivers so they can claim their cards run the program "the best", or at least better since the original release date.

Have I experienced some minor bugs? Yes. Have they ruined Skyrim game play for me? No, not on my system. Rest assured that Bethesda is working to make Skyrim even better. I know of no major gaming company that has been able to produce a bug free game, even after release. Current PC systems offer far too many permutations for developers to be able to swat every bug for every system...but developers do the best they can. Their job depends upon it.
User avatar
Camden Unglesbee
 
Posts: 3467
Joined: Wed Aug 15, 2007 8:30 am

Post » Thu May 31, 2012 4:53 am

No anything between 30 and 60 is almost impossible to see with the human eye, so as long as your getting no less than 30 FPS your doing just fine. They have plans for optimization and it will roll out after the holidays.
This is just plain wrong. Framerates around 25 to 30 FPS as captured by a video or movie camera (not that there's a lot of difference between those two these days) can look smooth because of how the images are made; minus any delay with operating the device's shutter and/or sensor, these images contain an average of what was going on during the 1/25 or 1/30 of a second they were capturing light. This works well with the laggy human brain and mostly looks correct as a result.

Games don't work like this; the image that appears on the screen is a perfect still from a single, discrete moment in time, and as such is badly lacking in the natural "motion blur" that lets 24 FPS movies look smooth. 30 FPS might get interpreted as motion by the visual cortex, but anything moving fast certainly isn't moving smoothly. To approximate real motion without postprocessing gimmicks the game would need to draw at far higher framerates than that.
User avatar
Marcus Jordan
 
Posts: 3474
Joined: Fri Jun 29, 2007 1:16 am

Post » Thu May 31, 2012 1:43 am

I think it's not realist asking Bethesda to make some changes at this point. They have years to add more than 2 core suport a 64 bit executable or a 4Gb flag activation. They won't do it now.
User avatar
alicia hillier
 
Posts: 3387
Joined: Tue Feb 06, 2007 2:57 am

Post » Thu May 31, 2012 12:07 am

I think it's not realist asking Bethesda to make some changes at this point. They have years to add more than 2 core suport a 64 bit executable or a 4Gb flag activation. They won't do it now.
This is sad, but true. So much for the "new" engine, lol. I said a long time ago that the "new" engine was just hype and it was the same old engine with some makeup. You can't polish a turd, and you can put makeup on a pig, but it's still a pig...
User avatar
Penny Courture
 
Posts: 3438
Joined: Sat Dec 23, 2006 11:59 pm

Post » Thu May 31, 2012 12:03 am

I find it seriously hard to consider this a new game engine; it is Gamebryo through & through. Admittedly, the dynamic shadows do make a difference but the fact that they can be very low resolution (no matter what setting you use) and that they are apparently CPU rendered means that... well basically we NEED proper multi threading support! Should've used DX11 even if you weren't going to use the other features as it was built for multi threading amongst other things. I mean, it possible to make it DX9 to DX11, they did it for Crysis 2 and Shogun Total War. C'mon, don't be so lazy and join the 21st century!
User avatar
Mrs Pooh
 
Posts: 3340
Joined: Wed Oct 24, 2007 7:30 pm

Post » Wed May 30, 2012 11:25 pm

My game generally runs at 60 fps unless I'm in Whiterun, at certain areas in Skyrim or in specific caves, where it drops to about 30-40, and it is very noticeable. Don't [censored] me about what I can or cannot see. Even when fraps isn't on, I can tell you exactly when I get fps dips. How can that be possible if the human eye can't notice these changes?
User avatar
Jennifer May
 
Posts: 3376
Joined: Thu Aug 16, 2007 3:51 pm

Post » Thu May 31, 2012 2:54 am

My game generally runs at 60 fps unless I'm in Whiterun, at certain areas in Skyrim or in specific caves, where it drops to about 30-40, and it is very noticeable. Don't [censored] me about what I can or cannot see. Even when fraps isn't on, I can tell you exactly when I get fps dips. How can that be possible if the human eye can't notice these changes?

^This. It's all rather subjective, but it's real. Some people can notice it. Some cannot. I personally feel movement fluidity starts to get lost at around 40-45fps. Anything below just feels off. I go between BF3 and GoW3 and there's always this adjustment period to get used to the lower fps. Same goes for input lag. Some players are fine with 60-80ms in input lag on their display, but anything greater than 40ms feels off for me.

But that's all besides the point. This is an rpg and I'd be fine with 30fps with graphics/gameplay mods running. As it stands this engine cannot provide that. That's what matters. That's why the engine needs to be optimized. It's about the future of the modding community and any prospect of modding Skyrim to look like a decent PC title in 2011. I'd love to be able to have high-res textures for the entire world. It would be sweet to have an Graphics Extender that added Godrays and HBAO etc etc. None of that will be feasible without engine optimization. With how much of a cpu bottleneck there is atm even gameplay mods that use scripting will have their limits.
User avatar
Robert Bindley
 
Posts: 3474
Joined: Fri Aug 03, 2007 5:31 pm

Post » Wed May 30, 2012 9:52 pm

I hope they fix a lot more than one thing, and I hope they focus more on game balance and logic bugs, like enchant scaling, and soul gems that can't hold a charge, and various skills that don't work correctly at or above 100 (e.g. sneak), and broken/nonexistent follower level scaling. Although to be fair some people seem to consider the soul gem thing more of a feature than a bug, so that's debatable.

I suspect most of the people with more serious errors need to sort out their hardware. I have a very typical setup, but I've got a good PSU and I run everything at stock settings. I've had no more than two or three CTDs after many hours of play and a few more where it didn't crash but needed to be restarted. Granted the shadow renderer is disappointing but otherwise the game looks great and is performant.
User avatar
Inol Wakhid
 
Posts: 3403
Joined: Wed Jun 27, 2007 5:47 am

Post » Wed May 30, 2012 5:18 pm

I wish I had your FPS, mine goes below 20 in some towns (no matter what graphic settings I use)

Sorry, but that doesn't sound right.

the game runs butter smooth for me- better than some games from five years ago-
but as soon as i turn to face any town or small settlement the framerate plummets dramatically

yet this doesnt happen on my six year old XBOX 360
why???
User avatar
JD FROM HELL
 
Posts: 3473
Joined: Thu Aug 24, 2006 1:54 am

Post » Wed May 30, 2012 11:00 pm

It's the cpu bottleneck of this engine. It seems to only be able to utilize 2 cores. Windows is able to distribute the 2 core load on 4 cores, but you still end up with only 40-50% of a quad core being used at most. The reason I get better fps than you do is mostly bc my 2500k is OC'd to 4.2. If I took the time to OC to 4.7 I'd probably get 10fps more. The sad, no, the pathetic part about this bottleneck is that Oblivion suffers from the same thing. Right now, your only hope to have decent fps everyone is to OC your cpu, and that's not something most users know how to do nor should do. If SB cpus weren't so easy to OC I probably wouldn't have gotten around to OCing it yet. 4.2ghz at stock voltages. Any higher and I'd have to do that whole dance of (increase vcore>post)*x+4hrsPrime95>12hrsPrime95...a ridiculously time consuming process considering the gains to be found in other applications will be minimal, but nonetheless, the only option available to anyone wishing to further increase their fps in this game.

Prime95 is rather unreliable. Try 10 minutes of Linpack.
User avatar
Penny Courture
 
Posts: 3438
Joined: Sat Dec 23, 2006 11:59 pm

Post » Wed May 30, 2012 7:40 pm

The only thing I want patched is separating the game's physics from the framerate. I want to turn off Vsync without any side effects.

Erm....wouldn't physics need to be tied to video output framerate? I think Vsync is required by the game engine.
User avatar
Tanya Parra
 
Posts: 3435
Joined: Fri Jul 28, 2006 5:15 am

Post » Wed May 30, 2012 2:54 pm

I support this thread.
Me too.
User avatar
NeverStopThe
 
Posts: 3405
Joined: Tue Mar 27, 2007 11:25 pm

Post » Wed May 30, 2012 10:56 pm

Errr...90% of the stuff the OP asked for is done by this latest patch.....why was this Necroed and that not mentioned?
User avatar
Queen of Spades
 
Posts: 3383
Joined: Fri Dec 08, 2006 12:06 pm

Post » Thu May 31, 2012 5:50 am

1) anyone can tell the difference between 30 and 60 fps. anyone can tell the difference between 60 and 120 fps. get an fps limiter. set it to 30. look around in a circle with your mouse. alt tab out. set it to 60. look around in a circle with your mouse. alt tab out. set it to 120. look around in a circle with your mouse. the difference is not subtle, and if you say you can't tell you're being disingenuous. warning: that last part won't work without a 120 hz monitor.
2) v-sync is not required by the engine. you can turn it off without side effects, but you must use an fps limiter to avoid physics glitches. as high as 80 fps works fine.
3) patch 1.310 introduced the 4GB (LAA) support
4) patch 1.4 optimized the game to a slight degree. people have reported 5-20 fps increases. i've heard that the changes made were akin to those done in the mod skyboost, which "Boosts your Skyrim via replacing most bottleneck code parts in the executable at runtime."

i do share concern about the viability of playing a heavily modded skyrim on this engine, however. the fact remains that the improvements have been minor, and the game is still cpu-limited. i'd like someone who knows what they are talking about to comment on how dire the situation is.

p.s. with 580s in SLI and my i7-2600k at 4.2, i am still very much subject to framerates in the 40s, even 30s, on occasion, or if i crank up ugrids to 7, say. haven't experimented with mods.
User avatar
Kellymarie Heppell
 
Posts: 3456
Joined: Mon Jul 24, 2006 4:37 am

Post » Thu May 31, 2012 5:06 am

You know guys I'm seriously jealous of your amazing eyesight or other superhero senses that let you tell 40 fps from 20. I barely see any difference worth caring about between 20 and 60 o_o and that's what my Skyrim is allegedly constantly fluctuating between...
Well, the human eye cannot differentiate the difference with anything past 32 FPS. Just how I love hearing people say that they have some audio hardware that can detect and/or play audio that is in the 40Khz range...I always lol at stuff like this, for the simple fact that humans cannot hear over 12.5Khz. So that money they spent on that hardware is completely wasted. <---Audio Engineer.
User avatar
Rozlyn Robinson
 
Posts: 3528
Joined: Wed Jun 21, 2006 1:25 am

Post » Thu May 31, 2012 4:41 am

In order to 'optimize the engine', bethesda would have to hire some quality programmers.....which i don't think is going to happen anytime soon.
Sorry B, love your art dept, but your coders are all on skooma. I'd like to leave a somewhat constructive suggetion though...FIRE THEM.
User avatar
Amanda Leis
 
Posts: 3518
Joined: Sun Dec 24, 2006 1:57 am

Post » Thu May 31, 2012 12:57 am

Game runs at constant 60fps on ultra with medium shadows, 1080p, hd tex mods... maybe they tailored the game to my system, but it feels pretty optimised to me.
User avatar
BaNK.RoLL
 
Posts: 3451
Joined: Sun Nov 18, 2007 3:55 pm

Post » Wed May 30, 2012 8:06 pm

Well, the human eye cannot differentiate the difference with anything past 32 FPS. Just how I love hearing people say that they have some audio hardware that can detect and/or play audio that is in the 40Khz range...I always lol at stuff like this, for the simple fact that humans cannot hear over 12.5Khz. So that money they spent on that hardware is completely wasted. <---Audio Engineer.

fail.

a site explaining why the human eye can detect differences between, say, 30 and 60 fps: http://amo.net/NT/02-21-01FPS.html DO A TEST FOR YOURSELF AND SEE--it's that simple. i guarantee you that you can tell. you. can. tell. you can tell. stop talking and go look. stop being thought-noobs.
User avatar
Kirsty Collins
 
Posts: 3441
Joined: Tue Sep 19, 2006 11:54 pm

Post » Wed May 30, 2012 11:36 pm

I mean, it possible to make it DX9 to DX11, they did it for Crysis 2 and Shogun Total War. C'mon, don't be so lazy and join the 21st century!

DX9 is faster than DX10, which is faster than DX11... (Running the same commands.)

DX10 just adds more commands to DX9, and nothing special. (It is also only available on Vista and Win7, not xbox or XP.)
DX11 adds a lot of coolness, at the cost of near-death of the games... "Crysis 2" Average FPS is about 20FPS across the board, unless you turn off the effects, in which case, it is reduced to DX9, and still runs slower on DX11, than it does on an XP rig, with the same DX9-only settings. DX11 is a Win7-only option. (You now exclude X-Box, XP, and Vista.)

However...

If they actually used "tessellation", and used it correctly... (Thus removing all the LOD junk from models.) The game would be a LOT faster, and easier to manage... If "Tessellation" actually worked like it should, which it does not, still...

Other than that, DX11 offers nothing special. Great looking water, skies, and vines... But simple programming can/does surpass those "auto-features".

Shaders... Now there is something that they could use. Since they keep trying to code shaders, and do it horribly... Using DX11 would just be even slower, unless they actually use the DX11 shaders, correctly. Since they can't manage DX9 shaders, I fail to believe that they could manage the more complex DX11 shaders code. Obviously, someone needs to take a shaders-class.
User avatar
TASTY TRACY
 
Posts: 3282
Joined: Thu Jun 22, 2006 7:11 pm

Post » Wed May 30, 2012 1:50 pm

Well, the human eye cannot differentiate the difference with anything past 32 FPS. Just how I love hearing people say that they have some audio hardware that can detect and/or play audio that is in the 40Khz range...I always lol at stuff like this, for the simple fact that humans cannot hear over 12.5Khz. So that money they spent on that hardware is completely wasted. <---Audio Engineer.

hahahaha OMG. humans can hear up to 20kHz, not 12.5. human range is 20hz - 20khz. and an audio device that advertises 48kHZ (or 44.1, 96, never 40) is referring to sample rate, which goes hand in hand with bit depth to define the resoutlion of the digital sample of an anolog wave. not the same thing as audible frequency range. you have absolutely no idea what you are talking about.

sorry i know this is off topic but i had to jump in. esp at '<---audio engineer' omg i almost fell off my chair.
User avatar
Jade MacSpade
 
Posts: 3432
Joined: Thu Jul 20, 2006 9:53 pm

Post » Wed May 30, 2012 4:20 pm

In order to 'optimize the engine', bethesda would have to hire some quality programmers.....which i don't think is going to happen anytime soon.
Sorry B, love your art dept, but your coders are all on skooma. I'd like to leave a somewhat constructive suggetion though...FIRE THEM.
Ouch... LOL...

No, teach them... Since they are already up to their ears in the code... why bring someone-else in to clean-up, reinventing the wheel... Just get them training-wheels!

In all fairness... Look at what they had to start with... The engine they damned themselves to, was not the best choice. (It wasn't even a good choice.) The code they added made it soooo much better, but now they are still haunted by the core elements, which they duct-taped onto the original duct-taped components.

The biggest issues are related to scripting-code execution, and data-base/cache/memory management. (Ignoring the shader-issues, since those can be adjusted or turned-off.) Failing scripts, non-acting scripts, data-corruption, failure to clean-up junk-data, poor cache system... To me, are the largest issues, which are NOT related to the original engine, which was mostly a rendering engine.
User avatar
jenny goodwin
 
Posts: 3461
Joined: Wed Sep 13, 2006 4:57 am

Post » Thu May 31, 2012 1:41 am

I think they need to have a Creation Engine SDK and let people like Arisu take a look at the engine source code. If Epic and Valve can do it why not Bethesda.
User avatar
Ladymorphine
 
Posts: 3441
Joined: Wed Nov 08, 2006 2:22 pm

Post » Wed May 30, 2012 3:29 pm

I guess the performance seems to be rather hardware specific; I'm running the game with 4xMSAA, injected SMAA, highly tweaked "high" shadows, rest of the stuff maxed out, some draw distances like grass and specularity amped, uGrids at 7, with a few high resolution textures replacing the lowest res textures in the game, more detailed LOD meshes and for most parts the game runs at a stable 60fps. The lowest I've seen since the beta patch is around 40fps, but that's extremely rare. It's not like my hardware's brand new either; i920's still a capable CPU but I have it at stock clocks currently and while still pretty good, 5850 is behind the likes of 570 or 6970. Then again, my experience with Skyrim has been very positive and I've had zero crashes in almost 300hrs of gameplay across 2 characters and the only bug I can think of is the z-fighting in the mountains west of Whiterun. That doesn't mean the engine's very well optimized, but when it works it's pretty good.
User avatar
Kathryn Medows
 
Posts: 3547
Joined: Sun Nov 19, 2006 12:10 pm

PreviousNext

Return to V - Skyrim