You were trying to tell me higher FPS = better ...?

Post » Wed Dec 05, 2012 1:52 pm

Motion blur is sometimes overdone but its original purpose is in fact to provide motion blur, not give the viewer another "special effect" to gawk at.
If that's truly the intended purpose then to my eyes they've failed miserably. :shrug:

Which is exactly what they do in certain implementations of the effect. It's called triple buffering.
I'm actually well aware of how 3D graphics rendering works. AFAIK triple buffering, useful as it is, cannot predict the future. :tongue: It's also expensive. If you're having trouble getting more than 30 fps out of a game triple buffering is not going to help things. At all. Triple buffering on top of running the algorithm to compare and merge/blur the frames every 1/30th of a second would be even more expensive. At some point you're going to run into a "diminishing returns" situation.

And even if they don't do that, the effect still has real benefits for most people, myself included. Though as I said before, if you don't find it convincing... well, I won't argue that.
That's great. It gives me a headache. :foodndrink:

My only point was that comparing film framerate and game framerate isn't really valid because of the many differences in the mediums. There are reasons that people say that games need to be rendered at 60+ fps that don't apply to recorded film by its nature.
User avatar
Shianne Donato
 
Posts: 3422
Joined: Sat Aug 11, 2007 5:55 am

Post » Wed Dec 05, 2012 3:13 pm

Motion blur is sometimes overdone but its original purpose is in fact to provide motion blur, not give the viewer another "special effect" to gawk at.
I always thought it was a special effect too. It doesn't make the game look any better when the framerate is low, it's just a fancy screen effect.
User avatar
Spencey!
 
Posts: 3221
Joined: Thu Aug 17, 2006 12:18 am

Post » Wed Dec 05, 2012 6:12 pm

When it comes to movies I worry more about script, story, dialogue, characters, setting, actual stunts over cgi etc than wtf FPS is.



What is FPS?

Also Im more worried about the gameplay and writing in a ame, before we worry about how crisp and clear it looks
User avatar
R.I.p MOmmy
 
Posts: 3463
Joined: Wed Sep 06, 2006 8:40 pm

Post » Wed Dec 05, 2012 7:20 am

What is FPS?
Frames (Pictures) Per Second.

I'm actually well aware of how 3D graphics rendering works. AFAIK triple buffering, useful as it is, cannot predict the future. :tongue:
Of course it can't. That's why there's inherent video latency in the technology. The "future frame" is actually the current frame, the "current" frame is actually the previous frame, and so on. Not that that matters to anything but input, anyway.

It's also expensive.
For video memory, maybe a bit. It also causes a few annoying side effects for graphics programmers to deal with. But other than that, I wouldn't call it "expensive" at all.

That's great. It gives me a headache. :foodndrink:
Then turn it off.

My only point was that comparing film framerate and game framerate isn't really valid because of the many differences in the mediums. There are reasons that people say that games need to be rendered at 60+ fps that don't apply to recorded film by its nature.
And there are other reasons that do apply. Computer graphics and film technology are more interrelated than they have ever been before. It's not valid to say that you can't make any comparisons between them, especially in something so fundamental as framerate and the impression of motion.

I always thought it was a special effect too. It doesn't make the game look any better when the framerate is low, it's just a fancy screen effect.
It's subtle and it obviously has varying degrees of success for different people, but the point is that it's not useless and it is certainly more than a "fancy screen effect." This is true whether or not you actually appreciate it.
User avatar
Ann Church
 
Posts: 3450
Joined: Sat Jul 29, 2006 7:41 pm

Post » Wed Dec 05, 2012 11:22 am

Of course it can't. That's why there's inherent video latency in the technology. The "future frame" is actually the current frame, the "current" frame is actually the previous frame, and so on. Not that that matters to anything but input, anyway.
It does matter, apparently. When the FOV is moving quickly (like when controlling a FPS game with a mouse) motion blur causes a "streaking" effect due to that latency and the fact that the game can't actually predict what the next set of user input is going to look like. :shrug: I could see this effect being lessened quite a bit when the differences in the rendered scene from one frame to the next aren't as large, though.

For video memory, maybe a bit. It also causes a few annoying side effects for graphics programmers to deal with. But other than that, I wouldn't call it "expensive" at all.
Run a game on a rig that's on the fringe of the game's system requirements with triple buffering on and off and then tell me it's not expensive. It's not going to halve your framerate or anything like that, but it does cost you a good chunk of resources. Yes, it's more of a video memory (and memory bandwidth) hit than anything, but that matters too. If it can potentially make a noticeable difference in performance then I consider it "expensive." Of course, depending on the situation it can be less expensive than traditional "vsync" methods. That's not terribly surprising, though, since older methods used to achieve vertical sync with a display usually consisted of fairly arbitrarily dropping frames.

Don't get me wrong...triple buffering is awesome. If you're struggling to get framerates that seem smooth, however, it's probably not going to improve the situation, and neither is piling motion blur calculations on top of it.

Then turn it off.
I do. No need to get defensive...I'm not arguing against motion blur or telling you that you're wrong for liking it. I'm just saying that it doesn't compensate for lower framerates in the way that natural film motion blur does.

And there are other reasons that do apply. Computer graphics and film technology are more interrelated than they have ever been before. It's not valid to say that you can't make any comparisons between them, especially in something so fundamental as framerate and the impression of motion.
I disagree. Film has the advantage of not needing to be rendered in real-time. This makes a huge difference in considerations regarding framerate and impression of motion. :shrug: The OP is referring to the "conventional wisdom" that is often repeated around gaming forums and such that states that higher framerates are better. I was just pointing out that there are some legitimate reasons that a direct comparison of framerates in games and film is full of caveats.
User avatar
Zoe Ratcliffe
 
Posts: 3370
Joined: Mon Feb 19, 2007 12:45 am

Post » Wed Dec 05, 2012 7:36 pm

I've been playing CoD for years, so I doubt the Hobbit's measly 40 some FPS will phase me.
User avatar
[ becca ]
 
Posts: 3514
Joined: Wed Jun 21, 2006 12:59 pm

Post » Wed Dec 05, 2012 7:44 am

I've been playing CoD for years, so I doubt the Hobbit's measly 40 some FPS will phase me.

Again, FPS in a movie =/= FPS in a game. Filming at a higher rate makes the cinematography different - film records blur on each frame, there's film grain, etc... various things that are aspects of recording a video (which have also changed with the shift to digital cameras, of course - they react differently than an actual film camera would).

What people are saying is that a 48fps movie looks different than a 24fps movie. Not that the actual framerate gives them problems - but that the higher framerate makes the film seem unlike a "film". (You know, kind of like you can't record a TV show for HD the same way you could for SD - they need different lighting & makeup, or it looks bad. Or the way some people say that a HD movie watched on an HDTV can be "too clear" or "too sharp".)
User avatar
Chris Guerin
 
Posts: 3395
Joined: Thu May 10, 2007 2:44 pm

Post » Wed Dec 05, 2012 7:12 am

I've been playing CoD for years, so I doubt the Hobbit's measly 40 some FPS will phase me.
Again, FPS in a movie =/= FPS in a game. Filming at a higher rate makes the cinematography different - film records blur on each frame, there's film grain, etc... various things that are aspects of recording a video (which have also changed with the shift to digital cameras, of course - they react differently than an actual film camera would).

What people are saying is that a 48fps movie looks different than a 24fps movie. Not that the actual framerate gives them problems - but that the higher framerate makes the film seem unlike a "film". (You know, kind of like you can't record a TV show for HD the same way you could for SD - they need different lighting & makeup, or it looks bad. Or the way some people say that a HD movie watched on an HDTV can be "too clear" or "too sharp".)
Exactly. To see an exaggerated version of this, watch some 24fps or 30fps content on a TV that supports 120Hz or higher refresh rates (these usually have some sort of frame interpolation post-processing built-in). Watch the content with the frame interpolation turned off, then again with it turned on (artificially boosts the framerate of the content). Bam, that episode of Mythbusters suddenly looks like it was filmed using a home camcorder from the '90s.

Again, I haven't seen the 48fps The Hobbit, and I'm assuming that actually filming it in 48fps makes a difference (I'd also assume that there are differences in the editing and post-processing techniques), but in general people aren't used to seeing film at higher framerates. It doesn't look the same as higher framerates in video games and can actually make the content look different. Some people like it and some don't.
User avatar
Guy Pearce
 
Posts: 3499
Joined: Sun May 20, 2007 3:08 pm

Post » Wed Dec 05, 2012 4:47 pm

Motion blur is sometimes overdone but its original purpose is in fact to provide motion blur, not give the viewer another "special effect" to gawk at.
I disagree. For me it's no different than other game special effects that gotten more and more common this console generation like the film grain special effect. Which usually doesn't really look like real film grain either, especially not on the higher resolutions.

For it's original purpose it kinda svcks in games, mainly because of two reasons.
1) It can't predict the future, which film easily do.
2) The game framerate flutters and is not constant, not giving the eyes any chance to get used to the speed of the blur effect.

Even the Uncharted series, which arguably have the best game motion blur when you spin the camera around the main character, is pretty bad compared to real film motion blur. But no game creator would call their motion blur "horribly bad flashy fake motion blur" in the options, they simply just call it motion blur instead. Well, if they have the decency to allow the motion blur to be turned off that is :P
User avatar
CHANONE
 
Posts: 3377
Joined: Fri Mar 30, 2007 10:04 am

Previous

Return to Othor Games