In-game FXAA vs nVIDIA FXAA

Post » Thu Jun 14, 2012 11:13 am

I've searched on bing, Google, nVIDIA's site and multiple other tech sites for an answer and found nothing. In the words of the infamous Irina Spalko, "I vant to know!" Is there a difference between the in-game FXAA setting and nVIDIA's FXAA setting? I tried to do a comparison myself by enabling nVIDIA Inspector's FXAA setting, taking a screenshot, then switching over to the in-game setting but the screenshot doesn't capture nVIDIA Inspector's setting for some reason--it reverts to 0x AA in the screenshot, only. The same problem occurs if you were to turn the brightness down to 0, the screenshot will look like it's back to the default, but back in-game it's still at 0. Very weird.
User avatar
Sammygirl
 
Posts: 3378
Joined: Fri Jun 16, 2006 6:15 pm

Post » Thu Jun 14, 2012 12:42 pm

not that I can tell.

honestly, I prefer the FXAA implementation (I'm playing on PC mind you) in comparison to the more standard MSAA offering. it makes the game world feel and look a lot more softer and natural (not sure how exactly to describe it)?
User avatar
Matt Bee
 
Posts: 3441
Joined: Tue Jul 10, 2007 5:32 am

Post » Thu Jun 14, 2012 3:07 pm

I was previously using 2x MSAA + 2x SGSSAA with a -0.500 LOD bias which made the game look awesome. But, I found a fix for Skyrim's AO disabling itself a few patches ago by changing the compatibility to Fallout 3; which in turn caused my FPS to drop below 60 (I refuse to run at anything lower) forcing me to lower one of my other settings--AA being my biggest FPS [censored].
User avatar
OJY
 
Posts: 3462
Joined: Wed May 30, 2007 3:11 pm


Return to V - Skyrim