bfloatpointrenertarget?

Post » Mon May 28, 2012 8:34 pm

mmm what does it do actually??i tried changing between 1 and 0 nothings change....
User avatar
Jonathan Braz
 
Posts: 3459
Joined: Wed Aug 22, 2007 10:29 pm

Post » Tue May 29, 2012 7:01 am

mmm what does it do actually??i tried changing between 1 and 0 nothings change....

You surely mean bFloatPointRenderTarget. Altering it won't lead to any visible changes because it just determines what picture format to use internally for drawing the frames to be displayed on the screen. If I remember correctly, ENB requires this to be set to 1 in order to work properly but aside from than that, it's not interesting.
User avatar
Alex Vincent
 
Posts: 3514
Joined: Thu Jun 28, 2007 9:31 pm

Post » Mon May 28, 2012 10:12 pm

aah yes,sorry for the typo,hehehehe...ooh i see...so nothing really affects any performance unless i use ENB...okay,then...thx a lot man :D
User avatar
Eire Charlotta
 
Posts: 3394
Joined: Thu Nov 09, 2006 6:00 pm

Post » Mon May 28, 2012 7:18 pm

Yah all I know is back in the day I once flipped it from 0 to 1 and my next play session with Skyrim looked like something from an acid trip until I managed to reach a door and load a new scene and yah it was required to get working for the ENB engines.
User avatar
Rik Douglas
 
Posts: 3385
Joined: Sat Jul 07, 2007 1:40 pm

Post » Mon May 28, 2012 3:48 pm

ENB makes a modification with that setting enabled.

otherwise without ENB... it should be set to 0..

enabling it enables a higher precision floating point rendering system.... instead of the dithering and much faster rendering mode that we have without it enabled..... we'd see a significant reduction in performance... for some people it'll cause the frame rate to tank to 0 in many cases.

It's BEST to leave this disabled.... even with ENB i ran 0... only enabling it to check... Some graphics cards may require it to be set to 1 with ENB....

Visual quality differences are basically zero otherwise. No point in enabling it unless your having an ENB bugged issue.
User avatar
Alex Blacke
 
Posts: 3460
Joined: Sun Feb 18, 2007 10:46 pm

Post » Mon May 28, 2012 8:22 pm

Orly now? I have it enabled for ENB but perhaps its time to play with that value again and see what happens if I go back to 0. Neat. Thanks bud.
User avatar
Catherine Harte
 
Posts: 3379
Joined: Sat Aug 26, 2006 12:58 pm

Post » Tue May 29, 2012 4:06 am

When I installed ENB I noticed he wanted that changed. I loaded game first without changing. Worked fine. I am using Nvidia. I did change it to 1 and loaded the Game, seamed the same, but i figured if it worked at the default setting why mess with it?. So I set it back.

I wonder if this is what is causing a lot of people to now be experiencing the drop to 5-10 FPS from 50-60?
User avatar
Myles
 
Posts: 3341
Joined: Sun Oct 21, 2007 12:52 pm

Post » Mon May 28, 2012 7:59 pm

This parameter turn on hdr rendering in game, if set to 0 it's drawed to old non hdr surface. But vanilla game do not use hdr at all, so you will not see any changes except some performance loss (it's noticable only for 64-128 bit budget videocards). If someone have huge performance hit with this enabled, it's driver bug and for sure you forced antialiasing in nvidia control center or catalyst control center, because all modern gpu support 64 hdr texture antialiasing. So if you playing without any graphic mods which produce hdr output (brightness above 1.0, some non d3d9.dll based do this too, because modify intensity of sky for example), set this to 0. If you play with ENBSeries, set this to 1, because hdr colors will be clamped and will look ugly grayish.
User avatar
Your Mum
 
Posts: 3434
Joined: Sun Jun 25, 2006 6:23 pm

Post » Tue May 29, 2012 5:16 am

it's safe to enable this on any modern video card and it does seem to improve fidelity but you probably won't notice.
User avatar
Stace
 
Posts: 3455
Joined: Sun Jun 18, 2006 2:52 pm

Post » Tue May 29, 2012 4:07 am

i've explained this thoroughly in another thread.... which i can't find...

while it is directly related to HDR... setting it to 0 DOESN'T Disable HDR.. hdr cannot be disabled in this game far as i've surmised... it can be further toned down with mods... but otherwise.. not disabled..

all the the setting does from the bit of what i can get out of the setting and differences.. is increase the range of the HDR.... and other shader specific values used which can apply to many other things outside of HDR...

typically most programmers use FP16.... it's fast, easy, doesn't require to much.. and lets face it.. the game is xbox360/ps3 friendly which doesn't exactly sport a massive gpu.. they are trying to optimise..... actually they might even be doing FP12....

There are also combination of FP..... you can combine different elements of FP16/24/32.... obviously the higher up you go... the larger the range, the better the precision usually and of course the MORE there is an impact on performance.... It's rare to see anything over 24 being used on PC games and anything higher than 16 on xbox/ps3 system IF they use something higher than 12..

it's still unadvisable to enable it.... the engine really doesn't make good use of a higher precision... and like i said prior... has a good chance of killing or even making your fps TANK.....

snip from another forum if your willing to take a quick gander at some detailed info

Let me clarify some of this. There are two main components to a floating point number, the mantissa (precision) and the exponent (range). In some cases, one is more important than the other (i.e. you want more precision when sampling textures, you want more exponent when demonstrating high dynamic range).

FP16 = s10e5, which means the mantissa is 10 bits plus 1 implied bit = 11 bits (the "s" represents the sign bit). The exponent is 5 bits, which means your maximum exponent is 15.

Since your maximum exponent is 15, that means the largest difference you can have between your dimmest areas and your brightest areas is only a factor of 2^15 = 32768. (I am neglecting non-normalized floating point values as those are uncommon.) Since your mantissa is 11 bits (note that this is lower than the precision of FX12), you can sample 2048 different positions from a texture map, but note that you'd want about 4-5 bits of subpixel precision, so you end up only being able to accurately sample a 256x256 texture.

FP24 = s16e7 = 17 mantissa bits, 7 exponent bits. This means your largest exponent is 63.

With such a large exponent, you can acheive differences in brightness of over 9.2 * 10^18! With 17 bits of mantissa, you can sample 2048x2048 textures and still have 4-5 bits of subpixel precision left over. Very well balanced.

FP32 = s23e8 = 24 mantissa bits, 8 exponent bits. This means your largest exponent is 127.

Now your range goes up to 1.7 * 10^38. Pretty much overkill compared to FP24. With 24 bits of mantissa, you could sample a 524288x524288 texture with 4-5 bits of subpixel precision. Again, overkill.

This doesn't mean that FP32 is bad. What it means is that FP32 is not likely to offer noticable improvements over FP24 on today's shaders. The jump from FP16 to FP24 is very large, because you are getting a large jump in precision and range. The difference from FP24 to FP32 is mostly in the precision... which was already sufficient to begin with.
User avatar
brenden casey
 
Posts: 3400
Joined: Mon Sep 17, 2007 9:58 pm


Return to V - Skyrim