» Wed May 09, 2012 6:15 am
My understanding is, that all computer games are mostly 3D-ready by default, because all rendering in the GPU is done in 3D-space anyway. One merely needs a suitable driver to transfer that information to the display device.
In fact, already many, many years ago I tried out a number of games (including some of the early Tombraiders on Win98SE and Oblivion on Win2k) in 3D using a set of cheapish LCD shutter goggles that connect to the anolog video cable going to a CRT monitor and were supported by a 3D Stereo supplement to nVidia's GeForce video drivers.
My experience was that it usually worked more or less OK (apart from an occasional loss of synchronization) but the main problem was that any additional 2D-elements, such as crosshairs, menus, or HUD elements, were usually disturbingly misplaced, unless the game was specifically designed to also give them a well defined and appropriate depth in the viewing field. I would guess that would still be the one thing that would specifically have to be taken into consideration by the developers if they wish to make a game perfectly playable in 3D.
Still, I would be interested to hear how that might work differently today, if someone has experience with a modern 3D-kit, especially with Bethesda games.