Hi Jimmy
It's not fair to blame AMD for this. How come every other game I've every tried to play on this card with these drivers has works just fine, but RAGE has well known and well documented issues with just about everyone using an AMD card? iD just did a lousy job optimizing this game for PC. They HAD to know it before release too. They just didn't care. They seriously gave us the shaft here, and considering that the game has been out for months now, I don't think they plan to rectify the situation either. It's completely inexcusable and I can't believe I spent my whole holiday downloading 21 gigs of this mess.
You don't know what is going on, so stop making conclusions.
You have little experience with these things i can see (PCs at all), so stop blaming ID they have nothing to do with it, RAGE is the most bug-less and most optimized game ever made, more effort has even gone to consoles since it required more work to svck all the performance out of those terrible mainstream devices, except in the bug department, some mistakes made throug for example in game menu-mouse sensetivity and a few human mistakes which are so little and subtle that have no impact on gameplay.
Rage uses OpenGL graphics API , all other (high-profile) games use DirextX ,
AMD/ATI just didn't put as much effort into OpenGL driver as Nvidia did , that's it.That's why all other games you played worked with the same drivers, it's not the same thought, it's totally separate, you just had the same catalyst release , but there are many different drivers for different APIs.
Go tell your friends that you were wrong and it's not ID's fault. If you wouldn't live in a cave you would have knew the happenings ... start reading news and get involved in forums, way before the release of the game, at the release and some time after it, if you don't have time, it's your problem and your loss.
The biggest stupidity is that the community who has followed rage for so long answers you and you hit back with the same stupid attitude, there have been many news articles explaining the whole deal with rage, the only ones who are crying are the kid noobs who think PCs are supposed to work like consoles, so, pretty much inexperienced people are always spreading misinformation, I feel sorry for ID.
And to make matters worse, the Rage-specific drivers was to be released on the date, unfortunately the pack contained an way older version of opengl.
All over the web people claim the title as buggy while stating
These people don't see the point in the picture, it's not the game that's buggy, it never was, it's all about the drivers, all those years of the GPU wars in the PC gold age with quake's and half-lifes and counter-strikes is all INVALID, it was never ever about the game's performance on the GPU , it was always DRIVER performance on the GAME. Nvidia being ahead of ATI was just a big marketing made up thing, false reviews, developer bribery ("the way it's meant to be played" crap), ATI might have had weaker drivers sometimes, but most of the time the developer-bribery was at play, Nvidia's practices were deceptive and unclean, via their nvidia-themed games they provided better support for the game in their driver , not to mention nvidia-chipset based motherboards could also purposelly hobble Radeon GPUs that could be installed in people's systems.
The GPU war was never genuine, the FPS war between the vendors was not about the GPUs as hardware them selfs , driver-optimized-for-the-specific-game. You always thought it was a GPU comparrison, it never was valid, so many factors played it was just simply a bunch of thrown up numbers, completely and utterly invalid, it wasn't professional at all, in the end it's the result what matters but nobody understood the way there and they still speculated/dicussed like they knew what were they talking about.
If you want to compare GPUs you need identical drivers for both - which is nearly impossible with different architecture's. The whole GPU war was just to create drama, a big buzz in the market and that's how it was interesting for consumers to fight over (, in other words , it was
ENTERTAINMENT FOR THE MASSES (stupid nvidia really devoted fan kids were the biggest victims, they spent so much more cash on a GPU just because it had 10 FPS more in a few games, also, nvidia's drivers seemd to focus on performance over stability/picture quality/shadows)
It's not how you think it is, that a company released driver for it's CARD, yes it does, but it also has to support all the GAMEs specifically, and without that the driver for a card is useless, that's the reality of the technology and ID software want's to change the PC GPU API overhead, they want to get rid of depending on others and make their own optimizations and driver, just like you do it on consoles, that's why consoles run such stuff, if consoles woul work like PCs, they would be scrap metal.
Just imagine how fast games would have been on PCs if the GPU/chipset would have been open to developers to specifically-optimize their games, as they have full access to the hardware. Stuff just nobody notices .... Consoles don't rely on monthly ATI driver releases, it's developers them selfs who make the game around the hardware and don't have to rely on some middle-man to "fix" a few things every month.
Rage would have been many times (10) faster on PC if they had full access to the hardware , comparing same graphics in this case, but we don't need 1000 FPS , so filling a lot more rich graphics and art would have been possible.
Just look this thread how some people have a terribly wrong look on the situation, because they don't know the above, they speculate that "focus" was made on consoles and "PC has weaker QA testing" ... others come to explain that's not the case, it's about the OpenGL driver itself.
It's just harder to develop games for Windows PC, ask your self why Linux works better - it's all about FULL ACCESS to the hardware, NO API RELIANCE, if you want to fix something, go in there and do it, you don't have to wait for the GPU vendor to release a new driver so you can fix or use new feature that mostly is not even working 100%.
With this in mind .... as you can see Xbox360 uses a GPU similar to ATI Radeon X1900/X1950 , the http://en.wikipedia.org/wiki/Xenos_(graphics_chip) which is custom made for X360 , it has total of 337 million transistors.
WiiU will use custom made GPU based on HD4800 tech (RV770) , (custom means it will be even better, tech update) , it has a total of 956 or 959 million transistors depending on what sub chip base they choose as RV790 Update in 2009, it's not clear but it's definitely likey they going to use the newer updated RV790 as the base chip, which is just a little update from the RV770 obviously, but the articles were talking about R770 ,,, probably meant R700, that's the proper series marker.
Did you remember the articles about "WiiU uses last-gen ATI GPU" ... and all of those articles talked-down as the "last-gen" would mean it's not that good, all of them failed to compare the effect of full hardware access to a 3-times newer architecture, they just tossed the numers around just guessing something here and there,
Yes WiiU GPU is not based on the current PC tech, but that's not the point - The point is the HUGE LEAP it is over the Xenos chip technology, that's the MAIN point of the focus everyone missed.
I really don't like this western anti-nintendo conspiracy on the newsfronts, fells like it many areas ... stuff like VGA awards and all. Nintendo titles offer great gamplay but are dumbed just because they aren't HD.
Stupid sites such as Engadget and other mainstream console news sites totally didn't get it and told that WiiU would be "as fast as PS3 and X360" ... others claimed 50% faster, where did they get this 50%. ... it's not about faster if you talk like that (without comparing actually, they really don't know what they're talking about), it's about the gaphics quality as well if you take it into account, as Carmack said, everything is all about the balance, sacrifice here beneift there , the tradeoffs ;; i can make a game 1000 times faster - if i use simple wireframe only. Many many "gaming" journalists have poor technical background.
WiiU GPU coul be about 4 times faster if we use the same identical graphics set (game specific), for example playing the same exact game for x360 on WiiU (as close as possible port needed) .... again , it won't need to be that fast, 60 FPS is enough, many developers will OBVIOUSLY stick to cap at 60 FPS but they will be able to use MUCH MORE GRAPHICS eyecandy.
Ubisoft about WiiU:
He talks about multi-core architecture and explains how memory capacity brings performance enhancements to an already stonking HD game engine. He talks about graphical shaders, increased cache sizes, pre-calculated data and natural extensions of dev-friendly API.
Look when it mentions about the API, so we don't take only the architecture into account, the raw power increase, but also the technological leap in all the other areas such as the natural extensions of the new API and obiously a lot better one than on the X360, all this is a factor for how much better WiiU will be. Even if MS and Sony release even more beefier consoles than WiiU later in 2013-14, nintendo will still be okay in the HD sector and no more excuses, both of em would have to sell for higher price and maybe not profitable at launch.