AMD Implementing Hardware Acceleration for Megatextures?

Post » Sun May 13, 2012 1:09 am

In a few weeks, AMD will launch the 7970, which is the first 28nm card from its Southern Islands family of GPUs.

From what I gather it appears that the new architecture incorporates a form of hardware acceleration specifically for purposes of rendering "partially resident textures," such as the megatextures employed in RAGE.

http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/6

Apparently the card will be able to handle texture sizes up to 32TB.

http://images.anandtech.com/galleries/1601/PRT0_575px.jpg

Is this likely to result in a drastically different experience for games that make use of partially resident textures, or simply an improvement on frame rates?
User avatar
Annika Marziniak
 
Posts: 3416
Joined: Wed Apr 18, 2007 6:22 am

Post » Sun May 13, 2012 12:05 am

Now that's interesting. Looks like Ati/AMD is going for a hardware solution for their driver problem with Megatextures. ;-)
What this will lead to in actual game performance remains to be seen since i don't really know how much main processor load Carmacks Megatexture system generates.

Nevertheless, if Ati/AMD do it in Hardware and not just by driver support, it will more likely work.
User avatar
Setal Vara
 
Posts: 3390
Joined: Thu Nov 16, 2006 1:24 pm

Post » Sun May 13, 2012 2:10 am

In order to invest in this level of hardware support for the feature, it seems they are gambling on more games being developed with the megatexture tech, no?
User avatar
Lovingly
 
Posts: 3414
Joined: Fri Sep 15, 2006 6:36 am

Post » Sat May 12, 2012 8:52 pm

It seems like a safe enough gamble, though. From my understanding of its description, it should be better (if properly done (big "if" right there)) than the current model of using seperate textures for everything. Then again, tesselation sounded really promising at first but, almost three video card generations later, developpers have yet to fully utilize its potential/implement it properly. Not sure how it was done in Battlefield 3 so that comment excludes it :tongue:

Even if it fails or doesn't catch on immediately, there's nothing forcing AMD to include the hardware implementation in future architectures. So it's not really a long-term commitment. Just a one year trial that could pay off short-term.

Personally, I think they should wait another generation before doing this. The technology is still very much new and if it catches on, the earliest I think we'll be seeing large amounts of games with it will be early 2013.
User avatar
Lew.p
 
Posts: 3430
Joined: Thu Jun 07, 2007 5:31 pm

Post » Sat May 12, 2012 6:36 pm

If they are bringing the card to market now, surely they must have been planning this for years. Perhaps they were invited to preview some sort of early demo of id Tech 5?

On a personal note, I'm tempted to consider investing in one of these cards in advance of DOOM 4's release.
User avatar
jenny goodwin
 
Posts: 3461
Joined: Wed Sep 13, 2006 4:57 am

Post » Sat May 12, 2012 4:01 pm



On a personal note, I'm tempted to consider investing in one of these cards in advance of DOOM 4's release.
I plan on buying one of the 7970's as well. My 6970 is really not cutting it in Eyefinity for some games.
User avatar
Nana Samboy
 
Posts: 3424
Joined: Thu Sep 14, 2006 4:29 pm

Post » Sat May 12, 2012 7:44 pm

If they are bringing the card to market now, surely they must have been planning this for years. Perhaps they were invited to preview some sort of early demo of id Tech 5?
Very possible. I'm definitely curious as to how this tech is going to turn out.
User avatar
MISS KEEP UR
 
Posts: 3384
Joined: Sat Aug 26, 2006 6:26 am

Post » Sat May 12, 2012 1:56 pm

It only seems to be for OpenGL apps though.

They should really consider making a DirectX version too.
User avatar
Jarrett Willis
 
Posts: 3409
Joined: Thu Jul 19, 2007 6:01 pm

Post » Sat May 12, 2012 9:19 pm

It only seems to be for OpenGL apps though.


Won't all of the id Tech 5 games be OpenGL apps?
User avatar
Lory Da Costa
 
Posts: 3463
Joined: Fri Dec 15, 2006 12:30 pm

Post » Sun May 13, 2012 4:48 am

Won't all of the id Tech 5 games be OpenGL apps?

Probably, but then it only benefits OpenGL devs which there aren't many of compared to DirectX devs.
User avatar
maddison
 
Posts: 3498
Joined: Sat Mar 10, 2007 9:22 pm

Post » Sun May 13, 2012 5:34 am

Yeah, this is an interesting step, and would've been in the pipeline for at least a year or two. They may well have had a few people working on it with the expectation of it taking longer, and it turning out more straightforward than expected... and if you have a feature the competition doesn't, why not release it before they do, even if it's not actually going to be used for another card generation or more?

I'm curious to see how long it will be before NVIDIA announces similar support.

Probably, but then it only benefits OpenGL devs which there aren't many of compared to DirectX devs.
I'm happy to live with that :shifty:.
User avatar
Jessica Raven
 
Posts: 3409
Joined: Thu Dec 21, 2006 4:33 am

Post » Sat May 12, 2012 6:44 pm

People keep claiming this is all future tech, but completely miss the mobile applications and consoles. Rage looks and plays great on something as wimpy as an iPhone and AMD is making a big push toward trinity which will leverage their expertise in graphics for mobile applications. Within a few years they hope to combine this new gpu compute architecture with bulldozer for cheap mobile applications and desktops with integrated graphics that can play games as demanding as Crysis. Eventually every snot nosed kid will be dragging around a cheap Walmart tablet PC capable of playing games as demanding as Crysis and already India has produced the first $50.oo tablet PC. Within ten years the number of PCs worldwide is expected to double and within 4 years PC game sales are expected to surpass those of consoles.

Rage was designed much more with consoles in mind then PC enthusiasts and that's just fine by me. Within 3 to 4 years I expect the technology to mature enough to become more competitive with games designed for the PC. Something like 1/3 of the PCs in the Steam survey are still running XP on dual core processors and its just not reasonable to expect the latest and greatest technology to always favor PC gaming enthusiasts right out of the gate anymore.
User avatar
Clea Jamerson
 
Posts: 3376
Joined: Tue Jun 20, 2006 3:23 pm

Post » Sat May 12, 2012 10:54 pm

Rage looks and plays great on something as wimpy as an iPhone

Wait, do you mean that the iPhone game also uses megatexture technology? Wow, I had no idea.
User avatar
^_^
 
Posts: 3394
Joined: Thu May 31, 2007 12:01 am

Post » Sat May 12, 2012 5:28 pm

Wait, do you mean that the iPhone game also uses megatexture technology? Wow, I had no idea.

Megatextures automate the entire process. The artist just draws whatever they want without worrying about the textures sizes and the developer just plays the game on any system to see how it looks. If it doesn't look good or runs too slow or whatever they tweak other aspects of the game and never have to worry about the textures. Even PC gamers don't have to adjust the textures and can just focus on getting a steady 60fps.
User avatar
Shannon Marie Jones
 
Posts: 3391
Joined: Sun Nov 12, 2006 3:19 pm

Post » Sat May 12, 2012 9:33 pm

The alternative to gpu hardware acceleration for partially resident textures
Is this likely to result in a drastically different experience for games that make use of partially resident textures, or simply an improvement on frame rates?

The idea is that the computer automatically lowers the textures if the frame rates drop, so the faster it can process them the better they look. Eventually they hope to do the same for geometry allowing almost any computer or phone to play the same game, but the more powerful the computer the better looking the game can be.
User avatar
Trevor Bostwick
 
Posts: 3393
Joined: Tue Sep 25, 2007 10:51 am

Post » Sun May 13, 2012 1:41 am

It only seems to be for OpenGL apps though.

They should really consider making a DirectX version too.

That's one of the advantages of using OpenGL. Want to add something to OpenGL version xy, just do it.
Want to add something to DX 11, wait and hope for Microsoft to add the feature in DX 12
User avatar
carla
 
Posts: 3345
Joined: Wed Aug 23, 2006 8:36 am

Post » Sun May 13, 2012 5:15 am

In order to invest in this level of hardware support for the feature, it seems they are gambling on more games being developed with the megatexture tech, no?

That depends on who much space this technology takes up on the chip die and how much money is needed to develop the layout. Implementing Megatexture in hardware is more about changing the texture chaching strategy than anything else which means it won't completely change the chip layout or heavily enlarge it's size and production costs. I guess this is more a marketing gag by ATI to cope with their lost reputation with OpenGL drivers.Looks like Megatexture is running fine and fast with a software implementatiton so why bother about hardwaresupport?
User avatar
Adam Kriner
 
Posts: 3448
Joined: Mon Aug 06, 2007 2:30 am

Post » Sun May 13, 2012 1:06 am

Looks like Megatexture is running fine and fast with a software implementatiton so why bother about hardwaresupport?

Would the only benefit to the hardware support be increased frame rates?
User avatar
Claudia Cook
 
Posts: 3450
Joined: Mon Oct 30, 2006 10:22 am

Post » Sat May 12, 2012 3:06 pm

That depends on who much space this technology takes up on the chip die and how much money is needed to develop the layout. Implementing Megatexture in hardware is more about changing the texture chaching strategy than anything else which means it won't completely change the chip layout or heavily enlarge it's size and production costs. I guess this is more a marketing gag by ATI to cope with their lost reputation with OpenGL drivers.Looks like Megatexture is running fine and fast with a software implementatiton so why bother about hardwaresupport?

I'd say this is a really bad guess on your part.

The whole purpose of discrete graphics cards is to provide hardware acceleration and that's been their entire evolution. They really can't do anything that a cpu can't do better, but they can do specific things faster and cheaper then a desktop full of cpu processors. Even their simplified processors are nothing more then a way to make cheaper and faster imitations of cpu processors.

Partially resident textures are the cutting edge of modern computer graphics for movies as well as games. The same technology can be used to make better looking streaming videos as well as video games and hardware acceleration is just the cheapest and fastest way to go. Its all about being able to put the best looking picture on the screen as possible while maintaining a constant frames per second and doing so as cheaply as possible. As usual AMD is testing the technology out on their discrete graphics cards first, but I have no doubt it will eventually find its way into their integrated graphics for portable devices.
User avatar
Jack
 
Posts: 3483
Joined: Sat Oct 20, 2007 8:08 am

Post » Sat May 12, 2012 11:57 pm

Would the only benefit to the hardware support be increased frame rates?

For what i know about the concept of Megatextures: Yes. But you never now what might evolve out of it.
User avatar
Flash
 
Posts: 3541
Joined: Fri Oct 13, 2006 3:24 pm

Post » Sat May 12, 2012 2:06 pm

I'd say this is a really bad guess on your part.

The whole purpose of discrete graphics cards is to provide hardware acceleration and that's been their entire evolution. They really can't do anything that a cpu can't do better, but they can do specific things faster and cheaper then a desktop full of cpu processors. Even their simplified processors are nothing more then a way to make cheaper and faster imitations of cpu processors.

Partially resident textures are the cutting edge of modern computer graphics for movies as well as games. The same technology can be used to make better looking streaming videos as well as video games and hardware acceleration is just the cheapest and fastest way to go. Its all about being able to put the best looking picture on the screen as possible while maintaining a constant frames per second and doing so as cheaply as possible. As usual AMD is testing the technology out on their discrete graphics cards first, but I have no doubt it will eventually find its way into their integrated graphics for portable devices.

First of all: AMD has admitted that the concept of Megatexture has problems with their exisiting chip design.
Otherwise it wouldn't take so long to provide a working driver unless their driver programmers are complete morons.
It's a logical step to tune their chip layout to work with this and if you're doing something new, why don't do it in a big way?
Might be a good argument to sell your product. ;-)
About the partially resident (tiled) textures:
We had a similar concept years ago with splitting up the complete rendering process into tiles with the PowerVR chips. It didn't work out on pc because nobody really understood how to use this or was willing to implement it on pc. Today the PowerVR chips are available only in smartphones and handhelds.
As for your statement that that gpus can't do anything that a cpu can't do better:
I know that lot's of scientists use graphics chips with specialized drivers for number crunching. Cpus are more flexible in handling tasks but despite their higher clockrate, they loose tremendously in terms of speed. As you said, speed is a big factor.
User avatar
Mélida Brunet
 
Posts: 3440
Joined: Thu Mar 29, 2007 2:45 am

Post » Sat May 12, 2012 2:48 pm

As for your statement that that gpus can't do anything that a cpu can't do better:
I know that lot's of scientists use graphics chips with specialized drivers for number crunching. Cpus are more flexible in handling tasks but despite their higher clockrate, they loose tremendously in terms of speed. As you said, speed is a big factor.
Graphics cards also have the advantage of much more silicon. Where a CPU is only a couple of square inches, a GPU can be spread over dozens (not saying they necessarily are, but the engineers do have a lot more room to play with).

So, yeah, there is a reason general processing on graphics processing units exists ;).
User avatar
Nany Smith
 
Posts: 3419
Joined: Sat Mar 17, 2007 5:36 pm

Post » Sun May 13, 2012 1:49 am

First of all: AMD has admitted that the concept of Megatexture has problems with their exisiting chip design.
Otherwise it wouldn't take so long to provide a working driver unless their driver programmers are complete morons.
It's a logical step to tune their chip layout to work with this and if you're doing something new, why don't do it in a big way?
Might be a good argument to sell your product. ;-)
About the partially resident (tiled) textures:
We had a similar concept years ago with splitting up the complete rendering process into tiles with the PowerVR chips. It didn't work out on pc because nobody really understood how to use this or was willing to implement it on pc. Today the PowerVR chips are available only in smartphones and handhelds.
As for your statement that that gpus can't do anything that a cpu can't do better:
I know that lot's of scientists use graphics chips with specialized drivers for number crunching. Cpus are more flexible in handling tasks but despite their higher clockrate, they loose tremendously in terms of speed. As you said, speed is a big factor.

Speed is only an issue because cost is an issue. Its all about getting the maximum bang for your buck and unless you have a serious need for more accuracy and speed then you don't go use cpu processors because the cost is simply too prohibitive.

As for AMD's chip design, the idea that the marketing weasels somehow decide what goes on a chip is laughable. This is about the evolution of their existing architecture and long term chip designs. If partially resident textures are not compatible with their current designs they've added hardware acceleration because they believe it is that important to future designs. It is the future of video games if they are to ever move beyond the limits of rasterization which every major developer on the planet is already researching. Again, its about providing the best looking picture as fast and as cheap as possible. Megatextures are merely the first step in that direction and until somebody comes up with something better I expect to see very major gpu manufacturer come out with their own hardware acceleration for partially resident textures in the near future.
User avatar
vicki kitterman
 
Posts: 3494
Joined: Mon Aug 07, 2006 11:58 am

Post » Sat May 12, 2012 2:55 pm

Looks like Megatexture is running fine and fast with a software implementatiton so why bother about hardwaresupport?

Because doing something in hardware is always way faster. You can even do software tessellation, but it would kill the performance.Megatexture is able to run in software, but could offer way more performance if it's done in hardware. More performance means better image quality (on the same hardware) as you can use better/more geometry, lighting and textures.
User avatar
K J S
 
Posts: 3326
Joined: Thu Apr 05, 2007 11:50 am

Post » Sun May 13, 2012 5:04 am

Because doing something in hardware is always way faster. You can even do software tessellation, but it would kill the performance.Megatexture is able to run in software, but could offer way more performance if it's done in hardware. More performance means better image quality (on the same hardware) as you can use better/more geometry, lighting and textures.

It also just makes sense from a logistical point of view. Textures and geometry make up the overwhelming majority of data that needs to be processed continuously and can be automated in this fashion. If video games are ever to become more high definition then at some point the hardware for processing high definition video must become hardware accelerated and partially resident textures happen to provide the most flexible system known that can produce any quality of video from the lowest to the highest. Sooner or later some system like it must be adopted and for AMD to include it in their newest generation hardware is an indication they have likely researched the subject thoroughly and found this to be the most promising path forward.
User avatar
Emerald Dreams
 
Posts: 3376
Joined: Sun Jan 07, 2007 2:52 pm

Next

Return to Othor Games