Crysis 2 being re-designed for GTX580?

Post » Fri Dec 31, 2010 11:28 am

source:

http://www.kitguru.net/components/graphic-cards/faith/crysis-2-being-re-designed-for-gtx580-expect-delays/
User avatar
Neko Jenny
 
Posts: 3409
Joined: Thu Jun 22, 2006 4:29 am

Post » Fri Dec 31, 2010 12:03 pm

The plan:

Enter Crysis 2 and a $2m spend from nVidia’s marketing team.

The Result:

It’s likely that Crysis 2 will become a major benchmark for 2011.

Besides there are lots of reasons why a company would spend $2m sponsoring a game so close to launch. Whatever the reason, it’s hard to ignore the possibility that it is being re-styled for the GTX580 and GTX590 cards with an absolute ton of additional tessellation IQ. For those that can afford a GTX580 going into Xmas, we believe that your Crysis 2 experience when it launches (several months after March) will be as good as it gets. If they delay Crysis 2 one more time it will have taken longer to make the sequel than the original game! (Crysis took two years to develop)

(More delays haha) :) Sorry, guys

From: kitguru.com
User avatar
Erich Lendermon
 
Posts: 3322
Joined: Sat Nov 03, 2007 4:20 pm

Post » Fri Dec 31, 2010 5:50 am

The plan:

Enter Crysis 2 and a $2m spend from nVidia’s marketing team.

The Result:

It’s likely that Crysis 2 will become a major benchmark for 2011.

Besides there are lots of reasons why a company would spend $2m sponsoring a game so close to launch. Whatever the reason, it’s hard to ignore the possibility that it is being re-styled for the GTX580 and GTX590 cards with an absolute ton of additional tessellation IQ. For those that can afford a GTX580 going into Xmas, we believe that your Crysis 2 experience when it launches (several months after March) will be as good as it gets. If they delay Crysis 2 one more time it will have taken longer to make the sequel than the original game! (Crysis took two years to develop)

(More delays haha) :) Sorry, guys

From: kitguru.com

:( more spending, less time. :)
User avatar
matt white
 
Posts: 3444
Joined: Fri Jul 27, 2007 2:43 pm

Post » Fri Dec 31, 2010 8:57 am

/merged threads.

Certainly an interesting theory :)
User avatar
Khamaji Taylor
 
Posts: 3437
Joined: Sun Jul 29, 2007 6:15 am

Post » Fri Dec 31, 2010 12:34 pm

Thanks Shin? I can call you that right?
User avatar
Kelvin Diaz
 
Posts: 3214
Joined: Mon May 14, 2007 5:16 pm

Post » Thu Dec 30, 2010 11:29 pm

but if this true there will be no worries about how the graphics will be for the PC .
User avatar
Irmacuba
 
Posts: 3531
Joined: Sat Mar 31, 2007 2:54 am

Post » Fri Dec 31, 2010 10:50 am

but if this true there will be no worries about how the graphics will be for the PC .

That is the great, "if" - fellow gamer.
User avatar
Emma Pennington
 
Posts: 3346
Joined: Tue Oct 17, 2006 8:41 am

Post » Fri Dec 31, 2010 12:47 pm

More info on this partnership will be released pretty soon so keep an eye out for that, as for Crysis 2 release dates, this has no impact on the already announced March launch.
User avatar
jennie xhx
 
Posts: 3429
Joined: Wed Jun 21, 2006 10:28 am

Post » Fri Dec 31, 2010 1:09 pm

More info on this partnership will be released pretty soon so keep an eye out for that, as for Crysis 2 release dates, this has no impact on the already announced March launch.

That's good to hear. Thanks Cry-Tom !
User avatar
MatthewJontully
 
Posts: 3517
Joined: Thu Mar 08, 2007 9:33 am

Post » Fri Dec 31, 2010 11:23 am

so that meanes there is a partnership and it will be 'announced' so its gotta be something special, hopefully
*saves money for a 580*

and nvidia is paying of lots of devs to make sure their cards are very well supported(sometimes better then AMD cards)
User avatar
Ymani Hood
 
Posts: 3514
Joined: Fri Oct 26, 2007 3:22 am

Post » Thu Dec 30, 2010 10:39 pm

I hope crysis 2 will be the example of dx11 game . cuz I already have gtx 470 in sli and it isn't used up even with metro and the game doesn't use the features of dx11 very well.
User avatar
Francesca
 
Posts: 3485
Joined: Thu Jun 22, 2006 5:26 pm

Post » Fri Dec 31, 2010 7:46 am

Thanks Shin? I can call you that right?

I doubt he minds, he says it himself. :)
User avatar
Jennie Skeletons
 
Posts: 3452
Joined: Wed Jun 21, 2006 8:21 am

Post » Fri Dec 31, 2010 1:12 pm

As long as they don't alter the release date.
User avatar
Darrell Fawcett
 
Posts: 3336
Joined: Tue May 22, 2007 12:16 am

Post » Thu Dec 30, 2010 9:46 pm

Crysis 2 could be DELAYED AGAIN. http://www.made2game.com/2010/11/06/could-crysis-2-release-date-be-pushed-back-by-nvidia/
/merged.

Please look for existing threads on the same topic before posting :)

Dunno where they're getting this idea that the date will be pushed back. The GTX580 is set to be released the end of this month (November). Crysis 2 is not due to be released until end of March, next year, 5 months away.
User avatar
Jessica Phoenix
 
Posts: 3420
Joined: Sat Jun 24, 2006 8:49 am

Post » Fri Dec 31, 2010 2:37 am

Crysis 2 could be DELAYED AGAIN. http://www.made2game.com/2010/11/06/could-crysis-2-release-date-be-pushed-back-by-nvidia/
User avatar
Sarah Knight
 
Posts: 3416
Joined: Mon Jun 19, 2006 5:02 am

Post » Fri Dec 31, 2010 10:33 am

Full dx11 support! Woo!

Nvidia getting involved! Boo!

On the bright side they don't have as much influence over big companies like EA so we won't be seeing a repeat of the batman arkham asylum fiasco.

Also the gtx580 will be a worse card than the hd6970, since the gtx580 isn't a new card. It's the gtx480 as it was originally designed to be (512 shaders and the actual clocks they wanted) so it'll only run 15% faster than a gtx480 at best. Whereas the hd6970 is a new design. This is a marketing ploy by nvidia to get some sales out of a dodo of a generation. They should cut their losses and get to work on keplar (the gtx6 series), AMD have won this generation again.

In the end the only way the gtx580 will run crysis 2 better is if they specifically make one or two additions to the game which they know the AMD cards aren't as good at (which won't be anything noticable, like taking 32 samples for tessellation instead of between 4-16 like it should be or something stupid, the visual effect is zero as the samples go below pixel level, but the performance effect is large) and go over the top with it.

Anyways, nvidia did try it with crysis 1 anyway, and AMD took the lead in crysis with drivers!
User avatar
Alister Scott
 
Posts: 3441
Joined: Sun Jul 29, 2007 2:56 am

Post » Fri Dec 31, 2010 12:44 pm

Full dx11 support! Woo!

Nvidia getting involved! Boo!

On the bright side they don't have as much influence over big companies like EA so we won't be seeing a repeat of the batman arkham asylum fiasco.

Also the gtx580 will be a worse card than the hd6970, since the gtx580 isn't a new card. It's the gtx480 as it was originally designed to be (512 shaders and the actual clocks they wanted) so it'll only run 15% faster than a gtx480 at best. Whereas the hd6970 is a new design. This is a marketing ploy by nvidia to get some sales out of a dodo of a generation. They should cut their losses and get to work on keplar (the gtx6 series), AMD have won this generation again.

In the end the only way the gtx580 will run crysis 2 better is if they specifically make one or two additions to the game which they know the AMD cards aren't as good at (which won't be anything noticable, like taking 32 samples for tessellation instead of between 4-16 like it should be or something stupid, the visual effect is zero as the samples go below pixel level, but the performance effect is large) and go over the top with it.

Anyways, nvidia did try it with crysis 1 anyway, and AMD took the lead in crysis with drivers!
That's just very dis-comforting.
User avatar
Zualett
 
Posts: 3567
Joined: Mon Aug 20, 2007 6:36 pm

Post » Fri Dec 31, 2010 8:38 am

The 6970 isn't a new architecture. It's still the same architecture with more processing cores.

We won't know anything about the performance of either card though until we get official in-game benchmarks.

The hd6970 is actually quite modified from the old architecture and has less shaders. It's not southern islands (which was pushed back the hd7xxx due to 32nm being dropped), but it's a halfway house between evergreen and southern. It's expected to be competing at least well with the gtx580. Currently we do not know about this cards performance at all, the scarce few leaks seem to indicate around 20% faster than a gtx480. Personally i'd hope for more considering the size of the die compared to the hd5870 and the larger number of shaders of the hd6870, but ah well.

Also, the gtx580 is, at most, 15-20% faster than the gtx480, fact. The architecture is identical bar the stripping down of the double float precision aspects of the chip. So the only difference is the extra few CUDA cores and a 10% core boost. Therefore we can logically expect a 20% performance boost at best from the card. It's easily the most boring out of the two cards because we know what it'll do before it even releases. It'll also cost £399, ouch.

EDIT: It's around 10-15% faster, it performs exactly the same as the hd5970.

If both of these predictions turn out to be true (it's likely) then we'll see some fantastic competition though! Also, Nvidia have made some progress with their tessellation uses too (in software), there's a great tessellation demo around atm with an alien :D

Finally, the only reason why nvidia involvement would be bad is if the purposely try to nerf AMD performance, and at this point they have their backs pressed firmly against a wall in this market (80:20 AMD:nvidia ratio for dx11 cards), so i could see them trying it as they've done it before several times. That's why i'm a little skeptical. I'd be far more comfortable if AMD had paid the $2m because they've never blatantly nerfed nvidia cards in games they've supported.
User avatar
Josh Trembly
 
Posts: 3381
Joined: Fri Nov 02, 2007 9:25 am

Post » Fri Dec 31, 2010 6:03 am

Which part?

I'm more annoyed at the gtx580 being such a (relatively) weak card, but this generation may end up an off generation anyway with both companies waiting for 28nm. I guess it's kind of like the generation after the first dx10 releases, where both companies seemed to have a break before they released the real 2nd gen dx10 cards.

In terms of nvidia inteference i can only see it amounting to 3d vision as being nvidia only. Other than that i can't see crytek being so weak as to nerf a game on 80% of the dx11 playerbase (That's right, dx11 is AMD's playground) for $2m, especially when that playerbase will suddenly not buy the game and lose them several million.

The 6970 isn't a new architecture. It's still the same architecture with more processing cores.

We won't know anything about the performance of either card though until we get official in-game benchmarks.
User avatar
Amie Mccubbing
 
Posts: 3497
Joined: Thu Aug 31, 2006 11:33 pm

Post » Fri Dec 31, 2010 4:30 am

Which part?

I'm more annoyed at the gtx580 being such a (relatively) weak card, but this generation may end up an off generation anyway with both companies waiting for 28nm. I guess it's kind of like the generation after the first dx10 releases, where both companies seemed to have a break before they released the real 2nd gen dx10 cards.

In terms of nvidia inteference i can only see it amounting to 3d vision as being nvidia only. Other than that i can't see crytek being so weak as to nerf a game on 80% of the dx11 playerbase (That's right, dx11 is AMD's playground) for $2m, especially when that playerbase will suddenly not buy the game and lose them several million.
User avatar
Liii BLATES
 
Posts: 3423
Joined: Tue Aug 22, 2006 10:41 am

Post » Fri Dec 31, 2010 11:45 am

I don't mind if Nvidia is involved in the process of developing a game.
Why would it be bad?
I think for us gamers it's good that hardwaremakers are involved. That way the software and hardware work together perfectly and the performance of the game is better. So why the negativity about this?

Also I don't think they will redesign the game entirely for a single card. A few modifications perhaps to support some features of the graphic cards, drivertuning perhaps, but nothing more.
User avatar
Breanna Van Dijk
 
Posts: 3384
Joined: Mon Mar 12, 2007 2:18 pm

Post » Fri Dec 31, 2010 10:39 am

Smart move nVidia. Really smart move. For gamers with nVidia GPUs, this means good news. It's a win, win really. Except for ATI fanboys. ;p
User avatar
Kay O'Hara
 
Posts: 3366
Joined: Sun Jan 14, 2007 8:04 pm

Post » Fri Dec 31, 2010 8:59 am

Smart move nVidia. Really smart move. For gamers with nVidia GPUs, this means good news. It's a win, win really. Except for ATI fanboys. ;p

Oh well, it sounds like a good time for a new Nvidia Gpu.
User avatar
MARLON JOHNSON
 
Posts: 3377
Joined: Sun May 20, 2007 7:12 pm

Post » Fri Dec 31, 2010 6:15 am

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_580/1.html

it isnt just a souped up 480
it has more texture units, more shader cores, lower TDP(maybe equal), less sound emmitance, LESS TRANSISTORS(like 200miljoen less), and it still performes like 15-20% better, seeems like a good upgrade to me.

the6970 is rumored to have less then 1600 stream processors, probably because of the chance from 5D to 4D architecture. ive read something about 1536. it will be optimized but i doubt it will beat the 580, cause they need to have maaaaajor improvements to compensate for the loss of cores. but well see
User avatar
sharon
 
Posts: 3449
Joined: Wed Nov 22, 2006 4:59 am

Post » Fri Dec 31, 2010 10:15 am

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_580/1.html

it isnt just a souped up 480
it has more texture units, more shader cores, lower TDP(maybe equal), less sound emmitance, LESS TRANSISTORS(like 200miljoen less), and it still performes like 15-20% better, seeems like a good upgrade to me.

the6970 is rumored to have less then 1600 stream processors, probably because of the chance from 5D to 4D architecture. ive read something about 1536. it will be optimized but i doubt it will beat the 580, cause they need to have maaaaajor improvements to compensate for the loss of cores. but well see

It IS a souped up gtx480 *sigh*, you can go and read some white papers if you're disputing the point. But regardless of what you may think, it is the entire and grounded purely in reality truth. The difference in power and sound is not due to a redesign, it's due to the tansistor cut and a vapor cooling system respectively.

the performance is 10-15% more and the reason for less transistors is that they stripped away CUDA only parts of the chip (the double floating precision aspects if you're interested) in order to allow for the enabling of the extra shaders + texture units that was always present in the gtx480 but disabled for heat and power reasons. Don't you think they'd get more performance if it was totally revamped? The gtx580 is exactly what the gtx480 was originally designed to be, with some CUDA stuff stripped away to allow for it. note that it will not affect gaming or anything such as folding however. The double float logic was basically just for scientific usage, and tesla covers that anyway.

The hd6970 will likely perform the exact same as the gtx580 if you extrapolate from the hd6870 (which is about hd5850 performance with 300 less shaders than the hd5850, and the hd6970 will have about 30-40% more shaders than the hd6870 or there abouts depending on how accurate the leaks are). it's a depressing generation overall, but it's probably the first time in a while that the two companies top two gpu's are truly neck and neck (normally AMD is 10% slower but much cheaper).

The thing is we don't know what clock the hd6970's shaders are at. Considering that it's bound to be higher than the hd6870, if we see 1GHz shaders, the hd6970 should beat the gtx580.

Both the cards are a complete waste of money though (£399 for a gtx580 and probably the same for the hd6970), the gtx570 and hd6950 will be far better buys if priced right.
User avatar
Susan Elizabeth
 
Posts: 3420
Joined: Sat Oct 21, 2006 4:35 pm

Next

Return to Crysis