Right now, smaller chip means less power. The measurement you were looking for is performance per watt, and nobody has any information on any of that yet.
The real question is, will it max out Crysis? Maybe that's just wishful thinking.
Well, the current 5870 from AMD will max it all with AA and AF at 1080P, but its a push to hold 30 FPS all the time, there are dips into the 20's, with the blur effect, its not as noticeable in that game, but I still would prefer to hold my monitors refresh rate.
Maybe by 2012 we will accomplish that mighty feat.
When it has come to GPU's over the last 10 years power never seemed to be an issue that companies looked at. This is apparent across the board. but I will agree ATI seems to be focusing on that aspect.
Overall that is a beast of a card that would fit nicely in my case but other things to notice is that it supports Nvidia's current Quad SLI headers which is more than enough to keep AMD's comments in check that Nvidia is backing out of the high end market... they seem to be very much getting into it still and have a black knight in the image above to prove it.
It is to bad that we don't see a direct HDMI plug coming out of the back of that thing.
I should also note the posting on Facebook clearly stated:
GF100 (the first GeForce GPU based on the Fermi architecture) running the Unigine Heaven DX11 benchmark!
Why would they be calling out DX11? did something change that we don't know about?
They're simply pointing out that they have a functional Fermi GPU core running DirectX 11, likely with the intent to generate excitement that launch may be close.
Actually mas0n with his OC on his 5850 says his doesn't dip below 40 FPS in only a few scenes and the rest is over 60
I haven't collected any hard data, but while most scenes run 50+ FPS, there are still times when I dip into the mid 20s. This is with the Q6600 @ 3.8GHz, the 5850 ~1GHz, and running all settings maxed at 2048x1152. I am also limited to 32-bit because I bought through Steam
Even CrossFire 5870s cannot keep the game above 30 FPS at max settings at 1080p.
To clarify, I think you mean the low frame rate, where it occasional dips, because I would say thats accurate.
My single 5870 does an admirable job at playing it, but there are those moments under heavy fire that it still dips below that magic 30 FPS threshold. The motion blur in those scenes disguises it a little, but a discerning gamer knows.
By time we get a single card that can run it at 60FPS full out nobody will care anymore.
Comments
Damn that thing sucks down the juice though. AMD clearly showing that they have a better handle on power management.
Essentially, it's all speculation until we get performance numbers from the Fermi geforce line, including my original comment.
Well, the current 5870 from AMD will max it all with AA and AF at 1080P, but its a push to hold 30 FPS all the time, there are dips into the 20's, with the blur effect, its not as noticeable in that game, but I still would prefer to hold my monitors refresh rate.
Maybe by 2012 we will accomplish that mighty feat.
Overall that is a beast of a card that would fit nicely in my case but other things to notice is that it supports Nvidia's current Quad SLI headers which is more than enough to keep AMD's comments in check that Nvidia is backing out of the high end market... they seem to be very much getting into it still and have a black knight in the image above to prove it.
It is to bad that we don't see a direct HDMI plug coming out of the back of that thing.
I should also note the posting on Facebook clearly stated:
Why would they be calling out DX11? did something change that we don't know about?
I haven't collected any hard data, but while most scenes run 50+ FPS, there are still times when I dip into the mid 20s. This is with the Q6600 @ 3.8GHz, the 5850 ~1GHz, and running all settings maxed at 2048x1152. I am also limited to 32-bit because I bought through Steam
Crysis is still a big mean bitch to tame.
I don't remember our conversation. I'm pretty sure I average in the 40s, maybe that's what I meant then.
To clarify, I think you mean the low frame rate, where it occasional dips, because I would say thats accurate.
My single 5870 does an admirable job at playing it, but there are those moments under heavy fire that it still dips below that magic 30 FPS threshold. The motion blur in those scenes disguises it a little, but a discerning gamer knows.
By time we get a single card that can run it at 60FPS full out nobody will care anymore.