fair is fair. where is the Doom vs Doom benchmark. but I kid...
one thing to remember is that the cost of the card is immaterial compared to the cost of the people running the hardware. if the artist is waiting for screen re-draws that's productivity lost. and if you don't run the settings at most most and instead focus on snap snap you've taken a frustrating part of the jorb (waiting) and made it a non-issue.
the thing in working with creatives is keeping the bitch factor low. if that means buying them new hardware every nine months so they are happy that's cheaper than higher new knobs because the old stars quit. the new MacBook Pro may cost $2500 but that's cheap comparatively. especially given that the now happy the artist cranks after installing the new.
you do have to be careful about performance. the writer for example has different needs. he doesn't need FPS to be creative. in fact we don't even need hardware to keep that guy going. that same $1600 buys lots of beer, steaks and cigars which will keep that guy writing for months without complaining.
(moderator: sorry, I've never ever meet a writer that didn't complain…)
I have a question . What if you compare this 400 dollar 5870 to a workstation card that has a 400 dollar price point. Studios might not have an issue with going fr a 1800 card. But someone like me who is just starting to freelance 1800 is to big. So i would like to see thae same test done with a card theat has the same price point as 5870 and probably one that has the same price point as 4890 the card i have right now
That's a good question Anand, I'm not sure what the answer is but I'd tend to go with the gaming card over a super cheap workstation card, I think you'll get a lot more bang for your buck. For one thing you'll be getting a lot less on-board memory on the card and, at least with the Quadro's I've used, a lot less reliability. I'm not the expert around here though and I don't have experience with AMD workstation cards.
What if you compare this 400 dollar 5870 to a workstation card that has a 400 dollar price point.
That's a review I'd really like to see, since that's more like the choice I'd have to make if I decide to delve into more 3D work. I don't think a $2000 card is ever going to be in my budget.
You guys mean like a Quadro 3800 series workstation GPU?
Which happens to be enroute to our test bench as we speak.
Oh, you'll have the numbers you hunger for.
Also, John Foster, you always have the right words for any occasion. Though I am a write who is also a 3D artist, so I am one of the few who needs that extra umf.
But I certainly know how to complain, and beer shuts me up, so yes, you nailed it.
A quadro 3800 would be nice if it falls in the range. Another comparison would be a Ati 5850 and a quadro 17 or 1800. I tihnk they fall in the same place.
I have a nice gaming rig with a Radeon HD 4870. I was thinking about buying a workstation card because my video card causes SolidWorks to crash all the time. I don't really want to build a separate workstation PC. Can I use both cards at the same time in one rig? Or would I have swap the cards each time I wanted to play a game or use SolidWorks?
Ecthelionv, you would have to swap the hardware in and out depending on what you want to do. As far as I know, there is no other method of doing it. I keep the desktop card installed most of the time, and swapping to the FirePro isn't too bad with a tool-less chassis.
Ecthelionv, you would have to swap the hardware in and out depending on what you want to do. As far as I know, there is no other method of doing it. I keep the desktop card installed most of the time, and swapping to the FirePro isn't too bad with a tool-less chassis.
What if you have two monitors? Would the card attached to each monitor behave as expected? Or is that a maybe/sometimes kind of thing?
Driver issue. Both use an ATI driver and most likely that will cause major issues with the system not knowing which driver goes to which card. If you plan to do that I would shoot for a Quadro because you can get drivers from both companies to co-habitat.
I turn the thing off and unplug it man! come on, what kind of barbarian do you think I am?!
Trust me, being unplugged does not mean you're not going to zap it with a little static electricity, I'm not too paranoid about it (though I have been through mandatory electrostatic discharge training) but rotating the cards in and out all the time have to increase your chances.
I wonder if I should try baking my three 7800GT cards that are all on the fritz. Can you kill your card this way?
I would say if they're already on the fritz, there's no harm in trying. You can't really use them when they're unstable anyway, so is it really going to matter if you DO kill them completely?
0
AnnesTripped Up by Libidos and HubrisAlexandria, VAIcrontian
Thanks for this great article, this really shows the difference between the two levels of video cards.
I hope to see a future article pitting the top Nvidia gaming GPU against the QuadroFX line. And one pitting the QuadroFX line against the FirePro line.
I also hope to see benchmarks for some Adobe CS4 products and Autodesk Revit products, and video encoding/decoding.
I look forward to your future article next week, with videos of ATI reps & engineers, explaining the differences.
Again Thanks for doing what no one else has done!!! =D
I'm still waiting for a good directX (3ds max) comparison of desktop vs workstation cards at similar pricepoints - ie HD4870 vs v3750. I've seen charts from each which suggest the v3750 is better, but I want some more proof before I consider switching. And softmodding the 4870 to 8750 isn't as good as a real 8750, but would it be better than a 3750? (Obviously I've got a 4870 & Vista 64).
WOW, this is a really good post. I just wantedto add my 2 cents.
First of all content creation normally favors the open gl language to render shadows, textures and all the other mumbo jumbo. Open GL use to be the premier toolkit back in the day. Any Graphics card that has the Open GL implementation will work with Revit and AutoCAD and by the sounds of it Solid Works too.
Now, the video game card can render the Open GL language but the Workstation card has been engineered to carry out the job with much greater quality and performance. Create a model in Revit turn shadows on and and change the views quickly, the shadows will go away and will be quickly re-rendered on the game card but the workstation card will always give you better speed and higher resolution and shadows will always be displayed.
-Can't go wrong with the HD 5770, open GL 3.0, low power consumption (25w idle and 118w at full load), plays Crysis (Don't get carried away with the settings) and DirectX 11 for gaming (will be the standard for the next 2 years)
-Any gigabyte mobo with the Ultra Durable 3 feature. Power effecient and less heat.
-Stuff as much RAM as yo can
-80+ Silver PSU. I recommend a 600w unit from Corsair
-go with any processor with a L3 cache. the differences with Intel and AMD nowadays are trivial really. You will like either one just be sure to get the L3 cache.
-If you decide not to game, the Quadro FX 580 (512mb) will handle the load and help your renderings pop.The Quality and performance will definitely be higher than a gaming card.
Any Graphics card that has the Open GL implementation will work with Revit and AutoCAD and by the sounds of it Solid Works too.
As of version 2010 Revit uses D3D and DirectX10 for the primary rendering. Open GL which was big up to revit 2009 was reduced in the amount of openGL acceleration 2010 due to a plague of video card problems thanks to openGL. Directx10 is the way forward in revit.
--Can't go wrong with the HD 5770, open GL 3.0, low power consumption (25w idle and 118w at full load), plays Crysis (Don't get carried away with the settings) and DirectX 11 for gaming (will be the standard for the next 2 years)
Thanks to different hardware and program specific drivers Workstation cards have a big leg up over any gaming card even the HD 5870 wouldn't perform as well a a lower end workstation card for Digital content creation or CAD work.
keep gaming cards in the gaming world, and workstation cards in the work world. it's not advisable to cross use the cards (although it can be done there is a big performance hit in cross use)
The SPECviewperf benchmark is not real it seems. It's probably highly optimized in the drivers like ATI and NVIDIA also did with the 3DMark benchmark a few years ago.
I have a Quadro FX 570 and a ATI HD 2900 XT. When I run SPECviewperf the Quardo FX 570 really smokes the HD 2900 XT. However. I tested the 2 cards on my own 3ds max projects the HD 2900 XT has always a higher fps. I tested a scene where the camera is rotating around a mechanical device with moving parts. The Quadro FX 570 got to max 22 fps and the HD 2900 XT almost double with 41 fps.
So when it comes to practical use I prefer a cheaper gamecard and invest in a faster cpu to cut down rendering times. Graphics cards don't affect the rendering times.
The 3ds max viewport quality is more or less the same on both cards.
I used the nVidia 3ds max performance driver for the Quadro FX 570 and the default DirectX 9 3ds max driver for the HD 2900 XT.
I softmodded the HD 2900 XT to a FireGL V7600 too but that only added a few fps so not really worth the trouble. I did very well in SPECviewperf tho. Maybe a real V7600 is faster but I don't know.
Cinebench is strange too. My HD 2900 XT scores +8500 points in the OpenGL test. A lot higher than the cards tested here.
So when it comes to practical use I prefer a cheaper gamecard and invest in a faster cpu to cut down rendering times. Graphics cards don't affect the rendering times.
depends on the program, Some programs Rendering is programed for graphics cards, other programs rendering is primarily CPU. You have to know which is better for your program (sometimes it's hard to find accurate information to know which is better)
FYI: the QuadroFX 570 is a really low end card the workstation line. With only 16 cuda cores and 256MB of memory; I'm not too surprised your HD 2900 XT is kicking it's trash. The 2900 has 320 stream processing units and double the memory at 512MB. The larger memory is your friend in this case.
Softmodding the 2900 doesn't really get you a FireGL V7600, your only forcing the FireGL V7600 driver to run on the 2900 but the hardware and GPU are different on the two cards. the only thing you got out of softmodding is a glitchy graphics card that wouldn't perform as well as a real FireGL V7600.
I personally set the entry level of workstation cards at the QuadroFX1800 or FirePro v7750
Comments
one thing to remember is that the cost of the card is immaterial compared to the cost of the people running the hardware. if the artist is waiting for screen re-draws that's productivity lost. and if you don't run the settings at most most and instead focus on snap snap you've taken a frustrating part of the jorb (waiting) and made it a non-issue.
the thing in working with creatives is keeping the bitch factor low. if that means buying them new hardware every nine months so they are happy that's cheaper than higher new knobs because the old stars quit. the new MacBook Pro may cost $2500 but that's cheap comparatively. especially given that the now happy the artist cranks after installing the new.
you do have to be careful about performance. the writer for example has different needs. he doesn't need FPS to be creative. in fact we don't even need hardware to keep that guy going. that same $1600 buys lots of beer, steaks and cigars which will keep that guy writing for months without complaining.
(moderator: sorry, I've never ever meet a writer that didn't complain…)
That's a review I'd really like to see, since that's more like the choice I'd have to make if I decide to delve into more 3D work. I don't think a $2000 card is ever going to be in my budget.
Which happens to be enroute to our test bench as we speak.
Oh, you'll have the numbers you hunger for.
Also, John Foster, you always have the right words for any occasion. Though I am a write who is also a 3D artist, so I am one of the few who needs that extra umf.
But I certainly know how to complain, and beer shuts me up, so yes, you nailed it.
Oh yeah, I'd like to see that, but I'd still like to see something cheaper, too.
Check out the variation in the $400 - $500 range on Newegg.
I turn the thing off and unplug it man! come on, what kind of barbarian do you think I am?!
What if you have two monitors? Would the card attached to each monitor behave as expected? Or is that a maybe/sometimes kind of thing?
Trust me, being unplugged does not mean you're not going to zap it with a little static electricity, I'm not too paranoid about it (though I have been through mandatory electrostatic discharge training) but rotating the cards in and out all the time have to increase your chances.
Really hoping the oven works
I wonder if I should try baking my three 7800GT cards that are all on the fritz. Can you kill your card this way?
/me looks over at Jackie's old dead 7900GT
I would say if they're already on the fritz, there's no harm in trying. You can't really use them when they're unstable anyway, so is it really going to matter if you DO kill them completely?
And Chris, you totally jinxed me man. Just a few hours after I posted the witty retort... ZAP.
I ground myself extensively and everything, but some time, the crap just needs to hit the fan.
I hope to see a future article pitting the top Nvidia gaming GPU against the QuadroFX line. And one pitting the QuadroFX line against the FirePro line.
I also hope to see benchmarks for some Adobe CS4 products and Autodesk Revit products, and video encoding/decoding.
I look forward to your future article next week, with videos of ATI reps & engineers, explaining the differences.
Again Thanks for doing what no one else has done!!! =D
First of all content creation normally favors the open gl language to render shadows, textures and all the other mumbo jumbo. Open GL use to be the premier toolkit back in the day. Any Graphics card that has the Open GL implementation will work with Revit and AutoCAD and by the sounds of it Solid Works too.
Now, the video game card can render the Open GL language but the Workstation card has been engineered to carry out the job with much greater quality and performance. Create a model in Revit turn shadows on and and change the views quickly, the shadows will go away and will be quickly re-rendered on the game card but the workstation card will always give you better speed and higher resolution and shadows will always be displayed.
-Can't go wrong with the HD 5770, open GL 3.0, low power consumption (25w idle and 118w at full load), plays Crysis (Don't get carried away with the settings) and DirectX 11 for gaming (will be the standard for the next 2 years)
-Any gigabyte mobo with the Ultra Durable 3 feature. Power effecient and less heat.
-Stuff as much RAM as yo can
-80+ Silver PSU. I recommend a 600w unit from Corsair
-go with any processor with a L3 cache. the differences with Intel and AMD nowadays are trivial really. You will like either one just be sure to get the L3 cache.
-If you decide not to game, the Quadro FX 580 (512mb) will handle the load and help your renderings pop.The Quality and performance will definitely be higher than a gaming card.
As of version 2010 Revit uses D3D and DirectX10 for the primary rendering. Open GL which was big up to revit 2009 was reduced in the amount of openGL acceleration 2010 due to a plague of video card problems thanks to openGL. Directx10 is the way forward in revit.
Thanks to different hardware and program specific drivers Workstation cards have a big leg up over any gaming card even the HD 5870 wouldn't perform as well a a lower end workstation card for Digital content creation or CAD work.
keep gaming cards in the gaming world, and workstation cards in the work world. it's not advisable to cross use the cards (although it can be done there is a big performance hit in cross use)
I have a Quadro FX 570 and a ATI HD 2900 XT. When I run SPECviewperf the Quardo FX 570 really smokes the HD 2900 XT. However. I tested the 2 cards on my own 3ds max projects the HD 2900 XT has always a higher fps. I tested a scene where the camera is rotating around a mechanical device with moving parts. The Quadro FX 570 got to max 22 fps and the HD 2900 XT almost double with 41 fps.
So when it comes to practical use I prefer a cheaper gamecard and invest in a faster cpu to cut down rendering times. Graphics cards don't affect the rendering times.
The 3ds max viewport quality is more or less the same on both cards.
I used the nVidia 3ds max performance driver for the Quadro FX 570 and the default DirectX 9 3ds max driver for the HD 2900 XT.
I softmodded the HD 2900 XT to a FireGL V7600 too but that only added a few fps so not really worth the trouble. I did very well in SPECviewperf tho. Maybe a real V7600 is faster but I don't know.
Cinebench is strange too. My HD 2900 XT scores +8500 points in the OpenGL test. A lot higher than the cards tested here.
depends on the program, Some programs Rendering is programed for graphics cards, other programs rendering is primarily CPU. You have to know which is better for your program (sometimes it's hard to find accurate information to know which is better)
FYI: the QuadroFX 570 is a really low end card the workstation line. With only 16 cuda cores and 256MB of memory; I'm not too surprised your HD 2900 XT is kicking it's trash. The 2900 has 320 stream processing units and double the memory at 512MB. The larger memory is your friend in this case.
Softmodding the 2900 doesn't really get you a FireGL V7600, your only forcing the FireGL V7600 driver to run on the 2900 but the hardware and GPU are different on the two cards. the only thing you got out of softmodding is a glitchy graphics card that wouldn't perform as well as a real FireGL V7600.
I personally set the entry level of workstation cards at the QuadroFX1800 or FirePro v7750