Skipping town on NVIDIA
I've been using strictly NVIDIA products for my whole life (admittedly, only the last ten years of which were actually spent building PCs). I have never used an ATI card, and have watched my friends have substantial trouble with them.
However, I've been looking at some of the more recent tech updates, and I'm starting to feel like NVIDIA has been acting really childish and uncooperative lately - really late-90s-microsoft-ish. Now, I know the Icrontic community has been paying attention to the graphics giant a lot more than I have, and for a lot longer. Have they always been like this? Does ATI do the same sort of crap? Or am I just barking at ghosts here? I've been known to form half-assed opinions, so help me out a little here.
I'm considering boycotting NVIDIA from now on and trying out ATI instead - supporting companies that act rudely really leaves a bad taste in my mouth - but I'd like to know if I'M being fair about it or not.
Weigh in?
However, I've been looking at some of the more recent tech updates, and I'm starting to feel like NVIDIA has been acting really childish and uncooperative lately - really late-90s-microsoft-ish. Now, I know the Icrontic community has been paying attention to the graphics giant a lot more than I have, and for a lot longer. Have they always been like this? Does ATI do the same sort of crap? Or am I just barking at ghosts here? I've been known to form half-assed opinions, so help me out a little here.
I'm considering boycotting NVIDIA from now on and trying out ATI instead - supporting companies that act rudely really leaves a bad taste in my mouth - but I'd like to know if I'M being fair about it or not.
Weigh in?
0
Comments
Present: ATI drivers are fine, while NVIDIA's have been a bit buggy. NVIDIA is dicking with the market in terms of physics, but it's about to be a non-issue (DX11). Both cards offer equivalent IQ, but NVIDIA's R&D fucked up and their product is at least 6 months late.
Historically speaking, it's a dead heat. My ultimate opinion on the issue is that since the advent of the Radeon 8500 vs. GeForce ti500 days, it would behoove you to check benchmarks and buy the card which is fastest in the applications you want.
Sometimes that has been NVIDIA, sometimes that has been ATI. Physics is fucking dumb, and nobody uses it well, so I've stopped caring about it until everyone is on the same page.
If you're into computational simulation, NVIDIA might be a better bet come January.
Forget politics. Just pick the fastest card you can afford right now. I've had both over the years, and both have served me just fine.
Sapphire Radeon 9800pro 256 (most expensive card I ever bought)
Geforce 6600
Radeon 3850XT
At the time, each of them served me admirably
VooDoo II 12MB SLI
GeForce 2 GTS
Radeon 9700 Pro
GeForce 8800 GTS
The rest of the cards have been minor upgrades or emergency purchase, but those four were the times when I said to myself: "Wow, this was worth every damn penny."
There have been many interim cards I have said no such thing to.
You found a card for $100 that'll do you at the same time? Holy crap, man, you're sitting on a gold mine!
MVC Gift card.
Certainly get the fastest card you can, no matter who the manufacturer is. If you're wanting to upgrade real soon, I recommend waiting for NVIDIA to drop the goods first so we know who wins this round.
And since it was mentioned, my top GPUs of my PC gaming lifetime:
ATI 9700 Pro (this thing was SWEET when it came out)
NVIDIA 8800 GTX (Current GPU, love it to death)
THIS. I loved that card; served me well for four years. But after flashing it up toa 9800XT and overclocking it, it kind of burned up...
I'm also a big proponent of going with the best card for the money at the time of purchase regardless of brand.
From this perspective, these are fairly good points. AMD is trying to push open source standards, nvidia is trying to keep them as proprietary as possible. It's pretty clear which company is currently looking at the big picture, and which one is only looking in the mirror.
I was getting the feeling I was the only person who cared about useless things like company politics.
And yeah, I CAN'T STAND their new freaking naming scheme. But where do you go after you run out of numbers, right?
Was ATI ever a massive D-bag? And I'm not talking about the quality of their merchandise (there's really good information about that in the posts above), I'm talking about them acting uncooperative and unfriendly toward journalists, gamers, other companies, whatever. Thrax mentioned they used to manipulate drivers to mess with benchmarks, but anything else?
Leo, you are right about 8800, 9800, GTS250 relabeling. But GTX260 is not a relabeled card; it is really newer generation. Not that such a typo is important but to feel better since I have a GTX260
I honestly just don't give two shits about either company. I want to know what will let me game the best at the time of purchase. If they're close in performance, I go NVIDIA for CUDA and Folding reasons, and previously because NV's Linux support was superior. The only time a company will lose my business is for nerdy reasons like throttling my network connection or using DRM on my files.
Who cares what they say? Unless they're making GT300s by killing babies (and they're telling that to me), their hardware isn't affected by their words. Bickering between companies just isn't interesting to me.
Right, but the point is while DirectX and D3D are proprietary, they don't restrict what cards can use them. Anyone can implement games, hardware, etc. that uses the standard. Would I prefer it was open source? Sure, but Microsoft needs to make money so I'll settle for being a closed source but open standard. Meanwhile, NVidia makes this PhysX engine, makes it so only they can implement hardware to take advantage of PhysX then makes it so you can't use PhysX if you have a non-Nvidia card in your system even if you also have an Nvidia card in there. That would be like Microsoft making DirectX or D3D stop working if you have Linux installed on a second partition.
Is it too late for Open GL to get back into the game?
Seems to me that DX11's new compute shader technology is an answer to OpenCL, at least in the gaming field. There needs to be a higher level library that will actually do the physics calculations, like Cry Engine on top of D3D.
With OpenCL and DirectCompute, it doesn't matter what library actually does the calculation, because the result is the same: Any DX10+ GPU will run it.
It's just like graphics: Thousands of game engines use DirectX. Some of them are better than others, but every video card can run all of them. Now it's the same for physics: There will soon be dozens of physics implementations. Some will be better than others, but every video card can run them.