Nvidia 8800. I'm finally tired of this crap.

Your-Amish-DaddyYour-Amish-Daddy The heart of Texas
edited November 2006 in Hardware
Ok, rant time. Sorry.

I just saw the Nvidia 8800 GTS and GTX. They require power. 2 PCI-X power molexes. Two. Two. Two. That's...72 volts. What the...? Why. Why does a video card require 72 feckin' volts? What's it gonna do, render things on spacetime calculations?! Seventy-two volts is enough for me to run 4 hard drives and I know they would consume less power. This is terrible.

For the longest time I've always stood by my policy. A powered video card is a sign that companies are too lazy to do things right. Now, I can understand if the card needs a few more watts than the port can provide...Wait, no I can't. If we're putting output limits on ports and no jumper to change this..When these motherboards now are starting to require 32+ pins..I don't fucking think so. No. This is stupid. I remember the old 12 pin motherboards and the difficulty of overclocking. But there is no excuse for this. We can make a laptop run on 18.2 volts. And these laptops aren't pushovers. Granted the system is designed to consume less power, but even on full-power, 18.2 volts is still 18.2 volts. Not 115+ through a transformer, which is basically what a power supply is. I'm tired of having my lightbill go up every time technology "increases". I'm tired of having to buy a new power supply every year, just to keep up. I'm tired of having gotten used to something, and they change it. Intel was supposed to have already released the 32X port. Back in June, I remember hearing something about them being ready to release it. Where is it? It's nowhere, and I've stopped hearing about it. Maybe it was not only more powerful than the 16x port, maybe...It was designed to run these higher end video cards, and companies like Thermaltake and Aspire didn't like it. I don't care why it's not out, I really don't care. What I still wanna know is; Why these video cards that don't have much different from the one I have The x1800 has no bells that my x1600 doesn't. And my X1600 isn't powered.

I'm tired of this crap. I bet the next power socket is going to be 40 pins. We're only 4 off. The Asus 650i requires 36. This is bullshit.

Comments

  • ZuntarZuntar North Carolina Icrontian
    edited November 2006
    Yea, I hear ya bro! I find it exhausting and extremly impossible to keep up with the changes in the new hardware as of the past several years. It just gets worse and worse. It used to be you could upgrade componants every once and a while, but now most of the required system changes to keep up is .......toooo much!!!
  • kryystkryyst Ontario, Canada
    edited November 2006
    But cutting edge hardware is still a year or two ahead of what software actually needs, that's always the case.
  • Your-Amish-DaddyYour-Amish-Daddy The heart of Texas
    edited November 2006
    That doesn't explain WHY these components consume so much power. My X1900GT consumes 12V extra just for the fan(I've seen the diagram.) so I don't give a damn about powering it. That power keeps this thing cool. Very well I might add, even though it's pulling heat right from my hard drives. What pisses me off is; There's no explination anywhere. I think I'll be emailing Nvidia tonight.
  • kryystkryyst Ontario, Canada
    edited November 2006
    They need more power because they are bigger gpu's on them simple as that.
  • Your-Amish-DaddyYour-Amish-Daddy The heart of Texas
    edited November 2006
    That doesn't make sense.

    The 8800 uses 65nm construction. That's tiny. Itty bitty. Why does this itty fucking thing need 72 volts? I can't even see 65nm. Much less the 111(Estimated) die-size under that massive heatsink, that in most cases isn't efficent due to common case construction.

    I guess I'm just pissed that I have to have all those wires in my case that do nothing until I get one of these honking cards. I'm tired of how complex this stuff has gotten, when it used to be so simple. I'm tired of having wires ontop of wires ontop of components ontop of wires. The inside of my case two years ago was infinitely more simple than this. All these advances in technology, and almost none in power efficency. I've seen no press on a power supply that's 100% efficent, but we can have a video card that can render spacetime. NO, I don't think so. This is just still a load of bullshit.
  • ZuntarZuntar North Carolina Icrontian
    edited November 2006
    I understand your frustration, but please contain you language choices to less offensive terms please.
  • kryystkryyst Ontario, Canada
    edited November 2006
    I'm not saying what they are doing is right but it's the current nature. GPU's develop independantly of PSU's etc.... Graphic card makers are about making money. Efficiency never saves money in the short run and that's all they are in it for with any card release. Yes it's stupid, yes it's backwards logic but unfortunately it's the nature of the beast. The next gen cards will draw even more juice as they start introducing cpu's or dual cored gpu's onto their cards. More heat, bigger fans the power requirements will continue to rise.
  • ZuntarZuntar North Carolina Icrontian
    edited November 2006
    kryyst wrote:
    I'm not saying what they are doing is right but it's the current nature. GPU's develop independantly of PSU's etc.... Graphic card makers are about making money. Efficiency never saves money in the short run and that's all they are in it for with any card release. Yes it's stupid, yes it's backwards logic but unfortunately it's the nature of the beast. The next gen cards will draw even more juice as they start introducing cpu's or dual cored gpu's onto their cards. More heat, bigger fans the power requirements will continue to rise.

    So true!!!
Sign In or Register to comment.