Future Graphics Cards May Consume More Than 150W Of Power

edited December 2004 in Science & Tech
PCI-SIG Raises the VPU power consumption bar to 150W in new standard.
Meanwhile certain engineers do their best to reduce power consumption of chips for mobiles and handhelds and some others make great efforts to hold power consumption of desktop chips in reasonable envelopes, PCI Special Interest Group preps a forward-looking electrical standard for PCI Express x16 implementation that would allow to feed forthcoming graphics cards with unbelievable amount of power.

“PCI-SIG announces the availability of the PCI Express x16 Graphics 150W-ATX Specification 1.0. This is another in a series of specifications that attest to the continued momentum in the adoption of PCI Express architecture as the general-purpose I/O interconnect of choice in computing and communications industries,” the PCI Special Interest Group said in the statement.

The new specification is aimed at high-end graphics applications that require increased power. It defines a standard power connector to meet the growing needs of power-hungry graphics engines. The specification is written for ATX chassis implementations. A companion specification for BTX chassis is currently in development in the PCI-SIG.
Is the market going to allow for this? -KF

Source: X-Bit Labs

Comments

  • RWBRWB Icrontian
    edited December 2004
    We're going to eventually need a Cold Fusion Reactor or something inside each PC for for just our Graphics Cards :P
  • MedlockMedlock Miramar, Florida Member
    edited December 2004
    KingFish wrote:
    Is the market going to allow for this? -KF
    Do we have a choice? :confused:
  • edited December 2004
    Yes, consumers have a choice. If you have two major video card makers that produce similar benchmarks have very different power needs then the market will dictate quite well which one will sell better. Also, if reviewers hammer the video card makers for their power draw it will sway their direction too. The market has quite a bit of say in this, the card manufacturers can't afford to not listen to the market.
  • Geeky1Geeky1 University of the Pacific (Stockton, CA, USA)
    edited December 2004
    Increased power consumption is a fact of life. The faster you go, the more transistors you add, the more power you need. Dropping the manufacturing process to a smaller scale (9 nanometers from 11, for example...) will help, but we're seeing manufacturers run into problems with their 9nm processes, so...

    Besides, who cares? so it draws 150w. So what? Just means that people will have to stop using crap power supplies. A good 550w PS will handle a 150w graphics card in a loaded system without a problem.
  • edcentricedcentric near Milwaukee, Wisconsin Icrontian
    edited December 2004
    Think about other possabilities.
    If you could supply 150W thru pcie then you could have multi cpu boards that had additional cpus on pcie cards.
    Or what else could you power on a card with 150W??????
  • edited December 2004
    I'd hate to even think what that 150w will produce in terms of heat inside of a computer case. Pair that with a presshot and you have a heck of a heat machine.
  • Geeky1Geeky1 University of the Pacific (Stockton, CA, USA)
    edited December 2004
    it's not that big of a deal. I've got a 3.4GHz Prescott in my laptop for god's sake. All it means is that we're going to see more use of heatpipes, larger fans, and hybrid or solid copper heatsinks even in OEM machines.
  • maf
    edited December 2004
    Heat is a big fucking deal, geeky (freak).
  • EyesOnlyEyesOnly Sweden New
    edited December 2004
    And so is noice. I for one don't wont a howling monster near me. The again some weirdos do. :shakehead :D
  • Geeky1Geeky1 University of the Pacific (Stockton, CA, USA)
    edited December 2004
    Maf: It depends on what you're talking about... a system with a graphics card that can put out 150w of heat and a CPU (say a 3.4 Prescott) that puts out about 125-150w of heat would have a total heat output (including the power supply, chipset, PWM, etc.) of somewhere between 350 and 450w. That makes for a hell of a space heater, yes, but in terms of cooling it, it's hardly a big deal.

    EyesOnly: What would you say if I told you that my xeon system (2 2.8 Xeons @ 3.2 [highest the board will go] and a 6800GT), which puts out somewhere around 300w of heat under full load (both CPUs & the GPU @ 100% usage) does so while keeping both processors and the GPU cooler than most peoples' single cpu systems, AND does it with less noise? Oh, and it's air cooled.

    The dual athlon system puts out a bit less heat (say 200-250w; the gfx card is only a 9800XT, so...). It's also air cooled, and it also runs cooler and quieter than most single CPU systems.

    So there. :p
  • godzilla525godzilla525 Western Pennsylvania Member
    edited December 2004
    Is it hot in here? Gee, must be this new computer....

    I think I'll go outside and cool off in the breeze coming off the ELECTRIC METER!!

    :shakehead
  • EyesOnlyEyesOnly Sweden New
    edited December 2004
    I Still think single cpu is the way to go so :D
    Is it hot in here? Gee, must be this new computer....

    I think I'll go outside and cool off in the breeze coming off the ELECTRIC METER!!

    :shakehead

    Ouch i feel for you man even though the only breeze i feel while outdoor isn't comming from my house.

    I so long for februari when i'm moving and will get i dedicated room for my computer in the basement. :thumbsup:

    It took almost a month after the first heaters in the house was turned on before i needed to turn on the one in my room. Go figure :scratch: :shakehead
Sign In or Register to comment.