Nvidia G80

Sledgehammer70Sledgehammer70 California Icrontian
edited November 2006 in Hardware
Some interesting images have come about that show us the first look of Nvidia's new monster known as the G80. Not much is known about this chip other than the fact that it will outperform a single GX2. We don't know if there will be an option for air-cooling, but we know for a fact this one is water cooled and it uses 2 PCI Power adapters.... this beast better be one heck of a Graphics card if it is sporting theses features.

* Unified Shader Architecture
* Support FP16 HDR+MSAA
* Support GDDR4 memories
* Close to 700M transistors (G71 - 278M / G70 - 302M)
* New AA mode : VCAA
* Core clock scalable up to 1.5GHz
* Shader Peformance : 2x Pixel / 12x Vertex over G71 (48 pixel and 96 vertex *)
* 8 TCPs & 128 stream processors
* Much more efficient than traditional architecture
* 384-bit memory interface (256-bit+128-bit)
* 768MB memory size (512MB+256MB)
* Two models at launch : GeForce 8800GTX and GeForce 8800GT
* GeForce 8800GTX : 7 TCPs chip, 384-bit memory interface, hybrid water/fan cooler, water cooling for overclocking. US$649
* GeForce 8800GT : 6 TCPs chip, 320-bit (256-bit + 64-bit) memory interface, fan cooler. US$449-499

g803.jpg
g802.jpg
g801.jpg

5d1f084bf4cee15f55dd17219dd2dccf.jpg
80a6be97bb831e1b7f3de548d3fcdcc2.jpg

Comments

  • edited September 2006
    No way I would buy something like that myself. We are just now getting away from the power gulping, electric heater simulators that were called netburst and it's about time that both Nvidia and ATI/AMD wake up and realize this fact too. I don't know what they will have to do to curb heat and power comsumption, but they are to that point with the next gen gpu's they are trying to finish up right now.
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited September 2006
    To be honest the G80 is made for the hard-core, not your average Joe users, the people who buy these cards do not care what type of power requirements are needed, they just want the best performance at any cost.
  • tmh88tmh88 Pittsburgh / Athens, OH
    edited September 2006
    To be honest the G80 is made for the hard-core, not your average Joe users, the people who buy these cards do not care what type of power requirements are needed, they just want the best performance at any cost.


    that thing probably costs more than the high end quadro fx's....especially since its watercooled.
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited September 2006
    If they want to stay competitive I am sure it will stay at $600 or less. as the GX2's sell in the $500's now and that is a dual GPU card with 1Gig of ram :)
  • edited September 2006
    The GPU designers need to take a lesson from Conroe 2 and start engineering cooler running parts. I'm getting tired of running "heaters" in my computer case that cost too much money to run and too much money to cool down my house to compensate for the heat generation. :grumble:
  • edited October 2006
    Dumb question I'm sure, but I take it this is a DX10 compliant part?

    If so, do they also plan to release it late November?
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited October 2006
    It has yet to be confirmed from Nvidia as a DX10 card, But I am almost positive it will be a DX10 card.
  • DanGDanG I AM CANADIAN Icrontian
    edited October 2006
    I don't think that Nvidia would shoot themselves in the foot like that, bring out a brand new card for christmas/vista and not have it be fully compliant with DX 10 and SM4.
  • lemonlimelemonlime Canada Member
    edited October 2006
    Looks like a very interesting architecture (especially the 256+128bit memory interface). I'm very curious to see how this will perform. Is it just me, or does that card look like it won't fit in a lot of cases? getting close to full-length size I think.

    I'm curious about the power consumption of this card. Obviously it will draw more power than the 7800 series since it is 90nm and most definitely a larger core. GDDR4 should help, but there is more memory to offset that savings. With any luck it may not exceed the draw of an X1900.
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited October 2006
    11" for the GTX and 9" for the GTS not to sure on how it performs, but it it requires 2 PCI power inputs I am sure it should shatter some records :)
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited October 2006
    Agressor Prime found this info and commented on it... I pretty much agree with his statements :)
    * Unified Shader Architecture
    * Support FP16 HDR+MSAA
    * Support GDDR4 memories
    * Close to 700M transistors (G71 - 278M / G70 - 302M)
    * New AA mode : VCAA
    * Core clock scalable up to 1.5GHz
    * Shader Peformance : 2x Pixel / 12x Vertex over G71 (48 pixel and 96 vertex *)
    * 8 TCPs & 128 stream processors
    * Much more efficient than traditional architecture
    * 384-bit memory interface (256-bit+128-bit)
    * 768MB memory size (512MB+256MB)
    * Two models at launch : GeForce 8800GTX and GeForce 8800GT
    * GeForce 8800GTX : 7 TCPs chip, 384-bit memory interface, hybrid water/fan cooler, water cooling for overclocking. US$649
    * GeForce 8800GT : 6 TCPs chip, 320-bit (256-bit + 64-bit) memory interface, fan cooler. US$449-499

    "Expect G80 to be out somewhere in mid November along with Kentsfield."

    Bold is my writing...

    So basically the GeForce 8800 GTX will have 144 unified pipelines clocked at 1.5GHz vs. ATI's 64 clocked at 1GHz. To be exact, just considering pipelines*clock, a single GeForce 8800 GTX will be 3.375x more powerful than a single R600. Now just imgaine how much nVidia will own ATI when they release the GX2 version and enable G80 Quad SLI... sorcerer.gif

    *I am thinking I might be wrong in this statement since it says performance, not pipelines. The shaders are unified so there might not be more than 96 pipelines, but that does not agree with 48 pixel. Still, there is more power than the R600.**

    **I thought about it again. The only thing that makes since is 48 unified and 48 vertex. In such you have 96 vertex and 48 pixel (w/ 48 vertex) when both extremes are taken. Still, there is an advantage over the R600 considering the application doesn't use more pixels than vertex than the 48p/16v ratio.

    ~~~ So basically the advantage the G80 has over the R600 (considering the pixel/vertex ratio = 1) is (96/64)*(1.5/1.0)=2.25. So a single GeForce 8800 GTX can beat 2 R600s in Crossfire. Still awesome... sorcerer.gif ~~~
  • MJOMJO Denmark New
    edited November 2006
    The cards are supposed to be quite long as well.
    I believe it is 26 cm for the GTS version and 28 cm for the GTX version.
    And I have just bought a new case for my new rig.
    It is an Antec P160W

    Furthermore since I have lost my slide ruler, I wrote Antec regarding the available space in that particular case.
    Consider me lucky, here is what the nice Antec supporter had to say:
    I have checked and the length is about 28.5cm.
    That's what I call a tight fit. :hair::eek:
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited November 2006
    Yeah the G80 is no slouch for size, from what I have seen it fills most cases pretty well; I have even seen removable HD brackets removed so one of these bad boys will fit. I have found that none of my cases will be able to hold 2 of these cards, well at least the GTX's...

    I did recently buy a very old school full size case that I know will hold these bad boys so when I am ready to sport some of these bad boy's I can :) I had one of these cases when I had a few of my Voodoo's that where much bigger than the 8800GTX. Its to bad you can't buy these cases retail anymore :(
  • lemonlimelemonlime Canada Member
    edited November 2006
    ..I have even seen removable HD brackets removed so one of these bad boys will fit.. I did recently buy a very old school full size case that I know will hold these bad boys so when I am ready to sport some of these bad boy's I can :) ..

    ;D these are real 'bad boys' aren't they? :D
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited November 2006
    Very Bad!
  • mmonninmmonnin Centreville, VA
    edited November 2006
    Very good article on Anandtech about the arch of the new G80 core. Brand new from the ground up. Seems like a winner to me with the changes they have made. Now they just need to do it on a smaller process to shrink the die size and power consumption. It does perform better than any GPU in performance per watt though so take that in mind and is a MONSTER in size. Yield will be an issue until they get it on a new process, and then the price will start to drop.

    http://www.anandtech.com/video/showdoc.aspx?i=2870
Sign In or Register to comment.