Dissecting Fermi, NVIDIA's next generation GPU

ThraxThrax 🐌Austin, TX Icrontian
edited October 2009 in Science & Tech

Comments

  • UPSLynxUPSLynx :KAPPA: Redwood City, CA Icrontian
    edited October 2009
    This thing is impressive, yet horrifying. This is totally consistent with the trends we saw NVIDIA pushing for at SIGGRAPH with GPU computing.

    We're seeing a time where the line between the CPU and GPU is being blurred significantly. In theory, great news, with all the crazy calculations needed for AI and physics and such.

    But at the same time, this could be new approach that doesn't end well for gamers.

    GPU computing is awesome, especially in the professional fields. But I'm going to buy my GPU for gaming. That's all I want this thing to do. If the computing helps games, awesome. If the thing games exceptionally well AS WELL as assisting the CPU during non-gaming moments, I'm OK with that. If the card has mediocre yields in gaming while helping run MS Word better, then we're in trouble.
  • QCHQCH Ancient Guru Chicago Area - USA Icrontian
    edited October 2009
    Did someone say Fermi? ;D
  • Cliff_ForsterCliff_Forster Icrontian
    edited October 2009
    I've been saying this for a while now, the GPU is the enthusiast part that is going to continue to evolve and impress us, the CPU is starting to top out, all you can do is add more cores. For a home user this is becoming fairly pointless, now GPU's are showing their muscle, they can do more than just render images, and they are just now figuring out how they fit into the much larger picture. Think Intel dumped millions in research to develop their own GPU's just to compete in the PC gaming market? You know they didn't, they are doing it because they have to, the writing is on the wall, GPU's are where the biggest performance gains are possible while the traditional X86 CPU while still improving, is starting to top out. I think its only a matter of time before GPU's are the guts of the system.
  • ButtersButters CA Icrontian
    edited October 2009
    In the cloud.
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited October 2009
    the writing is on the wall, GPU's are where the biggest performance gains are possible while the traditional X86 CPU while still improving, is starting to top out.
    What you have expressed became palpable for me when I added GPU clients to my [EMAIL="Folding@Home"]Folding@Home[/EMAIL] efforts. Seeing the amazing production the GPUs accomplished versus the CPU processing, I started paying attention to GPU-as-processor developments.
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited October 2009
    I hope Nvidia is successful with this technology. Five to ten years down the road it would benefit us all if there were a three-way serious competition - Nvidia, AMD/ATI, and Intel all producing top quality, high performance CPU-GPU processing units, or whatever they might be called at that time. The two-player general CPU market must evolve.

    The writing is definitely on the wall: 1) AMD purchased ATI, 2) Intel starts getting serious with research and development for graphics processing, and 3) Nvidia develops Fermi.
Sign In or Register to comment.