Intel's answer to the GPU

WingaWinga MrSouth Africa Icrontian
edited February 2007 in Science & Tech
Could this spell the end of the GPU as we know it?

The Inquirer has posted an interesting article about Intel's future release of a new type of 'graphics card' which uses x86 'mini-cores' as opposed to GPU's.

They have code named it Larrabee and it's initial release should see the light of day some time in 2009, sporting 16 cores. However due to the architecture, these cores can be increased or decreased quite substantially.

What are these cores? Well they not GPU's. They basically a line of x86 mini-cores with a staggeringly short pipeline. What you get in the end is a lot of threads running a super-wide vector unit, with the controls in x86. You use the same tools to program the GPU as you do the CPU. It then becomes very easy to use the GPU as an extension to the main CPU. Effectively Intel is doing away with the traditional 3D pipeline of putting points in space, connecting them, painting the resultant triangles, and then getting them to perform as fast as possible.

Connecting these cores by using a very wide bi-directional ring bus, we can start thinking four, not three digits of bit width and Tbps not Gbps of bandwidth.
Although the current chip is 65nm, by the time Larrabee hits production we could very well see it on 45 nanometres.

It seems in any case, the idea of having a GPU as a separate chip will become a thing of the past. The first step would be a GPU on a CPU, very much like AMD's Fusion. This should be transitional though as both sides begin to pull the functionality into the core itself. Thereafter GPUs will cease to exist.

Comments

  • GooDGooD Quebec (CAN) Member
    edited February 2007
    Im surprise that im the first one to reply on this thread.

    That's actually a pretty good news. Those days a pretty sad for us fellow GPU lovers. Video cards are only getting BIGGER and they make your electricity reach new record that you wont be proud about.

    Im not sure if i like the idea actually, but at least they are trying something. I guess we will see if this idea will come to life in ~3 years. (Benchmark Benchmark, Benchmark :D )
  • WingaWinga Mr South Africa Icrontian
    edited February 2007
    Good... I'm also surprised this item has not garnered the amount of interest I though it would.
    Maybe it's the source, maybe the idea is too radical to absorb at this time. If you think about it, it could change the face of graphics as we know it.
    I think this is huge! At least when and if it happens, we can say we told you about it right here on SM :ninja:
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited February 2007
    TBH I don't think high performance GPU's will go away anytime soon. These first steps will eliminate the onboard options and than work their way up to the mid range market. The tech just isn't there to remove the need for massively huge Gaming systems with massively huge cards.

    ATI/ AMD is about to win the award for the most power hungry monster of all times... It looks interesting, but I am thinking most users are going to need a new case or some really good Water-cooling setups.

    It will be an interesting next 3 years in the GPU industry, with Dx10 being a baby to the market. and Intel stepping up its approach to the graphics segment.
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited February 2007
    The inquirer is never wrong. Larrabee will take a breath of life, even if it's only one alpha silicon card.
  • godzilla525godzilla525 Western Pennsylvania Member
    edited February 2007
    I remember finding an ancient IBM network card that used the Intel ~8-10MHz 80188. It was probably used in a box with the 8088 sans 8087 as the main CPU running at 4.77MHz. :bigggrin:

    Old idea, new twist.
Sign In or Register to comment.