Intel's Larrabee delayed again

primesuspectprimesuspect Beepin n' BoopinDetroit, MI Icrontian
edited December 2009 in Science & Tech

Comments

  • Cliff_ForsterCliff_Forster Icrontian
    edited December 2009
    And AMD enjoys a higher margin on 5xxx series cards a little longer.
  • SnarkasmSnarkasm Madison, WI Icrontian
    edited December 2009
    Did anybody think Larrabee was even going to remotely compete with the Radeon 5000 series? Come on, Cliff.
  • lordbeanlordbean Ontario, Canada
    edited December 2009
    Intel thought it was going to. They hoped it would rival Fermi, too.

    Personally, I didn't see it happening. A company with no discrete card experience suddenly jumping into the high end? Yeah right.
  • edited December 2009
    I'm not surprised either. This IS a complex project, Intel is trying something that has never been done before (A graphic card based on X86 cores), with hardware and software to develop for it.

    I'm thinking that performance is not where they want it to be. Or, if you prefer, performance is not high enough to be competitive enough to sell Larrabee at a high enough price to cover costs.

    Either that, or their drivers suck a usual...
  • primesuspectprimesuspect Beepin n' Boopin Detroit, MI Icrontian
    edited December 2009
    They do have discrete card experience, Lordbean. The i740 debacle, even though it was a joke back in the day.
  • lordbeanlordbean Ontario, Canada
    edited December 2009
    They do have discrete card experience, Lordbean. The i740 debacle, even though it was a joke back in the day.

    Lemme rephrase... a company with no recent discrete card experience.
  • Cliff_ForsterCliff_Forster Icrontian
    edited December 2009
    Snarkasm wrote:
    Did anybody think Larrabee was even going to remotely compete with the Radeon 5000 series? Come on, Cliff.

    Not me of course. Intel can't produce a decent graphics chip to soldier on their own chipset.
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited December 2009
    Not me of course. Intel can't produce a decent graphics chip to soldier on their own chipset.

    Not that it matters, given their market share. ;) I don't think it's a matter of can vs. can't, I think it's a matter of can vs. need. Intel is the largest supplier of graphics chips in the world... Why lay out billions for a marginally better IGP?
  • Cliff_ForsterCliff_Forster Icrontian
    edited December 2009
    My point is that Larrabee has been a hot topic for the geek hype machine. I understand when NVIDIA says, here is FERMI, hold on to your shorts, its going to be awesome, and they miss delivery and everyone is still buzzing about how incredible its going to be. NVIDIA has earned that. In Intel's case, Larrabee is not hype worthy. Intel has never produced a graphics product that is even okay much less any good, but yet, people talk about Larrabee as if its going to be a game changer just because Intel may have talked about ray tracing or something in a demo. I never understood the hype surrounding Larrabee. There comes a time where you need to either "put up or shut up"
  • lordbeanlordbean Ontario, Canada
    edited December 2009
    Intel is and will remain a major player in the graphics sector as long as their processors (and therefore chipsets) remain a major player. Most office workstations have no need for extra dollars spent on discrete graphics setups, so by implementing an inexpensive, low-power IGP on their motherboards, Intel will automatically sell their graphics technology to a very large segment of the computing industry.

    This being said, Larrabee did nothing to make my geek sense tingle. Ever since I heard that announcement, all I could think was "well, they got their work cut out for them".
Sign In or Register to comment.