NVIDIA's chief scientist: Moore's Law is dead, CPUs inefficient

Comments

  • edited May 2010
    I think NVIDIA has got a good point on leveraging the power of the GPU. Microsoft DirectCompute exists to make apps that use GPU resources accessible to developers. Where I disagree: It's not a question of whether to leverage CPU or GPU... The GPU is the more specialized component -- use it for everything possible. And use CPU for everything else, but use multi-core as much as possible. (BTW, these are two reasons I think IE9 will be a hit for Microsoft -- it uses GPU and multicore, plus compiled javascript, etc, etc.)

    I definitely disagree that average user just needs a netbook. For example, I think even average users want HD video via their PCs... and 1080P video on an Atom proc without GPU is not a great experience.
  • ButtersButters CA Icrontian
    edited May 2010
    I agree with Thrax. A flying train would be quite delicious.
  • TimTim Southwest PA Icrontian
    edited May 2010
    I knew years ago that Moore's Law was going to end someday. Doubling the performance of a Pentium 2 366 Mhz is one thing, doubling the performance of a new top end i7 is another. CPUs will get more powerful, just not double every 18 months.
  • shwaipshwaip bluffin' with my muffin Icrontian
    edited May 2010
    Moore's law isn't about performance.
  • ardichokeardichoke Icrontian
    edited May 2010
    Moore's law describes a long-term trend in the history of computing hardware, in which the number of transistors that can be placed inexpensively on an integrated circuit has doubled approximately every two years.<sup id="cite_ref-0" class="reference">[1]</sup> It is often incorrectly quoted as a doubling of transistors every 18 months, as David House, an Intel Executive, gave that period to chip performance increase. The actual period was about 20 months.<sup id="cite_ref-news.cnet.com_1-0" class="reference">[2]</sup>

    You know... just to be precise.
  • jpparker88jpparker88 Lancaster, CA
    edited May 2010
    I was just studying this in my IT program book a couple days ago, trippy. I somewhat disagree with the most users needing a netbook comment, but it's not far from the truth. Yes people would like to have 1080p video, but how many of them have 1080p capable monitors or televisions? Working for an internet company I see many people with many uses for computers and a fair lot of them just surf or check email. Nobody needs a quad-core uber machine for that. Technology will continue to soldier on, but I think it will be more in the handheld market, such as smartphones. I mean my dad's driod has more computing power and storage space than my first laptop computer. I think we will begin to see a paradigm shift from people having dedicated computers and phones to people using an all in one device for most of their needs and having a computer for things their handheld won't do.
  • Bob
    edited September 2010
    I can understand why Nvidia are so nervous. Fast GPU's are limited by CPU's in gaming, which is after all a major market.

    For all practical purposes, real-time interactive 3d rendering is punching a wall with current CPU's. Its difficult to imagine this situation improving.
Sign In or Register to comment.