Options

Intel scientists find wall for Moore's Law

edited December 2003 in Science & Tech
Moore's Law, as chip manufacturers generally refer to it today, is coming to an end, according to a recent research paper.

Granted, that end likely won't come for about two decades, but Intel researchers have recently published a paper theorizing that chipmakers will hit a wall when it comes to shrinking the size of transistors, one of the chief methods for making chips that are smaller, more powerful and cheaper than their predecessors.

Manufacturers will be able to produce chips on the 16-nanometer manufacturing process, expected by conservative estimates to arrive in 2018, and maybe one or two manufacturing processes after that, but that's it.

"This looks like a fundamental limit," said Paolo Gargini, director of technology strategy at Intel and an Intel fellow. The paper, titled "Limits to Binary Logic Switch Scaling--A Gedanken Model," was written by four authors and was published in the Proceedings of the IEEE (Institute of Electrical and Electronics Engineers) in November.

Although it's not unusual for researchers to theorize about the end of transistor scaling, it's an unusual statement for researchers from Intel, and it underscores the difficulties chip designers currently face. The size, energy consumption and performance requirements of today's computers are forcing semiconductor makers to completely rethink how they design their products and are prompting many to pool design with research and development.

Resolving these issues is a major goal for the entire industry. Under Moore's Law, chipmakers can double the number of transistors on a given chip every two years, an exponential growth pattern that has allowed computers to get both cheaper and more powerful at the same time.

Mostly, the trick has been accomplished through shrinking transistors. With shrinkage tapped out, manufacturers will have to find other methods to keep the cycle going.

Catch the full article over @ CNET News.com

Comments

  • csimoncsimon Acadiana Icrontian
    edited December 2003
    interesting read ...interesting to see what happens in a few years.
  • DogSoldierDogSoldier The heart of radical Amish country..
    edited December 2003
    That's around Judgement day. SkyNet had already solved the problem of the Quantum barrier on it's own, but got tired of waiting for humans to figure it out. After weighing the pros and cons (Takes SkyNet all of .00000005 seconds)... they trigger Armageddon.
  • Geeky1Geeky1 University of the Pacific (Stockton, CA, USA)
    edited December 2003
    16 nanometers? Well, when you think about it, it's not really surprising. You have to hit a limit somewhere, and 16 nanometers is a hell of a small limit;

    16nm = ~0.000016mm, or ~6.299e-7 inches. Thats a hell of a small transistor, you know that?

    To put this in perspective, a SINGLE INFLUENZA VIRUS is ~100nm in size. 16nm transistors would be more than 5x smaller than a virus.
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited December 2003
    2016 - do you really think we'll still be used transistor based CPU's by then anyway? Seems like there'd be a new technology by then. I'm really not concerned about this.
  • SimGuySimGuy Ottawa, Canada
    edited December 2003
    There may be new technology, but will it be affordable?

    Transitors have been in use for what... 30 years now? I bet back in the day, everyone thought we'd be using fully-functional computers small enough to fit inside of cells by the year 2000.

    Didn't the same people think that we'd all be using a flying car, there would be no war and food would be ordered from a replicator?

    Tea. Earl Grey. Hot.

    In all seriousness, they are going to need to find some other type of magical electronic gate technology that can switch fast enough to replace transistors at an affordable price. So far, nobody has found a suitable replacement for the silicon-based transistor that can be mass-produced.

    Where's Dr. Daystrom with the invention of Duotronic circuitry when ya need him? :)
  • a2jfreaka2jfreak Houston, TX Member
    edited December 2003
    I'd settle for a "babe replicator" or as its commonly called, a holodeck. :D
    SimGuy had this to say
    and food would be ordered from a replicator?

    Tea. Earl Grey. Hot.
  • TemplarTemplar You first.
    edited December 2003
    Technologies advancing the use of multiple processors would delay the need for advancement long enough to find a new way to do processors.

    I think diamonds may play a large role in developing new processors. They may even become standard for a while.
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited December 2003
    Diamonds are inevitability for gate logic. They can run hotter than hell, they're atomically very very stable.

    Manufacturers still haven't ever investigated GaAs fabrication to the fullest extent which would prove an alternate to silicon. GaAs in and of itself in photovoltaic applications needs only 1/20th of the thickness of crystalline silicon to reach peak energy transfer efficiency (READ: Little leakage). I assume the same would apply to computers.. But the fact that it's more rare than gold and is a byproduct of smelting aluminum and zinc makes it cost-prohibitive right now.

    And there are still several ideas in development that are set to take us well through 2015 as far as speed increases, low-leak applications are concerned:

    Silicon on Insulator
    Strained Silicon
    Tri-gate transistors
    EUV lithography
    Quantum computing (Trite, I know)
    BBUL packaging
    FD-SOI
    EBP Lithography
    Immersion lithography

    And scientists, I'm sure, are looking into replacing nitrided silicon oxide as the main alloy for transistors. Carbon nanotubes are a possibility, and we've talked about diamonds.

    But beyond that, even CMOS (Not the one on your motherboard) still has many years left in it down to probably the 90 nanometer process, MAYBE the 65nm process as well. The 65nm process, I think, is where we'll start seeing things like immersion lithography as the middle road to current lithographic techniques and EUV, the switch from silicon dioxide transistors to something else (Diamonds?).. FD-SOI could be big, and IBM's push on carbon nanotubes might heat up.

    Great time to be a geek. Future looks healthy too.
  • drasnordrasnor Starship Operator Hawthorne, CA Icrontian
    edited December 2003
    The other question of course is whether or not you're going to need that kind of power when it arrives. My primary desktop's CPU, motherboard, and RAM are about 4 years old now, and they weren't cutting edge when I got this machine either. It isn't slow, it just doesn't multitask particularly well between any sort of compression work and anything else. For the average Joe, it'll probably be fine for another few years.

    By 2016, I'll have gone through maybe three more primary desktop computers, assuming they last as long as my current and past setups have lasted. It took 4 years for my Mac to become noticably slow (1996-2000), so it isn't like the average user needs to upgrade every year.

    The hardware growth seems to always vastly outstrip the software development. Take DX9 for example. I bought a DX9 card basically the same month DX9 came out. It's a year later now, and we're just now seeing DX9-compliant applications that aren't benchmarks. My only non-benchmark DX9 software is the XviD codec I linked myself.

    So yeah, I'm betting that your 16nm AMD Phoenix CPU's or 16nm Intel Pentium 7's are going to be plenty fast for a good deal of time.

    -drasnor :fold:
  • SimGuySimGuy Ottawa, Canada
    edited December 2003
    drasnor had this to say
    The hardware growth seems to always vastly outstrip the software development. Take DX9 for example. I bought a DX9 card basically the same month DX9 came out. It's a year later now, and we're just now seeing DX9-compliant applications that aren't benchmarks. My only non-benchmark DX9 software is the XviD codec I linked myself.

    Your DX9 investment will be worth it. DX9 and DirectX NeXt technologies will be around for the next 5 years. The new user interface of LongHorn will be 3D accelerated through the use of DirectX 9.0 compliant hardware.

    'Bout bloody time :)
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited December 2003
    I know in the games I play I can never find ENOUGH power for all the settings I want to run it at.

    People always say that hardware is overdone for the applications, but I think gamers have it the opposite. They always NEED more to maximize the performance of their games at the best settings.

    I certainly have several games where I cannot even touch the highest settings, even though I'd like to.
  • Josh-Josh- Royal Oak, MI
    edited December 2003
    Guys remember..this is INTEL telling us this information.

    INTEL!!
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited December 2003
    They're engineering team is a lot smarter than the leadership or marketing.

    I've suspected all along that the Pentium 4 was a leadership decision, not an engineer's decision.
  • DexterDexter Vancouver, BC Canada
    edited December 2003
    a2jfreak had this to say
    I'd settle for a "babe replicator" or as its commonly called, a holodeck. :D

    Yeah, and when you are done in the holodeck, you just need a little squirt of this stuff to clean up afterwards...

    :D

    Dexter...
  • TemplarTemplar You first.
    edited December 2003
    Great time to be a geek. Future looks healthy too

    Too bad IT is oversaturated right now (and doesn't look to decrease much). Now, it's like most grandmothers could run half a corporate network. I guess having Windows being able to setup a 200 user domain by point and click doesn't help. I love IT, but it sucks when I look at the market and see people who don't have an eighth of the passion and get jobs because ITT put in a good word.

    Sorry.. /rant :)
  • MachineGunKellyMachineGunKelly The STICKS, Illinois
    edited December 2003
    My friend Thrax.......I know in the games I play I can never find ENOUGH power for all the settings I want to run it at.
    And I can remember only wanting enough framerates so my character wouldn't freeze up in HalfLife. Geez, if I crouched in a corner or got too close to the wall I was stuck there! It ran fine once I upgraded to a Voodoo 2. And it's now practically obsolete for any modern gaming. I can't find anything the boy has that will even begin to tax my 9500 modded 128mb card. At least until I see H/L 2. ;)
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited December 2003
    Guess you haven't played:

    EverQuest
    Planetside
    Need For Speed Undergound
    Doom 3
    Half-Life 2
    Battlefield 1942
    Medal of Honor
    Call of Duty

    On max, or near-max settings. They can slow your system right into the ground.

    Then add in 4x FSAA and 8x aniso, which is a crime not to run, you're screwed.

    The power still isn't there yet.
  • Geeky1Geeky1 University of the Pacific (Stockton, CA, USA)
    edited December 2003
    Hey Thrax, there must be something wrong with your computers... I've run NFS: Underground on my LAPTOP (R7500m 64MB) at the maximum detail settings and resolution and the frame rates were never low enough to notice; I don't know what the actual FPS was, tho.
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited December 2003
    No, there's nothing wrong with my computers.

    Again, the difference is noticeable when in a tunnel with no other terrain being displayed (It's very very silky smooth), and then outside where the game is more sluggish. Not unplayably sluggish, not even annoyingly sluggish, just noticeably more.

    And again, I'm talking 4x FSAA + 8x aniso + 1280x1024 or higher + max details.
  • Straight_ManStraight_Man Geeky, in my own way Naples, FL Icrontian
    edited December 2003
    Um, back in the day, computers used Vacuum tubes. :D

    They THOUGH we might be at GEN 4 in 2000, no. Back in the day, Bill Gates, selling DOS out of his car, said he though anyone would be crazy to need more than 32 MB RAM, EVER. But my first days were before yours-- not boasting or saying new outlook is wrong, different perspective cuz earlier start point. I am still comfortable with a known console. :D

    John.
  • a2jfreaka2jfreak Houston, TX Member
    edited December 2003
    32MB? I think it was more like 640KB.

    How times have changed. I have more than 640MB in my system and sometimes I wonder if it's enough. :D
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited December 2003
    He said 640k.
  • Geeky1Geeky1 University of the Pacific (Stockton, CA, USA)
    edited December 2003
    Thrax... well the AA/Aniso explains it... I don't have either one enabled.
  • Josh-Josh- Royal Oak, MI
    edited December 2003
    Your laptop screen must have a hell of a refresh rate on it..
  • SimGuySimGuy Ottawa, Canada
    edited December 2003
    Probably 47Hz interlaced :D
  • Geeky1Geeky1 University of the Pacific (Stockton, CA, USA)
    edited December 2003
    Uh... laptop screens don't have refresh rates...
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited December 2003
    LCDs don't have refresh rates period.

    Just pixel response times.

    Refresh rates refer ONLY to how quickly the electron gun can paint the screen on a CRT. If it's LCD, it's pixel response, or how quickly the panes of polarized glass can shift.
Sign In or Register to comment.