Options
Intel scientists find wall for Moore's Law
Moore's Law, as chip manufacturers generally refer to it today, is coming to an end, according to a recent research paper.
Granted, that end likely won't come for about two decades, but Intel researchers have recently published a paper theorizing that chipmakers will hit a wall when it comes to shrinking the size of transistors, one of the chief methods for making chips that are smaller, more powerful and cheaper than their predecessors.
Manufacturers will be able to produce chips on the 16-nanometer manufacturing process, expected by conservative estimates to arrive in 2018, and maybe one or two manufacturing processes after that, but that's it.
"This looks like a fundamental limit," said Paolo Gargini, director of technology strategy at Intel and an Intel fellow. The paper, titled "Limits to Binary Logic Switch Scaling--A Gedanken Model," was written by four authors and was published in the Proceedings of the IEEE (Institute of Electrical and Electronics Engineers) in November.
Although it's not unusual for researchers to theorize about the end of transistor scaling, it's an unusual statement for researchers from Intel, and it underscores the difficulties chip designers currently face. The size, energy consumption and performance requirements of today's computers are forcing semiconductor makers to completely rethink how they design their products and are prompting many to pool design with research and development.
Resolving these issues is a major goal for the entire industry. Under Moore's Law, chipmakers can double the number of transistors on a given chip every two years, an exponential growth pattern that has allowed computers to get both cheaper and more powerful at the same time.
Mostly, the trick has been accomplished through shrinking transistors. With shrinkage tapped out, manufacturers will have to find other methods to keep the cycle going.
Catch the full article over @ CNET News.com
Granted, that end likely won't come for about two decades, but Intel researchers have recently published a paper theorizing that chipmakers will hit a wall when it comes to shrinking the size of transistors, one of the chief methods for making chips that are smaller, more powerful and cheaper than their predecessors.
Manufacturers will be able to produce chips on the 16-nanometer manufacturing process, expected by conservative estimates to arrive in 2018, and maybe one or two manufacturing processes after that, but that's it.
"This looks like a fundamental limit," said Paolo Gargini, director of technology strategy at Intel and an Intel fellow. The paper, titled "Limits to Binary Logic Switch Scaling--A Gedanken Model," was written by four authors and was published in the Proceedings of the IEEE (Institute of Electrical and Electronics Engineers) in November.
Although it's not unusual for researchers to theorize about the end of transistor scaling, it's an unusual statement for researchers from Intel, and it underscores the difficulties chip designers currently face. The size, energy consumption and performance requirements of today's computers are forcing semiconductor makers to completely rethink how they design their products and are prompting many to pool design with research and development.
Resolving these issues is a major goal for the entire industry. Under Moore's Law, chipmakers can double the number of transistors on a given chip every two years, an exponential growth pattern that has allowed computers to get both cheaper and more powerful at the same time.
Mostly, the trick has been accomplished through shrinking transistors. With shrinkage tapped out, manufacturers will have to find other methods to keep the cycle going.
Catch the full article over @ CNET News.com
0
Comments
16nm = ~0.000016mm, or ~6.299e-7 inches. Thats a hell of a small transistor, you know that?
To put this in perspective, a SINGLE INFLUENZA VIRUS is ~100nm in size. 16nm transistors would be more than 5x smaller than a virus.
Transitors have been in use for what... 30 years now? I bet back in the day, everyone thought we'd be using fully-functional computers small enough to fit inside of cells by the year 2000.
Didn't the same people think that we'd all be using a flying car, there would be no war and food would be ordered from a replicator?
Tea. Earl Grey. Hot.
In all seriousness, they are going to need to find some other type of magical electronic gate technology that can switch fast enough to replace transistors at an affordable price. So far, nobody has found a suitable replacement for the silicon-based transistor that can be mass-produced.
Where's Dr. Daystrom with the invention of Duotronic circuitry when ya need him?
I think diamonds may play a large role in developing new processors. They may even become standard for a while.
Manufacturers still haven't ever investigated GaAs fabrication to the fullest extent which would prove an alternate to silicon. GaAs in and of itself in photovoltaic applications needs only 1/20th of the thickness of crystalline silicon to reach peak energy transfer efficiency (READ: Little leakage). I assume the same would apply to computers.. But the fact that it's more rare than gold and is a byproduct of smelting aluminum and zinc makes it cost-prohibitive right now.
And there are still several ideas in development that are set to take us well through 2015 as far as speed increases, low-leak applications are concerned:
Silicon on Insulator
Strained Silicon
Tri-gate transistors
EUV lithography
Quantum computing (Trite, I know)
BBUL packaging
FD-SOI
EBP Lithography
Immersion lithography
And scientists, I'm sure, are looking into replacing nitrided silicon oxide as the main alloy for transistors. Carbon nanotubes are a possibility, and we've talked about diamonds.
But beyond that, even CMOS (Not the one on your motherboard) still has many years left in it down to probably the 90 nanometer process, MAYBE the 65nm process as well. The 65nm process, I think, is where we'll start seeing things like immersion lithography as the middle road to current lithographic techniques and EUV, the switch from silicon dioxide transistors to something else (Diamonds?).. FD-SOI could be big, and IBM's push on carbon nanotubes might heat up.
Great time to be a geek. Future looks healthy too.
By 2016, I'll have gone through maybe three more primary desktop computers, assuming they last as long as my current and past setups have lasted. It took 4 years for my Mac to become noticably slow (1996-2000), so it isn't like the average user needs to upgrade every year.
The hardware growth seems to always vastly outstrip the software development. Take DX9 for example. I bought a DX9 card basically the same month DX9 came out. It's a year later now, and we're just now seeing DX9-compliant applications that aren't benchmarks. My only non-benchmark DX9 software is the XviD codec I linked myself.
So yeah, I'm betting that your 16nm AMD Phoenix CPU's or 16nm Intel Pentium 7's are going to be plenty fast for a good deal of time.
-drasnor
Your DX9 investment will be worth it. DX9 and DirectX NeXt technologies will be around for the next 5 years. The new user interface of LongHorn will be 3D accelerated through the use of DirectX 9.0 compliant hardware.
'Bout bloody time
People always say that hardware is overdone for the applications, but I think gamers have it the opposite. They always NEED more to maximize the performance of their games at the best settings.
I certainly have several games where I cannot even touch the highest settings, even though I'd like to.
INTEL!!
I've suspected all along that the Pentium 4 was a leadership decision, not an engineer's decision.
Yeah, and when you are done in the holodeck, you just need a little squirt of this stuff to clean up afterwards...
Dexter...
Too bad IT is oversaturated right now (and doesn't look to decrease much). Now, it's like most grandmothers could run half a corporate network. I guess having Windows being able to setup a 200 user domain by point and click doesn't help. I love IT, but it sucks when I look at the market and see people who don't have an eighth of the passion and get jobs because ITT put in a good word.
Sorry.. /rant
EverQuest
Planetside
Need For Speed Undergound
Doom 3
Half-Life 2
Battlefield 1942
Medal of Honor
Call of Duty
On max, or near-max settings. They can slow your system right into the ground.
Then add in 4x FSAA and 8x aniso, which is a crime not to run, you're screwed.
The power still isn't there yet.
Again, the difference is noticeable when in a tunnel with no other terrain being displayed (It's very very silky smooth), and then outside where the game is more sluggish. Not unplayably sluggish, not even annoyingly sluggish, just noticeably more.
And again, I'm talking 4x FSAA + 8x aniso + 1280x1024 or higher + max details.
They THOUGH we might be at GEN 4 in 2000, no. Back in the day, Bill Gates, selling DOS out of his car, said he though anyone would be crazy to need more than 32 MB RAM, EVER. But my first days were before yours-- not boasting or saying new outlook is wrong, different perspective cuz earlier start point. I am still comfortable with a known console.
John.
How times have changed. I have more than 640MB in my system and sometimes I wonder if it's enough.
Just pixel response times.
Refresh rates refer ONLY to how quickly the electron gun can paint the screen on a CRT. If it's LCD, it's pixel response, or how quickly the panes of polarized glass can shift.