Intel going 32nm in 2009

ThraxThrax 🐌Austin, TX Icrontian
edited February 2009 in Science & Tech

Comments

  • BuddyJBuddyJ Dept. of Propaganda OKC Icrontian
    edited February 2009
    Harder better faster stronger.

    I like it.
  • edited February 2009
    the technology have improving lots since the first generation of desktop

    what can we do with old stuff? rejects, recycle etc??????

    anyway, I like it
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited February 2009
    Having watched the fast march of computer technology, we would become accustomed to its high rate of advance, or so it would seem. 32nm, already? That merits one of my ever so articulate comments - WOW! Really now, it wasn't that long ago we were smuggly chugging away with our high-speed, 130nm Athlon Thunderbirds. What was the original Pentium, 180nm interconnects? It wasn't that long ago, was it?
    what can we do with old stuff? rejects, recycle etc??????
    You sell it when it's aging, but before it's considered "ancient." When it's ancient, the owner (fortunately not you) then tries to unload it on Craigslist for two or three times its real value, gives up, and gives it to a relative, who then in turn tries to unload it....

    Eventually it becomes part of an earthen damn spanning *11 rivers in rural Butkrakostan. Some old items though, may have more specific uses. For example, the US Air Force is experimenting with dropping 21" CRT monitors out of C-130 aircraft. They've had some success demolishing stout mockups of concrete structures.

    * Actually, there is only one river in Butkrakostan. It has 11 different names, owing to the 11 different tribes in that area all making claims to the banks of the river.
  • KometeKomete Member
    edited February 2009
    Interesting. Man that is a small CPU cooler on that thing. I have to give it to intel. They are really hammering them out. Also it's interesting they are moving away from triple channel and back to dual channel.
  • BuddyJBuddyJ Dept. of Propaganda OKC Icrontian
    edited February 2009
    Guess they ran out of room to fit a third channel in there.
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited February 2009
    Dual channel benchmarks identically to tri, so I imagine they did it for reasons of cost.
  • KometeKomete Member
    edited February 2009
    Thankfully, they didn't go the route of some really exotic memory like they did with Rambus on the I7's. Was that a disaster. I remember my stepfather wanting to buy some more rambus memory for his aging pc and I ended up building a completely faster system for him at nearly the same cost.
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited February 2009
    There's no compelling reason to.

    Random fact: When RDRAM was first available in retail, it was more expensive than crack cocaine ounce for ounce.
  • TushonTushon I'm scared, Coach Alexandria, VA Icrontian
    edited February 2009
    Nice report and awesome random fact!
  • DrLiamDrLiam British Columbia
    edited February 2009
    Thrax wrote:
    Random fact: When RDRAM was first available in retail, it was more expensive than crack cocaine ounce for ounce.

    I personally wouldn't want to inject either into my computer. :P

    32nm sounds very exciting and I can't wait to see what kind of new advances this will encourage. Thanks for sharing!
Sign In or Register to comment.