ECS Sandy Bridge first look

mertesnmertesn I am Bobby MillerYukon, OK Icrontian
edited January 2011 in Science & Tech
«1

Comments

  • BandrikBandrik Elkhart, IN Icrontian
    edited January 2011
    I can't say I like the name "sandy bridge", but it all seems to be generating quite some excitement among enthusiasts. I'm looking forward to what you can benchmark in the future with these puppies, Nick.
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited January 2011
    Just a codename. /shrug
  • edited January 2011
    Bandrik wrote:
    generating quite some excitement among enthusiasts

    Many enthusiasts overclock and I am not sure about the excitement or frustration yet. Overclocking the Sandy Bridge CPUs seems to be very different due to highly integrated architecture. These CPUs even have an internal bus (ring bus) to connect different components of the CPU to the cache memory. I must buy Phenom II X4's (or previous generation Core i7 9xx), the last classic CPUs, before AMD also goes the same way :) I think Intel K and AMD Black series are becoming the only way of overclocking eventually.

    By the way, there is a rumor that AMD is bringing back the FX series with the Bulldozer CPUs. Given that Sandy Bridge CPUs are just more integrated and a little better in performance, AMD might be catching up with the premium performance level. If this is true, very good news for AMD.
  • primesuspectprimesuspect Beepin n' Boopin Detroit, MI Icrontian
    edited January 2011
    How do you guys feel about the DRM stuff included in Sandy Bridge?

    http://www.theinquirer.net/inquirer/news/1934536/intels-sandy-bridge-sucks-hollywood-drm
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited January 2011
    Disappointed, but I suspect it won't last long. It's also totally irrelevant to anyone not using Warner Bros' streaming service.
  • shwaipshwaip bluffin' with my muffin Icrontian
    edited January 2011
    mirage wrote:
    Many enthusiasts overclock and I am not sure about the excitement or frustration yet. Overclocking the Sandy Bridge CPUs seems to be very different due to highly integrated architecture. These CPUs even have an internal bus (ring bus) to connect different components of the CPU to the cache memory. I must buy Phenom II X4's (or previous generation Core i7 9xx), the last classic CPUs, before AMD also goes the same way :) I think Intel K and AMD Black series are becoming the only way of overclocking eventually.

    By the way, there is a rumor that AMD is bringing back the FX series with the Bulldozer CPUs. Given that Sandy Bridge CPUs are just more integrated and a little better in performance, AMD might be catching up with the premium performance level. If this is true, very good news for AMD.

    looks like the K cpus are overclocking quite well. (4.4 on stock cooler)
    http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i5-2600k-i5-2500k-and-core-i3-2100-tested/3

    also I've seen a few 5ghz results.
  • SnarkasmSnarkasm Madison, WI Icrontian
    edited January 2011
    Don't care about the DRM stuff. Tempest in a teapot. Won't stop people from getting it elsewhere, just enables some to legally get it how they want it faster.

    As far as overclocking, yeah, we don't get to play with modifying the bus frequency or BCLK anymore; it's all done with multi and turbo modes. Makes it easier, though - if you want an enthusiast part, buy a K model. The price premiums over the non-K parts are trivial - $23 on the $300 processor.

    Aaaaand Intel's "just a little more integrated" circuit is up to 50% faster than the most recent round of Core 2s, which is ridiculous. This isn't even touching their enthusiast lineup, which will launch later in the year - that's where the hex core and the $1000 processors are coming in. I don't think Intel's relinquishing the performance crown any time soon. AMD will still solidly dominate the mid/low areas, though - the Core i3 series is pretty disappointing, and AMD matches up a lot better in performance and price there.

    One piece of info I'd think Cliff would be interested in - Cliff, Intel's Quick Sync tech is their new hardware-accelerated transcoding functionality, and, uh... it blows everything out of the water in speed. Crushes x86, CUDA, and AMD's GPU transcoding, usually by more than a factor of 2 in speed with comparable results (except for CUDA, which consistently ranked lowest in transcoding quality). Between Cliff's love of quick/easy transcoding and his hate for Intel, what will happen...? I vote implosion.
  • UPSLynxUPSLynx :KAPPA: Redwood City, CA Icrontian
    edited January 2011
    The hardware DRM layer sucks. I don't think, initially, it'll really make an impression on most people. Like Thrax said, if you're not using WB streaming service, you're not going to see much of it. Not many people pirate from streams anyways, so whatever.

    What concerns me is what such a hardware layer will lead to. They've opened a proverbial pandora's box of DRM possibilities, Not to mention the explotation possibilities.

    The last thing I ever want is hard-wired DRM controlling everything. Throwing that layer in a processor could effectively lead to a world where nearly all media is controlled by it.

    That is all hypothetical, of course. It could remain limited to the streaming service and become something that nerds rage over with the launch, but no one ever has any real experiences with it.
  • primesuspectprimesuspect Beepin n' Boopin Detroit, MI Icrontian
    edited January 2011
    A world where all media is controlled by hardware? That's exactly what the RIAA and MPAA want.
  • SnarkasmSnarkasm Madison, WI Icrontian
    edited January 2011
    Yes, and that's not what we have. At least not yet. What we have is a special decoder - like what's in your cable box - that says "you can see this content once you've paid for it." Remember the scrambled porn channels from the '80s you used to see on your TV? Can you specifically elaborate how precisely this is different from that?

    It's not. When the day comes when the machine runs all of your media through a DRM pass to ensure it's legitimate, I'll not buy its product. When there's just a little backdoor that says "If you have this, I guarantee you'll be able to securely and legally see this particular type of media," I just won't give a damn yet.

    Then again, if people would price content more reasonably, as many have begun to do, piracy becomes less of a concern. What if Photoshop was $100?...
  • fatcatfatcat Mizzou Icrontian
    edited January 2011
    Snarkasm wrote:
    When the day comes when the machine runs all of your media through a DRM pass to ensure it's legitimate, I'll not buy its product.

    what I would like to know is, what does this new CPU do that my i7 950 doesn't?
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited January 2011
    It's 20-40% faster in everything.
  • fatcatfatcat Mizzou Icrontian
    edited January 2011
    Thrax wrote:
    It's 20-40% faster in everything.

    so I will get 20-40% faster FPS in games?

    so I will get 20-40% faster memory bandwidth?

    so I will get 20-40% faster hard drive throughput?


    or, will just my folding@home and Adobe Premiere and be 20-40% faster?

    playing Tim here ;)

    I think it's great the CPU is faster and more energy efficient, but I hardly ever utitlized the full potential of my Q6600 with software.
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited January 2011
    Read Anand's review. The $300 I7-2600 is faster than the Core i7 980X in virtually every test.
  • fatcatfatcat Mizzou Icrontian
    edited January 2011
    2600K @ 3.4GHz vs 870 @ 2.93GHz

    Fallout 3, 6fps faster
    Left 4 Dead, 14fps faster
    Far Cry 2, 8fps faster
    Crysis, 7fps faster
    Dragon Age, 0.2fps slower
    Dawn of War 2, 13fps faster
    WoW, 48fps faster

    Photoshop, 3D rendering and file compression it owns face.

    Uses more power in idle, 74 watts vs 70 watts
    Uses less power in load, 128 watts vs 138 watts

    interesting...
  • primesuspectprimesuspect Beepin n' Boopin Detroit, MI Icrontian
    edited January 2011
    That's pretty much settled. My next desktop system basically has to be a Sandy Bridge.
  • fatcatfatcat Mizzou Icrontian
    edited January 2011
    That's pretty much settled. My next desktop system basically has to be a Sandy Bridge.

    by then the Clay|Rock|Cement|Astroturf Bridge will be out.... :tongue:
  • edited January 2011
    shwaip wrote:
    looks like the K cpus are overclocking quite well. (4.4 on stock cooler)
    http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i5-2600k-i5-2500k-and-core-i3-2100-tested/3

    also I've seen a few 5ghz results.

    It looks like Intel is ready to ramp up the clock speed when necessary.
  • edited January 2011
    fatcat wrote:
    what I would like to know is, what does this new CPU do that my i7 950 doesn't?

    Quick sync is the only advantage I can see.
  • BuddyJBuddyJ Dept. of Propaganda OKC Icrontian
    edited January 2011
    Poop on new sockets.

    Yo Intel, I'm real happy for ya and I'mma let you finish, but the Q6600 is the greatest CPU of all time... because a balls old chip still hangs. I remember when upgrading actually gave you noticeable performance. These days it doesn't unless you're working in a very specialized area. It kills me that so little of today's software takes advantage of the power we have available to us with products like the Sandy Bridge chips. Right now, I have ZERO compelling reason to upgrade :(
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited January 2011
    Buddy J wrote:

    Yo Intel, I'm real happy for ya and I'mma let you finish, but the Q6600 is the greatest CPU of all time... because a balls old chip still hangs.

    /me looks at Q6600 vs 2600K in games, looks at quote, looks back at benchmarks, looks back at quote; follows with a confused expression.
  • BuddyJBuddyJ Dept. of Propaganda OKC Icrontian
    edited January 2011
    What I'm saying is the Q6600 is still a VERY usable processor. It's now 4 years old (well, it will be later this week) and still keeping up with today's games and software. For day-to-day use, most people wouldn't notice the difference.

    CPUs are like crotchrockets these days. Who cares if your top speed is 160 or 180 or 202 mph (besides XS pen0r envy benchmarkers)? Their ability far surpasses what most people use them for.

    My lack of a compelling reason stems from having a Core i7 860. The real-world benefits would be minimal. Now, on the video editing system at my office... I'd be all over that. Too bad our video guy is an Apple fanboy. :(
  • edited January 2011
    Buddy J wrote:
    What I'm saying is the Q6600 is still a VERY usable processor. It's now 4 years old (well, it will be later this week) and still keeping up with today's games and software. For day-to-day use, most people wouldn't notice the difference.

    CPUs are like crotchrockets these days. Who cares if your top speed is 160 or 180 or 202 mph (besides XS pen0r envy benchmarkers)? Their ability far surpasses what most people use them for.

    My lack of a compelling reason stems from having a Core i7 860. The real-world benefits would be minimal. Now, on the video editing system at my office... I'd be all over that. Too bad our video guy is an Apple fanboy. :(

    I agree with what you say in general. But why focus on Q6600 alone? Any $100 quad-core would be more than enough for any job today. There is about 15% performance increase clock-for-clock with Sandy Bridge compared to previous generation but even Intel is not focusing on the raw performance anymore. Their main focus seems to be more and more integration with the latest generations. Don't be surprised to see even wireless networking integrated in the CPU very soon. I don't think those chips will be called CPU anymore but anyway ...

    Let me add this: Low power and integration is the key for mobility, the inevitable trend.
  • BuddyJBuddyJ Dept. of Propaganda OKC Icrontian
    edited January 2011
    Sure. The Q6600 is just an example...
  • TushonTushon I'm scared, Coach Alexandria, VA Icrontian
    edited January 2011
    Buddy J wrote:
    What I'm saying is the Q6600 is still a VERY usable processor. It's now 4 years old (well, it will be later this week) and still keeping up with today's games and software. For day-to-day use, most people wouldn't notice the difference.

    CPUs are like crotchrockets these days. Who cares if your top speed is 160 or 180 or 202 mph (besides XS pen0r envy benchmarkers)? Their ability far surpasses what most people use them for.

    My lack of a compelling reason stems from having a Core i7 860. The real-world benefits would be minimal. Now, on the video editing system at my office... I'd be all over that. Too bad our video guy is an Apple fanboy. :(
    This sounds more like a Tim post than fatcat's. The people who care are exactly those you cite, adjectives aside: benchmarkers, enthusiast, people looking to get every drop of performance out of their system. Some people have systems vastly overpowered for what they use it for, I'm not doubting that. What you are saying is what's the point in new tech ... and that is just silly. Research (F@H, cryptography, mapping, etc etc etc) always* benefits from new tech, because faster processors, even in the .5GHz increments means millions of more cycles per year or w/e insane metrics they use.

    *always means nearly always ... I'm sure someone could pull up some weird and obscure example.
  • BuddyJBuddyJ Dept. of Propaganda OKC Icrontian
    edited January 2011
    I'm not saying "What's the point in new tech." I'm saying "Great. We've got new tech. Lets actually use it."
  • shwaipshwaip bluffin' with my muffin Icrontian
    edited January 2011
    As someone whose research is often cpu-speed bound, I welcome any innovation in cpu speed, especially when it includes a more efficient processor (yay cooler rooms).
  • TushonTushon I'm scared, Coach Alexandria, VA Icrontian
    edited January 2011
    Tushon wrote:
    Research (F@H, cryptography, mapping, etc etc etc) always* benefits from new tech, because faster processors, even in the .5GHz increments means millions of more cycles per year or w/e insane metrics they use.

    Oh and huge epenis is huge.
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited January 2011
    From what I see, there are plenty of applications and games that I use every day that would be positively impacted by Sandy Bridge. And by positively, I mean 30-40% better than what I have now.
  • edited January 2011
    @Shwaip
    The processors in discussion here are not for technical computing or research but everyday computing, multimedia, entertainment, and communication. It is a waste of silicon to include an integrated GPU in hundreds of CPUs of a server, not counting the quick sync part and all that. With increasing integration, the differences between the server, desktop, and mobile processors are increasing, I guess. I believe Fermi-like GPGPUs are the future of technical computing anyway.
Sign In or Register to comment.