I can't say I like the name "sandy bridge", but it all seems to be generating quite some excitement among enthusiasts. I'm looking forward to what you can benchmark in the future with these puppies, Nick.
generating quite some excitement among enthusiasts
Many enthusiasts overclock and I am not sure about the excitement or frustration yet. Overclocking the Sandy Bridge CPUs seems to be very different due to highly integrated architecture. These CPUs even have an internal bus (ring bus) to connect different components of the CPU to the cache memory. I must buy Phenom II X4's (or previous generation Core i7 9xx), the last classic CPUs, before AMD also goes the same way I think Intel K and AMD Black series are becoming the only way of overclocking eventually.
By the way, there is a rumor that AMD is bringing back the FX series with the Bulldozer CPUs. Given that Sandy Bridge CPUs are just more integrated and a little better in performance, AMD might be catching up with the premium performance level. If this is true, very good news for AMD.
Many enthusiasts overclock and I am not sure about the excitement or frustration yet. Overclocking the Sandy Bridge CPUs seems to be very different due to highly integrated architecture. These CPUs even have an internal bus (ring bus) to connect different components of the CPU to the cache memory. I must buy Phenom II X4's (or previous generation Core i7 9xx), the last classic CPUs, before AMD also goes the same way I think Intel K and AMD Black series are becoming the only way of overclocking eventually.
By the way, there is a rumor that AMD is bringing back the FX series with the Bulldozer CPUs. Given that Sandy Bridge CPUs are just more integrated and a little better in performance, AMD might be catching up with the premium performance level. If this is true, very good news for AMD.
Don't care about the DRM stuff. Tempest in a teapot. Won't stop people from getting it elsewhere, just enables some to legally get it how they want it faster.
As far as overclocking, yeah, we don't get to play with modifying the bus frequency or BCLK anymore; it's all done with multi and turbo modes. Makes it easier, though - if you want an enthusiast part, buy a K model. The price premiums over the non-K parts are trivial - $23 on the $300 processor.
Aaaaand Intel's "just a little more integrated" circuit is up to 50% faster than the most recent round of Core 2s, which is ridiculous. This isn't even touching their enthusiast lineup, which will launch later in the year - that's where the hex core and the $1000 processors are coming in. I don't think Intel's relinquishing the performance crown any time soon. AMD will still solidly dominate the mid/low areas, though - the Core i3 series is pretty disappointing, and AMD matches up a lot better in performance and price there.
One piece of info I'd think Cliff would be interested in - Cliff, Intel's Quick Sync tech is their new hardware-accelerated transcoding functionality, and, uh... it blows everything out of the water in speed. Crushes x86, CUDA, and AMD's GPU transcoding, usually by more than a factor of 2 in speed with comparable results (except for CUDA, which consistently ranked lowest in transcoding quality). Between Cliff's love of quick/easy transcoding and his hate for Intel, what will happen...? I vote implosion.
The hardware DRM layer sucks. I don't think, initially, it'll really make an impression on most people. Like Thrax said, if you're not using WB streaming service, you're not going to see much of it. Not many people pirate from streams anyways, so whatever.
What concerns me is what such a hardware layer will lead to. They've opened a proverbial pandora's box of DRM possibilities, Not to mention the explotation possibilities.
The last thing I ever want is hard-wired DRM controlling everything. Throwing that layer in a processor could effectively lead to a world where nearly all media is controlled by it.
That is all hypothetical, of course. It could remain limited to the streaming service and become something that nerds rage over with the launch, but no one ever has any real experiences with it.
Yes, and that's not what we have. At least not yet. What we have is a special decoder - like what's in your cable box - that says "you can see this content once you've paid for it." Remember the scrambled porn channels from the '80s you used to see on your TV? Can you specifically elaborate how precisely this is different from that?
It's not. When the day comes when the machine runs all of your media through a DRM pass to ensure it's legitimate, I'll not buy its product. When there's just a little backdoor that says "If you have this, I guarantee you'll be able to securely and legally see this particular type of media," I just won't give a damn yet.
Then again, if people would price content more reasonably, as many have begun to do, piracy becomes less of a concern. What if Photoshop was $100?...
Yo Intel, I'm real happy for ya and I'mma let you finish, but the Q6600 is the greatest CPU of all time... because a balls old chip still hangs. I remember when upgrading actually gave you noticeable performance. These days it doesn't unless you're working in a very specialized area. It kills me that so little of today's software takes advantage of the power we have available to us with products like the Sandy Bridge chips. Right now, I have ZERO compelling reason to upgrade
What I'm saying is the Q6600 is still a VERY usable processor. It's now 4 years old (well, it will be later this week) and still keeping up with today's games and software. For day-to-day use, most people wouldn't notice the difference.
CPUs are like crotchrockets these days. Who cares if your top speed is 160 or 180 or 202 mph (besides XS pen0r envy benchmarkers)? Their ability far surpasses what most people use them for.
My lack of a compelling reason stems from having a Core i7 860. The real-world benefits would be minimal. Now, on the video editing system at my office... I'd be all over that. Too bad our video guy is an Apple fanboy.
What I'm saying is the Q6600 is still a VERY usable processor. It's now 4 years old (well, it will be later this week) and still keeping up with today's games and software. For day-to-day use, most people wouldn't notice the difference.
CPUs are like crotchrockets these days. Who cares if your top speed is 160 or 180 or 202 mph (besides XS pen0r envy benchmarkers)? Their ability far surpasses what most people use them for.
My lack of a compelling reason stems from having a Core i7 860. The real-world benefits would be minimal. Now, on the video editing system at my office... I'd be all over that. Too bad our video guy is an Apple fanboy.
I agree with what you say in general. But why focus on Q6600 alone? Any $100 quad-core would be more than enough for any job today. There is about 15% performance increase clock-for-clock with Sandy Bridge compared to previous generation but even Intel is not focusing on the raw performance anymore. Their main focus seems to be more and more integration with the latest generations. Don't be surprised to see even wireless networking integrated in the CPU very soon. I don't think those chips will be called CPU anymore but anyway ...
Let me add this: Low power and integration is the key for mobility, the inevitable trend.
What I'm saying is the Q6600 is still a VERY usable processor. It's now 4 years old (well, it will be later this week) and still keeping up with today's games and software. For day-to-day use, most people wouldn't notice the difference.
CPUs are like crotchrockets these days. Who cares if your top speed is 160 or 180 or 202 mph (besides XS pen0r envy benchmarkers)? Their ability far surpasses what most people use them for.
My lack of a compelling reason stems from having a Core i7 860. The real-world benefits would be minimal. Now, on the video editing system at my office... I'd be all over that. Too bad our video guy is an Apple fanboy.
This sounds more like a Tim post than fatcat's. The people who care are exactly those you cite, adjectives aside: benchmarkers, enthusiast, people looking to get every drop of performance out of their system. Some people have systems vastly overpowered for what they use it for, I'm not doubting that. What you are saying is what's the point in new tech ... and that is just silly. Research (F@H, cryptography, mapping, etc etc etc) always* benefits from new tech, because faster processors, even in the .5GHz increments means millions of more cycles per year or w/e insane metrics they use.
*always means nearly always ... I'm sure someone could pull up some weird and obscure example.
As someone whose research is often cpu-speed bound, I welcome any innovation in cpu speed, especially when it includes a more efficient processor (yay cooler rooms).
Research (F@H, cryptography, mapping, etc etc etc) always* benefits from new tech, because faster processors, even in the .5GHz increments means millions of more cycles per year or w/e insane metrics they use.
From what I see, there are plenty of applications and games that I use every day that would be positively impacted by Sandy Bridge. And by positively, I mean 30-40% better than what I have now.
@Shwaip
The processors in discussion here are not for technical computing or research but everyday computing, multimedia, entertainment, and communication. It is a waste of silicon to include an integrated GPU in hundreds of CPUs of a server, not counting the quick sync part and all that. With increasing integration, the differences between the server, desktop, and mobile processors are increasing, I guess. I believe Fermi-like GPGPUs are the future of technical computing anyway.
Comments
Many enthusiasts overclock and I am not sure about the excitement or frustration yet. Overclocking the Sandy Bridge CPUs seems to be very different due to highly integrated architecture. These CPUs even have an internal bus (ring bus) to connect different components of the CPU to the cache memory. I must buy Phenom II X4's (or previous generation Core i7 9xx), the last classic CPUs, before AMD also goes the same way I think Intel K and AMD Black series are becoming the only way of overclocking eventually.
By the way, there is a rumor that AMD is bringing back the FX series with the Bulldozer CPUs. Given that Sandy Bridge CPUs are just more integrated and a little better in performance, AMD might be catching up with the premium performance level. If this is true, very good news for AMD.
http://www.theinquirer.net/inquirer/news/1934536/intels-sandy-bridge-sucks-hollywood-drm
looks like the K cpus are overclocking quite well. (4.4 on stock cooler)
http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i5-2600k-i5-2500k-and-core-i3-2100-tested/3
also I've seen a few 5ghz results.
As far as overclocking, yeah, we don't get to play with modifying the bus frequency or BCLK anymore; it's all done with multi and turbo modes. Makes it easier, though - if you want an enthusiast part, buy a K model. The price premiums over the non-K parts are trivial - $23 on the $300 processor.
Aaaaand Intel's "just a little more integrated" circuit is up to 50% faster than the most recent round of Core 2s, which is ridiculous. This isn't even touching their enthusiast lineup, which will launch later in the year - that's where the hex core and the $1000 processors are coming in. I don't think Intel's relinquishing the performance crown any time soon. AMD will still solidly dominate the mid/low areas, though - the Core i3 series is pretty disappointing, and AMD matches up a lot better in performance and price there.
One piece of info I'd think Cliff would be interested in - Cliff, Intel's Quick Sync tech is their new hardware-accelerated transcoding functionality, and, uh... it blows everything out of the water in speed. Crushes x86, CUDA, and AMD's GPU transcoding, usually by more than a factor of 2 in speed with comparable results (except for CUDA, which consistently ranked lowest in transcoding quality). Between Cliff's love of quick/easy transcoding and his hate for Intel, what will happen...? I vote implosion.
What concerns me is what such a hardware layer will lead to. They've opened a proverbial pandora's box of DRM possibilities, Not to mention the explotation possibilities.
The last thing I ever want is hard-wired DRM controlling everything. Throwing that layer in a processor could effectively lead to a world where nearly all media is controlled by it.
That is all hypothetical, of course. It could remain limited to the streaming service and become something that nerds rage over with the launch, but no one ever has any real experiences with it.
It's not. When the day comes when the machine runs all of your media through a DRM pass to ensure it's legitimate, I'll not buy its product. When there's just a little backdoor that says "If you have this, I guarantee you'll be able to securely and legally see this particular type of media," I just won't give a damn yet.
Then again, if people would price content more reasonably, as many have begun to do, piracy becomes less of a concern. What if Photoshop was $100?...
what I would like to know is, what does this new CPU do that my i7 950 doesn't?
so I will get 20-40% faster FPS in games?
so I will get 20-40% faster memory bandwidth?
so I will get 20-40% faster hard drive throughput?
or, will just my folding@home and Adobe Premiere and be 20-40% faster?
playing Tim here
I think it's great the CPU is faster and more energy efficient, but I hardly ever utitlized the full potential of my Q6600 with software.
Fallout 3, 6fps faster
Left 4 Dead, 14fps faster
Far Cry 2, 8fps faster
Crysis, 7fps faster
Dragon Age, 0.2fps slower
Dawn of War 2, 13fps faster
WoW, 48fps faster
Photoshop, 3D rendering and file compression it owns face.
Uses more power in idle, 74 watts vs 70 watts
Uses less power in load, 128 watts vs 138 watts
interesting...
by then the Clay|Rock|Cement|Astroturf Bridge will be out....
It looks like Intel is ready to ramp up the clock speed when necessary.
Quick sync is the only advantage I can see.
Yo Intel, I'm real happy for ya and I'mma let you finish, but the Q6600 is the greatest CPU of all time... because a balls old chip still hangs. I remember when upgrading actually gave you noticeable performance. These days it doesn't unless you're working in a very specialized area. It kills me that so little of today's software takes advantage of the power we have available to us with products like the Sandy Bridge chips. Right now, I have ZERO compelling reason to upgrade
/me looks at Q6600 vs 2600K in games, looks at quote, looks back at benchmarks, looks back at quote; follows with a confused expression.
CPUs are like crotchrockets these days. Who cares if your top speed is 160 or 180 or 202 mph (besides XS pen0r envy benchmarkers)? Their ability far surpasses what most people use them for.
My lack of a compelling reason stems from having a Core i7 860. The real-world benefits would be minimal. Now, on the video editing system at my office... I'd be all over that. Too bad our video guy is an Apple fanboy.
I agree with what you say in general. But why focus on Q6600 alone? Any $100 quad-core would be more than enough for any job today. There is about 15% performance increase clock-for-clock with Sandy Bridge compared to previous generation but even Intel is not focusing on the raw performance anymore. Their main focus seems to be more and more integration with the latest generations. Don't be surprised to see even wireless networking integrated in the CPU very soon. I don't think those chips will be called CPU anymore but anyway ...
Let me add this: Low power and integration is the key for mobility, the inevitable trend.
*always means nearly always ... I'm sure someone could pull up some weird and obscure example.
Oh and huge epenis is huge.
The processors in discussion here are not for technical computing or research but everyday computing, multimedia, entertainment, and communication. It is a waste of silicon to include an integrated GPU in hundreds of CPUs of a server, not counting the quick sync part and all that. With increasing integration, the differences between the server, desktop, and mobile processors are increasing, I guess. I believe Fermi-like GPGPUs are the future of technical computing anyway.