Hopefully, this current lull is just something like what happened with Intel for a few years. I'd like to see the Bulldozer come in and kick all kinds of butt like the Opteron did, but they need something really stellar to do so.
I also see AMD pushing a lot of their focus on their ATi counterpart. Maybe they're working toward conditioning the market for an eventual halt in x86 space. If AMD feels they can't continue to fight Intel, positioning ATi to solidly take the crown from nVidia seems like a viable alternative.
Who knows, though. It's really all speculation until the press releases come out.
For full disclosure, I would describe myself as a decade long AMD/ATI fanboy. For the sake of argument here, let me limit it to the home user experience, and leave the business platform argument for someone a little better versed in that topic than myself. My love for the AMD brand started in 2000 with my purchase of a slot A Athlon based system. That moment in time, in my mind is when the real two headed microprocessor design war began between brands Intel and AMD. (Intel more or less having a home CPU monopoly preceding few years)
AMD proved something with the slot A Athlon. They proved it took more than raw frequency to make a great performing chip. The overall design, how that chip is married to the system as a whole is just as important, and through innovation on system bus and memory architecture, along with better frequency they had manufactured a product that was in each and every way, technically superior to the Intel Pentium III at the time, and it took Intel a full manufacturing cycle, and a boat load of marketing revenue for three blue men to stay in the game.
Why is this history lesson important? Its because its the exact point in time that real market competition spurred innovation and better pricing for consumers. Both sides playing against each other, to tap thier resource to build a better chip, faster, cooler, more performance per watt. Without AMD, I honestly think we are all still paying $1000+ a chip for something that performs like a Pentium 4. Hard to say for certain what the level of innovation would have been like if not for AMD sticking it out as the market underdog, but one thing is for sure, Intel's margins would be much higher, and computers would probably be in far fewer homes, businesses and schools.
Since 2000 AMD has contributed much to the microprocessor market. 64 bit instructions, first true multi-core design, unlocked multipliers for enthusiast overclocking, Integrated DDR2 controllers just to name a few.
I build systems on the side for friends and customers. Nearly every single one will ask for an Intel based system due to brand familiarity. When I educate them on the value of AMD's product offering, especially the value of platform computing with AMD's motherboard chipsets, CPU's and ATI graphics all combined in a tidy bundle that plays nice together, they always say yes to AMD, and report they are thrilled with thier systems as they use them.
The problem may be marketing to a degree, or just a uneducated consumer, either way, AMD/ATI does have the tech, and a solid overall battle plan for bringing users chips at a value that is currently hard to believe. Its never been so inexpensive to build a powerful computer, and we have AMD to thank more than any other company in the tech landscape.
Lets look at some AMD/ATI branded products that the market should be very excited about.
Lets start with motherboards. Fact is, there is no better IGP for basic productivity computing or HTPC than the 780G or 790G that uses either the ATI HD3200 or HD3300 as way to deliver affordable HD graphics performance to the masses. The ATI IGP is so good you can actually run respectable 3D games on it. Sure, you still need a dedicated card to play the latest and greatest at the highest resolutions, but the 780/790G chipsets are a huge technical leap forward for IGP.
Inexpensive enthusiast level CPU's with unlocked multipliers for overclocking. Currently you can buy the 7750 Black edition dual core for $65, The triple core Phenom II 720 BE for $145, and the crown jewel 940 BE for only $225, All of these CPU's offer enthusiast end users the kind of tweaking freedom they crave when coupled with AMD's in house developed overdrive software the experience is even better. I am personally using the triple core 720 BE in my home system, and I have to say, never in my wildest dreams would I have ever thought that real enthusiast level performance be so affordable.
Another strong point for AMD is the ATI brand. I won't argue for or against the price of the brand, but what I will say is it positions them with who I believe is the best and most innovative producer of graphics products in the world. History lesson there, remember a little card called the 9700 pro released back in late 2002? Remember how it more or less doubled the performance of any competitive product offering from Nvidia at the time? What do you think got Nvidia spinning its wheels to produce better more innovative products? Its that moment, the 9700 pro is to the graphics market what the slot A Athlon was to CPU's. Those product releases shaped the home computing market to create the innovation and pricing we all enjoy now.
That being said, the 48xx series of graphics products is currently the best value dollar for dollar on the market. With rebate you can buy a GDDR5 fueled 4870 for about $155 right now, that's right, thanks AMD/ATI. You have to spend nearly $100 more just to run side to side with it in a comparable Nvidia offering, and frankly, even then the GTX260 is not the clear performance winner. Not to mention, striped down, but great performing 4830's are showing up on-line for about $75 after rebate, once again, thanks AMD/ATI. To perform along side the 4830 you more or less need a 9800GT at 25% more cash on average, and then the 4830 is still going to win head to head, pixel for pixel in most contests, so better tech, less cash, its a no brainier.
Also, soon to be released, 48xx graphics in performance notebooks!! What's not exciting about that!!
AMD/ATI, have always taken the position as the market underdog against two brands that have a great deal of marketed consumer loyalty. Just like consumers overspend on Apple products that lack essential features due to sheer marketing might and brand recognition, people do the same when they consider Intel and Nvidia. They are just the defacto brands that have perhaps marketed a little better, and to be fair, yes, Intel gained a performance lead in the last year on the products in the higher pricing range, but if you still compare them dollar for dollar, the value winner is AMD.
What I ask my friends and fellow tech enthusiasts, isn't it worth rewarding a hard working group of people for providing you with a competitive market that offers everyone a far better value? If the chip still more than suits your needs and costs far less than brand blue, you should spend your hard earned there. AMD/ATI has plenty up thier sleeve, the perception that they somehow make products that are vastly inferior to Intel is not a fair observation looking at the entire picture. Core2 Duo vs. the original Phenom, okay, got us, Core i7 vs. Phenom II, got us, sort of, but not dollar for dollar when you compare a new system build. A comparable Phenom II based system costs hundreds less and actually offers gamers a better platform frame for frame when coupled with Radeon discrete graphics.
So as users, are we saying the only thing we value is the short term raw performance winner? If that were the case we would all run out, spend $1,300 on the unlocked i7, but that's obviously a realy awful value dollar for dollar when compared to something like the AMD Phenom II unlocked CPU's. You could spend $529 on a Nvidia GTX295 (if you can find one), or get a 4870X2 which is a far better value at about $120 less right now.
What I am saying is this, value, should still have value, especially right now with the economy soft as it is. Who do we feel is winning the value race? Who is offering tech consumers more for thier hard earned dollars? I don't see how any educated person could answer anything other than AMD for motherboards, and CPU's, and ATI for graphics.
Join me, don't settle for being one of brand blue's lemmings, support a superior value dollar for dollar. Long live AMD.
First of all, thank you for commenting, Cliff. I just knew this op-ed would put a bee under your bonnet.
I have a few quibbles with some things you pointed out:
As I mentioned, the ATI has really done a job on the GPU market. Yep, they've brought prices down and brought the performance profile of "budget" up, but your 4870X2 vs. 295 comparison is just unfair. The GeForce GTX 285 is the X2's actual competitor, and it's a full $50 cheaper than the 4870X2 (Newegg). Not only does the 285 frequently provide better performance, its losses could be placed within the margin of error.
You also suggest that people should spring for the Phenom II just to reward a "hard working group of people." There's no denying that AMD has toiled laboriously, but toil doesn't win my money. The best performance does. I cannot buy into AMD's message with the Phenom II because it struggles to compete against the Kentsfield, an Intel core that is two generations and two years old.
If we're going toe to toe on enthusiast performance, an unclocked multiplier doesn't mean anything. No sane overclocker is going to pony for the $1000 i7 965 when the Core i7 920 is just $288, or only $60 more than the 940 BE. Did I mention that the chip can clock up to 4GHz without much effort? Add in the Intel X58 premium of about $60 (over comparable AMD 790 boards), and the pricetag for a Core i7 system is only $120 more*.
What does that $120 get you? A socket with a future. Guaranteed support for a die shrink to a faster, colder chip. A more modern chipset. Superior system and memory bandwidth. Better FPU performance. ESPECIALLY better multimedia performance.
The point I'm trying to drive home is that the word best isn't all that subjective. If $120 is nothing to you, the Core i7 is a superior CPU and platform. If you're on a budget, then the Phenom II becomes the best. It all depends on your needs.
There's no denying that AMD is trying very, very hard to stay relevant. They're doing the best they can, but their best is not the best the market can do, and that's what the people who generate the buzz want to talk about. Few people wants to spend all their time being excited over the guy who is only in the lead when people have to pinch pennies.
*NOTE: I did not include the price of an upgraded heatsink because both CPUs need improved cooling to achieve the clockspeeds that the Phenom's unlocked multiplier or the Core i7's bclock scaling can offer.
//EDIT: PS, I thought the Pentium 4's price system was horse crap. Thank you, AMD, for making the market reasonable.
History lesson there, remember a little card called the 9700 pro released back in late 2002? Remember how it more or less doubled the performance of any competitive product offering from Nvidia at the time? What do you think got Nvidia spinning its wheels to produce better more innovative products? Its that moment, the 9700 pro is to the graphics market what the slot A Athlon was to CPU's. Those product releases shaped the home computing market to create the innovation and pricing we all enjoy now.
I remember the 9700 Pro. I remember RMA'ing my first-party All-in-Wonder three times in as many months. I remember waiting 5 years for working drivers for it on Linux. I remember switching back to my old GeForce 3 Ti500 because at least it could manage a playable framerate in Unreal 2004. ATI would have to be *giving* away cards to get me to try their product again.
I have owned a dual Opteron 248 workstation with the original Sledgehammer chips. At the time, no one was talking about the possibility of dual-core processors and it seemed Netburst would never go away. I built NF7-S 2.0's with Athlon XP's for friends and family. At the time, they were the best products on the market.
Today, all of my machines are Intel boxes. When it came time to replace my dual Opteron workstation I discovered I could build an Intel Q6600 machine for a little more than the cost of a replacement motherboard for the Opterons. With the Phenom launch bugs and frequent AMD socket changes I didn't have any real desire to buy into another AMD platform.
The choice was even easier when it came time to replace my dual Athlon MP HTPC/LAN gaming machine. None of the AMD offerings were workable with the thermal and power constraints of my HTPC case. The Core2 Duo E8400 w/ GeForce 9600GT combination I eventually settled on has an idle power footprint of 75W according to my UPS. Top that with AMD/ATI hardware.
What I ask my friends and fellow tech enthusiasts, isn't it worth rewarding a hard working group of people for providing you with a competitive market that offers everyone a far better value? If the chip still more than suits your needs and costs far less than brand blue, you should spend your hard earned there. AMD/ATI has plenty up thier sleeve, the perception that they somehow make products that are vastly inferior to Intel is not a fair observation looking at the entire picture...
So as users, are we saying the only thing we value is the short term raw performance winner?
I want the best platform for my needs that my money can buy. Right now AMD isn't offering that. If I were interested in funding hardworking people providing me with a competitive market I'd still be a Mac user.
Rob, Good write-up... I really hope AMD has something up its sleeve that can make it "hip" and be the buzz again.
Cliff, thanks for the well thought out and articulate response. I am still an AMD fanboy but it becomes harder and hard to stay that way when prices are close enough to make PC's under $600.
In general I think the talk of AMD in twitter has shifted from "AMD" to other keywords. The same is true with Intel as you will most likely still get more results from intels sub words also. but the community and fanboi's are there... I know it as I converse witht he CPU side all the time but stay away from the GPU side
Believe me, searching the sub-words are no better:
I have a daily column for the following keywords (as relevant to this discussion):
AMD, ATI, Phenom, Phenom II, Agena, Bulldozer, Shanghai, Intel, Gulftown, Westmere, Nehalem, Core i7, RV790, and a long list of NVIDIA GPU code names.
Believe me when I say that the buzz just isn't like it used to be. It's sort of dead. And it's sad.
But this points to another thing. The "Intel" was alive and well. What's wrong with "AMD" if it's not putting up the volume? Is it a branding problem? Is it the "second place" phenomenon I have described?
I think we are down playing the cost of an i7 based system a bit when compared to a Phenom II. I am going to take some time, throw some prices together, see if I can draw a dollar per benchmark value comparison for us and share my findings.
In graphics, history to say the 9700 pro did not force Nvidia to innovate would be simply wrong. Nvidia purchased 3DFX in hopes of creating a one stop shop for all advanced graphics. What would you be paying now without ATI's interference, how far would we have come since then? To be fair, ATI's customer service dept left alot to be desired back then, but the merger with AMD has improved them in every aspect, better drivers, better partnerships with board makers, better overall quality, ATI is a better brand because of AMD.
In closing I leave you all with a simple question? Why can you build a powerful machine for far less today than you could imagine? I think more of it has more to do with competition than it does with technical advancement. The way I see it, AMD by sticking out a tough battle as an industry underdog has protected us from a big blue hardware monopoly, that would cost us all in price, performance and innovation.
Like that i7 chip you can get for around $290? Don't forget to thank AMD in some part for that. Come on, you know Intel is not cutting its margin short because they like you.
Look the software market, operating systems and office software. Everyone on earth knows that a revamped word processor, or an OS that is buggy as sin at each release should not cost what it does from Microsoft, but that is the cost of a monopoly. Is that what you want for the hardware market too?
I will not argue against the innovation of a platform like the i7, but it takes a serious power user to see the benefit, and even then, I think its a market condition to just want the brand that more people associate with. Can we agree that for 90+% of the users out there, a good dual core, or reasonably priced Phenom II will be more than enough performance to satisfy them if not totally knock thier socks off? Are we are all better off with AMD in the market to protect us from a price monopoly? I think so.
Sure, I completely agree that today's pricing landscape can be directly traced back to the Athlon 64. I can also completely agree that even a lowly dual core is more than sufficient for most users. We can also agree that AMD's evaporation would kill the CPU market.
But both all of those change the direction of the discussion. Nobody is saying that AMD hasn't moved mountains in the realms of price and quality... What I am saying is that for all the earth they've moved, they're still coming up short in the very important game of public discussion.
They can shovel inexpensive and/or bargain parts until they're blue in the face, but they're just not being talked about in the way Intel is. It's not just brand loyalty. People want to talk about Intel because they're in the lead, and everyone loves a winner. The enthusiast industry thrives on it.
The blog-loving enthusiast cares about the game benchmarks that the i7 trounces. What's going on in Crysis? World in Conflict? PREY? They don't care about the crusty channel benchmarks that show AMD's performance/watt, transactional or platform latency advantages.
I'm not saying they're tiptoeing back towards the utter irrelevance of their K6-III era, either. They're too big for that now. The Athlon XP, even though it was a big loser in the end, was still 6-10x less than a Pentium 4. It found a home because so many people couldn't stomach an $800 price discrepancy. But what is $120? It's a pittance to go from #2 to #1. I wish $120 would take me from my Cobalt to a McLaren.
All I am saying is that AMD is fighting a big uphill battle when it comes to public perception. They need a solid win in the CPU market, and their GPU market isn't big enough to pick up the slack. People talking about you is a sure sign of your health, and AMD just looks sick.
I really like the discussion here, but part of me really believes AMD is sitting on the idea of the CPU/GPU integration with whats going on. I really feel that they can make a huge dent in that market if they do what I HOPE they're doing: Bring it all into a cost effective package that the mainstream user (not the enthusiast) can purchase on the cheap.
I agree that AMD has some work to do to improve the overall market perception of the brand. I actually think the Intel cross license legal threats give AMD a huge window of opportunity to gain press, and expose Intel's plot to control a monopoly share in the processor market. In the end, AMD is going to show what a scrappy competitor they are and gain some significant market share. (I don't think for a second that AMD is going to loose x86 rights per the agreement).
The enthusiast market is but a small part of the overall computer user landscape. AMD/ATI has been attempting to market better there, and their message is right, its not not reaching. Define "enthusiast" the computer world, that's normally just another word for "gamer", or perhaps "videophile". Where do get the performance gain for gaming and video? We all know the secret sauce is in the GPU, and the Radeon 48xx line whether you pair it with an Intel or AMD platform will deliver similar frame rates in high res, high detail gaming, so how much does the CPU really matter to that user except it raw marketing perception? And ultimately, once your running the game beyond your monitors refresh rate, whats that really matter? Radeon parts get you there, why not build it on a reasonably priced AM2 platform, save some cash, run the same frame rate at a guy who spent hundreds more, buy some extra games, a bigger monitor, something that provides actual value to the "enthusiast"/ aka gamer.
The GPU is the real power part now, the CPU is secondary in performance.
I'm an enthusiast, and my primary work for that power is in photo editing. The memory bandwidth, cache size, clock speed, and number of cores matter intensely to me in my work - it cuts an HDR render from 40 minutes to 5. The GPU is great, but it's not the end-all for damn sure.
If your argument here is to recommend AMD for people that won't know any better for their requirements, I won't mind agreeing with you. If, on the other hand, you're asking performance enthusiasts to pay insignificantly less for second-place parts because their maker made the current landscape consumer-friendly, well... to that I simply need to laugh. I'm afraid reasonable, intelligent consumers don't purchase based on past (still profit-driven) benevolence.
I will buy Intel until AMD equals or surpasses their performance or Intel's price becomes a premium I'm entirely unwilling to pay. It's pure, hard economics.
Sorry Cliff, I like that AMD helped bring down prices and spurred Intel into action, but I'm not going to buy an inferior product put of pity for an ailing company, or as an investment that someday we will get a fully competitive market with no monopolies. At this moment in time, you'd pretty much have to pay me to get on the Spider platform instead of riding the Intel wagon.
I have nothing against AMD and actually I would like very much for them to get back in the game, but I'm not going to do a bailout for them.
Also that part about the GPU being the power part is nonsense even from a gaming perspective. I'm a power game and let me tell you this: last year I had an Opteron 180 and an 8800GT. I was getting framerate dips in games like The Witcher, Jericho and even The Sims 2. I upgraded my mobo, RAM and CPU so I could use an E7200 but I kept my 8800GT. With the exact same card, the only real difference being the move from Opteron to Core 2 Duo, my hiccups went away and I am one happy gamer.
Sorry, but the CPU is not some secondary piece in the performance puzzle, it's equally important. In fact, I'd say that most rigs today are CPU-limited because GPUs have become so powerful. That makes the CPU ever more important in the equation.
I knew as soon as I said "enthusiast" was just a code word for gamer, someone would come along and say, I'm a profesional photo editor. I was ready for this.....
So you can thank AMD for ushering the 64 bit era so you can use all that ram, that is the key to high performance in photoshop and the gimp.
Moving on to Mochan,
An Opteron 180 on a socket 939 platform is not comparing current tech. Your point perhaps valid if taking me 100% litteral, but obviously I meant to cross compaire current designs. Take a Phenom II, any Phenom II for that matter (the $115 710 even), take that i7 920, run the same 4870 graphics card, run the same timedemos, make sure to turn on the bells and whistles, run in perhaps at least 1440x900 (probably less even), you are going to hit the bottleneck at the card, the framerate is going to be the same. Crossfire that 4870 now, you have a whole different ball game on either platform. Your right to a point, I can't take a three year old CPU and memory archatecture and not choke that monster card, but if we are going to compare fairly recent comperable tech, running on a similar memory platform, the GPU realy is the glory part in a good gaming benchmark. Those low res benchmarks they run against the CPU on hardware websites mean absolutely zero, zip, nothing, I hate that review sites still do that, its an empty meaningless benchmark, if your playing in 800x600 low detail just to test your frame rate, your wasting time, its not a valid benchmark. What people realy want to know is how it will game with the details on and in a HD resolution. Real world performance is what matters, and the Phenom II has plenty of it for far less cash.
If you have seen any of AMD's marketing on the topic, the question is, what will you do with the extra cash, some new games, speakers, a bigger monitor, its up to you, but any way you slice it, the combo of a Phenom II with any of those items offers a far better value.
This calls for an article where we bench a 940 BE against an i7 920 with a 4870 X2 and a 285 stock settings and at maximum achievable overclocks. We can see who wins and give a $/FPS or $/whatever metric to show the REAL value.
...Now if only we could get test hardware from blue. Hmm.
Take a Phenom II, any Phenom II for that matter (the $115 710 even), take that i7 920, run the same 4870 graphics card, run the same timedemos, make sure to turn on the bells and whistles, run in perhaps at least 1440x900 (probably less even), you are going to hit the bottleneck at the card, the framerate is going to be the same.
Well duh, naturally, because you hit the GPU bottleneck. Of course the framerate would be the same. When you bench a CPU, you make sure there is no GPU bottleneck!
You have to bench a CPU at low res in order to avoid the GPU bottleneck if hi-res is what is causing the bottleneck, because otherwise you are not getting the real picture in the Phenome vs. Intel benches. The fact is, 4 years later you may still have the same CPU and decide to upgrade your GPU, that is when the "hidden potential" of the CPU comes out. Assuming their system wasn't CPU-limited (which, again, is the case with the majority of rigs in the Core 2 Duo/Phenome era) people who bought Core 2 Duos are going to have more "hidden potential" than a guy who bought a Phenome. The fact is, the Core 2 Duo is just faster. Plain and simple.
And again, that is a moot point, because even with "just" an 8800GT any Core 2 Duo or Phenome 1st gen is typically the bottleneck. My friend has an 8800GT same as me, the difference in our rigs is our CPU. He has an E8400 and I have an E7200. Guess who's kicking ass in the benches? Again, CPU-limited. Put up my E7200 against a similarly-priced Phenome, take a guess who is more CPU limited?
I am not too familiar with how the current Phenome 2 or i7 cores are doing paired with triple SLI or xFire configurations, but again the CPU is not to be underestimated in the gaming benchmark.
I do not know if the Phenome 2 is the better value but even if it were, it's only the better value "now" ... how about a few years down the road when GPUs are being upgraded? I bet the Phenome 2 isn't going to look like such a great value by then.
All I can say is that if all I have to wait is 20 seconds longer when I encode a movie(like once a month at best), or 20ms when I open Firefox, then yea, I will gladly save $150 on a build. I can't afford Intel platforms. We don't all drive high end cars for the same reasons.
Intel has brand trust for the masses, and AMD has lost theirs.
Not all Intel rigs are monstrosities. If you can afford a Phenome, you can afford a Core 2 Duo. And the Core 2 Duo generally will overclock more and is faster.
I will heed the suggestion and get a proper account here shortly,
Mochan, The point I want to make is this. If the general term "enthusiast" does apply mostly to PC gamers, and the benchmarks we obsess about, then what value does that low res benchmark to test the CPU limited frame rate realy honestly have? Are you going to bump down your details and resolution just so you can run a game well past your monitors refresh rate? I doubt it, your gonna play it with the eye candy you paid for, with your nice graphics card. Sure, there are limits to what a lower end CPU can feed, I would not buy a Sempron, or Celeron to pair it with a high end 3D card, but once you get into a certain arena of the better performing parts for each brand, the CPU has far less impact than the GPU in the real world.
What will perform better? An i7 920 paired with a current, but less capable 4830, or a Phenom II 710 paired with a Radeon 4870 (which I will do the math on newegg shortly and show you, by spending $65 more on the Graphics, but nearly $250 less for the total system when you factor in the more expensive CPU, chipset, and DDR3, and you still get better results through the GPU, factor in the same GPU on each, save $300+ on the total build and just run the same frame rate, use that extra cash to buy something that actually provides you with some real world value.
Benchmarks run at low res, low detail, may give you a warm fuzzy feeling, but they don't provide any serious gamer with any actually useable real world value.
Now, I respect what was said earlier about the whole GTX285 vs. the 4870X2, but if you realy want to illustrate this point, pick any high end performing card, whatever brand, and keep it the same across platforms. Heck, for the meager benefits of the DDR3 currently available at its loose timings, I would take an older AM2+ paired with dual channel DDR2, and still run right next to an i7 paired with triple channel DDR3 as long as the GPU's and consistent across platforms, and we are talking about REAL performance benchmarks, you know, the benchmarks comparing the values in gameplay (no phoney timedemos, or cpu specific benchmarks), take real gameplay at a reasonable HD resolution (I might suggest something between 720, and 1080, most people should at least have monitors for 1440X900 by now), Run that puppy details cranked, with some reasonable if not all out image quality guidelines, I would say 2XAA, and 8XAF are good enough at that resolution, run that all other details cranked, run it in real gameplay, take the fraps average of a few runs per game in a few popular titles, let the i7 users swallow the bitter pill when they realize thier money might have been spent on more graphics power than on a new platform.
And once again, to those that want to argue over performance in photoshop as being an enthusiast application, My Phenom II will generally perform as well as your i7 if I just feed it some more delicious realy inexpensive tight timed DDR2 on a 64 bit OS. Run 8 GB of DDR2, vs. your triple channel 3GB of DDR3, and you will be better off, because the real photoshop bottleneck is not raw cpu speed once you add more memory. Now, if your the kind of guy that can buy the i7, overclock it, slap 12 GB of DDR3 on a board because you swim in cash, good for you, but if your realy balancing performance against price, I can assure you I can find an angle to get at the performance you crave on whatever application for less money on a current AMD platform.
I think the next step in the debate is to come up with agreed on real world performance benchmarks, and a way to compare them dollar for dollar. I belive in a real world well thought out benchmark, when you compare dollar for dollar, AMD is going to beat Intel. Don't just test theoretical bandwidth, or run specificly designed timedemos, or benchmark suites, I want to see some innovation on how real world performance is measured to provide users with actual meaningful results in how they will use the system. Suggestions?
"what value does that low res benchmark to test the CPU limited frame rate realy honestly have?"
Cliff, the point is that in order to test the performance of two processors, you have to make the processor the bottleneck in the system. The only way to do that is to use low graphics settings so that the graphics card is not the bottleneck.
True it's not real-world and you won't play your games at a low resolution, but what if you buy a new graphics card 12-18 months down the line? All of a sudden, at the settings you were playing at before the new graphics card is no longer the bottleneck, your CPU is.
The point of these tests is to say that OK, taking the graphics card out of the equation which processor performs faster?
12-18 months down the line with the poorer-perfoming AMD system you have to go out and buy a new CPU & system. With the Intel not so.
For the record I've always bought AMD (after my first mistake with Cyrix) but as other people have said until AMD can offer something better, at the moment I would buy Intel if I bought a new system.
It's fairly easy. Test four system configurations:
Phenom II X4 940 BE + 4870X2
Phenom II X4 940 BE + GTX 285
Core i7 920 + 4870X2
Core i7 920 + GTX 285
Run through a laundry-list of hand-made synthetic, productivity, gaming and Windows benchmarks. At the end of each section, you can add all the framerates or seconds (or whatever the section's test metric is) and divide it by the total cost of the platform. That will give you your value. If I have 6 game benchmarks with 500 total FPS on a $900 system (example), that's $1.8 per FPS. Maybe we're doing a productivity suite and the total time for all the tests is 1260 seconds, and again with the $900 system, that's $1.4 per second.
The critical point to all of this is that you must avoid canned benchmarks whenever possible. That means making your own timedemos, running your own FRAPS recordings, making your own productivity scripts when possible, etc.
And then at the end of it all, you repeat all the tests in an overclocked condition and see which one really gives you more value.
There is one review at Tom's hardware not exactly but close to what you suggest. Core i7 is unquestionably better then Phenom II in every benchmark except Crysis since that game is GPU challenged with any CPU. See the link below.
However, I can not explain why but I am badly itching to play with a Phenom II. Maybe I am missing my retired Opteron 165. Or, I have completely transferred my constantly changing number of PCs to Intel platform lately and I need some diversity.
Cliff, I agree with you on AMD's role in bringing computer technology forward. But I do not have any sentimental attachment to either AMD or Intel. Those are both companies working for profit (not me), and what you have described is healthy competition in the free market. My only concern is Intel's monopoly not AMD's demise.
"The critical point to all of this is that you must avoid canned benchmarks whenever possible. That means making your own timedemos, running your own FRAPS recordings, making your own productivity scripts when possible, etc."
I agree with that statement, and it has to be in things that are repatable meaningful value added tasks. Don't toss out the worst case scenerio on one realy complex job, and don't give it something realy simple to do either. If your a photo guy/gal, or a video editing guy/gal, give me a common, repetable job, something you do often enough that a time savings realy does add value to the experience, not just a benchmark.
Same thing with gameing, I want benchmarks that reflect the way the majority will experience the games right now.
I respect what was said about the potential upgrade path, and the CPU becoming the potential bottleneck in a gaming system 18 months from now, but I could still argue against it in terms of value presented, I want to know, how do users extract value here and now.
I'm a gamer, and I do some light video standard def video editing, nothing too heavy, oh and music encoding too, and I convert a fair number of documents to .pdf, so I feel qualified to comment on those areas, but for our enthusiast friends who use photoshop each day, or perhaps edit HD video a couple times a week, what would be valid repeatable benchmark, something that adds real day to day task to task value? Not the easiest job you ever did, not the toughest one either, something common, repeatable, something you do on a regular basis? Thats going to be the best value added benchmark to measure with.
Synthetics tell us nothing thats realy tangible in today's computing environment, I want to see real world results, and I want to see it in tasks that users are likely to repeat and get some real value from. Not a benchmark for a benchmark's sake, I think there is too much of that going on already, and its only purpose is to market product, I think the innovator in benchmark review sites is going to be the person that comes up with the benchmarks that refelct real users situations in the here and now, and applies a real cost to performance benchmark against it. Thats what review sites should realy be doing.
Could be an oportunity for Icrontic, challenge the paradigm of the hardware review establishment. I think when that happens, AMD's market perception will improve.
A really great way to bench Photoshop is to take a very LARGE image (think 98 megapixel), and then make a photoshop action that runs through a series of very intense transformation in sequence. Transformations like blurs and rotations require real FPU horsepower, and are a great test for a CPU's prowess. You chain these things together and then stopwatch the results. Run the action 3-5 times for an average score, and you have yourself a benchmark!
Media encoding is pretty easy too. The x264 benchmark is a two-pass encoding of a real HD video clip that's repeated 3 times and averaged. It's not synthetic at all. You could go one step further and set up your own frameserving to encode a DVD that you've ripped yourself (this is what I'll be doing when we test Windows 7 RC1 against all current Microsoft OSes/service packs).
I've been an AMD fan for a while, but not so blind that this article didn't strike a chord or three with me.
I think AMD's done a decent job of offering their customers (which, according to AMD, are the OEMs and ISVs, not the end-user sadly) reasonably long socket life, reasonable performance per dollar, and so on. I think it's pretty cool I can honestly purchase up a Phenom II and put it into my older SB600-based mainboard and it will work (thank you Gigabyte) after a BIOS update. Not just a theoretical upgrade, but an actual supported upgrade for a three-year old motherboard no longer produced.
I'm disappointed with the fact that many of AMD's technologies don't really have an impact on my day-to-day performance, though. With the lack of I/O contention and all of the memory bandwidth available from using HyperTransport and having an on-die memory controller, my CPU's compute power is still the major bottleneck on my home system. My SATA performance isn't particularly great, because I cannot use AHCI and gain the benefits of NCQ for my file operations.
With a GTX 260 and 6GB of DDR2, my 3GHz dual-core is the part that keeps me from running my games with maximum settings at my modest resolution (14x9). When I picked up the GTX to replace my Radeon, I originally planned on purchasing a Phenom II 720 as well. It's supported on my mainboard, and I already have the correct BIOS installed. I just couldn't bring myself to do it. With AMD's continued issues with NCQ in their chipsets, I could see myself at some point replacing my AMD-based board with an nVidia-based unit. But, and here's the dark shame I hide from my girlfriend, if I'm going to replace the mainboard, I'm not sure I want to continue on with an AMD solution. Even though I personally despise Intel's "tick-tock" strategy (it seems to be nothing more than a marketing tool to force computing hardware into early retirement), there has been some good gained from it, and I'm beginning to worry about AMD's long-term viability as a competitive CPU manufacturer. The performance of the Nehalem platform is too good to ignore, and I'm beginning to feel a little constrained by my platform's lack of oomph. I don't doubt that the Phenom II would be a considerable upgrade over my K8. But if I need to go with a different motherboard so as to get a chipset that properly supports AHCI, do I really want to limit myself to a second-tier performer? I don't know. It's a hard time to be an AMD fan, that's for sure.
If you got ten thousand computer users in a room, how many of them would find any value in editing a 98 megapixel raw image file? Come on, thats not a real world way to test actual useable performance.
As far as the Tom's post goes, I can find a few other benchmark sites that proove otherwise on the gaming benchmarks, I am going to come right out and say it, those numbers are cooked.
Comments
I also see AMD pushing a lot of their focus on their ATi counterpart. Maybe they're working toward conditioning the market for an eventual halt in x86 space. If AMD feels they can't continue to fight Intel, positioning ATi to solidly take the crown from nVidia seems like a viable alternative.
Who knows, though. It's really all speculation until the press releases come out.
For full disclosure, I would describe myself as a decade long AMD/ATI fanboy. For the sake of argument here, let me limit it to the home user experience, and leave the business platform argument for someone a little better versed in that topic than myself. My love for the AMD brand started in 2000 with my purchase of a slot A Athlon based system. That moment in time, in my mind is when the real two headed microprocessor design war began between brands Intel and AMD. (Intel more or less having a home CPU monopoly preceding few years)
AMD proved something with the slot A Athlon. They proved it took more than raw frequency to make a great performing chip. The overall design, how that chip is married to the system as a whole is just as important, and through innovation on system bus and memory architecture, along with better frequency they had manufactured a product that was in each and every way, technically superior to the Intel Pentium III at the time, and it took Intel a full manufacturing cycle, and a boat load of marketing revenue for three blue men to stay in the game.
Why is this history lesson important? Its because its the exact point in time that real market competition spurred innovation and better pricing for consumers. Both sides playing against each other, to tap thier resource to build a better chip, faster, cooler, more performance per watt. Without AMD, I honestly think we are all still paying $1000+ a chip for something that performs like a Pentium 4. Hard to say for certain what the level of innovation would have been like if not for AMD sticking it out as the market underdog, but one thing is for sure, Intel's margins would be much higher, and computers would probably be in far fewer homes, businesses and schools.
Since 2000 AMD has contributed much to the microprocessor market. 64 bit instructions, first true multi-core design, unlocked multipliers for enthusiast overclocking, Integrated DDR2 controllers just to name a few.
I build systems on the side for friends and customers. Nearly every single one will ask for an Intel based system due to brand familiarity. When I educate them on the value of AMD's product offering, especially the value of platform computing with AMD's motherboard chipsets, CPU's and ATI graphics all combined in a tidy bundle that plays nice together, they always say yes to AMD, and report they are thrilled with thier systems as they use them.
The problem may be marketing to a degree, or just a uneducated consumer, either way, AMD/ATI does have the tech, and a solid overall battle plan for bringing users chips at a value that is currently hard to believe. Its never been so inexpensive to build a powerful computer, and we have AMD to thank more than any other company in the tech landscape.
Lets look at some AMD/ATI branded products that the market should be very excited about.
Lets start with motherboards. Fact is, there is no better IGP for basic productivity computing or HTPC than the 780G or 790G that uses either the ATI HD3200 or HD3300 as way to deliver affordable HD graphics performance to the masses. The ATI IGP is so good you can actually run respectable 3D games on it. Sure, you still need a dedicated card to play the latest and greatest at the highest resolutions, but the 780/790G chipsets are a huge technical leap forward for IGP.
Inexpensive enthusiast level CPU's with unlocked multipliers for overclocking. Currently you can buy the 7750 Black edition dual core for $65, The triple core Phenom II 720 BE for $145, and the crown jewel 940 BE for only $225, All of these CPU's offer enthusiast end users the kind of tweaking freedom they crave when coupled with AMD's in house developed overdrive software the experience is even better. I am personally using the triple core 720 BE in my home system, and I have to say, never in my wildest dreams would I have ever thought that real enthusiast level performance be so affordable.
Another strong point for AMD is the ATI brand. I won't argue for or against the price of the brand, but what I will say is it positions them with who I believe is the best and most innovative producer of graphics products in the world. History lesson there, remember a little card called the 9700 pro released back in late 2002? Remember how it more or less doubled the performance of any competitive product offering from Nvidia at the time? What do you think got Nvidia spinning its wheels to produce better more innovative products? Its that moment, the 9700 pro is to the graphics market what the slot A Athlon was to CPU's. Those product releases shaped the home computing market to create the innovation and pricing we all enjoy now.
That being said, the 48xx series of graphics products is currently the best value dollar for dollar on the market. With rebate you can buy a GDDR5 fueled 4870 for about $155 right now, that's right, thanks AMD/ATI. You have to spend nearly $100 more just to run side to side with it in a comparable Nvidia offering, and frankly, even then the GTX260 is not the clear performance winner. Not to mention, striped down, but great performing 4830's are showing up on-line for about $75 after rebate, once again, thanks AMD/ATI. To perform along side the 4830 you more or less need a 9800GT at 25% more cash on average, and then the 4830 is still going to win head to head, pixel for pixel in most contests, so better tech, less cash, its a no brainier.
Also, soon to be released, 48xx graphics in performance notebooks!! What's not exciting about that!!
AMD/ATI, have always taken the position as the market underdog against two brands that have a great deal of marketed consumer loyalty. Just like consumers overspend on Apple products that lack essential features due to sheer marketing might and brand recognition, people do the same when they consider Intel and Nvidia. They are just the defacto brands that have perhaps marketed a little better, and to be fair, yes, Intel gained a performance lead in the last year on the products in the higher pricing range, but if you still compare them dollar for dollar, the value winner is AMD.
What I ask my friends and fellow tech enthusiasts, isn't it worth rewarding a hard working group of people for providing you with a competitive market that offers everyone a far better value? If the chip still more than suits your needs and costs far less than brand blue, you should spend your hard earned there. AMD/ATI has plenty up thier sleeve, the perception that they somehow make products that are vastly inferior to Intel is not a fair observation looking at the entire picture. Core2 Duo vs. the original Phenom, okay, got us, Core i7 vs. Phenom II, got us, sort of, but not dollar for dollar when you compare a new system build. A comparable Phenom II based system costs hundreds less and actually offers gamers a better platform frame for frame when coupled with Radeon discrete graphics.
So as users, are we saying the only thing we value is the short term raw performance winner? If that were the case we would all run out, spend $1,300 on the unlocked i7, but that's obviously a realy awful value dollar for dollar when compared to something like the AMD Phenom II unlocked CPU's. You could spend $529 on a Nvidia GTX295 (if you can find one), or get a 4870X2 which is a far better value at about $120 less right now.
What I am saying is this, value, should still have value, especially right now with the economy soft as it is. Who do we feel is winning the value race? Who is offering tech consumers more for thier hard earned dollars? I don't see how any educated person could answer anything other than AMD for motherboards, and CPU's, and ATI for graphics.
Join me, don't settle for being one of brand blue's lemmings, support a superior value dollar for dollar. Long live AMD.
I have a few quibbles with some things you pointed out:
As I mentioned, the ATI has really done a job on the GPU market. Yep, they've brought prices down and brought the performance profile of "budget" up, but your 4870X2 vs. 295 comparison is just unfair. The GeForce GTX 285 is the X2's actual competitor, and it's a full $50 cheaper than the 4870X2 (Newegg). Not only does the 285 frequently provide better performance, its losses could be placed within the margin of error.
You also suggest that people should spring for the Phenom II just to reward a "hard working group of people." There's no denying that AMD has toiled laboriously, but toil doesn't win my money. The best performance does. I cannot buy into AMD's message with the Phenom II because it struggles to compete against the Kentsfield, an Intel core that is two generations and two years old.
If we're going toe to toe on enthusiast performance, an unclocked multiplier doesn't mean anything. No sane overclocker is going to pony for the $1000 i7 965 when the Core i7 920 is just $288, or only $60 more than the 940 BE. Did I mention that the chip can clock up to 4GHz without much effort? Add in the Intel X58 premium of about $60 (over comparable AMD 790 boards), and the pricetag for a Core i7 system is only $120 more*.
What does that $120 get you? A socket with a future. Guaranteed support for a die shrink to a faster, colder chip. A more modern chipset. Superior system and memory bandwidth. Better FPU performance. ESPECIALLY better multimedia performance.
The point I'm trying to drive home is that the word best isn't all that subjective. If $120 is nothing to you, the Core i7 is a superior CPU and platform. If you're on a budget, then the Phenom II becomes the best. It all depends on your needs.
There's no denying that AMD is trying very, very hard to stay relevant. They're doing the best they can, but their best is not the best the market can do, and that's what the people who generate the buzz want to talk about. Few people wants to spend all their time being excited over the guy who is only in the lead when people have to pinch pennies.
*NOTE: I did not include the price of an upgraded heatsink because both CPUs need improved cooling to achieve the clockspeeds that the Phenom's unlocked multiplier or the Core i7's bclock scaling can offer.
//EDIT: PS, I thought the Pentium 4's price system was horse crap. Thank you, AMD, for making the market reasonable.
I have owned a dual Opteron 248 workstation with the original Sledgehammer chips. At the time, no one was talking about the possibility of dual-core processors and it seemed Netburst would never go away. I built NF7-S 2.0's with Athlon XP's for friends and family. At the time, they were the best products on the market.
Today, all of my machines are Intel boxes. When it came time to replace my dual Opteron workstation I discovered I could build an Intel Q6600 machine for a little more than the cost of a replacement motherboard for the Opterons. With the Phenom launch bugs and frequent AMD socket changes I didn't have any real desire to buy into another AMD platform.
The choice was even easier when it came time to replace my dual Athlon MP HTPC/LAN gaming machine. None of the AMD offerings were workable with the thermal and power constraints of my HTPC case. The Core2 Duo E8400 w/ GeForce 9600GT combination I eventually settled on has an idle power footprint of 75W according to my UPS. Top that with AMD/ATI hardware.
I want the best platform for my needs that my money can buy. Right now AMD isn't offering that. If I were interested in funding hardworking people providing me with a competitive market I'd still be a Mac user.
-drasnor
Cliff, thanks for the well thought out and articulate response. I am still an AMD fanboy but it becomes harder and hard to stay that way when prices are close enough to make PC's under $600.
In general I think the talk of AMD in twitter has shifted from "AMD" to other keywords. The same is true with Intel as you will most likely still get more results from intels sub words also. but the community and fanboi's are there... I know it as I converse witht he CPU side all the time but stay away from the GPU side
NV For Life
I have a daily column for the following keywords (as relevant to this discussion):
AMD, ATI, Phenom, Phenom II, Agena, Bulldozer, Shanghai, Intel, Gulftown, Westmere, Nehalem, Core i7, RV790, and a long list of NVIDIA GPU code names.
Believe me when I say that the buzz just isn't like it used to be. It's sort of dead. And it's sad.
But this points to another thing. The "Intel" was alive and well. What's wrong with "AMD" if it's not putting up the volume? Is it a branding problem? Is it the "second place" phenomenon I have described?
I think we are down playing the cost of an i7 based system a bit when compared to a Phenom II. I am going to take some time, throw some prices together, see if I can draw a dollar per benchmark value comparison for us and share my findings.
In graphics, history to say the 9700 pro did not force Nvidia to innovate would be simply wrong. Nvidia purchased 3DFX in hopes of creating a one stop shop for all advanced graphics. What would you be paying now without ATI's interference, how far would we have come since then? To be fair, ATI's customer service dept left alot to be desired back then, but the merger with AMD has improved them in every aspect, better drivers, better partnerships with board makers, better overall quality, ATI is a better brand because of AMD.
In closing I leave you all with a simple question? Why can you build a powerful machine for far less today than you could imagine? I think more of it has more to do with competition than it does with technical advancement. The way I see it, AMD by sticking out a tough battle as an industry underdog has protected us from a big blue hardware monopoly, that would cost us all in price, performance and innovation.
Like that i7 chip you can get for around $290? Don't forget to thank AMD in some part for that. Come on, you know Intel is not cutting its margin short because they like you.
Look the software market, operating systems and office software. Everyone on earth knows that a revamped word processor, or an OS that is buggy as sin at each release should not cost what it does from Microsoft, but that is the cost of a monopoly. Is that what you want for the hardware market too?
I will not argue against the innovation of a platform like the i7, but it takes a serious power user to see the benefit, and even then, I think its a market condition to just want the brand that more people associate with. Can we agree that for 90+% of the users out there, a good dual core, or reasonably priced Phenom II will be more than enough performance to satisfy them if not totally knock thier socks off? Are we are all better off with AMD in the market to protect us from a price monopoly? I think so.
But both all of those change the direction of the discussion. Nobody is saying that AMD hasn't moved mountains in the realms of price and quality... What I am saying is that for all the earth they've moved, they're still coming up short in the very important game of public discussion.
They can shovel inexpensive and/or bargain parts until they're blue in the face, but they're just not being talked about in the way Intel is. It's not just brand loyalty. People want to talk about Intel because they're in the lead, and everyone loves a winner. The enthusiast industry thrives on it.
The blog-loving enthusiast cares about the game benchmarks that the i7 trounces. What's going on in Crysis? World in Conflict? PREY? They don't care about the crusty channel benchmarks that show AMD's performance/watt, transactional or platform latency advantages.
I'm not saying they're tiptoeing back towards the utter irrelevance of their K6-III era, either. They're too big for that now. The Athlon XP, even though it was a big loser in the end, was still 6-10x less than a Pentium 4. It found a home because so many people couldn't stomach an $800 price discrepancy. But what is $120? It's a pittance to go from #2 to #1. I wish $120 would take me from my Cobalt to a McLaren.
All I am saying is that AMD is fighting a big uphill battle when it comes to public perception. They need a solid win in the CPU market, and their GPU market isn't big enough to pick up the slack. People talking about you is a sure sign of your health, and AMD just looks sick.
I agree that AMD has some work to do to improve the overall market perception of the brand. I actually think the Intel cross license legal threats give AMD a huge window of opportunity to gain press, and expose Intel's plot to control a monopoly share in the processor market. In the end, AMD is going to show what a scrappy competitor they are and gain some significant market share. (I don't think for a second that AMD is going to loose x86 rights per the agreement).
The enthusiast market is but a small part of the overall computer user landscape. AMD/ATI has been attempting to market better there, and their message is right, its not not reaching. Define "enthusiast" the computer world, that's normally just another word for "gamer", or perhaps "videophile". Where do get the performance gain for gaming and video? We all know the secret sauce is in the GPU, and the Radeon 48xx line whether you pair it with an Intel or AMD platform will deliver similar frame rates in high res, high detail gaming, so how much does the CPU really matter to that user except it raw marketing perception? And ultimately, once your running the game beyond your monitors refresh rate, whats that really matter? Radeon parts get you there, why not build it on a reasonably priced AM2 platform, save some cash, run the same frame rate at a guy who spent hundreds more, buy some extra games, a bigger monitor, something that provides actual value to the "enthusiast"/ aka gamer.
The GPU is the real power part now, the CPU is secondary in performance.
If your argument here is to recommend AMD for people that won't know any better for their requirements, I won't mind agreeing with you. If, on the other hand, you're asking performance enthusiasts to pay insignificantly less for second-place parts because their maker made the current landscape consumer-friendly, well... to that I simply need to laugh. I'm afraid reasonable, intelligent consumers don't purchase based on past (still profit-driven) benevolence.
I will buy Intel until AMD equals or surpasses their performance or Intel's price becomes a premium I'm entirely unwilling to pay. It's pure, hard economics.
I have nothing against AMD and actually I would like very much for them to get back in the game, but I'm not going to do a bailout for them.
Sorry, but the CPU is not some secondary piece in the performance puzzle, it's equally important. In fact, I'd say that most rigs today are CPU-limited because GPUs have become so powerful. That makes the CPU ever more important in the equation.
I knew as soon as I said "enthusiast" was just a code word for gamer, someone would come along and say, I'm a profesional photo editor. I was ready for this.....
So you can thank AMD for ushering the 64 bit era so you can use all that ram, that is the key to high performance in photoshop and the gimp.
Moving on to Mochan,
An Opteron 180 on a socket 939 platform is not comparing current tech. Your point perhaps valid if taking me 100% litteral, but obviously I meant to cross compaire current designs. Take a Phenom II, any Phenom II for that matter (the $115 710 even), take that i7 920, run the same 4870 graphics card, run the same timedemos, make sure to turn on the bells and whistles, run in perhaps at least 1440x900 (probably less even), you are going to hit the bottleneck at the card, the framerate is going to be the same. Crossfire that 4870 now, you have a whole different ball game on either platform. Your right to a point, I can't take a three year old CPU and memory archatecture and not choke that monster card, but if we are going to compare fairly recent comperable tech, running on a similar memory platform, the GPU realy is the glory part in a good gaming benchmark. Those low res benchmarks they run against the CPU on hardware websites mean absolutely zero, zip, nothing, I hate that review sites still do that, its an empty meaningless benchmark, if your playing in 800x600 low detail just to test your frame rate, your wasting time, its not a valid benchmark. What people realy want to know is how it will game with the details on and in a HD resolution. Real world performance is what matters, and the Phenom II has plenty of it for far less cash.
If you have seen any of AMD's marketing on the topic, the question is, what will you do with the extra cash, some new games, speakers, a bigger monitor, its up to you, but any way you slice it, the combo of a Phenom II with any of those items offers a far better value.
...Now if only we could get test hardware from blue. Hmm.
Well duh, naturally, because you hit the GPU bottleneck. Of course the framerate would be the same. When you bench a CPU, you make sure there is no GPU bottleneck!
You have to bench a CPU at low res in order to avoid the GPU bottleneck if hi-res is what is causing the bottleneck, because otherwise you are not getting the real picture in the Phenome vs. Intel benches. The fact is, 4 years later you may still have the same CPU and decide to upgrade your GPU, that is when the "hidden potential" of the CPU comes out. Assuming their system wasn't CPU-limited (which, again, is the case with the majority of rigs in the Core 2 Duo/Phenome era) people who bought Core 2 Duos are going to have more "hidden potential" than a guy who bought a Phenome. The fact is, the Core 2 Duo is just faster. Plain and simple.
And again, that is a moot point, because even with "just" an 8800GT any Core 2 Duo or Phenome 1st gen is typically the bottleneck. My friend has an 8800GT same as me, the difference in our rigs is our CPU. He has an E8400 and I have an E7200. Guess who's kicking ass in the benches? Again, CPU-limited. Put up my E7200 against a similarly-priced Phenome, take a guess who is more CPU limited?
I am not too familiar with how the current Phenome 2 or i7 cores are doing paired with triple SLI or xFire configurations, but again the CPU is not to be underestimated in the gaming benchmark.
I do not know if the Phenome 2 is the better value but even if it were, it's only the better value "now" ... how about a few years down the road when GPUs are being upgraded? I bet the Phenome 2 isn't going to look like such a great value by then.
Intel has brand trust for the masses, and AMD has lost theirs.
Mochan, The point I want to make is this. If the general term "enthusiast" does apply mostly to PC gamers, and the benchmarks we obsess about, then what value does that low res benchmark to test the CPU limited frame rate realy honestly have? Are you going to bump down your details and resolution just so you can run a game well past your monitors refresh rate? I doubt it, your gonna play it with the eye candy you paid for, with your nice graphics card. Sure, there are limits to what a lower end CPU can feed, I would not buy a Sempron, or Celeron to pair it with a high end 3D card, but once you get into a certain arena of the better performing parts for each brand, the CPU has far less impact than the GPU in the real world.
What will perform better? An i7 920 paired with a current, but less capable 4830, or a Phenom II 710 paired with a Radeon 4870 (which I will do the math on newegg shortly and show you, by spending $65 more on the Graphics, but nearly $250 less for the total system when you factor in the more expensive CPU, chipset, and DDR3, and you still get better results through the GPU, factor in the same GPU on each, save $300+ on the total build and just run the same frame rate, use that extra cash to buy something that actually provides you with some real world value.
Benchmarks run at low res, low detail, may give you a warm fuzzy feeling, but they don't provide any serious gamer with any actually useable real world value.
Now, I respect what was said earlier about the whole GTX285 vs. the 4870X2, but if you realy want to illustrate this point, pick any high end performing card, whatever brand, and keep it the same across platforms. Heck, for the meager benefits of the DDR3 currently available at its loose timings, I would take an older AM2+ paired with dual channel DDR2, and still run right next to an i7 paired with triple channel DDR3 as long as the GPU's and consistent across platforms, and we are talking about REAL performance benchmarks, you know, the benchmarks comparing the values in gameplay (no phoney timedemos, or cpu specific benchmarks), take real gameplay at a reasonable HD resolution (I might suggest something between 720, and 1080, most people should at least have monitors for 1440X900 by now), Run that puppy details cranked, with some reasonable if not all out image quality guidelines, I would say 2XAA, and 8XAF are good enough at that resolution, run that all other details cranked, run it in real gameplay, take the fraps average of a few runs per game in a few popular titles, let the i7 users swallow the bitter pill when they realize thier money might have been spent on more graphics power than on a new platform.
And once again, to those that want to argue over performance in photoshop as being an enthusiast application, My Phenom II will generally perform as well as your i7 if I just feed it some more delicious realy inexpensive tight timed DDR2 on a 64 bit OS. Run 8 GB of DDR2, vs. your triple channel 3GB of DDR3, and you will be better off, because the real photoshop bottleneck is not raw cpu speed once you add more memory. Now, if your the kind of guy that can buy the i7, overclock it, slap 12 GB of DDR3 on a board because you swim in cash, good for you, but if your realy balancing performance against price, I can assure you I can find an angle to get at the performance you crave on whatever application for less money on a current AMD platform.
I think the next step in the debate is to come up with agreed on real world performance benchmarks, and a way to compare them dollar for dollar. I belive in a real world well thought out benchmark, when you compare dollar for dollar, AMD is going to beat Intel. Don't just test theoretical bandwidth, or run specificly designed timedemos, or benchmark suites, I want to see some innovation on how real world performance is measured to provide users with actual meaningful results in how they will use the system. Suggestions?
~Cyrix
Cliff, the point is that in order to test the performance of two processors, you have to make the processor the bottleneck in the system. The only way to do that is to use low graphics settings so that the graphics card is not the bottleneck.
True it's not real-world and you won't play your games at a low resolution, but what if you buy a new graphics card 12-18 months down the line? All of a sudden, at the settings you were playing at before the new graphics card is no longer the bottleneck, your CPU is.
The point of these tests is to say that OK, taking the graphics card out of the equation which processor performs faster?
12-18 months down the line with the poorer-perfoming AMD system you have to go out and buy a new CPU & system. With the Intel not so.
For the record I've always bought AMD (after my first mistake with Cyrix) but as other people have said until AMD can offer something better, at the moment I would buy Intel if I bought a new system.
~Cyrix
Phenom II X4 940 BE + 4870X2
Phenom II X4 940 BE + GTX 285
Core i7 920 + 4870X2
Core i7 920 + GTX 285
Run through a laundry-list of hand-made synthetic, productivity, gaming and Windows benchmarks. At the end of each section, you can add all the framerates or seconds (or whatever the section's test metric is) and divide it by the total cost of the platform. That will give you your value. If I have 6 game benchmarks with 500 total FPS on a $900 system (example), that's $1.8 per FPS. Maybe we're doing a productivity suite and the total time for all the tests is 1260 seconds, and again with the $900 system, that's $1.4 per second.
The critical point to all of this is that you must avoid canned benchmarks whenever possible. That means making your own timedemos, running your own FRAPS recordings, making your own productivity scripts when possible, etc.
And then at the end of it all, you repeat all the tests in an overclocked condition and see which one really gives you more value.
There is one review at Tom's hardware not exactly but close to what you suggest. Core i7 is unquestionably better then Phenom II in every benchmark except Crysis since that game is GPU challenged with any CPU. See the link below.
http://www.tomshardware.com/reviews/overclock-phenom-ii,2119.html
However, I can not explain why but I am badly itching to play with a Phenom II. Maybe I am missing my retired Opteron 165. Or, I have completely transferred my constantly changing number of PCs to Intel platform lately and I need some diversity.
Cliff, I agree with you on AMD's role in bringing computer technology forward. But I do not have any sentimental attachment to either AMD or Intel. Those are both companies working for profit (not me), and what you have described is healthy competition in the free market. My only concern is Intel's monopoly not AMD's demise.
I agree with that statement, and it has to be in things that are repatable meaningful value added tasks. Don't toss out the worst case scenerio on one realy complex job, and don't give it something realy simple to do either. If your a photo guy/gal, or a video editing guy/gal, give me a common, repetable job, something you do often enough that a time savings realy does add value to the experience, not just a benchmark.
Same thing with gameing, I want benchmarks that reflect the way the majority will experience the games right now.
I respect what was said about the potential upgrade path, and the CPU becoming the potential bottleneck in a gaming system 18 months from now, but I could still argue against it in terms of value presented, I want to know, how do users extract value here and now.
I'm a gamer, and I do some light video standard def video editing, nothing too heavy, oh and music encoding too, and I convert a fair number of documents to .pdf, so I feel qualified to comment on those areas, but for our enthusiast friends who use photoshop each day, or perhaps edit HD video a couple times a week, what would be valid repeatable benchmark, something that adds real day to day task to task value? Not the easiest job you ever did, not the toughest one either, something common, repeatable, something you do on a regular basis? Thats going to be the best value added benchmark to measure with.
Synthetics tell us nothing thats realy tangible in today's computing environment, I want to see real world results, and I want to see it in tasks that users are likely to repeat and get some real value from. Not a benchmark for a benchmark's sake, I think there is too much of that going on already, and its only purpose is to market product, I think the innovator in benchmark review sites is going to be the person that comes up with the benchmarks that refelct real users situations in the here and now, and applies a real cost to performance benchmark against it. Thats what review sites should realy be doing.
Could be an oportunity for Icrontic, challenge the paradigm of the hardware review establishment. I think when that happens, AMD's market perception will improve.
Media encoding is pretty easy too. The x264 benchmark is a two-pass encoding of a real HD video clip that's repeated 3 times and averaged. It's not synthetic at all. You could go one step further and set up your own frameserving to encode a DVD that you've ripped yourself (this is what I'll be doing when we test Windows 7 RC1 against all current Microsoft OSes/service packs).
@Mirage:
Thanks for the link!
I think AMD's done a decent job of offering their customers (which, according to AMD, are the OEMs and ISVs, not the end-user sadly) reasonably long socket life, reasonable performance per dollar, and so on. I think it's pretty cool I can honestly purchase up a Phenom II and put it into my older SB600-based mainboard and it will work (thank you Gigabyte) after a BIOS update. Not just a theoretical upgrade, but an actual supported upgrade for a three-year old motherboard no longer produced.
I'm disappointed with the fact that many of AMD's technologies don't really have an impact on my day-to-day performance, though. With the lack of I/O contention and all of the memory bandwidth available from using HyperTransport and having an on-die memory controller, my CPU's compute power is still the major bottleneck on my home system. My SATA performance isn't particularly great, because I cannot use AHCI and gain the benefits of NCQ for my file operations.
With a GTX 260 and 6GB of DDR2, my 3GHz dual-core is the part that keeps me from running my games with maximum settings at my modest resolution (14x9). When I picked up the GTX to replace my Radeon, I originally planned on purchasing a Phenom II 720 as well. It's supported on my mainboard, and I already have the correct BIOS installed. I just couldn't bring myself to do it. With AMD's continued issues with NCQ in their chipsets, I could see myself at some point replacing my AMD-based board with an nVidia-based unit. But, and here's the dark shame I hide from my girlfriend, if I'm going to replace the mainboard, I'm not sure I want to continue on with an AMD solution. Even though I personally despise Intel's "tick-tock" strategy (it seems to be nothing more than a marketing tool to force computing hardware into early retirement), there has been some good gained from it, and I'm beginning to worry about AMD's long-term viability as a competitive CPU manufacturer. The performance of the Nehalem platform is too good to ignore, and I'm beginning to feel a little constrained by my platform's lack of oomph. I don't doubt that the Phenom II would be a considerable upgrade over my K8. But if I need to go with a different motherboard so as to get a chipset that properly supports AHCI, do I really want to limit myself to a second-tier performer? I don't know. It's a hard time to be an AMD fan, that's for sure.
If you got ten thousand computer users in a room, how many of them would find any value in editing a 98 megapixel raw image file? Come on, thats not a real world way to test actual useable performance.
As far as the Tom's post goes, I can find a few other benchmark sites that proove otherwise on the gaming benchmarks, I am going to come right out and say it, those numbers are cooked.
This is probably more accurate.
http://forums.techpowerup.com/showthread.php?t=79635