Surprise surprise! The i5 750 is the new value king of gaming, often out performing the i7 920 and sometimes even the 950. Now my friend is talking about selling her computer because apparently her i7 920 isn't good enough
Surprise surprise! The i5 750 is the new value king of gaming, often out performing the i7 920 and sometimes even the 950. Now my friend is talking about selling her computer because apparently her i7 920 isn't good enough
Those numbers are fishy. Only explanation is that they ran them to remove the GPU bottleneck, which is not a "real world" benchmark if you ask me.
There are several other online benchmarks that indicate this. In Crysis Warhead if you play in a now "in between" resolution of 1680X1050 on enthusiast settings with a 4870 X2 it does not matter if you have two cores or four, or if you scale the CPU frequency from 2.5 to 3.5 per core, they all come in no more that a frame or so apart, well within the margin of error, and get this, that was with AA and AF disabled. When we are running the better detail settings the CPU has very little impact as long as its at least capable of running two threads with a respectable frequency.
Some will argue, but I think calling one CPU the better gamer vs. another while using old testing methodology's to remove the GPU bottleneck is misleading. Most gamers want to play with the details cranked, that's why we spend the cash on the best graphics tech we can afford.
What is the Left 4 Dead test really telling me? That every single CPU can play it past the monitors refresh rate, so if Left 4 Dead is your game of choice why waste your cash on a CPU upgrade? You would be doing nothing to improve your experience at that point, you would just have a benchmark to brag about.
If better gaming performance is what you realy desire your money is far better spent on the next wave of GPU's available. Wait until you see what AMD unleashes on the 10th, now that is going to be a gamer upgrade.
Comments
Core i5 750: $209.99
Core i7 860: $299.99
Core i7 870: $579.99
-drasnor
Those numbers are fishy. Only explanation is that they ran them to remove the GPU bottleneck, which is not a "real world" benchmark if you ask me.
There are several other online benchmarks that indicate this. In Crysis Warhead if you play in a now "in between" resolution of 1680X1050 on enthusiast settings with a 4870 X2 it does not matter if you have two cores or four, or if you scale the CPU frequency from 2.5 to 3.5 per core, they all come in no more that a frame or so apart, well within the margin of error, and get this, that was with AA and AF disabled. When we are running the better detail settings the CPU has very little impact as long as its at least capable of running two threads with a respectable frequency.
Some will argue, but I think calling one CPU the better gamer vs. another while using old testing methodology's to remove the GPU bottleneck is misleading. Most gamers want to play with the details cranked, that's why we spend the cash on the best graphics tech we can afford.
What is the Left 4 Dead test really telling me? That every single CPU can play it past the monitors refresh rate, so if Left 4 Dead is your game of choice why waste your cash on a CPU upgrade? You would be doing nothing to improve your experience at that point, you would just have a benchmark to brag about.
If better gaming performance is what you realy desire your money is far better spent on the next wave of GPU's available. Wait until you see what AMD unleashes on the 10th, now that is going to be a gamer upgrade.
How big is the price difference between the i5 750 and the i7 870 (and the 975 EE, while we're at it)?
Is the 975 still around a grand?
Preemptive strikes against weapons of mass discussion are always justified.