So what's going to happen?
http://www.dailytech.com/article.aspx?newsid=3471
I just read this article, and many of the comments, I must say I like the idea of combining the CPU and GPU. Sure it can easily be seen as a integrated graphics solution, where youmust now upgrade the CPU inorder to upgrade the GPU. But I honestly think most people are getting this wrong.
By adding a GPU ELEMENT to the CPU your adding in graphics processing ability, so let's start with the single core computer using this technology, a basic cheap PC. You have the Single Core Processor with onboard ATi graphics. You have a computer like any out now with integrated cheap graphics, it's a word processor really, but possibly capable of some gaming.
Lets look a a low end gaming rig, Single Core processor with built in graphics like the one mentioned above, but also has a dedicated graphics card, now you're talking about a system in which the processor and dedicated graphics card are working together to achieve faster graphical performance.
Then moving up to higher end you have multiple cores with GPU's working together tied in with perhaps a graphics co-processor and you have some insane gaming. Wanna up the graphics for the latest games, change the graphics co-processor the same way as before.
What do youguys think?
I just read this article, and many of the comments, I must say I like the idea of combining the CPU and GPU. Sure it can easily be seen as a integrated graphics solution, where youmust now upgrade the CPU inorder to upgrade the GPU. But I honestly think most people are getting this wrong.
By adding a GPU ELEMENT to the CPU your adding in graphics processing ability, so let's start with the single core computer using this technology, a basic cheap PC. You have the Single Core Processor with onboard ATi graphics. You have a computer like any out now with integrated cheap graphics, it's a word processor really, but possibly capable of some gaming.
Lets look a a low end gaming rig, Single Core processor with built in graphics like the one mentioned above, but also has a dedicated graphics card, now you're talking about a system in which the processor and dedicated graphics card are working together to achieve faster graphical performance.
Then moving up to higher end you have multiple cores with GPU's working together tied in with perhaps a graphics co-processor and you have some insane gaming. Wanna up the graphics for the latest games, change the graphics co-processor the same way as before.
What do youguys think?
0
Comments
Yes, there's no telling what the future has in store CPU wise but now that Intel has worked out what's needed to gain the upper hand I seriously doubt that they'll be losing much ground on that front especially with AMD spending time and R&D dollars on a divided front.
they had this photo of a motherboard, with 2 sockets, 1 for the cpu and another for the GPU.
you think this is where we are headed?
I think you're getting the idea wrong, becuase I can see what your saying and yes it wouldn't be very good. What I am saying is that graphics cards being parrallel by design especially with all this SLI and crossfire stuff going on can be used in a different way.
Each core or processor can be a standalone graphics solution, but can also be used in conjunction with a dedication graphics card or socketed GPU. Like a co-processor. Currently the CPU is mostly an offload for the GPU in games, or for AI and other things. The future of gaming can make it so that each peice is sharing the load. The more cores you have the better the graphics can be, add in a dedicated card and it gets even better, or add more cores. Yeah that gets excessive, but the point is that you wouldn't have to upgrade the CPU or add in more cores or processors to get a boost, change or add in the dedicated card like we do now. The CPU in the future will simply have a GPU included in it.
I think this is the road Intel is taking, they already seem to have the largest share in the graphics market with their low end integrated graphics.
I agree that it would be a good move to put the video port back on the mobo, and allow the user to seat a GPU behind it. It would conserve space, and cost, without really changing the way we purchase and assemble PC all that much.
Sure, at first they'll limit that technology to certain products. Laptops, handhelds, maybe low end desktops but eventually they'll migrate it into entry level enthusiast chips to help further recoup the monies they expended developing it. When they do You'll either have to buy the uber high-end AMD chips for megabucks to be shed of that bloated crap or go ATI. Mark my words.
Why can't it work to better suite the current methods? Think about this and pretend these CPU/GPU combinations come out... TODAY. I could buy one of these CPU/GPU's for my socket 939 system, the built in graphics processing on the CPU will allow for games to run just that much better through my NVidia 6600GT. A similar idea to SLI... but in combination with something like the 3D NOW! technology.
OK, maybe not even God.
Or the CEO of AMD.
Maybe in 4 or 5 years when 8 core CPUs are out but right now, there's just no way. Right now games are accelerating at the point that it's requiring two high end GPUs to play games at uber high resolutions and the CPU is getting pegged to 100% (or awfully close) by taking care of sundry things like physics, Ai and ather assorted behind the scenes duties.
I still can't see it being popular though. People like having the choice to go with the flavor they desire and a legacy setup (and face it, a GPU integrated into a CPU is as legacy as you can possibly get) doesn't lend itself well to allowing you the freedom to choose. When the CPU/GPU thing happens the PC will cease being a PC more than an open formfactor-universal console.
There's a big honkin' difference between the FPU and the GPU. The FPU does what it does transparently behind the veil of the OS whereas the GPU is something that is directly responsible for how we use and interact with our PCs. A bad FPU will cause as much havok as you'd think it would but it's only doing simple floating point calculations, pretty simple to get right. It's merely a link in a chain whereas the GPU is doing floating point calculations, rendering, Etc, Etc. It's not a link in the video chain but instead the base to which those links are attached.
Right now the GPU is higher in number of transistors than a CPU of the single core variety and close if not more than a dualcore. I think it will be some time before a CPU has the horsepower to eliminate a GPU without serious lag at higher resolutions but that's beside the point. What I'm saying is that AMD will eventually be sporting ATI graphics by default. They're not talking about the CPU doing the graphics, they're talking about embeding the GPU on the same die as the CPU. That's a whole different thing.
When they do that you can bet the prices will go up especially considering the extra hardware being soldered onto a mobo to accomodate that type of setup. It's going to also take some time to get the kinks worked out of the system as far as speeds are concerned. The whole reason to embed the FPU was speed. The longer traces and interfaces associated with a seperate FPU causes problems for clock speeds. This is also why cache was moved onto the die. The shorter the interconnects, the faster the transistors can switch.
I just see it as horribly convoluted. If they want to start building graphics into the chipset then fine, be my guest. I don't have to buy that chipset but when they're shoving a brand of GPU I refuse to buy down my throat (and that's what it looks like the future holds) then I won't buy their CPU.
This is not an interpretation, this is doing the math. If you spend $500 million developing a product or even $100 million you are not going to just market that product to a market that will require 5-10 years of sales just to break even. The product won't last that long.
This is a cut throat business. They have to recoup the R&D expenses in 2-3 years and then make a profit off each product.