So what's going to happen?

RWBRWB Icrontian
edited August 2006 in Hardware
http://www.dailytech.com/article.aspx?newsid=3471

I just read this article, and many of the comments, I must say I like the idea of combining the CPU and GPU. Sure it can easily be seen as a integrated graphics solution, where youmust now upgrade the CPU inorder to upgrade the GPU. But I honestly think most people are getting this wrong.

By adding a GPU ELEMENT to the CPU your adding in graphics processing ability, so let's start with the single core computer using this technology, a basic cheap PC. You have the Single Core Processor with onboard ATi graphics. You have a computer like any out now with integrated cheap graphics, it's a word processor really, but possibly capable of some gaming.

Lets look a a low end gaming rig, Single Core processor with built in graphics like the one mentioned above, but also has a dedicated graphics card, now you're talking about a system in which the processor and dedicated graphics card are working together to achieve faster graphical performance.

Then moving up to higher end you have multiple cores with GPU's working together tied in with perhaps a graphics co-processor and you have some insane gaming. Wanna up the graphics for the latest games, change the graphics co-processor the same way as before.

What do youguys think?

Comments

  • edited July 2006
    I think it'd be a huge mistake. AMD would essentially become a legacy setup and by alienating nVidia they're cutting their own throats. Personally I won't own an ATI GPU and if it comes down to paying for a GPU I don't want by purchasing an AMD CPU I'll avoid AMD. Besides that AMD will just use the "bundled" GPU stuff to drive prices up even further. Add to that the fact that AMD is the weaker CPU at the moment and an Intel SLI setup is looking awfuly sexy.

    Yes, there's no telling what the future has in store CPU wise but now that Intel has worked out what's needed to gain the upper hand I seriously doubt that they'll be losing much ground on that front especially with AMD spending time and R&D dollars on a divided front.
  • citrixmetacitrixmeta Montreal, Quebec Icrontian
    edited July 2006
    in one of the maximum pc magasine issues, they were talking about all those upcoming changes.

    they had this photo of a motherboard, with 2 sockets, 1 for the cpu and another for the GPU.

    you think this is where we are headed?
  • RWBRWB Icrontian
    edited July 2006
    I would say so, I mean it does get rid of plenty of bottlenecks, and NVidia would still have a place in the graphics world as a dedicated graphics system, and who says that NVidia and AMD couldn't still work together, it's in the best interest of NVidia to do so.
  • edited July 2006
    AMD isn't just talking about a socketed GPU as such, they're talking about one on the same substrate as the CPU. A socketed GPU would be a nice move forward but integrated directly into the CPU would be a mess.
  • RWBRWB Icrontian
    edited July 2006
    madmat wrote:
    AMD isn't just talking about a socketed GPU as such, they're talking about one on the same substrate as the CPU. A socketed GPU would be a nice move forward but integrated directly into the CPU would be a mess.

    I think you're getting the idea wrong, becuase I can see what your saying and yes it wouldn't be very good. What I am saying is that graphics cards being parrallel by design especially with all this SLI and crossfire stuff going on can be used in a different way.

    Each core or processor can be a standalone graphics solution, but can also be used in conjunction with a dedication graphics card or socketed GPU. Like a co-processor. Currently the CPU is mostly an offload for the GPU in games, or for AI and other things. The future of gaming can make it so that each peice is sharing the load. The more cores you have the better the graphics can be, add in a dedicated card and it gets even better, or add more cores. Yeah that gets excessive, but the point is that you wouldn't have to upgrade the CPU or add in more cores or processors to get a boost, change or add in the dedicated card like we do now. The CPU in the future will simply have a GPU included in it.

    I think this is the road Intel is taking, they already seem to have the largest share in the graphics market with their low end integrated graphics.
  • csimoncsimon Acadiana Icrontian
    edited July 2006
    I think it ties right in with the mobile solution. Both companies are highly focused on this at the moment ...for the near future. Also, quad cores are right around the corner and I see this as the headroom for the gpu ...you'll have multiple cores of whatever it is you desire whether it be 1 cpu and 3 gpus or 3 cpus and 1 gpu or 2 of each etc. Also ...we have ppus to contend with. I think this will be very big in the gaming arena.
  • CBCB Ƹ̵̡Ӝ̵̨̄Ʒ Der Millionendorf- Icrontian
    edited July 2006
    Having the second socket on the motherboard makes sense to me. That's bassically what we do now, anyway, except that GPU is on a daughter-board instead of a ZIF interface, like the CPU. You don't buy a mobo, and think if you should buy a graphics card, you have to think which graphics card to buy, since most mobos simply can't process graphics anymore. They don't even come with VGA ports most of the time.

    I agree that it would be a good move to put the video port back on the mobo, and allow the user to seat a GPU behind it. It would conserve space, and cost, without really changing the way we purchase and assemble PC all that much.
  • RWBRWB Icrontian
    edited July 2006
    In this case the video is not on the board but on the CPU and the output is all that's on the board, unless you want to add more horsepower to it :)
  • edited July 2006
    What I'm talking about is from this article and clearly states that AMD intends to integrate a CPU and GPU onto the same die for certain applications. I predict that it will be a matter of time before they start pawning these CPU/GPU's off on the mainstream market due to the cost of fabbing them, I mean think about it. It's not fiscally sound to spend umpteen million dollars to R&D a product then more to tape it out and fab it and sell it to a limited market. That's not what free enterprise is about. You develop a product then leverage it and milk it for every cent you can.
    AMD President Dirk Meyer also confirmed that in addition to multi-processor platforms, stating "As we look towards ever finer manufacturing geometries we see the opportunity to integrate CPU and GPU cores together onto the same die to better serve the needs of some segments."

    Sure, at first they'll limit that technology to certain products. Laptops, handhelds, maybe low end desktops but eventually they'll migrate it into entry level enthusiast chips to help further recoup the monies they expended developing it. When they do You'll either have to buy the uber high-end AMD chips for megabucks to be shed of that bloated crap or go ATI. Mark my words.
  • WuGgaRoOWuGgaRoO Not in the shower Icrontian
    edited July 2006
    i dont like the whole integration on the same die..i mean different people have different needs...its like sewing pants onto your shoes...not only do you have the waist and the length to contend with..you also gotta get the shoe size right..which is why this won't do well...much like everyone does not have the same size foot, not everyone wants the same gpu/cpu combo
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited July 2006
    they had this photo of a motherboard, with 2 sockets, 1 for the cpu and another for the GPU.

    you think this is where we are headed?
    No, I think we'll eventually see a processor that is a combined GPU and CPU, but called something else. You'll be able to purchase in grades, just as we do now with video cards and CPUs. There will be entry level stuff for corporate boxes such as today's onboard Intel video chips as well as the top end "FX" and "EE" equivalents. I'm hoping that we won't be stuck with having to buy top of the line graphics if we chose top of the line "CPU" processing. EXPENSIVE! But maybe that two-socket CPU/GPU chip concept would allow more flexibility?
  • WuGgaRoOWuGgaRoO Not in the shower Icrontian
    edited July 2006
    Leonardo wrote:
    I'm hoping that we won't be stuck with having to buy top of the line graphics if we chose top of the line "CPU" processing.
    my paant sewn to shoe theory proven
  • RWBRWB Icrontian
    edited July 2006
    Maybe what I am saying is not being fully understood...?

    Why can't it work to better suite the current methods? Think about this and pretend these CPU/GPU combinations come out... TODAY. I could buy one of these CPU/GPU's for my socket 939 system, the built in graphics processing on the CPU will allow for games to run just that much better through my NVidia 6600GT. A similar idea to SLI... but in combination with something like the 3D NOW! technology.
  • WuGgaRoOWuGgaRoO Not in the shower Icrontian
    edited July 2006
    the problem is what if you want a high end CPU but you want a lower end gpu because your not a gamer...so now instead you have to chose, and you won't be happy with your choice..also choices are always a good thing to have the amount of choices you will have will be lowered because you wont have the ability to chose.. or maybe they will have different grades..also upgrading will be annoying too..lets say you didnt have enbough money to buy a high end whatever and you were planning on doing it later on...then youd need to buy a whole new combo
  • RWBRWB Icrontian
    edited July 2006
    I really dont think your getting my idea.... but it doesn't matter because it's only my idea and only God and the CEO of AMD knows what's going to happen.

    OK, maybe not even God.

    Or the CEO of AMD.
  • deicistdeicist Manchester, UK
    edited July 2006
    It's not just AMD, according to the Inquirer Intel is also working on CPU/GPU integration. The thinking is that modern CPUs can do anything that a GPU can do, so why not use those extra cores that we're going to have. The timeline I've see rumoured is that by 2010 Intel expects to have GPU functions on it's mainstream CPUs. It makes sense to me. It reduces bottlenecks and keeps more data 'in house' on the CPU rather than farming it off to another processor. I see it like the floating point processor. In the 386 / 486 days the floating point processor was a seperate co-processor that you could buy as an add on if you think you needed it. Now it's integrated into the CPU and no-one really thinks about it anymore. I think in 5 years time the same will be true for the GPU, it'll just be another part of the CPU not even worthy of mention in its own right.
  • WuGgaRoOWuGgaRoO Not in the shower Icrontian
    edited July 2006
    the really cool thing might be if the gpu/cpu integration works and u have the ability to throw in an extra gpu whenever u damn well please and have it SLI with the gpu/cpu combo....almost like an addon card much like the voodoo2 was...hmmm thats an idea now!!have the abasic gpu stuff dictated through the cpu/gpu and then whatever u have left can be taken care of by the additional GP and non of this split the screen SLI bullcrap!!!! its bullcrap!!! "OH lemme split the screen and add in another graphics card !" NO!! thats just wrong...im talking about FULL integration
  • edited July 2006
    deicist wrote:
    It's not just AMD, according to the Inquirer Intel is also working on CPU/GPU integration. The thinking is that modern CPUs can do anything that a GPU can do, so why not use those extra cores that we're going to have. The timeline I've see rumoured is that by 2010 Intel expects to have GPU functions on it's mainstream CPUs. It makes sense to me. It reduces bottlenecks and keeps more data 'in house' on the CPU rather than farming it off to another processor. I see it like the floating point processor. In the 386 / 486 days the floating point processor was a seperate co-processor that you could buy as an add on if you think you needed it. Now it's integrated into the CPU and no-one really thinks about it anymore. I think in 5 years time the same will be true for the GPU, it'll just be another part of the CPU not even worthy of mention in its own right.

    Maybe in 4 or 5 years when 8 core CPUs are out but right now, there's just no way. Right now games are accelerating at the point that it's requiring two high end GPUs to play games at uber high resolutions and the CPU is getting pegged to 100% (or awfully close) by taking care of sundry things like physics, Ai and ather assorted behind the scenes duties.

    I still can't see it being popular though. People like having the choice to go with the flavor they desire and a legacy setup (and face it, a GPU integrated into a CPU is as legacy as you can possibly get) doesn't lend itself well to allowing you the freedom to choose. When the CPU/GPU thing happens the PC will cease being a PC more than an open formfactor-universal console.
  • deicistdeicist Manchester, UK
    edited July 2006
    How is it different to FPU/CPU integration though? I'm sure people complained when intel integrated the floating point unit into the CPU saying that it would add cost, and they'd like the freedom to choose a FPU rather than having the one that was built into their processor... but now, no-one even thinks about the FPU, it's just a part of the CPU. I don't see how the GPU is any different.
  • edited July 2006
    When was the last time that you tried to tweak your FPU? When was the last time that a driver glitch with your FPU hosed a game or made DVD playback suffer?

    There's a big honkin' difference between the FPU and the GPU. The FPU does what it does transparently behind the veil of the OS whereas the GPU is something that is directly responsible for how we use and interact with our PCs. A bad FPU will cause as much havok as you'd think it would but it's only doing simple floating point calculations, pretty simple to get right. It's merely a link in a chain whereas the GPU is doing floating point calculations, rendering, Etc, Etc. It's not a link in the video chain but instead the base to which those links are attached.
  • deicistdeicist Manchester, UK
    edited July 2006
    That's kind of my point though, at one time the FPU was something people worried about. There were different models available which had different performance criteria, and using a different FPU could make a difference to the speed of your machine. Now it's just a part of the CPu that no-one thinks about as a seperate entity. That's where AMD and Intel want to get to with an integrated GPU... and I don't see any reason why they won't. Once they start producing CPUs that can handle graphics as well as a seperate GPU can (and they will) for little to no extra cost then the GPU will go the way of the FPU as a seperate entity.
  • edited July 2006
    And when that happens the PC as we know it will cease to exist. Period. When the GPU is a legacy part of the CPU and as such no longer upgradable then you'll have what I said before, a universal console.

    Right now the GPU is higher in number of transistors than a CPU of the single core variety and close if not more than a dualcore. I think it will be some time before a CPU has the horsepower to eliminate a GPU without serious lag at higher resolutions but that's beside the point. What I'm saying is that AMD will eventually be sporting ATI graphics by default. They're not talking about the CPU doing the graphics, they're talking about embeding the GPU on the same die as the CPU. That's a whole different thing.

    When they do that you can bet the prices will go up especially considering the extra hardware being soldered onto a mobo to accomodate that type of setup. It's going to also take some time to get the kinks worked out of the system as far as speeds are concerned. The whole reason to embed the FPU was speed. The longer traces and interfaces associated with a seperate FPU causes problems for clock speeds. This is also why cache was moved onto the die. The shorter the interconnects, the faster the transistors can switch.

    I just see it as horribly convoluted. If they want to start building graphics into the chipset then fine, be my guest. I don't have to buy that chipset but when they're shoving a brand of GPU I refuse to buy down my throat (and that's what it looks like the future holds) then I won't buy their CPU.
  • deicistdeicist Manchester, UK
    edited August 2006
    Well we'll have to agree to disagree on this then, because I look at it in a completely different way. No-one outside of AMD / Intel knows with any certainty what the long term plans are for Graphics on the CPU, which leads to this difference in interpretation. Your view is that this will lead to ATI graphics on AMD chips. My view is that AMD are going to use the technology and skills they have from the ATI merger and use them to produce CPUs capable of doing GPU work. It won't be ATI graphics anymore, the graphics will just be another part of the CPU. As I said though, no-one knows for sure so it's pointless arguing about it :)
  • edited August 2006
    AMD has already stated that they WILL be integrating GPU's onto the CPU die, how in in the world is this an interpretation? Oy. At first it'll be for certain products and most likely be that way until they get the performance ramped up to competitive with descrete graphics solutions and then it will get moved over into the mainstream.

    This is not an interpretation, this is doing the math. If you spend $500 million developing a product or even $100 million you are not going to just market that product to a market that will require 5-10 years of sales just to break even. The product won't last that long.

    This is a cut throat business. They have to recoup the R&D expenses in 2-3 years and then make a profit off each product.
Sign In or Register to comment.