Nvidia aquires AGEIA!

UPSLynxUPSLynx :KAPPA:Redwood City, CA Icrontian
edited February 2008 in Hardware
Well now, this is quite the interesting situation. (link at bottom)

Nvidia has just bought Ageia, makers of the Phys-X Pysics Processing Unit. I'm a PC Hardware nut, and I LOVE physics processing in games. This was big news to me.

I saw Ageia demo their at the time unreleased Phys-X PPU's at the ACM reflections/Projections conference in 2005. It was incredible stuff. The potential for new simulations and interactions in games was mind blowing to me. The fact that they had malleable cloth that was folding and tearing in real time before our eyes sold me on the tech. I was ready to buy a PPU as soon as they released the hardware.

But I waited to see what the market would do. Next year at the same conference, Ageia was back showing off their newest physics tech that was being put in their next-gen PPUs. This stuff was absolutely astounding. Sadly, support was just never there. Consumers didn't want to pay the extra cash for another piece of hardware that had so little support, and developers didn't want to take a risk on hardware that consumers weren't buying.

Now Nvidia has bought the company and plans to further progress the technology in new implementations.

So what does this mean? Well, lots of stuff. Apparently Nvidia isn't going to kill off the current PPU's. It's actually still in the air whether Nvidia will continue on with dedicated PPUs or integrate the tech into their own GPUs. If they stick with dedicated PPU's, if anything Nvidia should be able to bolster much more support from both consumers and developers. This should press the adoption/integration that has struggled since PPU's were first introduced. If they decide to integrate the tech into their future GPUs, then we'd inevitably see less powerful processing, but easily more widespread support from everywhere.

I'm willing to bet that they integrate physics processing into their GPUs. When I went to the ACM SIGGRAPH conference in San Diego this last summer, Nvidia was there showing off their on-GPU solutions for Physics processing. I talked with them for a bit and watched a few demos. While it was pretty cool stuff, it lacked the power that Ageia's next-gen PPUs would have had. Ageia said it themselves, a dedicated PPU simply can process more flops than an on-GPU solution. Despite this, Nvidia's on-card solution was still impressive and they've been very enthusiastic about it. If they couple this system with the power of Ageia, it should yield some very capable graphics and pysics processing that everyone is on board with in the near future.

But the real issue here lies with AMD and Intel. Intel recently bought Havok, and it's known that Intel would like to start making multi-core processors that have integrated cores for physics processing. I don't think such a solution is a good as Nvidia's. Physics Processing is mostly a game-oriented technology. It's a better fit with a high-end GPU (which is already geared towards gamers.) If Intel pushes PPU enabled procs, we'll probably see more expensive "gamer" focused procs.

AMD however is once again left in the dust. They've only just released a card that can best the year old 8800 GTX/Ultra offerings from Nvidia. (Nvidia's set to release their Geforce 9 series soon, I'm sure these cards will go far beyond the recent cards from AMD/ATI) AMD mentioned months ago their intentions to buy Ageia - but in the end didn't have the money. Now they're the only major hardware dev that doesn't have a physics solution to back them. If current trends would lead me to guess, I'd say physics processing will play a HUGE part in upcoming gaming hardware. AMD will have to get on board with either the own solution, or working off of Nvidia's hardware (though I doubt Nvidia would open up development of their hardware to AMD).

Why is this a bad thing? Well, my assumption is that this will only lead to multiple hardware compatibilities. Hopefully we don't see a solution from both Nvidia and Intel (and eventually AMD). There are no more open physics platforms for developers - they've both been purchased. So my fears are that this leads to something similar of the "plays best on ATI/ Nvidia: the way it's meant to be played" but only on a much bigger factor. Perhaps we see games that have more complex physics if played on Intel/Havok or Nvidia/Ageia. Or perhaps we see complete incompatibilites - games only playable on one architecture. I just don't want to boot a game and see "Stuff blows up better with Nvidia!"

Either way the future turns out, this could be a very interesting shift in PC gaming hardware. I love physics in games, and I'd give anything to see more complex rigid and soft body calculations in games. I just hope that the companies try to work together in this progression rather than against one another.


Here's the link to the story and press release:
http://www.pcper.com/article.php?aid=515


Sorry for the novel, I'm a typaholic and I love physics. I've studied the stuff extensively.

Comments

  • ThraxThrax 🐌 Austin, TX Icrontian
    edited February 2008
    I firmly believe that quad core, not the GPU, killed all hope for discrete PPUs. The state of multi-threading, even in games that are fundamentally designed for it, often leaves a fully-programmable 2.2GHz+ CPU core floating off in space unused.

    This kind of raw horsepower as yet has gone untapped, particularly in the market where quad cores exist (Super enthusiasts, the same people who were interested in PPUs). With the ability to have a 100% load-free CPU do physics calculations, who needs the PPU?
  • WinfreyWinfrey waddafuh Missouri Icrontian
    edited February 2008
    It's also gonna depend on future software and the demands they have on current hardware. Will it continue to push the envelope in all directions thus necessitating new hardware solutions, or do we find a wall at some point where such hardware advancements are unnecessary?

    This piece of news seems to support the former rather than the latter. I currently don't see much need for PPU's but maybe nVidia has a better idea of future software and their needs than I do. Interesting nonetheless.
  • UPSLynxUPSLynx :KAPPA: Redwood City, CA Icrontian
    edited February 2008
    I think quad cores definitely played a large part in killing it - but dedicated cards were capable of processing a lot more physics calculations. Sadly, either the gaming industry didn't see a need to tearable cloth, or that tech is still ahead of it's time. But it does raise an interesting point for convenience - which is another reason why Nvidia probably will integrate the tech into their GPUs in the future.

    Hopefully future solutions do call for more realistic and complex physics interactions. Now that we've gotten a taste for what phsyics can do, I'm hungry for much, much more.
  • SnarkasmSnarkasm Madison, WI Icrontian
    edited February 2008
    All I really hope for is a more affordable Phys-X card, and if anybody can make it cost-efficient, it's probably nVidia. I know nobody really needs one right now, but they've held an interesting little part of my heart since they came out... but that little part of my heart has always had a less crucial portion of realism than my wallet possesses. Still, here's hoping.
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited February 2008
    Let's assume for the sake of discussion that the software coding models are already available and that physics/GPU discreet chips or chipsets are ready to tape out. It would not be a stretch to assess that Intel and Nvidia would wish to refine the technology to a quite higher degree, not necessarily for performance, but to make the technology affordable to more than the top 10% of gamers. In other words, to find that classic intersection on the graph where moderately high demand meets moderate or moderately high prices.

    Top end video cards are already as expensive as some laptops. Add to that another high tech chip or an even more expensive GPU with integrated physics, and the resulting cost might shrink the market so much that it would be unprofitable to manufacture.
  • NiGHTSNiGHTS San Diego Icrontian
    edited February 2008
    I could be very wrong with this, but weren't the original cards only like $150? Considering what some drop on their dream machines, that price is practically chump change for who it was originally marketed towards.
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited February 2008
    Sometimes it is not simply the price of an item, but the cost/benefit ratio it presents. While enthusiasts will spend a pretty penny to maximize their returns, what good is a penny spent if it <i>has</i> no returns?
  • SnarkasmSnarkasm Madison, WI Icrontian
    edited February 2008
    Bingo. $150 for some currently barely-measurable returns is outside the realm of necessity. Maybe if they had better, more obvious results, it'd be more reasonable, but as it stands, no real point. I can use that $150 for better RAM, mobo, PSU, whatever. But you get it down into the $50-75 range and I start thinking "hmm, why not?"
  • UPSLynxUPSLynx :KAPPA: Redwood City, CA Icrontian
    edited February 2008
    Very well stated. In my opinion, V1 of the Ageia cards were plenty affordable (unreleased V2 cards, however, were going to be ballpark $350 they told us at last year's conference). But the yields of that investment currently would be better particle effects in GRAW2 and 2 extra UT3 maps that have limited destroyable walls and a very sweet tornado. About the only real advantage to owning a gaming rig with a Phys-X card is to play Cellfactor. If you haven't seen footage, the amount of rigid and dynamic body physics interactions is just incredible in this game.

    But that's about it. Leonardo has a good point about pricing. If you tack this tech onto a high end graphics card, you could easily be looking at a $700 GPU. Would high end gamers buy it? More than likely. But it wouldn't penetrate the market in a way that this tech needs to. I'm ready to get into advanced physics, but I'd sooner have mass acceptance first, then advancements later.
  • NiGHTSNiGHTS San Diego Icrontian
    edited February 2008
    I agree that there's no benefit to it, just so we're all clear. I know it was a useless purchase. I'm just doing this for the sake of argument.

    In my time on the webernets, I've met more NEED TO BUY enthusiasts than I've met "is it worth it?" enthusiasts. What I was trying to get at is that there is a market for a $150 PPU, most certainly. Whether or not it is the largest denomination you could be achieving, however, is up in the air. Price skimming, I'm sure, was going on at the time of release - they probably could have kicked the price down some after realizing no one was buying it.
Sign In or Register to comment.