GPU Physics Card - Just A What Is & Why Question

jedihobbitjedihobbit Central Virginia, USA New
edited February 2011 in Hardware
I have been fooling around building and rebuilding computers for so long I’d forgotten the original reason for building one……Gaming! Now that I intend to build the gaming machine and stop f#%ting around with building and oc’ing (yeah right!) I will be having vid related questions.

Never really fiddled with GPUs except for bling cooling items so am really a noob in that area. One of the biggest things I’ve noticed is the idea of recycling an older unit as a “physics card”. So as the title says, “Just what is a physics card why is it necessary?”

I’m looking at running 2 x 580 GTXs in SLI and would think they would be enough horse power for a while. Also have three Acer 23” LED monitors that will eventually be used.

Comments

  • ThraxThrax 🐌 Austin, TX Icrontian
    edited January 2011
    Nobody needs a physics card. The last 3 generations of GPUs are more than sufficient for physics calculation, and I could count the number of GPU physics-enabled titles on about two hands.
  • _k_k P-Town, Texas Icrontian
    edited January 2011
    I can count them on one hand, but that is only on my polydactyly hand.
  • litenkulitenku Maryland Member
    edited February 2011
    The reality is that you could count the number of games that actually use the physics accelerated stuff to a more than just a quick "Gee whiz wow ... meh" gimmicky degree on about 2 fingers...

    That having been said, any NVidia card from the "8800" or later era should be just fine as a backup "Physics" card. The dedicated physics card that was sold many years ago (curiously, before NVidia bought the company) ultimately was a waste of money.
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited February 2011
    PhysiX Cards from Nvidia can come in handy when running massively large screen resolutions. It is true a small amount of titles use the technology... probably the biggest title would be Batman Arkham Asylum.

    Overall if you have a descent GPU you really don't need a dedicated card. I know most people recycle their older NV cards to be used for PhysiX, but if you don't play the games that support it than it is not needed.
  • Cliff_ForsterCliff_Forster Icrontian
    edited February 2011
    PhysiX Cards from Nvidia can come in handy when running massively large screen resolutions. It is true a small amount of titles use the technology... probably the biggest title would be Batman Arkham Asylum.

    Overall if you have a descent GPU you really don't need a dedicated card. I know most people recycle their older NV cards to be used for PhysiX, but if you don't play the games that support it than it is not needed.

    Even if you play games that support it, its not needed.... ;D

    Sorry, could not resist.
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited February 2011
    I said it can come in handy getting a solid 60 FPS at any rez or better is the target point for any gamer.. Sometimes a Physx card can allow that number to be reached.
  • litenkulitenku Maryland Member
    edited February 2011
    PhysiX Cards from Nvidia can come in handy when running massively large screen resolutions. It is true a small amount of titles use the technology... probably the biggest title would be Batman Arkham Asylum.

    Overall if you have a descent GPU you really don't need a dedicated card. I know most people recycle their older NV cards to be used for PhysiX, but if you don't play the games that support it than it is not needed.

    And that's the tradeoff - adding more PhysX (no i) effects into the game cause frame rates to get lower, and ultimately for all but a tiny fraction of games that even support PhysX, it makes little to no difference in gameplay beyond watching something explode in interesting ways. After a while, however, you stop looking at the pretty physics and just start playing the game - which is when you realize that it is, at best, minor eye candy, and at worst, drops frame rate to sub 60fps (even with the preferred video card, an NVidia one). In general, using PhysX hurts the gaming experience, particularly if you don't have a multi-GPU setup.

    I think that the technology has the classic "chicken and egg" problem. Adoption isn't high, because it's not (currently) enabling you to actually run an entirely new category of game. While the PhysX is sometimes interesting to see what can be done, games aren't being designed around it. Note this problem didn't occur with the advent of the 3D accelerator in the mid '90s, despite a similar situation ("brand new" tech + expensive addon hardware), primarily because that tech provided substantial performance improvements on the "new class" of game (aka the 3D game). Perhaps we'll get to that level with more physics effects in the game, but it currently doesn't provide the same grab you by the collar and slap you silly effect that you got when going from an unaccelerated 3D game to an accelerated 3D game.

    One day maybe it will, but still not yet. Particularly given that when you do enable these clever physics effects, your framerate tends to drop, and drop dramatically. I wonder if we're still at the phase of development in physics that we were before the 3dfx Voodoo came out.

    Just a musing, that's all.
  • Cliff_ForsterCliff_Forster Icrontian
    edited February 2011
    litenku,

    I think this is why it became critical that the physics processing become part of standard hardware load. Another add on to manage is not what the market wants. Sound cards are a shrinking market, even dedicated graphics have an uphill climb against some really legit options both onboard and via APU (If your not a hardcore gamer). The direction of the market is less hardware doing more stuff.

    Everyone here knows I'm the resident AMD fanboy. I'm just nuts for AMD product and I am especially loyal to the Radeon brand. That said, I'll give Nvidia some props for recognizing the value of implementing the physX middleware as part of their graphics product basically doing away with the Ageia PPU cards. My gripe is that this has created a standards battle that really isn't good for anyone involved, and what it does is stifle innovation in game physics. They went so far as to cripple PhysX on the CPU only allowing a single thread, and not allowing for Streaming SIMD extensions to support the processor in PhysX processing (I understand this is to be remedied in a future release?). In essence, it makes it impossible for a Radeon customer to play titles with PhysX enabled. While, one part of me says, its Nvidia's investment, I understand that, another part of me wonders why AMD might not do something dastardly, like make Nvidia graphics entirely incomparable with their chipsets. You just don't do it because its horrible for the user. This is where we are at now. A few competing Physics middlewares, some open, some not, just a big convoluted mess. What is right though, and once again, some reasonable props to Nvidia for not trying to create a demand for more add on boards, they killed the PPU, and that was a very good thing.

    Now what I want is for developers to agree on a standard, and agree on a way to implement it across everyone's hardware. I want this, because I know that physics engines are going to add an immense amount of game-play possibility. They have barely scratched the surface. Look, realistic geometry, more pixels, resolution, its only going to take you but so far. So what if its photo real if the environment does not react in a realistic way? Thats what I want, I want to go into a game and touch and move everything and have it react realistically in real time. It will happen.

    Look, one of my favorite examples of all time. In 2004 both Half Life 2 and Doom 3 came out around the same time. Both games looked amazing for their time, but Half Life 2 won on innovation by offering the player the most interactive game world they have ever seen. I want the next phase, I want to see everything in the game happen realistically, in real time, and I believe its possible. That interaction may offer gamers more than photo realistic graphics.
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited February 2011
    We have enough CPU horsepower to do physics.
  • litenkulitenku Maryland Member
    edited February 2011
    Cliff:

    So you're not telling me anything I don't already know. Whether it's PhysX, or a clever implementation of OpenCL that just so happens to do physics like rendering isn't really that important, at least not in my thought. What I see is that the direction that game development is going isn't having bajillions of particles floating around all physics-y. I see that as an incremental (nearly infinitessimally) upgrade over current offerings.

    As Thrax says, the type of physics that currently does make a significant contribution to gameplay right now can be easily done in CPU. Until the true shift of hundreds or thousands of glass particles has a real impact on gameplay, or until accurately modeled cloth has a real impact, or truer motion of smoke (and by extension, just about any viscous fluid) particles makes a real impact on gameplay, I just don't see the kinds of things that PhysX (or any other physics modeling done via OpenCL) can do provide real value to the end gamer.

    I understand and know what you're talking about with PhysX ultimately being bad for the consumer - it's the same argument that CUDA is actually bad for the consumer and ultimately the market (OpenCL = good, CUDA = bad, Stream = bad). Or that Glide was ultimately bad for the market (fracturing the market into vendor lockin is almost ALWAYS a bad precedent). PhysX is a well understood and clever application of generic programming (at it's heart, at least), which can easily be done via OpenCL.

    I think that if AMD really wants to jump on board this, they're going to have to improve the OpenCL implementation, and make more of a push on developers to support that "brand" of physics processing, not the semi-proprietary PhysX. I understand that NVidia was musing about opening up PhysX at some point...

    Also, I don't remember that much of HalfLife2 being that interactive. I remember the clever ragdoll physics, but that's about it. I also remember it ran well on my 9700 Pro, unlike Doom 3 (which I ultimately never really played all that much), but that's also a side point.

    I remain unconvinced that significant interactivity is possible on any of the current implementations of hardware, at least not at the significant detriment of visual prettiness...
Sign In or Register to comment.