If geeks love it, we’re on it

Howdy, Stranger!

You found the friendliest gaming & tech geeks around. Say hello!

radeon or geforce?

dodododo Landisville, PA
edited Jun 2003 in Hardware
First of all, hello everyone, its nice to have a forum again.

My question:
Which video card is my best option?
Radeon 9500 (or 9600)
Radeon 9500 PRO (or 9600)
GeForce 4 Ti4600 8x
GeForce FX 5600

Currently the 9500PRO is about $25 more than the other options, is this worth the extra price?

Will I see an added benefit from the 8x AGP on the Ti4600?

What is the difference between the 9500 and 9600 series, performance wise?

Will the 9500 or 9500PRO overclock better? Does the 9500 softmod allow it to eclipse the 9500PRO in performance?

Will DirectX 9 support allow the FX card to outlast the 9500? Basically which card will run future games better?

Thats it!

~dodo

Comments

  • mmonninmmonnin Centreville, VA
    edited Jun 2003
    8x doesnt mea much I dont think. I dont think it can saturate 4x. I could be wrong tho.

    9500 is better than that 9600. Prolly your best card too.
  • aZnWankstaaZnWanksta Baton Rouge
    edited Jun 2003
  • MadballMadball Fort Benton, MT
    edited Jun 2003
    I think the 9600 is just an improved 9500 and new core, but I could be wrong. It's probably the better card. I personally own a 9500pro and I am happy with it, but if I were to buy a card today I think I would spend the extra for the 9700. If you aren't in a hurry the price of the 9700 should begin to drop soon.
  • danball1976danball1976 Wichita Falls, TX
    edited Jun 2003
    Go with a Geforce. I think its better.
  • BlackHawkBlackHawk Bible music connoisseur There's no place like 127.0.0.1 Icrontian
    edited Jun 2003
    danball1976 said
    Go with a Geforce. I think its better.
    From the reviews I've read, the GeForce FX 5600 is only better then the Radeon 9500 and 9600 in a few tests. IMHO the 9500 is your best bet unless you're willing to fork out a bit more for a 9700.
  • dodododo Landisville, PA
    edited Jun 2003
    any thoughts about the differences between the 9500 and the 9500 pro?
    (This is not addressed in the VGA charts.)

    and i am unclear if you guys are talking about the 9500 or 9500pro in your comments, please specify.

    Thanks.

    ~dodo
  • ThraxThrax Professional Shill, Watch Slut, Mumble Hivemind Drone Austin, TX Icrontian
    edited Jun 2003
    The 9500 has four less rendering pipelines to its architecture, which means its pixel-pushing-power (Like that alliteration?) and bandwidth are significantly diminished.

    Essentially, when we speak of the 9500 hereabouts, we mean the Pro version. We mean the pro version of any Radeon card, actually. :D

    The 9500 Pro is currently the best product for your money on the market <i>unless</i> you can acquire one of the new GeForceFX 5600s based on the new NV31 GPU with the flip-chip design.

    Most prominently, this card can be purchased as the BFGtech GeForceFX 5600 ULTRA.

    In terms of performance, here's the breakdown of all the cards between nVidia and ATI:

    1. GeForceFX 5900 Ultra
    2. Radeon 9800 Pro
    3. Radeon 9700 Pro
    4. Radeon 9700
    5. GeForceFX 5600 Ultra
    6. Radeon 9500 Pro
    7. GeForceFX 5600
    8. Radeon 9600 Pro
    9. Radeon 9500
    10. GeForce4 ti4600
    11. GeForce4 ti4200
    12. GeForceFX 5200
    13. Radeon 9100
    14. Radeon 9000
    15. GeForce3 Ti500
    16. GeForce3
    17. GeForce3 Ti200
    18. GeForce4 MX 460
    19. GeForce2 Ti
    20. GeForce2 Ultra
    21. GeForce2 MX 440
    22. GeForce2 GTS
    23. GeForce4 MX 420

    Hope this helps muchly.
  • BlackHawkBlackHawk Bible music connoisseur There's no place like 127.0.0.1 Icrontian
    edited Jun 2003
  • panzerkwpanzerkw New York City
    edited Jun 2003
    The 9700 also has that 256bit memory bus. As I understand it this is very significant when you jack up AA and AF settings, which is why the 9700 owned the NV30. Is this accurate?
  • dodododo Landisville, PA
    edited Jun 2003
    panzerkw said
    The 9700 also has that 256bit memory bus. As I understand it this is very significant when you jack up AA and AF settings, which is why the 9700 owned the NV30. Is this accurate?

    I think that its the 256MB of memory that creates a significant advantage at higher resolutions, however i cant say for the bit rate.

    ~dodo
  • panzerkwpanzerkw New York City
    edited Jun 2003
    Bit rate? I don't understand...:confused:
  • panzerkwpanzerkw New York City
    edited Jun 2003
    Thanks for the explanation. I been looking for a monitor capable of some high refresh rates. Though I'm happy with my NEC Multisync at the moment, those Trinitrons are looking tempting.

    But from what you said it would seem that LCD monitors preclude smooth gaming? The whole "LCD's don't have refresh rates" concept kind of confuses me as well.
  • edcentricedcentric near Milwaukee, Wisconsin Icrontian
    edited Jun 2003
    The one twist to the explanation that jdii gave is when you turn up the AA and other effects. Quickly you can use 2-4 times as much memory per screen.
    Right now the 9500pro is the ticket.

    CRTs have a scan rate. And while LCDs don't scan they do have switching latancy times. There are limits to how fast you can turn them on and off. This is well described in some LCD reviews at TomsHardware (I can't believe that I said that).
  • dodododo Landisville, PA
    edited Jun 2003
    jdii1215 said
    Lessee-- GPUs can run 400 MHz plus top end these days??? RAM 350??? Monitors 100-250??? Limit is not the GPU speed, nor the Graphics RAM, it is the monitors in service. And the humans, who react slower yet cuz have to analyze and react. Methinks if you have a bleeding edge 128 to 256 MB RAM Graphics monster of a card, you need a faster monitor also.

    Sorry guys, the bottleneck is getting to be in the DISPLAY, not that which MAKES it. More REALISTIC displays you will get, but monitor will paint slower as given more dots to paint.

    At least we moved the bottleneck, used to be the cards. Now we get to reinvent monitors for speed.

    John Danielson.

    I dont think current graphics technology is limited by monitors. With the refresh rate set above 60Hz, you can't see the monitor refreshing. While its true that any framerates above the refresh rate are unrealized, also true is that anything faster than about 60 is imperceptible by the human eye. That is why GPUs are now pushing antialiasing techniques and such to improve image quality - because super high framerates gain nothing.
  • panzerkwpanzerkw New York City
    edited Jun 2003
    John, you sound like a professional in this sort of stuff. What do you do for a living?
  • dodododo Landisville, PA
    edited Jun 2003
    Ah, I'm with you now john, and i agree that current monitors should be improved resolution-wise and in regards to clearer pictures.

    ~dodo
Sign In or Register to comment.

The 5¢ Tour