Whats up with Half Life 2?? Jaggies??

edited May 2005 in Gaming
Ok well after making the wrong choice of getting a FX5900ultra 256mb card I went out and got a PNY Geforce 6600GT 128mb card and sold my crucial cl3 ram for OCZ ultra ram cl2.5 which is 2x512mb PC3200.

Thing is I set my graphics card for max graphics and set Half life 2 to max, but I dont know what it is but still the game doesnt look smooth still looks like its got jaggies and looks a bit sharp...

What are the best settings to use for this game?? I have set it to Anistropic 16x instead of Trilinear which is better?? and even though my card is set to max anti aliasing of 8xS the game is set to 6x which i did myself as the game does not choose anti aliasing for some reason by default!!

I am using the latest Nvidia drivers and bios etc.

Comments

  • Mt_GoatMt_Goat Head Cheezy Knob Pflugerville (north of Austin) Icrontian
    edited May 2005
    Sounds like you need to backk off the game setting a notch. What res are you running the game at?
  • SoLoSoLo DirtySouth, USA
    edited May 2005
    mtgoat wrote:
    Sounds like you need to backk off the game setting a notch. What res are you running the game at?
    yes, i would back off of the maxed out anti aliasing..leave it low as i always have mine low and still get great game play.REMEMBER, you have a 6600gt not a 6800ultra..

    Also try Nvidia Settings\performance and quality settings and set the application profile for doom3.exe I have my card to this setting and have found out that the card runs VERY fast and hard with this profile loaded,And this does not over clock the card...Overclocking is another story,

    Is you card overclocked at all?
  • edited May 2005
    hi guys just checked the settings and put them on the recommened ones and its all fine now! but to be honest the graphics or the walls and building are nice but all in all half life 2 graphics arent all that spectacular...i was expecting something more amazing oh well...

    So if i put my cards settings to Doom 3 spec it should be good?

    The card is overclocked..i have put it slighltly lower than what nvidias own settings go up to as i have unlocked the software overclocking utility.
  • SoLoSoLo DirtySouth, USA
    edited May 2005
    well if youre asking me about the graphics in hl2, well i had the game for a day then sold it to my brother b/c i did not like the gameplay and the graphics to me were bland, it ran fast and was fun but i didnot like the engine..but im not BASHin the game..lol
  • SoLoSoLo DirtySouth, USA
    edited May 2005
    oh yeah and with my Apollo 6600gt, when i overclocked i did as you did,by auto setting the overclock and then backing it down a little...
  • Mt_GoatMt_Goat Head Cheezy Knob Pflugerville (north of Austin) Icrontian
    edited May 2005
    Then I suggest ou guys get better cards and crank up the settings. Then you will see what it is all about!;)
  • SoLoSoLo DirtySouth, USA
    edited May 2005
    mtgoat wrote:
    Then I suggest ou guys get better cards and crank up the settings. Then you will see what it is all about!;)
    LoL all the newest games play GREAT with my card. maxed out graphics ALWAYS.!!!
    sorry to burst youre bubble Goat but the 6600gt is a kick ass card.!!
  • Mt_GoatMt_Goat Head Cheezy Knob Pflugerville (north of Austin) Icrontian
    edited May 2005
    solo23 wrote:
    LoL all the newest games play GREAT with my card. maxed out graphics ALWAYS.!!!
    sorry to burst youre bubble Goat but the 6600gt is a kick ass card.!!
    Solo

    I am not in a bubble and I compared a 6600GT and my current X800XL first hand in my own machine and there is a world of difference!

    Sorry dude.
  • lemonlimelemonlime Canada Member
    edited May 2005
    ATI's X8xx series of cards seem to have an edge in HL2 and other DirectX games.

    16x antistropic and 8x anti aliasing is VERY high for a midrange card, especially in HL2. If you want the resolution above 1024x768, you'll likely have to take things down a bit. 4/8X Antistropic, and 2x antialiasing does clean things up a bit, and keeps the frame rates high.

    Another side note, if things look a little jaggy, ensure that you have vsync turned on to prevent frame rates above 60fps. You'll notice all sorts of jaggies and funny behavior when you get in small rooms or close to walls. For some reason, they leave it off by default in the settings.
  • edited May 2005
    thanks for the heads up, talking about graphics cards...whats the point in a normal 6800 when a 6600GT is more powerful??
  • Mt_GoatMt_Goat Head Cheezy Knob Pflugerville (north of Austin) Icrontian
    edited May 2005
    thanks for the heads up, talking about graphics cards...whats the point in a normal 6800 when a 6600GT is more powerful??
    The plain 6800 is not at all less powerful. Let's look at the numbers other than core and memory clock, wich can be deceiving if you don't know or undrstand everything else behind them. First the 6600GT is a 128 bit memory interface and the 6800 is a 256 bit memory interface which means it has double the bandwidth. This means that twice as much information can flow at one time. It doesn't do any good to be faster if you can't get out of the gate. Then the 6600GT has 8 pixel pipelines while the 6800 has 12. This means that instead of having an 8 lane highway the 6800 has a 12 lane highway so even more information can move at one time.

    6600GT
    • core Speed: 500
    • Memory Speed: 500 x 2 = 1000 DDR
    • 128 bit interface
    • 8 Pixel Pipelines

    6800
    • Core Speed: 325
    • Memory Speed: 350 x 2 = 700 DDR
    • 256 bit interface
    • 12 Pixel pipelines

    Which one do you think is capable of doing more work faster?
  • SoLoSoLo DirtySouth, USA
    edited May 2005
    mtgoat wrote:
    Solo

    I am not in a bubble and I compared a 6600GT and my current X800XL first hand in my own machine and there is a world of difference!

    Sorry dude.
    thats like comparing apples to oranges.!!! a $125 video card campared to a $250 card..lol
  • lemonlimelemonlime Canada Member
    edited May 2005
    SoLo wrote:
    thats like comparing apples to oranges.!!! a $125 video card campared to a $250 card..lol

    X800XL certainly costs more, but the performance jump is quite large, especially in DirectX games. X800XL has the full 16-pixel pipelines open, versus 8 on the 6600GT. Most 6600GT's I've seen are around 150+ USD. It is worth the extra 100 bux. 6600GT's do really well in SLI though :D, as long as your games support it.
  • MedlockMedlock Miramar, Florida Member
    edited May 2005
    Back to the original topic, I've been told that if a graphics card can't handle the settings specified by the user, Half-Life 2 dynamically lowers them to an acceptable level. I would imagine that 8x anti-aliasing is much too high for a mid-range card. I'd use 4x at the max, 2x would probably be better, and still perform great. HL2 won't have to lower all kinds of texture resolutions and stuff, the game looks better anyway. :thumbsup:

    Back off on the AA/AF settings, then complain about the graphics. :p

    EDIT:
    6800>6600GT even at stock. With the 6800 there's a possibility of being able to unlock 4 more pixel pipelines. It's an overclocker's dream to get a 6800 and unlock the last pixel pipelines and overclock the hell out of it. But, it will never be as fast as a 6800GT because it uses DDR memory where the GT uses GDDR3.
  • SoLoSoLo DirtySouth, USA
    edited May 2005
    anti-aliasing, i never use this setting..
  • lemonlimelemonlime Canada Member
    edited May 2005
    Even 2x anti-aliasing will make a noticable improvement in the image quality. Take a look at some horizontal edges with and without antialiasing.. 2x is not too hard on most modern graphics cards..
  • edited May 2005
    Well here is the thing...Half life 2 turns off aa by default, and unless i max out aa in my nvidia settings or at least turn them on i get jaggies in the game.

    Why does half life 2 turn aa off by default?? the graphics of teh scenery are great but the characters and things you use in the game look pretty old hat.
  • SoLoSoLo DirtySouth, USA
    edited May 2005
    i would think it auto turns off AA b/c its auto detects youre hardware..and sets it at those settings due to what it detects..Dont most games set AA to off by default?
Sign In or Register to comment.