NVIDIA's official statement about HL2

SpinnerSpinner Birmingham, UK
edited September 2003 in Science & Tech
<a target="_blank" href="http://www.nvidia.com"&gt;Nvidia<a> has released an official statement in response to statements made yesterday by Valve software regrading the GeForce FX line of video cards.
Over the last 24 hours, there has been quite a bit of controversy over comments made by Gabe Newell of Valve at ATIs Shader Day.

During the entire development of Half Life 2, NVIDIA has had close technical contact with Valve regarding the game. However, Valve has not made us aware of the issues Gabe discussed.

We're confused as to why Valve chose to use Release. 45 (Rel. 45) - because up to two weeks prior to the Shader Day we had been working closely with Valve to ensure that Release 50 (Rel. 50) provides the best experience possible on NVIDIA hardware.

Regarding the Half Life2 performance numbers that were published on the web, we believe these performance numbers are invalid because they do not use our Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's optimizations for Half Life 2 and other new games are included in our Rel.50 drivers - which reviewers currently have a beta version of today. Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.

Pending detailed information from Valve, we are only aware one bug with Rel. 50 and the version of Half Life 2 that we currently have - this is the fog issue that Gabe referred to in his presentation. It is not a cheat or an over optimization. Our current drop of Half Life 2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers. Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.

The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half Life 2.

We are committed to working with Gabe to fully understand his concerns and with Valve to ensure that 100+ million NVIDIA consumers get the best possible experience with Half Life 2 on NVIDIA hardware.

Brian Burke
NVIDIA Corp.
Source - HardOCP

Thanks to Nightshade737 for the heads up on this one

Link:
http://www.hardocp.com/article.html?art=NTIw

Related News:
http://www.short-media.com/forum/showthread.php?s=&threadid=3413
http://www.short-media.com/forum/showthread.php?s=&threadid=3412

Comments

  • KhaosKhaos New Hampshire
    edited September 2003
    The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.
    The ATI benches were done using the DX9 codepath for Half-Life 2, not a specialized one. In other words, GF-FX drivers need special optimizations in the game's code in order to be even partially effective. NVidia is forgetting how 3Dfx first captured the market... Improve ALL games dramatically, regardless of optimizations! It's stupid that GF-FX needs optimized code just to play the game at all, nevermind play it well.
  • Omega65Omega65 Philadelphia, Pa
    edited September 2003
    Didn't Nvidia beat 3DFX over the head with this before (32bit precision vs 16bit precision)?!?
    Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.

    The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

    Uh Huh! So basically in retropect 3DMark03 was right about Nvidia and DX9....... :shakehead

    Hopefully NV40 will arrive and fix this before it gets REALLY bad for Nvidia.
  • edited September 2003
    Khaos said
    The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.
    The ATI benches were done using the DX9 codepath for Half-Life 2, not a specialized one. In other words, GF-FX drivers need special optimizations in the game's code in order to be even partially effective. NVidia is forgetting how 3Dfx first captured the market... Improve ALL games dramatically, regardless of optimizations! It's stupid that GF-FX needs optimized code just to play the game at all, nevermind play it well.

    Actually, as I remember it, 3DFX just had most all of the developers use GLiDE, cause it was easier to program for. Once DirectX came around, and had gotten considerably faster 3DFX had lost the edge, and was toast.
  • pseudonympseudonym Michigan Icrontian
    edited September 2003
    Yeah, damn you direct X, ruining all my fun with my 3dfx cards.....

    Pretty scummy by valve to not use the Det50s though. The least they could have done is that; not like it was going to give them 30 fps anyways.
  • RWBRWB Icrontian
    edited September 2003
    Well, if NVidia's rep boy is telling the truth, which he could be, this shouldn't be the issue period. I wouldn't call myself a NVidia Disiple, but I have been a fan of their work since I tried buying an ATi card a while back and it wouldn't playw ell at all, but lately I have seen ATi go above and beyond and become what NVidia USED to be.

    In the game of a Corporate Life, everyone is a cheating SOB, but it is the one who isn't seen this way that wins out.

    I amnot sure if Valve is out to get NVidia for some reason, butI know that NVidia has a new History of cheating their drivers to get ahead, even their GPU is very secretive, it is just way to easy to single them out.

    Either way it goes, I have a GF4 Ti4600, gonna keep it for a while, and according to the benchmarks it should play the game just as well as the GFFX lol.
  • Geeky1Geeky1 University of the Pacific (Stockton, CA, USA)
    edited September 2003
    Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture

    Translation:
    Rel. "50 is the most specialized driver we've ever built- it includes a significant number of cheats/hacks/tweaks for various benchmarks and games to make the highly-overrated GeForce FX look much better than it actually is."

    Sounds like nVidia is pulling an Intel to me... :rolleyes:
  • SimGuySimGuy Ottawa, Canada
    edited September 2003
    As you all know, AquaMark 3, the first REAL DirectX 9.0 benchmark based off of a brand new DirectX 9.0 game engine will be released September 15.

    It should be interesting to see how the CrapForce FX performs against the latest offerings from ATI in the first REAL DirectX 9.0 benchmark that's not synthetic. :)

    One great feature of this benchmark: You can see where NVidia (and ATI) have cut corners in their DirectX 9.0 compliancy with the feature called "SVIST."

    SVIST:

    This test changes the display output in a similar fashion to the overdraw test. Again aspects of the screen are shown in different colours to identify the technique used. (DirectX 9 hardware required and Pro License)

    Blue: Simple shading without Pixel Shader
    Yellow: Predominantly Pixel Shader 1.x
    Red: Predominantly complex shading using Pixel Shader 2.0

    IE, you'll be able to see where drivers have been re-written to utilize PS 1.X instead of having full DX9 compliancy (ie PS 2.0).
  • SimGuySimGuy Ottawa, Canada
    edited September 2003
    (Benchmarks): http://www.driverheaven.net/articles/aquamark3/index2.htm
    (Quality Testing): http://www.driverheaven.net/articles/aquamark3/index3.htm

    WOW. Look at those optimizations fly in the new Detonators! 51.75 :D
Sign In or Register to comment.