Revisted: The NVIDIA Quadro FX 3800 unleashed

UPSLynxUPSLynx :KAPPA:Redwood City, CA Icrontian
edited April 2010 in Science & Tech

Comments

  • Sledgehammer70Sledgehammer70 California Icrontian
    edited April 2010
    It sucks to find that one setting that held it back, but it good to see a wrap-up review giving it the proper dues. Game on :)
  • primesuspectprimesuspect Beepin n' Boopin Detroit, MI Icrontian
    edited April 2010
    Surprising that such an important setting is not highly publicized. I know I for one don't ever think to check default driver settings. That's not something I would have checked, and I'm probably an "average" user.
  • UPSLynxUPSLynx :KAPPA: Redwood City, CA Icrontian
    edited April 2010
    I'm curious as to if big studios prefer to operate with such a setting enabled or not. Knowing that might answer why it defaults enabled or not.
  • photodudephotodude Salt Lake, Utah Member
    edited April 2010
    I wonder what the importance of v-sync is, and why you might want it on? especially considering when having it on lowers performance.
  • drasnordrasnor Starship Operator Hawthorne, CA Icrontian
    edited April 2010
    Enabling Vsync prevents tearing as the image in the graphics card framebuffer is updated in sync with the display's vertical refresh rate. When disabled, the image in the graphics card framebuffer can be updated while the screen is refreshing which produces artifacts.
  • UPSLynxUPSLynx :KAPPA: Redwood City, CA Icrontian
    edited April 2010
    If you play an FPS on a system that can render the game at a rate faster than 60 frames per second, you can spin the mouse around really quickly and if you look closely, you'lll notice tears and artifacts in the images.

    vsync prevents that, as Drasnor said, it locks the card to the display's refresh rate. It's all about throwing away overkill performance for clarity of visuals.
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited April 2010
    Which, as I have long wondered, makes one wonder why it's not the default for AMD.
  • photodudephotodude Salt Lake, Utah Member
    edited April 2010
    Sounds like vsync is a valid and needed feature for games, to limit artifacts and tears. But in a workstation environment, for CAD/CAM/BIM or DCC, is vsync still a valid setting needed to keep a "clarity of visuals" at the sacrifice of additional performance?

    If vsync is primarily a setting needed for avoiding game rendering, what logic is there in making it a default setting for a workstation graphics card?

    Over all it seems vsync could be reimagined Since it's about "throwing away overkill performance" Would it not be better to program vsync to not throw away the extra performance but limit what portion of the GPU is being used and allow the "extra performance" to be available for other tasks?
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited April 2010
    The point to consider about vsync is that any FPS above and beyond that of the refresh rate is <i>already</i> thrown away. Sixty hertz is sixty screen refreshes per second, just as 60 FPS is sixty screen refreshes per second--it's not possible to show 61, 65 or even 65,000 FPS on a 60Hz display.

    Performance dips, too, work just the same on vsync. If an 80 FPS task would drop you down to 60 FPS, then a PC without vsync drops to 60. The same PC running vsync wouldn't budge from 60 FPS, because it's already perfectly capable of running 60 FPS, just not more. If a PC lost 30 FPS down to 50 FPS without vsync, then a PC with vsync will lose 10 FPS. Enabling vsync doesn't mean the program or game will automatically subtract the same chunk of performance, regardless of the FPS cap, it just means that any dip that isn't big enough to move the FPS below 60 with an uncapped frame rate won't move the FPS at all with vsync.

    Vsync doesn't "sacrifice" or "throw away" performance, it just throws away the frames that get thrown away anyhow.
  • UPSLynxUPSLynx :KAPPA: Redwood City, CA Icrontian
    edited April 2010
    And to add to what Thrax just said, viewport performance in 3D applications will experience the exact same tearing/artifacting effects that a video game does.

    The GPU is still pulling its full weight around with vsync enabled, its simply taking extra steps to ensure that the user doesn't have an unwelcome experience
  • drasnordrasnor Starship Operator Hawthorne, CA Icrontian
    edited April 2010
    Thrax wrote:
    The point to consider about vsync is that any FPS above and beyond that of the refresh rate is <i>already</i> thrown away. Sixty hertz is sixty screen refreshes per second, just as 60 FPS is sixty screen refreshes per second--it's not possible to show 61, 65 or even 65,000 FPS on a 60Hz display.
    ...
    Vsync doesn't "sacrifice" or "throw away" performance, it just throws away the frames that get thrown away anyhow.
    That's not correct due to a misconception about how the picture is drawn on the screen. Imagine if you will two painters sitting side-by-side. One painter is rendering an original image (the graphics card) and the second painter is attempting to copy the original painter's image (the monitor). These two painters are painting the same way a computer draws: rasterizing the painting one pixel at a time, left to right top to bottom. When each painter reaches the bottom right corner of their canvas they go back to the top left corner and start over on a new image. In the case that your graphics card can render the image faster than your monitor, the original painter will be painting faster than the second painter and eventually we reach a point where portions of the second painter's image come from different paintings. This is fine for image quality if both paintings are more or less the same but if they're different you get artifacts. The exact same phenomenon occurs if the original painter is painting more slowly than the second painter for the same reasons.

    The only real difference between this analogy and the way it really works is that the graphics card refreshes its entire image at once instead of pixel-by-pixel the way the monitor does. The end effect is the same though. Functionally, it's not that Thrax's "extra frames" are thrown away, it's that only parts of them get drawn.

    Unfortunately, while enabling Vsync will improve the image quality it also introduces some latency penalties. When the graphics card is drawing frames faster than the monitor it must begin drawing a frame after some delay so that both it and the monitor finish drawing at the same time. Anything that "happens" between the beginning of that delay and the end of the image getting drawn is lost. It's even worse if the graphics card is slower than the monitor; you may see the same image get drawn for multiple times while the graphics card renders the next frame. That sort of behavior is usually alright for viewports but sucks for video games since lots of things happen between frames getting rendered (you getting shot at for instance.)

    Bottom line, it's a trade-off.
  • photodudephotodude Salt Lake, Utah Member
    edited April 2010
    Considering what has been said...... It would be nice to see benchmarks for ATI and Nvidia with and without vsync so they can be compaired better and more equially.
Sign In or Register to comment.