StarCraft 2 and anti-aliasing: a tempest in a teapot

«1

Comments

  • Sledgehammer70Sledgehammer70 California Icrontian
    edited July 2010
    I am pretty sure Nvidia worked with Blizzard.
  • EX
    edited July 2010
    Doesn't really matter. I have a three year old Laptop and the game looks & runs great even on the lowest setting.

    I feel like todays gamers care about the stupidest crap.
  • primesuspectprimesuspect Beepin n' Boopin Detroit, MI Icrontian
    edited July 2010
    I think, in this case, gamers don't care, though.
  • ardichokeardichoke Icrontian
    edited July 2010
    Bah to anti-aliasing I say. BAH! I've never found it made graphics look all that much better, especially for the performance penalty one pays for using it.
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited July 2010
    I love anti-aliasing. I think it's one of the best features any GPU can offer. I can see jaggies from across the room (yes, really); I have a hard time UNseeing them.

    With all that in mind, I never noticed that AA wasn't functioning during my time in the SC2 beta. It was a total non-issue because Blizzard's engine looks fantastic without it.

    Talk about manufactured fuss.
  • TimTim Southwest PA Icrontian
    edited July 2010
    Here's one more for the list of issues - you can't custom bind keys yet. I don't like using the arrow keys to move the screen around, I want to use WASD like in WoW. It works so much better like that.
  • shwaipshwaip bluffin' with my muffin Icrontian
    edited July 2010
    you should be moving the screen either:
    1) By double tapping your control group key
    2) with your mouse, either by scrolling or clicking the minimap.
  • pigflipperpigflipper The Forgotten Coast Icrontian
    edited July 2010
    Shwaip: you are wasting your figurative breath.

    As for the whole AA thing, don't care, game looks great on my backup system @ low/med settings.
  • shwaipshwaip bluffin' with my muffin Icrontian
    edited July 2010
    For those of you running low/med settings, you should see it at high/ultra :)
  • KoreishKoreish I'm a penguin, deal with it. KCMO Icrontian
    edited July 2010
    I hate wifi. I can play the game on ultra just fine single player, get me playing multiplayer, I'm struggling with low.
  • Cliff_ForsterCliff_Forster Icrontian
    edited July 2010
    Does AA matter? Sure, it does to a degree, but not like it used to.

    At one time when you were playing Counterstrike at 800X600, 4X AA could do some remarkable things to clean up the image that you saw on screen. When you combined lower resolutions with lower poly counts the jagged lines could be a massive distraction.

    On modern games playing at 1920X1080 and beyond with massive poly counts the edge detail is so fine that jagged lines are far less noticeable without AA. When it comes down to it you always want to play in your monitors native highest resolution when you can with all the eye candy you can turn on to maintain a constant 60 frames per second rate. If your doing that, and the one sacrifice you make is AA in order to make performance, its not like the game experience is ruined anymore. The game just looks too good as it is, and to some degree too much AA can actually soften some games. Take Unreal Tournament III, a game where the designers ultimately decided not to build AA into the engine figuring that users might opt to run it from the cards driver if they really felt it necessary. Truth is, that game looks a little sharper and more defined with it off, might just be my personal preference, but its a case where AA actually deadens the image a little bit.

    Point being, AA is a nice option to have, but its no longer a deal breaker for a great visual experience.
  • edited July 2010
    Yeah, I agree with the sentiment here. AA does not matter just like Tessellation. Radeon is the best as always. Go AMD!
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited July 2010
    In the case of RTS AA helps with direct shadows in the game engine, most of the other things are using other means to smooth out edges. If you play with AA and than turn it off you will see the difference.

    But as noted above, Blizzard has built the game to run on a slew of system types, so if you have not seen it than your probably not going to miss it.
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited July 2010
    mirage wrote:
    Yeah, I agree with the sentiment here. AA does not matter just like Tessellation. Radeon is the best as always. Go AMD!

    Tessellation is a new technology that can bring a huge impact in games. Not many games are using this technology currently, but many new titles are building there games with it. Regardless of it being AMD or Nvidia, both companies support the tech in their current product offerings.
  • AlexDeGruvenAlexDeGruven Wut? Meechigan Icrontian
    edited July 2010
    I think, in this case, gamers don't care, though.

    Precisely.

    Fanboys care.
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited July 2010
    I'd be surprised if we ever see a game that strongly leverages tessellation. As much as it sucks to hear this, it's not a feature developers intend to push.
  • edited July 2010
    Thrax wrote:
    I'd be surprised if we ever see a game that strongly leverages tessellation. As much as it sucks to hear this, it's not a feature developers intend to push.

    No it does not suck since I don't agree.
  • edited July 2010
    Precisely.

    Fanboys don't care.

    Corrected for you :buck:
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited July 2010
    I like to buy products that put the technology they are built with to use. Game looks great without AA. But looks even better with it.
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited July 2010
    mirage wrote:
    No it does not suck since I don't agree.

    It really is a shame, because tessellation could do away with the trend of designing games for the lowest common denominator, or the "average" installed hardware base. Rather than using LOD levels or a pedestrian poly count, a heavily-tessellated engine could dynamically scale the detail appropriately for the hardware of the player. In other words, the player would always see the most detailed experience their PC can deliver in the given scene.

    That's why it sucks that game devs won't be using tessellation for less gimmicky things than water.
  • mondimondi Icrontian
    edited July 2010
    Thrax wrote:
    It really is a shame, because tessellation could do away with the trend of designing games for the lowest common denominator, or the "average" installed hardware base. Rather than using LOD levels or a pedestrian poly count, a heavily-tessellated engine could dynamically scale the detail appropriately for the hardware of the player. In other words, the player would always see the most detailed experience their PC can deliver in the given scene.

    That's why it sucks that game devs won't be using tessellation for less gimmicky things than water.

    It means that the player would always see whatever the graphics driver deemed appropriate. Water works because it's abstract enough that you can rely on the underlying implementation to produce fairly similar results regardless of the details. Anything that cannot rely on runtime geometry wouldn't work.

    Remember, a "highly tessellated" cube is a sphere.
  • edited July 2010
    Tessellation can be implemented in an adaptive way depending on the capability/power of hardware. For example, when the game detects an HD5870, it can scale tessellation back and do it more extensively with a GTX480. Of course, just like they did with AA, ATI can still decide to disable tessellation to avoid losing in the benchmarks. ;)
  • edited July 2010
    mondi wrote:
    Remember, a "highly tessellated" cube is a sphere.

    A cube is always a cube. But a "low tessellated" sphere might look like a cube.
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited July 2010
    Thrax wrote:
    It really is a shame, because tessellation could do away with the trend of designing games for the lowest common denominator, or the "average" installed hardware base. Rather than using LOD levels or a pedestrian poly count, a heavily-tessellated engine could dynamically scale the detail appropriately for the hardware of the player. In other words, the player would always see the most detailed experience their PC can deliver in the given scene.

    That's why it sucks that game devs won't be using tessellation for less gimmicky things than water.

    I agree thrax 100%. But I do know some dev's are starting to get creative with it.

    It's also a shame to see people say "AA" is not important as just a few years ago AA and AF were the top things. "OMG!!!! 32x AA"
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited July 2010
    mirage wrote:
    Tessellation can be implemented in an adaptive way depending on the capability/power of hardware. For example, when the game detects an HD5870, it can scale tessellation back and do it more extensively with a GTX480. Of course, just like they did with AA, ATI can still decide to disable tessellation to avoid losing in the benchmarks. ;)

    Or do it like NVIDIA and rely exclusively on tessellation benchmarks to fool the public into believing that the 7 months spent catassing around on a deeply flawed architecture was worth more than an 8% lead. ;)

    DIRECTX 11 DONE RITE, GUISZ.

    //edit: Now with 100% more citation.
  • mondimondi Icrontian
    edited July 2010
    mirage wrote:
    A cube is always a cube. But a "low tessellated" sphere might look like a cube.

    Same three vertices:

    1 - Cubic
    2 - Straight
    3 - Bezier

    Without any extra information as to what the tangents / vertex tension should be, we end up with 3 very different curves. Which one is correct?

    attachment.php?attachmentid=28464&stc=1&d=1280436987

    If you add tension info per vertex, then you're effectively doubling (possibly tripling depending on the implementation) the data sent to the graphics card _before_ you calculate the per vertex information. If you're going to do that, why not LOD the model and send the appropriate vertices directly, bypassing a large amount of bus transfers, and a whole bunch of calculations per vertex?
  • fatcatfatcat Mizzou Icrontian
    edited July 2010
    so...when is Starcraft 2: Episode 1 (or will it be episode 2, and yes I know they are called Heart of the Swarm and Legacy of the Void) coming out? 12 years from now? Want more single player campaign (namely Protoss)
  • edited July 2010
    Thrax wrote:
    Or do it like NVIDIA and rely exclusively ...

    Sure, I would only be happy to see if ATI had such good performance with tessellation and AA. After all, this is state of the art that improves 3D realism. That flawed architecture, as you call, is the "best architecture" I have seen to date. They went beyond designing traditional DX11 co-processors to push 100+ fps with two-year old games.
  • FrylockFrylock Washington St...not DC
    edited July 2010
    Well. I just beat the game and can't wait for the others. I loved it!
  • ObsidianObsidian Michigan Icrontian
    edited July 2010
    I was a bit let down by the short ending. After all the lengthy cutscenes leading up to it I expected more. At least the game itself was pretty damn fun.
Sign In or Register to comment.