Nvidia says AMD is punishing gamers: Promptly gets bitch slapped

Comments

  • mas0nmas0n howdy Icrontian
    edited September 2009
    If the Radeon 5870 is punishment, then AMD can bend me over and spank me. Spank me hard, AMD; I have been a very naughty boy.

    Awesome.
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited September 2009
    If the Radeon 5870 is punishment, then AMD can bend me over and spank me. Spank me hard, AMD; I have been a very naughty fanboy.

    Fixt
  • _k_k P-Town, Texas Icrontian
    edited September 2009
    um cliff is there something you want to tell us.
  • KometeKomete Member
    edited September 2009
    K,let him keep it to himself.

    Really starting to loose respect for nvidia.
  • Cliff_ForsterCliff_Forster Icrontian
    edited September 2009
    All humor aside,

    What does everyone think of a company that approaches a journalist to offer a list of preferred bullet points for a competitors product? Would you argue against the ethical merits of that? What if Mr. Baxtor would have just rolled over and gave them what they wanted for fear of loosing NVIDIA review samples, advertising revenue, whatever?

    NVIDIA should be interfacing with the press on the merits of their own product, not the competition. They should compete by making better product, not by coaxing journalists to frame the articles on the competitors product in a way that they find favorable.

    Of course Mr. Baxtor was not going to just say, DX11 is pointless because most games are just DX9 ports, because he would have lost all credibility for it with the enthusiast community. That is why this is so ridiculous.

    Well played Mr. Baxtor, well played.
  • lordbeanlordbean Ontario, Canada
    edited September 2009
    I like Baxtor's comments about PhysX. They strike true to the issue, and he makes a point of highlighting the only real thing PhysX acceleration does for nvidia: it increases their 3DMark Vantage score. As some people may have noticed, this is even accomplished in a fashion I would consider cheating - when GPU physX is enabled, the graphics card helps the CPU tests render more frames per second. What was supposed to be a test of your raw processor power becomes a cheap-ass way to increase the vantage score of an nvidia system.

    /rant off
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited September 2009
    No, on a more serious note, you and Mr. Baxtor are absolutely correct, Cliff.

    It is completely improper for a company to coerce, suggest, or otherwise hint at the need for a favorable statement.

    As a journalist, I feel I have three huge responsibilities (no particular order):

    1) Write with honesty: I will never lie, distort or manipulate the facts. If I cannot determine a statement's validity, I will clearly indicate that its credibility is in question.

    2) Write with integrity: I will never be coerced, manipulated or influenced to alter my opinion. Any company that tries (in any manner) will get ratted out to 500,000 (and growing) readers, Reddit, Digg, and any other site that will run the story.

    3) Write with equity: I will never provide biased coverage. My job is to write about the industry, and any movements within it are equally deserving of note.

    No company is above my responsibilities and no company can change them.
  • ardichokeardichoke Icrontian
    edited September 2009
    Oh NVidia. I used to be solidly in your camp when it came to graphics cards. The utter uselessness of PhysX, the bug-ridden drivers and now the ridiculous attempt to criticize your competitor simply for beating you to the market has ensured my next card will not be sporting one of your chips.
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited September 2009
    My next card will be sporting their chip if it's faster than AMD's. :P All I care about is IQ speed.
  • djmephdjmeph Detroit Icrontian
    edited September 2009
    These guys are all used to being able to buy favorable reviews from media outlets like Tom's Hardware and Maximum PC magazine. I'm willing to bet hardware manufacturers have been doing shit like this for years. It's just good that they're getting called out on their bullshit now.
  • edited September 2009
    I have to agree with nvida on this one. A card rushed out the door to satisfy those that HAVE to have a DX11 card asap kibda sucks considering it will be overlaped in next to no time. I'm content with my dual 285s for now. Once we see some MAJOR performance gains...then I will upgrade, be it ATI or Nvidia.
  • lordbeanlordbean Ontario, Canada
    edited September 2009
    It could just be me, but near 50% increase over a GTX285 seems like a MAJOR performance gain to me, especially when you consider the 5870 is damn near the same price as a GTX 285.
  • primesuspectprimesuspect Beepin n' Boopin Detroit, MI Icrontian
    edited September 2009
    Mastershredder, I think that even the most obstinate consumer can plainly see that, DirectX 11 support notwithstanding, the 5870 represents a giant leap forward in performance. Any way you look at it, there are major performance gains.
  • edited September 2009
    nvidia rules
  • edited September 2009
    Thrax wrote:
    My next card will be sporting their chip if it's faster than AMD's. :P All I care about is IQ speed.

    ^^^^^^^^^^^^^^^^^^

    ATI and Nvidia have been competing like this since more than a decade. Here is what I can remember in the last decade. I owned most of the cards below.
    Radeon 8500 -> Geforce 4 Ti 4200 -> Radeon 9700 Pro -> Geforce 6800 -> Geforce 7900 -> Radeon X1950 -> Geforce 8800 -> GTX280 -> Radeon HD4800 -> Radeon HD5800 -> ..... and it will go on.
  • GnomeWizarddGnomeWizardd Member 4 Life Akron, PA Icrontian
    edited September 2009
    mirage wrote:
    ^^^^^^^^^^^^^^^^^^



    I would say speed is most important , I am running a 4850 and I dont think ill be switching over to Nvidias side this next gen either
  • edited September 2009
    I would say speed is most important , I am running a 4850 and I dont think ill be switching over to Nvidias side this next gen either

    I currently have both HD4850@850/1200 and GTX260-216@725/1450/1200. HD5800 seems like a significant upgrade for both cards but I have no motivation to upgrade with the current games. I will wait at least 6 months more. I agree that ATI will most likely dominate the next generation Nvidia as well. This happened before. When ATI had no significant success in between X1950 and HD4800, Nvidia dominated for two generations.
  • Cliff_ForsterCliff_Forster Icrontian
    edited September 2009
    TweakTown update -

    Added this morning, I quote.

    "UPDATE - NVIDIA's PR company for Asia Pacific in Singapore, CIZA Concept, has contacted us and claimed that the source of the questions was from them as "our observations, as industry watchers" and not NVIDIA. We always get these types of questions and they seem to read like they are from NVIDIA, so we are unsure. We were asked to state this in this news post, which has been done here.

    We were also contacted by phone about this by NVIDIA and we have offered them the chance to do a follow-up. We are in the process of preparing questions and will have them answered early next week in a new news post."


    Now, while I feel it is still fair to say that the PR company that represents NVIDIA should not be offering a specific set of bullet points that they desire a journalist to touch on about a competitors product, it would be unfair for me to ignore that it may give NVIDIA some level of plausible deniability in this matter. Still they may have some things to discuss with their PR firm.

    Apparently TweakTown is primed to offer NVIDIA a rebuttal, and I am also seeking comment.
  • lordbeanlordbean Ontario, Canada
    edited September 2009
    So either the original set of questions were not from nvidia, or someone at nvidia's PR department has been fired for making a BIG mistake. Either way, we get to hear some things straight from the horse's mouth.
  • edited September 2009
    This is what I heard.
  • I wont even upgrade to the 5000 series, Ill just pick up a extra 4850 and crossfire, that will be fine
  • StarmanStarman Icrontian
    edited September 2009
    I'm still puttering about with an nvidia 7900.
  • ObsidianObsidian Michigan Icrontian
    edited September 2009
    Starman wrote:
    I'm still puttering about with an nvidia 7900.
    You poor bastard.
  • ObsidianObsidian Michigan Icrontian
    edited September 2009
    mirage wrote:
    ^^^^^^^^^^^^^^^^^^

    ATI and Nvidia have been competing like this since more than a decade. Here is what I can remember in the last decade. I owned most of the cards below.
    Radeon 8500 -> Geforce 4 Ti 4200 -> Radeon 9700 Pro -> Geforce 6800 -> Geforce 7900 -> Radeon X1950 -> Geforce 8800 -> GTX280 -> Radeon HD4800 -> Radeon HD5800 -> ..... and it will go on.
    Ummm, since when is any single-gpu HD 4800 series faster than the GTX280? The HD 4890 is debatable but in general, the GTX 200 series is still more powerful than the HD 4800 series.
  • edited September 2009
    I tried to list the significant releases in chronological order. There can be errors even in chronological order, the point is to illustrate the cycles in performance leadership.
  • UPSLynxUPSLynx :KAPPA: Redwood City, CA Icrontian
    edited September 2009
    Thrax wrote:
    My next card will be sporting their chip if it's faster than AMD's. :P All I care about is IQ speed.

    This. I have my favorites in companies, but that won't stop me from buying the best chip. I'm waiting to see what nvidia pulls out before I upgrade.

    Lordbean - I'm curious about your earlier comments about using physX to 'cheat'. You think it improper that hardware manufacturers start building to share the processing load across hardware? GPUs have an astounding amount of power today, most of which goes unused in typical situations. GPUs are taking the excess processing responsibilities off of the CPU and taking care of business to yield overall higher performance to the user. This is the way the market is going, and every company out there is exporing their own methods of doing this. If the GPU has room to spare and can pull off seemingly needless physics calculations off of the CPU, is this not a good idea?
  • edited September 2009
    I can only commend Nvidia for their development of CUDA and porting Ageia's PhysX. GPU computing has been a very important development not only in games but especially for scientific computation. I am sure Folding guys can testify for that.
  • Cliff_ForsterCliff_Forster Icrontian
    edited September 2009
    Whenever I read a graphics card review, I find the framerates measured at the resolutions I play at to be a more important than the 3D mark score. Sure, when some 3D mark record is broken is some crazy overclocking competition I take notice more because of whats going on with the extreme systems and the level of overclocking savy, but in terms of it being a real world measurement, 3D mark is not as good as testing in real game timedemo's
  • lordbeanlordbean Ontario, Canada
    edited September 2009
    Obsidian wrote:
    Ummm, since when is any single-gpu HD 4800 series faster than the GTX280? The HD 4890 is debatable but in general, the GTX 200 series is still more powerful than the HD 4800 series.

    Have to agree. From everything I've seen, the Radeon HD4890 is closer to a GTX 275. GTX 280 / 285 are superior cards.
    Whenever I read a graphics card review, I find the framerates measured at the resolutions I play at to be a more important than the 3D mark score. Sure, when some 3D mark record is broken is some crazy overclocking competition I take notice more because of whats going on with the extreme systems and the level of overclocking savy, but in terms of it being a real world measurement, 3D mark is not as good as testing in real game timedemo's

    Also agree with this. Gamers tend to consider graphics upgrades in terms of the gains they will see in the games they play, not in synthetic benchmarks or games they have no interest in. As a result, the "overall best" card may not necessarily be the one chosen.
Sign In or Register to comment.