All this waiting... for that ? Why !?!?!

2»

Comments

  • edited April 2010
    Not entirely sure today is the direct launch day. Nvidia said week of the 12th... either or Newegg and other retailers have been selling cards since last week. They seem to sell out within 10 minutes of posting them online.

    I guess people want their space heaters.
    but the heat and noise issue has been blown way out of proportion.
    :scratch:
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited April 2010
    I was only keep Cliff's mouth shut as that was what he was focusing on.
  • Cliff_ForsterCliff_Forster Icrontian
    edited April 2010
    I was only keep Cliff's mouth shut as that was what he was focusing on.

    I was going to stay out of it, but since you brought me up....

    My 5870 loads a good 30% lower in similar "real world" gaming tests when the CCC controls the fan to near silent levels.

    When I turn it to 40% manually, I can't get the thing much past the mid 40's.

    If we are doing a comparative analysis of the more efficient design, I'm not sure how your going to paint a picture where the 5870 does not win that battle handily.

    Listen, Fermi falls short of expectations no matter how you want to shape the argument. Nvidia over-promised and under-delivered. If you were to believe the pre launch hype, Fermi was going to flat out whip it out and piss on the Radeon 58xx series. Sledge, thats just not the case.

    This is how it is today without the Radeon parts refresh thats likely to happen in the next few months. Fermi grossly inefficient in comparison to gain a lousy 15% tops on AMD cards that has been out for at least six months. When AMD refreshes what do you think will happen? Lets play a little devil's advocate, thats been the whole Nvidia fanboy argument for months, hey, lets wait and see about Fermi, now that its here? So why not just see what the Radeon refresh is going to be like?

    See, the difference, is AMD's marketing dept is honest with its customers. No outlandish claims coming from their camp. By time Fermi cards are plentiful enough for Nvidia holdouts to purchase, AMD will likely have a parts refresh with a performance gain that a very least narrows the gap, and at best beats them while maintaining a more efficient overall design.
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited April 2010
    I never said it was the case.. I only said the heat and sound issue was blown out of proportion. But leave it to Cliff making it a battle... :)

    It may have fallen short of your expectations, but over the years I have learned that you never listen to marketing and claims until the product is in your hands. Fermi may not be what marketing claimed it would be 6 months ago, but Nvidia does have a card that is DX11 and is slight faster currently with a very young tech that has a lot of room to grow. int he end you should be happy as your precious ATI cards will just become cheaper for you to own.


    Also if you think AMD's marketing has been truthful over the years, you must have your blinders on...
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited April 2010
    The point of marketing is to be truthful only to the extent that it can't be called inaccurate.
  • Cliff_ForsterCliff_Forster Icrontian
    edited April 2010
    Also if you think AMD's marketing has been truthful over the years, you must have your blinders on...

    Give me one example in recent memory that even comes close to Fermi... (and the 2900XT does not count, after all, they were the old ATI :wink:)
  • Cliff_ForsterCliff_Forster Icrontian
    edited April 2010
    Thrax wrote:

    Hmmmmm, you might not agree with how they positioned that argument, but I would not accuse them of under-delivering to the customer here?

    I think we all fundamentally agree that an open physics API would be the best case scenario for gamers?
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited April 2010
    You asked for some AMD marketing that is as specious and ruthlessly close to a lie as Fermi has been at times. I'd say accusing your competitor of a grand conspiracy to bribe developers is on the level.
  • Cliff_ForsterCliff_Forster Icrontian
    edited April 2010
    Thrax wrote:
    You asked for some AMD marketing that is as specious and ruthlessly close to a lie as Fermi has been at times. I'd say accusing your competitor of a grand conspiracy to bribe developers is on the level.

    Okay, I'm not sure there is not something to it though :wink:

    Every time I see a "the way its meant to be played" logo in front of a title I want to hurl...
  • SnarkasmSnarkasm Madison, WI Icrontian
    edited April 2010
    Every time I see you post in a graphics or CPU/chipset thread, it makes me want to throw all of my computers out a window.
  • Cliff_ForsterCliff_Forster Icrontian
    edited April 2010
    Snarkasm wrote:
    Every time I see you post in a graphics or CPU/chipset thread, it makes me want to throw all of my computers out a window.

    Then I know, my work is done.... :p
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited April 2010
    Okay, I'm not sure there is not something to it though :wink:

    Every time I see a "the way its meant to be played" logo in front of a title I want to hurl...

    And you're positive that all involve game studios, publishing houses, involved developers and NVIDIA employees are able to keep the conspiracy a secret?

    You know, the funny thing about conspiracies is that as you add more people, the secret... Well, you know.
  • Cliff_ForsterCliff_Forster Icrontian
    edited April 2010
    Thrax wrote:
    And you're positive that all involve game studios, publishing houses, involved developers and NVIDIA employees are able to keep the conspiracy a secret?

    You know, the funny thing about conspiracies is that as you add more people, the secret... Well, you know.

    I understand what your saying. What if I said by "paid to do it" - How was lost in the translation.

    I'll just defer to my old Avacado farm analogy.

    I grow em, you grow em, they need to become tasty guacamole. I go to the finest Mexican kitchens help em develop their recipe, and while I'm at it, I get a couple of my lackey's to bus the tables.

    In any economy much less the current one, its hard to say no to a little free labor. You know, probably totally legal, but ethical? A fine line.

    It's kind of like me getting a sale by slipping a guy a couple of tickets to the ball game. It works, its a tool, some guys do it, but...

    So, payment does not necessarily have to be hard cash, its that program, and the support the supply developers with, I mean, its free labor, they show up with a few high priced programmers and help them code. Nvidia, says its all for gamers, but honestly, its to forward their competitive agenda by pushing proprietary tech, and in some cases, to refine the game code to be optimized for their specific hardware. I won't say thats entirely wrong, or evil, or anti competitive or anything, but, its probably a fairly close interpretation of what the way its meant to be played program is and why developers choose to partake in it. Whats in it for them? Free labor.

    Slap that label on your box, get PhysX into the title, bork AMD's AA implementation, and we will send you a couple top notch programmers to clean some things up for you. Developer does the math, they see free labor, they say, okay, well if your writing the code anyway, go ahead, stick that in our title. Thats how the developer gets paid.

    One salesman has donuts and free baseball tickets, another has free labor. Both sort of unethical means to achieve your business objectives, but hey, bribes work!
  • primesuspectprimesuspect Beepin n' Boopin Detroit, MI Icrontian
    edited April 2010
    Do you think that if AMD could afford to do that very same thing, they would?
  • ardichokeardichoke Icrontian
    edited April 2010
    If I had known this thread was going to go nuclear today I would have extended my vacation by a day. Alas, no popcorn for me.
  • Cliff_ForsterCliff_Forster Icrontian
    edited April 2010
    Do you think that if AMD could afford to do that very same thing, they would?

    Well, thats hard to say.

    Thats the Nvidia argument, hey, we have a better developer relations program and AMD does nothing but boo-hoo about it...

    AMD says, our developer relations program is superb as well, but you don't see us trying to break Nvidia's code....

    Each company has certain showcase titles. AMD has eyefinity now, and they have showcased it on certain titles. I know in 2004 both the ATI camp and Nvidia camp bet on different horses with the releases of Half Life 2 and Doom 3 both coming out around the same time, and it was obvious that each game was optimized for specific drivers.

    So, in short, I'm not saying AMD is not involved in a little of that, but at the same time, I don't buy the whole Nvidia spiel that we just do it to give gamers the best experience. They do it to forward their competitive agenda, and part of that is to get PhysX support into games. Get PhysX in the game, we will send you more programmers.... I can't speak for Richard Huddy, but when he said paid, I think thats more what he meant than some shady back room deal where a brief case full of cash and narcotics.

    Thats all I'm saying. Why embrace a proprietary tech that strains your relationship with another partner, especially when there are other available options? Is it because it really offers the best experience, or, is it because Nvidia will show up and do it for them just to screw AMD? So, when Richard Huddy says, hey, thats not good for gamers that may want to choose our products, maybe he at least has a little gripe?
  • edited April 2010
    Hey everyone, first post here *waves*

    I did skip through a lot of this thread simply because i'm getting deja vu from discussions like this in the past :P

    I just wanna say i've had my 5850 from launch(ish) and used 9.11 drivers and up. I've had no major issues at all, all games play fine, no GSOD, performance has been fantastic from day one. Performance has increased with every driver release, but really you'd be splitting hairs to say that's a bad thing! For example, i gained around 20fps in DIRT 2 recently, it was fine before but now... wow :D

    I really do believe the whole "ATI drivers suck" thing is baseless, as someone else said maybe back in the old ATI days but definitely not now.

    In the last few years i've had 2400, 2600, 4850, 4770, 4350, 5850, 3200 IGP cards from the "new" ATI and had not one major driver issue, not saying they don't exist, but really it's blown out of proportion.

    @&lt;cite class="ic-username"></cite>kryyst Love the user pic, Radiohead FTW!

    Hey @Cliff_Forster :)
  • Cliff_ForsterCliff_Forster Icrontian
    edited April 2010
    Aaronage = the voice of reason.
  • SnarkasmSnarkasm Madison, WI Icrontian
    edited April 2010
    Why does everybody always say the person that agrees with them is the voice of reason?

    I'm happy that your ATI drivers may not suck, but mine do. I dare you to disprove me.
  • Cliff_ForsterCliff_Forster Icrontian
    edited April 2010
    Snarkasm wrote:
    Why does everybody always say the person that agrees with them is the voice of reason?

    I'm happy that your ATI drivers may not suck, but mine do. I dare you to disprove me.

    Oh Snark, when will you learn, I'm always right.... ;)
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited April 2010
    I can tell you first hand working in the gaming industry Nvidia works with developers better than AMD. I don;t need either side to say that, it is first hand experiences.

    Also Cliff you keep saying AMD marketing... so if you go back to Phenom... you do know the same marketing team has moved over to ATI.
  • ThraxThrax 🐌 Austin, TX Icrontian
    edited April 2010
    Personal attacks are verboten.
  • Cliff_ForsterCliff_Forster Icrontian
    edited April 2010
    Fixed

    I can tell you first hand working in the gaming industry Nvidia works with developers better than AMD. I don;t need either side to say that, it is first hand experiences.

    Also Cliff you keep saying AMD marketing... so if you go back to Phenom... you do know the same marketing team has moved over to ATI.

    Ohhhhh, looks like someones stooped being polite, and started getting real...

    I honestly don't know why AMD keeps the ATI logo. Radeon is a brand that I believe is synonymous with quality graphics products. They probably have some great reason for it that I don't understand (after all, I am a moron), but I tend to call them AMD Radeon.
  • edited April 2010
    I am trying to analyze my position on the Nvidia side. Is this because of so much negativity against Nvidia or so much fanboyism for ATI? I know one thing that I usually tend to do the opposite of common trends.

    Actually, not this time. I really, really like the design idea behind Fermi. And I have a gut feeling that it will be a big success for the company in the following years. It will be used in many areas from game consoles to number crunchers. You guys can call it late, I call it ahead of its generation. That is why it is late, hot, and require the not-yet-commonly-employed features of DX11 for showing its real muscle.

    Regarding the developer relations. Cliff, you said this before "cut the crap and show me the product". Or something like this :) Regarding developers, here are CUDA, Physx, and OpenCL (Nvidia is the second highest contributor with Apple). Show me the developer products of AMD.

    To everyone (not Cliff alone): I am not knocking AMD in any way. I can not afford GTX480 now and the affordable graphics card I would buy/recommend is HD5770. But I just can not stand this much unfair negativity/fanboyism.
  • Cliff_ForsterCliff_Forster Icrontian
    edited April 2010
    mirage wrote:
    I am trying to analyze my position on the Nvidia side. Is this because of so much negativity against Nvidia or so much fanboyism for ATI? I know one thing that I usually tend to do the opposite of common trends.

    Actually, not this time. I really, really like the design idea behind Fermi. And I have a gut feeling that it will be a big success for the company in the following years. It will be used in many areas from game consoles to number crunchers. You guys can call it late, I call it ahead of its generation. That is why it is late, hot, and require the not-yet-commonly-employed features of DX11 for showing its real muscle.

    Regarding the developer relations. Cliff, you said this before "cut the crap and show me the product". Or something like this :) Regarding developers, here are CUDA, Physx, and OpenCL (Nvidia is the second highest contributor with Apple). Show me the developer products of AMD.

    To everyone (not Cliff alone): I am not knocking AMD in any way. I can not afford GTX480 now and the affordable graphics card I would buy/recommend is HD5770. But I just can not stand this much unfair negativity/fanboyism.

    Listen, main point I want to make is this, and this does not stem so much from Icrontic as it does the graphics enthusiast community in a whole.

    Fermi was perceived to be the Jesus card, it was going to come and save us all from the doldrums of ordinary PC gaming all while curing cancer on the side. I'm not making this up, its been Nvidia's line for months.

    It comes out, a few reviewers are like, meh, its costs a 25% premium for a 10% performance bump, its not available, and it runs really hot and uses more current than any chip before it, and still, we have people that will go fight the battle for them online. Well, its a little faster, who cares if it costs too much, is widely unavailable, and the chip is inefficient? I mean seriously guys, I may be the ultimate AMD fanboy, but I am not in denial....

    Nobody here, absolutely nobody can tell me that the initial expectations for Fermi have been met by Nvidia. You can't tell me that. Mirage, I'm not saying that there are not some interesting things about the architecture. I'm just saying, the marketing promised the world that Fermi and CUDA would perform miracles, that it would be worth the wait.... Forgive me if I call em out on it now that the reviews are in.
  • GooDGooD Quebec (CAN) Member
    edited April 2010
    Fermi was perceived to be the Jesus card, it was going to come and save us all from the doldrums of ordinary PC gaming all while curing cancer on the side.

    I LoL'ed IRL :P

    I'm with Cliff with the fact that NVIDA didnt deliver what they promise they would deliver some time ago. They tried, but they failed. It's as simple as that.

    It may be not a complete fail, that would have been to be unable to launch a card at all, but it's a fail anyway. It may be a good first step for a future next product that *could* deliver what they were trying to achieve in the first place. But then it will give ATI many many time to ajust and create a competitive product too.

    Once the new cards from NVIDIA get more available you won't be able to say ATI is the king of the hill anymore, but they still got a huge advantage over NVIDIA for the coming months, because i don't think NVIDIA will be king of the hill for a long time since now ATI is the next one to release another bunch of cards.

    BTW : To get more on my original topic , i've been to my friend house who got his replacement for his 5850, and it doesnt do any GSOD anymore, when he was talking to me i was thinking it was the same problem with his new card but it is not. Since the time he went to 10.3 with his new card he gets BLUE SCREEN , more specificaly crash dump blue screen. I havent been able to do many testing but it may be not the video card fault anymore.

    Blue screen occurs less frequently than GSOD, and not with any games. Dragon age is the game that is causing his computer to blue screen very frequently.

    And i was thinking we were getting rid of blue screen, it has been now like 8 years since i saw my last one lol.

    Anyway, it was just to let you know ;) I'll do more benchmark and diagnostic like memtest to see if its RAM could be the problem, and windows benchmark to see if high load IN windows could cause it to blue screen too. So you can remove some of the bads comments about the drivers, seems like it correctly removed the GSOD problem with a none-defective card ;)

    Peace !
  • Cliff_ForsterCliff_Forster Icrontian
    edited April 2010
    Dragon Age is unique because its actually CPU loaded pretty heavily. Not saying the culprit isn't the card, just saying that Dragon age is one of the few titles that will scale well in high resolutions when you add more central processing power. It seems to be a CPU hungry title.
  • coldalarmcoldalarm England, UK
    edited April 2010
    I used to get a lot of Blue Screens in FO3.
    My problem was my graphics driver (it's why I've stayed on the 18x series), and once I'd rolled back from 19x to 18x everything's been fine.
  • lordbeanlordbean Ontario, Canada
    edited April 2010
    Listen, main point I want to make is this, and this does not stem so much from Icrontic as it does the graphics enthusiast community in a whole.

    Fermi was perceived to be the Jesus card, it was going to come and save us all from the doldrums of ordinary PC gaming all while curing cancer on the side. I'm not making this up, its been Nvidia's line for months.

    It comes out, a few reviewers are like, meh, its costs a 25% premium for a 10% performance bump, its not available, and it runs really hot and uses more current than any chip before it, and still, we have people that will go fight the battle for them online. Well, its a little faster, who cares if it costs too much, is widely unavailable, and the chip is inefficient? I mean seriously guys, I may be the ultimate AMD fanboy, but I am not in denial....

    Nobody here, absolutely nobody can tell me that the initial expectations for Fermi have been met by Nvidia. You can't tell me that. Mirage, I'm not saying that there are not some interesting things about the architecture. I'm just saying, the marketing promised the world that Fermi and CUDA would perform miracles, that it would be worth the wait.... Forgive me if I call em out on it now that the reviews are in.

    I consider myself sufficiently neutral to comment here...

    I don't think Fermi was supposed to come across as the "Jesus" card, to be honest. Yes, nvidia claimed that their next generation would be a major step in PC advancement, but what else would we expect them to do in the situation they were in? Simply put, straight out no bullshit, their DirectX 11 card was late, by a heaping 8 months. Whether that was an R&D fuckup, a supply problem, a fine-tuning problem, or simply a "we want it to be perfect" situation, I don't know, and I don't think anyone else here can say for sure either. I find it completely natural that nvidia would claim their card would put all others to shame - think about it for a moment. Is nvidia going to keep more future customers by telling the public that they're going to wait 8 months for a mediocre product, or are they going to keep more future customers by telling them what they're going to get will smash all other cards to pieces?

    Personally, I gambled on my upgrade. I bought ATI hardware fully knowing that nvidia may well have released a much more attractive card. But, like so many other people, I wanted the DirectX 11 hardware when DirectX 11 was released - not 8 months later. I can honestly say, now that I've seen nvidia's offerings, that I'd choose the same upgrade path again - meaning my gamble has paid off. I'm sure many other people will agree with this.
Sign In or Register to comment.