7950gtx 2

Sledgehammer70Sledgehammer70 California Icrontian
edited May 2006 in Hardware
With the new Quad SLI based 7950GTX 2 what will become of the graphics market? Will quad SLI become the standard in a few years? will the Quad SLI setups allow Nvidia to push their new Physic's to the next level, allowing gamers and Game developers to give us that much more realistic gaming experiences?

What do you think?
«1

Comments

  • entropyentropy Yah-Der-Hey (Wisconsin)
    edited April 2006
    I think these companies need to stop spending money trying to get US to spend money, and work on better technologies. You know there's something wrong with the market when it's standard to have to put two products together to get one working one.

    How about we spend more money on R&D on enhancing these things? Better yet, let's ditch the entire architecture and come up with, gasp, something NOVEL!

    I refuse to spend $500 on one graphics card, and then to have that not even be the best? You're telling me that I can just keep buying this crap and "making it better"? Come again?

    Let's come up with a good idea here, folks.
  • botheredbothered Manchester UK
    edited April 2006
    It's like the everlasting light bulb though, no money in it. Far better to have people keep buying new bulbs.
  • edited April 2006
    bothered wrote:
    It's like the everlasting light bulb though, no money in it. Far better to have people keep buying new bulbs.

    Exactly. There's a light bulb in Livermore CA that's burned continuously for going on 90 years. The company that made that particular bulb went out of business many years ago because their bulbs lasted too long, everyone that was going to buy that brand bought it and basically never needed more so the demand dried up and they went under.

    If the GPU companies come up with the killer GPU everyone will buy what they need and migrate that GPU from machine to machine and the demand will drop low enough that the supply will cease. This is the whole reason for the forced migration from AGP to PCI-e. There's no reason to run PCI-e other than to support the new GPUs and SLI. If they made a 7900GTX AGP and ran that on an AGP based system there'd be an upheavel of people scrapping plans to upgrade simply for better graphics when their PCs are still competent in every other respect. Sales start to slump so the hardware mfgs come up with new ways to suck the dollars out of your pockets. Eventually there will come a time when PCI express is leveraged to the benefit of the end user but at the present time it just isn't happening. Right now PCI express is being leveraged to the benefit of the suppliers, new sales.
  • RWBRWB Icrontian
    edited April 2006
    Bulbs and GPU's arn't the same... a bulb is a bulb is a bulb, and a GPU can always be made faster and games can always look more real. There is no reason they cannot just build an amazing graphics card and then come out with a new one like they used to do.... Geforce 2 GT, Ultra, Geforce 3, Geforce 4 etc...
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited April 2006
    Will quad SLI become the standard in a few years?
    Dual SLI won't even become a standard. It's a fad, but may lead to to real advances in mainstream technology. I don't doubt at all that SLI (2 cards, 4 cards, 628 cards) brings with it performance improvement, but it's just too cumbersome, expensive, and power hungry to become more than a niche market. I would liken it to the dual CPU market of a few years back - not well penetrated into the home PC user base.

    Why not a dual core GPU? Why not a dual GPU card?
    There's a light bulb in Livermore CA that's burned continuously for going on 90 years. The company that made that particular bulb went out of business many years ago because their bulbs lasted too long
    Sorry, sounds too much like the wonder-gizmos that will give cars 300 miles per gallon, that everyone knows were bought up by the "oil companies", but of which no one can provide any documentation whatsoever. Not saying you're wrong, but it smells HIGHLY of urban legend. I'd love to be proved wrong.
  • edited April 2006
    Sorry, my father's the source, the bulb is burning outside the main firehouse in Livermore and has a dedicated backup genny for it. Pops is the lucky bastich who put it in in the late '80's when the bulb had 75 years on it.

    Booyah!
  • edited April 2006
    RWB wrote:
    Bulbs and GPU's arn't the same... a bulb is a bulb is a bulb, and a GPU can always be made faster and games can always look more real. There is no reason they cannot just build an amazing graphics card and then come out with a new one like they used to do.... Geforce 2 GT, Ultra, Geforce 3, Geforce 4 etc...

    The GF2 series is as close to the GF3 series as the GF3 series is to the 6 & 7 series. Miles different. The 2 series featured hardware T&L but no programable shaders, the 3 series introduced programable shaders but on a fixed point, the 5 series introduced floating point shaders.

    What they're doing now with the 7800, 7900, 7600, etc, etc, is exactly what you're describing, incremental upgrades.

    I'm talking about something that's future proof (As much as it's possible to anticipate such things) and powerful as hell, powerful enough to be more than you'll need for a couple of years. If they made such a card then when everyone that was going to upgrade to it bought one there'd be no reason to buy more aside from failures and the GPU company that was foolish enough to provide that card would go under. The sales at first would be a flood then a slow flow then a trickle. Get my point?
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited April 2006
    I am truly impressed!

    But I still think though, that if it were a technology that could be mass produced at a price attractive to consumers, someone would mass produce it. A case in point is compact flourescent lightbulbs. They last several times as long as incandescant, but they aren't all that popular, probably because of the initial cost. And boy, they really do save a lot of money in energy costs. I use them throughout the inside and outside of my home. But the average guy walking through WalMart or Home Depot will look at them and think in the long term, but will say to himself, "why would anyone pay that much for a lightbulb?" But, as prices fall on these, more and more people are buying them. LED lightbulbs are already available too, and they last darn near forever. Hardly anyone is buying them though, because they are very expensive.
  • edited April 2006
    That's the problem though, the guys that sold the bulb that I'm talking about made them at a reasonable price at a time when lighting wasn't widespread so there was limited demand, we are talking about 96 years ago. Think about it, if you had X number of housholds that used electric lighting (say you're on an island) and you introduced LED bulbs at a price equal to 2 incandecent bulbs you'd end up selling all the bulbs that everyone that wanted to pay twice as much for a lighbulb would want right away then after that unless those folks were to come up with a bunch more sockets to fill your demand would fall off and you'd starve.

    In todays market, globally, that's less likely to happen but you get my drift. Back then they saturated the market, demand dried up and they went under.
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited April 2006
    Concerning the SLI topic: Why haven't we seen a dual-core GPU? Or a dual-GPU graphics card? Does it have to do with a data transfer bottleneck to the motherboard?

    Concerning innovation: The link you provided, MadMat, implies the lightbulb in questions did not go into greater production because it is cool running - low powered. And unless there is some type of secret-material filament (highly unlikely) burning inside that bulb, it MUST be cool running and very dim. The writer implied that no one now, or before has been interested in dim running lightbulbs.
  • GargGarg Purveyor of Lincoln Nightmares Icrontian
    edited April 2006
    Leonardo wrote:
    Concerning the SLI topic: Why haven't we seen a dual-core GPU? Or a dual-GPU graphics card? Does it have to do with a data transfer bottleneck to the motherboard?

    That's what I'm wondering, too. I also agree that the adding additional graphics cards gets too cumbersome. Most of the space for expansion on the motherboard being taken up exclusively by graphics just isn't practical.
    Leonardo wrote:
    Concerning innovation: The link you provided, MadMat, implies the lightbulb in questions did not go into greater production because it is cool running - low powered. And unless there is some type of secret-material filament (highly unlikely) burning inside that bulb, it MUST be cool running and very dim. The writer implied that no one now, or before has been interested in dim running lightbulbs.
    I'm sure there are (yet undiscovered) ways to produce long-lasting light bulbs that aren't dim, but judging from the slow acceptance of fluorescent lighting, it's unlikely anyone would invest heavily in the research. The bulbs would have to be priced extremely high, not (exclusively) because of materials or complex construction, but just to get a return on the R&D.

    On a side note, I've got fluorescent bulbs in my apartment from the previous tenant, and while they're great in some rooms, it drives me nuts how my bedroom is dim for about a minute until the bulbs warm up to their full output.
  • entropyentropy Yah-Der-Hey (Wisconsin)
    edited April 2006
    Are you guys talking about those curling ones? If so, we've got those things all over. The great part about them is that I think they use ~5 watts or so to produce 60w of light (compared to the regular bulbs). And yeah, the bad part is that they take awhile to heat up.

    If you mean the big long tubes, I hate those. I don't know if they're different, have different glass, different covers or what, but I can't stand them. Makes everything feel really harsh.
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited April 2006
    Wow, trying to tie-in these to topics under the general heading of SLI - Is it a Real Innovation? Light bulb innovation.

    Well, yes I think these are compatible topics.

    SLI - I still maintain it's just not practical and is too expensive for the mainstream. For hardcore gamers? Sure, but a hard bitten hobbyist will do what it takes.

    Lightbulbs. LED is the technology of the future. It's here now and is available, but will remain too expensive until there is enough of a consumer user base to encourage production competition to bring down the prices. Compact flourescent lightbulbs were very expensive until enough people purchased them, convincing more manufacturers that there was a viable market. The resultant increase of supply brought prices way down. You can now get multipacks of the bulbs for under $10. Yes, they do take a little while before they fully illuminate. I will start using LED bulbs when the price to energy savings is at what I think is a reasonable. I have no idea how far in the future that will be. Sodium vapor and mercury vapor technology is also very good, but the color hues produced by these lamps is such that no one wants them in a home. I certainly wouldn't. (I'm referring to the hazy blue - mercury and orange - sodium looking lamps that light roadways, warehouses, and other such infrastructure.)
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited April 2006
    Yes - the curly ones. They use a fraction of the energy that incandescent bulbs use. The reason they are curly is that flourescent technology for lighting requires the tube be certain length, hence curling it for a compact design. I've been using floursecents so long that I've just be accustomed to the warm up time. Oh also, the compact flourescents also put out only a fraction of the heat that incandescent bulbs do.
  • GargGarg Purveyor of Lincoln Nightmares Icrontian
    edited April 2006
    Leonardo wrote:
    Lightbulbs. LED is the technology of the future...

    You might find the latest OLED advancement interesting.

    http://www.nature.com/news/2006/060410/full/060410-8.html
    The traditional light bulb's days could be numbered, according to scientists who have taken an important step towards making white organic light-emitting diodes (OLEDs) commercially viable.

    060410-8.jpg
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited April 2006
    I know one day I will own a slew of Quad setups... I mean you can get SLI in laptops now! The graphic world is growing allowing us more advanced games, which is good in my opinion...
  • tmh88tmh88 Pittsburgh / Athens, OH
    edited April 2006
    but why would you ever need that much power. I mean i'm assuming that games like FEAR would run fine at 1280x1024 on a single 7800GT, and maybe even 1600x1200 on a 7800GTX or 7900GT/GTX
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited April 2006
    The graphic world is growing allowing us more advanced games, which is good in my opinion...
    And that right there is the key that just take SLI to the mainstream, but I doubt it. If the hardware can be packaged smaller, which it obviously can, and if it becomes less expensive, it just might take off. A few years ago, who would have thought that dual core CPUs would become commonplace, and that in inflation-adjusted dollars are no more expensive than old single core Athlons and Pentiums?

    I could happen, but I don't think it will be in a dual/quad card format. It will have to be new engineering.
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited April 2006
    The reason tech like Quad SLI is need is because of our demand for bigger than 30" LCD screen to play games on, and be able to view these games as they would look on a 15" LCD.... if you look at benchmarks for 2000+ res. and higher a single card or even a dual setup drops into the 40's to 50's FPS now the human eye can separate up to 68 FPS so anything higher than 68 would round out more crisp and defined images...

    Me, I now own 2 24" LCD's sitting on my desk ran by 2 7800GTX 512's and they have issues keeping up with today’s gaming on both screens. This is why Quad SLI is needed in today’s market. and as long as Nvidia is going to strive to push HD and other new features we as consumers demand... I can only think the GPU market will grow larger and larger. I don't know if it will be the same tech, but I do hope within the next 5 to 10 years we will see human realistic characters outside of a cinematic and actually in game!
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited April 2006
    Now it's really starting to get interesting, even though I'm not a gamer!
  • RWBRWB Icrontian
    edited April 2006
    This argument reminds me of the scientist from The Simpsons on one episode where he was showing a giant super computer in the 70's and says... "One day I predict a computer twice as powerful as this one and 10,000 times bigger!"

    SLI.... IS OVERKILL. OK I wish I had a quad SLI setup, it'd be sweet I don't care about the power requirements or the insane heat. I DO CARE ABOUT MY POCKET and that most of us here simply CANNOT afford to pay $200 x4 for a medium end setup, or $400x2. It's insane, sure it's there for those who can afford it, that's all fine and dandy but when games are being built that basically require this crap!? I am more and more liking the consoles.... and I HATE CONSOLES...
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited April 2006
    Agreed on consoles... but I will most likely buy a PS3 for the new Blue Ray player :)

    But it comes down to us who can afford the higher priced items. weare the one really driving thr market as we are buying the product. I myslef do not owna quad SLI system yet... for the pure fact my 2 7800gtx 512's run perfectly fine.

    Now also you have to remind yourself not even my 4800+ x2 can keep enough info flowing to 4 nvidia cards... so until they start doing dual X2's I won't be getting 4 cards... as it is technically pointless to have the power and not have to CPU to harness it.

    But when Nvidia physics rolls out I can see a benificial use for eveyone who owns one of these setups :)
  • airbornflghtairbornflght Houston, TX Icrontian
    edited April 2006
    On the ps3 note, lets hope that the blu-ray player doesnt turn out likt the ps2's dvd player, cause mine just wont play movies.

    on the video cards, I think they need to start focusing on miniturization of the current technology. I mean, it has to be possible, The have 600mhz processors in paml pilots now and more. back in '99 and '00 my p3 is twice the size of the palm pilot.

    And when it comes right down to it, I think they are going to run out of bus bandwidth well before they make any new inovations, which is probably the primary reason for dual-sli and quad-sli. I think its time for amd to creat an optical processor, and have all the conections as high bandwidth optical conections. This is just my dream though.

    On the light bulbs...LEDs. Im ordering one or two, so that I can take them apart and see what they did to run them on 120v ac. Then Im gonna build one bigger and better. I know that led's can be ran on ac, but it will switch from on to off 60 times a second. so at a perfect sine, i would only be getting 60 steady volts. if you look at how pwm works. So im thinking that I could do it and just pull the voltage drop down to the ground so that i need a minimal resistor. Think im gonna put them in a series paralell config. (im not an ee, so this is just some backyard engineering, anything I said could be wrong.)Remember kids, do not try this at home.
  • deicistdeicist Manchester, UK
    edited April 2006
    Everyone with an interest in CPU technologies is working on none-semiconductor logic circuits frenzidly at the moment since we've basically hit a brick wall when it comes to semi-conductors. There's only so much miniturization you can do before physics kicks you in the ass....

    But anyway, about the 7950. Are these DX10 parts? If not it seems silly spending a huge amount of cash on graphics cards when DX10 is due to hit at the end of this year.
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited April 2006
    No they are a shrink of the 7900GTX dual card setups... just mounted on smaller boards.
  • jradminjradmin North Kackalaki
    edited April 2006
    Quad SLI is just a money sink for graphics gurus atm. I highly doubt any computer built right now coupld make total use of quad SLI. Infact, I doubt that socket M is going to make full use of quad SLI. Right now is just for bragging/showoff rights in the hardcore gamer/enthusiast market (which is always a small market but always brings in a large amount of money quickly.).

    Were going to see physicis chips on cards before we see dual core cards (I agree with you in your comment on this Sledgehammer.). With physicis in games starting to really be refined and change the way games act and play...I think there gonna milk that first since the technology is already out and ready to go.

    I see dual core GPU's coming out in about 2 years. For now though, in my opinion...quad-SLI is for people who dont have anything better to spend $$ on.
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited April 2006
    I don't see why they would go Dual core for the most part as even with a single core the CPU can't feed it enough info.

    But now that I think of it they might go dual core for the fact that they could run physics on one core and standard info on another. maybe this is where they are headed. just maybe.......

    Besides they still need to move to a 65nm shirnk which would boost all graphics. They would run cooler and on less power. allowing Quad setups to run on normal PSU's. There is a ton of things they can do to make things run better. But the main question is... are they going to do it?
  • jradminjradmin North Kackalaki
    edited April 2006
    But the main question is... are they going to do it?


    Not as long as people are willing to pay $$$ out the nose for the current chips/cards.

    Graphics are the cash cow of the PC industry, and they are gonna milk that cow till the tits fall off. Thats pretty much historically what PC component manufacturers do.
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited May 2006
    From the looks of it the 7950GTX 2 will be a huge thing for Nvidia at E3 this year. I have confirmed reports from Game vendors across the board that they will be showing off their latest and greatest at E3 on Nvidias new Boards....
  • rykoryko new york
    edited May 2006
    quad sli is a joke...i forget where but i saw 2 x1900xtx's in crossfire beat a quad 7900gtx SLI setup.....where was that link.

    anyway, SLI/crossfire won't be worth it untill 2 cards = 2 times the performance. i can't understand for the life of me why people think it is acceptable to purchase 2 cards and have one of them not live up to its full potential. i mean 2 7900gts in SLI is not 2 x faster than 1 7900gt---maybe 1/2--1/3 as fast. so you are paying full price for that second card and only 1/2--1/3 of it gets used---that is totally lame. maybe if they discounted your 2nd gpu purchase to 1/3 of the regular price i would be all for it.

    i want more innovation from gpu manufacturers, and i feel that SLI/crossfire has allowed them to be somewhat lazy. why innovate when you can just slap another expensive card in there and count your profits for another few months before bringing out a new product.

    i want a single card that can do at least 1600x1200 with full effects (AA/AF/HDR) and cost between $200-$300. and i don't want to buy a new psu for it. and i want hdmi compatibility (or displayPort), and dx10.

    also, i don't want to purchase a new monitor for it, but thanks to Vista i don't have a choice. where in the hell are all of the hdcp compliant monitors anyway? i would love to pick up 2 hdcp 1600x1200 lcds now, but i can't find any for a resonable price---but all of the non-hdcp lcds seem to be dropping in price every week.

    i am fine gaming at 1280x1024 with my x800xl for now. i am going to wait sometime before any new purchase. probably have to upgrade my entire system when vista and hdcp rear their ugly heads.

    as for lightbulbs, i have been using the curly flourescents in most of my house for a while, but i do kinda hate the light they produce. very much like the huge tube flourescents found in office buildings that seem to make you tired. that's why in my office i have regular incandescent bulbs--so i don't fall asleep!
Sign In or Register to comment.