ATI to make Quad Crossfire

Sledgehammer70Sledgehammer70 California Icrontian
edited April 2006 in Science & Tech
ATI has decided to combine two Gemini boards that feature two R590 chips each. You're reading it right, we're talking about dual R590 boards working together, or a Quad GPU powered ATI product.
The final physical design is far from being completed but ATI's Gemini X1600 is a concept where two chips fit on a single PCB. However, in order to create room for two 256 bit memory interface and quadrupled power consumption and heat dissipation - you need to be very creative. Enter creative engineering.
Source: The Inquirer

Comments

  • GargGarg Purveyor of Lincoln Nightmares Icrontian
    edited April 2006
    We've gone beyond competition and entered a pissing contest. The winners? People with $3000+ budgets for GPUs alone.
  • edited April 2006
    I hate to say it but ATI is just looking pathetic now. This is totally a "Me too!!" product. At least with Crossfire it was different enough from SLI to not be considered a copycat product but this, this is just sad.
  • GHoosdumGHoosdum Icrontian
    edited April 2006
    It seems like ATI is doing the same thing as AMD - "If we can't beat their two with our two, we'll beat their two with our four!"
  • edited April 2006
    nVidia did the quad thing first, this is more of a "OMG look at the money these fools are paying for that shiz! We gotta get in on that!!" sort of thing.
  • GrayFoxGrayFox /dev/urandom Member
    edited April 2006
    If you look back to september ati said that quad crossfire was probably going to happen in the future.. But yah this is redickulous who the hell needs 4 GPU's ? you will never use all the power.
  • GargGarg Purveyor of Lincoln Nightmares Icrontian
    edited April 2006
    I guess it's just a halo product. ATI/nVidia seem to think that whoever is perceived to have the performance crown will sell the most cards all across the performance spectrum.

    In other industries, I'm not sure that's the case, but for graphics cards, it does seem that some people shop that way. Even if it doesn't make sense :shakehead
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited April 2006
    Well the problem is for us guys who have a 30" LCD need the power to push higher frames on these huge LCD's as much as you can argue 2 7900GTX's or 2 1900XTX's will not give you the full punch a 30" LCD needs.... this is where the Quad setup comes into play....

    Now I am a graphic Nut, but I can't justify the expense... :( "Yet"
  • QeldromaQeldroma Arid ZoneAh Member
    edited April 2006
    Dang. I really need to buy power stock.
  • mmonninmmonnin Centreville, VA
    edited April 2006
    Uhh ATI has had multi-gpu systems for awhile now, before nV. ATI has been using multi-gpu setups in government systems.

    And what else would be next? 3 GPUS? 4 is the next logical step anyway.
  • edited April 2006
    I dunno when the ATI g'vment setups came out but I remember drooling over 20 GPU nVidia flight sim units used by the USAF back in late '02
  • GargGarg Purveyor of Lincoln Nightmares Icrontian
    edited April 2006
    Multi-GPU use in government simulators isn't rediculous. For home systems though? I have a hard time believing a 2-GPU Crossfire or SLI setup can't push the highest resolution monitor available. It's just not a product we need. A few of us will want it, though.
  • QeldromaQeldroma Arid ZoneAh Member
    edited April 2006
    Largest flight simulator company in the world:

    CAE Electronics Ltd.
    Quebec, Canada
    with headquarters in Toronto.

    They also bought up key American flight simulator companies like Link Systems.

    They have put together for years the largest, wildest and most expensive display systems I have seen.

    I think there is a connection with ATI. :canflag:
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited April 2006
    The Government setups are totally different from what we see today on the Crossfire and SLI front
  • drasnordrasnor Starship Operator Hawthorne, CA Icrontian
    edited April 2006
    It doesn't matter how many GPUs you've got if your drivers make it run like a dog under *nix. I find it difficult to even read ATI press releases anymore.

    -drasnor :fold:
  • edited April 2006
    drasnor wrote:
    It doesn't matter how many GPUs you've got if your drivers make it run like a dog under *nix. I find it difficult to even read ATI press releases anymore.

    -drasnor :fold:

    I guess it's all a matter of who can afford to write drivers for *nix, which evidently ATI can't do right now. It's a lot easier to write drivers for 90+% of the systems out there than for ~5%.

    But back on topic; quad gpu's from either camp is just crazy IMO. Like WOW, watch the power meter spin when I play games. :D
  • mmonninmmonnin Centreville, VA
    edited April 2006
    There is nothing CPU/GPU manufactures can do about that. Until a totally new technology arrives that will replace the transistor, adding more transistors and using more wattage is the only feasible way to add functional/processing power.
  • GooDGooD Quebec (CAN) Member
    edited April 2006
    Personaly i don't really like where this is leading for now... What after Four ? Six ? I will miss those good days with only 1 video card for a while i think :(

    I've chose a X850 XT when i builded my rig because i was not willing to have 2 video card in SLI... i Guest i'll have to deal with the fact that my next video card in 2 years will be SLI or CrossFire in any ways lol

    If there's no new technology soon (let say, 4-5 years) it will be impossibile for them to continue this crazy processing power war, or else it will take motherboard with 6 PCI-E to run all those video cards together, with a huge case hah
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited April 2006
    I think the graphics world will finally change to a removable GPU and Memory I mean if they were able to get that to work buying GPU cores would be a nice thing to have... The problem is I am a graphics nut and I will be determined to own the best... just becuase...
  • jradminjradmin North Kackalaki
    edited April 2006
    I think the next big step is going to include nano-tube technology. They can already make nano-tubes that act exactly like transistors and produce like 1/4th the heat.

    The big winner is going to be the company who tosses the silicon die out the window and starts working with a new pallate.
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited April 2006
    As cool as that will be they will most likely keep what they have and focus on Physic's now :)
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited April 2006
    It seems like ATI is doing the same thing as AMD - "If we can't beat their two with our two, we'll beat their two with our four!"
    Nope, your history is a little bent here, GH. Intel was publicly scoffing at AMD's dual core plans until AMD proved it was viable, real, and true innovation. Intel scrambled as fast as they could and shoved the D 8 series out the door before they were really a finished technology. Intel has been playing follow-the-leader for lack of an inventive, comprehensive strategic plan. Merom and Conroe may though, turn the tables.
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited April 2006
    Leo is right! but on the 2nd part of the post i do believe AMD has something hiding and will smack the Intel parts of their rocker.... I think they are letting Intel spend the cash to advertise their new parts and than AMD is going to smack them again with a superior part. & not only take more market share but also hurt Intel for all the cash they spent trying to sell there new part.

    If I wrong about this oh well, if I am right I will be back to say I told you so! 
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited April 2006
    AMD is not sitting on its thumbs. Their technology advancements have momentum. But my point was that AMD's new additions to the CPU camp have not been "me too" responses. That has been Intel for the past two years, with the exception of Netburst and Itanium, which have not been me-too, but rather what-for?.

    Yeah, I think the multiple GPU race is getting rather silly. After a point, will there be any difference in playing a computer game, at least which is detectable by a human? The CPU race though, brings tangible results. The multi-core CPUs are used, and will increasing be used in servers and other high end applications. And of course, those of smitten by the distributed computing bug, Folding@Home, are quite fond or our dual cores.
  • jradminjradmin North Kackalaki
    edited April 2006
    I agree with you. However, there only going to be able to squeeze so much out of a waffer. At some point in the very near future, silicon is going to go bye bye and something else is going to take its place.
  • Sledgehammer70Sledgehammer70 California Icrontian
    edited April 2006
    Well in a tech stand point a company who makes a program that will utilize a faster GPU will score big.... Using both GPU and CPU can benefit the end user. I mean come on it is all 1's and 0's

    With ZRam as a boost it will be interesting to see if a high L3 cache is applied. I mean GPU's can benefit from all this CPU tech advancements right?
  • GargGarg Purveyor of Lincoln Nightmares Icrontian
    edited April 2006
    With ZRam as a boost it will be interesting to see if a high L3 cache is applied. I mean GPU's can benefit from all this CPU tech advancements right?

    I'd like to see it go the other way, too. 1000mhz main memory? Oh yeah. :rockon:
Sign In or Register to comment.