What is the meaning of "AMD/Intel x16"

edited December 2005 in Hardware
At the risk of asking a realy stupid question, what is the meaning of
"AMD/Intel x16"

Seen on nVidia's driver download page after you select "Platform / nForce drivers" in the first step, the list includes AMD and Intel, and this weird option, namely "AMD/Intel x16" ... I've never noticed this option before today, and it cannot possibly refer to PCI-E x16 because that's been arround for a while and is even a native aspect of nForce4. The Instalation Notes I include below mentions that this support is new...

Notes from the driver download page:
http://www.nvidia.com/object/nforce_nf4_winxp32_x16_6.82
Installation Notes:

* Windows XP users must install Service Pack 1, at a minimum, prior to attempting to install this package. Failure to do so will result in the inability to support USB 2.0.
* Installation of DirectX 9.0 or higher is required in order to use NVMixer.

Release Notes:

* First driver release to support nForce4 AMD/Intel X16 motherboards

Windows XP/2000 Driver Versions:

* Audio driver version 4.62 (WHQL)
* Audio utility version 4.51
* Ethernet NRM driver version 5.09 (WHQL)
* SMBus driver 4.45 (WHQL)
* Network management tools version 5.09
* SMBus driver version 4.5 (WHQL)
* Installer version 4.89
* IDE SATARAID driver version 5.52 (WHQL)
* IDE SATA_IDE driver version 5.52 (WHQL)
* RAIDTOOL application version 5.52

Comments

  • mmonninmmonnin Centreville, VA
    edited October 2005
    It just means its for AMD or Intel motherboards with a PCIe 16x slot.
  • edited October 2005
    I think they added the "x16" to the driver package because the mobo makers are just releasing a new line of SLI boards that utilize 2 NVidia northbridges that give you 40 lanes of PCI-e, so you can have a full 16 lane PCI-e to each 16X PCI-e slot on the mobo. They have a different driver package for this new setup and I think it's labeled "x32".
  • edited October 2005
    I just downloaded and installed the nForce4 AMD non-x16 chipset drivers for my nForce4 chipset on my Gigabyte GA-K8NF-9 motherboard. Would it have been better to have downloaded the AMD/Intel x16 drivers for my setup?
  • edited October 2005
    I think you probably have already gotten the right drivers for your board. The drivers labeled "x16" might be the ones for this new series of boards coming out. I wasn't sure on the naming conventions for this new series of Nvidia chipsets when I made my previous post. If they still have a new non-x16 chipset drivers download, then the ones with the x16 in them must be for the newest chipset release that allows both pci-e x16 slots to have a full 16 lanes in sli mode.
  • edited October 2005
    Ok, thanks for that.
  • Omega65Omega65 Philadelphia, Pa
    edited November 2005
    muddocktor wrote:
    I think they added the "x16" to the driver package because the mobo makers are just releasing a new line of SLI boards that utilize 2 NVidia northbridges that give you 40 lanes of PCI-e, so you can have a full 16 lane PCI-e to each 16X PCI-e slot on the mobo. They have a different driver package for this new setup and I think it's labeled "x32".
  • edited November 2005
    Ahhhh, I see. Thanks for the confirmation Omega65.
  • edited November 2005
    So i have MSI Diamond bur using only one video card x850xt, do i have to install this one or the other for Amd not x16?

    tkz
  • Omega65Omega65 Philadelphia, Pa
    edited November 2005
    Install the regular NF4 driver v6.70
  • edited November 2005
    muddocktor wrote:
    I think they added the "x16" to the driver package because the mobo makers are just releasing a new line of SLI boards that utilize 2 NVidia northbridges that give you 40 lanes of PCI-e, so you can have a full 16 lane PCI-e to each 16X PCI-e slot on the mobo. They have a different driver package for this new setup and I think it's labeled "x32".

    You are at least partially right.

    I don't see any reference to "two northbridges" anywhere, so that sounds a bit suspect. But I found some specifications here:
    http://www.nvidia.com/page/nforce4_family.html
    and here:
    http://www.nvidia.com/page/pg_20041015917263.html
    In the case of Intel it is 40 PCI-e channels, and in the case of AMD it is 38. The important difference is that the old boards had a total of 16 PCI-e lanes for graphics, which was configured to connect all 16 to the first port when a single card is used, and split 8/8 when two graphics cards are in use. The new X16 motherboards have 16 PCI-e lanes for each of the two "graphics" ports.

    Firstly I put "Graphics" in quotes as these ports are supposedly usable for things other than graphics. Secondly, what irks me, is why the motherboard manufacturers even bother to re-direct the "other" 8 PCI-e lanes to the first port when using it in a single card config ... AS IF it adds some benefit!

    It is all marketing hype, and directed to make the less well informed think there is some benefit to having 16 lanes right now. Motherboards would probably have been cheaper and more stable if they just hard-wired 8 lanes to each port for the next few years untill we came close to using up a fair portion of those 8 lanes worth of bandwidth!
  • KhaosKhaos New Hampshire
    edited November 2005
    For those not accustomed to block diagrams...

    SLI x16 doesn't employ two northbridges. There are two PCI-E controllers, but one is located in the northbridge (SPP) and the other in the southbridge (MCP), as opposed to ATI's dual-x16 Crossfire which has both controllers in a single northbridge. The only real difference is the physical location of the PCI-E controllers, which requires the second controller on the southbridge to use much longer pipelines than it would if it were on the northbridge. This was done by Nvidia in order to save costs and use up stock of previously manufactured northbridges while still being able to deliver SLIx16 to market before ATI has Crossfire Dual-x16 to market in a significant manner.

    An Nvidia northbridge in the (hopefully near) future will likely feature both controllers integrated, which has proven to provide a hefty performance boost. In fact, much of the performance that makes Crossfire even remotely comparable to SLI can be attributed to the integrated northbridge design. Nvidia's current implementation of dual-x16 is a sort of patch compared to the real deal. Not that it doesn't perform well. It just doesn't perform as well as it could.

    While it's true that the bandwidth isn't necessarily needed, tests have shown performance gains by virtue of using two separate controllers as opposed to one controller for two video cards. I don't think it is bandwidth related at all, even though it is being marketed as such. This is demonstrated by the fact that Crossfire is competitive (Or was until the 7800GTX 512 was released) with SLIx16 despite Nvidia currently having faster high end graphics cards.

    Dual-x16 on one northbridge > Dualx16 split between north and south > Dual-x8 > Single-x16

    Or perhaps more appropriately, if not shorter (heh)....

    Dual video cards with dedicated controllers on a single northbridge > dual video cards with dedicated controllers split between the northbridge and southbridge > dual video cards sharing a single controller > one video card with a dedicated controller.

    Assuming all other factors remain constant. In the real world, this is impossible, which is why the absolute highest performance SLIx16 rigs outperform the highest performance Crossfire Dual-x16 rigs.

    Nvidia is currently riding the performance of their video cards, which makes up for the poor design and implementation of their SPP/MCP chipsets.

    Then again, I haven't researched any of this stuff in about a week... With the rate of change in graphics these days, all my information could be WAY off. </disclaimer>
  • Omega65Omega65 Philadelphia, Pa
    edited November 2005
    The current version of Crossfire is 8x/8x just like regular SLI. Only Nvidia's x16 version currently has two full 16x video slot.

    ATI next version of there chipset (RD600 I think) will have 44 PCIe lanes giving you 2 full 16x PCIe video slots.
  • ShortyShorty Manchester, UK Icrontian
    edited November 2005
    Khaos wrote:
    For those not accustomed to block diagrams...

    SLI x16 doesn't employ two northbridges. There are two PCI-E controllers, but one is located in the northbridge (SPP) and the other in the southbridge (MCP), as opposed to ATI's dual-x16 Crossfire which has both controllers in a single northbridge. The only real difference is the physical location of the PCI-E controllers, which requires the second controller on the southbridge to use much longer pipelines than it would if it were on the northbridge. This was done by Nvidia in order to save costs and use up stock of previously manufactured northbridges while still being able to deliver SLIx16 to market before ATI has Crossfire Dual-x16 to market in a significant manner.

    An Nvidia northbridge in the (hopefully near) future will likely feature both controllers integrated, which has proven to provide a hefty performance boost. In fact, much of the performance that makes Crossfire even remotely comparable to SLI can be attributed to the integrated northbridge design. Nvidia's current implementation of dual-x16 is a sort of patch compared to the real deal. Not that it doesn't perform well. It just doesn't perform as well as it could.

    While it's true that the bandwidth isn't necessarily needed, tests have shown performance gains by virtue of using two separate controllers as opposed to one controller for two video cards. I don't think it is bandwidth related at all, even though it is being marketed as such. This is demonstrated by the fact that Crossfire is competitive (Or was until the 7800GTX 512 was released) with SLIx16 despite Nvidia currently having faster high end graphics cards.

    Dual-x16 on one northbridge > Dualx16 split between north and south > Dual-x8 > Single-x16

    Or perhaps more appropriately, if not shorter (heh)....

    Dual video cards with dedicated controllers on a single northbridge > dual video cards with dedicated controllers split between the northbridge and southbridge > dual video cards sharing a single controller > one video card with a dedicated controller.

    Assuming all other factors remain constant. In the real world, this is impossible, which is why the absolute highest performance SLIx16 rigs outperform the highest performance Crossfire Dual-x16 rigs.

    Nvidia is currently riding the performance of their video cards, which makes up for the poor design and implementation of their SPP/MCP chipsets.

    Then again, I haven't researched any of this stuff in about a week... With the rate of change in graphics these days, all my information could be WAY off. </disclaimer>
    :D
  • KhaosKhaos New Hampshire
    edited November 2005
    Omega65 wrote:
    The current version of Crossfire is 8x/8x just like regular SLI. Only Nvidia's x16 version currently has two full 16x video slot.

    ATI next version of there chipset (RD600 I think) will have 44 PCIe lanes giving you 2 full 16x PCIe video slots.
    Yeap, I was actually referring to the next version of Crossfire (RD580)... Which, if I'm not mistaken, has already been tested by Anandtech.

    Here's the article, which details what I was talking about rather nicely:
    http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2609

    It's an interesting read... With pretty pictures too! My favorite :D

    Edit:

    LMAO... Looking at the other threads in the forum, I see that you actually created one dedicated to the RD580 article at Anandtech. I love preaching to the choir. :D Or in this case, the preacher. Hey, speaking of Preacher, where the hell is that guy?
  • edited November 2005
    The x16 refers to the new x32 based boards, don't ask me why they call it x16 when it is effectively x32 but that's how the chipset is designated.

    I'm planing to build a pc for a customer based on that chipset, it's the ASUS A8N32-SLI Deluxe
  • edited December 2005
    Have a look at this review of the VIA K8T900 chipset which actually allows nVidia graphics cards to run in SLI mode.

    http://techreport.com/reviews/2005q4/via-k8t900/index.x?pg=1

    Of particular interest is something which is not mentioned in the article, but... The "gain" from having a second S3 "MultiChrome" card in multi-GPU mode on this chipset seems to be much higher than the benefit seen on nForce motherboards using a pair of GeForce cards in SLI mode (over a single card), at least in the 100 or so reviews that I've read comparing SLI to non-SLI performance. In these benchmarks, the benefit is of MultiCrome is between 180% and 200%, while the benefit of SLI has often been criticized for being low at 120% to 130% over a single card setup.

    The other feature which is glossed over is the ability to change raid layout online - in fact it is stated here that V-RAID offers the feature in order to "keep up" with the competition, but I am not aware that any of the other chipset makers' raid solutions can do online raid layout conversions (or even offline raid conversion) at all.
    Khaos wrote:
    For those not accustomed to block diagrams...

    Or perhaps more appropriately, if not shorter (heh)....

    Dual video cards with dedicated controllers on a single northbridge > dual video cards with dedicated controllers split between the northbridge and southbridge > dual video cards sharing a single controller > one video card with a dedicated controller.

    Assuming all other factors remain constant.

    I suspect that the abve is very much over simplified. In particular, the power of the respective video cards needs to be considdered, or a statement needs to be added in terms of what is measured and compared (FPS, total bandwidth, latency, or ...), but for simplicity's sake I think this will do.
Sign In or Register to comment.