GPU2: Shader clock influence on production

LeonardoLeonardo Wake up and smell the glaciersEagle River, Alaska Icrontian
edited July 2008 in Folding@Home
In another thread, _K_ posed a question as to the effects on production by varying the GPU shader clock. Well, I did some testing, and at least for the card I tested, there is a direct, constant positive relationship between increased shader frequency and production.

hardware:

System No. 5 in Signature

Video card is an MSI 8800GT 512MB. Default clock is 660 core, 950 RAM (1900DDR), and shader clock 1650. The card is pretty standard as far as shaders go - 112.

Below are listed the clock configurations I tried with the corresponding production. Room temperature remained constant and the computer was not used for anything but Folding@Home - 1 X WinSMP client and 1 X GPU2 client running simultaneously. All GPU2 clock/production tests were with the same project, 5006.

Core Memory Shader Production-PPD Delta

750 1000 1650 5057 (baseline)
750 1000 1700 5057 0
750 1000 1750 5184 127 +
750 1000 1800 5316 259 +
750 1000 1850 5456 399 +

It's an anomaly to me why the shader clock increase from 1650 to 1700MHz resulted in a delta of zero. I ran both of the 1650 and 1700MHz runs three times to confirm.

This is obviously not an analysis of multiple cards. This cannot be extrapolated out to project trends - it's simply a demonstration of what this particular video card does with an increased shader clock.

As an aside, I am very impressed with this card's GPU/RAM cooler. It's stock with a large heatpipe/fan unit that is near dead quiet. With GPU2 running, ambient room temperature of 76F, and a core overclock from 660MHz to 750MHz, the core temperature is 57C.

Comments

  • mas0nmas0n howdy Icrontian
    edited July 2008
    Leonardo wrote:
    It's an anomaly to me why the shader clock increase from 1650 to 1700MHz resulted in a delta of zero.

    It is because in both cases the shaders are actually running at 1674MHz. The shaders must operate on dividers and cannot really be adjusted in even steps.

    1650=1674
    1700=1674
    1750=1728
    1800=1782
    1850=1836

    This applies also to memory and core clocks and can be confirmed by watching the Hardware Monitoring portion of RivaTuner while adjusting clocks.

    :cheers:
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited July 2008
    ooh

    As you can tell, I'm quite a noob at video card overclocking. :D

    How does one know where the dividers are?
  • RyderRyder Kalamazoo, Mi Icrontian
    edited July 2008
    Most cards operate on 13,27, and 54 MHz base clock jumps.

    Cores will clock every 27 and sometimes 13 MHz, Memory usually at 54 MHz.

    Somewhere around this big old internet there is a formula... it started with the 7800 series as I recall, when clocking the 7800GTX became popular.

    I will see if I can find it. (after I get some sleep)
Sign In or Register to comment.