GPU2 folding?

2456

Comments

  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited June 2008
    What is the core clock on your GTS? Is it overclocked?

    I'm getting ready to purchase an 8800GT, 600MHz core clock.
  • DanGDanG I AM CANADIAN Icrontian
    edited June 2008
    My GTS is at 675.
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited July 2008
    Alright, I sold some parts and purchased an 8800GT, 600MHz, 512MB from a trading forum. I should get it towards the end of the week.

    Preparation for GPU Folding:

    1) Downloaded Nvidia drivers 177.35
    2) Downloaded Folding client Beta 6 GPU2, "612b8"
    3) Downloaded CUDA (for drivers 177.35)
    3) bookmarked some pertinent threads

    I'm still confused about the .inf issue. The machine that will host the 8800GT is running WinXP SP3 (32bit). Is that a factor for the .inf thing? Please tell me what else I need to do to prepare.

    This thread at Folding Forum recommends to use drivers 174.55. Are those recommended because that thread was opened before newer drivers were out?

    Is there a GPU Folding how-to guide somewhere. OK, I really haven't looked that hard yet, but I'm finding stuff piecemeal, not all in one place.

    This is for Team 93, boys. Get me started right! :bigggrin: :cool2:
  • _k_k P-Town, Texas Icrontian
    edited July 2008
    Just install the nvidia driver. Try to install the cuda driver, you should get an error part way through, then go to the directory where the cuda driver decompressed(Nvidia folder) and make a spare copy of the .inf file that matches yours just in case. Copy paste the edited .inf file into the correct folder. Then tell the cuda driver to install again but do not allow it to replace files that are already present. After all the reboots install the GPU2 client and run.

    if you want to run a CPU folding or SMP with this you need to set affinity of the cores to allow for the GPU to have its own core or you can set the priority of the GPU2 slightly higher in the install setup, in the config box during install its somewhere in the top right.
  • everyguyeveryguy El Cajon, California
    edited July 2008
    Leonardo wrote:
    I'm still confused about the .inf issue. The machine that will host the 8800GT is running WinXP SP3 (32bit). Is that a factor for the .inf thing? Please tell me what else I need to do to prepare.

    This thread at Folding Forum recommends to use drivers 174.55. Are those recommended because that thread was opened before newer drivers were out?
    You need a modded inf for 8800GT because 177.35 really isn't supposed to support that card (although it does). I had an easy install with the files from here: http://www.laptopvideo2go.com/forum/index.php?showforum=94. That link has the modded .inf. Make sure you use the 177.35 and not a later one. I heard they don't have CUDA.

    I think the 175.44 driver is the one recommended in the GPU2 faq because it is the latest that "officially" supports CUDA for all cards, but the 177.35 driver seems a lot more stable and it folds twice as fast on an 8800GT.

    I had good luck just going to add programs in the control panel, uninstalling the previous nVidia drivers, and installing 177.35 with the modded .inf.
  • _k_k P-Town, Texas Icrontian
    edited July 2008
    I just changed to the 177.39 driver and i still get the same rate as with the older drivers. I haven't seen a difference in the folding rate between drivers.
  • everyguyeveryguy El Cajon, California
    edited July 2008
    _k_ wrote:
    I just changed to the 177.39 driver and i still get the same rate as with the older drivers. I haven't seen a difference in the folding rate between drivers.
    What kind of rate are you getting. I'm getting about 4500-4700 with the 8800GT without overclocking. Maybe I had something wrong when I was running the 174.55 driver. I was only getting about 2500 with that one.

    This is in Vista by the way.
  • _k_k P-Town, Texas Icrontian
    edited July 2008
    I am getting 2970 per day, roughly. That is based on production trend over an hour period and then projected out for a 24 hour period. I am also running SMP with it as well as doing my daily stuff on here.
  • everyguyeveryguy El Cajon, California
    edited July 2008
    _k_ wrote:
    I am getting 2970 per day, roughly. That is based on production trend over an hour period and then projected out for a 24 hour period. I am also running SMP with it as well as doing my daily stuff on here.
    If your daily stuff is gaming, I can understand why it might take a hit. Otherwise it seems pretty low. I went from 174.55 directly to 177.35, and that's what I'm running now. It's running SMP too. It's a quad-core phenom at 2.3 GHz.
  • mas0nmas0n howdy Icrontian
    edited July 2008
    _k_ wrote:
    I am getting 2970 per day, roughly. That is based on production trend over an hour period and then projected out for a 24 hour period. I am also running SMP with it as well as doing my daily stuff on here.

    That does seem low. If you are running SMP + GPU2 did you set GPU2 to use "Slightly Higher Core Priority" on the advanced tab of the configuration? If not, you need to. When you look in Task Manager "FahCore_11.exe" should be at a solid 25% CPU usage. It should be getting one dedicated core/thread and not fighting for priority with SMP or whatever else you are doing. If you aren't already, run FahMon and see what it tells you after it's had an hour or so to average out your PPD.
  • _k_k P-Town, Texas Icrontian
    edited July 2008
    Well I just fixed the acid vision problem so we will see if it goes up any, real quick it was saying 3020
    //ok after 30 minutes of it running while i watched stuff projection is 4547.87; fo'sho. So apparently the driver issue was slowing everything down a lot more than what I was thinking. I uninstalled video drivers...and a few others by accident.....and loaded up the 177.35 and its giving me those numbers as well as fixing the acid vision problem in all video games. Its kind of sad to no longer be able to play TF2 with crazy Ghost Riders coming at you shooting boxes of flames in every direction and what looks like mysterious satanic text over all the walls.
    //4986 ppd
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited July 2008
    Alrighty, I've got GPU Folding now. Here are the pertinent hardware specifications and software settings:

    • System 4 (signature) running WinXP SP3
    • XFX 8800 GT (stock), 600MHz Core/1500MHz Shader/700MHz DDR3, 256MB, 256-bit
    • Video card settings - "3D Settings" option set to "Performance;" all other settings at default
    • Video drivers: ForceWare 177.35 with modified .inf
    • F@H GPU2 6.12B8
    • One WinSMP client running
    • Affinity Changer uninstalled
    • GPU2 Fahcore_11 affinity is set to Core 3 (fourth core of the Q6600); each WinSMP FahCore_A1.exe is set for affinity to cores 0, 1, and 2
    • GPU2 configuration: -forceasm and -verbosity 9 flags set, Core Priority set to "Slighty higher," CPU Usage percent set to maximum. "Do NOT lock cores to specific CPU" I left unchecked
    I'll let this run overnight and we'll see what production data develops.

    It's been years - about five I think - since I've tweaked video cards for performance, other than quality settings for photo editing. So I'll be looking for your advice and assistance in optimizing this video card. I've already installed Riva Tuner and familiarized myself with it. I experimented for a little while but set all the video card hardware settings back to stock. I will be removing the stock thermal paste on the GPU to replace with premium TIM.
  • lemonlimelemonlime Canada Member
    edited July 2008
    Very cool :)

    I've got an older 640MB 8800GTS that I'll be giving this a try with.
  • mas0nmas0n howdy Icrontian
    edited July 2008
    FYI: I'm running GPU2 on my HD4870 now and am getting ~2200 PPD. The word from the folding community seems to be that the 4800 series will not see much better performance until the next core optimization and/or until the WUs contain larger proteins. When I run GPU2 and watch the "GPU Load Meter" in CCC It hangs between 50 and 70 percent while CPU usage is double that of GPU2 on nVidia cards, so it looks like there is much optimization to be done. I'll miss the PPD from the 9800GTX, but man this card rocks. :)
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited July 2008
    mas0n, K, and Everyguy, results are in for three GPU2 work units. The average PPD is about 3700. Just to refresh your memory:
    • System 4 (signature) running WinXP SP3
    • XFX 8800 GT (stock), 600MHz Core/1500MHz Shader/700MHz DDR3, 256MB, 256-bit
    • Video card settings - "3D Settings" option set to "Performance;" all other settings at default
    • Video drivers: ForceWare 177.35 with modified .inf
    • F@H GPU2 6.12B8
    • One WinSMP client running
    • Affinity Changer uninstalled
    • GPU2 Fahcore_11 affinity is set to Core 3 (fourth core of the Q6600); each WinSMP FahCore_A1.exe is set for affinity to cores 0, 1, and 2
    • GPU2 configuration: -forceasm and -verbosity 9 flags set, Core Priority set to "Slighty higher," CPU Usage percent set to maximum. "Do NOT lock cores to specific CPU" I left unchecked
    What tweaks for the card, Folding configuration, or what not should I try? I will also be overclocking the card, but will take that slow and easy. Last time I overclocked a video card was about five years ago. I'm reading that overclocking the shader clock is more effective than overclocking the core clock.

    But anyway, I'm listening closely for any suggestions...more points for Team 93!
  • everyguyeveryguy El Cajon, California
    edited July 2008
    It seems a little low to me. I'm getting over 4500 with exactly the same card and drivers. What is the max ppd reported by FahMon? Are you using the computer for other stuff? If you use the newest GPU2 core, it allows you to set the affinity with an environmental variable. You probably know that you have to reset the SMP affinities everytime a new WU starts or when you reboot.
  • everyguyeveryguy El Cajon, California
    edited July 2008
    FWIW, here is the FahMon report and some information about my rigs that I recently sent to _k_ in a private message. I'm not overclocking anything.

    FahMon_everyguy.jpg
    AMD: AMD Phenom Quad Core 9850 "Black" 2.5 GHz, 32-bit Vista, 9800GTX
    Abexia: AMD Phenom Quad Core 9600 2.3 GHz, XP home, 8800GT
    Ku-chan: AMD Athlon X2 Dual-Core 5600 2.8 GHz, XP Pro, 8500GT (I'm not running GPU2 on this card at the moment. I lose from the SMP what I gain from GPU.)
    All running SMP client ver. 5.92 and GPU2 ver. 6.12, beta 8 with 1.06 core, 177.35 drivers.

    Also Ideaholic is a Toshiba notebook with Turion 64 X2 running the SMP client at 1.6 GHz

    The ones that say GPU are the cards. Abexia is actually a bit lower in real life because my sons use it for gaming. The affinities on the quad cores are set with GPU2 running on core 3, and the SMP clients running on cores 0, 1, 2.
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited July 2008
    It seems a little low to me. I'm getting over 4500 with exactly the same card and drivers. What is the max ppd reported by FahMon? Are you using the computer for other stuff? If you use the newest GPU2 core, it allows you to set the affinity with an environmental variable. You probably know that you have to reset the SMP affinities everytime a new WU starts or when you reboot.
    It seems low to me also. No I was not aware that one had to reset affinities at every new work unit.

    No, I'm not using the computer for other things, maybe just half an hour per day.

    "Environmental variable?" I've set the Core Priority to "Slightly higher," if that's what you mean.

    In task manager, if you right click on the GPU2 Core_11 and the FAH Core_a1, there is an option to set Priority - Realtime, High, AboveNOrmal, NOrmal, BelowNormal, and Low. What should those be set to, both for GPU2 and WinSMP?

    As for relatively low production, I had wondered if the card was underclocked or something. Riva Tuner and GPU-Z both show the card to be at default of 600/1500/700.
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited July 2008
    GPU2 units folded so far have been 5007 and 5214, which are credited for 480 and 479 points respectively.

    Is there any advantage to setting a flag for -advmethods. I did though, set config to allow upload and download of assignments over 10MB.
  • mas0nmas0n howdy Icrontian
    edited July 2008
    Looks like the AMD/ATI cards are getting totally different WUs (which makes sense) 4730, 4732, 4726. All worth ~200 points.
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited July 2008
    In task manager, if you right click on the GPU2 Core_11 and the FAH Core_a1, there is an option to set Priority - Realtime, High, AboveNOrmal, NOrmal, BelowNormal, and Low. What should those be set to, both for GPU2 and WinSMP?
    Update. I set GPU FAHCore_11 to Realtime and SMP cores each to Normal. PPD jumped from 3280 to 4220!

    Anything else to try? Can anyone recommend a utility/script or something that will automatically set affinities and priorities for the SMP and GPU cores?
  • everyguyeveryguy El Cajon, California
    edited July 2008
    Leonardo wrote:
    It seems low to me also. No I was not aware that one had to reset affinities at every new work unit.

    This applies to only SMP work units, not GPU2 units.
    Leonardo wrote:
    "Environmental variable?" I've set the Core Priority to "Slightly higher," if that's what you mean.

    An environmental variable is something like the "PATH" variable you set using Control Panel -> System -> Advanced tab -> Environmental variables. Combined with the 1.07 core, it allows you to fine tune the affinity for GPU2. It's explained in this thread: http://foldingforum.org/viewtopic.php?f=43&t=3689
    Leonardo wrote:
    In task manager, if you right click on the GPU2 Core_11 and the FAH Core_a1, there is an option to set Priority - Realtime, High, AboveNOrmal, NOrmal, BelowNormal, and Low. What should those be set to, both for GPU2 and WinSMP?
    Realtime will kill you if you try to do anything else on the computer. It's not really necessary if you are using the computer just for folding. You can leave this at the default.
    Leonardo wrote:
    As for relatively low production, I had wondered if the card was underclocked or something. Riva Tuner and GPU-Z both show the card to be at default of 600/1500/700.

    If I forget to tweak the SMP affinity settings each WU, and let them share core 3 with GPU2, I get about the ppds you are showing. I wonder if that might be the problem. I'm not at the computer with the 8800, so I can't tell you what the clocks are on my card.
  • _k_k P-Town, Texas Icrontian
    edited July 2008
    those are defaults for the 8800gt except the memory it should be 900
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited July 2008
    Alright, guys, I'll take a look at what you advised.

    In the meantime though, take a look at Process Lasso. You can set affinities and priorities for each process. When a process starts, Lasso automatically assigns whatever affinity and priority you have preset. It works perfectly. I'm using it now.

    4200PPD seems to be steady now. That is with core 3 assigned exclusively to GPU FAHCore and cores 0, 1, and 2 assigned to WinSMP FAHcore (all four instances).

    BTW, the memory on my 8800GT is default 700MHz.
  • _k_k P-Town, Texas Icrontian
    edited July 2008
    http://www.newegg.com/Product/Product.aspx?Item=N82E16814133205 That is a base GT with no OC, of course it shows the effective rate of the mem. Don't know whats going on with yours.
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited July 2008
    I don't think there's anything wrong with my card. GPU-Z shows the memory to be 700MHz, so I'm assuming effective rate, DDR, to be 1400MHz. It's pulling 4300 now consistently. I believe everything is alright.

    Still though, does it make any difference with the advanced settings for the card, such as all the filtering, buffering, and texture options? I'm really quite a simpleton when it comes to 3D rendering.
  • edited July 2008
    I remember reading in the OCForums F@H forum that shader speed is what affects ppd most, Leo. You might try overclocking the shaders and see if the ppd comes up. :)
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited July 2008
    Yes, I read the same thing. I've played just a wee bit so far with overclocking this card. Both times GPU Folding errored out - "UNSTABLE MACHINE". Later today I'll be giving it another shot, perhaps boosting PCI-e voltage a bit more. From what I'm seeing though, 4300 seems to be a decent production for a no-frills 8800GT.
  • _k_k P-Town, Texas Icrontian
    edited July 2008
    boosting the PCI-ev shouldn't help if you are increasing and the thing crashes GPU2. The voltage on the card still stays the same unless you create a new BIOS with increased v to mem and core. If I was you I would dl ATITool and rivatuner at least, since the think failed on you. If you want to play with OC on the card change it, rivatuner, and then use ATITool for the stress testing and it reads out the temp for you as well. If you set your fan to 31% its fairly silent and lets you get close to 700/2000, core/mem. With the stock cooler you can push G92s 88s pretty hard without having to do v increase.
  • LeonardoLeonardo Wake up and smell the glaciers Eagle River, Alaska Icrontian
    edited July 2008
    OK, I didn't know that ATITool worked with Nvidia cards. Yeah, my OC attempts so far have been with Riva Tuner. If I continue with any OC workups today, I'll post a separate thread. I'm afraid I've taken this thread a little off topic.
Sign In or Register to comment.