LeonardoWake up and smell the glaciersEagle River, AlaskaIcrontian
edited July 2008
OK, up to 4412 PPD on a WU 5006. I boosted the shader by 50 points, which was good for about an additional 100PPD. I also tried a shader clock of 1600, 100MHz increase. That was stable until the screen saver ran for about half an hour.
In the meantime though, take a look at Process Lasso
Thanks Leo, I've been on the lookout for something like this.
0
LeonardoWake up and smell the glaciersEagle River, AlaskaIcrontian
edited July 2008
4700, pretty consistent now
I spent a couple days on and off looking for the sweet spot with my 8800GT. After MUCH trial and error, I finally found a completely stable configuration. For my XFX GT it Core 750/RAM 900/Shader 1575. I couldn't get any setting with a shader frequency above 1575MHz to be stable more than one hour. I ran 775/950/1625 for a short spell, which produced about 4930 points per day. the 750/900/1575 is averaging about 4700PPD.
To say the least, I am now a GPU Folding believer! Running one WinSMP client simultaneously with the GPU2 client, the machine (No. 4 in signature) is cranking out a rate of up to 7400PPD. No, I'm not exaggerating.
To say the least, I am now a GPU Folding believer! Running one WinSMP client simultaneously with the GPU2 client, the machine (No. 4 in signature) is cranking out a rate of up to 7400PPD. No, I'm not exaggerating.
That's amazing! What an improvement! What are your affinity settings?
0
LeonardoWake up and smell the glaciersEagle River, AlaskaIcrontian
edited July 2008
Affinity settings:
GPU2 on CPU core 3
WinSMP client on CPU cores 0, 1, 2
Affinity settings:
GPU2 on CPU core 3
WinSMP client on CPU cores 0, 1, 2
Priority for both clients is set to "low"
That's what I had mine set to, also, but now I'm experimenting to see if setting all of them to 0-4 brings up the SMP without hurting the GPU too much.
I appreciate you posting your OC settings for the 8800 GT. I have the same card you do and I'm going to try those settings too.
0
LeonardoWake up and smell the glaciersEagle River, AlaskaIcrontian
edited July 2008
This is what makes me so depressed...
Look at the bright side:
1) even just a modest system with a dual core processor can knock out 4000PPD if fitted with an 8800GT or better, and
2) 8800GTs can be had for relatively low prices now. I paid $105 shipped for mine, new, sealed, just without the silly (purely my opinion) game.
Look at it this way - it's now probably easier to accumulate massive points than before.
I would say you could take your card higher even on the shader with a new BIOS loaded that had an increased voltage. Though past 1700 there doesn't seem to be any real improvement in production.
0
LeonardoWake up and smell the glaciersEagle River, AlaskaIcrontian
edited July 2008
Yeah, I've considered updating the BIOS and configuring for more voltage. I've updated motherboard BIOSes too many times to count. I've updated video card BIOSes before, too, but that was years ago. I'm a bit queasy about it, now. This is my first high performance video card I think in five years. For now, I'll just leave it where it is. Later maybe this winter, when I feel the urge to fix something that isn't broken, I'll update and configure the BIOS.
high performance.........you have quadros listed in your sig. Now if you could get those puppies folding that would be sick
0
LeonardoWake up and smell the glaciersEagle River, AlaskaIcrontian
edited July 2008
Well, no. Quadros aren't necessarily high performance. Their core clocks and memory specifications aren't any higher than their gaming counterparts. They are just optimized for precision and have drivers certified for a truckload of professional design and drafting applications.
So is there anyone doing GPU2 folding on two NVidia cards on our team, I am going to get the second card in the next weeks and if someone else has already gone through troubleshooting problems that would be nice to know.
So is there anyone doing GPU2 folding on two NVidia cards on our team, I am going to get the second card in the next weeks and if someone else has already gone through troubleshooting problems that would be nice to know.
Last I saw on the GPU2 FAQ, dual video cards aren't supported for folding. It would be awesome though.
LeonardoWake up and smell the glaciersEagle River, AlaskaIcrontian
edited July 2008
I haven't looked into it very much, but my last readings seemed to show that a second video card's production on a quad core machine was 50% or less of the first card's. That would be about the same total single machine production as the same machine folding one WinSMP CPU unit plus one GPU2 unit. Again, I haven't looked at this closely.
0
LeonardoWake up and smell the glaciersEagle River, AlaskaIcrontian
edited July 2008
Again, I haven't looked at this closely.
Now I have looked more attentively. People are now running two cards and two GPU2 clients in the same computer. Over 10,000 PPD in some cases.
Two cards + 2xSMP clients works fine for me here (Vista64). "HTPC1" has a 8600GT and a 8800GT in it, and "TEST 2" has a 8800GT and a 8800GTX, no problems at all
Two cards + 2xSMP clients works fine for me here (Vista64). "HTPC1" has a 8600GT and a 8800GT in it, and "TEST 2" has a 8800GT and a 8800GTX, no problems at all
Holy hell, that's why you're closing so fast on me...
That would explain why you suddenly surged ahead of me!
Isevald, are both of your computer's PCI slots X16?
Physical yes (obviously), but the 8600GT+8800GT setup runs in an old 975 based board, so the numbers of lanes are split into x8 for each slot. Haven't had time to check the other one (8800GT + 8800GTX) , but the board in that one is x38 based (Asus Maximus Formula).
so with the multiple cards do you have monitors hooked up to them and running GPU2 on each monitor or have the scripts.
Oh, I forgot to mention, I'm using dummy adapters to fool Vista into thinking there are monitors connected to all cards. I had to use it to get a picture on one of the computers that only got TV out connected too (no display with the 177.35 Cuda enabled drivers).
But simplified it, using a DVI to VGA adapter like this: (no soldering required, just inserted the 75ohm resistors into the VGA connector and secured them by hot glue, takes like 10 minutes)
LeonardoWake up and smell the glaciersEagle River, AlaskaIcrontian
edited July 2008
Wow! That's so simple!
I simply MUST find more stuff to sell to finance more 8800 purchases.
By the way, Isevald and Enisada. I spent all that hard work and dedication to pass you guys on the Team rankings and then you sneak up on me in the very dark of the night, figuratively stab me in the back, and don't even call an ambulance. What, you guys got upset when I passed you a few weeks ago?
4. You need to set these variables either in a script that launches the client, or before you launch the client. Doing so after starting the client will have no effect until it is stopped and restarted.
Thats from the link I posted earlier, I haven't read through the whole thread but they don't go into any more detail than that
So with that you just launch multiple instances of GPU2, I keep reading how to do this but I am not grasping.
....By the way, Isevald and Enisada. I spent all that hard work and dedication to pass you guys on the Team rankings and then you sneak up on me in the very dark of the night, figuratively stab me in the back, and don't even call an ambulance. What, you guys got upset when I passed you a few weeks ago?
4. You need to set these variables either in a script that launches the client, or before you launch the client. Doing so after starting the client will have no effect until it is stopped and restarted.
Thats from the link I posted earlier, I haven't read through the whole thread but they don't go into any more detail than that
So with that you just launch multiple instances of GPU2, I keep reading how to do this but I am not grasping.
I don't understand it either Maybe interesting if you got a mixed setup with NV and ATI? (all not cuda capable) I'm only using the -gpu x switch under advanced config:
Comments
min - 29
max - 100
GPU temp has not exceeded 62C according to both SpeedFan and Riva Tuner
Thanks Leo, I've been on the lookout for something like this.
I spent a couple days on and off looking for the sweet spot with my 8800GT. After MUCH trial and error, I finally found a completely stable configuration. For my XFX GT it Core 750/RAM 900/Shader 1575. I couldn't get any setting with a shader frequency above 1575MHz to be stable more than one hour. I ran 775/950/1625 for a short spell, which produced about 4930 points per day. the 750/900/1575 is averaging about 4700PPD.
To say the least, I am now a GPU Folding believer! Running one WinSMP client simultaneously with the GPU2 client, the machine (No. 4 in signature) is cranking out a rate of up to 7400PPD. No, I'm not exaggerating.
That's amazing! What an improvement! What are your affinity settings?
GPU2 on CPU core 3
WinSMP client on CPU cores 0, 1, 2
Priority for both clients is set to "low"
What makes my day brighter... one new PC can make my numbers look GREAT again.
That's what I had mine set to, also, but now I'm experimenting to see if setting all of them to 0-4 brings up the SMP without hurting the GPU too much.
I appreciate you posting your OC settings for the 8800 GT. I have the same card you do and I'm going to try those settings too.
Look at the bright side:
1) even just a modest system with a dual core processor can knock out 4000PPD if fitted with an 8800GT or better, and
2) 8800GTs can be had for relatively low prices now. I paid $105 shipped for mine, new, sealed, just without the silly (purely my opinion) game.
Look at it this way - it's now probably easier to accumulate massive points than before.
Isevald, are both of your computer's PCI slots X16?
Holy hell, that's why you're closing so fast on me...
Physical yes (obviously), but the 8600GT+8800GT setup runs in an old 975 based board, so the numbers of lanes are split into x8 for each slot. Haven't had time to check the other one (8800GT + 8800GTX) , but the board in that one is x38 based (Asus Maximus Formula).
Oh, I forgot to mention, I'm using dummy adapters to fool Vista into thinking there are monitors connected to all cards. I had to use it to get a picture on one of the computers that only got TV out connected too (no display with the 177.35 Cuda enabled drivers).
EDIT: Scripts? Haven't seen those...got a link?
I used the guide that seems to be linked to everywhere: http://soerennielsen.dk/mod/VGAdummy/index_en.php
But simplified it, using a DVI to VGA adapter like this: (no soldering required, just inserted the 75ohm resistors into the VGA connector and secured them by hot glue, takes like 10 minutes)
I simply MUST find more stuff to sell to finance more 8800 purchases.
By the way, Isevald and Enisada. I spent all that hard work and dedication to pass you guys on the Team rankings and then you sneak up on me in the very dark of the night, figuratively stab me in the back, and don't even call an ambulance. What, you guys got upset when I passed you a few weeks ago?
Thats from the link I posted earlier, I haven't read through the whole thread but they don't go into any more detail than that
So with that you just launch multiple instances of GPU2, I keep reading how to do this but I am not grasping.
?
Huh?
I think you called 'im out for a straight-up fight...
...and got one.
I don't understand it either Maybe interesting if you got a mixed setup with NV and ATI? (all not cuda capable) I'm only using the -gpu x switch under advanced config: