Geforce 6200 Overclocking... Unlocking Pipelines

124»

Comments

  • edited April 2007
    azrael201 wrote:
    Believe me there is still much interest now that the price of this card has dropped to $25 + shipping. This is IMO the best value card out there for what it can do.

    I am not familiar with how Riva Tuner works, but I'm assuming the modifications it does on the card requires software to be loaded in Windows? What exactly is the trade off of using Riva Tuner and theoretically a working 6600 BIOS? Can you explain a bit more about Riva Tuner scripts as well?

    Your DVI port works after you flashed it to PX6600.rom? or did you mean when you flashed it back to the original 6200.rom?

    What is your feeling of the chinese article, do you think they were able to unlock those pipes permanently (especially after examining their performance tests on the second page). It seems like their main argument for using the 6600 BIOS was mainly for a more stable and higher overclock than Riva Tuner could provide?

    There is absolutely no trade-off with Rivatuner. I recommend, based on my experience, to keep the 6200 BIOS and use Rivatuner. There are detailed information in this thread, and you can use the following link to see how to use Rivatuner.
    http://www.doomedpc.com/?q=node/36

    You can find the information about using the scripts in Rivatuner documents when you download the package.

    I did not test if the DVI port was working with 6600 BIOS. I only tested the analog port and it was working.

    I added my overclock and 3DMark2005 score in the previous post. I did the overclocking using coolbits in Nvidia driver. The overclock is from 300MHz to 600MHz in the GPU. I have never seen 100% overclock on a video card before.
  • edited April 2007
    ok i thought i read that the BIOS overclock was always the best way to do it, but i will try to find someone to translate that page for me and find out what's going on.

    i highly doubt they would write such an extensive article about flashing it to 6600 unless it was working-at least with unlocked pipes.

    did the vantec orb help significantly with the OC?
  • edited April 2007
    azrael201 wrote:
    ok i thought i read that the BIOS overclock was always the best way to do it, but i will try to find someone to translate that page for me and find out what's going on.

    i highly doubt they would write such an extensive article about flashing it to 6600 unless it was working-at least with unlocked pipes.

    did the vantec orb help significantly with the OC?

    Yes, Vantec cooler decreased the GPU loaded temperature by at least 10C and enabled stability at 600MHz.
  • edited April 2007
    hey mirage

    someone over at slickdeals found the original Leadtek 6600 BIOS and it works.

    it's over at mvktech look for the first revision of leadtek 6600 bios


    i had it all along and never knew ...i spent hours looking for it and it was right under my nose i feel so defeated someone beat me to it.
  • edited April 2007
    azrael201 wrote:
    hey mirage

    someone over at slickdeals found the original Leadtek 6600 BIOS and it works.

    it's over at mvktech look for the first revision of leadtek 6600 bios


    i had it all along and never knew ...i spent hours looking for it and it was right under my nose i feel so defeated someone beat me to it.

    Thank you for finding that BIOS, I will try right away. :rockon:
  • edited April 2007
    are these cards SLI-able? i don't think there is a SLI connector but some people claim there is. I remember you mentioned it was.

    nevermind someone mentioned they were through PCI-e and i remembered i saw that in the BIOS of my lanparty


    Do you really think the BIOS flash will be good other than the 6600 SLI? The Taiwanese article does state it gave them better overclocks but I'm skeptical.
  • edited April 2007
    azrael201 wrote:
    are these cards SLI-able? i don't think there is a SLI connector but some people claim there is. I remember you mentioned it was.

    Unlike 6600GT, SLI capable 6600 cards do not have the SLI bridge (i.e. PCIE bus is fast enough for these cards). If the 6600 conversion is successful, those 6200 cards could work in SLI mode. I don't have any SLI MB here to test tough. In case you test it, why don't you post here, I am curious.

    By the way, I tested the BIOS you mentioned. It indeed unlocked all 8 pipes and the card was recognized as 6600. But, my overclocked was not as good. I could only overclock the GPU upto 450MHz vs 600MHz with the original BIOS. In the end, I went back to the original BIOS.
  • edited April 2007
    i find that interesting since the the whole premise of that Taiwanese article was that you could obtain higher overclocks after using the 6600 BIOS. perhaps it varies from card to card, but I really doubt it.

    thanks for the heads up. I do have a SLI MB however i do not have 2 of these cards :(

    i bought one just for my media pc and before i get a DX10 card (pass down 7800GTX). i really don't believe it would be economical to get super cheap cards in SLI especially since SLI lacks considerable software support still and a single card solution is still better overall.
  • edited April 2007
    This is a fantastic thread - my and I'm sure many other peoples' thanks to RWB and all the contributors.

    I'm not sure whether this is a call for help or just a glum note on 6200's. I have a Celeron 2.26 on an ASRock P4i65G with 768 Mb RAM and an XFX GeForce 6200 256Mb via AGP 8x with 93.71 drivers (WinXP Home SP2). Not the highest spec machine, granted. Nor am I a hardcore action gamer -- in fact we only bought the card to support my son's "Cars" game.

    Having done a bit of research the 6200 seemed good enough for purpose, various reviews reporting a 3DMark 05 score between 850 and 1500 (depending, of course, on RAM and CPU, let alone the individual card). However, "Cars" was horribly jerky and on other games performance was *worse* than the onboard Intel thing.

    Overclocking seemed a bit of a sledgehammer -- surely this card should perform better than the onboard straight out of the box? So I looked for obvious things but could find nothing. The pixel pipe unlock which started this thread is unavailable on the XFX (in fact, if you do it it takes you down to 2 pixel pipes -- sweet!:rolleyes2). Finally I overclocked to 409/540, took down all the background stuff on the PC (firewall et.al.) and took the colour down to 16bit and got a whopping 3DMark 05 score of ..... 521/775 (3D/CPU).:eek: Stunning.

    BTW, the only reason I'm using 3DMark 05 is because its what most of the reviews used so I wanted to compare like-for-like. The 06 score, overclocked and with 16bit colour, was a stupendous 348/182. Takes an age to run too :bigggrin:

    I think that's it, unless anyone has any other suggestions? Highly disappointed I have to say, and more than a little peeved that I've had to shell out to be able to play a game but have been left with performance worse than the on-board :puke: .
  • edited April 2007
    the xfx geforce 6200 is probably not the NV43a. i would not be surprised if it was the NV44...the one that does not unlock and does not overclock AS well.

    i thought 3DMark only gave one score...i downloaded them recently off the internet and they make you check your score online and there is a "comparison" button where you can check how your score compares to people of similar spec rigs.


    ah...i just google'd it and it is NV44

    oh yes...there is no reason why it should not outperform your onboard by a significant margin
  • edited April 2007
    Thanks for the reply azreal.
    azrael201 wrote:
    i thought 3DMark only gave one score...i downloaded them recently off the internet and they make you check your score online and there is a "comparison" button where you can check how your score compares to people of similar spec rigs.
    The second score I give is for the CPU tests. Just for interest really.

    Oddly enough, there is no-one else on the site with a simliar spec machine as mine ;D The wierd combination of the Celeron and 6200 I think, neither of which I would guess hardened action gamers would touch with a bargepole. Still, it doesn't stop be being ranked 18,000+ :cool:
    azrael201 wrote:
    oh yes...there is no reason why it should not outperform your onboard by a significant margin

    Well at least it good to know. Still doesn't stop me being so grumpy though 'cos it don't and I'm nacked off. 'Scuse me while I go and have another littel sulk.
  • RWBRWB Icrontian
    edited April 2007
    I think it's the processor, I looked up some scores on the orb and found some people right around your range with 351/501 speeds on their card running P4's at 2GHz+ and getting like 880 points vs your 521 with a Celeron... it's really hard to say. But really, 3D Mark 05 is too advanced for this card.

    Frankly, this card just blows compared to others out now; even on the cheap... though I am curious if your Intel graphics card is still enabled.
  • edited April 2007
    RWB wrote:
    I think it's the processor, I looked up some scores on the orb and found some people right around your range with 351/501 speeds on their card running P4's at 2GHz+ and getting like 880 points vs your 521 with a Celeron... it's really hard to say. But really, 3D Mark 05 is too advanced for this card.

    Frankly, this card just blows compared to others out now; even on the cheap... though I am curious if your Intel graphics card is still enabled.

    I have no way of knowing if the Intel is enabled. All I get in the BIOS is setting the priority to AGP/PCI. Its obviously picking up the AGP as a) its working (d'oh!) and b) the AGP-specific BIOS options are enabled and the PCI-specific ones disabled. Whether that means the onboard itself is disabled I dunno though.

    Thanks for the thoughts though. All this goes back to when I got the extra half gig of memory. After that the original celeron blew so I got my friendly chap-with-a-screwdriver to replace the mobo and he was going to put in an AMD chip but then found the half-gig memory needed an Intel chip. Live, learn, and always predict the future!!
  • edited April 2007
    I tried to run 3DMark06 with my unlocked PCIE GF6200 (nv43a2). But 3DMark06 unselected the HDR/SM3.0 tests and gave me a score of 1150. We all know that this is exactly the same chip on GF6600 with HDR and SM3.0 capabilities, and I was wondering why 3DMark06 had disabled the HDR/SM3.0 tests. Then I found that Nvidia had configured the drivers to disable HDR capabilities (not SM3.0) of all 6200 cards by default. The solution was in RivaTuner again. I went to NVStrap driver settings and did the followings
    - selected "custom" for "Graphics adapter identification"
    - selected "0141 (NVIDIA GeForce 6600) PCI Device ID"
    - and selected "Use ROM straps for PCI DeviceID programming" option

    After clicking "Apply" RivaTuner rebooted the computer and Windows recognized the card as 6600 this time. I had to reinstall the ForceWare driver, and reset the overclocking again to 600/1300 using coolbits. With these done, 3DMark06 ran the HDR/SM3.0 tests and the score came to 1650 from 1150.

    Just to let others know how to enable HDR with unlocked GF6200's without cross flashing to 6600. Without unlocking and overclocking, enabling HDR will not mean much since the chip is crippled beyond a level that does not have the power to handle HDR. But with unlocking and overclocking, my 6200 has the same level of performance with 6600GT, so it makes sense to have HDR as well.
  • edited May 2007
    i just wana say that im a PROUD owner of a XFX GeForce 6200 128mb GPU reason being that its an amaizing overclocker! ive read threads of people sayin that its a poor/cheap performer and i do agree that the stock 6200 isnt the greatest gpu, although if u know how to overclock and u have the rite hardware and driver combinations u can do amazing things with this gpu. first u wana make sure that u hav the latest drivers
    (see below tread 4 the rest)
  • edited May 2007
    enjoy overclocking the very overclockable 6200. just remember to get the latest geforce driver. its vital!
  • edited May 2007
    have these last 2 messages been edited? would love to know what "the rite hardware and driver combinations" are. I already have the latest full nvidia drivers (93.71), so what next? tc-v1.2 says "if u know how to overclock", so let's assume I don't. what should I be doing? do tell, please.....:bigggrin:
  • mtroxmtrox Minnesota
    edited May 2007
    let's assume I don't. what should I be doing? do tell, please.....:bigggrin:

    I didn't know s--t a couple months ago. Mirage got me going on a 6200 GPU card and then I also read this thread. A great way to get info and get going. Check it out britesprite.
  • edited May 2007
    since nvidia does not list the 6200 as a natural oc. u need to obtain nvidia drivers that hav been re-run by aftermarket software. these can be found on google. . . search "nvidia drivers by tweaksRus" . Hardware wise, u need atleast 768mb of ram and a 2.6ghz+ cpu to minimize the strain on the gpu and most important! disable all other tweak software like rivatuner.etc as they conflict. excuse my previous treads had trouble with my dsl
  • edited May 2007
    tc-v1.2 wrote:
    since nvidia does not list the 6200 as a natural oc. u need to obtain nvidia drivers that hav been re-run by aftermarket software. these can be found on google. . . search "nvidia drivers by tweaksRus" . Hardware wise, u need atleast 768mb of ram and a 2.6ghz+ cpu to minimize the strain on the gpu and most important! disable all other tweak software like rivatuner.etc as they conflict. excuse my previous treads had trouble with my dsl

    tc, you might need to read this thread, maybe again. Overclocking and unlocking 6200 has nothing to do with amount of ram and cpu frequency. Actually you *need* Rivatuner instead of disabling it. Also there is no need for a nvidia driver other than official ones. I had to correct every sentence you wrote, please stop posting nonsense.
  • edited May 2007
    have these last 2 messages been edited? would love to know what "the rite hardware and driver combinations" are. I already have the latest full nvidia drivers (93.71), so what next? tc-v1.2 says "if u know how to overclock", so let's assume I don't. what should I be doing? do tell, please.....:bigggrin:

    Hi britesprite, I checked your first post. It seems to me that your XFX AGP card is not unlockable. I also searched for an unlocking AGP 6200 card, but I could not find. You will need to upgrade to a PCIE system to use an unlocking 6200 card such as this card. The difference between your AGP 6200 and unlocked/overclocked 6200 is huge, although both of them are 6200s. Please check my 3DMark06 score above, it is ~1650 and 3DMark05 score is ~3300. Until then, you will need to put up with your AGP 6200, there is not much you can do with it.
  • edited May 2007
    i've runned 3DMark03 with my Medion Geforce ti 200 (stock) and Xpertvision geforce 6200 agp 64 bits 256 mb 300/500 (stock)

    results
    Medion Geforce ti 200 1000 points
    Xpertvision geforce 6200 804 points :confused:

    i've even tested the Medion Geforce ti 200 on a pentium 4 1.8 ghz 512mb 133mhz
    and my geforce 6200 on a pentium D 2.8 ghz 2048 mb 667mhz

    (sorry for my crappy english, i'm dutch)
    can anyone help???
  • edited May 2007
    Peddo123 wrote:
    i've runned 3DMark03 with my Medion Geforce ti 200 (stock) and Xpertvision geforce 6200 agp 64 bits 256 mb 300/500 (stock)

    results
    Medion Geforce ti 200 1000 points
    Xpertvision geforce 6200 804 points :confused:

    i've even tested the Medion Geforce ti 200 on a pentium 4 1.8 ghz 512mb 133mhz
    and my geforce 6200 on a pentium D 2.8 ghz 2048 mb 667mhz

    (sorry for my crappy english, i'm dutch)
    can anyone help???

    Do you mean GeForce3 ti 200 or GeForce4 ti 4200? Also, 3DMark03 uses Directx 9.0a. Both GF3 and GF4 do not support DX9.0. If you want apple-to-apple comparison you should run 3DMark01. Finally, both GF3 ti 200 and GF4 ti 4200 have 128 bit memory bus width compared to 64 bit of your 6200 card; that is a disadvantage. Some of the 6200s also have 128 bit bus, but not in your case.
  • edited May 2007
    if u say so mr intelect. . . . .u just do me a favour, hook up a 6200 to a celeron 1.6ghz cpu with 128mb of ram, overclock the core to 365mhz+ddr2@750mhz then u tell me how it well it performs compared 2 a p4 @3.2ghz with 1g of ram? i'm triyin 2 help, ive run the card plus others on a range of cpu-ram combinations as i am the "repair man" at the computer corp, so i have plenty o time to mess around ;-)
  • RWBRWB Icrontian
    edited May 2007
    tc-v1.2 wrote:
    since nvidia does not list the 6200 as a natural oc. u need to obtain nvidia drivers that hav been re-run by aftermarket software. these can be found on google. . . search "nvidia drivers by tweaksRus" . Hardware wise, u need atleast 768mb of ram and a 2.6ghz+ cpu to minimize the strain on the gpu and most important! disable all other tweak software like rivatuner.etc as they conflict. excuse my previous treads had trouble with my dsl

    I don't have an f'n clue what you're arguing about in the more recent posts, but as for this quoted post I am curious if you could give more info on these drivers. Do they unlock the card by default so you don't need Rivatuner? If so, post more.

    But IMHO, for anyone buying a new card on the cheap there ARE better alternatives. This card is only unlockable to a certain extent, I don't care how much you can overclock it, not to mention one card may OC well while another may not. There are simply much more newer technologies out since this card's day that a modern card of similar core and memory speed specs would run circles around this one for more modern games.
  • edited May 2007
    mirage wrote:
    Do you mean GeForce3 ti 200 or GeForce4 ti 4200? Also, 3DMark03 uses Directx 9.0a. Both GF3 and GF4 do not support DX9.0. If you want apple-to-apple comparison you should run 3DMark01. Finally, both GF3 ti 200 and GF4 ti 4200 have 128 bit memory bus width compared to 64 bit of your 6200 card; that is a disadvantage. Some of the 6200s also have 128 bit bus, but not in your case.

    i'v used a geforce 3 ti 200 175/400
    i wil run the test again with 3dmark01

    thanks:wink:
  • edited May 2007
    Thanks for the help and advice guys.

    I downloaded some 101.02 drivers as suggested but there was no difference. Re-installed RivaTuner, but the pipeline unlock still reduces the pp's from 4 to 2 (which I still find hilarous).

    And I got all excited there for a mo .. oh well :cool:

    At least I know what to do ... get a better processor AND get a better card :rolleyes2 Probably go for the processor first as a Celeron 2.26, I now realise, is just waaaaayyyy to slow. Spotted a P4 2.8 for < £50 so I may give that a whirl.

    Alternatively, as the card's only there to support one game, I could always just get the ol' screwdriver out again!
  • jamesberkhimerjamesberkhimer Maryland
    edited April 2009
    RWB wrote:
    OK you see in this image that part that has a number blurred or something? It says...

    "128-bit NV43 (A2 8x1 3vp)" I think that 8x1 part is the part that says you now have 8 pipes instead of the normal 4. kthxbye. :thumbsup:


    ok i figured it out A1 is fixed you can not unlock the pipeline or overclock it it will just freeze up. i been doing it for an hour following the steps and the card just crashes even with advanced cooling:sad2:

    if you got A2 your lucky :respect:
Sign In or Register to comment.