Quick Question
I have a Sony 21" Multiscan E500 Monitor and a VisionTek GF3 Ti 500 card...
Been using this combo for a long time now no probs..
I was messing around with the desktop settings and for me the 1600x1200 seemed to look the best, but the monitor supported settings only let the refresh rate be set as high as 75hz...
That was a little too low for this setting so I checked the box that allowed me to see all rates... When I chose apply I was able to see 85Hz so I chose that, the monitor flickerd and the deaktop came up so I have decided to try that for a while...
My qusetion / consern is this is about 10Hz higher than the supposed default refresh rates that was allowed with the other unsupported rates hidden...
Even though it worked, by leaving it this way am I endangering the GF card or the monitor in anyway by using this setting...?
I know that when I was playing around if a freq and pixel combo was too high the monitor told me by going to a black and showing out of scan range message...
I ask because I don't want to be pre-maturely sucking the life out of these parts...
What are the effects of this?
Thanks,
"g"
Been using this combo for a long time now no probs..
I was messing around with the desktop settings and for me the 1600x1200 seemed to look the best, but the monitor supported settings only let the refresh rate be set as high as 75hz...
That was a little too low for this setting so I checked the box that allowed me to see all rates... When I chose apply I was able to see 85Hz so I chose that, the monitor flickerd and the deaktop came up so I have decided to try that for a while...
My qusetion / consern is this is about 10Hz higher than the supposed default refresh rates that was allowed with the other unsupported rates hidden...
Even though it worked, by leaving it this way am I endangering the GF card or the monitor in anyway by using this setting...?
I know that when I was playing around if a freq and pixel combo was too high the monitor told me by going to a black and showing out of scan range message...
I ask because I don't want to be pre-maturely sucking the life out of these parts...
What are the effects of this?
Thanks,
"g"
0
Comments
I've attached the spec manual for the MultiScan CPD-E500.
I new/figured out that the mon suported it but I don't think that the GF card does...
I'll use it if works with out causeing programs and games to hang up or crash...
Speaking of crash... I installed the latest version of 3DMark03 v3.0.4 or what ever it was...
Anyhoo I've been trying all day to get it to go past the first test...
I can get the Demo to run fine along with having no problem looping 3DMark2001SE or running PCMark04, AquaMark all run fine... Sandra Pro shows no probs and the DirectX 9b tests all show that everything is installed and working properly...
Dunno..., this seems to be a head scratcher...
I've tried several versions of Nvidia drivers includeing the new ones... Seems kinda like the return of the refresh rate problem but I don't know.... I'm loading up some games now to see if I have probs.. if not then I gotta say that its just some kind of bug... who knows...
Thanks,
"g"
John.
Thanks... everything seems to be working fine for now.
BF1942 played just fine, its really weird that the 3DMark03 bench won't run but the demo will along with 2001SE, aquamark, and PCmark04... Heck all of the reagular tests seem to work too like memtest86, sandra brunin, and prime95... All work except for the 3d_03 test... I guess at this point I'll just continue loading stuff up and let the chips fall where they may...
"g"
John, it's RAMDAC quality and speed that dictates the card's refresh rate, not the GPU core.
Ti500's & Ti200's use the exact same core, the only difference being clock speeds (240/500 & 175/400). Cards made by manufacturers that offer cards based on both core's use the same PCB, RAMDAC's and outputs, as they are interchangable.
The Ti200, Ti500 and every card since VESA ratified UXGA has been capable of displaying 1600x1200 at 60Hz or higher. I have in my hands the retail box for an ATI Xpert@Play 8 MB PCI and said device can display 1600x1200 @ 75 Hz with 32-bit color.
As for a PCI Ti500, they would display 1600x1200 @ 32-bit color at a minimum of 60Hz without trouble either. Further more, PCI cards cannot utilize main system memory the same way that AGP cards can utilize AGP Aperture size to mark certain system memory for texture swapping in the event that the cards onboard memory is full.
AGP aperture size should be set to 50% of main system ram, or if you have more than 512 MB, set to 256 MB.
I have a lot of problems with 3DMARK03, I'd just make sure that nothing is running in the background at all. i.e antivirus etc etc. Turn everything off, then see if it gets through the tests.
Let's say you have 512 RAM on board, and a 128 MB video card, and a fast box. Widnows likes to have 256 with XP (recommended, and for DirectX 9 I woudl say go with recommended as minimum base). Card can use up to twice its RAM onboard, reasonably, and will never really use much more. BUT, If I give a card like this 256, and have 384 MB used, when the full frame aperture is demanded, you get problems-- suddenly the virtual machine has to swap a bunch of stuff to swap file\virtual memory file on Hd, and programs bog. I run 768 MB RAM on a Barton box with 98 SE and DriectX 9 from redist file from Microsoft after can fold with two apps open in frame buffer is set to GRAM RAM (oncard RAM) size, not half main RAM, while doing a virus scan and running two major apps.
Oh, hardware tech note here, the Gainward Ti 4800 SE has a defective WDM 3D Accellerator package on driver CD, but will run fine with the Nvidia driver pack 4523 which is latest for that specific card type from a surf to http://www.nVidia.com.tw/ yesterday. The Linux box fetched it and burned it and some other things (the DX9.0b redist and other video related things) to CD for me. This card does not like a 1\2 GRAM aperture, older cards would live with that as they were made for older boxes.
For game instensive stuff, you are right, I end up multitasking a LOT in Windows with non-game apps and there you want minimal frame aperture that lets card keep up with box without bogging box with delayed display updates to the monitor. Ok, then take this into account-- lots fo folks see flicker at 70 Hertz Vert Refresh or less, so 75 is a decent number for refresh minimum acceptable. I see flicker myself at 70 Hertz VR(vertical refresh).
So, balancing game. If card refreshes faster than monitor adn sends at fastre rate than monitor can display, you get monitor damage and garbling. So pick a refresh rate monitor can handle which does not flicker like an old TV for your eyes, run Frame Aperture in BIOS to minimum needed for operations use and conditions. My ratio was not ideal for card, was best balance or having usable RAM left even if card uses max Frame Aperture with BIOS help. Your ratio is CARD IDEAL, but might crimp RAM available for games themselves if you are multitasking the box with three other programs open. So, I set for box overall stability while under heavy load, not one game on at a time.
I tend to bench under load, folding plus two apps -- say Corel Draw 11 AND WordPerfect 2002, then no apps and no folding, to get bog rates for things, then look at worst things.
John.