PDA

View Full Version : GeForce FX 5200 AGP 8x video card- can't get to work properly



jonnyD
07-01-2007, 04:42 PM
My system specs: Windows XP Pro, SP2, AMD Athlon XP 2400+, 1.5 gigs RAM
motherboard name: ECS L7VMM(2) motherboard chipset: VIA VT8375 ProSavageDDR KM266 Monitor:Gateway FPD1975W TFT LCD

I received a father’s day gift of a Gateway 19” widescreen LCD monitor (whose resolution is 1440 x 900). My onboard display adapter was not providing me with a 1440 x 900 resolution mode. So the helpdesk people at EZ Tune, the software manufacturer for the monitor, suggested I get a display adapter that would suport 1440 x 900 resolution. So I bought myself a PNY Technologies GeForce FX 5200 AGP 8x video card. I installed the card. When I try to install the drivers from the manufacturer’s disc, it says “video card not found”. I took out the card, put it back; checked all connections; still the same problem. I exchanged it for a new GeForce FX 5200 card. Same problem. I downloaded the AIDA32 Personal System Information utility – amazing!! I found out a couple of things: my GeForce FX 5200 is recognized as my current display adapter; my VGA mode is disabled (how do I enable it if that’s necessary?) Also, when I click on the monitor icon under the display section of the AIDA32 utility – NOTHING SHOWS UP!! I figure that’s a big clue to something. Also, nothing shows up under Windows Video icon, either. Separately, if I go into my computer’s device manager, no category for display shows up there either.

I’m not a gamer, by the way, so that’s not a consideration for the card I chose (I know, for example, that my motherboard has a VGA 4x slot, but my card is an 8X VGA card. No biggie). I’m just looking for stutter-free video (movies) with the potential, down the road, to do some video editing as a hobby sort of thing.

In your Super PC paper on Video Cards, you wrote: “But if you're using a flat panel display monitor (FPD) then make sure the video card has a DVI-I connector. A DVI-I connector may or may not be included on a particular video card model, regardless of price, so check to be sure.” Do I need a DVI-I connector? Could that be the problem?

athlonfan
07-08-2007, 07:46 PM
If you have on-board graphics (i.e. integrated), did you disable it in the BIOS menu?
To enable VGA mode,
Go to Start-Run. Type msconfig and press enter. Click the BOOT.INI tab,
and make sure none of the boxes are checked under the Boot Options section.

Cirndle
07-09-2007, 11:56 AM
The dvi connector is a white cord, and will fit on the monitor and computer. I am guessing you will need this since most cards do, but your monitor or probably card came with an adapter that goes from DVI to VGA or from white to blue, use the adapter on video card, and go from white to blue and use the blue cord, if your monitor doesn't have spot for white cord or DVI.

KNow confusing but it is because i am hurrying okay bye.

Owned
07-18-2007, 08:15 PM
That video card would most likley have DVI and VGA.

DVI cables are white.
VGA are blue.

Your LCD monitor you got probably has DVI and VGA. I'd choose DVI if it has it, but you have to make sure you video card has a WHITE dvi input port.