Basically I have a monitor which accepts both VGA and DVI input, but I only have a VGA cable... and I only have a card with DVI slots (this is a lol situation)

And I have an adapter which converts DVI to VGA... so I plug this onto the graphics card and put my VGA cable into it, and plug the other end of the VGA cable into the monitor.

But it does not detect any active display and goes straight into power save mode.

But if I put in an old VGA card and use the VGA cable with the same monitor, it gives a display.

The card is not the issue (X1300) as it also has a DVI output, which I have used the VGA adapter on and again it gives no display on the monitor, and it is not the VGA adapater that's the problem because I've used different ones and they all give no display.

I'm trying to get this sorted so I can put in my spare 7950GT and use that instead... but it only has 2 DVI outputs. (yes I have tried both of them before you ask)

Have you recently confirmed the 7950GT works at all? Does it display on a DVI monitor? If no, I'd assume it is dead or incompatible with your motherboard (rare, but can happen). If it doesn't work in another system with different hardware, it's a pretty sure bet the card is done for.

If they are functioning correctly, yes. DVI-I (what is on the cards) includes DVI-D (digital) and DVI-A (analog). The adapter you are using should be a DVI-A to VGA adapter which basically means it is only pulling the analog signal out.

DVI-D will look better on a digital screen (like LCDs) than VGA. Make sure the cable is DVI-D though. Some monitors won't accept DVI-A/DVI-I.

I have to assume it really, really doesn't like VGA adapters. I have no idea how or why but that's really the only conclusion I'm drawing. The cheapest thing to try (and a better picture to boot) is a DVI-D cable. Again, make sure it is DVI-D. Some monitors won't accept DVI-I.

Yes, D-Sub 15 (15 pins) with male or female ends. DVI-A, except the arrangement of pins, is the exact same as VGA--at least in theory. I've encountered situations where DVI-D worked but DVI-A didn't but as I said, that's rare and the odds of it happening across three cards at once is non-existant.

I have to assume it really, really doesn't like VGA adapters. I have no idea how or why but that's really the only conclusion I'm drawing. The cheapest thing to try (and a better picture to boot) is a DVI-D cable. Again, make sure it is DVI-D. Some monitors won't accept DVI-I.

Yes, D-Sub 15 (15 pins) with male or female ends. DVI-A, except the arrangement of pins, is the exact same as VGA--at least in theory. I've encountered situations where DVI-D worked but DVI-A didn't but as I said, that's rare and the odds of it happening across three cards at once is non-existant.

If the tv takes VGA input via DVI, then maybe, but I doubt it.
At that point, only the analog signal (vga) would be going over the wire.
I really don't know if monitors/tvs can take analog input through DVI.

FordGT90Concept will know more than I know about VGA input via DVI probably.

I highly doubt that will work. Most monitor DVI inputs are DVI-D because they only accept a digital signal. If it is a DVI-I input, you could certainly try it but my money is on it not working. I think most monitors have like a DVI-I port internally and the split it to VGA and DVI-D which cuts costs.

New Member

I realize that this topic is a skeleton by now, but for anyone who cares; another issue could be that the new video card may be too demanding for your PSU, an could take it over it's watt limit. Anyone who finds this issue with an adapter of this sort should check power demands. It seems rather silly, but it's actually a very common mistake!

That obviously makes DVI-A to VGA adapters look like the guility party but no, you said you used 3 different ones with the same results.

I guess that leaves only one question: how many monitors are you trying this on? Just one? Can you try a different monitor to see if it yeilds the same results?

If they are functioning correctly, yes. DVI-I (what is on the cards) includes DVI-D (digital) and DVI-A (analog). The adapter you are using should be a DVI-A to VGA adapter which basically means it is only pulling the analog signal out.

DVI-D will look better on a digital screen (like LCDs) than VGA. Make sure the cable is DVI-D though. Some monitors won't accept DVI-A/DVI-I.

My monitor with dvi just died and the only other monitor I have is vga only so I went and bought a dvi female to male vga adaptor for my dvi cord to plug into my dvi port in the back of my tower and the monitor is not detected at all. If I read your post correctly, does that mean that it will never work and I just wasted money?

well it wont turn on and makes a quiet static sound as it tries to start. My second monitor works fine, but only has vga. the tower turns on fine and just got a 1tb harddive for it. if it's the graphics card wouldn't the other monitor not work as well? anyway the tower is an hp originally with windows 7 but upgraded to 10. the working monitor is an hp vs17. the blown monitor is an hp w1907 and the monitor not working with the adaptor is an roc lcd monitor (and the newest one I have)