5 years ago I bought an MSI 785GM-E65 motherboard with an MSI HD5770. I play games, but am also a full time graphic artist so I do some work on this machine at home and use a dual monitor setup. What I did with this combo was run my main display from the graphics card and the secondary display from the on-board VGA connection, which is an HD4200. That setup worked great this whole time...until the games got better.

I just upgraded to a Zotac GTX660. Right off the bat, I noticed the secondary display did not fire up on the initial reboot. The primary appeared to be tip-top. I figured it's something with the Nvidia software, so I went into the control panel/display settings and noticed there was not a second monitor detected. I figured that maybe the Nvidia drivers uninstalled the AMD drivers, so I screwed on the VGA to DVI connector that came with the GTX660, but noticed that the pins on the adapter would not fit to the output on the card. I never realized there were two types of DVI connectors and wondered why a second adapter wasn't included in the box.

Before I screw anything up, I figured I'd ask you guys...the pros. Do I screw with the on-board bios, drivers, firmware, or is there a cheap adapter I can get to run the second monitor from the GTX660?

at least one of that 660's DVI connectors will support a DVI to VGA adaptor

Put the DVI monitor on the card's other DVI connector, then use the display properties in control panel to select which screen is the primary (if you need to).

EDIT (By way of an explanation):graphics cards tend to only support a single VGA output these days so while you often have two DVI outputs only one of them has the extra 4 little holes that the VGA signal is sent through.

I'd rather try and run the secondary off of the on-board GPU if possible.

It should be possible (though I'm not sure why you'd bother).

Is the onboard graphics showing up under display adaptors in the device manager? At a guess I'd say it isn't and the onboard graphics have decided to disable themselves because of the add in card, though why they didn't when you had the 5770 I've no idea. If you have a hunt around in the bios you should find something to force it on.

I do graphic work and it helps having a second monitor off to the side. That and chatting on Steam or Facebook and browsing the web while playing a game as well.

I disconnected the GTX660 and my 2nd monitor fired up on a reboot like it was the primary display. When I re installed the GTX660 the 2nd monitor no longer worked, and the AMD Catalyst software only shows the one monitor as well.

He can't, look at this Zotac for example. The 2nd DVI port is missing the analog pins. There is DVI-D, and then there is DVI-I (link). What the adapter does is to carry the analog signals to VGA-port compatible pins.

The Model M is not for the faint of heart. You either like them or hate them.

He can't, look at this Zotac for example. The 2nd DVI port is missing the analog pins. There is DVI-D, and then there is DVI-I (link). What the adapter does is to carry the analog signals to VGA-port compatible pins.

I'm confused. I had two monitors setup previously without having any adapters using the exact same configuration. Other than raw power increases with the cards, the technology hasn't changed any with connecting.

Last night I removed the GTX660 and the on-board fired up my second monitor. When I installed my older HD5770, both monitors fired up. I reinstall the GTX660 and just the primary fires up.

I'll probably just have to sacrifice the really cool PhysX and grab another AMD card. If it works, it works.

I would seriously consider replacing the screen instead of investing in a converter box. When you can get a new 1080p 20" screen with a DVI for under $90, spending significant money on what I am assuming is an older CCFL monitor with dimming, yellowing image and probably slower 8ms response seems like a bad long-term investment.

I guess if the old screen is very good still, it makes sense but usually VGA-only screens have always been at the low end of the spectrum.

Congratulations, you've noticed that this year's signature is based on outdated internet memes; CLICK HERE NOW to experience this unforgettable phenomenon. This sentence is just filler and as irrelevant as my signature.

I'm confused. I had two monitors setup previously without having any adapters using the exact same configuration. Other than raw power increases with the cards, the technology hasn't changed any with connecting.

What has changed is that many modern video cards only have one DAC on them, so they can only drive one analog display. If you want to use more than one monitor, only one can be analog; the rest all need to be digital.

The years just pass like trains. I wave, but they don't slow down.-- Steven Wilson

I'm confused. I had two monitors setup previously without having any adapters using the exact same configuration. Other than raw power increases with the cards, the technology hasn't changed any with connecting.

What has changed is that many modern video cards only have one DAC on them, so they can only drive one analog display. If you want to use more than one monitor, only one can be analog; the rest all need to be digital.

As was the case with my HD5770, one DVI, one HDMI, and one Display Port. Not one problem running two monitors since 2008 using the DVI out on the HD5770 and the DVI out on my Motherboard. It still works when I uninstall the GTX660 and reseat the HD5770. It's got to be a AMD vs Nvidia driver conflict or something, the 2nd Monitor literally isn't recognized what so ever. Like it's not plugged in.

At this point, I think it's the decision of which to sacrifice, PhysX or two monitors. I'm going to screw with BIOS settings for the next couple days and see if I can get the 2nd monitor to even show up in the Device Manager.

What has changed is that many modern video cards only have one DAC on them, so they can only drive one analog display. If you want to use more than one monitor, only one can be analog; the rest all need to be digital.

As was the case with my HD5770It's got to be a AMD vs Nvidia driver conflict or something, the 2nd Monitor literally isn't recognized what so ever. Like it's not plugged in.

You can throw away the Nvidia card or you can listen to what we are telling you. The onboard graphics are disabling because they are AMD and the discrete card is Nvidia. It might be possible to force it on, maybe not, that's up to your particular motherboard, not my issue.

Nvidia cards now only have a single analog output. This is normal, as designed. Current AMD cards still support more than one analog output, but this will be changing before long. A few years after that, analog outputs will be removed entirely, since they're not DRM-compliant, by design. The clock is running out for analog.

There are two issues here, the onboard graphics auto-disables when a non-AMD GPU is present. Bummer. Maybe a firmware update for your motherboard will help, maybe not. Impossible to say from where I'm sitting.

The other issue is that you want >1 analog output and those are being phased out aggressively. You can buy an AMD graphics card as a temporary solution. That will enable the onboard and also will have 2 or more analog outputs from the discrete GPU.

Alternately, you can just buy a new display, and kill twenty or thirty birds with one stone. You say you're a graphic artist professionally, but you're using two VGA-only monitors that are 5+ years old? Either you're not nearly as professional as you claim, or you don't know what you're missing.

As was the case with my HD5770, one DVI, one HDMI, and one Display Port. Not one problem running two monitors since 2008 using the DVI out on the HD5770 and the DVI out on my Motherboard. It still works when I uninstall the GTX660 and reseat the HD5770. It's got to be a AMD vs Nvidia driver conflict or something, the 2nd Monitor literally isn't recognized what so ever. Like it's not plugged in.

Not all DVI ports are created equal. Historically, most DVI ports were DVI-I ports, which support both digital and analog (with an inexpensive adapter required for the analog). DVI-I ports are being phased out in favor of DVI-D ports, which support digital monitors ONLY. You CANNOT drive an analog monitor with a DVI-D port.

Edit: And as Forge notes, mixing nVidia and AMD GPUs is quite likely to cause you grief.

The years just pass like trains. I wave, but they don't slow down.-- Steven Wilson

My two Dells are the cat's ass and have reproduced color perfectly from my home office, very similar to the two much more expensive newer 30" Apple displays I use in my work office.

If it ain't broke, don't fix it.

Thanks for the help...minus the insult of my professionalism.

Sorry about your professionalism, but it's broke. Analog outputs are going away. You can do fairly small things to dodge the issue now, but the steps to preserve your current setup are going to become more involved all the time. Just trying to address an issue rather than put it off.

It's very likely that your 660 has a displayport connection, that one can be adapted to VGA.

If he has a reference card, the Displayport output is just that, Displayport, and not Displayport++ (DP++), so there is no analog signal to pass along. Base model displayport can't drive anything other than Dispayport monitors, without an active converter. DP++ has been fairly common so far, but it's a lot like the DVI-I vs. DVI-D issue. Only DVI-D is guaranteed by the name "DVI", and now that we're no longer getting the deluxe version, folks are confused. Likewise, DP++ is harder to provide and will slowly disappear in favor of baseline DP, assuming DP monitors become common.

You can throw away the Nvidia card or you can listen to what we are telling you. The onboard graphics are disabling because they are AMD and the discrete card is Nvidia. It might be possible to force it on, maybe not, that's up to your particular motherboard, not my issue.

I believe that this sums up the problem you are having. The AMD onboard does not play well with the Nvidia discrete card.

Analog only LCDs? Don't think they will be better than the good old FW900. Some would argue no LCDs can match those too. Care to reveal the models?

They are very early Dell 30s, which were made by Samsung back then. If I didn't have $5,000 in them I would have got rid of em along time ago.

IIRC Samsung used MVA/PVA panels back in those days for their large monitors. A bit inferior to today's IPS that's for sure. I remembered I bought a very early 15" Panasonic LCD for close to $1000, and I had to let that go when it is now worth nothing. Have you written off for tax purposes yet?

Either you switch monitors (you can get 2 30" Professional grade Eizo's/NEC's/whatever for may be less than $5K now), or you may have to consider upgrading to an AMD card that still supports 2 analog outputs (although you may be able to use the onboard one with any AMD card it seems).

The Model M is not for the faint of heart. You either like them or hate them.

I moved the Dell monitors to my home from my office years ago. I use newer 30" Apple IPS Monitors at the office for my graphic design and printing. The Dells are for gaming and the occasional side jobs when they come up. My work is covered, the setup we've gone around and around on is for my personal amusement. I'm well aware that I could buy 3 new 24" or 27" monitors for next to nothing, but I don't see the need to when things have worked as well as they have for this long.