Don't you need cards from the same chipset family to do CrossFire? Is there a 48xx that supports HDCP?
– ShinraiFeb 22 '11 at 19:00

@Shinrai my HD 4650 does support HDCP, it's my monitor's that don't support it. As long as the setup works and there's not tons of lag with my system, I don't mind if I have to get an older card.
– PatrickFeb 22 '11 at 19:07

Er, 46xx. yeah. I must have misread the question because I thought you meant it didn't. (I'm not too familiar with this particular family of Radeons). To my knowledge Josh's answer is good though (and I've upvoted it as such).
– ShinraiFeb 22 '11 at 19:09

Why must it be Crossfire(X)? Why not just run 2 independent cards?
– Milind RJan 2 '14 at 13:31

@MilindR Windows doesn't support two independent video cards for video output, Crossfire will allow Windows to treat both cards as a single card that Crossfire manages.
– PatrickJan 2 '14 at 15:25

2 Answers
2

First of all it depends on your budget. :) You could use Crossfire only if you want to upgrade your performance (by 1.5x to 2x). However, if you want to support 3 or more displays properly, you need at least the Radeon 5xxx series.

There are several Radeon 5xxx or 6xxx which not only give you significantly better performance, but also as many as 6 display outputs without having to go Crossfire.

That said, the upgraded single approach would be my recommendation because:

You need a 5xxx series or better to support multiple displays

Newer single cards have 3-6 or more connectors

Single card gives significantly better performance AND features

Single card is lower overall power consumption

Single card will produce less noise

Update Additional note on performance:

Having had some additional experience with Crossfire and SLI since this post, I should also point out that both will (a) increase your CPU usage, and (b) do not always work properly - or even at all - in many games / apps.

The CPU usage is an issue particularly with Crossfire I found, where it increases nearly 30% or more even on a Quad core machine.

The best answer is ALWAYS to replace an old(er) video card. New technologies such as Pixel Shader, Vertex Shader, etc. are constantly evolving, and thus newer 3D Renderers will require the latest versions of such. Sure, you can double your video memory, but video memory is not the only constraint on video cards.

Now, if the card isn't lacking in the features, and still has the latest versions of shader technology, etc. then I would suggest adding another card. In that case, however, you shouldn't need another video card. Multiple cards, in my opinion, are more of a gimmick than anything. Most games can be ran perfectly fine with a single up-to-date card. If not, there will be a LOT more at play than just the card. The processor is probably playing a very large role if a new video card cannot run a game.