HDMI vs DisplayPort: which is best?

How to choose between an HDMI or DisplayPort connection

Look on the back of a top-flight AMD Radeon such as the HD 6870 and you'll find, along with the now-familiar DVI ports, HDMI and DisplayPort connectors. That's three distinct digital connectors on one card. Why so many?

Well, DVI is the current champion but is on the way out. It's been fun, but its reaching the end of the line. However, it won't vanish just yet, because it can carry analogue VGA signals and there are still an awful lot of analogue monitors out there. This leaves us two new contenders for high-definition video.

The two all-digital standards are not in direct competition either. They 'complement' each other, at least that's the semi-official line. However, two new standards on one card means there is competition at some level, especially when you need to drop one for budget editions. This isn't exactly a format war where only one will be left standing, though – both formats are here to stay.

So what's up then? Both offer high-speed all-digital connection for video and audio with allowance for copy protection and 3D images. What are the strengths of each and why do the card people put both onboard to entice us?

The basic specifications and capabilities are similar and while the two ports essentially do the same job they do it in different ways which reflect their origins.

HDMI hails from the world of TVs, DVD players and consumer electronics. It takes as its starting point S-VHS and composite signals. DisplayPort hails from the computer chaps and uses a more sophisticated and flexible data transmission method.

HDMI explained

The High Definition Multimedia Interface first appeared in 2003 and was designed as a digital replacement for the multitude of analogue formats used in consumer AV standards (RF, SCART, Composite, S-VHS, RGB and so forth), in a compact single cable.

It can carry any uncompressed TV signal with 48-bit colour and up to eight channels of audio as well as control connections for the rare instances where one bit of kit can control another.

A DVI signal is fully compatible, so you can use a DVI to HDMI converter no probs (although not for the analogue signals on DVI-A obviously).

The standard was put together by a consortium of big names, including Panasonic, Sony, Philips and Toshiba. The specification has now reached version 1.4.

The big gain in the later versions is the maximum clock speed, which governs bandwidth. The original specification called for a maximum of 165MHz, which is just enough to handle 1080p.

Version 1.3 upped this to 340MHz, in order to comfortably handle 1600p (technically known as Wide Quad Extended Graphics Array). The maximum data rate is 10.4GB/s.

It comes in single-link or double-link types (the 19-pin Type A and 29-pin Type B respectively), Type B equates to dual-link DVI, although we haven't seen one yet.

HDMI 1.4 launched in March 2010 and adds an Ethernet connection, an audio return channel, more control protocols and is ready for 3D signals. It can cope with a 4096 x 2160 display, enough for a very beefy home cinema setup.

Cables come in two main types: Standard or Category 1 cables can cope with the lower capacities of version 1.0 to 1.2, while High-Speed or Category 2 cables are certified for versions 1.3 to 1.4.

There's no standard set for the maximum cable length, its essentially down to the cable company to get it working properly. The signal drops off and the longer the cable, the thicker the wire and therefore the better the quality required.

Fully compatible with DVI, just add a converter. This only offers single-link DVI however, which means that high-end cards will still need to carry a DVI dual-link for high resolutions. A nice bit of backward compatibility, though.

HDMI includes CEC: Consumer Electronics Control. The idea is that one bit of AV kit can pass across instructions to another, such as turn on, change channel, and so forth. Nice when it works, which isn't as often as you might like. No big loss on a PC.

HDMI has native support for the xvYCC colour space. This offers a gamut 1.8 times bigger than plain-old RGB models, by effectively using negative numbers for primaries. It was developed because modern panels are more capable than CRTs of displaying richer colours. Support for xvYCC colour space is found in many graphics cards, but the main target is digital camcorders. You won't find it in Blu-Ray though and its absence on DisplayPort is no great loss.

If you've a respectably new television then it will have an HDMI port, making connecting a PC really easy, at last. This is the big bonus, games coming at you with real screen acreage (albeit not at super-high resolutions).

HDMI kit is currently a bit cheaper, especially for compatible monitors.