Early reviews of Fuji’s FinePix Real 3D W1? Not so hot. However, a year later the $450 refreshed W3 seems to be doing better, scoring a “Recommended” review over at Photography Blog. The camera’s dual 1/2.3-inch, 10 megapixel CCDs and 3X zoom lenses are carried over from before, but a new design and more user-friendly interface is said to make a huge improvement — even if it’s still too easy to stick a finger over either of the two light portals up front. Build quality is solid and the new 3.5-inch, glasses-free 3D LCD on the back is called “impressive,” far brighter than last year’s parallax barrier. The machine will capture 720p 3D movies and can save both 3D MPO images and 2D JPEGs simultaneously, meaning your holiday snaps are future-proofed even if you haven’t jumped on the 3D bandwagon just yet.

There’s no word yet on when the Motorola Motoroi will make it to T-Mobile (in the industry we call that “the T-Motoroiola rumor”) but there has definitely been a good deal of chatter concerning this handset. The latest has Pocket-lint confirming a UK release with Moto itself. Offering similarities to Taipei’s HSPA-lovin’ XT701, users in Ol’ Blighty (and the rest of the UK) can look forward to an 8-megapixel camera (with a Xenon flash), support for 720p video, mini HDMI, and an unspecified processor boost. In addition, Android 2.1 is likely to be part of the deal. We have neither a timeline nor a price, and the veracity of this rumor is yet to be established (although it does seem like a no-brainer), but if this all goes down as Pocket-lint says it will you can color us Yanks mighty jealous.

Props to Lenkeng for dressing up their otherwise anonymous VGA-to-HDMI converter box with a PSP-related angle — the LKV8000 comes with the necessary cables to take your PSP-2000 or above’s 480p video output and push out a 720p HDMI signal complete with stereo audio. Not a bad idea — except that we can’t think of an HDTV that lacks either component or VGA jacks and that doesn’t have a built-in scaler to do the same job. Maybe you’re just out of ports? In any event, this guy needs a Stateside distributor before we can tell you pricing or availability, so you’re stuck swapping cables for a while, Sparky.

Wireless docking stations have been around for years now, but the main issue has been bandwidth. Sure, it’s easy to send a wireless mouse signal through the air, but try shoving 720p video, four USB signals and a little bit of arrogance though those highly-spaced particles. Toshiba’s new dynadock wireless U USB docking station does a commendable job of doing the best it can with what it has, tapping into wireless USB technology in order to nix the need for your laptop to actually be seated into your docking station. The device can be setup to auto-connect when your machine is in range, and a one-touch undock button carefully shuts down all of your peripherals as you exit. There’s six USB 2.0 sockets (included two of the Sleep-and-Charge variety), integrated 7.1 audio and support for a VGA / DVI monitor with resolutions as high as 1,680 x 1,050. Nah, that’s not quite 1080p, but we’ll take it for now. The $299.99 asking price, however, is a bit harder to swallow.

I warned you that it was back on. Monster’s priciest—a $250 35-foot HDMI cable—goes toe-to-toe with Monoprice’s longest and thickest—a $35 35-footer and a $53 50-footer. Which will win? Or more importantly, which will fail? Let’s have a look, shall we?

As I mentioned this morning, I skipped the testing on the shorter cables because, using Monster’s own gear, we showed that they could carry today’s 1080p signal without trouble. (One, from XtremeHD, had trouble with some extreme video simulations, but it passed all of the real-world simulations, so you can keep using it… for now.)

But as you know, both the 35-footer and 50-footer from Monoprice failed the 1080p test in the lab. I used the very same cables from the lab for the real world test below, and guess what? The 35-footer did just fine, as did the 35-ft cable from Monster. But Monoprice’s 50-footer gave me some unmistakable trouble signs, as you will see below.

The TV in all of these shots is a Samsung LN-T5265F 52-inch 1080p LCD. It’s nice and big, the better to spot any aberrant cable behavior. I recognize that you might think the TV’s error correction is interfering with the test, to which I reply:
• I ran preliminary tests with a Sony Bravia KDF-37H1000 rear-projection set, but since it had 1080p inputs but only 720p display, couldn’t use it for the finals.
• We are only testing 1080p TV signal. Given that tightened criteria, wouldn’t all new “full HD” sets have at least some competent error correction?
• If error correction is truly the name of the game, then it especially doesn’t matter which cable you buy.

There was actually quite a lot of noise—a bouncing picture that happened so frequently I was able to capture the effect with a still camera. I was able to reproduce the noise with some consistency, too. Here’s the noise detail for you to scrutinize:

While it may seem conclusive that the 50-foot Monoprice is not a good choice, I was fortunate enough to have another 50-footer from the company, one that was not part of the original lab test. When I used it, I was not able to reproduce the noise. Furthermore, I double-checked the noisy cable on the Sony Bravia KDF-37H1000 with 1080p input (but 720p display) and again could not duplicate it.

The missing piece is Monster’s “No Frills” $300 50-footer. I know some of you wish I had tested it, that it had been part of this from the beginning. I don’t have a time machine to fix that, but I will say that, given how the Monster 35 footer (10M) did in the lab, chances are you’re not going to see noise on the Monster 50 footer.

For the love of God, what does it all mean???
I have to say I for one have learned a few things with all of this testing, and I hope you have too. The way I see it:

• It never pays to buy a Monster cable first. It doesn’t even make sense to buy the “marked down” $50 cable you can buy if you don’t want Monster. Go online, order your cables, and wait.

• Even if you’re going for the long haul, try a cheaper cable from a reliable vendor first. Monoprice isn’t the only one. During this process I’ve spoken with good people at FireFold, DataPro International, and others, and tested an assortment of discount products, with no noticeable problems. I am confident that, if a vendor has a solid return policy and satisfaction guarantee, you should feel free to buy even a super-long cable from a discount house. In the case of my 50-footer noise, a quick return would have been all that was required.

• Monster has a point about future-proofing. I have no doubt, given our testing, that Monster cables can outperform other cables in video formats that are not yet in use. What does this mean for a consumer? Does it make sense to spend $300 now on a 50-foot cable, assuming you will spend thousands to upgrade all of your video equipment around it in the next few years? Logic dictates that the answer is no.

• The only people who should buy Monster cable are people who light cigars with Benjamins. Fortunately for Monster, there are plenty of those people. They’re not even suckers, they are just rich as hell, and want the best. This testing did not prove that Monster is not the best. It just proved that the best is, for the most part, unnecessary.

This time, we brought along a bag full of awesomely priced cables, mostly from Monoprice, that we were ready to run bandwidth tests on, side-by-side with Monster’s finest (and most damned expensive) cables.

What were our findings?

1) At short distances up to 6ft (2 meters), you can pretty much get away with any cable. Monoprice cables kicked ass at the 6 foot length that mostly everyone uses.

Not all cables are the same, however, and in truth, it’s the medium-priced cables that may be the real rip-off.

2) At longer distances, cheaper cable tends to choke up. A 720p signal will make it, but even today’s standard 1080p signal can fry out inside of a long cable that isn’t built as well. If you are trying to hook up a 1080p projector on your ceiling to a Blu-ray or HD DVD player, this is a concern.

The tests, which fired digital signal through the cable to synthesize high-definition video, can be divided into REAL-WORLD requirements (720p and 8-bit 60Hz 1080p) and FUTURE-WORLD requirements (12-bit 60Hz 1080p and even 12-bit 120Hz 1080p). Mind you, the future formats don’t exist now, so they should only be a concern when you are buying cables you intend to keep for five years, such as those you want to build into a wall.

To simulate high-def video, it sends signal down one of three paths within an HDMI cable, so its signal at any given time is ONE-THIRD the bandwidth of that video format. The list of bandwidth tests we ran is as follows:

When the signal was sent out over the cable, its performance was measured on a Tektronix DSA8200 Digital Serial Analyzer. The argument goes like this: it may all be 1’s and 0’s, but what is being sent over that cable is electric current. When too much data is sent over a shabby cable, the device on the other end can’t tell what is a 1 and what is a 0. The end result is video that is either jittery, full of digital snow, or flat-out not there.

The Tektronix display shows two arcs, a high ridge that stands for the 1’s and a low ridge that stands for the 0’s. As bandwidth increases, you will see that the arcs get fuzzier, and at the failure point, there are too many 1’s that look like 0’s, and vice versa.

Bear in mind, in some cases, if the cable failed at one level, we didn’t go on to the next. Likewise, if we knew it passed the higher test, we might not go on to a lower test.

even the Monster 10-meter couldn’t pass the Future World 1080p test. The Monster folks said they didn’t have a 50-footer in the building that they could test with, but I suspect it would have done a little bit better than the Monoprice, possibly even carrying today’s 1080p. But we did not test that.

Judging from these results, I would have to reiterate my original position, that it’s best to skimp at short distances, but you don’t want to be caught with the wrong cable installed in your walls. Even with the projector, it might be smart to buy a $30 cable first and see if it works, but be prepared, when upgrading your gear, to upgrade the cable too. Does it have to be Monster? Hell no, but you might have to pay something close to a Monster-sized price.

The truth is, the bigger rip-off appears to be the $20 XtremeHD cable. It didn’t perform as well as stuff one-fifth the price. (No wonder they don’t sell a 10-meter cable.) I would say beware of mid-priced cable of dubious origin. Our dealings with Monoprice lead us to believe that at least they know what they’re selling, even at such a tremendous discount.

Stay tuned for HDMI Cable Battlemodo: The Truth About Monster, Part 3, where we try to match the laboratory results with basic, in-home testing. If the Digital Serial Analyzer said a cable fails, but it works just fine in my basement, maybe I’ll have to call BS. – Wilson Rothman

Hey guys: I just got back from meeting with Noel Lee from Monster Cable, along with a posse of affiliated ladies and gentlemen, and their heavy equipment. I was there to talk to them about the fact that they sell—and have convinced a lot of retailers to sell—very expensive cable ($120 for 2 meters, last I checked). At the same time, there are cheaper non-Monster cables available on the Internet. My simple question Why? resulted in an organized, technical 2-hour response. I won’t give you the blow-by-blow, but I have information that might make this debate interesting, and a bit more three-dimensional.

Let’s start with my allegation about Monster, which isn’t mine alone, because Lee helpfully pointed out the gist of it in the opening of his presentation:

I say, since everything is digital, and since HDMI is a spec, the cheap cable will get the data from point A to point B as well as any other cable. Additionally I say that if there are subtle (i.e. videophile-grade) differences in cables, the average consumer isn’t going to spot them on the TV.

Am I wrong? Monster says yes, but in Lee’s elaborate answer I felt both his POV and mine were justified.

Here are Monster’s truths:

Bandwidth is King.

The requirements of 1080p and beyond is what separates from the high-end cable from the knock-offs. This is the same as Ethernet cable, in the sense that a cable certified for HDMI 1.3a “Highspeed” will guarantee greater throughput. The newest spec, 1.3a means just over 10Gbps of bandwidth. Standard 480p requires less than 1Gbps, the current 8-bit 1080p requires 4.46 Gbps, but the next gen 1080p formats will require nearly 15Gbps, more than the highest certified HDMI cable can support. (See chart if you can, if not I’ll try to get a better one up later.)

Not all cables are the same.

During Lee’s slideshow, he demonstrated via X-Ray slides that pricier cables (OK, Monster’s) have a smaller chance of wear and tear damage at the point where the cable meets the connector. t’s a concept that’s easy for any musician to understand—remember all of those shorting-out patch cords?

Even if it has an HDMI-style connector, it may not be certified HDMI.

You have to look for the HDMI logo, says Steve Venuti of HDMI Licensing. There are tons of knock-offs, especially the bundled or online cables, since you can’t look at the packaging when you buy. Really high-end cables will certify other things, such as HDMI 1.3a and even “Highspeed.”

Just because digital information is made up of ones and zeros it can still degrade, especially over distances.

I get this now, because it’s not about the digital info just getting there, like packet data. It’s video, so it’s about the digital info getting there at the right time to make sense. It’s also audio, and over distances, there’s a greater chance that audio and video will get out of sync. The following pictures show a test that they run that measures data throughput. In the interest of brevity, I’ll just say that the more those lines crowd the center, the greater the risk of having crappy video.

Differences in cable are easily spotted by untrained eyes.

A PS3 feeding 1080p signal to a Samsung 1080p LCD TV starts to jitter and throw digital noise lines across the screen if the cable can’t hack the bandwidth. We tested the two cables above on a PS3 showing a Blu-ray of Chicken Little and it was totally noticeable, there were lines and jitters, none of this videophile matter-of-opinion stuff that I had anticipated. It was totally obvious, and something that Monster says people often blame on their TV, not their cable.

Future proofing and heavy-duty cable are crucial for in-wall installation.

This probably made the most sense of all. Given the fact that in-wall cable is longer than others, you’d need something that can handle the bandwidth. (In fact, when it gets to 50 feet, you don’t have many choices in the cable world for that reason—Monster says it’s soon headed for 100 feet of HDMI.) Couple that with staples, kinks and other weirdness that might happen with in-wall installation, and the fact that when you upgrade your TV, you don’t want to have to re-do your drywall, and Monster has a good point.

Lest you think I be drinkin’ Lee’s Kool-Aid, here are my caveats to Monster’s truths:

• If you are going from any source to a 720p or 1080i TV set, you should really be in the clear using a full-on crappy ass cable.

• As long as you’re not doing installing the wiring in your wall, start with the crappy cable. If it sucks and you only paid $20 for it, go back and spend more on something certified.

• In the demo, Monster even proved that good components can offset crappy cables: that PS3 and that Samsung 1080p were able to work around much of the problems, all the more reason why, in a non-custom non-in-wall installation, you should try out the lower grade stuff first.

So listen, you’ve heard it from me: there are differences in cable, but there are also differences in technical requirements. We don’t all need $120 cables for our components. As to the question of why Monster won’t offer a lower-priced product in recognition of these differences in technical requirements, Lee told me to “stay tuned.”

While high definition has become a reality for many consumers, the technical jargon associated with this exiting new technology is causing much confusion. Just as we were beginning to understand the differences between Blu-ray and HD DVD along comes a new high-definition format, 1080p.

But why do we need another high-definition format anyway? Many of us have bought our HD Ready screens and were ready to sit back and enjoy this new viewing experience, but now we are all wondering if we bought the right kit in the first place.

Many of the more recent HD Ready flat screens feature a resolution of 1,366×768 pixels. This will display the commonly used 720p and 1080i formats, although 1080i/1080p signals will be downscaled to fit. To display 1080i/1080p signals in their entirety, you’ll need a screen with a resolution of 1,920×1,080 pixels, coined ‘Full HD’ by the marketing men.

However, just because a screen has 1,920×1,080-pixels it does not necessarily mean that it will accept 1080p input – so check before you buy.

Remember, 720p, 1080i, 1080p are formats in which ‘Sources’ of high definition content are presented for viewing on a particular output device such as your LCD/Plasma screen. The source could originate from your TV cable provider for example, or your xbox 360. To restate the point, 1080i/1080p needs a screen resolution of 1,920×1,080-pixels to display in its entirity, but you don’t have to have a screen with this resolution to display a 1080i/1080p signal – lower resolution screens downscale the signal to fit.

Taking a step back, 720p and 1080i were initially set out as the two key standards for High Definition content, with Sky HD, HD DVD and the Xbox 360 supporting these formats. Any TV that supports 720p and 1080i is classed as HD Ready. Let’s take a step back for a moment and take a quick look at the development of TV technology to see how we arrived at these standards.

In a CRT display (the TV you grew up with), a stream of electrons is generated by a gun, and is scanned across the face of the tube in scan lines, left to right and top to bottom. The face is coated in phosphors, which glow when hit by the electron stream. A method of scanning was required that would reduce the transmitted TV picture’s bandwidth and work in accordance with the electricity supply frequency (50Hz in the UK and Europe and 60Hz in the US). The result was interlaced scanning.

A method of reducing bandwidth was required because early sets were not able to draw the whole picture on screen before the top of the picture began to fade, resulting in a picture of uneven brightness and intensity. To overcome this, the screen was split in half with only half the lines (each alternate line) being refreshed each cycle. Hence, the signal is interlaced to deliver a full screen refresh every second cycle. So if the interlace signal refreshes half the lines on a screen 50 times per second this results in a full screen (or frame) refresh rate of 25 times per second. The problem with interlacing is the distortion when an image moves quickly between the odd and even lines as only one set of lines is ever being refreshed.

As TV screen technologies have progressed another system called Progressive Scan has also been developed. With progressive scanning the frames are not split into two fields of odd and even lines. Instead, all of the image scan lines are drawn in one go from top to bottom. This method is sometimes referred to as ’sequential scanning’ or ‘non-interlaced’. The fact that frames are shown as a whole makes it similar in principle to the way film is shown at the cinema.

At this point it is worth considering what we mean by resolution in relation to TVs;

Resolution: HD-Ready TVs need to be able to display pictures at the resolution set by the new standard. Resolution can be described either in terms of “lines of resolution,” or pixels. The resolution you see on your TV depends on two factors, namely the resolution of your display and the resolution of the video signal you receive. Because video images are always rectangular in shape, there is both horizontal resolution and vertical resolution to consider.

Vertical resolution: This is the number of horizontal lines that can be resolved in an image from top to bottom. The old familiar CRT TV displays 576 lines, while Digital HD television operates at a resolution of either 720 or 1080 lines. This is the most important resolution as it is most noticeable to the human eye.

Horizontal resolution: This is the number of vertical lines that can be resolved from one side of an image to the other. Horizontal resolution varies depending on the source. The number of horizontal pixels is not quite so critical as vertical resolution as it is not as obvious to the human eye during normal viewing.

An analogue TV signal in Europe, where the PAL standard is used, has 625 horizontal lines of which 576 lines are displayed and the image (or frame) is refreshed 25 times a second. This is the standard we have been used to for years.

A High Definition Digital TV signal delivers significantly more picture detail and audio quality than a standard signal, producing pictures that are significantly better, sharper and clearer;

720p: 1,280×720 pixel resolution. High-definition picture that is displayed progressively. Each line is displayed on the screen simultaneously, therefore it is smoother than an interlaced picture.

1080i: 1,920×1,080 pixel resolution. High-definition picture that is displayed interlaced. Each odd line of the picture is displayed, followed by each even line, and the resulting image is not as smooth as a progressive feed. 1080i is therefore a more detailed picture suited to documentaries and wildlife footage, but less suitable for action-oriented material such as sports and movies.

1080p: 1,920×1,080 pixel resolution. High-definition picture that is displayed progressively. Each line is displayed on the screen simultaneously, therefore it is smoother than an interlaced picture. This is the ultimate high-definition standard — the most detailed picture, displayed progressively.

There are two main formats for HDTV, namely 720p (i.e. a 720 line picture progressively scanned 50 times a second) and 1080i (1080 lines interlaced at 50 cycles per second). The picture resolution of a high definition digital TV is about 4 times greater than a typical 576 line TV picture.

not having a screen which is able to display 1080p may not be important to you. However, there are exceptions, and if you are a serious game player you will probably already know one of them, or to be precise two of them. The xbox360(with a little tweak) and the playstation 3 produce output at 1080p. Also, the new High Definition DVD format, blu-ray has also been designed for 1080p ouput. Is the difference worth the extra investment? Maybe, something you will have to judge for yourselves …

HDMI devices are manufactured to adhere to various versions of the specification, in which each version is given a number, such as 1.0, 1.2, or 1.3a.Each subsequent version of the specification uses the same kind of cable but increases the bandwidth and/or capabilities of what can be transmitted over the cable.A product listed as having an HDMI version does not necessarily mean that it will have all of the features that are listed for that version, since some HDMI features are optional, such as Deep Color and xvYCC (which is branded by Sony as “x.v.Color“).

Version 1.0 to 1.2

HDMI 1.0 was released December 9, 2002 and is a single-cable digital audio/video connector interface with a maximum TMDS bandwidth of 4.9 Gbit/s. It supports up to 3.96 Gbit/s of video bandwidth (1080p/60 Hz or UXGA) and 8 channel LPCM/192 kHz/24-bit audio.HDMI 1.1 was released on May 20, 2004 and added support for DVD Audio.HDMI 1.2 was released August 8, 2005 and added support for One Bit Audio, used on Super Audio CDs, at up to 8 channels. It also added the availability of HDMI Type A connectors for PC sources, the ability for PC sources to only support the sRGB color space while retaining the option to support the YCbCr color space, and required HDMI 1.2 and later displays to support low-voltage sources.HDMI 1.2a was released on December 14, 2005 and fully specifies Consumer Electronic Control (CEC) features, command sets, and CEC compliance tests.

Version 1.3

HDMI 1.3 was released June 22, 2006 and increased the single-link bandwidth to 340 MHz (10.2 Gbit/s).It optionally supports Deep Color, with 30-bit, 36-bit, and 48-bit xvYCC, sRGB, or YCbCr, compared to 24-bit sRGB or YCbCr in previous HDMI versions. It also optionally supports output of Dolby TrueHD and DTS-HD Master Audio streams for external decoding by AV receivers.It incorporates automatic audio syncing (audio video sync) capability. It defined cable Categories 1 and 2, with Category 1 cable being tested up to 74.25 MHz and Category 2 being tested up to 340 MHz. It also added the new Type C miniconnector for portable devices.HDMI 1.3a was released on November 10, 2006 and had Cable and Sink modifications for Type C, source termination recommendations, and removed undershoot and maximum rise/fall time limits.It also changed CEC capacitance limits, clarified sRGB video quantization range, and CEC commands for timer control were brought back in an altered form, with audio control commands added.HDMI 1.3b was released on March 26, 2007 and added HDMI compliance testing revisions.HDMI 1.3b has no effect on HDMI features, functions, or performance, since the testing is for products based on the HDMI 1.3a specification.HDMI 1.3b1 was released on November 9, 2007 and added HDMI compliance testing revisions, which added testing requirements for the HDMI Type C miniconnector.HDMI 1.3b1 has no effect on HDMI features, functions, or performance, since the testing is for products based on the HDMI 1.3a specification. HDMI 1.3c was released on August 25, 2008 and added HDMI compliance testing revisions, which changed testing requirements for active HDMI cables.HDMI 1.3c has no effect on HDMI features, functions, or performance, since the testing is for products based on the HDMI 1.3a specification.

Version 1.4

HDMI 1.4 was released on May 28, 2009, and Silicon Image expects their first HDMI 1.4 products to sample in the second half of 2009. HDMI 1.4 increases the maximum resolution to 4K × 2K (3840×2160p at 24Hz/25Hz/30Hz and 4096×2160p at 24Hz, which is a resolution used with digital theaters); an HDMI Ethernet Channel, which allows for a 100 Mb/s Ethernet connection between the two HDMI connected devices; and introduces an Audio Return Channel, 3D Over HDMI, a new Micro HDMI Connector, expanded support for color spaces, and an Automotive Connection System.

Version Comparison

Note that a given product may choose to implement a subset of the given HDMI version. Certain features such as Deep Color and xvYCC support are optional.

A 36-bit support is mandatory for Deep Color compatible CE devices, with 48-bit support being optional.

B Maximum resolution is based on CVT-RB, which is a VESA standard for non-CRT-based displays. Using CVT-RB 1920×1200 would have a video bandwidth of 3.69 Gbit/s, and 2560×1600 would have a video bandwidth of 8.12 Gbit/s.

C Using CVT-RB would have a video bandwidth of 8.12 Gbit/s.

D Using CVT-RB would have a video bandwidth of 7.91 Gbit/s.

E Using CVT-RB would have a video bandwidth of 7.39 Gbit/s.

F Even for a compressed audio codec that a given HDMI version cannot transport, the source device may be able to decode the audio codec and transmit the audio as uncompressed LPCM.

G CEC has been in the HDMI specification since version 1.0, but only began to be used in CE products with HDMI version 1.3a.

H Playback of SACD may be possible for older HDMI versions if the source device (such as the Oppo 970) converts to LPCM.

I Large number of additions and clarifications for CEC commands. One addition is CEC command, allowing for volume control of an AV receiver.

Capable of sending HDMI v1.3 digital audio/video signal through multimode optical fibers, AT-HDF20SR and AT-HDF30SR allow any HD display to extend signals up to 1,320 ft at WUXGA or HDTV resolutions. They offer self adjustment with no compression, bit reduction, or signal degradation and can run multiple signals in single conduit without crosstalk. CEC compliant units support 12-bit color depth and have 8 dB input equalizer, which compensates losses over 16 ft.

Atlona Technologies is releasing a new line of video extenders that transfer high definition video over multi-mode Optical type fiber optic cable up to 1000 ft without any signal loss. These new extenders serve not only as a way for companies to send video signal far beyond the lengths of actual cables, but also provides a measure of security due to their use of Fiber Optic cable. Since Fiber Optic cable is immune to interference caused by electromagnetic fields (EMF/EMI), these extenders are perfect for medical imaging applications where a perfect picture is not only ideal, but necessary.

Along with being HDCP/DDC compliant, these UL/CE approved Atlona units are equipped with advanced digital fiber optic technology allowing for self adjustment with no compression, bit reduction, or signal degradation. Multiple signals can be run in a single conduit without crosstalk. These CEC compliant units support 12-bit color depth and have an 8dB input equalizer, which compensates losses over 5 meters (16 ft).

The AT-HDF20SR is an adaptor styled baluns with a small form factor that lends itself perfectly to HDMI matrix switchers where space is limited. The larger of the two baluns is sold as two separate units, the AT-HDF30R and the AT-HDF30R, to allow users the ability to convert among multiple HD video types depending on the receiver unit. This unit is compatible with Atlona’s RGB or DVI receiver units (models: AT-DVIF30R and AT-RGBF30R) making it versatile enough to handle even the most complicated digital signage application.