Post Your Comment

26 Comments

As I sat there reworking the cables behind my TV stand last night, I kept wondering why we don't attempt to consolidate cables more often. The HDMI spec supports an Ethernet channel, yet I'm still running CAT5e to all of my devices! Even if it's just removing a single type of cable, it would bring me one step closer to cleaning up the clutter! Imagine if I could run a single cable from my AVR to my XBOX that provided power and Ethernet while also serving the video/audio from the XBOX to the AVR! =)Reply

I think you have it a bit backwards, you would be better off running everything over cat 5e/6 instead (as the connectors, cables, etc are much cheaper/ubiquitous). There isn't much that can't be run over "ethernet" cable including: usb, vga, hdmi, dvi, power, sata, video cameras, alarms, phones, heck there's even pci-e over cat 6.Reply

While Cat5e and Cat6 cables sort of support 10GigE, even the official standard limits their range to well below the normal levels, even in perfect conditions. Interference is a big enough problem that Cat5e cables often have trouble with regular gigabit ethernet, let alone 10GigE. It's unlikely that, shoving ethernet cables in the rats nest behind a TV, you're going to have success with anything less than Cat6a, which isn't cheap. Of course, there's also the issue of 10GigE NICs being way too expensive for consumer use...

You're also a bit off on the specs. 8.16 Gbps is the throughput after removing overhead. With overhead, they're at 10.2 Gbps, and HDMI 2.0 is expected to bump that up to 18 Gbps.

HDMI's ethernet support is also limited to 100 Mbps, which limits the usefulness somewhat.Reply

I have my laptop down to three cables when I plug it in: USB hub, display, and power. This almost perfectly brings everything over one cable. The big downside is USB display lacking any graphics acceleration but most users wouldn't mind that. Thunderbolt is the technically superior solution to get high bandwidth and graphics acceleration but USB will hopefully be the cheap alternative.Reply

The big advantage of Thunderbolt is that it actually passes video straight over the cable, whereas with USB you need a separate display adapter (here they're using DisplayLink). In the video, it seems the left monitor might have a DisplayLink controller built in, and the right monitor has a controller taped to the back.

DisplayLink has advanced quite a bit, but it's still not a native display connection. I'm actually driving a 27" 1440p LCD via DisplayPort from an HP USB 3.0 port replicator that has a DisplayLink chip inside (since HDMI 1.3 and MicroHDMI can't do more than 1080p). Performance is not bad with a high-end quad-core i7 laptop, but power users can tell that something's up-- and when connected to my i7 ultrabook (not exactly a $200 netbook), things start getting messy (feels like using an RDP connection to some PC).

It's a workable hack today (given so many PCs still lack DisplayPorts and are thus unable to drive >1080p displays), but not really comparable to a real video connection.Reply

It blows my mind that USB hasn't been standardised as a native display interface yet.. Would be great for running headless low end servers, rescuing laptops with broken screens, etc. Just have a small USB connected monitor on hand. Better yet, imaging if you could boot into a laptop's BIOSm enable an "Act as Monitor mode" and use it as a monitor via a USB device port on the back. Lovely, and mainly a software effort.Reply

Even though USB3 has enough bandwidth to drive a monitor well (USB2 wasn't even close); as a low cost solution it still struggles due to a lack of QoS guarantees because it has to wait for the CPU to become available to do anything.Reply

Well perhaps USB hasn't been standardized as a native display interface because it isn't a native display interface, it's a general purpose serial bus. Also, due to protocol overhead, USB 3.0 SuperSpeed mode provides less usable bandwidth than single-link DVI or HDMI prior to version 1.3.

What baffles me is the statement that, "the newer revision enables 10 Gbps by using more efficient coding." This makes absolutely no sense seeing as advertised USB speeds have always been the physical layer gross bitrate. In which case using less efficient encoding would actually make it easier for them to hit their target. 4b10b would allow them to hit 10 Gbit/s in no time.Reply

Any sense of how the USB-IF / OEMs plan to brand/market USB power solutions? How will consumers know which USB ports can provide power, which can take power, and how much (i.e. the different profiles)?

I see the spec no longer fixes power direction, but if I wanted to plug in, say, an extra backup battery to my laptop via USB (as are currently available for smartphones), that would presumably have to be plugged into the special power-input-enabled USB port, not just any port?

The site also says they envision things like printers (presumably small inkjets) being powered by USB-- but I assume they' mean USB ports in hubs / bricks, not laptops (will a laptop be able to output 36 or 60W?). So if someone markets a printer as being USB-powerable, how will one communicate the requirements?

This sounds really cool, and I'd love to consolidate power bricks- just wondering how they plan to avoid completely confusing consumers.Reply

Why Interconnect uses SSIC ? And not PCI-Express? Is it because it is cheaper?When everything else are moving to PCI-Express ( Thunderbolt, SATA Express ), why USB? ( Fundamentally it is still a CPU hogging spec )

And i hate the design of all USB Port. Normal, Mini and Micro.

May be USB Spec will continue to get better, I just wish Apple would open up Lightning and use that with USB3.0 instead.Reply