Posted
by
Soulskill
on Sunday September 27, 2009 @10:47AM
from the hopefully-they'll-come-up-with-a-less-goofy-name dept.

We recently discussed Light Peak, Intel's upcoming, optical interconnect technology that boasts data transfer rates of up to 10 Gbps. While some have speculated that Light Peak will directly compete with USB 3.0, Engadget has now unearthed information that indicates the idea for the technology originated from Apple, who apparently asked Intel to develop it.
"According to documents we've seen and conversations we've had, Apple had reached out to Intel as early as 2007 with plans for an interoperable standard which could handle massive amounts of data and 'replace the multitudinous connector types with a single connector (FireWire, USB, Display interface).' ... Based on what we've learned, Apple will introduce the new standard for its systems around Fall 2010 in a line of Macs destined for back-to-school shoppers — a follow-up to the 'Spotlight turns to notebooks' event, perhaps. Following the initial launch, there are plans to roll out a low-power variation in 2011, which could lead to more widespread adoption in handhelds and cellphones. The plans from October 2007 show a roadmap that includes Light Peak being introduced to the iPhone / iPod platform to serve as a gateway for multimedia and networking outputs."

USB now a days is often used to charge devices too, which is not possible with these optical interfaces. Because of this, I don't think this will have much future for portable devices, so nice try, but I'm not buying it.

Gee, Intel, thanks for the complete lack of information on your page. Licencing costs? Connector shape? Power? Protocol overhead?Though I'll admit, the cheap laser effect and helpful conversion from x bits transfered per second to height of x stacked dollar bills in miles does add a lot of class.Could we wait with announcing new protocols until there's actual technical information on them to be had?

... cable system, too. It would be passively translated, using exactly the same bit level protocols, etc. It would be slower in most cases, of course. This would be so that metallic connection needs can be seamlessly integrated into the same bus architecture (which I hope fixes the mess they made of USB).

IEEE1394 or FireWire or iLink had issues with IP if I recall correctly and it was more than just the name it was known by I think. Will this new thing be even more heavily encumbered by patents? I really with manufacturers would grow a pair and stand up against these emerging "standards" in favor of standards that everyone can use. This is especially true of those that utilize encryption and DRM schemes to control how the technologies are implemented. ("Oh sure! You can use our patented technology for free, but you have to sign here, here and here and remember, you can only use it in ways that we tell you. If you use it to exercise 'Fair Use' rights, then we will yank your license and sue you into the ground.")

USB dominates the peripherals market because it allows for cheep peripherals.Monitor cables are specialised to not require the monitor to do much work.Ethernet cables allow high transfer rates between expensive devices.

What is the market for this?Will it require "expensive" tech on both ends or will the PC be able to do the lifting?

Actually, photovoltaic cells are more efficient when illuminated by monochromatic light than they are when illuminated by sunlight (narrower spectral spread means you can pick a semiconductor to hit the peak efficiency). You can easily get 50-60 percent conversion of laser light.

Tie a knot in an optical cable? Colour me skeptical. I am unaware of any optical material with a decent transmissive efficiency that has that kind of flexibility. Perhaps a polymer of some kind, but it will not be able to take that repeatedly, as the optical transmissiveness is dependent upon the material being fairly structurally dense, which rules out extreme flexibility in all of the polymers that I know of.

IMHO optic fibre has no place in consumer gear. The cable lengths do not necessitate them for high speed transmission, the cost of end devices will always be higher than for wire-devices due to the need to modulate to optical signals and back again, and the possibility of getting dust or dirt into the socket or otherwise abusing the equipment is far higher under consumer product conditions.

Optical may have outlived its usefulness for storage and backup, but it hasn't outlived its usefulness as a distribution medium. It is a lot cheaper for a software vendor to ship out their software on ~10-cent DVDs rather than ~$5 SD cards or USB drives. Entertainment firms especially like optical disks because in addition to being cheaper, they are also more fragile and harder to use with computers rather than locked-down, purpose-built, stand-alone players. Computers can better do unwanted things like skip the mandatory 30 minutes of previews, transfer the files to another medium, or strip out DRM altogether, so the entertainment firms want to discourage the playback of their files on computers as much as possible. The obvious distribution method of using the Internet is even more unappealing to software and entertainment distributors as they think it makes piracy easier and makes their ridiculous pricing schemes based on "scarcity" look that much more ridiculous.

So while putting things on optical media may be pretty much useless for customers, suppliers love it and that's why we won't see optical media die for a good, long time.

You get complete galvanic separation, thinner cables and no dodgy contacts

Good points. The one very nice thing about optical power transmission--assuming it could be made practical in a consumer product--would be total electrical isolation between devices. No more ground loops.

If you look at a computer from around 1994, it will probably have all of these and things plugged into most of them. A modern laptop can have the same set of devices all plugged in to a USB hub, connected to a single USB port. This same laptop, however, will probably still have:

DVI, HDMI, or DisplayPort connectors for a monitor.

Ethernet.

FireWire for external disks, digital cameras, and so on.

eSATA for external disks.

USB doesn't yet replace these (some people use USB2 for disks, but it doesn't perform as well as even FireWire 400 in the real world, let alone FireWire 800). This should be able to replace all of them. A laptop in 5 years time would then only need one type of port and, if it's fast enough, then a palmtop with just one of them will be able to drive all of the things that a modern desktop or laptop might have connected.

To be honest I never quite understood why USB was not just plug in either way and just use some sort of auto negotiate to figure out what pins do what.
So there would have been no guessing:(
Oh well maybe lesson learned?

Firewire was designed to do just that, whereas USB was intended for cheaper and simpler devices. Of course, having two different technologies for different kinds of devices was too complicated, so now we "simply" have USB with a dozen different connectors, speeds, and host/device/OTG capabilities.

Actually, what's the matter with competition? If USB 3.0 is actually as good as they say, that's okay. If there's a real purpose to delivering multiple protocols through a single optical cable the thickness of a hair at 10 Gbps then up to 100 Gbps, it would be relatively easy to connect six or seven peripherals, run them at max, and add in the connection to the 36" monitor running at 2048 x 3840. The two things are not competitors. There might be uses for them both/neither. Let 'em compete.

I think Apple's move comes after the failure of Firewire to get market share in competition with Intel's relatively slow -- that's right -- USB standard, which won because of the ubiquity of Intel, and USB's need for processor arbitration. By the end, they could give away the standard, and it would still never get traction. So, no Firewire 3200.

But optical? Well, with one connector to the motherboard, you get a multitude of protocols, all running on the same fiber. Theoretically, it's great. And Intel will be able to profit no matter which wins; or maybe it's USB 3.0 first and then Light Peak? Who knows?