Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

We recently discussed Light Peak, Intel's upcoming, optical interconnect technology that boasts data transfer rates of up to 10 Gbps. While some have speculated that Light Peak will directly compete with USB 3.0, Engadget has now unearthed information that indicates the idea for the technology originated from Apple, who apparently asked Intel to develop it.
"According to documents we've seen and conversations we've had, Apple had reached out to Intel as early as 2007 with plans for an interoperable standard which could handle massive amounts of data and 'replace the multitudinous connector types with a single connector (FireWire, USB, Display interface).' ... Based on what we've learned, Apple will introduce the new standard for its systems around Fall 2010 in a line of Macs destined for back-to-school shoppers — a follow-up to the 'Spotlight turns to notebooks' event, perhaps. Following the initial launch, there are plans to roll out a low-power variation in 2011, which could lead to more widespread adoption in handhelds and cellphones. The plans from October 2007 show a roadmap that includes Light Peak being introduced to the iPhone / iPod platform to serve as a gateway for multimedia and networking outputs."

Put it on iPods and it becomes ubiquitous almost immediately. They could charge extra for a usb cable or dock.

Well, looking at the diagram, dongles to connect USB and other types would be the means to do that. Personally, if it works as well as they say that it works, I'd be opting for gadgets and devices that just support it natively.

From the photos, it looks like it is a standard USB connector. The optical part likely connects through the centre of the connector. I imagine the standard 4 copper conductors are still in place. This makes sense as it enables low cost cables and peripherals by simply using the existing USB standard.

Future computers could use the physical connector as the only interface to the machine while retaining compatibility with existing USB devices. Kind of like how those Mini-TOSLINK [wikipedia.org] cables work.

The iPod was new, and not yet ubiquitous. Also, they were fighting against Intel rather than with them. With Macs, iPhones, iPods, iTablets, and Intel, they can start a new standard overnight. BTW, when they switched to USB, I understood, but it was soooo much slower than Firewire.

Apple had to drop FireWire? I don't know if you've looked at an Apple computer recently, but every single Apple computer sold today, with the exception of the entry-level white polycarbonate MacBook has FireWire.

The iPod dock connector is what I believe you're referring to, and while it's proprietary, it's also very well documented for developers [apple.com] and carries a lot more than just plain ol' USB. It has, among other things, pins for FireWire (deprecated on iPods) analogue audio and video and a control channel.

Actually, what's the matter with competition? If USB 3.0 is actually as good as they say, that's okay. If there's a real purpose to delivering multiple protocols through a single optical cable the thickness of a hair at 10 Gbps then up to 100 Gbps, it would be relatively easy to connect six or seven peripherals, run them at max, and add in the connection to the 36" monitor running at 2048 x 3840. The two things are not competitors. There might be uses for them both/neither. Let 'em compete.

Yeah two "possessive" types would help, but here a more direct syntax would also help. This statement "Intel's USB Competitor" is unclear because you are not sure if the writer means "a competitor to Intel's USB standard" or "a competitor belonging to Intel which will compete against USB." In the former, "Apple Behind Competitor to Intel's USB" would work better; in the latter, "Apple Behind Intel's Competitor to USB."

USB now a days is often used to charge devices too, which is not possible with these optical interfaces. Because of this, I don't think this will have much future for portable devices, so nice try, but I'm not buying it.

They will have a hybrid copper/optical [cnet.com] wire to power devices : "In addition, Intel said it's working on bundling the optical fiber with copper wire so Light Peak can be used to power devices plugged into the PC, he said."

In practice this is correct. OTOH, add an photocell and one could, at least in principle, power a device. Of course as others have mentioned running a wire as well as the fiber optic solves this problem.

for your comment, can I please get an explanation as to why we lose from both the light efficiency and then solar (heat I assume)? also what are you suggesting wire efficiency is, as IIRC from the basic stuff I've read here and there wire efficiency goes down over distance. just curious.

Actually, photovoltaic cells are more efficient when illuminated by monochromatic light than they are when illuminated by sunlight (narrower spectral spread means you can pick a semiconductor to hit the peak efficiency). You can easily get 50-60 percent conversion of laser light.

You get complete galvanic separation, thinner cables and no dodgy contacts

Good points. The one very nice thing about optical power transmission--assuming it could be made practical in a consumer product--would be total electrical isolation between devices. No more ground loops.

In practice this is correct. OTOH, add an photocell and one could, at least in principle, power a device. Of course as others have mentioned running a wire as well as the fiber optic solves this problem.

Yeah, seriously. I mean, how much damage can a 10 watt laser really cause, anyway?

I think the idea is to have a homogeneous connective form factor for all data connections on the computer, so that all cables are interchangeable. As far as I know, the bandwidth of an optical transmission isn't limited by the transmitting medium itself, but by the interpretative hardware on either end, which is improving as defined by Moore's law. So you set a standard for the cable and connector now and create interchangeable cables that are not device-specific, which results in all changes to the technology occuring completely on the backend, out of sight to the user.

If this is, indeed, the goal of LightPeak, i *really* hope that they learned a lesson from USB, and make a connector that can be plugged in using tactile feedback, rather than requiring the user guess-and-rotate as is the case today.

To be honest I never quite understood why USB was not just plug in either way and just use some sort of auto negotiate to figure out what pins do what.
So there would have been no guessing:(
Oh well maybe lesson learned?

Firewire was designed to do just that, whereas USB was intended for cheaper and simpler devices. Of course, having two different technologies for different kinds of devices was too complicated, so now we "simply" have USB with a dozen different connectors, speeds, and host/device/OTG capabilities.

So now users can call tech support with their mouse plugged into their monitor and say that their "computer doesn't work".

I don't understand the fixation on making a completely universal plug. It seems good in theory, but what does it actually get us beyond some cable interchanging possibilities and expensive upgrades?

Why aren't we working on better wireless communication so that we don't need wires at all? I can't get my wireless mouse 2 feet away from the receiver, and I sure as hell don't want another cable cluttering things up.

Because we can't power anything through wireless, at least not in any practical or inexpensive fashion. So we either need batteries and battery replacements or a power connector, which kind defeats the purpose. Unless of course we can power them using alternative means ("solar" power panels, key clicking/mouse moving). Powering an antenna (array) requires quite some juice.

Since the cable can be 100 feet (30+ m), I'd put my computer in the basement, put even bigger fans on it and overclock it a bit more. Then I'd run a cable to my living room TV and bedrooms, so that the whole house can simultaneously use a single computer from many different local monitors/keyboards. It's pretty damn elegant and efficient if you ask me. Since you only need one computer for the house, it's worth it to make it awesome: Multiple CPU sockets, multiple GPUs - this is stuff that has entered the ma

USB now a days is often used to charge devices too, which is not possible with these optical interfaces.

Here's an optical interface that can transfer lots of power: C02 laser [wikipedia.org]. You wouldn't want to feel around the back of a computer with one of these behind one of the interface connectors, though.

Except this isn't just trying to be USB 4.0, it's ambitious enough to replace high bandwidth interfaces like DVI/HDMI/DisplayPort. (Maybe Ethernet, too, but I think that'd be a bit too much of an uphill slog to pull off.)

Now, whether or not that actually happens is an open question, but can you imagine how cool it'd be to have a bunch of identical ports on your laptop, which you're free to plug your monitor, mouse, or video camera into?

Perhaps, perhaps not. If the transition is made as a step to a new generation of connectors, you will hopefully end up with a generation that has fewer connector types. After all, we have managed to go from

Mice: used to be serial or ps/2; now: USB.Keyboards: serial, ps/2, AT; now: USB.External CD drives: used to be SCSI or whatever; now: USB.External HDDs, the same, even if some enthusiasts also use eSATA.

And I'll bet you have nothing at all to say about the hundreds of other little thing that use USB. Phones, flash drives, webcams, tv tuners, wifi, ethernet, bluetooth, and SO MANY MORE things I can't even remember much less have seen before.

That would be true of most companies. But this is Apple we're talking about. They nearly went out of business back in 1997 because they got rid of standard serial/keyboard/mouse/parallel/SCSI connectors and replaced them with USB (and occasionally Firewire).

What exactly makes you think that Apple went nearly bankrupt (they didn't) because they dropped legacy-ports? Besides, if Apple went nearly bankrupt in 1997, I fail to see how it applies, since it was the iMac that dropped legacy technology (floppy, and only expansion-ports it had were USB). iMac was released in 1998.... And last time I checked, it was pretty popular....

Maybe, like Apple's previous poster child Firewire, it will be freaking awesome but have absolutely no uptake in the consumer market, leading its own champions to drop support for it (see 5G iPod, recent Macbooks)

recent MacbooksThe air never had firewire, probablly because it always designed as a cut down ultraslim machine.

The basic polycarbonate macbook has always had firewire 400.

All apples other current machines have firewire 800 (which is compatible with 400 with a wiring adaptor)

The 13 inch unibody didn't initially have firewire which many people at the time thought was a sign of apple dropping it. However either the pundits were wrong or apple decided the backlash was too much because soon a

I think Firewire served Apple's purpose well. During the time that USB was still catching up (for several years), Apple computers had a freaking fast peripheral bus which made them very desirable for video and photo people. That gave them the edge they needed to stay relevant and perhaps dominate in that area, to a limited extent.

It would be perfect to have a small simple and single connection between a laptop, enhanced iPhone/iPod, or *cough* tablet *cough* and an external display (power would be the only other connection needed, unless the proposed connector contains power pins). The display would contain ports for hardwire networks, USB, firewire, speakers, "web" camera, microphone, eSATA, etc. (much like Apple's and others current display products).

This would be Apple's answer to docking stations that often have rather large fix

... such as the long settling time when a new device is plugged in, and the loss of continuity when a device is unplugged and quickly plugged back in. Another pet issue is that there should be a means to address a device specifically by which port it is plugged into, as well as by the device's unique ID regardless of which port it is plugged in to.

BTW, they could have included a USB path via the DVI/HDMI cable connection, so USB devices could be plugged directly into the monitor. I do worry that even Ligh

Gee, Intel, thanks for the complete lack of information on your page. Licencing costs? Connector shape? Power? Protocol overhead?Though I'll admit, the cheap laser effect and helpful conversion from x bits transfered per second to height of x stacked dollar bills in miles does add a lot of class.Could we wait with announcing new protocols until there's actual technical information on them to be had?

... cable system, too. It would be passively translated, using exactly the same bit level protocols, etc. It would be slower in most cases, of course. This would be so that metallic connection needs can be seamlessly integrated into the same bus architecture (which I hope fixes the mess they made of USB).

You do realise that the *point* of this is that there is one cable, one connector, and one standard. You can plug anything into anything, and it works. Adding a second cable standard would completely defeat the point. Why by the way might you have "mettalic connection needs" btw?

You're wrong. USB is and was for hooking up peripherals like keyboard/mice/printers/low-bandwidth devices to effectively replace the old RS-232 serial and parallel ports of yore. USB was never intended to replace the interface that goes to your monitor, your hard drives*, and your ethernet.

* Yes, we're all aware of USB storage, but see all the comments above about how even low-end devices today can swamp USB... if USB was so great for this then eSATA never would have come into existence.

This new standard appears to be point-to-point and with all the knowledge we have now it will hopefully be efficient. Additionally, 10Gbps is the starter speed... Intel was talking about scaling it to 100Gbps without too much difficulty.

If you look at a computer from around 1994, it will probably have all of these and things plugged into most of them. A modern laptop can have the same set of devices all plugged in to a USB hub, connected to a single USB port. This same laptop, however, will probably still have:

Even with the criticisms (e.g., http://en.wikipedia.org/wiki/MagSafe [wikipedia.org]), one thing I've been impressed with Apple on (and there aren't that many) is the MagSafe connector. I've had way too many problems with other connectors wearing out and not working, and occasionally, the the yanking unintentionally almost causing havoc problem.

I'd love to see the next generation data connections (with power transfer) be magnetic. To solve the short problem, the power transfer could be inductive, and the optical connection isn't going to short. I'd be happy to have every single damn cable I ever have to use in the future be some variation of MagSafe.

IEEE1394 or FireWire or iLink had issues with IP if I recall correctly and it was more than just the name it was known by I think. Will this new thing be even more heavily encumbered by patents? I really with manufacturers would grow a pair and stand up against these emerging "standards" in favor of standards that everyone can use. This is especially true of those that utilize encryption and DRM schemes to control how the technologies are implemented. ("Oh sure! You can use our patented technology for free, but you have to sign here, here and here and remember, you can only use it in ways that we tell you. If you use it to exercise 'Fair Use' rights, then we will yank your license and sue you into the ground.")

We'll have to see if Apple has learned anything. I first heard about FireWire in maybe 1993. I went to work at Apple in 1995 and met with the people developing FireWire and there was lots of talk about having devices natively support it, yada yada. It didn't make it into shipping Apple hardware until 1999. Besides being late to market, Apple insisted on charging licensing fees to everyone who incorporated FireWire.

Had FireWire been out in 1996, they might have been able to get away with the licensing fees. Had they forgone the licensing fees in 1999 they might have kept USB a low-speed interconnect.

In order to succeed in today's market it will need to offer technical advantages over USB 3.0 and not come with a price premium. Having Intel introduce is a pretty strong first step. We'll have to see how the rest of it plays out.

You can use our patented technology for free, but... If you use it to exercise 'Fair Use' rights

Fair Use is a defense to copyright infringement, not patent infringement. There's no way to claim "I used your patented technology without a license, but I only used it for educational purposes, or in a news reporting circumstance, or I only used it for 30 seconds."

USB dominates the peripherals market because it allows for cheep peripherals.Monitor cables are specialised to not require the monitor to do much work.Ethernet cables allow high transfer rates between expensive devices.

What is the market for this?Will it require "expensive" tech on both ends or will the PC be able to do the lifting?

Not sure what your measure of "unable to compete" is. Firewire is not dead, in fact I think the number of FW800 devices has increased. This is much less than the number of USB devices, but "unable to compete" overstates the current situation. FW800 is sufficiently fast enough for a file server in my home office (but I'm not swapping to those drives, rather it's for shared files.) So Firewire has a nice niche market, generally sustained by its advantages and widespread use for Macs in their niche market

I wonder just why USB3 cannot be used as that one-connector-to-rule-them-all stuff. In fact, anybody knows why monitors aren't offering the USB2 option? It's a bandwidth problem or what? And why is not more widespread the use of USB2 as networking port? Just a matter of speed? USB2 is speedy enough for most networking uses, and USB3 will be faster than most Ethernets. Of course you'd need routers with USB2 connections, but they could start with one or two connections at first and see if people bought it. As

USB3 is pretty marginal for connecting a monitor. Your average single-link DVI interface has up to 3.96 Gbps of bandwidth, which a typical 1920x1080 LCD @ 60 Hz nearly saturates. USB3 is rated at 5 Gbps but if it's anything like USB2, you'll probably see ~2 Gbps of actual throughput and a huge CPU load. USB2 is horrible as a display interface as it is really only good for connecting small secondary displays to display static 2D images. You have only 480 Mbps of theoretical bandwidth, which is enough to driv

Tie a knot in an optical cable? Colour me skeptical. I am unaware of any optical material with a decent transmissive efficiency that has that kind of flexibility. Perhaps a polymer of some kind, but it will not be able to take that repeatedly, as the optical transmissiveness is dependent upon the material being fairly structurally dense, which rules out extreme flexibility in all of the polymers that I know of.

IMHO optic fibre has no place in consumer gear. The cable lengths do not necessitate them for high

Even if that's correct (which I doubt), do you suppose that the next iteration of pretty much every device might have faster memory in it? Or will if there's an interconnect that can take advantage of it?

The iPhone uses cheap MLC NAND flash. If Apple wanted faster flash memory, they could have installed more expensive and faster SLC flash. But it will be a while before Apple puts something the iPhone that will even saturate USB 2.0.

I estimate the flash write speed on my 16GB iPhone 3G to be around 5 megaby

And what Apple wants to do with this interconnect is to replace things like DVI/Display Port, Firewire/USB, (e)SATA, etc., all on one bus.

I think this is probably what Apple is after. As I look at my Macbook Pro, I have the following connectors: MagSave (power), Ethernet, FW800, miniDP, USBx2, SD card, line-in, and headphones. You could probably get rid of Ethernet, FW, miniDP, and USB and replace them with Light Peak. Since I'm rarely using more than two of those at a time, you could probably reduce the number of ports and start shrinking devices.

The other thing that Apple seems to be targeting is the optical drive. I think you're going to see Apple dropping optical altogether, and moving OS delivery to SD cards. Most other software/media will be downloads.

The other thing that Apple seems to be targeting is the optical drive. I think you're going to see Apple dropping optical altogether, and moving OS delivery to SD cards. Most other software/media will be downloads.

Interesting and I think you're right, especially when you can already boot OSX from a flash drive and one the size of a DVD (8 GB) can probably be purchased in mass for a buck a piece.

I also think optical is somewhat outliving its usefulness for storage or backup. HD and flash space has gotten larger and cheaper, much faster.

Optical may have outlived its usefulness for storage and backup, but it hasn't outlived its usefulness as a distribution medium. It is a lot cheaper for a software vendor to ship out their software on ~10-cent DVDs rather than ~$5 SD cards or USB drives. Entertainment firms especially like optical disks because in addition to being cheaper, they are also more fragile and harder to use with computers rather than locked-down, purpose-built, stand-alone players. Computers can better do unwanted things like skip the mandatory 30 minutes of previews, transfer the files to another medium, or strip out DRM altogether, so the entertainment firms want to discourage the playback of their files on computers as much as possible. The obvious distribution method of using the Internet is even more unappealing to software and entertainment distributors as they think it makes piracy easier and makes their ridiculous pricing schemes based on "scarcity" look that much more ridiculous.

So while putting things on optical media may be pretty much useless for customers, suppliers love it and that's why we won't see optical media die for a good, long time.

Not from an SD card, I can't see it. That's too (pick one) copyable, counterfeitable, normal, cheap-looking. No, if they do away with releasing software optically, physical media will be in a very tasteful custom thumb drive of some sort, with lots of special DRM built in.

Do you only use it for your mouse and keyboard? If that's the case, then you'll probably be satisfied.

Now, back in the real world, it becomes the bottleneck for even low-end, high-capacity storage devices built around traditional spinning media. With us now moving towards solid-state storage, USB 2.0 fails us horribly. We can only manage 30% to 35% read/write capacity utilization under real-world conditions.

The same goes for connecting high-end visual displays via USB. Once you get above a resolution of 200

The tricky bit with replacing video with a general purpose interface would be to sort out signal routing inside the computer. There still needs to be a GPU/framebuffer and that GPU needs a high bandwidth path (we are talking a couple of PCIe 1.x lanes worth per display) from the framebuffer to the general purpose interface.

Not saying this couldn't be done but it would definately require cooperation between the GPU vendor and the vendor of the general purpose interface

Because, as mentioned in another thread the other day, the reason a lot of devices don't have gigabit or 10gigabit connections is that those interfaces take 6 watts rather than the 1 watt or less for 100mbit or 10mbit. Optical is a good choice for the faster speeds because it will require less power than a high-bandwidth copper connection.

You don't have to buy a $40 HDMI cable. If the cables you buy are that expensive, then you're just getting fleeced. Do the barest amount of research before you purchase.

Also, the cheap HDMI cables are more expensive than "ethernet patch cables" because of licensing, a more expensive connector, more wires, and more stringent requirements on the quality of materials. The cable costs more than a dollar because it's the equivalent of several CAT-6a cables. It's designed to transmit raw video data at 1920x1080p30. That's roughly 1.4Gbps. The standard even defines faster rates. You'd need 2-3 CAT6a cables to transfer video at that rate and still cover everything else HDMI takes care of.

even better, why did hdmi have 'many wires' when really just 2 opto (even toslink!) cables would have worked.

one for the send and one for the return path. that's it. no ground loops, no cable quality issues, no switch complexity (with parallel wires that have to be *exact length* on the pc board traces).

duh!

I have stopped expecting quality connector and cable standards from computer makers and the industry. sata is a nightmare, sata power is no better than 4pin old style molex drive power, hdmi is a night

I have never known why industry standards such as HD-SDI have never made it to the consumer market. Single coax cable terminated with BNCs that can deliver 4k (four times the resolution of 1080p) or higher with 16 channels of audio, all uncompressed, at a length of over 100m.

The obstacle I see with networking is that the world connects with RJ45, so for wired networking you'd still need an adapter. The other obstacle is security and management. Ethernet networks often contain (relatively) untrusted devices (so anything attatched to them will need to implement security and users will need to deal with setting that up) and are managed by someone other than the computers user (so getting more addresses can be nontrivial). Compare this to something like USB or firewire where it is

Apple wants something better than USB crap. Apple knows Intel has the IP to make something better. Apple lets Intel in on the gig so Intel will be more willing to eventually drop USB. All Apple needs to do is convince Intel that this will be big enough that Intel's share of the booty will still be bigger than USB. That and convince Intel they can't go it alone because Apple controls what goes on the iPods. That way Apple and Intel get to rape the consumer together. This is what you get when people buy