Towards the end of 2012, Intel will alter Thunderbolt specification from a copper wire-based interconnect to a fiber-optic or photonic interconnect. An analysis by DigiTimes which takes into account information by industry sources, states that Thunderbolt will become standard for PCs only in 2013, and that too the optical one. This comes even as copper wire-based Thunderbolt is beginning to feature on some high-end socket LGA1155 motherboards based on Intel 7-series chipset.

Thunderbolt will get its big push in 2013, when the port will be standard on mainstream desktops and notebooks by OEM majors such as Lenovo, and several PC motherboard vendors, such as ASUS. The optical Thunderbolt IO, apart from allowing greater cable-lengths than 6 m (a limitation of copper-wire Thunderbolt), could push bandwidth greater than 10 Gbps, as a possible incentive for the industry to facilitate the transition to the optical variant.

That Thunderbolt is provided by additional ASIC, not incorporated in the chipset. The basic problem with Thunderbolt implementation is it's price - $25 per chip. Intel is the sole provider of the Thunderbolt controller. It is not licensed to anyone. What I've heard about additional features is that mobo vendor tend to like them , when their price is $1-$3, $5 max. Thunderbolt is much pricey than $5, so nobody likes it. It is that simple.

That Thunderbolt is provided by additional ASIC, not incorporated in the chipset. The basic problem with Thunderbolt implementation is it's price - $25 per chip. Intel is the sole provider of the Thunderbolt controller. It is not licensed to anyone. What I've heard about additional features is that mobo vendor tend to like them , when their price is $1-$3, $5 max. Thunderbolt is much pricey than $5, so nobody likes it. It is that simple.

Click to expand...

That, and Thunderbolt peripherals are so sparse right now and crazy expensive. Heck, there aren't even any Thunderbolt docks out yet afaik (I'd like to have Gigabit Ethernet for my Macbook Air).

Seems like AMD users aren't going to have the luxury of Thunderbolt unless mobo manufacturers incorporate it, at a premium of course due to the reasons that R_1 stated, or they get an add-in card to add that functionality.

Yeah, I think 2013 is overly optimistic. Yes, it is cost prohibitive but it is also USB prohibitive. Everything the every-man owns is USB so you're talking a large expense to convert all those items to Thunderbolt ports. And then you have the explosion of smartphones and their manufacturers finally agreeing to use a standardized micro-USB for charging and data exchange--something they aren't likely to change again. I doubt Thunderbolt will become standard until at least 2015 and even then, there will be a ton of USB devices.

Perhaps the hardware implementation costs is part of what lead to Apple being able to introduce Intel Thunderbolt (Light Peak) on their hardware. Many PC manufacturers or motherboard manufactures may not have been interested at the initial introductory cost. Apple may have been willing if it were coupled with an exclusivity deal (for a limited time).

Who knows, but its clear that even Thunderbolts release on LGA1155 hardware will be limited to the higher-end of the spectrum. So clearly its not cheap. This quasi exclusive nature may attribute to retarding its growth in the market. This could keep peripherals limited in number and relatively expensive,…unlike USB.

As for AMD, apparently AMD is working something they call “Lightning Bolt”,….the poor mans Thunderbolt:

Apple likely heard about Thunderbolt and demanded they have access to it on their hardware. Apple has always adapted non-industry-standard ports for their products (hell if I know why). Intel pushed it out the door as a copper product for Apple because optical wasn't ready.

Lightning Bolt is not optical and basically just extends the DisplayPort packet structure.

Naw, USB 3.0 is gonna thrive because of its backwards compatibility with 2.0 and 1.1 (devices o'plenty). USB 4.0 is trivial and that's fine by me. As demonstrated with Super Speed connectors, USB wasn't intended to be expanded in this way. 4.0 connectors would probably have more in common with Frankenstein than USB 2.0. USB also has severe length limitations due to its copper wiring. Intel is going to use Thunderbolt to make sure 4.0 never happens. If Intel were in complete control of USB, they wouldn't have allowed 3.0 to happen.

Apple likely heard about Thunderbolt and demanded they have access to it on their hardware. Apple has always adapted non-industry-standard ports for their products (hell if I know why). Intel pushed it out the door as a copper product for Apple because optical wasn't ready.

Lightning Bolt is not optical and basically just extends the DisplayPort packet structure.

They did not. Apple basically set the technical requirements they wanted met (like using a mini DP connector) and likely funded the research. Intel did all the heavy lifting on a stipulation that the first Thunderbolt products would be Apple computers. Intel aquired the name "Thunderbolt" from Apple through the deal.

AMD has been quietly demonstrating a cost-effective alternative to Intel's Thunderbolt at CES. It's unimaginatively called Lightning Bolt and, according to the company, it will deliver USB 3.0, DisplayPort and power over a single cable with mini DisplayPort connectors for a fraction of the cost Thunderbolt controllers and devices command.

Click to expand...

Intel need to make sure Thunderbolt is cost effective in order for it to succeed. USB4.0 is just around the corner with approx: specs at 15.2Gbit/s (1900 MB/s) and will have backward compatability with USB 2.0 and 3.0 but not 1.0.

Wake me when a device is made for this that I'll actually care about...

Got a feeling TB is gonna turn into the Bluray of cabling. With USB3 and then 4 coming...I'm sure TB will have it's place but I don't expect it to become anything more than Firewire was. The major standards like USB will still dominate for awhile.

I think that's a little too optimistic. Putting graphics on a shared bus with NICs, keyboards, mice, external drives, and everything else is destined to create problems of its own (namely bandwidth issues). I think it will be a long time before any interface achieves that kind of bandwidth.

I think that's a little too optimistic. Putting graphics on a shared bus with NICs, keyboards, mice, external drives, and everything else is destined to create problems of its own (namely bandwidth issues). I think it will be a long time before any interface achieves that kind of bandwidth.

Click to expand...

thats the whole point, and why it uses displayport connectors. video is the only really high bandwidth application, and HDMI 1.4 already includes 100Mb ethernet.

Fiber is the next logical progression of data transfer to/from the PC.
I would think that good QOS prioritization would eliminate a lot of issues with devices becoming starved for bandwidth (if it's even needed).

thats the whole point, and why it uses displayport connectors. video is the only really high bandwidth application, and HDMI 1.4 already includes 100Mb ethernet.

Click to expand...

The push for 10 gigabit ethernet has already begun. That, by itself, can overflow current Thunderbolt specifications, not to mention the issues of interfacing with established networks. Add in a 900 MB/s external SSD and you're already well into no-man's land of interfaces. Right now, if they smoosh them all into one port, there will be major drawbacks. Optical Thunderbolt also won't supply power initially.

The Thunderbolt implementation in Apple products likely used mini-DP connectors because it's cheap (and Apple probably demanded it). The mini-DP connectors are not likely to work for the optical implementation without substantially increasing the price of the cables.

Fiber is the next logical progression of data transfer to/from the PC.
I would think that good QOS prioritization would eliminate a lot of issues with devices becoming starved for bandwidth (if it's even needed).

Click to expand...

Video cards, NICs, etc. all share the PCI Express bus. If you are aiming to combine their outputs, you need a separate bus for that. No PC hardware available now is intended to do that. As far as I know, PCI Express bus isn't even capable of those changes without a serious overhaul.

I'm not saying it couldn't be done with smartphones and other low bandwidth devices but it is extremely unlikely to happen any time soon with laptops, desktops, workstations, nor servers.