Posted
by
Soulskill
on Tuesday May 13, 2014 @11:10PM
from the apple-can-afford-life-support-for-a-while dept.

Lucas123 writes: "The USB SuperSpeed+ spec (a.k.a. v3.1) offers up to 10Gbps throughput. Combine that with USB's new C-Type Connector, the specification for which is expected out in July, and users will have a symmetrical cable and plug just like Thunderbolt but that will enable up to 100 watts of power depending on the cable version. So where does that leave Thunderbolt, Intel's other hardware interconnect? According to some analysts, Thunderbolt withers or remains a niche technology supported almost exclusively by Apple. Even as Thunderbolt 2 offers twice the throughput (on paper) as USB 3.1, or up to 20Gbps, USB SuperSpeed+ is expected to scale past 40Gbps in coming years. 'USB's installed base is in the billions. Thunderbolt's biggest problem is a relatively small installed base, in the tens of millions. Adding a higher data throughput, and a more expensive option, is unlikely to change that,' said Brian O'Rourke, a principal analyst covering wired interfaces at IHS."

I figured that all along. It took off on Apple hardware, with almost no pickup on normal PCs. That has finally started to happen a little - some upper end motherboards have 1 or 2 Thunderbolt ports now, and Asus has an add-on board for a few others - but it is really a niche thanks to its odd hardware requirements and lack of early adoption outside of Apple. USB is easier to use, and at least up to 3.0 has been backward compatible with older devices. With an even faster option, as long as they don't screw something up, I don't see how USB could not continue to be the leading connectivity standard.

This is exactly what I came here to post. It's a shame, because FW400 was far superior to USB2.0. The problem lay with the peripheral manufacturers who didn't want to put in more expensive controllers and dual-ports on their enclosures. Heck, wasn't the iSight the only webcam for Firewire? No demand=no supply=high prices. FW800 was pretty much the same. Better tech, limited market, high prices, bang, whimper. I love that my old Mac Mini can transfer data between 3 daisy-chained FW400 drives much faster than

Just had a look at my local online camera shop. None of the Canon home, prosumer or pro-broadcast cameras have FW anymore and only the Sony pro-broadcast cameras have it, not their prosumer or home devices. Sad.

Using firewire for external hard drives and other tech came long after firewire/i.link was added to video cameras.

This was the problem with Firewire. It has lots of technical things going for it, but unlike USB it was not actually designed to be general-purpose. Neither was Thunderbolt. Thunderbolt was designed to support the specific protocols that Intel was building into its chips and boards.

This is exactly what I came here to post. It's a shame, because FW400 was far superior to USB2.0. The problem lay with the peripheral manufacturers who didn't want to put in more expensive controllers and dual-ports on their enclosures. Heck, wasn't the iSight the only webcam for Firewire? No demand=no supply=high prices. FW800 was pretty much the same. Better tech, limited market, high prices, bang, whimper. I love that my old Mac Mini can transfer data between 3 daisy-chained FW400 drives much faster than it can transfer to a single USB2.0 drive, but the fact that enclosures are expensive and basically non-interchangeable with any of my other devices makes it a pretty niche market.Thunderbolt will probably follow the exact same progression, right down to the "new" faster Thunderbolt. Sure, its PCI-E, but 95% of consumers don't know, care, or need that capability. They buy on price and availability, plain and simple.

One of the security failures of firewire was that it provided direct access to memory. In other words a malicious external device could gain complete control of the computer. Having your peripheral interface be PCIe is just as bad. USB for all its overhead is still more secure (assuming you finally fix some of the stupid windows autoexecute bugs)

same niche market that Apple made popular;)......it was Apple who made USBpopular, with their first iMac.....USB was also original by Intel, and did hardly anything for 5 years...then came the first iMac, and suddenly almost overnight USB was hot...it took PC's about 2 years to catch up, from PS/2

Correlation is not causation. What really made USB popular was when 2.0 came out, which was developed mostly by Intel and HP. At that point you could connect hard drives and get reasonable speeds, and hardware costs for slow devices like keyboards dropped too as HP found a way to do the timing required for USB 1.1 cheaply.

I had it with Windows 95 OSR/2. As usual Apple fanatics reinvent history by claiming Apple is responsible for innovation they clearly aren't responsible for. They did not develop the spec, did not develop the prototypes, did not develop most of the market, ad nauseum.

Since it's essentially a PCI bus extension --- this means, you can add an external PCI chassis attached via Thunderbolt without
needing special drivers, and in theory....
you can do things like add additional GPUs and arbitrary PCI devices to your desktop way beyond the expandability of your physical motherboard's
or primary chassis' form factor.

There's really no way to accomplish something like that using USB, at least.... not with complicated specialized drivers being developed

Adding external GPUs is a pretty niche application though, and of course you have to supply them with 300W+ for a high end one so that means yet another power brick. For external storage the performance gains are unlikely to be enough for most people to care and spend the extra money, especially once USB gets its shit together and provides enough power to run HDDs without an extra power supply.

Security is also a problem with Thunderbolt, Like Firewire any Thunderbolt device has full and unrestricted access

To my mind that's where Apple went wrong with the new Mac Pro. Looks pretty, no internal expansion. So you have this nice object on your desk - with a bunch of trailing wires to your storage, optical drive, card readers etc...

If it had an internal optical drive, multiformat card reader (SD and CF at least) and room for at least one or two extra drives (hot swappable would be nice), it would be a great machine. I'd imagine more people would find that useful than the dual graphics cards. I know I'd prefer it

Except that USB is the badly designed but good enough for average users standard. Like choosing IDE over SCSI, not a decision made on the technical merits but by low margin motherboard manufacturers. The problem USB has won't be fixed with higher speeds, it'll still be a master-slave polling architecture and have latency issues.

I figured that all along. It took off on Apple hardware, with almost no pickup on normal PCs. That has finally started to happen a little - some upper end motherboards have 1 or 2 Thunderbolt ports now, and Asus has an add-on board for a few others - but it is really a niche thanks to its odd hardware requirements and lack of early adoption outside of Apple.

The real problem with Thunderbolt is that it is niche as designed, while USB is a general-purpose interface.

Thunderbolt is a slower-speed (because copper) variant of Intel's "lightning" interface... which has lots of potential because it's optical. I think doing it first over copper was a rational stepping-stone to fiber, but the problem is that it doesn't seem to be general-purpose. Instead it was made as a carrier for faster version of existing standards: PCI and Display Port. This is a big limitation.

I figured that all along. It took off on Apple hardware, with almost no pickup on normal PCs. That has finally started to happen a little - some upper end motherboards have 1 or 2 Thunderbolt ports now, and Asus has an add-on board for a few others - but it is really a niche thanks to its odd hardware requirements and lack of early adoption outside of Apple. USB is easier to use, and at least up to 3.0 has been backward compatible with older devices. With an even faster option, as long as they don't screw something up, I don't see how USB could not continue to be the leading connectivity standard.

Try hooking an external SSD up to you machine via USB3 and then via Thunderbolt and you'll see why Thunderbolt is desirable if you are transferring large amounts of data: http://gizmodo.com/5980157/thu... [gizmodo.com]. Take a look at the "Time to write 16.9 Gb of data" row in the table at the bottom and imagine you are transferring 3,4 or 500 Gb. There is about 250 Gb of data on the SSD in my MacBook Pro, large amounts of that data can change frequently meaning long backup times and cutting the time it takes to write that stuff up to disk in half is a major bonus. The problem Thunderbolt has had is not just backwards compatibility, i..e. that here are so many USB 3 devices out there that it is going to take a looooong while to put a dent in the USB monoculture (as you correctly pointed out). Thunderbolt devices have also had a tendency to be more expensive which didn't help either nor did the fact that up until now you have only started to benefit from Thunderbolt for real when using SSDs and they are also expensive which just aggravates the cost problem. When the USB 3 alternative is 2-3 times less expensive than Thunderbolt the choice for the consumer is obvious. If there is going to be a USB standard that is comparable in speed to Thunderbolt, backwards compatible with all the old USB2 and USB3 devices and that has a better connector, Thunderbolt is doomed. Intel should have pushed Thunderbolt way more aggressively i.e. handed out Thunderbolt product licenses liberally, provide motherboard and peripheral manufacturers with incentives or even sell Thunderbolt chips at cost.

Thunderbolt isn't going to replace USB in all cases, but Thunderbolt isn't about the speed. It's about the protocol. Thunderbolt is basically PCI-E over a wire. Can you connect a GTX 780 Ti (http://techreport.com/news/26426/thunderbolt-box-mates-macbook-pro-with-geforce-gtx-780-ti) with USB 3.1? No? Not really a replacement then. Same goes for any other device that has traditionally been a PCI-E card. Or, you know, you can get an adaptor (http://www.sonnettech.com/product/echoexpressiii.html) and directly connect a PCI-E card.

Speed wise Thunderbolt is evolving too. At this rate there isn't much of a chance of USB 3.1 catching Thunderbolt. As the OP mentioned, Thunderbolt is still ahead of USB 3.1 and 40 Gbps Thunderbolt is coming soon (http://www.extremetech.com/computing/181099-next-gen-thunderbolt-details-40gbps-pcie-3-0-hdmi-2-0-and-100w-power-delivery-for-single-cable-pcs). But again, even is USB catches Thunderbolt, or both become fast enough, the protocols and designs of the connections makes them entirely unsuitable for each other's uses (you wouldn't connect a mouse and keyboard to your PCI-E bus directly via Thunderbolt.)

Is there a real use case for connecting a PCI-E card to a system via an external port? The link you showed was basically an enthusiast/hobbyist novelty. If I actually need that sort of graphics power (gamers or CAD), I'm probably using a gaming rig or a workstation, which both have PCI-E slots in the case. I can't imagine what other sort of PCI-E cards I'd be carrying around with my laptop.

Is there a real use case for connecting a PCI-E card to a system via an external port? The link you showed was basically an enthusiast/hobbyist novelty. If I actually need that sort of graphics power (gamers or CAD), I'm probably using a gaming rig or a workstation, which both have PCI-E slots in the case. I can't imagine what other sort of PCI-E cards I'd be carrying around with my laptop.

The point isn't to make PCI-E cards portable. It's to make it so you only need one machine. Why buy a desktop when you can simply plug the PCI-E cards straight into your laptop? You COULD buy a desktop with a bunch of PCI-E slots, but you don't need to now. Why buy a redundant CPU with a redundant motherboard just to drive a few PCI-E cards?

And if you're a pro with a desktop, and you run out of PCI-E slots, do you simply buy a whole new machine? Thunderbolt can drive six PCI-E devices per bus (http://www.macworld.com/article/2146360/lab-tested-the-mac-pro-daisy-chain-challenge.html). Most desktops don't have six PCI-E slots total.

A lot of pros are adopting Thunderbolt because it allows them to use the devices that used to require a desktop quickly and easily with a laptop, and they can reduce their machine count by one. Thunderbolt doesn't need to displace USB because it has a niche that USB effectively can't replace.

Why buy a desktop when you can simply plug the PCI-E cards straight into your laptop?

What PCIe cards are you plugging in again? Graphics cards? You still have yet to demonstrate that it is not a novelty. I have never seen a CAD setup like that. Nor have I heard of a gaming rig that uses a laptop CPU but has an external graphics box. Maybe you're right and it will be all the rage in CAD houses.

And if you're a pro with a desktop, and you run out of PCI-E slots

You're kidding, right?

A lot of pros are adopting Thunderbolt because it allows them to use the devices that used to require a desktop quickly and easily with a laptop, and they can reduce their machine count by one.

What PCIe cards are you plugging in again? Graphics cards? You still have yet to demonstrate that it is not a novelty. I have never seen a CAD setup like that. Nor have I heard of a gaming rig that uses a laptop CPU but has an external graphics box. Maybe you're right and it will be all the rage in CAD houses.

I could go on but really the answer is "Every single PCI-E card that exists." Or "Every single PCI-E card that is important to professional users that just because you don't know about doesn't mean it doesn't exist."

Of those examples, they are still mostly video accelerator / transcode acceleration area, and a couple have USB 3.0 / SS versions. Outside of the die hard MBP/MP users, anyone with a non-Apple laptop who works in industries where such hardware is necessary will have a dedicated render station to run those cards. You seem to forget that a MBP is going to have CPU, RAM, and I/O buses which simply can't match a regular desktop much less server-class workstation motherboards.The other part that you are ignori

Look buddy, I don't doubt that there are PCIe cards that are useful to professionals. What I doubted was the desire to hook them up to a laptop. This post [slashdot.org] happily provided one example, so I clearly stand corrected. I still don't buy your premise that Macbook Pros with external boxes for these sorts of things are going to be common.

Just so you know, the second link in your list shows a (non-PCIe card, but rather meant-to-be-external) device available with either a Thunderbolt or USB3.0 interface. There i

Probably not for most people, but I do it all the time on film sets. Rather than carry a workstation, you can just cary a laptop and a thunderbolt chassis with a RED ROCKET card for playback and transcoding. On location, this is a lifesaver.

The first? There are any number of "docking station"-style solutions that are less specialized and therefore legitimately useful to even more people - the primary one being the one integrated into Apple's Thunderbolt display (but there are cheaper solutions from Belkin, Sonnet, Matrox, CalDigit, etc). Get home, plug your laptop in, and with that one connector it instantly has access to your 30" display(s), gigabit ethernet, and your USB 3, Firewire, and other Thunderbolt peripherals (and the speakers, mic, and webcam built into the display, too). For a laptop, Thunderbolt can be remarkably useful. On a desktop less so IMO.

Exactly - the cost of the external enclosure alone is more than a decent desktop PC. Not to mention the Thunderbolt speed limits its performance to basically the level of a much cheaper video card, anyway.

The external enclosures are expensive because they're a niche item. They're manufactured in low volume and sold to a 'pro' audience with deep pockets.

In reality, Thunderbolt controllers aren't all that expensive. [mouser.com]. Even if an external GPU cost $75 or $100 more than the internal equivalent, it would still be a great way to upgrade an Ultrabook, or a Steam box, or even a cheap name-brand desktop.

I've been thinking hard about a cable that will bring data to my CPU with the lowest latency. At the other end of the cable would be a guitar with several A/D converters, one for each pickup. Including piezos, that might add up to about 10 192Hz/32bit signals. That's still not a tremendous amount of bandwidth, but latency is much more important in this application. I don't think there is any dispute that the lowest latency lane to the CPU in current PCs is over PCIE. If thunderbolt is PCIE over a wire, it would be a natural technology to finally modernize the electric guitar for the digital age. Well, a guy can dream!

to do what with exactly?PCIe doesn't bring the data straight to your CPU, it brings it straight to your RAM, and then an interrupt will fire to signal the data has been transferred.You've then got the latency the OS takes to service the interrupt, after the latency of transferring an entire packet of data.

If you want low latency to do "something" with your signal, you buy a $2 DSP chip and connect that to the ADC.

If you need predictable latency and guaranteed bandwidth, USB already has a mode called "isochr

Is what will get me back on a laptop. I have a 'gaming rig'. There used to be all manner of reasons why laptops were worse - screens, speed etc. Now there's just GPU.
I'd love to be able to have a laptop I could lug around, get home and dock with a proper mouse, a proper keyboard and a GPU plugged into a great big monitor.

Thunderbolt isn't going to replace USB in all cases, but Thunderbolt isn't about the speed. It's about the protocol. Thunderbolt is basically PCI-E over a wire.

Bad idea for security reasons. Any device plugged in can read and write memory. That's not a good thing. At least with USB, it's just packets to and from the driver.

FireWire had the same problem. Most FireWire PC interfaces allowed limiting the hardware capability to accept packets that read and wrote memory. (There were address limit registers. The default settings for Linux left memory wide open to FireWire attack. (Under Linux, all of memory was open on 32-bit systems, but because this was a bug, not

I have a really hard time caring about "up to 100 watts of power depending on the cable version", mostly because of the "depending on the cable version" part of the statement.

How is this different from DVI, which much or might not have multichannel audio, might or might not be analog, might or might not support 5 channel digital sound, etc., etc.?

One thing Thunderbolt has going for it is that a cable is a cable, and you don't have to worry about it. If you want negotiated power supplied over USB, fine, but don't make me search my cardboard box for the "most sincere USB 3.1 cable". Thanks.

and more to do with finding one with thick enough wires - the flimsy little cables which were specified to carry 0.75 watts (150mA @ 5V) for USB 1.0 would probably melt in a few seconds if made to carry 100 watts...

Currently you're limited to a pathetically low current meaning your tablet may take most of the night to charge. You grab the wrong cable? No worries, your tablet will still take all night to charge. You grab the right one and you have a faster charging rate. You are no worse off than you used to be.

Oh the flip side most devices come with their own cables and the only cables which are not going to be compatible with higher power will be flimsy. There's

A niche technology, used mainly by Apple fans. Part of it is just lack of need, and increased cost. Most devices work just fine on USB and Thunderbolt, being a PCIe bus more or less, has more hardware requirements on the device side than USB.

However it is also because of Apple's meddling. Apple got involved with it back when it was an Intel project called Lightpeak and paid Intel to influence the development. They wanted an exclusive on it, since Apple loves being "first", for a year and convinced Intel to

The problem with all USB to this point is the fact that it has been largely CPU bound. PCIe, Thunderbolt, SCSI, FireWire are DMA devices, not without it's risks but with proper management the performance is leaps and bounds above USB - sure it costs a little extra but that point quickly becomes moot when you see the benefits.

USB is fine for mice and keyboards and some other low-bandwidth and very cheap things. FireWire has been doing low-latency audio and video (high-res) since it's inception. Even full-speed USB2 on a modern computer has difficulties getting a VGA frame buffer to work properly while studios have been able to live-edit multiple streams using FireWire since the PowerMac G5.

CPUs have gotten really, really, fast and for many things are seriously undertasked. Like I said, not knocking Thunderbolt for certain uses, but they are limited. USB3 on a modern system is capable of being "good enough" for most things. Audio? No problem, even USB2 has that licked. Video? Yep, USB3 can handle that. Data transfer? Well it is fast enough that even fast sticks are slower than it so no big deal. Network? It'll do 1gbps no issue.

The problem with all USB to this point is the fact that it has been largely CPU bound. PCIe, Thunderbolt, SCSI, FireWire are DMA devices, not without it's risks but with proper management the performance is leaps and bounds above USB - sure it costs a little extra but that point quickly becomes moot when you see the benefits.

USB is fine for mice and keyboards and some other low-bandwidth and very cheap things. FireWire has been doing low-latency audio and video (high-res) since it's inception. Even full-speed USB2 on a modern computer has difficulties getting a VGA frame buffer to work properly while studios have been able to live-edit multiple streams using FireWire since the PowerMac G5.

This is largely a myth. USB have supported DMA since the beginning so the transfers themselves doesn't use any processing power at all. What required processing power was periodically reading the status of transfers and creating a new schedule of things to be done. This was done 1024 times per second - not really a problem.

What USB (even with recent enhancements) have a problem with is latency - as transfers are mostly done at a fixed schedule and that schedule is updated relatively seldom there are many wa

That myth about Apple getting a one-year exclusive deal on Thunderbolt was debunked by Intel the day after it's release [pcmag.com], three years ago. On top of that, Thunderbolt could never work as a standard PCI add-on card, because it is lower-level and needs to expose/act as an entire PCI bus itself. Asus makes add-ons [techpowerup.com] for certain of their motherboards that have an additional specific Thunderbolt header, though - and Displayport is optional there, busting yet another one of your claims.

The thing I dislike most about USB was the Intel style slathered all over it (or at least their host controller interfaces). The next problem is taking something designed only for low speed devices and twisting it so that it can work with full and high speed (polling at very infrequent intervals and a master/slave interface is ok for a mouse or keyboard, but not for a high speed device).

Thunderbolt 2 allows me to connect a 4k DisplayPort screen (or daisy chain two lower resolution DisplayPort monitors). Its connector is the same as mini-DisplayPort. It's small and convenient. Apple fit two TB 2 buses next to each other on my 13" MacBook Pro. Nice. Very high bandwidth, PCIe.

I don't want to plug a keyboard into this bus, because its overkill. Thunderbolt will probably never have any cost effective way to do a hub/star type topography. For general use lower bandwidth (haha, 1 gigabit is low bandwidth now!) peripherals I need USB. And my MacBook has that too. I wouldn't want it any other way.

That said, USB 3.0 seems like a ball of hurt compared to the difference between USB 1.0/1.1/2.0
Just look at the ads for USB 3 hubs. Most of them state which chipset revision they use, so you can look up whether or not your motherboard / OS will have difficulty with them. I built a FreeBSD 9.1 file server using usb 3 / usb 3 docks, but I failed them all back down to using their 2.0 interface due to persistent flakiness/dropping off the bus type issues. Rock solid on USB 2.0. YMMV, but I hope that USB 3 gets over its growing pains soon.

I built a FreeBSD 9.1 file server using usb 3 / usb 3 docks, but I failed them all back down to using their 2.0 interface due to persistent flakiness/dropping off the bus type issues.

If you look at MacZFS you will notice that ZFS over a USB bus is garbage. Far too many problems - developer says to not even bother reporting the bugs. And in my experience, FreeBSD is not much different in this regard. Had major problems with ZFS over USB while UFS appears to work fine. Use a different connection, like eSATA or Firewire, and ZFS starts to work again.

I only mention this because it is quite possible that USB was working fine. Glitches / delays / disconnects, regardless of which layer

I considered eSATA, but that is spiked ball of hurt, presuming you want to connect more than one drive per port via multipliers. Thanks for the suggestion, but it really was the USB 3.0... devices would drop off the bus (no longer present in usbconfig) on 3.0. Nary an issue with the exact same hardware on 2.0.

I think it's due to hacks like USB 3.0 hubs apparently also having USB 2.0 as a separate bus/hub logically, rather than attempting to unify the device tree somehow. Given some other comments here, ther

DisplayPort lets you connect a 4k DisplayPort screen, or multiple streams (specifically the 1.2 MST). Thunderbolt is not required. It's fine that it is a Thunderbolt connector as well bunt don't get confused here. A DP connector coming off a regular videocard in a desktop will drive the monitor just the same. It is the DP 1.2 signaling that matters, not the PCIe lane of Thunderbolt.

If all you are doing with your Thunderbolt connector is hooking up displays, that's an argument AGAINST Thunderbolt since you a

If all you are doing with your Thunderbolt connector is hooking up displays, that's an argument AGAINST Thunderbolt since you aren't using it, you are just using DisplayPort.

No, that's incorrect logic. Should we fill your unused PCIe slots with cyanoacrylate simply because you aren't using them right now?

Thunderbolt gives me a high speed expansion bus while conveniently not requiring a separate connector to do so. It duplexes with DisplayPort, for which I had immediate use.

Come on, I expect better arguments than this. Here... if you really hate Thunderbolt, have already made up your mind, and are just searching for stones to fling at it then why not refer to the DMA attack vect

My point is simply that your argument for Thunderbolt isn't actually an argument for it. You like DisplayPort, not Thunderbolt. An argument for Thunderbolt is if you are using one connector for display and for other things. If you are just using it for display, well then it could be DP for all you'd know/care.

That's the thing: Doesn't matter how good it looks on paper, doesn't matter how technically perfect it is, what matters is if it gets used in a meaningful way, such that people want to buy devices with

My point is simply that your argument for Thunderbolt isn't actually an argument for it.

Haha, yes, I understood the point I *thought* you were trying to make, but that wasn't what you said. It isn't my job to restate your conclusions correctly.

You are correct that my citation of using the DisplayPort functionality isn't support Thunderbolt per se. 1394 was certainly niche, but it was great when it came to bulk data transfer, such as pulling video from cameras.

I like Thunderbolt because I want a high speed PCIe type bus, and I believe the approach is more elegant than slots on a motherboard. I'

I design USB3 H/W....what. a. piece. of. shit. I have truly given up hope of engineering anything that will ever work universally, even Intels interfaces which you would like to believe would be a model reference design look like crap when you plug them into a gizillion dollar Agilent USB3 analyzer. Should I be be surprised? Probably not, USB has never exactly been the premium interface has it? Firewire didn't go away because USB was technically superior thats for sure. Thunderbolt just friggin' works, day in, day out, incredible and reliable performance. Sure cables are expensive, they have all sorts of clever active electronics...because...thats what it takes to make 10G in a consumer application work...not a $1.99 piece of injection molded crap from god knows what Asian hell chemical works. In fact Thunderbolts worst problem is....Intel.....who seem to have a bizarre attitude towards people who want to buy components from them to make peripherals...I honestly don't get it.

This whole mess started in USB2.0, it's only saving grace was that it is low enough bandwidth not to get trashed by poor hardware design.

Just looking at the specs for differential impedance of traces gets you a trace over 40mil wide and only a 5mil gap between them on a standard 2 layer circuit board. Effectively none of the cheap USB hubs conform properly to the differential signalling requirements as it's effectively impossible to achieve on an economic 2 layer PCB used by all cheap hubs. Then there's some who just don't seem to care about keeping signal lengths similar or any of that other unimportant stuff and you end up with absolute garbage when you hook an analyser to it.

I'm not surprised USB3.0 is hard to design for. Technically most people failed with USB2.0

I design USB3 H/W....what. a. piece. of. shit. I have truly given up hope of engineering anything that will ever work universally, even Intels interfaces which you would like to believe would be a model reference design look like crap when you plug them into a gizillion dollar Agilent USB3 analyzer. Should I be be surprised? Probably not, USB has never exactly been the premium interface has it? Firewire didn't go away because USB was technically superior thats for sure. Thunderbolt just friggin' works, day in, day out, incredible and reliable performance. Sure cables are expensive, they have all sorts of clever active electronics...because...thats what it takes to make 10G in a consumer application work...not a $1.99 piece of injection molded crap from god knows what Asian hell chemical works. In fact Thunderbolts worst problem is....Intel.....who seem to have a bizarre attitude towards people who want to buy components from them to make peripherals...I honestly don't get it.

I can relate. The USB 3.0 spec is a committee beast. It is difficult to read it and understand very clearly how you are supposed to put it into practice, and that's a problem for someone like me who is writing an implementation from the ground up.

Probably the biggest gripe is the compliance tests. The idea is that every manufacturer goes to the USB Implementers Forum and runs standardized tests on their widget so they can say "hey, it passed the tests, it's ready for the market.". But in real life, the t

DMA just means that you can program an I/O transaction to copy the data to the location you specify rather than have the CPU have to copy each byte. Most USB 2.0 controllers use DMA but they can not write to any location, and certainly devices plugged into USB have no ability to dictate where their data goes.

Even with bus-master devices, DMA is not inherently insecure. The copying is always done by the local host controller and never by the external device. The problems arise if the host controller is na

I thought USB had to pass thru the CPU/driver. Firewire had device DMA access and PCI is well, as low as you can go. I read about Thunderbolt 3 which is also being worked on; since both come out from intel you can expect Thunderbolt to be ahead of USB in terms of speed and flexibility. Don't see much need for high power output when Thunderbolt devices like displays will probably need their own power supply anyway.

I would be the first to agree... Thunderbolt is technically better than USB3.However, that is not the point.

What is the advantage of USB? Simple. The connector.I can plug a USB1 device into any USB3 port and it works. The reverse it also true, albeit at a pretty slow transfer rate.

The point it, the USB plug is ubiquitous while thunderbolt is already planning to change the connector again. That means buying adapters.That also means that the next motherboard I buy most likely wont have thunderbolt on it. Whic

That we keep talking about the two in language that exactly describes the two, but we completely ignore the language?EVERY spec for USB refers to the "up to" speed and quotes the maximum theoretical burst transfer rate that is sustainable for only fractions of a second in host to single peer communication.Thunderbolt's speed is the speed. period. 1 peer or 16 peers doesn't matter. You get 20Gb/s every second after every second. USB has never and is likely to never achieve that.

This was true of Firewire vs USB as well; USB claimed "up to 480Mb/s" but could never sustain that for any human sense-able time. Firewire 800 was flatly 400Mb/s. Firewire didn't advertise a theoretical maximum speed that you could get once in a while; it was a real-world measurable throughput when you were copying files.

So as long as people are ignorant enough to fall for marketing hype instead of actual useful data then USB will continue to dominate (and people will continue to purchase cars based solely on HP ratings)

Thunderbolt's speed is a PCIe2 speed, which is 10b/8b encoding, so 20Gb is only 16Gb of data with 20% overhead at the physical layer.It can't deliver the claimed speed in terms of data transferred. If they moved to PCIe3, that gets much better though with 130/128 encoding.

Aside from the technical advantages that people keep on bringing up, one of the main non-technical advantages that Thunderbolt has is its certification process. Any USB chipset that is faster than USB2's theoretical speed is certified as USB3, whereas in order to get certified as Thunderbolt 1 or 2 you must actually reach the advertised speed.

When you buy a USB device (unless it's from a reputable manufacturer such as Intel), its actual speed is usually an order of magnitude worse than the advertised speed.

Minor nitpick: the Thunderbolt connector is not symmetrical. The writer must be confusing it with Apple's Lightning connector for iDevices, from which the new USB connector probably copied this feature.

(Actually, I believe the Thunderbolt connector is more or less symmetrical with respect to the x-axis, but this is undoubtedly not what the writer meant.)

Same speed for an external hard drive. 30% CPU consumption for USB, 2% for Firewire.I also know my usb 3 dock very well - it sometimes decides to connect at usb 2.0 speeds for no particular reason.And don't get me started on the microusb connector that breaks if you sneeze in its general direction.So I hope that TB takes its sweet time to die out so I can get some use out of it.

I see a lot of concern here about backwards compatibility with any new interface. Why are we really concerned about this?

Your brand new server with quad-port Gig-E interfaces still auto-negotiates down to 10Mb speeds. Why? Because you might hook up your new $10,000 server to a $20 network hub you bought off eBay? Uh...no.

Apple had literally billions of devices in the market with the old sync connector. Then, they came out with an all-new connector, alienating entire lines worth of products. Did they go bankrupt? Was there some massive revolt in the industry? No, not quite.

My point is we should learn to move on. Stop worrying about backwards compatibility to ensure that we address scenarios that rarely happen, if ever. What exactly was Thunderbolt compatible with when it came out? Or Firewire? Didn't stop them from innovating.

Besides, there's a damn good chance that every single piece of computing hardware in your hands today will be replaced within 3-5 years, so I fail to see why we care even from a logistical standpoint. You won't even have the hardware in your hands to worry about backwards compatibility, and vendors will always see replacement as THE solution, so don't expect many long-term favors from them either.

Given Apple now has almost 15% of the laptop market (and way more if you count tablets, like some silly analysts do), it's clearly not a niche any more.

Especially since it's the top end of the market. Like iPhones, Apple may not have the #1 market share, but their customers spend a *lot* more per device than other hardware owners, which is a lot of motivation for high-end peripheral manufacturers to build it into their high-end peripherals...