By Paul Miller March 23, 2015

The best kinds of questions start with a "What if..." and end with everyone scratching their heads. The other day I launched a presumably innocent inquiry into the nature of USB-C, and got a lot more than I expected:

What most people initially mistook for a dumb joke became a not-entirely-healthy investigation on my part into whitepapers, specs, and slide decks about USB-C. Because, as it turns out, there seems to be no trivial answer to that question.

USB-C is the hot new plug everyone is talking about. Most notably, both Apple and Google — within days of each other — have launched laptops which charge over USB-C. Apple, more infamously, made USB-C the only plug of any kind on its new MacBook, other than a lone headphone jack. Google was more generous, with charging-capable USB-C on both sides of the laptop, and a pair of old school USB ports for the nostalgic among us.

These laptops are not only some of the only devices on the planet to ship with (apparently) full implementations of USB-C, they're shipping a good number of months before most experts were expecting to see USB-C on the market. I, not being a plug fanatic, had no idea what USB-C was.

One of Pixel's fancy USB-C ports.

There are a few important things that make USB-C special. First off, "USB-C" actually refers to the plug type. USB-C has a tiny new connector that can't be plugged in "wrong" like all other forms of USB. There's no right side up. But more importantly (in my opinion), USB-C is the official plug of USB 3.1. USB 3.1 is twice as fast as USB 3.0, which is only natural for a new USB standard, but what's a little crazier is the new "USB Power Delivery" specification.

If you look at most USB cables, they have different connectors on each end. This is important because USB traditionally needs a host / peripheral dichotomy to work. Like, if you plug your scanner, or phone, or camera into your computer, your computer (with its standard USB plugs) is the host, and the device you plugged in (with its miniature or weird USB plugs) is the peripheral. These roles are negotiable for data transmission, but non-negotiable for power delivery... until now.

There's no right side up.

USB-C allows devices to negotiate automatically who charges whom. Power flow is bi-directional. This is important because the prototypical USB-C cord will have the same plug on both ends. It's also important because USB-C has been designed to allow way more power to flow through it. Back in the day, you could usually get about 5 watts out of USB — plenty to charge a phone, or maybe power a tiny hard drive, but not much more. When Apple's iPad came out, which needs around 10 watts to charge, PC manufacturers and wall-plug designers had to scramble to start putting out that much power over USB — even now, many computers only have one or two plugs that can pump out that much power. USB-C, meanwhile, is capable of 100 watts. The new MacBook, for instance, charges at 29 watts, which is probably blowing your iPad's mind right now.

Just to be clear: USB-C doesn't require that every device can sling 100 watts around, it just makes it possible. A lot of the spec is left up to device manufacturers to decide how they want to implement it. That's why my initial question is still a head scratcher: I have no idea how Apple and Google have implemented USB-C. I know they can accept a lot of power for charging, but how much power can they output, and how will they decide who gets power and who gives power?

But some use cases are already becoming clear. Take a standard office laptop setup, for instance. To be productive, you usually need to plug in power from a wall charger, a display connector of some kind, and a USB cable to connect to your monitor's USB hub that's plugged into a mouse, keyboard, and perhaps an external hard drive. And, of course, that hard drive and monitor are each plugged into the wall as well. Power strip manufacturers love this. With USB-C you could plug one cable from a USB-C monitor into your laptop and it would charge your laptop, work as the display connector, and communicate with the monitor's USB hub. It's not even crazy to imagine that during a power outage, your laptop's battery could output electricity through the monitor's USB hub and keep your hard drive alive until you can shut it down safely. If you really wanted to kill your battery, you could probably power the monitor as well from the USB-C connector.

We'll need fewer plugs, fewer wall warts, and less hassle to get more done.

I can imagine other useful use cases for USB-C, like maybe your friend's phone is dying but you have a full charge — plug them in to each other and share some of your battery life. There's also the fact that backup battery manufacturers can finally make external laptop batteries that don't require a million adapters — even a tiny external charger might give you the five extra minutes you need on your laptop, while it could provide hours of charge for a phone.

It's a brave new world, people. Apple's decision to put a single USB-C connector on the new MacBook might rub you the wrong way, but it's also a signal that in a future powered by USB-C we'll need fewer plugs, fewer wall warts, and less hassle to get more done.

And no, I still have no idea what will happen if you plug a MacBook into a Pixel over USB-C. But, thankfully, we don't have to throw all of our old USB stuff in the garbage right away: my friend Joanna Stern plugged a Chromebook Pixel's USB-C port into an old fashioned MacBook Air and it charged the Pixel without a problem, like a good old fashioned USB "host" should. Baby steps into the future of plugging things.