Apple’s newest headphones, the AirPods, are not actually headphones: they’re aural implants. Absent any wires, earpods are so unobtrusive that you may never take them out. Like the eyeglasses on your face, they’ll sit in your ears all day, and you’ll remove them only at night.

For Apple, this is more than just an incremental update to their headphone product. This is a paradigm shift, a Jedi move by Apple where the seam between human and computer is disappearing.

At first blush, AirPods seem the same as today’s headphones, except with the wires gone. Sure, we’ll use them just like the old ones, to make phone calls and listen to music. But the crux of what’s interesting here is what we’re doing when we’re not using them. Without the lanky wire getting caught on clothes, doorknobs, and backpacks, persistently begging to be wound up and put away, there’s no longer a compelling reason to take them out when you’re done using them. So we won’t.

AirPods are default in, as opposed to traditional wired headphones, which are default out. Much of the criticism lobbied at the design is that you’ll put them down and lose them — but what if you never put them down in the first place?

When you’re expecting an important SMS during a meeting, instead of checking your phone (or watch), you’ll hear the message discretely whispered in your ear. Imagine having lunch with a friend, and your headphones whisper "your next appointment is in 10 minutes." Message received, and you haven’t even blinked. Imagine sharing a glass of wine with your romantic partner, and the voice in your earbuds effortlessly lets you know that your dinner table reservation is ready. As though the maître d' just whispered it in your ear.

Unlike vision, hearing is nondirectional. Eyes can only look one place at a time — you cannot keep eye contact while checking your phone for a text. But when you listen, you don’t have to point your ears anywhere. You can hear music on the radio and hear your friend next to you talking. Unlike vision, we can easily hear multiple channels of sound at once, so an audio notification is therefore a degree less invasive than a visual one. Apple’s AirPods are Google Glass done right— if Glass was meant to provide information passively though a visual channel, AirPods have figured out that audio is a better channel for passive information. You can interact with it invisibly.

Audio is a better channel for passive information. You can interact with it invisibly.

And let’s not forget the microphone is always there, too. Yes, we’ll probably be able to say things like "Siri, call an Uber." But that’s not the most creative use case . There will be entirely new uses we can’t possible know now, just like nobody imagined Snapchat when the iPhone first launched. Imagine organic keyword matching that allows invisible use within a conversation: "That’s a great point Rebecca—I’ll be sure to make a note to call Adam tomorrow about his new designs" and—voilà—the note is created without missing a beat. Or maybe it whispers contextual factoids, or pings you when you say filler words like "um, uh, y’know," or lets you tap to save the last 30 seconds of conversation to send a great punchline to a friend.

What makes AirPods a winner is that Apple will throw their whole ecosystem behind it. We’ll have better Siri, we’ll have interesting interactions using Watch as a controller, and we’ll see a flurry of apps we can’t even imagine. These aren’t just wireless headphones, this is the entire internet implanted in your ear.

This inversion, from default out to default in, is not actually new. Deaf patients, out of medical necessity, put in hearing aids and only take them out to sleep or change the battery. Where medical necessity has blazed a trail, consumerization is following. Technology has improved and price has fallen to a point where millions of people can now install computers in their ears. And pretty soon, when implants are powered by blood sugar, we won’t even need batteries.

Things are going to get smaller, more personalized, and more natively integrated, equipping us to become better versions of ourselves.

We’re seeing our prediction of humans as the next platform come true before our very eyes. From nootropics to sleep monitors to aural implants, we’re seeing the development of mainstream products that let us enhance our biological abilities significantly and in seamless ways. Things are going to get smaller, more personalized, and more natively integrated, equipping us to become better versions of ourselves.

What’s the next implant we’ll see in the mainstream? I predict that we’ll see a consumer product for continuous blood monitoring, for biomarkers such as cholesterol, glucose, and ketones. The value is enormous for a hook into our bodies that lets us seamlessly track macronutrient levels, the immediate and long-term effects of our diet and environment, and early signs of disease.

It’s not far off, either. Similar technology exists for diabetics.There are products that are essentially a Band-Aid with a microchip in it, used by diabetics out of medical necessity for measuring blood glucose levels. I recently bought one for personal use, and it’s admittedly new, limited in capability, expensive, and actually had to be shipped from the U.K. But it won’t be long before we see a cheap and easy consumer version. I predict we’ll see a continuous blood monitoring product make the jump to the mainstream within the next three years.

The fact that Apple is launching the first generation of mainstream aural implants makes this an exciting time for biohackers like myself.