Everyone understands what it means to own a plunger. Go to a store, buy the tool, take the physical object home, use it. No contract is required. If you give it away, all its plunging capabilities go with it. If it is stolen, it is lost. If you don't know how to use it, there is no customer service line to explain. There is no way to hack a plunger from eastern Europe or even from across the street. Modern Americans own most things the plunger way.

But consider the phone. The physical object of the phone is the center of an ecosystem that spans the Earth and goes on up to low-earth orbit, where GPS satellites float in space and help companies pinpoint devices on Earth. To make calls or send text messages, one must have a contract with a provider of service. These phones come with all kinds of restrictions on their possible physical capabilities. You may not take them apart. Depending on the plan, not all software can be downloaded onto them, not every device can be tethered to them, and not every cell phone network can be tapped.

"Owning" a phone is much more complex than owning a plunger. And if the big tech players building the wearable future, the Internet of things, self-driving cars, and anything else that links physical stuff to the network get their way, our relationship to ownership is about to undergo a wild transformation.

We won't own almost anything the way we own our plungers.

* * *

But the coming transformation is about more than ownership. It’s about attempting to take a way of thinking (and selling) from one technological world and apply it to another. It’s about making things “smart”—which means, in effect, making them function like our smartphones.

Sometimes that pitch is more explicit. When it announced its first wearable computer early this month, Apple never called its Watch a “smartwatch.” (Such a clunky title.) Instead, it was all too happy to compare it to the iPhone. That’s partly because the iPhone is the company’s marquee product, the one that millions have warm-fuzzies for—but it’s also because turning the watch into a zone of smartphone-dom is exactly what Apple hopes to do.

And make no mistake: The Apple Watch is indeed Smart. At base, smartness means it has a processing power onboard. It’s means it’s a little computer hooked up to a sensor or three and a networking node. A smart thing probably reports some fact of its condition or location, and an aspect of its state might be modifiable from afar.

Smartness often entails a whole ecosystem, too. Smart things can do more with software—which means they have an app store. And because they have a computer or storage onboard, smart things, too, need to be replaced at a decidedly un-plunger-like rate. “Smartness” implies a smartphone-like upgrade cycle.

This aspirational smartness is not just coming to watches. Any product that's worth more than a few bucks will have some intelligence and communication abilities embedded in it. Companies that exist today are trying to create smart umbrellas and smart crockpots.

Do you want a crockpot that has to be replaced at every few years—or at least that will be forever upgrading itself? Would apps change your mind?

* * *

With smartness comes something else: hackability. If you have a computer, and it’s on the network, it can be breached by someone or something else. Build a smart ceiling fan and you have a ceiling fan that can be hacked.

This is the future that Kevin Meagher, the general manager of Lowe's SmartHomes initiative, is paid to think about. Talking to GigaOm this summer, he implied that the smarthome—even the smart air conditioner—will exist in a tremendously messy environment.

“The only way you can actually control [a] device or integrate it into your ecosystem is by actually going through the cloud-to-cloud. So, for example, whatever the device is is talking to the manufacturer or the vendor’s cloud platform, and then they are allowing you access to their cloud’s data," he said.

What this means: Most “smart-home” devices right now aren’t talking to each other directly over a local wifi network. Instead, they’re talking to enormous and centralized data centers, which are then talking to each other, which are then (finally) sending instructions back to different devices in your home.

Meagher continued: “If I am responsible to you with a level of service for your home so [that] when you push a button, you want your air conditioning to go on or off, I’m dependent on a third party that won’t give me any guarantee of service—they won’t give you any guarantee of service, so how is that going to work?”

How is that going to work?

Set aside even the specter of foreign criminals commandeering a ceiling fan and using it against you or your country. (Which isn’t as outlandish as it sounds: Anything with memory and a web connection can join a botnet.)

Consider instead that most recent “hacking” incidents have barely involved forced manipulation of computer code at all. They have instead relied on clever tricks and workarounds to gain access to cloud services accounts. As often as not, hackers enter by tricking security questions, not by brute-force attacking a backdoor. (This seems to be the case of the most recent celebrity iCloud hackings.)

In other words, the recent spate of user hackings stand as symbols not of black-hat engineering prowess but of failed consumer products. And devices in your home will rely on the exact same kind of service that so regularly fails us already.

And that’s just in home appliances. Our self-driving cars, too, are supposed to be locked down. But they could be just as jailbreakable and hackable as smartphones.

Ryan Gerdes, a professor at Utah State, has been given $1.2 million grant from the National Science Foundation to think about autonomous vehicles and security. When we talked to him earlier this month, he posed questions that don’t have good answers yet.

“What happens when you have two advanced cruise control vehicles and the one in front starts accelerating and breaking such that the one behind it starts doing the same thing in a more amplified fashion?” he asked.

What if someone hacked their car to get ahead of other cars by manipulating their logic? Would people be allowed to make those changes to their car? What constitutes hacking and what constitutes a permissible manipulation of software?

In some ways, these are easier questions to answer when they’re about cars. As Americans, we accept that some automobile modifications should be illegal to ensure the safety of everyone else on the road. That doesn’t mean it’s not technically feasible, but it changes the legal calculus somewhat.

The legal environment that smartphones exist in, meanwhile, is quite different. Until last month, the Librarian of Congress got to decide whether unlocking a cellphone was a violation of federal copyright law. (Because unlocking it required altering its source code, and source code has copyright protections.)

And beyond strict illegality, there are practical and financial limits on how you can modify a smart thing. To jailbreak a phone—to install software on hardware you purchased—often violates the phone’s warrantee. It’s likely such rules would carry over into other smart devices.

In short, you can never own a smart device like a plunger—you’re always to some degree at the leisure of the company that made the software.

* * *

But it gets weirder. While you have limited ability to modify your phone, it’s also grafted to you—forever. In 2013, a Londoner’s laptop was stolen. He had installed anti-theft software on it, but it wasn’t pinging him, so he figured it was lost.

Once activated, it began sending him updates regularly. He watched its new users type and play music. Eventually, he figured out how to get in touch with them, and they sent him back his computer.

Heisn’talone. Once you own a smart device, even if it’s stolen, it remains yours to some degree—always pinging back to tell you where it is in the world. In other words, even if you cease to have physical dominion over a smart thing, it allows you a certain amount of functional control.

It’s a funny mode of ownership. It’s almost mystical. The thing remains subject to you, somehow, no matter how far away you are from it. It can send updates about its status to you, even if you’ve long forgotten about it.

And yet: If you’re holding it in your hand and want to install a certain piece of software on it, you might be breaking the law.

* * *

When it announced its two new iPhones, Apple also granted every iTunes user a free copy of U2’s albums. We write “granted,” but they didn’t have a choice: The files just showed up on their laptop or phone all of a sudden, unbidden. Not every user was pleased, and Apple eventually had to release a tool to let users remove the album.

In this brave new farrago of medium and message, U2 seem to have transmitted all of rock-and-roll’s misguided egotism into one ridiculous statement: Our music is technically worthless and everyone in the world should hear it. That’s what this band is “all about,” and Apple is happy to do its part, making you the owner of these songs without asking your permission. Which is disgusting.

So as you delete “Songs of Innocence” from your memory — as you should, without hesitation — remember the fleeting heebie-jeebies as they crawl around your follicles.

That utopian philanthrocapitalist democracy that Bono is always stumping for will also be a place where your belongings will be chosen for you.

It might be a stretch to call our current world "a place where your belongings will be chosen for you," but the smarter one's things, the greater the possibility that they'll be conscripted into schemes you never would have imagined and might not like.

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.