The recent release of the Samsung Galaxy Gear smart watch has started a new wave of rumors about the much-anticipated iWatch, including the prediction by at least one research firm that Apple could sell as many as 10 million of the devices. But the history of wearable technology says otherwise—and in fact, I think anything attached to the wrist is more likely to fail than thrive.

Several centuries ago, a technological innovation encouraged the development of individual privacy, secrecy, mobility, and connectivity. It was called the pocket. Sewn into jackets, bodices, skirts, and pants, the pocket was a great improvement over the detached pouch. It was where secrets were kept, keys and coins stashed, and everyday items like cases of needles, snuff boxes, combs, and mirrors jumbled together.

The pocket also enabled an early instance of wearable technology: the pocket watch. In their pants, vests, waistcoats, or skirts, millions of open-faced and closed-cased watches lived in the pockets of men and women around the world. These watches came of age in the 1800s, the century when railroads, steamships, and telegraphs encouraged imperial and individual adventuring.

Pocket watches had only limited timekeeping utility at first. Not only were their timekeeping mechanisms unreliable, but nowhere in the world before the 1780s was there an agreed-upon time standard. It would be like owning a smartphone today without belonging to a network—or even the possibility of belonging to a network.

Despite the absence of time standards, demand for pocket watches exploded in the 19th century. Owners of pocket watches became adept at clockwatching, comparing their watches with the place of the sun in the sky, ringing bells, and public clocks. Various innovations worked to solve the problem of asynchronicity, none more so than the introduction in the United States in 1883 of standard time and time zones, the very system we continue to follow today. By the 1920s, communications technology (radio and telephone time services) had made standardized time accessible throughout the nation.

As this modern network of time took shape, it created new possibilities. Individuals could autonomously coordinate their movements and actions with others, whether scheduling an 8 a.m. meeting or midnight rendezvous. News organizations began to note the exact clock time when important events happened, whether it was the death of President Lincoln, the assassination of President McKinley, or the end of World War I (a New York Times headline of November 11, 1918 read, “War Ends at 6 o’clock This Morning”). Simultaneity became a watchword: simultaneous chess matches played in distant parts of the world presaged today’s Internet gaming.

So what happened to pocket watches? The wrist, that’s what happened. Wristwatches, first known as “bracelet watches” and “strap watches,” captured the fancy of a people trying to be modern. Fashionable ladies, German sailors, and British soldiers began wearing versions of wristwatches in the early 20th century. While the prospects for wristwatches seemed dim through World War I—skeptics went on about “the latest idiocy in fashion,” that is, effeminately wearing “one’s watch on a bracelet”—they stuck around, due in no small part to the martial associations they gained from being worn by daring aviators and tank drivers.

Nevertheless, the end of World War II marked the true opening of the age of the wristwatch. Some of it symptoms—notably the withering of certain aspects of social decorum—are very similar to those now associated with the smartphone. As one midcentury observer noted, “the action of looking at the time [on one’s watch] is perceived as a serious infringement of the most elementary conventions of polite society.” Whether nonchalant or studious, a glance at one’s watch or smartphone conveys punctiliousness, narcissism, and impatience in varying degrees.

By the 1970s, quartz had made wristwatches ubiquitous, cheap, and reliable. It was unthinkable that wristwatches would be eclipsed; modern life attested to their necessity. Just to be safe, however, various watchmaking firms tried to bring watches into the computer age. In 1977 multifunctional wristwatches, like the Pulsar Pulse Time Computer, inspired by microprocessor calculators, were marketed. In 1982, Seiko introduced the TV watch. By the end of the 1980s, several different watches doubled as pagers; and in 1993 Casio marketed the Zapping, a wristwatch that was also a TV remote control. As it turned out, not many people wanted to wear miniature calculators, televisions, phones, or remote controls. The wrist would not become an information hub, at least not back then.

But there was demand for portable information and communication devices, which is why the pocket came back. The rise of the smartphone coincided with a decline in the universality of wristwatches, especially among younger people. Many of my college students don’t wear wristwatches, but instead use their smartphone as, among other things, a pocket watch.

Now, decades after the pocket watch became antiquated, the pocket-to-wrist cycle may repeat itself. More and more tech companies are betting on the proposition that the next big thing will be wearable technology. Although Google has launched Google Glass eyewear, many of its competitors are eyeing the wrist as prime real estate for tomorrow’s wearable information hub. Samsung recently introduced its Samsung Galaxy Gear smart watch with a fabulous commercial conjuring up pop culture’s long anticipation of a companionable wristwatch that would double as a phone and command center. Sometime in 2014, it is rumored, Apple’s next big thing will arrive: an iWatch.

Still, not to sound like the first wristwatch haters, I am skeptical. The wrist had a good run, but it simply cannot afford the privacy, security, mobility, or safety of the pocket, not to mention its carrying capacity. Unless we perfect a way of implanting all our apps in a chip in our brains, stowing our technology in our clothes remains the best choice for most of us. For that reason, the pocket continues its centuries-long reign. At least until someone comes up with something better.

Alexis McCrossen, Professor of History at Southern Methodist University, is the author of Marking Modern Times: Clocks, Watches and Other Timekeepers in American Life and Holy Day, Holiday: The American Sunday. She rarely wears a watch. She wrote this for Zocalo Public Square.