Measuring the History of Electricity

Take the Blades! We'll Sell You the Razor

The invention of the electric meter made it possible to bill customers for electricity, creating the incentive to build out the nation's first network for moving electrons. The Grid, the system of dumb, buzzing wires that allows power to move across the country, is so important, it topped the National Academy of Engineering's top 20 triumphs of the 20th century.

This gallery tours the history -- and future -- of making you pay for juice. Some time within the next few years, you're likely to get a new type of so-called "smart meter" that will mark the first real upgrade to electrical billing since your grandparents were born.

Until the 1870s, electrical power wasn't used for much aside from telegraphs and telephones. But after the Edison's improvement of the incandescent light bulb, power was suddenly much more useful. The problem was, the few metering systems that tinkerers had built up until that time didn't actually work.

So Edison resorted to a low-tech method: He charged for electricity on a per-lamp basis. In modern business model terms, Edison was giving away the blades to sell the razor. He would not have received venture capital for that idea.

Pay Per Dunk: The Era of the Rube Goldberg Meter

Throughout the 1880s, various inventors thought hard about the problem of how to measure the flow of electrons through time. Edison himself tried a two-electrode chemical system in which your charge was determined by how much zinc moved from one electrode to another. Workers actually had to weigh the electrodes to determine the price you paid.

Elihu Thomson developed a walking-beam meter that functioned quite like toy dunking birds (left). The heating and cooling of alcohol inside a pair of bottles caused a periodic liquid exchange that caused the bottles to rock back and forth. And that mechanical motion is what the meter measured. It was an excellent hack, but it couldn't scale.

AC-DC Dispute Settled: AC Wins by Transforming

By 1888, a major, long-lasting dispute within the power industry was on the verge of getting settled. Edison had been promoting the use of direct-current power, despite the difficulty that the technology encountered transmitting electricity over long distances and changing the voltage. Both problems limited the uses of electricity.

George Westinghouse, meanwhile, purchased a patent for a transformer that could increase the voltage of alternating-current power. With a working transformer, his company, Westinghouse Electric, was able to send power over long distances, allowing for larger, centralized power-generating stations. These stations could power factories as well as your great-grandfather's school reading lamp.

But they needed to bill for it. And that's where Westinghouse employee Oliver Shallenberger came in. His design (left) paved the way for Westinghouse to purchase a patent from Nikola Tesla for an improved AC system. The modern electrical grid was about to take root.

Image: Library of Congress

The Standard Model of American Power

With early success fueling investment in the electrical sector, a variety of new technologies began to converge to create the standard model for electrical generation and distribution in the United States.

Through the 1890s, various iterations of the induction watt-hour meter were becoming standard technology. These meters measure the number of rotations that a metal disk makes in response to magnetic flux within the meter. The amount of power is proportional to the speed of the disk's revolution, so the meter can accurately measure a range of energy usage levels. In most places, this is still how your company knows how much power your home or business is drawing.

Meanwhile, transmission-line technologists were steadily upping the voltage of the power lines running from ever-large power plants, like this one, to increasingly large cities filled with more and more electricity users. The higher the voltage, the better the quality of transmission over distance.

Power Becomes Ubiquitous

During the Great Depression, the government began to regulate private utilities and push for getting electricity to rural areas far from urban centers through agencies like the Rural Electrification Administration and Tennessee Valley Authority.

The Edison Electric Institute Bulletin had a special issue in 1942 on "entering the seventh decade of electric power." By this time, almost all Americans had access to cheap and reliable electric power, but many could remember a time when they didn't.

The horsepower available to factory workers had increased from about 3 in 1914 to 6.5 in 1942, with most of the increase coming from purchased electrical power. As one professor chillingly put it, engineering advances had made 6 billion "manpower" available to the country, "equivalent to 50 slaves for each man, woman, and child."

Image: Edison Electric Institute

Innovation Ends, Coal Begins

With most of the metering and transmission infrastructure in place, all electrical companies had to do was make as much power as cheaply as possible. And that's all they did. Innovation in transmission and metering largely stopped. This 1940s meter technician would probably understand most meters in use today.

Most capital investment went to building power plants that could exploit the nation's ready source of cheap energy: coal. In 1949, only 84 million tons of coal wer used for electrical power production. By 1970, coal consumption by the power industry had nearly quadrupled to 320 million tons per year. Last year, American utilities burned about 1.05 billion tons of coal to make electricity.

Photo: Library of Congress

The System Starts to Break

The golden age of cheap power came to an end some time in the last decade. Coal, which made electricity cheap and abundant, also happens to generate massive amounts of carbon dioxide, which is the greenhouse gas responsible for climate change. It's widely expected that the next president will sign a law that will tax carbon dioxide emissions, as is already the case in many places around the world.

The specter of energy regulation and rising natural gas, coal and petroleum prices has raised interest in new emission-free technologies like wind turbines and solar power. But the adoption of these technologies isn't as simple as it sounds. Both wind and solar -- which are abundant and clean -- will require substantial changes to the nation's transmission and billing systems.

Wind and solar, unlike coal, do not produce power at the same rate at all times. If they are adopted at scale, the grid infrastructure and the meters like this one will have to be much more flexible than what we built 100 years ago.

Power generation has been centralized since the very early days of the industry, but now, wind and solar open the possibility to generate power right on or near your home. But to make economic sense, we need meters and grid tie-ins that can easily accomplish this type of "reverse billing".

The Next Big Thing May Be Little

So, we find ourselves in a new era of electric meter innovation. A host of companies is trying to find just the right mix of features that will satisfy utilities and provide consumers with more flexibility in how they make, buy and use power.

Like everything else in the internet age, electricity-billing systems are about to make the transition from a centralized, one-way mode of operation to two-way systems that are connected to the internet. In addition to the back-end differences, the next generation of meters has received a facelift that will let consumers see their energy usage in near real-time.

Of course, people have been talking about "smart meters" for years. But after years of delayed rollouts, utilities finally appear ready to scale them up.