Mar. 10, 2011

Beckett is studying measurement in school this week, so we decided we would take a closer look at what it means to measure something and some of the techniques used to measure things. It turns out that measuring things -- and agreeing how to measure things -- is very, very difficult! For thousands of years humans have been devising systems to measure things accurately, with varying levels of success.

We started with a discussion of the ancient unit of measurement called the cubit. The Ancient Egyptians used the cubit to build the pyramids, and there are cartouches that illustrate the cubit as slightly larger than the average Egyptian arm. The cubit was divided into seven palms, and these palms were subdivided into four digits. The Ancient Egyptian cubit was just over 20 inches -- and since the average height of an Egyptian 4000 years ago was just under 5 1/2 feet tall, it is highly unlikely that the cubit was any where near the actual length of an arm.

The cubit had a long history and was used as a unit of measurement for centuries by many different cultures -- but each culture had a different length for it. I asked Beckett to notice how long various arms that he saw were -- and the difficulty in using body parts as units of measurement. He immediately said that differences in size would make it difficult to find a standard based on bodies.

From the cubit, we moved on to a slightly more modern measure that is also based on the body -- the foot. Today the foot is 12 inches long, though its real, scientifically approved measure is described in meters. Both the Ancient Greeks and Romans used a measure called the foot. But again, the difficulty of agreeing on whose foot should be standard caused the measure to vary from country to country and even town to town. In England, tradition has it that King Henry I's foot was 12 inches long. Yet, the average Englishman of the 11th century was short and had shorter feet. In fact, many kings, emperors, mayors, and local rulers redefined the local unit of measure to their own liking. Most villages and towns had an 'official' or 'standard' foot on display. While I am not a king, my foot is exactly 12 inches in length!

The Platinum-Iridium Meter, used from 1889-1960. Image courtesy NIST.

Finally, we talked about the meter -- which most Americans both ignore and take for granted as a unit of measure. But the history of the meter is just as difficult and full of change as that of the cubit and the foot. Two early approaches came close to the modern meter. In the first approach, a meter was defined as the length of pendulum that gives a half swing of one second. This was problematic because gravity varied from location to location based on altitude and other factors.

The second approach was to define the meter as the distance from the equator to the North (or South) pole. The French sent out an expedition to measure this distance precisely; they expected it to be ten million meters. It took another 80 years for the world to acknowledge that the Earth is not perfectly round and therefore the meter could not be 'defined' by the slightly bulging Earth. While this convention defined the meter for more than 100 years, the meter was finally given a definition it has today in 1983 (!): the meter is the length of the path traveled by light in a vacuum during a time interval of 1/299,792,458 of a second. It is astonishing to me, and to Beckett as well, that the problem held by the Ancient Egyptians 5,000 years ago wasn't fully 'solved' until 1983!

25 toothpicks in two inches

Next, we talked about things that were hard to measure -- both big and small. First: how do you accurately measure something small? We tried measuring the width of a toothpick with a tape measure -- and failed. Then I asked Beckett to line up toothpicks until we hit an even number of toothpicks and an even number of inches. He came up with 25 toothpicks in two inches. So the average toothpick is 2/25 inches wide. This method was used by early scientists to measure small things, like the width of a hair, or anything too small to register accurately in a system of measurement.

Second: how do you measure really big things? Astronomers defined a parsec, for example, as the length of the long leg of an imaginary right triangle that has as its short leg the distance from the sun to the earth (which astronomers call an astronomical unit) and has an angle of one arc second.

A parsec. Image courtesy http://astronomyonline.org

The word parsec comes from the combination of two words -- parallax, which is a method of measuring by looking at an object from two different points and comparing the way they look from those two points, and arc second, which is 1/60 of one degree which equals 1/3600 of a circle. A parsec if you do the math is therefore 3.26 light years.

But even a parsec, which is a really huge unit of measurement, isn't big enough to measure something as big as the universe. Many scientists now discuss far away objects by how much red-shift they exhibit -- which means they use the Theory of Relativity and the color of light the objects emit to describe how far away they are.

There are so many things to be measured and so many ways to measure them that it is impossible to list them all here. Next time you pick up a ruler or tape measure or glance at a clock or turn your music up to 'eleven', pause a second to think of all the work that it took to get to that point. While I am lucky enough (just like King Henry 1!) to have a foot that measures exactly a foot, I rarely use it to measure distance. And think about measurement next time your order a 'large' cup of coffee -- remember when a cup of coffee was just that, a cup? Or as you watch the news, how do scientists measure the force of an earthquake or hurricane? And get ready, the Federal Trade Commission is planning to redefine how you measure and purchase light bulbs -- from watts to lumens. There is a measure for everything whether we understand it or not!