The class blog for Math 3010, fall 2014, at the University of Utah

Tag Archives: applications of imaginary numbers

Imaginary numbers, which are also known as complex numbers, have had a pretty bad reputation. When most people think of imaginary numbers, they probably break out in a cold sweat from the horrific memories of high school math class. They think that imaginary numbers are utterly incomprehensible and useless in the “real” world. “Imaginary numbers” sound very intimidating to people who are not familiar with them. They also sound highly theoretical with little or no use outside of pure mathematics. In fact, the exact opposite is true.

The most common imaginary number is i, which is formally defined as i = √-1. Since the act of squaring any real number always makes the number positive– whether it began as a negative number or not, it is impossible to find the square root of a negative number without using i. Thus, i made possible an entire class of math problems that were not possible before. For example, √-64 = 8i, cannot be done without using i, because √-64 does not exist in the real number line. Additionally, i can be easily changed from an “imaginary” number into a “real” number simply by squaring it: i² = -1.

The first known person to stumble upon the idea of using an imaginary number to take the square root of a negative number was the Greek mathematician Heron of Alexandria in 50 CE. He was trying to find the volume of a section of a pyramid using a formula that involved the slant height of the pyramid. However, certain values for the slant height would produce the square root of a negative number. Heron was very uncomfortable with this result, so in order to avoid using a negative number, he fudged his calculation by dropping the negative sign.

Girolamo Cardano was an Italian mathematician who was particularly interested in finding the solutions to cubic and quartic equations. In 1545, he published a book titled Ars Magna, which contained the solutions to cubic and quartic equations. One of the equations in his book gave the solution of 5 ± √-15. Commenting on this equation, Cardano wrote, “Dismissing mental tortures, and multiplying 5 + √ – 15 by 5 – √-15, we obtain 25 – (-15). Therefore the product is 40. …. and thus far does arithmetical subtlety go, of which this, the extreme, is, as I have said, so subtle that it is useless.”

Perhaps the first champion of imaginary numbers was Italian mathematician, Rafael Bombelli (1526-1572). Bombelli understood that i times i should equal -1, and that -i times i should equal one. However, Bombelli could not find a practical use for this property, so he generally was not believed. Bombelli did have what people called a “wild idea” – that imaginary numbers could be used to get real answers.

Imaginary numbers continued to live in disgrace until the work of a series of mathematicians in the 18th and 19th centuries. Leonhard Euler helped clear up some of the problems with using imaginary numbers by developing the notation i to mean √-1. He also introduced the notation a+bi for complex numbers. Carl Friedrich Gauss made imaginary numbers much more concrete and less “imaginary” when he graphed imaginary numbers as points on the complex plane in 1799. However, William Rowan Hamilton in 1833, delivered the coup de grace to imaginary numbers’ bad name when he advanced the idea that complex numbers could be expressed as a pair of real numbers. For example 4+3i could be written simply as (4,3). This made complex numbers much easier to understand and use.

Today, imaginary numbers are an essential part of the everyday calculations that make modern technology work. They are indispensable in the field of electrical engineering, particularly in the analysis of alternating current, like the electrical current that powers household appliances. Also, cell phones and air travel would not be possible without imaginary numbers because they are necessary in the computations involved in signal processing and radar. Imaginary numbers are even used by biologists when studying the firing events of neurons in the brain. Imaginary numbers have come a long way in the five hundred years since they were scoffed at for being absurd and totally useless.