This chapter is from the book

This chapter is from the book

The idea of using light to represent data isn't particularly new. Over a
hundred years ago, Alexander Graham Bell demonstrated that light could carry
voice through the air. What is new, however, is the ongoing improvement
of the fibers and other basic optical components. While scientists in the
'60s predicted that fibers would be limited to 500 meters, today's
optical pythons can reach 1000 times that distance.

Those improvements are made possible by an understanding of light's
basic properties. Reflection and refraction play critical roles in enabling
light to travel down a fiber, while scattering becomes important on extended
lengths of fiber.

The irony is that, while scientists know how light behaves, they are less
clear on what comprises light. Experiments show that light can be explained both
as a particle and as an electromagnetic wave, a form of energy caused by the
excitation of an atom's electrons. Both explanations will be important to
our understanding of optical networking.

Atoms, Electrons, and the Like

Until the 19th century, light was conceived as a series of particles emitted
by some object and in turn viewed by another object. The particle theory of
light, as its called, largely stems from the Newtonian understanding of
light that stretched back to the Greeks who dubbed these tiny particles
corpuscies. Under the Newtonian understanding proposed in 1666, light was a
series of particles emitted by a light source that stimulated sight in the eye.
By conceiving light as a series of particles, Newton was able to explain
reflection and refraction.

Even within Newton's lifetime, however, the particle theory was
challenged. In 1678, Christian Huygens, a Dutch physicist, argued that
reflection and refraction could also be explained by understanding light as a
wave. Huygens' views were rejected at the time by opponents who argued that
if the wave theory were true, light should bend around objects and we
should be able to see around corners. As we'll see later, light in fact
does bend around cornerswhat we call diffractionthough it isn't
easily observable because light waves have short wavelengths.

The wave theory of light laid largely dormant for just over a century when in
1801, Thomas Young, a British physicist, physician and Egyptologist (who also
helped decipher the Rosetta Stone that led to the understanding of Egyptian
hieroglyphics), first demonstrated light's wave properties. Young's
experiment proved that light rays interfere with one another, a phenomena
unexplainable under the particle theory as no two particles could inhabit cancel
each other out. Additional research during the 19th century, gradually swayed
the scientific community towards viewing light as a wave that passed through an
invisible substance, called ether, the same substance after which Metcalfe named
the popular local area network, Ethernet (see Chapter 2).

This theory, that light is a wave, can be easy understood by using the atomic
model proposed by Ernest Rutherford (18711937), the
"layman's" view of the atom. With the Rutherford model, the
electron's orbit around a nucleus, comprised of protons and electrons, like
planets orbiting around the sun. The nucleus exerts a force, the electric force,
on the electrons, holding them in their respective states (see Figure 3.1). The
closer an electron is to the nucleus, the greater the attraction. The area of,
shall we say, "reach" of the protons is called the force field (in
physics, not Trekkian, terms). Positive charges placed in this field are
repelled by the protons; negatively charged particles, like electrons, are
attracted.

As energy is introduced, the electrons are excited and begin to vibrate in
their place. The electrons' vibrations distort the electric field holding
them, forming an electromagnetic wave (see Figure 3.2).

Figure 3.2 Under the classic view, electromagnetic waves are formed when
an electron vibrates, causing a distortion in the electric force field exerted
by a positively charged particle.

Quantum View

With their understanding of wave coupled with their understanding of
electricity and magnetism, 19th century scientists were able to explain most
known properties of light. Yet some phenomena, notably the photoelectric
effect, could not be explained. A new model was to be developed, the quantum
model of light, that combined elements of both particles and waves.

Under the photoelectric effect, electrons can be released when light strikes
a semiconducting material. Using the wave theory of light, the kinetic energy of
the released electron should increase with the intensity of the light.
Experiments, however, showed that the amount of additional energy was
independent of the light's intensity.

It was only until the start of the 20th century that this problem was solved.
Albert Einstein proposed a theory based on Max Planck's theory of
quantization, which assumes energy to be present in a light wave in packages,
called photons. Einstein theorized that the energy of these photons is
proportional to the frequency of the electromagnetic wave.

Using Plancks original quantum theory and Einstein's conception of light
as a series of photons, Niels Bohr in 1913 introduced a new model of the atom to
replace the Rutherford model. The problem with Rutherford's model is that
if energy is produced though an electron's vibrations, then according to
this model electrons should be emitting energy all the time. Do you see why?
When you map an electron's orbit onto a two-dimension plane the electron
appears to be constantly vibrating, because in fact it is! (see Figure 3.3).

Figure 3.3 An electron's orbit appears the same as an electron
vibrating in place, which under the classic view should lead to an
electromagnetic wave.

This means, then, that according to the conservation of energy, the electron
would slow down each time energy was emitted. After enough time, the electron
would be unable to hold its position, eventually crashing into the nucleus,
destroying the atom. Matter would exist for a fraction of a second and this book
would never have been written.

Bohr postulated that classical radiation theory doesn't hold for
atomic-sized systems. He thought that that electrons were contained at certain
energy levels around the nucleus. The term energy levels is used for many
reasons, one of which is that although electrons might appear to move, they
don't actually orbit around the nucleus (see Figure 3.4).

Figure 3.4 Under Bohr's model, electrons are shown as inhabiting
different energy states; the farther they are from the nucleus, the more energy
they contain.

Whereas classical physics allowed for nearly any orbit, the quantum view says
that only "special" energy levels are possible. Electrons are pushed
to higher energy levels through particles of light, called photons,
sharing the same frequency. When the electron drops from a higher energy level
to a lower one, it emits a photon equal to the energy difference between the two
states. When enough photons are emitted of the right frequency, visible light is
produced.

The quantum understanding of light view might sound much like the original
particle view and, in fact, there is a strong similarity to the quantum model.
What's important here though is that Einstein's theories contain
aspects of both the wave and particle theories. The photoelectric effect then
results from the energy transfer of a single photon to an electron in the metal.
Yet this photon's energy is determined by the frequency of the
electromagnetic wave.

So is light a particle or wave? It's both, or perhaps more accurately,
light exhibits qualities of both particles and waves depending on the situation.
Much of optical networking can be explained with the wave theory of light.
We'll resort to the particle theory where necessary.