The production of iron by humans began probably
sometime after 2000 BCE in south-west or south-central Asia, perhaps in the
Caucasus region. Thus began the Iron Age, when iron replaced bronze in
implements and weapons. This shift occurred because iron, when alloyed with a
bit of carbon, is harder, more durable, and holds a sharper edge than bronze.
For over three thousand years, until replaced by steel after CE 1870, iron
formed the material basis of human civilization in Europe, Asia, and Africa.

Iron is the fourth most abundant element and
makes up more than five percent of the earth’s crust. Iron exists naturally in
iron ore (sometimes called ironstone). Since iron has a strong
affinity for oxygen, iron ore is an oxide of iron; it also contains varying
quantities of other elements such as silicon, sulfur, manganese, and
phosphorus. Smelting is the process by which iron is extracted from iron
ore. When iron ore is heated in a charcoal fire, the iron ore begins to
release some of its oxygen, which combines with carbon monoxide to form carbon
dioxide. In this way, a spongy, porous mass of relatively pure iron is formed,
intermixed with bits of charcoal and extraneous matter liberated from the ore,
known as slag. (The separation of slag from the iron is facilitated by
the addition of flux, that is, crushed seashells or limestone.) The
formation of this bloom of iron was as far as the primitive blacksmith
got: he would remove this pasty mass from the furnace and hammer it on an anvil
to drive out the cinders and slag and to compact the metallic particles. This
was wrought iron (“wrought” means “worked,” that is, hammered) and
contained generally from .02 to .08 percent of carbon (absorbed from the
charcoal), just enough to make the metal both tough and malleable. Wrought
iron was the most commonly produced metal through most of the Iron Age.

At very high temperatures (rare except in a
blast furnace -- see below), a radical change takes place: the iron begins to
absorb carbon rapidly, and the iron starts to melt, since the higher carbon
content lowers the melting point of the iron. The result is cast iron,
which contains from 3 to 4.5 percent carbon. This high proportion of carbon
makes cast iron hard and brittle; it is liable to crack or shatter under a
heavy blow, and it cannot be forged (that is, heated and shaped by hammer
blows) at any temperature. By the late Middle Ages, European ironmakers had
developed the blast furnace, a tall chimney-like structure in which
combustion was intensified by a blast of air pumped through alternating layers
of charcoal, flux, and iron ore. (Medieval ironworkers also learned to harness
water wheels to power bellows to pump the air through blast furnaces and to
power massive forge hammers; after 1777, James Watt’s new steam engine was also
used for these purposes.) Molten cast iron would run directly from the base
of the blast furnace into a sand trough which fed a number of smaller lateral
troughs; this configuration resembled a sow suckling a litter of piglets, and
cast iron produced in this way thus came to be called pig iron. Iron
could be cast directly into molds at the blast furnace base or remelted from
pig iron to make cast iron stoves, pots, pans, firebacks, cannon, cannonballs,
or bells (“to cast” means to pour into a mold, hence the name “cast iron”).
Casting is also called founding and is done in a foundry.

Ironmakers of the late Middle Ages also learned
how to transform cast pig iron into the more useful wrought iron by oxidizing
excess carbon out of the pig iron in a charcoal furnace called a finery.
After 1784, pig iron was refined in a puddling furnace (developed by the
Englishman Henry Cort). The puddling furnace required the stirring of the
molten metal, kept separate from the charcoal fire, through an aperture by a
highly skilled craftsman called a puddler; this exposed the metal evenly
to the heat and combustion gases in the furnace so that the carbon could be
oxidized out. As the carbon content decreases, the melting point rises,
causing semi-solid bits of iron to appear in the liquid mass. The puddler
would gather these in a single mass and work them under a forge hammer, and
then the hot wrought iron would be run through rollers (in rolling mills)
to form flat iron sheets or rails; slitting mills cut wrought iron
sheets into narrow strips for making nails.

While blast furnaces produced cast
iron with great efficiency, the process of refining cast iron into wrought iron
remained comparatively inefficient into the mid-1800s. Historian David Landes
writes: “The puddling furnace remained the bottleneck of the industry. Only men
of remarkable strength and endurance could stand up to the heat for hours, turn
and stir the thick porridge of liquescent metal, and draw off the blobs of
pasty wrought iron. The puddlers were the aristocracy of the proletariat,
proud, clannish, set apart by sweat and blood. Few of them lived past forty.
Numerous efforts were made to mechanize the puddling furnace – in vain.
Machines could be made to stir the bath, but only the human eye and touch could
separate out the solidifying decarburized metal. The size of the furnace and
productivity gains were limited accordingly” (The Cambridge Economic History
of Europe, Vol. VI, Part I, 1966, p. 447).

Another important discovery in the 1700s (by the
Englishman Abraham Darby) was that coke (a contraction of “coal-cake”),or coal baked to remove impurities such as sulfur, could be substituted for
charcoal in smelting. This was an important advance since charcoal production
had led to severe deforestation across western Europe and Great Britain.

Steel has a carbon content ranging from
.2 to 1.5 percent, enough carbon to make it harder than wrought iron, but not
so much as to make it as brittle as cast iron. Its hardness combined with its
flexibility and tensile strength make steel far more useful than either type of
iron: it is more durable and holds a sharp edge better than the softer wrought
iron, but it resists shock and tension better than the more brittle cast iron.
However, until the mid 1800s, steel was difficult to manufacture and
expensive. Prior to the invention of the Bessemer converter (described below),
steel was made mainly by the so-called cementation process. Bars of
wrought iron would be packed in powdered charcoal, layer upon layer, in tightly
covered stone boxes and heated. After several days of heating, the wrought
iron bars would absorb carbon; to distribute the carbon more evenly, the metal
would be broken up, rebundled with charcoal powder, and reheated. The
resulting blister steel would then be heated again and brought under a
forge hammer to give it a more consistent texture. In the 1740s, the English clockmaker
Benjamin Huntsman, searching for a higher-quality steel for making clock
springs, discovered that blister steel could be melted in clay crucibles and
further refined by the addition of a special flux that removed fine particles
of slag that the cementation process could not remove. This was called crucible
steel; it was of a high quality, but expensive.

To sum up so far: wrought iron has a little
carbon (.02 to .08 percent), just enough to make it hard without losing its
malleability. Cast iron, in contrast, has a lot of carbon (3 to 4.5 percent),
which makes it hard but brittle and nonmalleable. In between these is steel,
with .2 to 1.5 percent carbon, making it harder than wrought iron, yet
malleable and flexible, unlike cast iron. These properties make steel more
useful than either wrought or cast iron, yet prior to 1856, there was no easy
way to control the carbon level in iron so as to manufacture steel cheaply and
efficiently. Yet the growth of railroads in the 1800s created a huge market
for steel. The first railroads ran on wrought iron rails which were too soft
to be durable. On some busy stretches, and on the outer edges of curves, the
wrought iron rails had to be replaced every six to eight weeks. Steel rails
would be far more durable, yet the labor- and energy-intensive process of
cementation made steel prohibitively expensive for such large-scale uses.

The mass-production of cheap steel only became
possible after the introduction of theBessemer process, named
after its brilliant inventor, the British metallurgist Sir Henry Bessemer
(1813-1898). Bessemer reasoned that carbon in molten pig iron unites readily
with oxygen, so a strong blast of air through molten pig iron should convert
the pig iron into steel by reducing its carbon content. In 1856 Bessemer
designed what he called a converter, a large, pear-shaped receptacle
with holes at the bottom to allow the injection of compressed air. Bessemer
filled it with molten pig iron, blew compressed air through the molten metal,
and found that the pig iron was indeed emptied of carbon and silicon in just a
few minutes; moreover, instead of freezing up from the blast of cold air, the
metal became even hotter and so remained molten. Subsequent experimentation by
another British inventor, Robert Mushet, showed that the air blast actually
removed too much carbon and left too much oxygen behind in the molten metal.
This made necessary the addition of a compound of iron, carbon, and manganese
called spiegeleisen (or spiegel for short): the manganese
removes the oxygen in the form of manganese oxide, which passes into the slag,
and the carbon remains behind, converting the molten iron into steel. (Ferromanganese
serves a similar purpose.) The blast of air through the molten pig iron, followed
by the addition of a small quantity of molten spiegel, thus converts the whole
large mass of molten pig iron into steel in just minutes, without the need for
any additional fuel (as contrasted with the days, and tons of extra fuel and
labor, required for puddling and cementation).

One shortcoming of the initial Bessemer process,
however, was that it did not remove phosphorus from the pig iron. Phosphorus
makes steel excessively brittle. Initially, therefore, the Bessemer process
could only be used on pig iron made from phosphorus-free ores. Such ores are
relatively scarce and expensive, as they are found in only a few places (e.g.
Wales and Sweden, where Bessemer got his iron ore, and upper Michigan). In
1876, the Welshman Sidney Gilchrist Thomas discovered that adding a chemically
basic material such as limestone to the converter draws the phosphorus from the
pig iron into the slag, which floats to the top of the converter where it can
be skimmed off, resulting in phosphorus-free steel.(This is called the basic
Bessemer process, or the Thomas basic process.) This crucial
discovery meant that vast stores of iron ore from many regions of the world
could be used to make pig iron for Bessemer converters, which in turn led to
skyrocketing production of cheap steel in Europe and the U.S. In the U.S., for
example, in 1867, 460,000 tons of wrought iron rails were made and sold for $83
per ton; only 2550 tons of Bessemer steel rails were made, fetching a price of
up to $170 per ton. By 1884, in contrast, iron rails had virtually ceased to be
made at all; steel rails had replaced them at an annual production of 1,500,000
tons selling at a price of $32 per ton. Andrew Carnegie’s genius for lowering
production costs would drive prices as low as $14 per ton before the end on the
century. (This drop in cost was accompanied by an equally dramatic increase in
quality as steel replaced iron rails: from 1865 to 1905, the average life of a
rail increased from two years to ten and the car weight a rail could bear increased
from eight tons to seventy.)

The Bessemer process did not have the field to
itself for long as inventors sought ways around the patents (over 100 of them)
held by Henry Bessemer. In the 1860s, a rival appeared on the scene: the open-hearth
process, developed primarily by the German engineer Karl Wilhelm Siemens.
This process converts iron into steel in a broad, shallow, open-hearth furnace
(also called a Siemens gas furnace since it was fueled first by coal
gas, later by natural gas)by adding wrought iron or iron oxide to
molten pig iron until the carbon content is reduced by dilution and oxidation.
Using exhaust gases to preheat air and gas prior to combustion, the Siemens
furnace could achieve very high temperatures. As with Bessemer converters, the
use of basic materials such as limestone in open-hearth furnaces helps to
remove phosphorus from the molten metal (a modification called the basic
open-hearth process). Unlike the Bessemer converter, which makes steel in
one volcanic rush, the open-hearth process takes hours and allows for periodic
laboratory testing of the molten steel so that steel can be made to the precise
specifications of the customer as to chemical composition and mechanical
properties. The open hearth process also allows for the production of larger
batches of steel than the Bessemer process and the recycling of scrap metal.
Because of these advantages, by 1900 the open hearth process had largely
replaced the Bessemer process. (After 1960, it was in turn replaced by the basic
oxygen process, a modification of the Bessemer process, in the production of
steel from iron ore, and by the electric-arc furnace in the production of steel
from scrap.)

Unlike many of his competitors, Andrew Carnegie
was quick to recognize the importance of the Bessemer, Thomas basic, and
open-hearth processes. He was also among the first steelmakers to grasp the
vital importance of chemistry in steelmaking. These became keys to his success
as a steel manufacturer.

In view of his moral failings, can
we really consider Carnegie a “portrait of human greatness?” The case for an
affirmative answer is this. We are heirs to thousands of years of technological
progress, and we benefit every day from the ingenuity and hard work of many
thousands of blacksmiths, ironworkers, steelworkers, engineers, inventors,
chemists, metallurgists, and entrepreneurs, long since deceased, one of whom
was Carnegie and few of whom were saints. Our standard of living today owes
much to Carnegie’s entrepreneurial drive, self-education, and genius for
efficiency. Whatever his flaws – and who among us has none? – Carnegie embodied
a type of human greatness that deserves our appreciation and gratitude.
Without forgetting the contributions of others (especially his workers), we
should make the same judgment about Carnegie that Stephen Ambrose makes about
the men who built the first transcontinental railroad: “Things happened as they
happened. It is possible to imagine all kinds of different routes across the
continent, or a better way for the government to help private industry, or
maybe to have the government build and own it. But those things didn’t happen,
and what did take place is grand. So we admire those who did it – even if they
were far from perfect – for what they were and what they accomplished and how
much each of us owes them.” (Nothing Like It In the World [New York:
Simon and Schuster: 2000], p. 382)

Landes, David. “Technological Change and Development in
Western Europe, 1750-1914,” in Postan and Habakkuk eds., The Cambridge
Economic History of Europe, Volume VI, Part I. Cambridge: Cambridge
University Press, 1965, pp. 274-601, esp. pp. 444-8 and 477-496.