In this section, you will find out why the fancy word "paradigm" came to be
so popular, and what its technical meaning really is; you will also learn the
meaning of some other amusing words, such as "hagiography." But most
importantly, you should get an appreciation for the complex history and actual
inner workings of science and technology.

One theme for this section is that science and technology are inseparable,
from which (given the previous section) it follows that science, technology
and society are inseparable. For example, society determines what basic
research gets done, which in turn influences what technologies can exist,
which then again influences the social choices of what basic research to
invest in.

But before pursuing this, we need a deeper understanding of science. This
is a very nontrivial task. The history, philosophy and sociology of science
are each huge areas, in fact, each is a whole department (or at least a
program) in most major universities, with dozens of courses, and with degrees
at all levels. We will have to get along with about one hour for each. Some
of you may find parts of this rather tough going, a fast trip through some
dense, deep material. Sorry! Fasten your seat belts!

5.1 The Renaissance and Classical Periods

Let us begin gently, with some of the stories that are often told
to justify the existence of science as it is currently practiced. Heroic
characters like Galileo, Bruno and Newton typically play important roles in
these narratives. For example, we are told that Bruno and Galileo showed
courage in standing up against the Catholic Church for their beliefs, that
Bruno died for doing so, and that Galileo might have. Giordano Bruno,
born circa 1548, was burned at the stake in Rome on 17 February 1600, for
saying things like

Innumerable suns exist; innumerable earths revolve around these suns in a
manner similar to the way the seven planets revolve around our sun. Living
beings inhabit these worlds.

Today nearly all educated people believe essentially the same things (though
we know that there are more than 7 planets around our sun), and it is hard for
us to understand why Bruno got such harsh treatment at the beginning of the
seventeenth century. The SETI web page on
Bruno has (or had) a picture of the monument to intellectual freedom
at Campo dei Fiori in Roma, erected on the site of his martyrdom; I have been
there, and noticed that the local people place ashes and flowers at the base
of the monument every day. (By the way, SETI stands for Search for
Extra-Terrestrial Intelligence, and their webserver seems a bit flakey. Just
in case, here is a link to
another site on Bruno.)

Bruno was a mystically inclined Dominican monk, not a scientist in the
modern sense, despite his interest in astronomy. By contrast, Galileo Galilei
(1561-1642) was a scientist, who arrived at his theories far more rigorously
than Bruno did; but he too had trouble with the Church's Inquisition, in
particular, for his belief that the Earth revolved around the Sun, rather than
vice versa. If he had been as stubborn as Bruno, he too would probably have
been killed, instead of merely sentenced to life imprisonment (though he spent
most of it in house arrest at his country villa). Galileo is famous for his
experiments with falling bodies, done from the Leaning Tower of Pisa; what is
not well known is that the experiments were a failure, and as a result he was
run out of town! For more information on Galileo, see for example the St Andrews webpage on Galileo.

The moral of such stories is usually taken to be something like this: we
need science in order to find out objectively what is really true,
independently of all religious, political, and commercial interests. Science
is the search for truth, and its results are far more reliable than results
found by other methods, which tend to be tainted by various special interests.
Some scientists are taken to be great heroes, and their stories often read
like hagiographies (the literal sense of this word is a written history
of a saint, but we will use it metaphorically to refer to any biography which
makes a hero of its subject).

It is interesting to treat such texts as data, i.e., to view them
critically and analytically, asking questions such as the following: Why are
hagiographies so common, even though this form is often misleading and
error-ridden? And what values lie behind this phenomenon?

We should go back to the ancient Greeks for the orgins of modern science,
or even further, to the ancient Egyptians, who used practical geometry for
very complex engineering projects like the pyramids. The contribution of the
Greeks was to systematize this knowledge, by showing how it could be derived
from a small number of basic principles, called axioms; this
development reached its peak in the famous Elements of Euclid. (The
Egyptians also had sophisticated schemes for doing arithmetic, as did the
Babylonians, Assyrians, etc., who used theirs mainly for accounting.) The
most distinctive feature of Greek geometry is that it is deductive;
this was a major advance, and the beginning of modern mathematics. But
because there was less emphasis on experiment than on rationality, modern
empirical science was not yet visible.

Moving very quickly through time now, the Romans did little to advance
mathematics or science; their interests were largely practical, and their
contributions were more in law, warfare and engineering. After the sacking of
Rome by the (so called) barbarians came the period called the "middle ages" or
sometimes the "dark ages," during which only a few monks had any knowledge of
what the Greeks had achieved, and no major advances occurred. During the
period called the "renaissance," and mainly in Italy, things began to change.
Bruno and Galileo were among those in the forefront of this; there were also
of course many very great artists, like Michaelango, Gioti, and Leonardo da
Vinci, who was also a great engineer and inventor.

Rene
Descartes (1569-1650) is another important figure in the development of
modern science; his ideas provide philosophical foundations for much of modern
thought. His aim was to justify the separation of science from theology, so
that science could proceed without interference from the Church. He did this
by asserting that matter and spirit were two completely different realms,
which he called res extensa and res cogitans (things with
extension in space, and things of thought); this doctrine is called
dualism. Under this view, the Church has authority over the spiritual
realm, while the material realm remains open to empirical investigation. Of
course Descartes could not state this goal publicly, or he would likely have
had as much trouble as Bruno, or at least Galileo; but he did state it in a
letter to a friend. Galileo and Descartes appear early in the period called
classical or enlightment, the age of rationalism, and of
mechanism. Descartes made major contributions to mathematics, especially his
algebraicization of geometry, called "analytic geometry" and enshrined in the
phrase "Cartesian coordinates." This is perhaps the prime example of
reductionism.

Sir Isaac Newton (1642-1727) is undoubtedly the greatest scientist of the
classical period. He is best known for his physics, including his laws of
motion, his theory of gravitation, his proof that the orbits of the planets
are elliptical, his work in optics, and more. He was the Lucasian Professor,
at Trinity College, Cambridge University, and also served as Master of the
Mint for England, and hence was an important public figure. It is now fairly
well known that the "apple" story was made up by an early biographer. It is
less well known that most of his written work consists of attacks on orthodox
theological positions, especially the trinity; this was long kept secret,
because his professorship was at Trinity College. It is even less well known
that most of his experimental work was not in physics at all, but in alchemy!
(He died of mercury poisoning, contracted from his alchemical experiments.)
So Newton is not the great hero of pure rationality that he is often made out
to be. (For details, see the books by Michael White and James Gleick in the
list of recommended books for this class, or more simply, see John Banville's
review of The Magus, by James
Gleick, from the Guardian.)

In fact, early scientists had little understanding of what science is; this
only developed gradually, and is still hotly debated today, as we will see.
Francis
Bacon (1551-1626) and Robert Boyle (1627-91) were early promoters of the
experimental method; Bacon was Lord Keeper of the Seal and later Lord
Chancellor of England; he too died of effects of his experiments, bronchitis
after stuffing a foul with snow (pun: he died of foul play). He is also one
of the people sometimes claimed to have written the plays of Shakespeare.
Boyle is famous for Boyle's law, and is one of the most important founders of
modern chemistry.

5.2 Some Later Developments

Later, a split developed between rationalism and empiricism, the latter
championed by two British philosophers, John Locke (1632-1704) and David Hume
(1711-1776), the former by the German philosopher Immanuel Kant (1724-1804),
following Descartes. Roughly speaking, rationalism is the view that we
can study nature using logical inference, and empiricism is the view
that we can study nature by use of our senses, i.e., that our senses give us
information that corresponds to reality. Both of these presuppose
realism, the view that there is an objective reality, independent of
our ability to perceive it. Today, rationalism and empiricism are not longer
considered to be at odds, and all three views are important epistemological
assumptions underlying modern science (epistemology is the area of
philosophy devoted to studying how we come to know things); that is,
modern science is generally considered to use both reasoning and
experiment, in order to discover what is real.

Dualism seems consistent with the traditional physical sciences (physics,
chemistry, astronomy, etc.), but advances in the human sciences, especially
recent sciences of the mind, e.g., neuroscience, call dualism into doubt. If
science is devoted to material reality, then it must study the mind from a
material point of view, and hence it cannot accept the assumption that the
mind is non-material. Monism is the opposite of dualism; it asserts
that there is just one thing in the world; that one kind of thing might be
material, in which case we have materialism, or it might be spirit,
which, for example, was Plato's view. Thomas Hobbes (1588-1679)
was an important early proponent of materialism (he is also famous for his
political philosophy). Modern neuroscience accepts the view of materialist
monism. This has the effect of eliminating Descartes' mind/body dualism,
but it also seems to exclude a lot of what actual living breathing human
beings regard as important. In a somewhat extreme statement, Francis Crick
wrote about his readers (in parody of Lewis Carroll) that

You're nothing but a pack of neurons!

Both Descartes and Hobbes were said to have had mystical insights about the
certainty of mathematics, and the profound role that this might play in
science, inspired by Euclid's axiomatic geometry, and Galileo's mathematical
theories of falling bodies and moving planets, which were the beginnings of
modern mathematics and physics, respectively. Let me emphasize that the
quantitative, mathematical deterministic character of Galileo's laws
was of absolutely fundamental importance. Hobbes also tried to extend this
kind of rational determinism into the social, with mixed success, but enormous
influence, particularly in theories of government and law.

Another absolutely fundamental characteristic of science is its attempt to
achieve objectivity, excluding all "merely" subjective
factors, such as the beliefs, hopes, fears, prejudices, etc. of the
experimenters, and of others (especially the Church). There is a paradoxical
situation with words here, since we say that the subject of the
experiment is regarded as an object, while the experimenter bans his
subjectivity by becoming objective. This duality between
the experimenter and the experimented upon is exactly parallel to the
Cartesian duality between mind and body, so that these two seemingly opposite
poles actually serve to reinforce each other, and this is what is reflected in
the seemingly strange language discussed above. (Such dualities are very
common, but it is less common for them to leave such clear linguistic
footprints.)

In music, Wolfgang Amadeus Mozart (1756 - 1791) is the epitome of the
classical period, clearly exhibiting an elegant symmetry, restrained emotion,
and self-containment that correspond to the values of rationalism,
objectivity, and individualism that characterize this period. In a common
large grain classification of historical periods, the classical period is
generally considered to be followed by the Romantic period, epitimized
by the music of Ludwig van Beethoven (1770-1827), and characterized by
unrestrained emotionalism, agression (often called "freedom"), and egoism. In
this period, science conquers nature, and the European nation states conquer
the world.

As you might expect, things get much more complicated in the 20th century.
We will be able to cover only a small part of this huge territory, a little
bit here, and then an even smaller bit later in the course.

5.3 Entering the Twentieth Century

Scientists of the classical era were inspired by the certainty of
mathematical results, and by their amazing applicability to the physical
world. The fact that Newton's physics applies to the planets, to ballistics
(cannon balls, etc.), to automobiles, railway engines, watches, and so much
more, seems to confirm this. Even quantum mechanics supports this view, since
no other physical theory has ever been accurate to so many decimal places (13
at last count). The physicist Eugene Wigner called this the unreasonable
effectiveness of mathematics, wondering what it can mean about the world
that mathematics can describe so many aspects of it so very well. Or does it
perhaps mean something about us instead? Pythagoras (circa 572-510
BC) maintained that the world actually is mathematical, giving
evidence from music and geometry, but few have been willing to go so far in
more recent times. Plato held a weaker view, that mathematical truths, and
all true ideas, lived in their own ideal world, of which we can see only
glimpses. This is called Platonism.

One result of the success of mathematical physics was that other subjects
sought to achieve the same precision and deductive rigor, by employing
mathematical methods in what they hoped was a similar way. Another result was
that some philosophers decided that the ideal kind of knowledge was scientific
knowledge expressed with mathematical precision, and that all other kinds of
knowledge were inferior; in particular, during the 1920s, the so called
Vienna circle developed such views; this was a group in Vienna that
included Rudolph Carnap, Moritz Schlick, Hans Reichenbach, and to some extent,
the great logician Kurt Godel; they were influenced by the early work of
Ludwig Wittgenstein (although he refused to become a member). Their
philosophy of logical positivism held that the only meaningful
sentences are those that are expressed in logic, and are either empirically
verifiable, or else are logical truths (which are necessarily tautological).
From this, they concluded that all metaphysics is nonsense, including
religion, art, and ethics, which should therefore be de-valued. Their so
called verifiability criterion came under attack from many quarters,
especially in the later work of Wittgenstein, and it now has few adherents.

However, the influence of logical positivism lives on in so called
analytic philosophy, which is now the dominant school in the US and
Britain. And for society as a whole, the view called modernism can be seen as
coherent (though not identical) with logical positivism. Although the term is
used in many different ways by many different people, roughly speaking
modernism calls for a homogeneity of society, an interchangeability of
workers, mass consumerism in the media and in physical goods (which are called
"commodities"), plus predictability, and rationality. Society is composed of
autonomous rational consumers. Science is considered to support modernism.
We are said to live in "modern times" (or perhaps, in early "post-modern
times").

High school science textbooks (and even many college textbooks) give an
outline of the scientific method that looks something like the
following: (1) state a hypothesis H; (2) devise an experimental test for H;
(3) carry out the experiment; (4) and then analyze the data so as to either
confirm or deny H. It is often said that this leads to an ever growing body
of sound empirical knowledge, and therefore to the unending progress of
science, and hence of technology, and therefore society. All this is also
generally considered to be part of modernity, and is highly consistent with
logical positivism and the myth of progress. It is also far from what
actually happens in scientific labs.

5.4 Paradigms and Paradigm Shifts

Thomas Kuhn is famous for introducing a very different way to conceptualize
scientific progress, in his book The Structure of Scientific
Revolutions, using the notions of paradigm, crisis, revolution, and
paradigm shift; see the readings for details.
But please note that Kuhn's own version differs from that of some of his
interpreters, and the fact that Kuhn introduced these ideas does not
necessarily mean that his versions are better; on the contrary, just as
Newton's physics is better than Galileo's, it seems more likely that (at least
some of) the later interpretations of Kuhn may be better than the original.
Moreover, many similar ideas were introduced earlier by Ludwig Fleck
(1896-1961), a Polish MD who wrote about the history of syphilis in 1935.
Once again, I really want to encourage you to think it through for yourself.

As an example, we can consider the Ptolmaic paradigm vs. the Copernican
paradigm for the heavens, noting that the Ptolmaic paradigm is deeply entwined
with the Aristotelian world view, which in turn has become deeply entwined
with Catholic theology. As Kuhn notes, the Copernican paradigm was
not initially better than the Ptolmaic; in fact, Ptolemy's approach
gave more accurate results, until the original Copernican theory was improved
to view the orbits of the planets as ellipses rather than circles.

Contrary to the high school model (and most philosophy of science until
recently), experiments are not purely objective determinations of fact, but
rather are theory laden, in the sense that they only make sense in the
context of some particular theory; it is impossible to devise an experiment
for measuring how long it takes objects to fall without first having some
theoretical context, including (for example) notions of length and time; more
generally, experiments only make sense within particular paradigms. Galileo's
experiment made sense in terms of his opposition to the Aristotelian paradigm,
and his own fledgling more quantitative theories. Moreover, theories are
underdetermined with respect to data: this means that any given set of
experiments can always be explained in more than one way.

Experiments are also value laden, because they are always embedded
in a paradigm, and paradigms are value laden, in the sense that they involve a
community with shared values, which determine what is and is not worth
pursuing, what are good and bad results, what counts as data, what counts as
theory, and even what counts as a problem; perhaps (somewhat contrary to Kuhn)
these values constitute the real essence of the notion of paradigm.

An important point about successive paradigms follows from this, that they
are incomparable, in the sense that using the values of one paradigm to
criticize another will at best give misleading results, and in general will be
just plain wrong. For example, Aristotle's physics was not really about
"motion" in the same sense as Galileo's. Nevertheless, it is quite usual for
each paradigm to give a rational reconstruction of the preceeding
paradigm, reevaluating the older material in terms of its own values. This
makes for shorter, more coherent textbooks, but it also makes for bad history.
In particular, Galileo's experiments did not prove that Aristotle's
theory of motion was wrong, because Aristotle's notion of motion was more
phenomenological, i.e., it was more concerned with our natural perception of
motion, what we intuitively feel about motion, than with the results of
objective (and artificial) experimental measurements.

As a result of the incomparability of paradigms, it is not correct to say
that a later paradigm is better than an earlier paradigm in absolute terms,
although of course it will be better in its own terms, and it may well be
better for certain particular purposes. Another interesting observation is
that a paradigm is likely to be more coherent with the values of the culture
that produced it than with some earlier (or later) culture. This, plus the
rational reconstruction of earlier paradigms, and the fact that progress does
occur within a paradigm during normal science (that is, until
a crisis appears) helps support the myth of steady progress. Thus, standing
within our own culture and some current paradigm, we may genuinely be entitled
to say that things have progressed. But we should also realize that this is
relative to a set of values that is not absolute. For example, the Nazis no
doubt saw things getting better and better during the 1930s, relative to their
own values.

The Nazi example suggests why we should not give in to the total moral
relativism that is found in some quarters. For example, I am quite willing to
say that taking life is bad, while still recognizing that this is not a value
that everyone shares at every point in time, or interprets in the same way
that I would. The fact that people may hold different values does not imply
that there are no values, nor does it imply that value systems cannot be
compared, with some being found better than others. We will discuss this in
more detail later on.

Paradigms are naturally conservative, in there is a great
reluctance to overturn their fundamental values, paradigmatic experiments,
etc.; this makes sense because these define the paradigm. Rather,
things that don't fit are seen as puzzles to be worked on and solved,
and if after long effort they still don't fit, then they are ostracized as
anomalies. If some field is too willing to change its own
fundamentals, then it will not be seen as scientific, but rather as
disorganized and chaotic, and therefore as pre-paradigmatic, i.e., not
yet realizing normal science in a fixed paradigm. For example, contemporary
linguistics is in the middle (or just past the middle) of a paradigm shift
from the theoretical linguistics of the Chomsky school to a more empirically
based cognitive linguistics.

Contemporary theoretical physics is in a state of crisis (in Kuhn's
technical sense of that term), because its two major field-based theories seem
to make incompatible assumptions about nature, and no one as yet knows how to
reconcile them. These theories are quantum mechanics and relativity, and
their as yet speculative combination is called quantum gravity, and also
general unified field theory (sometimes "GUT" for short). One of the
difficulties involved is that quantum mechanics is statistical (see Section
5.5 below), whereas relativity is deterministic. Periods of crisis are quite
different from pre-paradigmatic science, in which there is not yet any
agreed upon paradigm; although disagreement is usually rampant, there are no
paradigmatic doctrines or experiments around which disagreement can center.
An example of a pre-paradigmatic field is consciousness studies, since
researchers cannot even agree on what consciousness is, let alone how to study
it; nevertheless, one can feel consensus beginning to emerge in several
subareas.

It is interesting to the contrast the Kuhnian view with that of Peter
Gallison, who emphasizes the material infrastructure of science rather than
the concepts of science. For example, in his recent book Einstein's
Clocks, Poincare's Maps: Empires of Time, he brings out the role of the
problem of clock synchronization in the thinking of these two giants who
developed very similar ideas on the relativity of time. Actually neither Kuhn
nor Gallison excludes the factors emphasized by the other, and Actor Network
nicely combines the two.

5.5 Statistics

Statistics plays a fundamental role in most science today, because it is
well known that measurements are always somewhat inaccurate, and that repeated
measurements are necessary to ensure accuracy. Furthermore, it's not enough
to just compute an average and proclaim "Well that looks close enough to me".
Indeed, statistics has become a very sophisticated subject, and we will just
skim a few main points here. First, a statistic is a function for
computing a value that summarizes some dataset. Each statistic, in this
sense, has its own probability distribution, and therefore has a certain
likelihood of giving incorrect values in a given experiment. So experimenters
should ensure that the probability of drawing a false conclusion from a given
statistic is sufficiently small for the purpose at hand.

The standard approach is called hypothesis testing: there is a so
called null hypothesis, which says that what you are testing is false,
and you hope for a high probability that the null hypothesis is false, and
that the hypothesis you are testing is therefore true with a high probability.
This corresponds to the dictum that you can never prove hypotheses in
science, but only disprove them. Karl Popper is famous for his
doctrine that only falsifiable assertions can be scientific (in part,
this was an attempt to improve on the logical positivists). But science as it
is actually practiced very often takes a looser approach than discussions of
this kind suggest; e.g., very few doubt that cosmology is a science, even
though experiments are impossible, since we only have one universe. Moreover,
Kuhn and followers argue that there are several other very important factors
in the development of scienfic theories.

Especially in the social sciences and medicine, statistical tests are often
used to determine the degree to which variables are correlated or
covariant, that is, the degree to which they vary together. In many
cases the goal is determine whether or not one variable "causes" another. For
example, cigarettes and cancer have been clearly shown to covary, but this
does not in itself prove that cigarettes cause cancer; it might be that some
other factor predisposes people to both cancer and cigarettes; it might even
have been the case that cancer predisposes people to smoking! In the 1950s,
when attacks on the cigarette manufacturers began, exactly this kind of
argument was made in the courts, and at that time, it won! Now we know more
about the underlying mechanisms, so the situation is very different, and the
argument that statistical tests do not prove causation cannot prevail. There
are also many examples where absolutely false causal inferences have been
drawn from statistics, so the cigarette example should not be taken as
paradigmatic! See the reading Bayesian Critique of
Statistics in Health (but please regard it as a document to be read
and evaluated critically, rather than as a source of utterly reliable facts,
noting in particular that it argues for a paradigm shift in statistics, from
standard hypothesis testing to Bayesian inference, taking the poorer than
expected reliability of medical studies as an anomaly in the normal Fisher
paradigm; the problem is that it does not provide evidence the Bayesian
methods would eliminate the anomalies.)

We should note that, shockingly to many people, probabilities enter into
the very foundations of quantum mechanics; QM does not directly predict
outcomes, but only the probability distributions of outcomes; and furthermore,
the Hiesenberg uncertainty principle says that attempting to measure one
variable (say position) more accurately will cause another variable (such as
momentum) to become more uncertain than it was before. So absolute certainty
is no longer something that modern science can promise, and the interventions
of scientists have become part of the theory.

5.6 Summary and Discussion

One clear trend during the periods discussed in this section is a gradual
diminution of the influence of the Church, and of the spiritual in favor of
the material. This can be seen in the progression of positions held by Bruno,
Galileo, Descartes, Kant, and Wittgenstein (and generally speaking, with some
exceptions of course, those in between).

Although the material in this section should give us some idea of the
history and philosophy of science, it does not give much of an idea about how
science and technology are related. One obvious point is that technology
provides infrastructure for science. The huge experiments of modern physics
are also huge engineering projects, e.g., consider the Stanford Linear
Accelerator (SLAC), with its one mile of magnets accelerating particles in an
incredably straight line; whole teams worked on designing, testing and
building just these very special magnets; another whole team used lasers to
ensure the linear alignment of the beam.

And of course, it is also said that science underlies technology. For
example, Newton's optics was used in working with the laser beams that aligned
the magnets at SLAC. Finally, it is said that technology converts the
abstract truths of science into tangible benefits for society. For example,
what is learned about particle physics at SLAC may help us build better bombs,
and even better consumer products. But is the picture really so simple as
this? A recent area called "Social Studies of Science and Technology" (SSST)
does not think so, as we will see in the next section.