From the 18th century through late 20th century, the history of science, especially of the physical and biological sciences, was often presented in a progressive narrative in which true theories replaced false beliefs.[3] More recent historical interpretations, such as those of Thomas Kuhn, tend to portray the history of science in different terms, such as that of competing paradigms or conceptual systems in a wider matrix that includes intellectual, cultural, economic and political themes outside of science.[4]

In prehistoric times, advice and knowledge was passed from generation to generation in an oral tradition. For example, the domestication of maize for agriculture has been dated to about 9,000 years ago in southern Mexico, before the development of writing systems.[5][6][7] Similarly, archaeological evidence indicates the development of astronomical knowledge in preliterate societies.[8][9]

The development of writing enabled knowledge to be stored and communicated across generations with much greater fidelity. Combined with the development of agriculture, which allowed for a surplus of food, it became possible for early civilizations to develop, because more time could be devoted to tasks other than survival[further explanation needed].

Many ancient civilizations collected astronomical information in a systematic manner through simple observation. Though they had no knowledge of the real physical structure of the planets and stars, many theoretical explanations were proposed. Basic facts about human physiology were known in some places, and alchemy was practiced in several civilizations.[10][11] Considerable observation of macrobiotic flora and fauna was also performed.

From their beginnings in Sumer (now Iraq) around 3500 BC, the Mesopotamian people began to attempt to record some observations of the world with numerical data. But their observations and measurements were seemingly taken for purposes other than for scientific laws. A concrete instance of Pythagoras' law was recorded, as early as the 18th century BC: the Mesopotamian cuneiform tablet Plimpton 322 records a number of Pythagorean triplets (3,4,5) (5,12,13). ..., dated 1900 BC, possibly millennia before Pythagoras, [2] but an abstract formulation of the Pythagorean theorem was not.[12]

In Babylonian astronomy, records of the motions of the stars, planets, and the moon are left on thousands of clay tablets created by scribes. Even today, astronomical periods identified by Mesopotamian scientists are still widely used in Western calendars such as the solar year and the lunar month. Using these data they developed arithmetical methods to compute the changing length of daylight in the course of the year and to predict the appearances and disappearances of the Moon and planets and eclipses of the Sun and Moon. Only a few astronomers' names are known, such as that of Kidinnu, a Chaldean astronomer and mathematician. Kiddinu's value for the solar year is in use for today's calendars. Babylonian astronomy was "the first and highly successful attempt at giving a refined mathematical description of astronomical phenomena." According to the historian A. Aaboe, "all subsequent varieties of scientific astronomy, in the Hellenistic world, in India, in Islam, and in the West—if not indeed all subsequent endeavour in the exact sciences—depend upon Babylonian astronomy in decisive and fundamental ways."[13]

Ancient Egypt made significant advances in astronomy, mathematics and medicine.[14] Their development of geometry was a necessary outgrowth of surveying to preserve the layout and ownership of farmland, which was flooded annually by the Nile river. The 3-4-5 right triangle and other rules of thumb were used to build rectilinear structures, and the post and lintel architecture of Egypt. Egypt was also a center of alchemy research for much of the Mediterranean.

The Edwin Smith papyrus is one of the first medical documents still extant, and perhaps the earliest document that attempts to describe and analyse the brain: it might be seen as the very beginnings of modern neuroscience. However, while Egyptian medicine had some effective practices, it was not without its ineffective and sometimes harmful practices. Medical historians believe that ancient Egyptian pharmacology, for example, was largely ineffective.[15] Nevertheless, it applies the following components to the treatment of disease: examination, diagnosis, treatment, and prognosis,[3] which display strong parallels to the basic empirical method of science and according to G. E. R. Lloyd[16] played a significant role in the development of this methodology. The Ebers papyrus (c. 1550 BC) also contains evidence of traditional empiricism.

In Classical Antiquity, the inquiry into the workings of the universe took place both in investigations aimed at such practical goals as establishing a reliable calendar or determining how to cure a variety of illnesses and in those abstract investigations known as natural philosophy. The ancient people who are considered the first scientists may have thought of themselves as natural philosophers, as practitioners of a skilled profession (for example, physicians), or as followers of a religious tradition (for example, temple healers).

The earliest Greek philosophers, known as the pre-Socratics,[17] provided competing answers to the question found in the myths of their neighbors: "How did the ordered cosmos in which we live come to be?"[18] The pre-Socratic philosopher Thales (640-546 BC), dubbed the "father of science", was the first to postulate non-supernatural explanations for natural phenomena, for example, that land floats on water and that earthquakes are caused by the agitation of the water upon which the land floats, rather than the god Poseidon.[19] Thales' student Pythagoras of Samos founded the Pythagorean school, which investigated mathematics for its own sake, and was the first to postulate that the Earth is spherical in shape.[20]Leucippus (5th century BC) introduced atomism, the theory that all matter is made of indivisible, imperishable units called atoms. This was greatly expanded by his pupil Democritus.

Subsequently, Plato and Aristotle produced the first systematic discussions of natural philosophy, which did much to shape later investigations of nature. Their development of deductive reasoning was of particular importance and usefulness to later scientific inquiry. Plato founded the Platonic Academy in 387 BC, whose motto was "Let none unversed in geometry enter here", and turned out many notable philosophers. Plato's student Aristotle introduced empiricism and the notion that universal truths can be arrived at via observation and induction, thereby laying the foundations of the scientific method.[21] Aristotle also produced many biological writings that were empirical in nature, focusing on biological causation and the diversity of life. He made countless observations of nature, especially the habits and attributes of plants and animals in the world around him, classified more than 540 animal species, and dissected at least 50. Aristotle's writings profoundly influenced subsequent Islamic and European scholarship, though they were eventually superseded in the Scientific Revolution.

"Men were weighing for thousands of years before Archimedes worked out the laws of equilibrium; they must have had practical and intuitional knowledge of the principles involved. What Archimedes did was to sort out the theoretical implications of this practical knowledge and present the resulting body of knowledge as a logically coherent system."

and again:

"With astonishment we find ourselves on the threshold of modern science. Nor should it be supposed that by some trick of translation the extracts have been given an air of modernity. Far from it. The vocabulary of these writings and their style are the source from which our own vocabulary and style have been derived."[24]

In medicine, Hippocrates (c. 460 BC – c. 370 BC) and his followers were the first to describe many diseases and medical conditions and developed the Hippocratic Oath for physicians, still relevant and in use today. Herophilos (335–280 BC) was the first to base his conclusions on dissection of the human body and to describe the nervous system. Galen (129 – c. 200 AD) performed many audacious operations—including brain and eye surgeries— that were not tried again for almost two millennia.

One of the oldest surviving fragments of Euclid's Elements, found at Oxyrhynchus and dated to c. 100 AD.[26]

Theophrastus wrote some of the earliest descriptions of plants and animals, establishing the first taxonomy and looking at minerals in terms of their properties such as hardness. Pliny the Elder produced what is one of the largest encyclopedias of the natural world in 77 AD, and must be regarded as the rightful successor to Theophrastus. For example, he accurately describes the octahedral shape of the diamond, and proceeds to mention that diamond dust is used by engravers to cut and polish other gems owing to its great hardness. His recognition of the importance of crystal shape is a precursor to modern crystallography, while mention of numerous other minerals presages mineralogy. He also recognises that other minerals have characteristic crystal shapes, but in one example, confuses the crystal habit with the work of lapidaries. He was also the first to recognise that amber was a fossilized resin from pine trees because he had seen samples with trapped insects within them.

Mathematics: The earliest traces of mathematical knowledge in the Indian subcontinent appear with the Indus Valley Civilization (c. 4th millennium BC ~ c. 3rd millennium BC). The people of this civilization made bricks whose dimensions were in the proportion 4:2:1, considered favorable for the stability of a brick structure.[30] They also tried to standardize measurement of length to a high degree of accuracy. They designed a ruler—the Mohenjo-daro ruler—whose unit of length (approximately 1.32 inches or 3.4 centimetres) was divided into ten equal parts. Bricks manufactured in ancient Mohenjo-daro often had dimensions that were integral multiples of this unit of length.[31]

Astronomy: The first textual mention of astronomical concepts comes from the Vedas, religious literature of India.[37] According to Sarma (2008): "One finds in the Rigveda intelligent speculations about the genesis of the universe from nonexistence, the configuration of the universe, the spherical self-supporting earth, and the year of 360 days divided into 12 equal parts of 30 days each with a periodical intercalary month.".[37] The first 12 chapters of the Siddhanta Shiromani, written by Bhāskara in the 12th century, cover topics such as: mean longitudes of the planets; true longitudes of the planets; the three problems of diurnal rotation; syzygies; lunar eclipses; solar eclipses; latitudes of the planets; risings and settings; the moon's crescent; conjunctions of the planets with each other; conjunctions of the planets with the fixed stars; and the patas of the sun and moon. The 13 chapters of the second part cover the nature of the sphere, as well as significant astronomical and trigonometric calculations based on it.

Linguistics: Some of the earliest linguistic activities can be found in Iron Age India (1st millennium BC) with the analysis of Sanskrit for the purpose of the correct recitation and interpretation of Vedic texts. The most notable grammarian of Sanskrit was Pāṇini (c. 520–460 BC), whose grammar formulates close to 4,000 rules which together form a compact generative grammar of Sanskrit. Inherent in his analytic approach are the concepts of the phoneme, the morpheme and the root.

Medicine: Findings from Neolithic graveyards in what is now Pakistan show evidence of proto-dentistry among an early farming culture.[39]Ayurveda is a system of traditional medicine that originated in ancient India before 2500 BC,[40] and is now practiced as a form of alternative medicine in other parts of the world. Its most famous text is the Suśrutasamhitā of Suśruta, which is notable for describing procedures on various forms of surgery, including rhinoplasty, the repair of torn ear lobes, perineal lithotomy, cataract surgery, and several other excisions and other surgical procedures.

Metallurgy: The wootz, crucible and stainlesssteels were discovered in India, and were widely exported in Classic Mediterranean world. It was known from Pliny the Elder as ferrum indicum. Indian Wootz steel was held in high regard in Roman Empire, was often considered to be the best. After in Middle Age it was imported in Syria to produce with special techniques the "Damascus steel" by the year 1000.[41]

The Hindus excel in the manufacture of iron, and in the preparations of those ingredients along with which it is fused to obtain that kind of soft iron which is usually styled Indian steel (Hindiah). They also have workshops wherein are forged the most famous sabres in the world.

Mathematics: From the earliest the Chinese used a positional decimal system on counting boards in order to calculate. To express 10, a single rod is placed in the second box from the right. The spoken language uses a similar system to English: e.g. four thousand two hundred seven. No symbol was used for zero. By the 1st century BC, negative numbers and decimal fractions were in use and The Nine Chapters on the Mathematical Art included methods for extracting higher order roots by Horner's method and solving linear equations and by Pythagoras' theorem. Cubic equations were solved in the Tang dynasty and solutions of equations of order higher than 3 appeared in print in 1245 AD by Ch'in Chiu-shao. Pascal's triangle for binomial coefficients was described around 1100 by Jia Xian.

Although the first attempts at an axiomatisation of geometry appear in the Mohist canon in 330 BC, Liu Hui developed algebraic methods in geometry in the 3rd century AD and also calculated pi to 5 significant figures. In 480, Zu Chongzhi improved this by discovering the ratio which remained the most accurate value for 1200 years.

Astronomy: Astronomical observations from China constitute the longest continuous sequence from any civilisation and include records of sunspots (112 records from 364 BC), supernovas (1054), lunar and solar eclipses. By the 12th century, they could reasonably accurately make predictions of eclipses, but the knowledge of this was lost during the Ming dynasty, so that the Jesuit Matteo Ricci gained much favour in 1601 by his predictions.[44] By 635 Chinese astronomers had observed that the tails of comets always point away from the sun.

From antiquity, the Chinese used an equatorial system for describing the skies and a star map from 940 was drawn using a cylindrical (Mercator) projection. The use of an armillary sphere is recorded from the 4th century BC and a sphere permanently mounted in equatorial axis from 52 BC. In 125 AD Zhang Heng used water power to rotate the sphere in real time. This included rings for the meridian and ecliptic. By 1270 they had incorporated the principles of the Arab torquetum.

Seismology: To better prepare for calamities, Zhang Heng invented a seismometer in 132 CE which provided instant alert to authorities in the capital Luoyang that an earthquake had occurred in a location indicated by a specific cardinal or ordinal direction.[45] Although no tremors could be felt in the capital when Zhang told the court that an earthquake had just occurred in the northwest, a message came soon afterwards that an earthquake had indeed struck 400 km (248 mi) to 500 km (310 mi) northwest of Luoyang (in what is now modern Gansu).[46] Zhang called his device the 'instrument for measuring the seasonal winds and the movements of the Earth' (Houfeng didong yi 候风地动仪), so-named because he and others thought that earthquakes were most likely caused by the enormous compression of trapped air.[47] See Zhang's seismometer for further details.

However, cultural factors prevented these Chinese achievements from developing into what we might call "modern science". According to Needham, it may have been the religious and philosophical framework of Chinese intellectuals which made them unable to accept the ideas of laws of nature:

It was not that there was no order in nature for the Chinese, but rather that it was not an order ordained by a rational personal being, and hence there was no conviction that rational personal beings would be able to spell out in their lesser earthly languages the divine code of laws which he had decreed aforetime. The Taoists, indeed, would have scorned such an idea as being too naïve for the subtlety and complexity of the universe as they intuited it.[50]

While the Byzantine Empire still held learning centers such as Constantinople, Western Europe's knowledge was concentrated in monasteries until the development of medieval universities in the 12th and 13th centuries. The curriculum of monastic schools included the study of the few available ancient texts and of new works on practical subjects like medicine[51] and timekeeping.[52]

Muslim scientists placed far greater emphasis on experiment than had the Greeks.[53] This led to an early scientific method being developed in the Muslim world, where significant progress in methodology was made, beginning with the experiments of Ibn al-Haytham (Alhazen) on optics from c. 1000, in his Book of Optics. The law of refraction of light was known to the Persians.[54] The most important development of the scientific method was the use of experiments to distinguish between competing scientific theories set within a generally empirical orientation, which began among Muslim scientists. Ibn al-Haytham is also regarded as the father of optics, especially for his empirical proof of the intromission theory of light. Some have also described Ibn al-Haytham as the "first scientist" for his development of the modern scientific method.[55]

Ibn Sina (Avicenna) is regarded as the most influential philosopher of Islam.[68] He pioneered the science of experimental medicine[69] and was the first physician to conduct clinical trials.[70] His two most notable works in medicine are the Kitāb al-shifāʾ ("Book of Healing") and The Canon of Medicine, both of which were used as standard medicinal texts in both the Muslim world and in Europe well into the 17th century. Amongst his many contributions are the discovery of the contagious nature of infectious diseases,[69] and the introduction of clinical pharmacology.[71]

Islamic science began its decline in the 12th or 13th century, before the Renaissance in Europe, and due in part to the 11th–13th century Mongol conquests, during which libraries, observatories, hospitals and universities were destroyed.[80] The end of the Islamic Golden Age is marked by the destruction of the intellectual center of Baghdad, the capital of the Abbasid caliphate in 1258.[80]

As well as this, Europeans began to venture further and further east (most notably, perhaps, Marco Polo) as a result of the Pax Mongolica. This led to the increased awareness of Indian and even Chinese culture and civilization within the European tradition. Technological advances were also made, such as the early flight of Eilmer of Malmesbury (who had studied Mathematics in 11th century England),[83] and the metallurgical achievements of the Cistercianblast furnace at Laskill.[84][85]

At the beginning of the 13th century, there were reasonably accurate Latin translations of the main works of almost all the intellectually crucial ancient authors, allowing a sound transfer of scientific ideas via both the universities and the monasteries. By then, the natural philosophy contained in these texts began to be extended by notable scholastics such as Robert Grosseteste, Roger Bacon, Albertus Magnus and Duns Scotus. Precursors of the modern scientific method, influenced by earlier contributions of the Islamic world, can be seen already in Grosseteste's emphasis on mathematics as a way to understand nature, and in the empirical approach admired by Bacon, particularly in his Opus Majus. Pierre Duhem's provocative thesis of the Catholic Church's Condemnation of 1277 led to the study of medieval science as a serious discipline, "but no one in the field any longer endorses his view that modern science started in 1277".[86] However, many scholars agree with Duhem's view that the Middle Ages were a period of important scientific developments.[87][88][89][90]

The first half of the 14th century saw much important scientific work being done, largely within the framework of scholastic commentaries on Aristotle's scientific writings.[91]William of Ockham introduced the principle of parsimony: natural philosophers should not postulate unnecessary entities, so that motion is not a distinct thing but is only the moving object[92] and an intermediary "sensible species" is not needed to transmit an image of an object to the eye.[93] Scholars such as Jean Buridan and Nicole Oresme started to reinterpret elements of Aristotle's mechanics. In particular, Buridan developed the theory that impetus was the cause of the motion of projectiles, which was a first step towards the modern concept of inertia.[94] The Oxford Calculators began to mathematically analyze the kinematics of motion, making this analysis without considering the causes of motion.[95]

In 1348, the Black Death and other disasters sealed a sudden end to the previous period of massive philosophic and scientific development. Yet, the rediscovery of ancient texts was improved after the Fall of Constantinople in 1453, when many Byzantine scholars had to seek refuge in the West. Meanwhile, the introduction of printing was to have great effect on European society. The facilitated dissemination of the printed word democratized learning and allowed a faster propagation of new ideas. New ideas also helped to influence the development of European science at this point: not least the introduction of Algebra. These developments paved the way for the Scientific Revolution, which may also be understood as a resumption of the process of scientific inquiry, halted at the start of the Black Death.

The renewal of learning in Europe, that began with 12th century Scholasticism, came to an end about the time of the Black Death, and the initial period of the subsequent Italian Renaissance is sometimes seen as a lull in scientific activity. The Northern Renaissance, on the other hand, showed a decisive shift in focus from Aristoteleian natural philosophy to chemistry and the biological sciences (botany, anatomy, and medicine).[99] Thus modern science in Europe was resumed in a period of great upheaval: the Protestant Reformation and CatholicCounter-Reformation; the discovery of the Americas by Christopher Columbus; the Fall of Constantinople; but also the re-discovery of Aristotle during the Scholastic period presaged large social and political changes. Thus, a suitable environment was created in which it became possible to question scientific doctrine, in much the same way that Martin Luther and John Calvin questioned religious doctrine. The works of Ptolemy (astronomy) and Galen (medicine) were found not always to match everyday observations. Work by Vesalius on human cadavers found problems with the Galenic view of anatomy.[100]

The Romantic Movement of the early 19th century reshaped science by opening up new pursuits unexpected in the classical approaches of the Enlightenment. Major breakthroughs came in biology, especially in Darwin's theory of evolution, as well as physics (electromagnetism), mathematics (non-Euclidean geometry, group theory) and chemistry (organic chemistry). The decline of Romanticism occurred because a new movement, Positivism, began to take hold of the ideals of the intellectuals after 1840 and lasted until about 1880.

The Scientific Revolution established science as a source for the growth of knowledge.[103] During the 19th century, the practice of science became professionalized and institutionalized in ways that continued through the 20th century. As the role of scientific knowledge grew in society, it became incorporated with many aspects of the functioning of nation-states.

The history of science is marked by a chain of advances in technology and knowledge that have always complemented each other. Technological innovations bring about new discoveries and are bred by other discoveries, which inspire new possibilities and approaches to longstanding science issues.

The Scientific Revolution is a convenient boundary between ancient thought and classical physics. Nicolaus Copernicus revived the heliocentric model of the solar system described by Aristarchus of Samos. This was followed by the first known model of planetary motion given by Kepler in the early 17th century, which proposed that the planets follow elliptical orbits, with the Sun at one focus of the ellipse. Galileo ("Father of Modern Physics") also made use of experiments to validate physical theories, a key element of the scientific method.

The beginning of the 20th century brought the start of a revolution in physics. The long-held theories of Newton were shown not to be correct in all circumstances. Beginning in 1900, Max Planck, Albert Einstein, Niels Bohr and others developed quantum theories to explain various anomalous experimental results, by introducing discrete energy levels. Not only did quantum mechanics show that the laws of motion did not hold on small scales, but even more disturbingly, the theory of general relativity, proposed by Einstein in 1915, showed that the fixed background of spacetime, on which both Newtonian mechanics and special relativity depended, could not exist. In 1925, Werner Heisenberg and Erwin Schrödinger formulated quantum mechanics, which explained the preceding quantum theories. The observation by Edwin Hubble in 1929 that the speed at which galaxies recede positively correlates with their distance, led to the understanding that the universe is expanding, and the formulation of the Big Bang theory by Georges Lemaître.

In 1938 Otto Hahn and Fritz Strassmann discovered nuclear fission with radiochemical methods, and 1939 Lise Meitner and Otto Robert Frisch wrote the first theoretical interpretation of the fission process, which was later improved by Niels Bohr and John A. Wheeler. Further developments took place during World War II, which led to the practical application of radar and the development and use of the atomic bomb. Though the process had begun with the invention of the cyclotron by Ernest O. Lawrence in the 1930s, physics in the postwar period entered into a phase of what historians have called "Big Science", requiring massive machines, budgets, and laboratories in order to test their theories and move into new frontiers. The primary patron of physics became state governments, who recognized that the support of "basic" research could often lead to technologies useful to both military and industrial applications. Currently, general relativity and quantum mechanics are inconsistent with each other, and efforts are underway to unify the two.

Modern chemistry emerged from the sixteenth through the eighteenth centuries through the material practices and theories promoted by alchemy, medicine, manufacturing and mining.[104] A decisive moment came when 'chymistry' was distinguished from alchemy by Robert Boyle in his work The Sceptical Chymist, in 1661; although the alchemical tradition continued for some time after his work. Other important steps included the gravimetric experimental practices of medical chemists like William Cullen, Joseph Black, Torbern Bergman and Pierre Macquer and through the work of Antoine Lavoisier (Father of Modern Chemistry) on oxygen and the law of conservation of mass, which refuted phlogiston theory. The theory that all matter is made of atoms, which are the smallest constituents of matter that cannot be broken down without losing the basic chemical and physical properties of that matter, was provided by John Dalton in 1803, although the question took a hundred years to settle as proven. Dalton also formulated the law of mass relationships. In 1869, Dmitri Mendeleev composed his periodic table of elements on the basis of Dalton's discoveries.

The synthesis of urea by Friedrich Wöhler opened a new research field, organic chemistry, and by the end of the 19th century, scientists were able to synthesize hundreds of organic compounds. The later part of the 19th century saw the exploitation of the Earth's petrochemicals, after the exhaustion of the oil supply from whaling. By the 20th century, systematic production of refined materials provided a ready supply of products which provided not only energy, but also synthetic materials for clothing, medicine, and everyday disposable resources. Application of the techniques of organic chemistry to living organisms resulted in physiological chemistry, the precursor to biochemistry. The 20th century also saw the integration of physics and chemistry, with chemical properties explained as the result of the electronic structure of the atom. Linus Pauling's book on The Nature of the Chemical Bond used the principles of quantum mechanics to deduce bond angles in ever-more complicated molecules. Pauling's work culminated in the physical modelling of DNA, the secret of life (in the words of Francis Crick, 1953). In the same year, the Miller–Urey experiment demonstrated in a simulation of primordial processes, that basic constituents of proteins, simple amino acids, could themselves be built up from simpler molecules.

Geology existed as a cloud of isolated, disconnected ideas about rocks, minerals, and landforms long before it became a coherent science. Theophrastus' work on rocks, Peri lithōn, remained authoritative for millennia: its interpretation of fossils was not overturned until after the Scientific Revolution. Chinese polymath Shen Kua (1031–1095) first formulated hypotheses for the process of land formation. Based on his observation of fossils in a geological stratum in a mountain hundreds of miles from the ocean, he deduced that the land was formed by erosion of the mountains and by deposition of silt.

Geology did not undergo systematic restructuring during the Scientific Revolution, but individual theorists made important contributions. Robert Hooke, for example, formulated a theory of earthquakes, and Nicholas Steno developed the theory of superposition and argued that fossils were the remains of once-living creatures. Beginning with Thomas Burnet's Sacred Theory of the Earth in 1681, natural philosophers began to explore the idea that the Earth had changed over time. Burnet and his contemporaries interpreted Earth's past in terms of events described in the Bible, but their work laid the intellectual foundations for secular interpretations of Earth history.

Modern geology, like modern chemistry, gradually evolved during the 18th and early 19th centuries. Benoît de Maillet and the Comte de Buffon saw the Earth as much older than the 6,000 years envisioned by biblical scholars. Jean-Étienne Guettard and Nicolas Desmarest hiked central France and recorded their observations on some of the first geological maps. Aided by chemical experimentation, naturalists such as Scotland's John Walker,[105] Sweden's Torbern Bergman, and Germany's Abraham Werner created comprehensive classification systems for rocks and minerals—a collective achievement that transformed geology into a cutting edge field by the end of the eighteenth century. These early geologists also proposed a generalized interpretations of Earth history that led James Hutton, Georges Cuvier and Alexandre Brongniart, following in the steps of Steno, to argue that layers of rock could be dated by the fossils they contained: a principle first applied to the geology of the Paris Basin. The use of index fossils became a powerful tool for making geological maps, because it allowed geologists to correlate the rocks in one locality with those of similar age in other, distant localities. Over the first half of the 19th century, geologists such as Charles Lyell, Adam Sedgwick, and Roderick Murchison applied the new technique to rocks throughout Europe and eastern North America, setting the stage for more detailed, government-funded mapping projects in later decades.

Midway through the 19th century, the focus of geology shifted from description and classification to attempts to understand how the surface of the Earth had changed. The first comprehensive theories of mountain building were proposed during this period, as were the first modern theories of earthquakes and volcanoes. Louis Agassiz and others established the reality of continent-covering ice ages, and "fluvialists" like Andrew Crombie Ramsay argued that river valleys were formed, over millions of years by the rivers that flow through them. After the discovery of radioactivity, radiometric dating methods were developed, starting in the 20th century. Alfred Wegener's theory of "continental drift" was widely dismissed when he proposed it in the 1910s, but new data gathered in the 1950s and 1960s led to the theory of plate tectonics, which provided a plausible mechanism for it. Plate tectonics also provided a unified explanation for a wide range of seemingly unrelated geological phenomena. Since 1970 it has served as the unifying principle in geology.

Geologists' embrace of plate tectonics became part of a broadening of the field from a study of rocks into a study of the Earth as a planet. Other elements of this transformation include: geophysical studies of the interior of the Earth, the grouping of geology with meteorology and oceanography as one of the "earth sciences", and comparisons of Earth and the solar system's other rocky planets.

In 1847, Hungarian physician Ignác Fülöp Semmelweis dramatically reduced the occurrency of puerperal fever by simply requiring physicians to wash their hands before attending to women in childbirth. This discovery predated the germ theory of disease. However, Semmelweis' findings were not appreciated by his contemporaries and came into use only with discoveries by British surgeon Joseph Lister, who in 1865 proved the principles of antisepsis. Lister's work was based on the important findings by French biologist Louis Pasteur. Pasteur was able to link microorganisms with disease, revolutionizing medicine. He also devised one of the most important methods in preventive medicine, when in 1880 he produced a vaccine against rabies. Pasteur invented the process of pasteurization, to help prevent the spread of disease through milk and other foods.[108]

Perhaps the most prominent, controversial and far-reaching theory in all of science has been the theory of evolution by natural selection put forward by the British naturalist Charles Darwin in his book On the Origin of Species in 1859. Darwin proposed that the features of all living things, including humans, were shaped by natural processes over long periods of time. The theory of evolution in its current form affects almost all areas of biology.[109] Implications of evolution on fields outside of pure science have led to both opposition and support from different parts of society, and profoundly influenced the popular understanding of "man's place in the universe". In the early 20th century, the study of heredity became a major investigation after the rediscovery in 1900 of the laws of inheritance developed by the Moravian[110] monk Gregor Mendel in 1866. Mendel's laws provided the beginnings of the study of genetics, which became a major field of research for both scientific and industrial research. By 1953, James D. Watson, Francis Crick and Maurice Wilkins clarified the basic structure of DNA, the genetic material for expressing life in all its forms.[111] In the late 20th century, the possibilities of genetic engineering became practical for the first time, and a massive international effort began in 1990 to map out an entire human genome (the Human Genome Project).

Successful use of the scientific method in the physical sciences led to the same methodology being adapted to better understand the many fields of human endeavor. From this effort the social sciences have been developed.

In Western culture, the study of politics is first found in Ancient Greece. The antecedents of European politics trace their roots back even earlier than Plato and Aristotle, particularly in the works of Homer, Hesiod, Thucydides, Xenophon, and Euripides. Later, Plato analyzed political systems, abstracted their analysis from more literary- and history- oriented studies and applied an approach we would understand as closer to philosophy. Similarly, Aristotle built upon Plato's analysis to include historical empirical evidence in his analysis.

An ancient Indian treatise on statecraft, economic policy and military strategy by Kautilya[112] and Viṣhṇugupta,[113] who are traditionally identified with Chāṇakya (c. 350–-283 BCE). In this treatise, the behaviors and relationships of the people, the King, the State, the Government Superintendents, Courtiers, Enemies, Invaders, and Corporations are analysed and documented. Roger Boesche describes the Arthaśāstra as "a book of political realism, a book analysing how the political world does work and not very often stating how it ought to work, a book that frequently discloses to a king what calculating and sometimes brutal measures he must carry out to preserve the state and the common good."[114]

During the rule of Rome, famous historians such as Polybius, Livy and Plutarch documented the rise of the Roman Republic, and the organization and histories of other nations, while statesmen like Julius Caesar, Cicero and others provided us with examples of the politics of the republic and Rome's empire and wars. The study of politics during this age was oriented toward understanding history, understanding methods of governing, and describing the operation of governments.

With the fall of the Roman Empire, there arose a more diffuse arena for political studies. The rise of monotheism and, particularly for the Western tradition, Christianity, brought to light a new space for politics and political action[citation needed]. During the Middle Ages, the study of politics was widespread in the churches and courts. Works such as Augustine of Hippo's The City of God synthesized current philosophies and political traditions with those of Christianity, redefining the borders between what was religious and what was political. Most of the political questions surrounding the relationship between Church and State were clarified and contested in this period.

In the 1920s, John Maynard Keynes prompted a division between microeconomics and macroeconomics. Under Keynesian economics macroeconomic trends can overwhelm economic choices made by individuals. Governments should promote aggregate demand for goods as a means to encourage economic expansion. Following World War II, Milton Friedman created the concept of monetarism. Monetarism focuses on using the supply and demand of money as a method for controlling economic activity. In the 1970s, monetarism has adapted into supply-side economics which advocates reducing taxes as a means to increase the amount of money available for economic expansion.

Other modern schools of economic thought are New Classical economics and New Keynesian economics. New Classical economics was developed in the 1970s, emphasizing solid microeconomics as the basis for macroeconomic growth. New Keynesian economics was created partially in response to New Classical economics, and deals with how inefficiencies in the market create a need for control by a central bank or government.

The above "history of economics" reflects modern economic textbooks and this means that the last stage of a science is represented as the culmination of its history (Kuhn, 1962). The "invisible hand" mentioned in a lost page in the middle of a chapter in the middle of the to "Wealth of Nations", 1776, advances as Smith's central message. It is played down that this "invisible hand" acts only "frequently" and that it is "no part of his [the individual's] intentions" because competition leads to lower prices by imitating "his" invention. That this "invisible hand" prefers "the support of domestic to foreign industry" is cleansed—often without indication that part of the citation is truncated.[115] The opening passage of the "Wealth" containing Smith's message is never mentioned as it cannot be integrated into modern theory: "Wealth" depends on the division of labour which changes with market volume and on the proportion of productive to Unproductive labor.

The end of the 19th century marks the start of psychology as a scientific enterprise. The year 1879 is commonly seen as the start of psychology as an independent field of study. In that year Wilhelm Wundt founded the first laboratory dedicated exclusively to psychological research (in Leipzig). Other important early contributors to the field include Hermann Ebbinghaus (a pioneer in memory studies), Ivan Pavlov (who discovered classical conditioning), William James, and Sigmund Freud. Freud's influence has been enormous, though more as cultural icon than a force in scientific psychology.

The 20th century saw a rejection of Freud's theories as being too unscientific, and a reaction against Edward Titchener's atomistic approach of the mind. This led to the formulation of behaviorism by John B. Watson, which was popularized by B.F. Skinner. Behaviorism proposed epistemologically limiting psychological study to overt behavior, since that could be reliably measured. Scientific knowledge of the "mind" was considered too metaphysical, hence impossible to achieve.

The final decades of the 20th century have seen the rise of a new interdisciplinary approach to studying human psychology, known collectively as cognitive science. Cognitive science again considers the mind as a subject for investigation, using the tools of psychology, linguistics, computer science, philosophy, and neurobiology. New methods of visualizing the activity of the brain, such as PET scans and CAT scans, began to exert their influence as well, leading some researchers to investigate the mind by investigating the brain, rather than cognition. These new forms of investigation assume that a wide understanding of the human mind is possible, and that such an understanding may be applied to other research domains, such as artificial intelligence.

Ibn Khaldun can be regarded as the earliest scientific systematic sociologist.[116] The modern sociology, emerged in the early 19th century as the academic response to the modernization of the world. Among many early sociologists (e.g., Émile Durkheim), the aim of sociology was in structuralism, understanding the cohesion of social groups, and developing an "antidote" to social disintegration. Max Weber was concerned with the modernization of society through the concept of rationalization, which he believed would trap individuals in an "iron cage" of rational thought. Some sociologists, including Georg Simmel and W. E. B. Du Bois, utilized more microsociological, qualitative analyses. This microlevel approach played an important role in American sociology, with the theories of George Herbert Mead and his student Herbert Blumer resulting in the creation of the symbolic interactionism approach to sociology.

American sociology in the 1940s and 1950s was dominated largely by Talcott Parsons, who argued that aspects of society that promoted structural integration were therefore "functional". This structural functionalism approach was questioned in the 1960s, when sociologists came to see this approach as merely a justification for inequalities present in the status quo. In reaction, conflict theory was developed, which was based in part on the philosophies of Karl Marx. Conflict theorists saw society as an arena in which different groups compete for control over resources. Symbolic interactionism also came to be regarded as central to sociological thinking. Erving Goffman saw social interactions as a stage performance, with individuals preparing "backstage" and attempting to control their audience through impression management. While these theories are currently prominent in sociological thought, other approaches exist, including feminist theory, post-structuralism, rational choice theory, and postmodernism.

Anthropology can best be understood as an outgrowth of the Age of Enlightenment. It was during this period that Europeans attempted systematically to study human behaviour. Traditions of jurisprudence, history, philology and sociology developed during this time and informed the development of the social sciences of which anthropology was a part.

At the same time, the romantic reaction to the Enlightenment produced thinkers such as Johann Gottfried Herder and later Wilhelm Dilthey whose work formed the basis for the culture concept which is central to the discipline. Traditionally, much of the history of the subject was based on colonial encounters between Western Europe and the rest of the world, and much of 18th- and 19th-century anthropology is now classed as forms of scientific racism.

During the late 19th-century, battles over the "study of man" took place between those of an "anthropological" persuasion (relying on anthropometrical techniques) and those of an "ethnological" persuasion (looking at cultures and traditions), and these distinctions became part of the later divide between physical anthropology and cultural anthropology, the latter ushered in by the students of Franz Boas.

In the mid-20th century, much of the methodologies of earlier anthropological and ethnographical study were reevaluated with an eye towards research ethics, while at the same time the scope of investigation has broadened far beyond the traditional study of "primitive cultures" (scientific practice itself is often an arena of anthropological study).

The emergence of paleoanthropology, a scientific discipline which draws on the methodologies of paleontology, physical anthropology and ethology, among other disciplines, and increasing in scope and momentum from the mid-20th century, continues to yield further insights into human origins, evolution, genetic and cultural heritage, and perspectives on the contemporary human predicament as well.

As an academic field, history of science began with the publication of William Whewell's History of the Inductive Sciences (first published in 1837). A more formal study of the history of science as an independent discipline was launched by George Sarton's publications, Introduction to the History of Science (1927) and the Isis journal (founded in 1912). Sarton exemplified the early 20th-century view of the history of science as the history of great men and great ideas. He shared with many of his contemporaries a Whiggish belief in history as a record of the advances and delays in the march of progress. The history of science was not a recognized subfield of American history in this period, and most of the work was carried out by interested scientists and physicians rather than professional historians.[117] With the work of I. Bernard Cohen at Harvard, the history of science became an established subdiscipline of history after 1945.[118]

The history of mathematics, history of technology, and history of philosophy are distinct areas of research and are covered in other articles. Mathematics is closely related to but distinct from natural science (at least in the modern conception). Technology is likewise closely related to but clearly differs from the search for empirical truth.

Much of the study of the history of science has been devoted to answering questions about what science is, how it functions, and whether it exhibits large-scale patterns and trends.[119] The sociology of science in particular has focused on the ways in which scientists work, looking closely at the ways in which they "produce" and "construct" scientific knowledge. Since the 1960s, a common trend in science studies (the study of the sociology and history of science) has been to emphasize the "human component" of scientific knowledge, and to de-emphasize the view that scientific data are self-evident, value-free, and context-free.[120] The field of Science and Technology Studies, an area that overlaps and often informs historical studies of science, focuses on the social context of science in both contemporary and historical periods.

Humboldtian science refers to the early 19th century approach of combining scientific field work with the age of Romanticism sensitivity, ethics and aestetic ideals.[121] It helped to install natural history as a separate field, gave base for ecology and was based on the role model of scientist, naturalist and explorer Alexander von Humboldt.[122] The later 19th century positivism asserted that all authentic knowledge allows verification and that all authentic knowledge assumes that the only valid knowledge is scientific.[123]

A major subject of concern and controversy in the philosophy of science has been the nature of theory change in science. Karl Popper argued that scientific knowledge is progressive and cumulative; Thomas Kuhn, that scientific knowledge moves through "paradigm shifts" and is not necessarily progressive; and Paul Feyerabend, that scientific knowledge is not cumulative or progressive and that there can be no demarcation in terms of method between science and any other form of investigation.[124]

The mid 20th century saw a series of studies relying to the role of science in a social context, starting from Thomas Kuhn's The Structure of Scientific Revolutions in 1962. It opened the study of science to new disciplines by suggesting that the evolution of science was in part sociologically determined and that positivism did not explain the actual interactions and strategies of the human participants in science. As Thomas Kuhn put it, the history of science may be seen in more nuanced terms, such as that of competing paradigms or conceptual systems in a wider matrix that includes intellectual, cultural, economic and political themes outside of science. "Partly by selection and partly by distortion, the scientists of earlier ages are implicitly presented as having worked upon the same set of fixed problems and in accordance with the same set of fixed canons that the most recent revolution in scientific theory and method made seem scientific."[125]

Further studies, e.g. Jerome Ravetz 1971 Scientific Knowledge and its Social Problems referred to the role of the scientific community, as a social construct, in accepting or rejecting (objective) scientific knowledge.[126] The Science wars of the 1990 were about the influence of especially French philosophers, which denied the objectivity of science in general or seemed to do so. They described as well differences between the idealized model of a pure science and the actual scientific practice; while scientism, a revival of the positivism approach, saw in precise measurement and rigorous calculation the basis for finally settling enduring metaphysical and moral controversies.[127][128] However, more recently some of the leading critical theorists have recognized that their postmodern deconstructions have at times been counter-productive, and are providing intellectual ammunition for reactionary interests. Bruno Latour noted that "dangerous extremists are using the very same argument of social construction to destroy hard-won evidence that could save our lives. Was I wrong to participate in the invention of this field known as science studies? Is it enough to say that we did not really mean what we meant?"[129]

^"For our purpose, science may be defined as ordered knowledge of natural phenomena and of the relations between them." William C. Dampier-Whetham, "Science", in Encyclopædia Britannica, 11th ed. (New York: 1911); "Science comprises, first, the orderly and systematic comprehension, description and/or explanation of natural phenomena and, secondly, the [mathematical and logical] tools necessary for the undertaking." Marshall Clagett, Greek Science in Antiquity (New York: Collier Books, 1955); "Science is a systematic explanation of perceived or imaginary phenomena, or else is based on such an explanation. Mathematics finds a place in science only as one of the symbolical languages in which scientific explanations may be expressed." David Pingree, "Hellenophilia versus the History of Science", Isis83, 559 (1982); Pat Munday, entry "History of Science", New Dictionary of the History of Ideas (Charles Scribner's Sons, 2005).

^Golinski, Jan (2001). Making Natural Knowledge: Constructivism and the History of Science (reprint ed.). University of Chicago Press. p. 2. ISBN9780226302324. When [history of science] began, during the eighteenth century, it was practiced by scientists (or "natural philosophers") with an interest in validating and defending their enterprise. They wrote histories in which ... the science of the day was exhibited as the outcome of the progressive accumulation of human knowledge, which was an integral part of moral and cultural development.

^Kuhn, T., 1962, "The Structure of Scientific Revolutions", University of Chicago Press, p. 137: "Partly by selection and partly by distortion, the scientists of earlier ages are implicitly presented as having worked upon the same set of fixed problems and in accordance with the same set of fixed canons that the most recent revolution in scientific theory and method made seem scientific."

^Greek Science, many editions, such as the paperback by Penguin Books. Copyrights in 1944, 1949, 1953, 1961, 1963. The first quote above comes from Part 1, Chapter 1; the second, from Part 2, Chapter 4.

^Boyer (1991). "Euclid of Alexandria". A History of Mathematics. p. 119. The Elements of Euclid not only was the earliest major Greek mathematical work to come down to us, but also the most influential textbook of all times. [...]The first printed versions of the Elements appeared at Venice in 1482, one of the very earliest of mathematical books to be set in type; it has been estimated that since then at least a thousand editions have been published. Perhaps no book other than the Bible can boast so many editions, and certainly no mathematical work has had an influence comparable with that of Euclid's Elements.

^Calinger, Ronald (1999). A Contextual History of Mathematics. Prentice-Hall. p. 150. ISBN0-02-318285-7. Shortly after Euclid, compiler of the definitive textbook, came Archimedes of Syracuse (c. 287–212 BC.), the most original and profound mathematician of antiquity.

^Henig, Robin Marantz (2000). The Monk in the Garden : The Lost and Found Genius of Gregor Mendel, the Father of Genetics. Houghton Mifflin. ISBN0-395-97765-7. OCLC43648512. The article, written by an obscure Moravian monk named Gregor Mendel...

^Mabbett, I. W. (1 April 1964). "The Date of the Arthaśāstra". Journal of the American Oriental Society84 (2): 162–169. doi:10.2307/597102. JSTOR597102.Trautmann, Thomas R. (1971). Kauṭilya and the Arthaśāstra: A Statistical Investigation of the Authorship and Evolution of the Text. Leiden: E.J. Brill. p. 10. while in his character as author of an arthaśāstra he is generally referred to by his gotra name, Kauṭilya.

^Mabbett 1964
Trautmann 1971:5 "the very last verse of the work...is the unique instance of the personal name Viṣṇugupta rather than the gotra name Kauṭilya in the Arthaśāstra.

^Boesche, Roger (2002). The First Great Political Realist: Kautilya and His Arthashastra. Lanham: Lexington Books. p. 17. ISBN0-7391-0401-2.

^Compare Smith's original phrase with Samuelson's quotation of it. In brackets what Samuelson curtailed without indication and without giving a reference: "[As] every individual … [therefore, endeavours as much as he can, both to employ his capital in the support of domestic industry, and so to direct that industry that its produce maybe of the greatest value; every individual necessarily labours to render the annual revenue of the society as great as he can. He generally, indeed,] neither intends to promote the general [Smith said "public"] interest, nor knows how much he is promoting it. [By preferring the support of domestic to that of foreign industry,] he intends only his own security, [and by directing that industry in such a manner as its produce may be of the greatest value, he intends only] his own gain; and he is in this, [as in many other cases,] led by an invisible hand to promote an end which was no part of his intention. [Nor is it always the worse for the society that it was no part of it.] By pursuing his own interest, he frequently promotes that of the society more effectually than when he really intends to promote it" Samuelson, Paul A./Nordhaus, William D., 1989, Economics, 13th edition, N.Y. et al.: McGraw-Hill, page 825; Smith, Adam, 1937, The Wealth of Nations, N. Y.: Random House, page 423

^Reingold, Nathan (1986). "History of Science Today, 1. Uniformity as Hidden Diversity: History of Science in the United States, 1920-1940". British Journal for the History of Science19 (3): 243–262. doi:10.1017/S0007087400023268.Cite uses deprecated parameter |author-separator= (help)

^Lears, T.J. Jackson. "Get Happy!!". The Nation. Retrieved 21 December 2013. ...scientism is a revival of the nineteenth-century positivist faith that a reified "science" has discovered (or is about to discover) all the important truths about human life. Precise measurement and rigorous calculation, in this view, are the basis for finally settling enduring metaphysical and moral controversies—explaining consciousness and choice, replacing ambiguity with certainty.

Lakatos, ImreHistory of Science and its Rational Reconstructions published in The Methodology of Scientific Research Programmes: Philosophical Papers Volume 1. Cambridge: Cambridge University Press 1978

Levere, Trevor Harvey. Transforming Matter: A History of Chemistry from Alchemy to the Buckyball (2001)