29 January 2008

If you've never heard of Olszewski, that's no real surprise. Outside of certain physics and chemistry circles, the Polish scientist is not well remembered, but he made a few very significant additions to our knowledge of the most common elements in our atmosphere, as well as exciting new ways to use those elements.

Olszewski was an alumnus of the Jagiellonian University in his hometown of Krakau, long a distinguished center of learning in Poland. Jagiellonian boasts among its other famous alumni Pope John Paul II, science fiction writer Stanislaw Lem, and none other than Nicolaus Copernicus.

Olszewski was an accomplished scientist, but is best known for his work on the creation of liquid oxygen, nitrogen and carbon dioxide. He and colleague Zygmunt Wróblewski were the first to achieve stable liquid states for these gases, and their discoveries are still in wide use today. Liquid nitrogen is, by far, the most common product of their work, used in all kinds of freezing applications, from the freezing and transporting of food to the removal of skin lesions and warts. Liquid oxygen was a vital propellant in early rockets, and is still used in some rocket systems, most notably the U.S. space shuttle.

In 1888, just five years after achieving the liquid states, Olszewski's most valued partner, Wróblewski, was killed during their work studying the basic properties of hydrogen. Olszewski bravely continued this work, eventually achieving a stable liquid hydrogen state and setting the new world record for lowest achieved temperature, at -225 C.

28 January 2008

On this date in 1986, all seven crewmembers of the Challenger space shuttle were killed.

The dates this week all seem strangely connected to the worst episodes in the history of space exploration. With my last post, discussing the loss of the Ranger 3 probe, I wanted to talk about the imperfection of human endeavors, but in that case, I could afford to speak lightly of the probe missing the moon. It was, after all, an unmanned probe, and all that was lost was time, effort, and money. Some have had to give much more to the cause of exploring space.

Yesterday was the 41st anniversary of the Apollo 1 disaster. Three astronauts died when a fire started and rapidly spread through the spacecraft as it sat on the launchpad. Gus Grissom, Edward White, and Roger Chaffee were later found to have died of smoke inhalation, but were very heavily burned.

Today marks another sad date in NASA history, and one that, unlike the Apollo 1 fire, I was around to witness. 22 years ago today, the Challenger shuttle exploded seven seconds after takeoff from Cape Canaveral. I remember watching, back in the days when every school child in America watched every launch. It was still an occasion, and a great place to start a lesson, especially in a science class. This launch was even more heavily watched by American classrooms, because a teacher, Christa McAuliffe, was aboard, about to become the first teacher ever to leave the atmosphere of Earth.

McAuliffe's name is still, and will always be inexorably connected to the Challenger explosion. She was the public face of the mission, the poster child for NASA, and when the mission turned so horribly and suddenly tragic, she was still the name in the headlines. I asked a few friends today what they remembered of the explosion, and the only name that any of them could remember was Christa McAuliffe. Six professional astronauts also lost their lives that day, in the performance of their duties and in service of not only our country, but our species. Their names are Greg Jarvis, Ronald McNair, Ellison Onizuka, Judith Resnick, Francis Scobee, and Michael Smith. They all deserve to be remembered.

26 January 2008

In my opinion, one of the most impressive feats of science and technology, not to mention luck, in recent decades has been the success of the Mars rovers. The small (and relatively cheap) "little data collectors that could" have survived on the surface of Mars for years now, collecting hundreds of times the data originally planned for the mission. In the end, it is estimated that the mission will end because of dust that has settled on the rovers' solar panels. We never even considered they could last long enough for dust to be a problem.

Looking at this sort of success, it's easy to forget just how many times we as humans have tried to reach for the stars and failed. Aside from Apollo 13, which was lucky enough to get its own movie, most failed missions are happily forgotten.

One such mission that we often overlook in the history of space exploration is the Ranger 3. Launched from Cape Canaveral on January 26, 1962, it was meant to fly over the moon taking data on radar reflectivity and surface features, then land and continue to collect more data on the surface itself. It was, in many ways, the direct ancestor of the Mars rovers. Only it missed ... by a lot.

The Ranger 3 was launched by an Atlas-Agena B rocket. The idea was to clear the Earth's atmosphere, make a single course correction, and then ride the forces of gravity for the rest of the trip. It's the sort of calculation that anyone outside of NASA would call then, and many of us would still call today, impossible. But this is the reason you and I don't work for NASA. Their calculations were, in all likelihood, perfectly correct, but a faulty transistor on the rocket threw off its guidance system, and Ranger 3 missed the moon by an impressive 37,000 km on January 28. It then joined the thousands of other small objects orbiting our sun, and it remains out there today, spinning around the same star as us. Maybe someday we'll pick it up, and put it in a museum to remind ourselves that, as smart as we are, we and our technology are not perfect.

18 January 2008

When I choose which events and births and deaths to profile in this blog, it's often difficult to pick the most significant. Often I have to hope that I continue to do this for more than a year, just to get back to some of the interesting stories I had to pass up the first time around. For January 18, there was no question that Bronowski would be my subject.

Jacob Bronowski is best known to most people, myself included, for his excellent work on the BBC's Ascent of Man. Ascent, for those of this generation that have not yet seen it, is a masterwork explaining the various ways that science and technology have shaped human civilization. While Darwin traced humanity's "Descent" back to our evolutionary ancestors, Bronowski chose the opposite, starting at the point we became truly modern humans and noting the way that science fostered our growth and development. It was a brilliant show.

One of the most memorable moments from Ascent was also one of the formative moments in my understanding, limited as it may be, of science. In the eleventh of thirteen episodes, Bronowski visited Auschwitz, the infamous Nazi death camp. He proceeded to make one of the most important points I have ever heard, that certainty is the enemy not only of science, but of humanity. No explanation I could possibly offer would do better than Bronowski's own, and so here, thanks to the glory that is YouTube, is that segment. I encourage you to buy the entire series, and see why it was Carl Sagan' inspiration for making Cosmos.

15 January 2008

"Of course, the whole point of a Doomsday Machine is lost, if you keep it a secret."-Doctor Strangelove

On this date in 1908, Edward Teller was born.

The brilliant but eccentric Teller was born in Hungary, and was born with the name Teller Ede. His family was Jewish, and growing up in the politically tempestuous Hungary of the early 20th century made him deeply distrust totalitarian government. He studied under both Enrico Fermi and Niels Bohr, and emigrated to the United States, along with his wife Mici. There he made several interesting breakthroughs, such as his prediction of Jahn-Teller distortions, and there he also began his interest in nuclear energy.

Teller joined the Manhattan Project in 1942. In an episode that speaks volumes about his level of confidence in his abilities, and hi scientific curiosity, he had a famous conversation with Fermi about the potential of the atomic bomb. They argued over whether an even more powerful blast would be possible by using atomic fission to set off a second fusion reaction. While the first atomic bomb was 3 years away, Teller had already begun imagining the Hydrogen bomb. He contributed in several ways to the development of the first atomic bomb, and always kept the H-bomb, or "Super" as he called it, in the back of his head.

He would go on to be known as the father of the H-bomb. The whole idea was largely ignored or marginalized until the Soviets tested an atomic bomb of their own. Teller's hydrogen bomb idea seemed like the obvious next step, and he was central in its design and execution.

Teller is remembered mainly for his H-bomb work, but is really a symbol of the optimism, sometimes blind, that we had towards atomic power in the 1950s and '60s. He was a leading advocate of peacetime uses for the atomic bomb, most famously recommending the use of a hydrogen bomb to create a deep-water port at Point Hope, Alaska. At some point in the early 1960s, cooler heads prevailed, but the plan, called Project Chariot, went farther than anyone likes to admit. He also pressed for the use of nuclear weapons in excavation for oil, a seemingly questionable bit of logic that also failed to see the light of day.

Teller remained a public figure well into the 1980s. In 1979, he took on what he perceived as an unfair attack on nuclear energy in the form of the film "The China Syndrome". He wrote an ad in the New York Times blasting Fonda and Ralph Nader for their rants against nuclear power, claiming that his tireless efforts against them had actually caused his heart attack. He wrote "You might say that I was the only one whose health was affected by that reactor near Harrisburg (referring to Three Mile Island). No, that would be wrong. It was not the reactor. It was Jane Fonda. Reactors are not dangerous."

Teller was a unique man, even among the eccentric brain trust that was the Manhattan Project. While many of his colleagues eventually saw the dangers of nuclear proliferation and became strong opponents of the use of nuclear weapons, he remained a stalwart supporter of them. He made enemies of Oppenheimer, Stanislaw Ulam, and Nobel laureate Isidor Rabi, among others, and his isolation made him that much easier to label a "mad scientist". But though I disagree with many of his views, to remember Edward Teller as "mad" would be grossly unfair. He was a fine physicist, a brilliant man, who believed that he was serving the cause of peace. I personally have always questioned the logic of peace through the most extreme destruction man has ever devised, but I also find it hard to imagine the terror of believing that a Nazi atomic bomb was possible, especially for an ethnic Jew from Hungary cursed with the knowledge of what such a device could do.

14 January 2008

Maury was a true cross-disciplinary scientist. A published astronomer, oceanographer and geologist, he is best known for his work on oceanic geology, or perhaps as one of the great mediators between science and religion, a need as dire today as it was 200 years ago.

As one would expect of a 19th century man of learning, Matthew grew up in an educated family. His grandfather was Reverend James Maury, teacher of both James Madison and Thomas Jefferson. Jefferson even lived with the Maurys for two years. James emphasized geography, making it a central part of his lessons, and is widely credited with encouraging young Jefferson's later push to settle the West.

After the death of his beloved brother John, an officer in the Navy, Matthew's parents forbade him to enter naval service. Then, as now, there was no way to better encourage a curious mind than to forbid it something, and Matthew entered the Navy in 1825. He would, from that point forward, always remain in love with the sea, even after a shipboard injury forced him to abandon sailing at the age of 33.

One of his best-known studies had rather mixed effects, depending on your perspective. He was the first to systematically chart whale sightings, realizing that the great whales were migratory and followed certain paths. While this was, of course, a revelation in marine biology, and a hugely important biological finding, it also led to far more efficient harvesting of whales, to the point that populations could not withstand.

Maury was also a founding member of the American Association for the Advancement of Science. During the Civil War, he developed a new form of torpedo for the South that had serious implications for enemy shipping. Had he been on the Northern side, he may have been remembered as a hero, but his scientific achievements are largely lost in his allegiance to the South.

Above all else, though, Maury was a man of faith. He quoted scripture, read the Bible daily, and was by all accounts a strictly Christian man. His science was a profession, and a great love of his, but never interfered with his beliefs, nor vice versa. Some so-called creation scientists would put him forward as an example of someone that would clearly take their side today, but I have to disagree. I think he would be, as many scientists today are, more of a spectator in the conflict between religion and science, and I like to believe that a man so dedicated to both ways of knowing saw no reason why they should be so at odds.

13 January 2008

On this date in 1938, the Church of England officially recognized, and accepted, the theory of evolution.

That's 70 years ago today. An interesting thing to ponder as we face another round of religion vs. evolution in our own time. In Florida, Bill Foster is running for mayor of St. Petersburg, and has publicly linked Darwinism with the Columbine shootings and other tragedies. (Thanks to Pharyngula for the link). On the federal scale, we have a major candidate for President, in Mike Huckabee, who is clearly against the teaching of evolution in schools.

So how did the Church of England, a sect with a long history going back to the fourth century, come to accept Darwin's theory of evolution by natural selection? Perhaps the decision had its roots in the church's pride in its native son. After all, Darwin did attend a Church of England school. The decision was part of the 1938 Commission on Doctrine, but if one man was responsible for its passing, it would have to be Bishop Ernest William Barnes.

Barnes was part of what may have been the last generation of the scientist/priest. A mathematician and fellow of the Royal Society, he was also an avid biological collector, and made frequent trips into the wild to observe and collect. In 1927, he wrote a famous letter to the Archbishop of Canterbury, in which he outlined his beliefs in evolution and promised "to show why they did not seem to upset the main Christian position". In a brilliant turn of phrase, he finished his letter "No man shall drive me to Tennessee or to Rome." Only two years before, the state of Tennessee had made its famous case against John Scopes, in the so-called "Monkey trial".

He was hardly the first or the only voice in favor of evolution, but he may have been the best known. His view that theology "must take into account the God of Nature revealed by science" was an insight that many of us would welcome eighty years after he wrote it. He and his kind managed to convince the Church of England to officially accept evolution, and to deny any claim that it somehow undermined or conflicted with their teachings. Though the debate continues even within the Church of England, we can look back on this as a victory not only for the proponents of evolution, but for the much larger populace that yearns for science and religion to more easily coexist, as two forces which should, at their best, make human life better.

12 January 2008

On this date in 1808, the first organizational meeting of the Wernerian Natural History Society was held.

You'd be forgiven for not having heard of the Wernerian Society. Even among Scottish learned societies, it is not terribly well known, but it was an important meeting place for the intellectuals of Edinburgh, which was, many forget, a major center of European scientific thought in the 1800s.

The Society was founded by Robert Jameson, a professor at Edinburgh University. He had worked, briefly, under the great geologist Abraham Gottlob Werner. Werner is one of those tragic historical scientists whose name will always be associated with one wrong idea. His theory of Neptunism, which should at least get some credit for having a very cool name, incorrectly postulated that all rocks were the result of the crystallization of minerals from an early ocean which had covered the entire world. Under Neptunism, it was assumed that all the Earth was originally under miles of water, and that minerals had simply come out of suspension in the water and settled into sedimentary layers. Now we know many aspects of this theory to be false, though I still love the idea of a pure water planet.

Werner should be remembered not only for his one wrong idea, but for his other contributions to geology. He wrote the first modern textbook on mineralogy in 1774, and was above all else a fantastic teacher. One student, Jameson, who studied with him for only a year, proceeded to dedicate a large portion of his own life to a society named after Werner, a tribute to the way he had of entrancing his students.

Members of the Wernerian Society included botanist William Wright, whose collections of plants in Jamaica and elsewhere led to the descriptions of over 800 new species. The chemist Thomas Thomson was also a founding member. The society had 44 meetings in its history, which included visits by such luminaries as American ornithologist John James Audubon and Sir John Richardson, who described to the society the animals found on his overland Arctic expedition.

The Wernerian Society did not long survive after the death of its founder, Jameson, in 1854, but while it lasted, it was a hotbed of new ideas and an important part of the intellectual history of Scotland, Europe and in fact the world.

11 January 2008

If the name doesn't immediately ring bells, that's understandable. Jedlik is best known in engineering circles, but even there, he gets nowhere near the respect or remembrance he deserves. Jedlik is one of the Alfred Russell Wallaces, the men who made discoveries along with, or in Jedlik's case before, the people credited with them, but who for one reason or another have faded into the background of history. Wallace has enjoyed a bit of a comeback, with modern biologists granting him the credit he deserves for basically having the same idea as Darwin at about the same time. Jedlik has gotten no such surge, and it's a shame.

Anyos Jedlik was a true Renaissance man, and excelled in many areas of study. He was a priest, a physicist, an inventor, and an engineer. He should be above all else remembered as a teacher. He wrote the first textbook, and taught the first university classes, in physics in the Hungarian language. Up until that point, physics was taught in Latin, and Jedlik really did nothing less than translate an entire science into his native tongue for his countrymen. He lectured at Budapest University of Sciences for 40 years, and helped to foster a whole generation of Hungarian scientists.

Jedlik's most important invention, and the one for whom he is truly an unsung genius, is the dynamo. Though Werner von Siemens is given credit for the invention of the device, Jedlik's dynamo predates Seimens' by six years. However, by the time he got around to publishing it, Siemens had published his own and begun marketing it commercially. Soon, Seimens was a well-known name, and he would continue on to have such honors as a unit named for him. A seimen is the SI unit of conductance. There are no scales that measure in jedliks.

Jedlik's accomplishment shouldn't necessarily take anything away from Siemens. After all, having an idea isn't always the hardest part, and credit often belongs to the person with the ability to make an invention commonly applicable to people's lives, not just invent it. Just as Wallace's attention takes nothing away from Darwin, we should accept that many of history's geniuses were not alone in their discoveries, but instead were simply the ones that were most able to bring that discovery to market. And then there are Einsteins, who really do come out of nowhere with discoveries and theories that might not have been made for 50 years had he not made them, but that's a story for another day.

09 January 2008

It seems appropriate today to do a post about extreme weather. I live in the Pacific Northwest, and though the winter is always wet, this season has been unusually gray and rainy. A puddle that stresses the definition of puddle and probably deserves to be a pond sits right outside my front door, and occasionally seeps in a bit. You only have to go up about 500 feet for all the water to turn into snow, and there's plenty of that at elevation.

But if I was alive 128 years ago today, today's nasty winter weather would be but a pleasant fantasy, because on this date in 1880, one of the greatest winter storms in American history hit the Pacific Northwest.

Unfortunately, the region was lacking modern measurement devices to know, exactly, how fast the wind was, and how low the barometer sank. One estimate puts it at 998 mb, an extreme storm event, but we have to go on words instead of numbers to understand the ferocity of the storm. Most of the words surviving to this day are from letters to the Daily Oregonian, a newspaper in Astoria, OR. Here are a few excerpts:

"We have just experienced one of the severest gales; nothing like it has occurred since the settlement of the bay." Astoria had been officially settled 70 years earlier, in 1810.

"The tide rose seven feet higher than was ever known; nearly all the old wharves are taken away." I don't care if it's accurate or not - the phrase "seven feet higher than was ever known" is poetic in a way that letters to the editor seldom approach.

In Portland, far from the center of the storm's fury, we do get some scientific measurement of the gale. A barometer read 28.556 inches, the second lowest reading in the history of the state, and even more impressively, the fastest rate of decline in Oregon history. The barometer fell by a over quarter of an inch in 2 hours, a massive drop in so short a time. Wind speed rose from 4 to 39 miles per hour in a span of fifteen minutes. And remember, this was not where the storm first hit, in Astoria, but 72 miles inland.

So, if you're like me, and you complained today about the winter weather in the Northwest, you can be somewhat comforted that it has been a lot worse than this, and we made it through.

08 January 2008

By any measure, January the Eighth is an unusually tragic day. There may have been more separate events known popularly as "disasters" on this day than on any other of the year.

In 1906, a landslide in New York state killed 20 people. In 1908, it was a train collision in New York City. In 1962, another train wreck in Holland. Plane crashes in 1989 and 1996. An earthquake in 2006.

Even with all the disasters in world history, both natural and man-made, January 8th seems unusually rife with tragedy, and it is. Considering how rare train wrecks and plane crashes are, to have a combined four on the same calendar day within a century is a bit odd. But is it statistically notable, or just a blip on the radar, a random association of events?

In order to be fair in looking at how unusual the congregation of disasters on January 8th is, we have to define disaster in its various forms. It might be a bit morbid, but bear with me. I will define a major air disaster as one which kills 100 people or more. There have been 76 such crashes worldwide, and two occurred on January 8th. That's more than the average if they were evenly spread (about one disaster every 4 or 5 calendar days). Let's call it about 8x background levels.

There have been 48 train disasters on record, using the same criteria of 100 deaths. Fortunately for those souls aboard both trains that crashed on January 8ths, neither of them were so deadly, and statistics are harder to find on smaller crashes.

I'm losing my point a bit. It is tempting to take a look at the history of such days and look for patterns, and there may be patterns, or at least a lack of complete randomness, but it's pretty weak. Travel is much more intensive during the holiday season, and we're coming to the end of that season, so more transport inevitably means more accidents. Beyond that, though, I wouldn't run to the bomb shelter just yet. January 8th is as safe as any other day.

07 January 2008

On this date in 1610, Galileo Galilei discovered the four largest moons of Jupiter.

Well, to be honest, we aren't sure of the exact date he discovered them. As conscientious a note-taker as Galileo was, we can't quite pin down his first glimpse of Io, Europa, Ganymede and Callisto, but it was this date that he first let anyone else know about them. He wrote about them in a letter.

Although Galileo is still today given credit for discovery of the moons, he did not give them their lasting names. He originally named them after the great patrons the de' Medicis. Nothing curries favor with the boss like naming a planetary body after them. The four lasting names were chosen by Simon Marius, a figure of somewhat less lasting fame than Galileo, who claimed to have discovered the moons at the exact same time. He chose to name them for four of the many, many lovers of the Greek god Zeus (Jupiter in the Roman pantheon). The appropriateness of the names seemingly overcame Galileo's right of priority, and I have to admit I'm glad. I'm usually quite a fan of priority in naming conflicts (possibly the nerdiest of all conflicts, by the way) but I would much rather think of the possibility of life on Europa than ponder the frozen seas of Ferdinandipharus. Seriously. Ferdinandipharus.

Europa, whatever you call it, is one of the most promising places for life beyond Earth in our solar system. Though covered in ice, many researchers believe there may be a subsurface ocean of liquid water. This speculation has, of course, spawned volumes of science fiction, from Arthur C. Clarke's 2010 to the works of Caitlin Kiernan and Dan Simmons.

There were several important consequences to the discovery, whoever made it first, of the four moons. Galileo was a believer in Copernicus' theory that the Sun, not the Earth, was the center of the Universe. He famously and unfortunately paid the price for these beliefs, being basically imprisoned by the Inquisition and forced to recant. What he could not take back, though, was his discovery of the moons. They were the most powerful evidence yet that the Earth could not be the center of everything, as Ptolemy said it was. A few relatively lame arguments were made about the moons and Jupiter having incredibly close orbits, making it only seem like they were revolving around each other when in fact they were both revolving around the Earth, just together. It was a ridiculous theory, but one becomes more amenable to such things at the point of a sword, and geocentrism would remain a while longer.

Still, the discovery of the moons did more than deepen our understanding of our own solar system. It opened up the universe. Many astronomers of the time believed that all such major bodies in the universe had been found, and the discovery of the moons basically legitimized the use of the telescope for finding new things in the sky. It made our universe seem bigger, more mysterious, and more worthy of study than it had before, and we've never really stopped looking up since.

06 January 2008

On this date in 1974, the United States enacted year-round daylight savings time.

The concept of daylight savings time has always been a strange one to me. No matter what number one attaches to a particular time of day, it has no effect on the number of hours of sunlight, but it can have a huge effect on human activities, mostly because the majority of us working folks are on a relatively strict work schedule. Nine to five may be a bit of a cliche, but it still remains the most common work day, at least in the United States. There are a lot of interesting consequences to the difference between it being light and dark outside when that 5-o'clock bell rings.

Studies have shown a reduction in both crime and traffic accidents during daylight savings time. It makes a certain amount of sense when you think about it. If the average person is home and (relatively) secure by the time it gets dark, there is both less driving in the dark and less walking in the dark through what might be suspicious territory.

Daylight savings also has a huge effect on energy use, and that is the effect that brings up today's blog post. Because of our enforced work schedules, people tend to go to sleep at about the same time regardless of the timing of sunlight. So, the energy used on electric lighting in the home tends to be less during daylight savings time because there are fewer hours between returning home from work and turning almost everything off to go to sleep. If gas happens to be in low supply, this little difference can be important.

In 1974, gas was, to say the least, in low supply. The energy crisis was in full stride, after OPEC decided to use the West's oil dependence as a weapon to punish us for our support of Israel in the fourth Arab-Israeli War. Price controls and rationing were already well in place when the New Year came around and the decision was made to extend daylight savings to further reduce energy use. As it turned out, the problems had reached their worst point and were starting to abate. It would be two months later, on March 17, that OPEC dropped the embargo and things started to normalize. Still, for those of us who complain about $4 gas and marvel at $100/barrel oil, it is worth remembering a time when we not only had to wait our turn to get gas, but also actually changed our clocks to save it.

05 January 2008

Daughter of the Biblical scholar Sir Frederick Kenyon, Kathleen was born into the study of history. Her father would later become Director of the British Museum. She studied history at Oxford, before beginning her career as a photographer on the Great Zimbabwe expeditions of Gertrude Caton-Thompson. As rare as a female archaeologist was in Kenyon's day, it was even rarer in Caton-Thompson, and the elder scientist must have been a heavy inspiration on young Kathleen. Her next work was with Sir Mortimer Wheeler, and it was this work that I want to center on in today's blog.

Wheeler was a star in the world of archaeology. Keeper of the London Museum, he was a precious rarity in science, an extremely talented researcher with an uncanny ability to translate his findings to a larger audience. He was the first to lay out excavations in a grid system to keep track of where items were found to a more specific degree than was previously common. Kenyon learned this system from him at Verulamium, a Roman excavation north of London. She then took Wheeler's ideas and expanded on them, and improved them.

Where Wheeler had concentrated primarily on the horizontal arrangement of artifacts, Kenyon was just as interested in the vertical, the depth at which things were found. Her tweaks to Wheeler's strategy became the Wheeler-Kenyon Method.

In the Wheeler-Kenyon Method, excavation sites are divided into 5x5 meter squares, in a grid. Anytime you see old video of excavation sites, with large squares cut into the earth, they were using Wheeler-Kenyon. Kathleen Kenyon's addition was to leave walls, 1 meter thick, remaining between all of Wheeler's squares. This allowed the layers of soil and other materials to be analyzed alongside the artifacts they held. So much information had previously been swept away, the artifacts seeming to be the important thing. The value of what Kenyon did cannot be overestimated. She realized that the context of the artifacts would be as important as the pieces themselves, and that the layer of sediment surrounding them could tell volumes about when the piece was buried and what was going on there at the time.

Kenyon went on to become principal at St. Hugh's College at Oxford, and was honored as a Dame Commander of the Order of the British Empire upon her retirement. She died five years later, but left behind a science firmer and more robust than she had found, something to which every scientist would do well to strive.

04 January 2008

Hello and welcome to Days of Science, my new blog on topics of scientific history. From now on, each post will look at a different historical event, or series of events, that impacted the world of science in some significant way. These posts are organized by date, with each post occurring on the same date as the event it describes. Though my background is in biology, I will try to touch on as many subjects and disciplines as possible. Thanks for reading, and I hope you enjoy the blog.

It seems appropriate to start such a venture on the 365th birthday of Sir Isaac Newton, one of my scientific heroes. Newton's Philosophiæ Naturalis Principia Mathematica remains today one of the most influential books in the history of science, introducing his theories on gravity and motion. One of the great true geniuses in history, his legacy is so much more than this one book, and includes the entire discipline of calculus (which he co-created along with Gottfried Leibniz). His ideas were some of the most fundamental seeds of the Enlightenment, and lives on today as perhaps the most recognized name in the history of science.