15 November 2011

Genesis 1 lays out in a poetic manner a six-day sequence for the creation of the Earth. It's very simple, something I imagine an unschooled shepherd could easily deal with.

19th Century geologists in Europe (especially England - the British Geologists Association formed in 1858) had watched as sediments accumulated in basins and puddles. They realized that they could rather easily calculate the rate of sediment accumulation. They also had seen and mapped huge stacks of similar, consolidated sediments in many places - and had even begun to correlate some distinctive bedding sequences in one place with bedding sequences a long ways away. This permitted British geologists, blessed with pretty much the fullrange of geologic ages on one small island, to figure out which layer sat on top of another layer, which must be older, and that units below these were older still. By the turn of the 20th Century, even conservative geologists looking at their numbers had concluded that the Earth had to be many millions of years old.

Geologists could also see another kind of time-line: progressively more sophisticated fossils, remains of ancient life forms not currently found walking or swimming the Earth, as the sediments got younger - more towards the "top" or modern day of the stratigraphic stack. A curator at the BYU Geology Museum once walked me through a series of dinosaur vertebrae, showing me how with time the vertebrae became lighter but at the same time structurally stronger. This meant they could run faster. Evolution.

Then Pierre and MarieCurie, Becquerel, and others discovered radioactive decay. They could measure a decay rate for a given amount of a particular element, and they could see the daughter products forming as a result of that decay. It's an easy step to measure the ratio of radioisotope to its daughter products (uranium-lead, for example, and of course there are intermediate steps) and so you should be able to figure how long that particular crystal (a zircon, for example, containing uranium and lead) has been sitting there since it solidified out of a magma somewhere.

Whoa. Geochronologists started coming up with HUGE numbers. As more and more rock units were sampled and dated, the push-back for an oldest rock - homing in on an origin of the Earth - passed into the hundreds of millions of years, and then billions of years.

Radiometric dating currently suggests that the age of the Earth is 4.54 +/- 0.01 years.

Times 365 days per year, this is 1,657,100,000,000 days, at least, that this rock has been orbiting what is now our Sun. This number is not really comprehensible to people who count on their fingers. "...7, 8, 9, 10!" Can you count higher than that? "Sure, (raises both hands overhead) "1, 2, 3, 4...".

However, as you consider the actual processes involved, as we understand them from geology and astronomy, it certainly can't be this precise. There was a protoplanetary disk, gravitational clustering, segregation, a crust formed, modified repeatedly by continued heavy early bombardment, and then later in the game there was an impact of another protoplanet that led to the formation of the Moon. So it's unrealistic to place such a precise three-decimal-point age as a "start" - better to point at the oldest piece of unmelted material ever found. The current record-holder comes from the Jack Hills of Western Australia, at 4.4 By. If you compare the mass and luminosity of the Sun to other stars, and age-date meteorites, it is apparent that the Solar System can't be much older than that.

Certainly there are complications with radiometric dating; the Carbon-14 creation rate in the atmosphere varies over time depending on cosmic ray flux, for instance. You can calibrate for this using tree-rings, however. Radiometric dating also must necessarily make some assumptions, among them that the decaying radioisotope and its daughter products remain together for the entire time that the age is calculated for (no remelting), and also that the decay rate today is the same now as it was when the original material solidified out of a melt. It's more complicated than that, even, but at this point we're only quibbling about small plus-and-minus stuff.

Science writing being as persistent as it is, rather few people take the Six Days of Genesis as literal truth these days. Genesis IS, after all, a translation of a translation, and the original writer had rather little experience with orbital mechanics, conservation of angular momentum, and the weak nuclear force. This is to say, he had a limited vocabulary to work with.

In this context, I found some interesting things in the writings of Latter-Day Saint apostles who were also scholars:

John A. Widtsoe wrote about the "vast periods of time" required for each class of animal to rise, dominate the Earth, and then become extinct. (Joseph Smith as Scientist, manual distributed by the General Board of the YMMIA, Church of Jesus Christ of Latter-Day Saints, 1908).

Here's something even more specific, unlike anything I've seen in any other church doxology: "What is a day? It is a specified time period; it is an age, an eon, a division of eternity; it is the time between two identifiable events. And each day, of whatever length, has the duration needed for its purposes." --Bruce R. McConkie, Mormon Church Apostle, during General Conference, 1982.

I also was pointed at an interesting quote from the (atheist) astronomer Carl Sagan:

"How is it that hardly any major religion has looked at science and concluded “This is better than we thought! The Universe is much bigger than our prophets said, grander, more subtle, more elegant. God must be even greater than we dreamed”? Instead they say, “No, No, No! My god is a little god, and I want him to stay that way.” A religion, old or new, that stressed the magnificence of the Universe as revealed by modern science might be able to draw forth reserves of reverence and awe hardly tapped by the conventional faiths. Sooner or later, such a religion will emerge." - Carl Sagan, "Pale Blue Dot:A Vision of the Human Future in Space" (1994).

I think that religion has emerged, actually.

But let's put this all in perspective. Can you count to a million? Neither can I, so the difference between a million years and a billion years seems somewhat irrelevant - unless you are a geochronologist. I once poured over $250,000 into instrumentation for a rock-dating laboratory - because it was important to know how long ago a volcanic eruption had taken place, in order to get a sense of how dangerous that particular volcano was. That can be important, right? Especially if you live in, say Seattle, or Tokyo... or anywhere in the Mediterranean or Pacific Rim.

But consider this: allow for a minute the possibility that there is life after life. I have agnostic friends, even atheist friends, who go back and forth on this one. I myself have a number of strong experiential reasons for no longer questioning this. If you are an atheist, then the Age of the Earth doesn't matter. If you are faith-based, then... it doesn't really matter either. It's sort of like Pascal's Wager.

When we die and make that transition, cross the Veil, I think there may be some questions asked of us. Like: Where is your family? What did you do to help others?

Somehow I don't think that Someone is going to ask me "While you were in your mortal state, what was your opinion about the age of the Earth?"

07 November 2011

Wait a minute. What about the First World War? The Second World War? The deaths of 20,000,000 people during Stalin's purges in the 1930's in between? The Crack epidemic of the 1980's? Two million people in American jails? What about 9/11? The Iraq and Afghanistan and Libyan Wars?

In fact, this is how proximity weight-loads the history that WE remember. It is dramatically amplified by the rise of the 24-hour news cycle since the advent of CNN. This is called a "bias towards recency." A careful statistical and historical analysis makes a compelling case that in fact violence has declined throughout history.

Put another way: the actual likelihood of being assaulted or killed has been falling for centuries.

How could this possibly be?

Pinker's book moves through the historical record first (Hey! Ever hear of the Hundred Years War? This represents a century of continual European warfare, famine, and death). It then addresses the intellectual revolutions of the last several centuries, and even delves into modern studies on the human mind and human behavior. Pinker's lasting achievement is that his intellectual quest really knew no bounds: he covers the gamut from psychology, neuroscience, evolutionary biology, history, and social science.

He didn't operate in a vacuum, however. Pinker homes in on, and give full credit to, a particular inspiration. He calls Norbert Elias, a German-born scholar who wrote during Hitler's 1930's, "the most important thinker you have never heard of."

Elias proposed that the growth of the nation-state all over the world in the past several millennia has had profound effects (described in Thomas Hobbes, 1651 book Leviathan) on stabilizing human behavior. It created physical boundaries, it established bahavioral norms with consequences. The consequences were profound, too: outlaw behavior drew out the posse - stirred up the hornet's nest - and sociopaths were removed from the gene pool. In the United States, we incarcerate more than 2,000,000 people, mostly men, but in past centuries there weren't resources to hold people in jail. Beheading, hanging, and feathering with arrows accomplished the same goal much less expensively. With time, violent tendencies have been steadily filtered out of the human race, and all of this stemmed from the establishment of nation-states.

The other thing that Elias and Pinker noted was the rise of commerce. Mutual gains from trade created a common purpose, and raised most of humanity above the tribal state. The xenophobia common throughout the world earlier became progressively more untenable - xenophobia interfered with the common gain, and has been increasingly less tolerated by the majority of humanity.

There has also been a "rights revolution" in the past century: women's rights, civil rights, gay rights, animal rights... with the accompanying increase in sensitivity that goes with these. There is also a somewhat more controversial idea: that there has been a rise in human reasoning ability. However, it's hard to separate this from evolving culture. Pinker also tends to dismiss income inequality. However, numerous studies have shown that income inequality correlates closely with homicide rates in country after country, and areas within countries. If you wish to see low rates of violence (the anomalous Breivik massacre last summer notwithstanding) go to Norway. Norway has an income disparity range far smaller than the United States or even many other countries in Europe - and is one of the most peaceful nations on the planet.

Something neither Elias nor Pinker noted was an additional factor that I have noticed: the establishment of sports as a normative social activity. Sports in aggregate constitute a legally-sanctioned opportunity to compete with others without loss of life or limb (Rugby or Hockey or Lacrosse notwithstanding). Sports are a way to release pent-up energy and frustration; they are also a means for organizing small armies and using strategies to win... and gain fame and riches at the same time.

I may have come up with this sports issue on my own because it seems an odd part of our culture. I (and several of my children) have never been able to see any point to golf, baseball, or football. The potential aerobic benefits of basketball and soccer seem counterbalanced by the risk (some say inevitability) of knee and spinal injury. In fairness, people look at me as someone in his '60's practicing Jujitsu and think I'm crazy. In my defense, it makes me more flexible/younger, and gives me a means to perform community service outside the range of Church opportunities.

The bottom line: the Angels in our natures seem to be winning the battle for the soul of humanity.

05 November 2011

This is a shorthand way of describing the life-work of a visionary Microsoft research scientist named Jim Gray. A few weeks after he gave a talk on the subject in 2007, he was lost at sea off the coast of California.

Gray was proposing the Fourth Paradigm: a quasi-new scientific approach that says insight can be gathered from manipulating large amounts of data. Manipulating, sorting, and graphically expressing relationships in very large data sets: new stuff pops out. You can apply statistics to very large data sets and have far greater confidence in the results.

The First Paradigm is sometimes called empirical science - observational or descriptive science. This is the science carried out by interested folk like you and me over the past several thousand years. That, for instance, is Drosophila Melanogaster... what you've been calling the Fruit Fly. You named it, classified it as a fly of the fruit-eating variety.

The Second Paradigm is analytic science: analysis of scientific observations that leads to an understanding of electricity and magnetism, for example. By careful experiment and observation, Michael Faraday was able to connect electricity with magnetism. From this work, James Clerk Maxwell developed... yep, the famous Maxwell's Equations. I'm not making this up: Maxwell built on the scientific experiments and papers of Faraday to develop a working theory of electromagnetism, complete with an elegant mathematical formalism that haunts undergrad physics students to this day. Actually, these guys are heroes to physicists as much as Fermi, Bohr, and Einstein are.

An Example: In 1994 I was working in northern Saudi Arabia on a phosphate project. A monster sandstorm beginning in the Sahara far to the west engulfed us, and for a day it was very hard to work. For the next several days the dust haze hung in the air and I realized that each afternoon I could look directly at the Sun without a filter - with my naked eyes and without injury. I noticed a huge Sunspot cluster in the upper left quadrant, and was so impressed that I could actually see this without instrumentation that I sketched it into my field notebook. The next day I could see it again... and it had migrated downward and right. By the fourth day I had a complete sketch of the movement of this Sunspot cluster. That is an example of First Paradigm science: observation. FROM those observations, I could deduce (a) that the Sun rotated, (b) where the axis of that rotation was (upper right of the observed disk), and (c) how FAST it rotated (I figured roughly 10 days would bring that cluster if it still existed to the same initial point). That part is the Second Paradigm: I analyzed the data and drew some conclusions from them. (PS: Data are always plural - there is always more than one number).

The Third Paradigm is sometimes called computational science; sometimes it's called simulation science. Think ever larger computers, calculating results from ever finer grids of models of the galaxy, models of a complex earth being deformed by stress leading to an earthquake, giant models used to predict weather. More or less.

An Example of this is my use of a powerful software package called Geosoft Oasis Montaj: this software allows me to bring in vast amounts of data from any source and process the entire mess. It's generally known among geophysicists that you can only "see" about 15% of the content of magnetic data by hand-contouring many measurements on paper. If I pass frequency filters through the data, I can separate the deep sources from the shallow sources. If I pass derivative filters through it I can find the edges of those sources of magnetic anomalies. If I then do two-dimensional (or higher dimensional) modeling, I can obtain a probable shape of the source(s) of the anomaly(s). Say, an electric pig in a magnetic bathtub. This is computational or simulation science.

The Fourth Paradigm is a step beyond this. Grey's point was that hey!* We are collecting vast amounts of data - more data in seconds now than in all previous history before 1950. There MUST be some relationships, connections, new things in all that mess. If we don't DO something with all these numbers, then what is the point in COLLECTING them?

Data mining is an obvious outcome of this sort of work. Clever digital types can use many different sources of data, search for links - relationships or connections - and from all this can pretty much tell some company what you are going to buy this Christmas, where, and how much money you will spend. That is valuable to a company - it allows the company to save money on inventory and helps them set up displays that will get even MORE money out of you. That's a good thing, right? Maybe.

It's already well-established that corporate recruiters need little training in data mining to find out how you party, what you really do, who your friends are, and how honest you are... no matter what your resume may say. A good thing for the HR people, a bad thing for the careless and dishonest job-hunter.

This same data mining can have unequivocally terrible consequences: people supporting the revolutions in Iran and Syria using Twitter, Facebook, and Anonymizer have died because regime agents have connected different sources of data and figured out who was trashing their regimes... and people have been found, arrested, and have died as a consequence of this kind of data mining.

For better or worse, we have all reached - and fallen into - the ocean of data lying at the end of our continent of former human interaction. Our lives will never be the same again. The Internet is self-healing and in effect self-replicating.

Big Brother is Skynet, and it has found us. You may run, but you cannot hide.

~~~~~

* No pun intended, but a book has been published online by T.S. Hey and others (2009) that assembles all the ideas Jim Gray was promoting.