October 25, 2017

Yesterday, a one-sentence prescription for happiness written by Einstein on the letterhead of a hotel where he was staying in Japan in 1922 while giving a series of lectures was auctioned for over $1.5 million. Einstein wrote it for a courier who delivered the notification that Einstein had just been awarded the Nobel Prize. He gave the note to the courier in lieu of a tip because he didn’t have any money at the moment and said “it might be worth something someday.”

The advice was: “A calm and humble life will bring more happiness than the pursuit of success and the constant restlessness that comes with it.”

My favourite Einstein quote, though, is “If at first the idea is not absurd, there is no hope for it.”

Unfortunately, on the other hand, I suspect most absurd ideas – including my own – are simply absurd. It takes a genius to turn them into a theory of relativity or a string theory. Or a mere gravitational wave!

July 9, 2016

I was intrigued when I was recently reading what I thought initially was a serious review of the research into dementia. The author – a medical doctor – claimed that curcumin (which includes the spice tumeric) drastically reduces the rate of Altzheimer’s disease, a fact demonstrated by India, where the reported percentage of this debilitating disease is lower than in any other country in the world.

Then I realized what I was reading was an advertisement for tumeric supplements. Not just any tumeric supplements either. Only high quality supplements will bring about the desired results.

I started to ask a few obvious questions:

What percentage of the population over the age of 60 in India have been in contact with a qualified professional who might have made a diagnosis of some kind of dementia? I know more than one case in both the US and Britain where an elderly person suffering from dementia is being taken care of by family members and who have not seen a doctor in years.

To make matters even less clear, a certain diagnosis of Alzheimer’s disease is extremely difficult if not impossible without a post-mortem examination of the brain of the affected person.

And since the advertisement insisted that the quality of tumeric supplements was important, it may be relevant to ask just what kind and how much of this treasured spice is consumed on average every day in India.

There was no discussion of any of these issues vital to substantiating the claims made.

So to claim that India has a lower incidence of Alzheimer’s disease than any other country in the world, let alone to claim that this is a result of the fact that so many Indians eat curry spiced with tumeric, is highly dubious.

I have tumeric in something I eat almost everyday because I like it. I am aware that claims for it are made for curcumin as an antioxidant, for reducing joint pain, the incidence of cancer, brain & heart disease, depression and the side effects of many cancer treatments. I strongly suspect that tumeric, like many herbs and spices, is very good for us.

But if it’s a miracle, science has not yet proven it.

Sometimes I think the differences between religious faith, political promises, and scientific claims are indiscernible.

December 28, 2014

My musician sister sent me the Colbert farewell YouTube video. It was removed from the internet by Viacom who owns the copyright, so attached here in Vera Lynn’s rendition that gave hope to so many during WWII.

I have heard the Vera Lynn version many times and understood why it meant so much to so many. But this post is about my unexpected response to the Colbert version.

First of all, let me assure any doubters that I personally do not believe in heaven as most people understand the term. And if I did, I would not be motivated to try to get there. Sitting around in a perfect world, with no problems ever to solve, with no one in need of an extra act of thoughtfulness, with no creativity because everything is already perfect sounds excruciatingly boring.

But as I watched the Colbert video, I suspended my unknowing, and began to wonder if, in some mysterious way that I cannot fathom, we will, indeed “meet again” in a next life. What would that be like?

I imagined sitting around a fire, when our two dogs burst into the room, barking in wild enthusiasm as they recognized us. And then Mom and Dad and my sister Mary who died almost twenty years ago joined us. We each had a glass of wine and began to exchange stories. And I asked them all the questions about what they thought about this and that, questions I couldn’t ask after they’d died. And then four more dear friends came, and we continued to talk late into the night.

Of course, I would want them all eventually to leave. Except the dogs. I mean, sitting around the fire with a glass of wine forever would get to be pretty boring too. I need sleep. And besides, I don’t have a very high tolerance for alcohol.

So I don’t think I’ve figured out the great mystery of life and the universe in which it is evolving after all. The scenarios offered by various religions are inadequate metaphors at best. Some super-mathematical scientists suggest that there are an infinite number of universes in which life repeats itself in every possible version. And another scientist has just seriously suggested that when the Big Bang happened, Time began to run both forward and backward in two different parallel universes. Maybe we are in the universe where time is running backward and will eventually run into the universe where time is running forward. I confess it doesn’t make a lot of sense to me.

The best I can hope for is that when we die we become part of some kind of transcendent consciousness. And I say that only because I haven’t the faintest idea of what that means either.

I think I’ll just listen to the Vera Lynn YouTube again and be grateful for the mystery of life that has been given to me right now.

July 10, 2014

A friend sent me a reference to a series of books by Ilia Delio, which he said seemed to echo some of my ideas and which he thought I might like to read. So I checked Delio out on Amazon and saw that the introductory quote in one of her books was Einstein’s “The most incomprehensible thing about the universe is that it is comprehensible.”

No. I am among that group of scientists, including Stephen Hawking who believes that we will never reduce the universe to the totally comprehensible – that there is an infinity which we will never exhaust.

I find a deep and profound peace in that acceptance. I don’t have all the answers; I never will. I live surrounded by mystery. Somehow I am immensely comfortable here. That knowledge and that peace is probably the single most important contribution to my coming to terms with my childhood socialization as a Roman Catholic. There were several other significant steps as well.

One was the realization that the concept of matter as totally inert had been exploded with Einstein’s equation e=mc2 – the equation that demonstrated that energy and matter are two forms of the same thing. We know now that matter is not a passive blob sitting there until something else pushes it along. Matter is a seething mass of movement and energy at its very core.

Why is this so exciting? Well, for me, it brought the problem of the emergence of consciousness into the scientific world. Even today, in my opinion, the single most important unresolved question for science is the fact that we have no idea how the brain produces something as seemingly immaterial as consciousness. Consciousness in all of life is totally dependent on a functioning body. Today through MRI scans, we are even learning some of the minute pathways in the brain that are activated by various kinds of consciousness. But we do not have a theory about how this conversion takes place. It is a parallel problem to the one we had when we used to think, less than two centuries ago, that matter and energy were two completely different things. I do not have the answer to what many philosophers call “the mind-body problem” but I am convinced now that the answer lies in the natural world.

In other words, we do not have to have recourse to Plato’s “spiritual” world which Christianity eventually adopted as “heaven” and “hell,” populated by spiritual beings including God, the angels, and the souls of those who have died before us. I remember the almost ecstatic feeling I had when I realized that I was already home in this universe. I am not living in exile. For all its pain and trouble and difficulties, I am already where I belong. And whatever happens after death, I will not be spirited away into some another plain, to some ethereal heaven or fire of hell. However it will happen, what I am will continue to evolve as part of this natural universe.

Another giant step in my coming to terms with Roman Catholicism was the discovery that the original meaning of “faith” as understood by the Hebrews and the early Christians did not reflect adherence to a strict set of doctrines, but is more accurately translated as “faithfulness.” “Faithfulness” does not require that every one in the community always agree, or always accept the same doctrines. This switch to belonging to the community based on faith as unquestioning acceptance of universal dogmas did not occur in the Christian church until the 4th century. Until then, the essence of the Christian message was that “the greatest of these is love,” that “we are no longer Jew nor Gentile, slave or free, male or female; we are all one.” In other words, we are all — all — in this together. All of us in the human family.

Refusing to reduce faithfulness and universal love of all humanity to a set of doctrinal and liturgical rituals might diminish the power of religious leaders. It certainly destroys the “one of us” attitude of so many religions, and the claims of any single religious tradition that it is the “one and only true church.” Roman Catholicism with its proclivity today for excommunicating dissidents and its insistence on papal infallibility is benign compared to its torture and execution of those who refused to accept church authority for over a millennium until papal power was finally separated from the secular authority of the state. But this commitment to literally killing those who disagree with us is still rife in the world today. Turn on the news tonight and look at what is going on in Iraq, in Syria, in South Sudan, even in the United States where some fundamentalists are trying to change the law to match their own religious beliefs. In this war-torn, trigger-happy world, we badly need to understand the original Christian message that we are all one.

August 11, 2013

The Sunday papers today are reporting that two British professors have patented a test that analyzes endothelial reactivity.

Oh good, you say – just what I always wondered about myself.

The paper is calling it a Death Test, but if the Americans get hold of it, it will undoubtedly be called a Life Test. Either way, endothelial reactivity measures the oscillation within the blood cells of capillaries, our smallest blood vessels. The results indicate just how well an individual is functioning over all, and so can predict the undiagnosed presence of cancer and dementia.

But the results are also graded for optimal functioning between 0 and 100, and with sufficient data will ultimately be able to make a reasonably accurate prediction of when an individual will die – even if that event is decades away.

The good news is that the test is a laser test that is completely painless, non-invasive, even user-friendly. The expectation is that the test will be available to GP’s within three years.

If it were on offer, would you take it? At my age now I would – it would make it much easier to plan for the rest of my limited future here. But would I want to know at the age of eighteen how long I probably had to live barring accident or epidemic. Or at the age of forty? fifty?

One thing for sure, once the data is reliable enough, insurance and pension companies are going to want to know.

June 4, 2013

But I’m beginning to think what has happened in the past is just as uncertain.

This week we’ve been trying to figure out events in my family a mere generation ago. Simple questions like “Who was it that was engaged to my mother before she married dad? When did our parents meet, and who introduced them?” have baffled us.

Why did the Neanderthals die out when Homo sapiens did not? The Neanderthals had brains as big as ours, and had survived in Europe for tens of thousands of years before Homo sapiens even left Africa.

And the Neanderthals were as big and as fast as Sapiens, so it is unlikely that we hunted them to death. But we might have out-competed them for resources. Neanderthals have larger eye-sockets and probably better eye-sight than Sapiens. This made them good hunters, but used up much of the frontal brain that Sapiens developed for social networking and more abstract thought. So Sapiens became more cooperative, better at sharing and learning from each other.

June 2, 2013

I’ve just read an interview with Noam Chomsky in which he suggests that effective education doesn’t teach to tests, but teaches students to discuss and explore processes, events, issues, problems.

This reflects my own philosophy of education. I never tried to teach my students the right answers. Their grades didn’t depend on their agreeing with me or with any particular theory we might be studying. Their grade depended on their ability to describe each theory, or each side of an issue, in a way that someone espousing that theory would agree fairly reflects their thinking. Then, and only then, do I think we have the credentials to make our own decisions.

Every once in a while, a student would say he or she didn’t want to learn about some theory or other because he didn’t agree with it.

How in heavens’ name can we legitimately disagree with someone if we don’t know what they are saying?

I’ve once again gotten so excited about the value of understanding the points of view with which we disagree that I’ve even fantasized writing a book for teachers who are mandated by state law to teach both Darwin’s theory of evolution and Creationism.

What an incredible opportunity for a teacher in this position!

This is a topic about which feelings run so deep that they often suffocate rational discussion. And I am not talking only about the view of Creationists. I have met Evolutionists (with whom I happen to agree, by the way) who are as dogmatic, close-minded, and judgemental about Creationists as any one. Wouldn’t it be wonderful if we could learn in the classroom to take the alternative seriously, seriously enough to grapple with the legitimate claims of both sides?

Wouldn’t we have brighter students? And would we have a society that is more tolerant of those with whom we disagree?

Oh no, I’m not going to write another book! There’s too much work to do in the garden anyway.

March 18, 2013

In the face of little concrete evidence, it wasn’t totally unreasonable to assume that the bigger the brain, the more intelligent the head that held it.

Gradually, though, doubts about that simple equation have begun to creep in. The brains of Neanderthal man were larger than the brains of Homo sapiens. Surely they couldn’t have been smarter than we are, could they? And then we’ve been discovering that dumb animals can do all sorts of things that we can’t. Birds can navigate half-way around the world without getting lost and without the help of even an old-fashioned compass. Dogs can hear things we can’t, and bees can see colors we don’t. Dolphins can communicate with each other. So even can trees. Then we discovered that some parrots can correctly use as many words as the average two-year old.

And now, using MRI scans, scientists have found that female brains are more efficient than male brains. The brain of the average woman is 8% smaller than the average male brain. But research isn’t suggesting that men are 8% smarter. Research from both the Universities of California and Madrid have found that on the whole men have better spatial intelligence than women. (So when they won’t stop and ask for directions, maybe they aren’t as lost as we women think they are.) Women, however, outperform men in inductive reasoning, are better at keeping track of a changing situation and at some numerical tasks.

Ah, but not to worry: Trevor Robbins, professor of cognitive neuroscience at Cambridge University in England says that the finding was fascinating, but controversial.

February 16, 2013

I was a young adolescent when I first learned that Luther had taught that doubt was an inevitable part of belief. I wondered at the time if my father’s ancestors had been Lutheran rather than Roman Catholics, because it was my lawyer-father who first taught me to doubt.

But what my father did not teach me, and what has taken me a lifetime to learn is that there aren’t any Right Answers available to us humans either. Whether I was discussing theology or cooking a chicken, I thought there was One Right Way. I was usually open to exploring the possibility that my way wasn’t that Right Way, but I was always looking for it, I always thought it was there.

A graduate course on Immanuel Kant gave me my first glimmer into the realization that Right Answers might not be absolute. And being married to someone from a different cultural and religious background (not to mention an opposite sex), was almost a daily reminder that my Right Answers were not quite as obvious as I thought.

In recent years I have found not looking for Right Answers is amazing fun. Whether I’m putting supper together or planning a garden or even a book, it’s so freeing, so exciting to feel that the possibilities are endless. There isn’t just one Right Way. There isn’t even just one Best Way.

Heisenberg’s principle of indeterminacy showed conclusively that we cannot and never will be able to predict exactly what is going to happen on the level of quantum physics. Yet many people clung to the idea that the world of our everyday lives is predictable. Most recently, economists thought that they could develop statistic patterns that would predict the stock markets. Despite their blatant failure and the crisis of 2008 whose fall-out remains with us, many people, economists and non-economists alike, still believe they know, they have the absolutely non-negotiable Right Answer, to how we should revive our economy.

Now I have just finished reading two books which are further undermining our hopes for certainty and for right answers. Any surviving hopes among deterministic, mechanistic scientists for absolute predictability are dangerous and illusory. The first book is by Nicholas Taleb, the New York trader who says that the crisis was an example of what he calls “a black swan event.” A black swan event is extremely rare, and so extraordinarily difficult – in fact, ultimately impossible – to predict using the statistic tools of probability. Taleb argues that not expecting the unexpected is the worst possible way to prepare for it.

The second book is The Signal and the Noise by Nate Silver, the New York Times guru who correctly predicted both the 2008 and 2012 Presidential election outcomes in 49 out of 50 states. Despite this success, Silver says that prediction is getting less certain, partly because we know so much. So much of the data is sheer noise, distraction from hearing the true signal.

In a way, in order to survive, we must live with the assumption that some things are going to happen. We even need to plan them , and we need to believe that to some extent we can control those outcomes. And to some extent we do.

But actually, we cannot predict even the next second with absolute certainty. We can’t know for certain what we should do about anything. We can plan for the future, we can put money into pensions, we can try to take care of our health, we can do our best to put effort into relationships that are significant, we can get an education, we can follow our dreams. But we might be poor, we might die young, our relationships may not last, our dreams may shatter.

That’s a bit scary.

But it’s also the way it is.

And having spent many years in the cage of Right Answers, I also find it liberating. What will be, will be.

But what’s the difference between an asteroid that misses Earth and a meteor that comes crashing down? And as long as we’re on the subject, is there a difference between an asteroid and a comet? or between a meteor and meteorite? or a meteoroid?

The differences are not always clear-cut because their identities often change, so that an asteroid might not remain an asteroid, or a meteor might become a meteorite. I know these questions do not qualify among the great epistemological questions of the age, but here are the current definitions. It helps organize one small corner of chaos anyway.

An asteroid in basically a planetoid made of rock and basic metals that orbits the Sun in the same way that Earth and the other planets revolve around the Sun.

A comet is different from an asteroid in that it is composed of dirt and ice. When they are close to the Sun, it is possible to see a comet’s tail of dust and gas. Apart from that, a comet is pretty much like an asteroid, and it is sometimes hard to tell which is which.

Ameteoroidis a small chunk from an asteroid or comet. They also have their orbits around the Sun but are too small to be asteroids or comets.

A meteor is a shooting star. It’s a meteoroid that enters the atmosphere of another object – like Earth – and streaks through the sky as it burns up in the atmosphere of a planet like Earth.

A meteorite is a meteoroid that survives its passage through Earth’s atmosphere and impacts an object like Earth. They are the ones that create the most damage to life on Earth. Historically, meteor strikes may have been responsible for some of the 54 major extinctions we know have taken place in the last 550 million years. The dinosaurs may have been felled by the catastrophic destruction caused by a meteor strike.

Well you never know. This is information that might be useful one day.

January 24, 2013

I was aghast to read yesterday that a professor at Harvard’s Medical School is seeking a woman to carry the embryo of a Neanderthal baby. George Church believes he can reconstruct the DNA of the Neanderthals, and is seeking a surrogate mother for our extinct human relative.

I am not aghast for religious reasons. I am aghast because I think this reflects a terrifying lack of sensitivity and respect for life. This is a human child Church wants to bring into life. It is not a member of Homo Sapiens, but our cousin Homo Neanderthalis. Folklore represents Neanderthal man rather like a club-wielding thug with limited intelligence.

But archaeology is rendering this characterization as a chauvinist assumption of Homo sapiens rather than the reality. Neanderthalis was a species that buried its dead, made musical instruments, and we are now know interbred in some places with our own species. Either they copied our tools or we copied theirs- probably both. We were cousins. Although we like to point out that the brain of Homo sapiens is bigger than any body else’s, Neanderthal’s brain was larger. There isn’t a lot of evidence that he died out because he wasn’t intelligent enough.

So as a scientific experiment, geneticists are going to try to bring a child of this species to live on our planet. Are there any plans for rearing this human child? Any concerns about its potential isolation? Will it be treated like a laboratory animal subject to experiments and tests all its life? Will it be granted human rights? And what about the “mother”? Having born the child in her womb for presumably nine months, will she then pass it over to – to whom? its presumed owners?

And what might be the benefits if such an experiment were to succeed? Presumably Church believes he will earn a place in history, if not in infamy. But apart from personal gain, are there great scientific benefits that might arise from this endeavour? My own imagination fails to identify potential gains sufficient to justify the attempt.

I hope George Church fails comprehensively and utterly.

Unfortunately, if Church doesn’t do it, that doesn’t mean somebody else won’t.

Or even a dumb human being!

As Lynn Margulis said – intelligence and life are co-extensive.

December 4, 2012

NASA has ended some feverish speculation with the announcement of a tantalizing discovery by the land rover, Curiosity, which is slowly creeping around Mars:

Curiosity has found carbon on Mars.

This is potentially exciting because carbon is one of the essential building blocks of life as we know it. So there may have been life on Mars at some point.

But then, there may not have been. The carbon may be a contamination from Earth. Or it may have been brought there by an asteroid. At this point there isn’t enough evidence to know which of these three possibilities these few atoms of carbon represent.

Actually, one would not expect remnants of life to be left on the wind-swept dusty surface of the planet. If there was life on Mars which has left some mark, the chances are that it will be buried.

Curiosity hasn’t started digging yet. But it is equipped with a digger, and it will be going underground during its two-year mission on the planet.

October 22, 2012

I’ve just been reading an article about the high-flying mathematical calculations that are leading some of the world’s best scientists to think that they might be able to uncover actual evidence for a big bang before the one that produced the universe we now live in.

If they are right, the laws of physics in the previous universe are the same as the laws of physics in our own universe today. And if they are right, our own universe will eventually contract, and another big bang will produce another universe, etc. Although the subsequent universes may obey the same laws of physics, they are not carbon copies of each other. In other words, just because we’ve been through this once, doesn’t mean we’ll do it all again in the same way.

Whatever else, I’m not going to write a book about another Big Bang. I don’t think I could fit it into six chunks anyway.

It is available immediately in paperback, and will be available as a Kindle version within the next ten days or so.

The cover suggests a waxing moon that grows larger in the sky as it moves from being a New Moon to the Full Moon. This is traditionally a time for spells that attract, that bring positive change, spells for love, for good fortune, for growth. Above all, it is a time for new beginnings, for new ideas, for new hopes.

I would be most interested in your comments in the Big Bang to Now – either here, or even better on Amazon. Does the book help, as it promises, to make sense of billions and billions of years of time? Is it really Big Time for the Non-Genius?

August 2, 2012

I grew up in a family in which there was never a suggestion that there is an irreconcilable conflict between religion and science. My views of both science and religion have matured over the years, but I am still of the opinion that apparent conflicts between these two approaches are based on a profound misunderstanding of science, religion, or both.

One of the most widespread misunderstandings, I think, is in relation to the certainty of science. Many people think that scientific facts are absolute, that they are permanently and unquestionably certain. They aren’t. Even such “facts” that earth is round and revolves around the sun are not unassailable.

I’m not suggesting here that I’m not convinced of these particular facts. I do rather think the world is round and goes around the sun. But what I am saying is that, as Kant pointed out, we are always limited by the perspectives of the time and space in which we live, as well as of the abilities of the human mind to process what we observe. From another perspective – perhaps from the perspective of a different universe or with a different sensory abilities than we humans possess – the world may look completely different.

Indeed, even Newton’s theory of gravity has undergone grave changes since he first formulated it. We now know that the universe does not operate like a huge machine, and that Newton’s calculations apply to what happens on our planet earth, but the further away we get from earth, the less accurate they are. Without adjustments directed by Einstein’s theory of relativity, our rockets that landed on the moon would have missed its target and might even now still be spiral ling into outer space.

Many “facts” require less extreme contortions or effort to develop human perspectives which could bring them into doubt. As a simple example, for instance, many major illnesses such as schizophrenia, manic depression, cancer, and heart disease run in families. Isn’t it obvious, therefore, as many thought, that they are genetic?

Well, no. Language runs in families, but it’s not genetic. Occupations tend to run in families but they are not genetic. To discover just how much any correlation like this is due to genetic factors, to environmental conditions, or most often to an interaction between the two is subject to extraordinarily complex research and subsequent supposition.

Besides our facts which frequently change as scientific theories change, there are also many important, even extremely significant, questions which science simply cannot address, let alone answer. One obvious example among many is what happens after we die? We may make assumptions, but we don’t know. And we can’t use the scientific method to find out.

So if one wants certainty, if one cannot live without absolute answers, science is not the place to go.

August 1, 2012

Arthur C Clarke once observed that the products of science are often indistinguishable from magic.

As a scientist, I have always felt that conclusions are superior to those achieved as a result of what is generally labelled as superstition and magic.

Right. But a great deal of what I accept as based on science I actually take on faith. One need not go into the depths of quantum physics where time and space are completely upended, where particles go in and out of existence, where before and after or up and down don’t obey the obvious rules we learn to live with every day, where particles seem to communicate with each other over hundreds or even thousands of miles by no visible means.

Even on the everyday life as we all know it, the difference between magic and science isn’t really as obvious as we assume. Take something as ubiquitous as electricity, for example. I have taken the word of others that the two electrical wires leading to a light switch are carrying positive and negative electrons. And then do I really understand how throwing the switch actually produces energy, which then produces light?

Do I really understand how a message on the internet travels around the world in minutes? Not really. I jut assume that some scientists somewhere have made it work. Or how my car runs by converting gasoline into rolling tires? No, I don’t. I could probably write a paragraph or two on the subject before it becomes apparent that I’m babbling.

And do scientists or anybody else have hard evidence of what happens when we die? No. We have predispositions, hopes, fears, beliefs, fantasies. But we don’t know.

So today I turned the thesis around. If science can so often sound like magic, can magic also seem like science? Can magic and superstition, in other words, seem sensible?

I think they can. And to make the question even more complex, it is difficult, unless you are an expert in a particular field, to know for sure which is which. Even then, scientists capable of any honesty at all have said to ourselves more than once “I don’t understand how this could have happened. But all the evidence available to me is that it did happen.”

So these days I’m thinking that there is often more wisdom, more common sense, even, in what we often dismiss as superstition and magic, than I have thought.

Not that I’m recommending that we all start avoiding black cats or that we sit safely at home on every Friday the 13th.

But superstitions and unsubstantiated beliefs, our intuitions are often hypotheses that are worth examining for some underlying validity.

Or to put it another way: we often know more than we know we know. Neurologists estimate that about 10% of the activity of the brain actually reaches consciousness. But the other 90% is not gibberish.

July 9, 2012

I haven’t been able to find a lot of things in my life, but I can’t think of a single thing for which my search could possibly have lasted 40 years. I’d have given up by then. If only because by the time I found the lost shoe, the misplaced spectacles, the missing widget, it would be totally useless by then anyway.

Obviously the Higgs Bosom belongs in a different category altogether. Some of the best minds in the world spending billions of dollars have been looking for it. And now they think they’ve found it.

They are only pretty sure, mind you, because nothing in science is ever 100% certain. But they think they found it.

So why were they looking for the Higgs Bosom? Because up until now scientists have been unable to explain why the almost infinitesimally small particles that emerged with the Big Bang haven’t all just kept whizzing around at the speed of light in splendid isolation from each other. What made some of them slow down and mass together? In other words, what made the particles give up their individuality to bind together ultimately to form atoms and molecules, the very stuff of matter? It must have been something very powerful because bound together in the atom are both negative and positive particles that one would expect to repel each other.

The Higgs Bosom creates the Higgs field which slows down any other particles in its field and so creates bunches, or mass or matter. It’s why irreverent scientists dubbed it the “God particle.”

Okay, it’s pretty easy to understand what the Higgs Bosom does.

What I can’t find anybody to explain is how it does it. How, for instance, did scientists know to search in a particular place of the particle spectrum? And how did they know it when they found it?

I haven’t a clue to the answer to this question.

I have the feeling I am going to have to ask one of those brainy scientists working on the problem to explain it.

But I have the scary feeling I won’t have any idea what any of them are talking about.

Anyway, scientists say that now that they have a pretty comprehensive theory about matter, which accounts for about 4% of the Universe, they are now free to start looking for dark matter, which explains about 23% of the Universe. Another 73% of the Universe is composed of dark energy, and even the brainiest scientists don’t really know what that is. A few scientists think it doesn’t exist. They say we only think it does because we are misinterpreting what we see.

May 24, 2012

One would think that of all the questions we humans might be unable to answer, a core issue at the heart of everyday experiences about which we all agree everywhere in the world would not contain one of the biggest problems physics faces.

But it is, and it’s a problem that revolves around the very nature of time and space. The laws in physics detailing what happens on the quantum level of particles frequently contradict the laws of time and space as we all know them. Particles seem to go in and out of existence, for instance. And particles seem to be able to transcend space, influencing what happens half way around the earth, and quite possibly much much further. Up and down, before and after, inside and outside don’t operate on the quantum level in ways we expect them to in the world in which we live.

I won’t embarrass myself by trying to explain this in greater depth. I would undoubtedly make a fool of myself.

But I do know enough to know that it is an unresolved problem that Einstein spent his life trying to solve. And couldn’t.

If even a genius can’t get out of bed in the morning and truly feel as if he understands what’s going on around him, there’s an awful lot we don’t know. If we don’t understand even the basics of time and space, and if our grasp of the nature of matter and energy is so tenuous, I think we can say there’s a big chunk we don’t understand.

Knowing how much we don’t know should keep us from getting too arrogant, shouldn’t it? I find it very liberating, but I think sometimes it might also be quite scary, sending us running for cover with a dubious collection of Right Answers.

April 8, 2012

I said in my post yesterday that Christianity has been facing the destruction of many of its myths with the beginning of the scientific revolution 600 years ago. In a way, this is surprising, because historically, Christianity has been absolutely brilliant at incorporating the myths of the peoples converted to Christianity. Christmas and Easter are perhaps the most well-known but there are hundreds of myths like this which have enriched Christianity over the centuries.

Christianity, like other great religions, addresses questions which cannot ultimately be addressed by science – questions like what happens after we die? why do the innocent suffer and the guilty suffer? Why then has Christianity been so flummoxed by the discoveries and inventions of science?

I’m wondering if it is because so often Christian missionaries have arrived with so much more knowledge, so many treasures, so much temporal power that they have not felt the need to incorporate science into their myths. Christianity has so often been almost like a cargo cult. Christians had so much, it must have looked as if their god was indeed better than those already in residence.

And yet there is so much in science that could resonate with the basic Christian message. There are mysteries at the center of science that, for me, at least, are as awe-filled and awful as the mystery of god. Aren’t concepts like dark energy, or the singularity out of which the universe burst 67 billion years ago an opportunity to suggest that we perhaps need to forego our anthropomorphic concepts of god arising from an era when gods lived in the mountains and growled in the thunder?

This is not an argument for God. It’s a word that, perhaps because of my socialization, I cannot use to describe the mystery which seems to reside at the heart of existence. For a similar reason, perhaps, many of the Christian myths strike me as childish. But that is not true for hundreds of thousands of people who can draw great strength and courage from these myths. I watched my own mother face death from cancer at the age of 48 leaving behind ten children under the age of 20. The myth of heaven brought her great strength, and she faced her death with a courage that I still find simply incredible.

But this is a hope that more people who understand Christianity might be able to develop myths that incorporate the essence of the Christian message within a framework that does not require the believer to fear or distort science. Primack and Abrams in their book, The View from the Center of the Universe, are trying to do this in a way that I find quite tantalizing. They argue that science is showing that we are, in a most extraordinary way, quite literally in the center of the universe.

Similarly, Tony Equale in his Easter post has taken the message of god’s universal love to a new and challenging level. Five hundred years ago Galileo challenged the Church to stretch its world beyond that of Earth. This post challenges us to stretch god’s love beyond our universe, beyond time as we know it.

It seems to me that our religious myths must begin to develop this way. Or there will be an ever-widening gap between science and religion. This will be a terrible loss. Science is too powerful to be unexamined, it is potentially too destructive not to be submitted to the demands of respect and even love for our fellow-man.

February 9, 2012

Peter’s computer crashed last night. I don’t mean, crashed/reboot/you’ve lost the afternoon’s work you haven’t saved. I mean crashed/terminal. The computer has no more life: it is beyond all the revival systems 35 years of experience has taught us.

All right, I recognize it’s not up there with life’s Tier One challenges along with diagnosis of an incurable disease or the break up of a significant relationship. But it does belong a little higher up the list than the wash machine breaking down or burning the bacon.

As I was looking up a local contact for computer data recovery (also frequently sympathetically referred to as “disaster recovery”) I was remembering Google’s enthusiasm for the prospect of a brain implant that may one day soon enable one to access the internet merely by thinking about it.

I’m not kidding. There are people seriously thinking about how this can be done. Doctors already know how to annihilate traumatic memories that may be causing acute and ongoing distress. While this sounds quite benevolent, the potential possibilities also sound terrifyingly malevolent to me.

Already, having someone hack into my desk-top computer is disruptive, and losing my computer altogether can make me feel as if half my brain has been disabled. What if I have a brain implant that spreads the entire internet out before me as a mere thought? On good days I might start feeling like a genius. But what if somebody hacked into my brain?

September 26, 2011

Physicists at the Cern reactor in Geneva and at the underground facility of Italy’s Gran Sasso National Laboratory seem to have observed the impossible.

Particles sent from Geneva to the facility in Italy seem to have arrived faster than the speed of light.

Well, so what?

So what is that if anything in the entire universe can travel faster than light, the very foundations of our understanding of the physical world are profoundly shaken. Our very understanding of causality begins to fall apart. And useful as it may be to get us around the universe, Einstein’s theory, based on the unquestionable assumption that nothing can, ever has, or ever will, travel faster than light, is deeply flawed.

Of course, scientists are not absolutely sure yet about their data. The scientists who made the original observations are asking the entire scientific community to look at their data for potential errors. And the trials will be repeated.

In fact, they are so baffled, many scientists fully expect some kind of glitch in the procedure to explain the apparently inexplicable.

What I personally find so wonderful about this report, though, is not that light might not be the fastest thing around. What I love is that scientists can look at what might be an incredibly disruptive piece of evidence and not deny it. They can’t explain it, but they are totally fascinated, and totally – if painfully – open to the possibility that this key idea which has been accepted as just about absolute for centuries and on which so much of our understanding of the universe depends, might be wrong.

July 8, 2011

It is sometimes surprising how radically the meaning of something can change when we put the exact same words in a different context.

My father-in-law used to refer to his wife as “the rose in his garden.” He was obviously not referring to a botanical reality. He did, however, have a rose-bush in his garden, and when he said when we pruned it in the fall that we should “murder” it, he was just as obviously not referring to his wife. Actually, he didn’t mean to literally murder the bush either.

Sometimes we can mistake the context or an idea or not recognize that it’s a context at all. When I was about three, my mother once explained that my dad would be late for dinner because he was “tied up on the road.” I thought she meant this literally, and wondered how my father was going to unknot the ropes which were tying him up.

It’s not just the thinking of individuals that we need to understand in context. Stories, literature, art, myth, even laws have to be understood in context.

So too, scientific thinking, like all other thought, has developed in a context. In terms of modern science, it emerged in 15th century Europe when the political power of the Roman Church was still paramount. As the story of Galileo demonstrates, it was often dangerous to think scientifically, and scientists for centuries tread a careful path between evaluating observable evidence and avoiding censure.

One of the attempts to do this was to emphasize that science was concerned only with explanations provided by natural laws. Theological thought, God, and the rest of the spiritual world belonged solely to the authority of the Church. This was not a totally successful strategy – it is still not today – but it often created the space for some kind of co-existence.

Bu the thinking of scientists was not uninfluenced by this compromise. One of the most critical compromises was the scientists’ implicit agreement to keep hands off the supernatural world. But without realizing it, I believe scientists surrendered a part of reality to the religious authorities. Plato had posited a supernatural world in the first place to explain how it was possible for us to have ideas about perfect things that do not actually exist in the concrete world of our experience.

In giving up the supernatural world as an explanation for natural events, scientists inadvertently assented to give consciousness a spiritual dimension. Consciousness, intention and goal-seeking were dismissed as potentially authentic scientific explanations.

This surrender was given a major boost by Newton’s theory of gravity. What gravity did was to mechanize the scientific view of the universe. It was – and is – an incredibly powerful theory, and displaced angels and similar explanations for how the stars stay up in the sky while apples fall to the ground.

Ironically, Newton himself recognized that gravity was not exactly a mechanical dynamic. Gravity does not involve any physical contact between bodies acting on each other, and Newton was forced to posit an additional mysterious force which he did not elaborate. It was perhaps comparable to the concept of dark energy which scientists today posit must be out there, but which they cannot yet define.

This mechanization represented a huge leap forward in our understanding of the universe. But it also led to what I think was a terrible loss from which we are only just beginning to recover in the last one hundred years or so.

With mechanization, the universe became passive. Dynamic forces were dismissed as epi-phenomena — not real in themselves. Animals were said to be no more than sophisticated machines, and were subject to horribly cruel experiments on the grounds that they are incapable of feeling any more than a car engine can feel. Everything that ever happened or ever would happen was the result of the operation of completely mechanical forces that will roll irrevocably forward to final entropy.

Science today sees the universe as far more dynamic. Matter itself is no longer perceived as purely passive, but to contain within itself the very energy that pushes the unfolding universe forward.

In the past, and for some even today, this suggestion that the universe possesses an intrinsic dynamic was dangerously close to a return to supernatural explanations. It was sneaking God in by the back door.

But today as quantum mechanics on one hand and astrophysics on the other are revealing realities to us that are incomprehensible in terms of our common sense conclusions, mysteries such as the thought processes that Plato found so inexplicable, or the potential of an intrinsically dynamic universe are no longer so intolerable. Scientists are less terrified of questions that in the past might have been held up as evidence of God and a supernatural world.

I think that in some ways this is giving us our world back. One can be a scientist without having to deny the essential reality of so many of those realities that were volunteered originally to the Church.

It’s why I myself find it so liberating to accept that I live inevitably in mystery. It means I don’t have to distort my experience to fit theory.

I can just say “we don’t know yet.”

And we’re never ever going to get to the point where there isn’t something we can’t explain.

April 27, 2011

I’ve just read a blog post on cairns and other stone monuments and messages in Japan. They focus on earthquakes and tsunamis, both as a memorial to the thousands of people who have died in these recurring disasters in Japan, and advice to those now living – sometimes centuries later.

Ever since I was introduced to them in the hills and mountains of Britain, I have been fascinated by these stone monuments, some of which are probably as old as ten thousand years.

Sometimes they are piles of small stones piled together as markers along the way to other travelers. Sometimes they are carvings, often representing religious symbols. Sometimes the stones are very big arranged in circles where the community gathered. Sometimes, even as they are today in our cemeteries, they are markers of burial grounds. I understand from personal experience how awe-inspiring standing amidst some of these stones and circles can be.

We now recognize that some of the stone arrangements are extraordinarily sophisticated. Stonehenge, for instance, works like a huge computer, predicting eclipses and other crucial events in the astronomical calendar.

Until now, I thought these stone monuments were concentrated in Britain and western Europe. Now that I have discovered that they are as far-flung as Japan, I have begun to wonder again about stones.

Where does the sense of awe so often displayed by these stones come from? Why is there this sense that stones possess some kind of sacredness? Where does it come from?

The more I learn about modern science, especially about astrophysics on the one side, and quantum mechanics on the other, the more I realize just how profound is the mystery of this universe in which we live. I realize that if I were as smart as some of the scientists working in these fields, the mystery would be even greater, not smaller. It is they who talk about completely revolutionizing physics in order to make sense of what they are observing. One example among many, for instance, is the difficulty of explaining in any reasonable way how particles thousands of miles apart are able to influence each other at speeds more than a trillion times the speed of light.

When I read these things, I feel less worried that my own hunches or hypotheses are whacky. They aren’t nearly whacky enough to explain what scientists are discovering.

And so without further apology, I offer my hypothesis that ancient people sensed something legitimate about stone – some potential for life that modern science lost when it separated the universe into inert matter which belongs to the domain of science and living spirit, which belongs to the domain of philosophy and religion.

What if matter itself is dynamic? Einstein’s theory suggests that it is – that matter and energy are interchangeable forms of the same thing. It suggests that life does not belong to another world but is part and parcel of this one.

What if stones in their very substance really do represent the quivering potential of life?

April 26, 2011

The Times here on Saturday did a full-page feature on ten things that they predict will shape the near future. By near future, they are talking about this century, but some of them are predicted for the next twenty. These near events include internet contact lenses – LED arrays that will enable people wearing them simply to blink to access the internet, a computer-supported kind of telepathy, and an unlimited capacity to grow spare organs and body parts as needed.

With 50-60 years they are predicting that extinct animals will have been brought back to life — maybe even Neanderthal man – aging will have been dramatically slowed done.

By 2100, cancer will no longer be life threatening, people will be building star ships and space elevators, and robotic machines will have merged with some forms of life.

My first thought was that I’m not all that sorry that I won’t be around to live in this brave new world.

My second though was that some things never change. If we are still thinking beings in a century, we will still be wondering about the meaning of life, the nature of the universe, and whether there is any way to beat death rather than simply delay it until we are a thousand years old or so.

March 30, 2011

I have often thought that a key problem with religious fundamentalism is that believers have lost the capacity to understand symbolic thought. They have lost the capacity to value poetry as a kind of truth. Metaphor is soft-headed thinking.

According to fundamentalists, God could not possibly be speaking in metaphors or symbols. The only valid interpretation of the Bible is a literal one. And so they get themselves into the impossible position of arguing against all the evidence that the world was created in six days about four thousand years ago, and that Adam and Eve lived in the garden of Eden until they listened to the advice of the snake.

This is actually a rather recent interpretation of biblical truth. For thousands of years the scriptures were interpreted metaphorically. The Hebrews themselves understood metaphor, a fact illustrated by Jesus’ frequent use of parables to teach a universal truth or principle.

So where has this anti-symbolic prejudice come from?

As a scientist I hate to say this. But I fear that at least part of the problem comes from us. For hundreds of years there have been scientists who have argued that the only reliable, verifiable source of truth is through the application of the scientific method.

Valuable as I think it is, the scientific method does not “do” metaphor. The very first thing one learns in applying the scientific method is to define terms in clear, unambiguous and concrete terms. In other words, science defines terms literally.

This is fine. This is legitimate and immensely valuable.

But over the years, scientific thought has in the minds of many taken a place of precedence. It is seen as superior to any other kind of thinking. And so when funds are cut by stretched governments or educational institutions, the arts are cut rather than science. Film, literature, sculpture, poetry, music are all “soft” subjects, rather like dessert following a meal. They are pleasant and enjoyable, but in tough times, they are a luxury.

Similarly, only highly intelligent students are considered suitable for careers in the hard sciences. Poetry and story-writing and composing music again are seen as wonderful contributions to a society that can afford such luxuries.

Which is why I fear that scientific thinking has contributed to the impoverishment that I believe is reflected in so much fundamentalist thought.

Yes, I am a scientist by nature, by education, by choice. I value what it has given me, which has included moments of intense, even transcendent, joy and profound insight.

But to think that the arts are somehow secondary in our endeavour to become fulfilled human beings is a terrible impoverishment. They aren’t a luxury. We need them in the worst of times possibly even more than we need them in the best.

March 8, 2011

The work of a professor from Cambridge University studying elephants was reported by the BBC this morning. The report includes a video of elephants cooperating with each other to solve a problem that could not be solved by either of them alone, and suggests that this is an astonishing discovery.

It goes on to say – as if this is revolutionary – that researchers have actually documented elephants in the wild helping each other and says that we see”amazingly complex behaviours – culture, tool use, social interaction” throughout the animal kingdom.

How is it that we are just discovering what our ancestors before us knew tens of thousands of years ago – that we share with other animals our intelligence, our problem-solving abilities, our capacity to work together for mutual benefit?

My guess is that it started when Christianity adopted the Platonic two-world theory by teaching that Homo sapiens is unique, that we were created directly by God, that we have souls that set us apart from every other living thing in creation. Bit by bit, that alienated us from all the other animals.

In some ways, it may even have alienated us from any other members of our species whom we did not recognize. So that we have the Spanish conquistadores in what is now Latin America asking the pope in Rome to determine whether the natives were actually human. Even in the 20th century, we have American immigration authorities declaring that newly arrived European immigrants were significantly less intelligent than those already here, because intelligent tests were given to non-English speaking immigrants in English.

February 9, 2011

There is a theory which states that if ever anyone discovers exactly what the Universe is for and why it is here, it will instantly disappear and be replaced by something even more bizarre and inexplicable. There is another theory which states that this has already happened.

I don’t actually remember when people thought the world was flat and that the sky was held up by sturdy poles at the four corners.

But the world must have seemed quite simple then. And it would have explained most of ones daily experiences as one walked around weeding the garden, hunting the rabbits, or starting the fire for the evening meal.

Yes, there would have been a few niggling questions for people who weren’t ever satisfied. Like why the positions of the stars in the night sky changed in such complicated ways. Or why it looked as if sailing ships were actually rising out of the sea when they appeared on the horizon, or seemed to be sinking into the sea when they sailed out.

But mostly things made sense for most people.

Then Copernicus and then Galileo came along and answered those niggling questions. The ships weren’t sinking as they went out to sea: it just looked like that because the world is round, not flat. And the stars’ positions aren’t really that complicated once we understand that earth is whirling around the sun, not the other way around, as well as turning on its own axis at the same time.

The problem is that these answers created a whole handful of even more niggling questions. Like why apples fall from the tree but the stars don’t fall from the sky. What is holding them up?

Eventually Newton came along with the theory of gravity, which “explained” why things fall or don’t fall to the earth. Or why, for that matter, we don’t all fall off the earth if it’s whirling so fast around in empty space.

Except that Newton didn’t really explain what gravity is. What he did was to develop a mathematical theory that enables us to predict with a fair amount of accuracy (although not with absolute accuracy) the conditions that determine when and what will fall to earth and what won’t. But to this day we don’t really know what gravity is. We only really know the conditions under which it works.

And then Einstein came alone and answered another niggling problem about the relationship between energy and matter. E= MC2.

Except that Einstein’s theory made even Newton’s world look simple. Einstein’s theory demonstrates that time and space – both of which seem fundamentally pretty non-negotiable to most of us most of the time – are relative. “Time” in outer space runs at a completely different speed than time on earth. And when time is different, so is space. Which doesn’t run in a straight line anyway once you get off earth, because space curves which is why parallel lines eventually meet if you go out far enough.

Quantum mechanics is another world of answers that you don’t want to know about if you really just want your questions answered. Once we get to the quantum level of the super-small, particles don’t seem to obey any of the rules we mortals in the grown-up world have to obey. Particles go in and out of existence. And a theory called entanglement suggests that they somehow can communicate with each other across the expanse of the universe at about a trillion times the speed of light.

It goes on and on like this. The more little niggling questions we answer, the more imponderable the next questions seem to become.

So all in all, I think Adams was right.

On the other hand, I personally agree that we should stop shooting the elephants. I don’t think it’s all that complicated to figure out.

February 3, 2011

This blog, if it stands for anything, stands for the idea that there is always another way of looking at things. Most often I find alternative and unexpected interpretations liberating and exciting. But that does not mean that I think all interpretations and explanations are equally convincing. I don’t. Some possibilities must be dismissed as wishful thinking at best, as wilful distortions at worst.

The BBC is airing a documentary that explores what it is calling “the darker side” of Abraham Lincoln. I have not seen the programme yet, but the BBC claims it explores evidence that Lincoln secretly planned to deport back to Africa the black slaves whom he had freed with the Proclamation of Emancipation.

I’m dubious but also disturbed. I am aware that the new vogue among historians is to look at the evidence and to put together a different story than the one we have always told ourselves. The newer or more radical the story, the more likely it is to be accepted for a Ph.D dissertation, a best-selling book, or the basis of a television series.

I can’t help but support the call to question whether our classical interpretations of the past are always right. But as with all research, some results are more professionally rewarding than others. Some results provide the author with greater prestige, greater financial reward, greater professional recognition.

We have seen, for instance, that finding a statistically significant improvement among those who take new experimental (and expensive) Drug A compared to those who take an old (and usually much less expensive) Drug B is more apt to lead to more big funding than finding no important differences between Drugs A and B.

And I fear it is the same with history. Some stories are more radical, more surprising, more innovative. So they are much more apt to attract television audiences.

But I don’t necessarily trust all these new stories and the new evidence upon which they are purportedly based.

Just because they are on television doesn’t make them any more valid than the promises made by internet sources trying to sell cheap Viagra and other enhancing products.

There’s no way around the hard work of looking at the evidence.

And living with the knowledge that even then, one risks being absolutely wrong.

January 7, 2011

As I’ve said in my last two posts, The Decline Effect reveals another source of uncertainty, another reason why scientific facts might be wrong. So is science so riddled with potential error that we should give it up? Is it too biased to bother with?

Only people who have not understood the nature of science have ever thought that science was infallible. What the Decline Effect has done is open up a source of doubt that might be much more gaping than we had previously suspected. We don’t really know how big the problem might be. But we know it’s there.

So is science worth all its trouble? should we go back to relying on common sense and intuition and to believing what our elders tell us? Isn’t that just as good? Or maybe even better?

No, I don’t think we should discard the scientific pursuit.

First of all, look at what science – faulty as it may be – has done for us that no other approach has come near. Science has put us on the moon and will probably get us to Mars. It has eliminated small pox from the face of the Earth, and through vaccinations has saved millions of people from the devastations of polio, whooping-cough, and measles. It has parked cars in our garages, put computers on our desks, mobile phones into our pockets, televisions into our homes. In the last 150 years, it has increased human life expectancy in the world by more than 25 years.

Has science ever gotten things wrong? Indisputably yes! But it is invariably scientists themselves who have noticed that it was wrong and often righted it. The Decline Effect was first noticed by a scientist and it is scientists who are going to find ways to reduce its distorting effects.

In that sense, the Decline Effect hasn’t changed anything. An attitude of questioning has always been the most scientifically intelligent approach.

January 6, 2011

Yesterday’s post described the Decline Effect, the phenomenon which seems to have appeared in almost every area of science, in which the results which seemed initially to be very strong get weaker and weaker as studies replicate the original research.

Undue carelessness in carrying out the original research or even deliberate fraud do not seem to explain why the decline effect is happening. If only because nobody seems to be gaining from it. The decline was initially even identified by someone trying to replicate his own research.

There might still, however, be a weakness in the way the scientific method is being implemented by the scientific community. Ironically, globalization might be both making it harder to detect and exacerbating the decline.

The great benefit of replication, of research being repeated many times by many different researchers, is that it should reveal the errors and assumptions that any individual researcher may unconsciously be bringing to his or her work and which are distorting the outcomes.

Some of the strongest and most frequently unquestioned assumptions arise out of earlier research which has been broadly accepted. So nobody is questioning them, even if they may be erroneous. Some of these assumptions are widespread and held by the general public. For instance, that gravity is what makes objects fall to earth. Some assumptions are less well-known but are apt to be held by the majority of specialists in a particular field. For instance, the assumption that the ability to see in three dimensions is limited if one is using only one eye. Some assumptions are so deep that they are not even recognized as assumptions. Whatever the combination of assumptions, they are always there. It is impossible for any of us to perceive the world without making them. What is observed is always and inescapably influenced by the perspective of the viewer as well as by the nature of the object being examined.

The increasing globalization, especially through the internet, of scientific theory and research might make recognizing some of these assumptions even more difficult because they are more universally shared. So scientists may be subject to the same distortions in their research, which results in coming up with similar errors. This would mean it would take much longer for the Decline Effect to emerge. Instead of a year or two, when it becomes clear that the original research results are not being replicated, it may take a decade or two.

Sometimes, of course, research might not be replicated at all for a very long time. It may be too expensive, or take too long, or not seem to be of sufficient importance.

Finally, there is the problem of statistic significance. There is a huge bias in science to publish research with statistically significant results. In other words, that demonstrate that some variable makes a difference. Publishing research which concludes that no distinguishing effects were found feels like failure. It’s also much harder to get results which are not statistically significant published at all because professional publications prefer research with positive results. Besides, positive findings are conclusive in the way that negative ones aren’t.

For example, suppose an article studying Vitamin B12 supplements concludes that it can make a difference in reducing memory loss in old age. That’s good news. We can do something about some of that distressing memory loss. But if, on the other hand, the research finds no discernible differences, there is always the possibility that different research might still turn up something. Different doses, different combinations of vitamins, different populations, different measurements of cognitive functioning – all of them might turn up some result. In the meantime, the message is that there’s no new advice for dealing with our forgetfulness. Not nearly as many people are going to be interested in this headline information.

In fact, when scientists don’t get a statistically significant result, they will often trawl through the data they have already collected to see if they can’t find something they hadn’t realized was there and had not, actually, been planning on looking for. I know. I’ve done it. And I know dozens of reputable researchers who have done it. It is not considered second-rate science to do this. On the contrary.

But it creates a bias against finding results that are less striking than have already been reported in the research.

Imagine you are a young, ambitious scientist eager to make a mark. Are you going to deliberately put your professional time and energy into doing research that is going to refute research which has already been published and is quite possibly lauded by the professional community? Is it the wisest thing to begin by pitting your findings against received wisdom? Wouldn’t it be better to strike out and find out something new and positive and statistically significant?

My strong suspicion is that the Decline Effect is the result of a widespread but indeliberate failure to adequately replicate initial research that reports some supposedly significant finding. And so it is taking us longer to sort out the wheat from the chaff, to identify those findings that are robust, and those that were based merely on chance.

The almost universally accepted level of probability that results are not caused by chance is 5%. It is even higher than that if there are some widespread biases influencing more than one researcher in similar ways.

Almost certainly then, at least 5% of all research reporting a positive result is simply chance and not a real effect at all.

The one thing science cannot afford to do is to let go of real, robust replication – including studies that report that they have been unable to find anything significant at all..

January 5, 2011

I was a young graduate student when I first read a book by Thomas Kuhn which convinced me that scientific fact could never be absolutely certain. The reason is that a new theory might always displace an earlier theory, which then put everything we thought we “knew” into a different perspective. And indeed this has happened more than once in the last four centuries.

As a result of understanding this, I have always been comfortable with the changing landscape and fundamental potential uncertainties of science. But a friend has just sent me an article published last month in the New Yorker magazine that is, for me at least, as big a bombshell to the certainties of scientific fact as Kuhn was.

The article is about what is called the “Decline Effect.” Fundamentally, this is a phenomenon in which scientific findings get progressively weaker as they are replicated. The effects of a new generation of anti-psychotic drugs, for instance, which twenty years ago seemed to reduce psychotic symptoms dramatically, have bit by bit dribbled away.

The Decline Effect, though, has not just appeared in relation to research into the effectiveness of new drugs and medicines. It has shown up in psychology, in biology, in physics, in fact in almost every area of science.

What in heaven’s name is going on? Two holy grails of the scientific method – replication and statistical analysis – were supposed to prevent this from happening. Are they breaking down?

Replication has been intrinsic to the scientific method since it was first formulated and refined five hundred years ago. Around the world, it has been almost an article of faith that scientific results must be repeated by other scientists, doing their own observations and producing their own data independently. This seemed to be the cast-iron assurance against not only outright fraud but against careless research or against the inevitable bias that any individual might, in all innocence, impose on his or her research.

The essence of statistical analysis is deciding whether some event probably happened by chance or not. Although analyses have become extraordinarily sophisticated with the increasing computer power in the modern world, the principle remains the same.

For instance, walking down the street, there is one chance in 365 that the next person I see will have been born on the same day of the year as I was. There is one chance in two that if I flip a penny, it will come up heads. Science uses statistical analysis to decide if the effect connected to some variable happened by chance or is what is called “statistically significant.”

Did this group of people, for instance, who took an aspirin every day for the last ten years have lower levels of cancer than a similar group of people who didn’t regularly take aspirin? And if the first group did have lower levels of cancer, was that merely a chance difference or were the differences so great that the probability that it was chance are miniscule?

What the decline effect is suggesting is that results that at first look as if could almost certainly not have happened by chance gradually seem to look more and more like chance with repeated replication.

The Decline Effect, of course, creates a cloud of doubt over huge swathes of accepted scientific findings. And scientists are not yet totally agreed as to its causes or its cures.

But there are some potential explanations, and their exploration will almost certainly have an influence on the way science is practiced in the future. More about what they might be in my next post on the subject.

December 26, 2010

Bad news inevitably makes better news than good news. Good news gets put into a small specialty spot to create a feel-good moment, but unless its our team that has just won, it hardly ever makes the headlines.

For instance, which of these headlines do you think are most apt to make the front page of The Times:

Factory head indicted for stealing factory goods over a period of ten years!

When things are getting better, it’s really hard to tell.

So on this dark day of 2010, here is a statistic that is actually honestly real. And it really is good news.

Two hundred years ago, only two countries in the entire world had an average life expectancy above 40 years. And even in the United Kingdom and the Netherlands, where life expectancy was above 40, it was only by a few years.

This is in spite of two world wars, in spite of an unprecedented increase in population, in spite of AIDS and a flu epidemic. In spite of our selfishness, our greed, our stupidity. In spite of everything else we think is wrong. That is wrong.

Okay, okay, I agree. That doesn’t mean things are going to continue to get better. It doesn’t even mean we might not still destroy our environment and ourselves in the future.

But still, I had no idea that the change in two hundred years had been so encouraging.

An awful lot of what a lot of people have done must have been worth it after all.

December 20, 2010

Here in Britain we are facing what is literally breaking records for cold. Not just the last ten years, or “since the 1962 winter,” but quite possibly the lowest temperatures ever recorded in modern Britain. Thankfully they do not extend to the ice age 15,000 years ago. They don’t even extend to the 17th century mini-ice age when people crossed the River Thames in London on foot and in New York walked from Manhattan to Staten Island.

Yet modern records are bad enough. During the winter of 1962 (which those who lived here then remember and shutter), it did not get over 5 degrees any where in Britain before mid-March.

This morning a meteorologist reminded me of something I’ve known for years but have tried to forget. Instead of global warming making Britain a warmer, if wetter, place, it could shift the Gulf Stream south. In which case our weather here in Britain will resemble that of Siberia or Alaska which are on the same latitude.

This winter the Gulf Stream has shifted south, and is highly unlikely to return here this year. And it does feel like Siberia or Alaska right now.

Well, we will survive that. But what if the Gulf Stream never returns? I do not find the scientific evidence comforting.

December 16, 2010

Several billion years ago – that is when Earth was about half as old as it is now – algae was one of the most common visible forms of life on the planet. This kind of pond life was a key player in generating the oxygen that made it possible for life to leave its ocean home and set up a permanent home on land. Algae remains an essential part of our eco-system, providing food for many sea animals and contributing significantly to the oxygen supply for those of us who breathe on land.

Algae may now be posed to make another significant contribution. The US Navy has taken delivery of 20 thousand galls on jet fuel produced by – yes, algae.

If it works, and if algae-based jet fuel can be scaled up so that it can be produced cheaply and in sufficient quantity, it can transform life once again.

As an alternative to fossil fuel, the contribution of algae-based fuel to the environment could be immense. The political implications could be equally monumental. Countries whose economies rely on oil exports as well as those relying on its import to meet their energy needs could find that they are dealing with completely revolutionized variables.

Will it happen? It would require a lot of water under cultivation. An area the size of South Carolina would be required to meet America’s transport needs alone. Algae farms covering a country the size of Portugal would be need to service the transport needs of Europe.

The farms would not, of course, have to all be located in a single vast pond. But if they are divided up into many small puddles, the algae would then have to be transferred to central refineries for processing. And that, of course, would add to the fuel.

I am not benignly confident, but I am hopeful.

The human species, along with a number of arguably less admirable qualities, is marvellously inventive.

December 12, 2010

Unlike other societies who made images of their gods and carried them around with them, the Hebrews were forbidden to make images of God. Their prophets destroyed them in righteous fury.

Instead, the nomadic Hebrews carried an empty tent which symbolized the presence of their God in their midst. God could not be reduced to images. Yahweh is He- Whose-Name-Cannot-Be- Spoken — He-Whom-We-Cannot-Know. The Hebrews did not think their sacred writings were inspired by God with a message meant to be understood literally.

Actually, what I like about the Hebraic concept of Yahweh is that in some ways it resembles the Universe. It is an amazing, fantastic, inspiring and inexhaustible reality about which we can always understand more and never comprehend totally.

December 11, 2010

Earlier this week we watched a tv programme in which Jacob Bronowski’s daughter, now in her sixties, tried to understand her father who died ,pre tjam 30 years ago. If you are old enough, you may remember the tv series The Ascent of Man, in which Bronowski argued that science was one of the greatest forces for good that mankind has at its disposal.

His daughter, now an eminent historian, was greatly influenced by her father and his ideals, and it was a great shock to discover recently that during WWII, he had used his rarefied mathematical ability to help allied planes drop bombs on German cities like Dresden and Berlin so as to maximize damage. How, she wondered, did he live with this? And what did he think and feel when he was sent after the war by the military to assess the damage of the atomic bombs dropped on Hiroshima and Nagasaki?

She discovered that Bronowski was not a man who went through life undisturbed by the decisions he had made. In fact, she discovered and presented some gripping video in which her father faces the issue straight on.

He stands on the edge of a field at Auschwitz where the human ash of hundreds of thousands of Nazi victims of the gas ovens was dumped. He bends down and takes the ashes into his hands and looks at them intently. It is not possible, he said, to be absolutely sure about the choices we make. Life faces us with dilemmas in which the moral choice is not always clear, in which doing anything or even nothing seems tainted with guilt. But we must, he said, make the choices to the best of our ability at the time.

But we may be wrong.

We may be horribly, terribly wrong.

And then he quoted again that wrenching plea from Oliver Cromwell: We beseech you, in the bowels of Christ, to consider the possibility that you may be mistaken. If only people like Hitler or Stalin had remembered that. It was not science, he said, that murdered these people. It was people who would not accept that they could be wrong.

I think this is the key to why Bronowski so valued science. Because built into science is the unrelenting possibility that evidence will prove one wrong.

The history of science is littered with theories that have been accepted and then replaced by new theories because there is new evidence or new a new way of looking at old evidence. Even Newton’s theory of gravity has not remained unscathed. No scientist now believes the universe runs solely on the mechanical principles that he thought.

For science, there is always the potential for looking at things from another point of view. There is always the possibility that I am wrong.

To live with that knowledge, not just in relation to science, but in relation to all the judgements I make, seems to me to demand great strength of character and discipline, but also to offer great rewards.

To be in the middle of a heated fight and to consider that there may be another legitimate point of view is not easy. But it may save a relationship.

To be able to consider the possibility that someone who has done great harm may be something else besides a selfish brute who should be kept in prison for life may be the path to forgiveness, even of ourselves.

To be able to see something from a different point of view is the essential ingredient of creativity.

To believe that what seem to be two diametrically opposed ideas might somehow be compatible might reduce my bigotry.

Whatever else, it invariably makes life a surprise.

I have thought for a long time that the key moral value underlying science was to tell the truth – not to lie about what one saw. But I think now, there is another moral principle just as important, just a hard to practice, and just as applicable in the world beyond science: to always remember that I might be wrong; that there might be another point of view that is just as legitimate, or even sometimes more legitimate, than the one of which I am so convinced.

December 9, 2010

It’s not clear to me whether Iannis Xenakis thought of himself more as a musician or an architect. He was outstandingly brilliant in both. Mathematics was a bridge for him, and he found music in his modernistic architecture and geometry and set theory in music.

But he was a no mere translator of music into numbers.

Music, he said, was a way to find answers to phenomena we don’t understand.

Xanakis does not seemed to have talked a lot about what he came to understand through music. Perhaps he felt the music said it for him.

If so, I think I understand. Music often gives me a glimpse of things I do not yet have words for. It’s like an arrow pointing to something which I cannot yet reach through rational or scientific analysis.

I have too great a legacy of distrust for belief based on “faith” to place unquestioning trust in insights music suggests to me. But I take it seriously. And it has given me immense joy and strength.

December 2, 2010

Raised as a Catholic, I was taught from an early age that human beings are damaged. The doctrine of original sin is the explanation for the loss of the mythical Garden of Eden when everything was idyllic, without conflict, or injustice. We are the reason everything went wrong. Adam and Eve committed some grave sin, and were thrown out of Eden by an angry God who cursed all human offspring with the stain of sin from birth.

Put that way, it makes God sound like a vengeful vindictive unforgiving and unloving tyrant, and it makes no sense to me. But this idea that we are cursed by an essential self-seeking sinfulness goes back much further than the Old Testament and survives in much of current scientific theory.

I think part of the reason the idea is so tenacious is that it carries the message that we are not helpless – that by being good we can influence what happens to us. I think we would often prefer to feel responsible rather than helpless, even if the cost is admitting to inescapable guilt.

I abandoned the idea of original sin many decades ago. But it is only now that I have begun to question the source of a similarity between original sin and Freud’s id, that self-seeking pleasure principle which Freud believed is the core of our energy. Behaviorism’s insistence that what we do is determined by punishment and reward is, paradoxically, another version of the same thing.

It is in this context that I found myself objecting to Dawkins’ reference to our genes as “selfish.” In his comments on my earlier post, Chris Lawrence equates Dawkins’ use of “selfish” to describe genes with the poetic description of the sea as “cruel” or the wind as “angry.” This impresses me as quite reasonable, and I am willing to accept that Dawkins was speaking metaphorically.

But we still have the deeper question: are human beings essentially selfish? Until very recently, I have always assumed we are.

But I’m not so sure any more. Darwin’s theory of evolution initially had difficulty in explaining altruistic behavior, especially altruistic behavior exhibited toward other species when their preservation was – genetically speaking – quite distant from the preservation of our own genes.

Initially, this altruism was often used as an attack on the adequacy of evolutionary theory, as evidence that human beings are far superior to other organisms who blindly sought their own advancement. I was never convinced by this idea and still see it as fairly pathetic.

But I am looking now at the scientific evidence and wonder if the drive to survive is as intrinsically selfish as we have assumed.

I say I’m looking at the evidence but it is inconclusive. We can’t possibly add up all the instances of behavior and compare the two columns of “selfish” and “unselfish” behavior – even assuming we could agree about which behavior goes in which column which we can’t. From some points of view, even Hitler’s grizzly policy of genocide can be interpreted as a desire to create utopia that went horribly array.

Similarly, is it selfish that we pass on to the next generation only our own genes? We have absolutely no choice. I can’t pass on your genes or the genes of my dog or even of my own mother. So ethically, the mechanics of genetic transmission can’t be called selfish.

The conscious human decision to have children might come from essentially self-serving motives such as “someone will be there to care for me when I am old,” “it’s too expensive to have children” or “I will receive a benefit payment from the state if I have children and won’t have to work” . But there are also apparently unselfish reasons, sometimes extending to selflessly caring for children who aren’t even ones own.

I’m beginning to think that the genetics of survival has generated organisms which are intrinsically unselfish. That goodness is not simply wrung out of us by fear or need or power or self-serving greed. It’s in our very genes. We like sharing, we like being part of a functioning community. There are scientific studies showing that people find being kind, or helping someone, solving a problem, or delighting in the good fortune of others are highly pleasurable acts in themselves. I don’t think we have to learn them.

Okay, perhaps there is a streak of unselfishness in the very striving to survive. What about the cruelty, the blind indifference, the impulse that seems to be present in every living organism on the planet that seems to say: “If it’s a choice between you or me, I’m fighting for me”?

Yes, it’s there. But is that what we really are, as opposed to being unselfish? Are we not perhaps both? Or was Buddha right when he argued that what we call evil or selfish is our incompleteness?

Are we actually evolving toward greater goodness?

I don’t know. I know my contrarian self well enough to know that on this question I would find myself arguing which ever side is being attacked. The evidence for either side of the case impresses me both as very strong and as unconvincing.

But that in itself represents quite a big change in my fundamental view. As I said, I’ve always taken it as pretty much of a given that living organisms, and humans especially, are basically selfish.

November 30, 2010

In an earlier post, I wondered if genes could legitimately be called “selfish” in their drive for survival. In a following comment, Chris Lawrence suggested that Dawkins, who is the considerably accomplished evolutionist who coined the term in the title of his book The Selfish Gene, was referring to a pattern of numbers, not a moral quality.

My first thought was that this was a fair comment. But my second thought is to wonder what makes a pattern of numbers resemble selfishness as opposed to unselfishness? or any other characteristic which is usually used to describe ethical behaviour? I don’t think there is one.

We don’t call atoms “selfish” when they combine with other atoms to form molecules. We don’t call water “selfish” when it crawls into the tiniest conceivable space and lodges there. We don’t call the universe “selfish” as it expands to encompass trillions of light years of space. Living things, though, get to be called selfish, even by those who do not believe that invading dandelions or marauding locusts are free to choose.

No, I’m going back to my original hypothesis: I think the use of the word “selfish” to describe the survival instinct reflected in genetic transmission is a deep-seated but unrecognized hangover from world views that taught that when things go wrong, they are somehow our fault. Even today, preachers have claimed that recent tsunamis and volcanoes are the result of human sinfulness.

The advantage of this teaching, of course, is that it suggests that we have some control over what happens to us. If we would just stop sinning, we could begin to re-establish the order of the Garden of Eden. Science also suggests that we can have some control over what happens to us, but its recommendations are quite different from typical religious admonishment. (I do know that if I were in an earthquake, I would greatly prefer to be in a building constructed to withstand the tremors than standing next to a holy man whose virtue is unquestioned.)

But along with dismissing any human guilt for natural happenings, I’m also questioning the assumption that the drive for survival is blindly selfish. If one wants to ascribe any moral characteristics to it at all, why not say it is unselfish? It is trying to preserve life, even at the cost of passing it on to a younger, fitter offspring. I can hardly fault a lion for not passing on genes to preserve a species of spiders, or a fish for failing to pass on genes for the increase of tomato plants. Genes are not essentially destructive mechanisms, but creative ones but they are not free to pass on just any characteristic they please.

So, no, I still disagree that genes can legitimately be called “selfish.” Not that they can essentially be called “unselfish” in the moral sense either.

But it does seem to me that the results of evolution are overwhelmingly plentiful. Yes, disordered and chaotic too. But absolutely marvellous.

November 28, 2010

Roger Penrose, a cosmologist of some significant distinction at Oxford University, is publishing a new book, Cycles of Time, in which he argues that our universe did not begin with the Big Bang.

He thinks we can see in otherwise inexplicable patterns of low variation around various galaxies the evidence of what he calls earlier “aeons” before the Big Bang. He thinks it suggests that Big Bangs have happened more than once in the past and will continue to happen in the future. This is different from other, now discarded, theories hypothesizing that the universe may one day deflate back into the singularity – the incredibly dense dot of energy out of which scientists hypothesize the universe originally emerged – and then inflate again in another Big Bang.

Penrose’ theory explains why there are clearly visibly variations in the rings surrounding some galaxies.

Penrose thinks Big Bangs may occur when galaxies run into each other, and that these events are cyclical.

In a BBC interview Penrose was asked what he thought were the implications for a creator God. Penrose says that although he himself is not a believer, sources in the Vatican have told him that from their point of view, his theory neither confirms or eliminates the possibility of an initiating creator.

November 22, 2010

Have you ever wondered why Darwin assumed that genes are selfish? that is, that the pursuit for survival is based on a notion of survival based entirely on the individual involved?

Darwin himself wrestled with the problem created for this position by inter-species altruism. The classic, but by no means singular example, are worker bees who are sterile but seem to work all their lives solely for the survival of the larger bee community. Modern evolutionary theorists reason that altruism is a means of ensuring the survival of ones own genes carried by ones closest relatives. A second-best choice, no doubt, to passing on ones own genes, but it does leave the basic mechanism of evolution in tact.

But I’ve begun to wonder if Darwin was influenced by his Christian beliefs. Darwin himself took a lot of schtick, not least from his deeply religious wife, when he finally published his work. But I know from personal experience that one can hold on to ideas that arise out of one’s earliest socialization even when I have thought I’ve abandoned the entire system.

This phenomenon, of course, if not unique to me. Science itself is shaped by its determination to replace supernatural explanations of natural events with empirical, observable, and testable causes. But the influence of that rejection of the supernatural has not altogether disappeared. Many of the mistakes we scientists have made reflect a fear of letting the supernatural in by the back door.

Psychologists, for instance, in order to make eliminate concept of the “soul” tried to develop theories of human behavior in which the mind was completely absent. Feelings and thoughts were at best epi-phenomenon, shadows of the real reality that could be seen and measured and subject to the laws of physics. Not all of psychology is completely free of this assumption even today.

What has got me thinking in particular about this selfish gene is the concept of sin. At the core of Christianity is the belief that mankind is basically sinful, that we need to be redeemed, and that was why Jesus died on the cross. But it is possible to give up any belief in God at all and still cling to this assumption that we are basically sinful?

Are we hiding from ourselves that we still cling to this belief by using the world “selfish” instead of “sinful”? Do we think selfish is somehow more secular, less spiritually contaminated, less influenced by this idea which is as old as far back as we can see into our own history? Even before Christianity, we tried to placate the gods, sacrificed our virgins, danced our rain dances, apparently reasoning that we had done something terribly wrong and that we were the reason things were so badly askew.

In the light of this radical idea, I’m wondering whether the survival instinct is primarily creative rather than primarily selfish.

The problem with answering this question, it seems to me, is that there is so much evidence to support either perspective. The media, of course, are hardly an unbiased source of data. Stories that make news are inevitably the ones about disasters and crime, about impending doom, greed and ignorance. Feel-good stories are add-ons.

How much unselfishness, how much creative impulse, how much caring and concern actually exist in our world? It may be overwhelming. Even the awful apparent evilness of a Hitler or Stalin might be interpreted as failed attempts to create a utopia rather than totally self-absorbed obsession with preserving their personal genetic legacy.

This idea is not entirely original to me. I got the kernel of the thought from Tony Equale’s book The Mystery of Matter, soon to be available on Amazon. Tony suggests that there is more reason to think genes are unselfish than selfish. I take responsibility, however, for the elaboration of this idea here. Equale may totally disagree.

I’m not initially inclined to trust ideas that are too optimistic. But the more I think about it, the less outlandish the possibility that genes are more creative than selfish seems it might be.

November 12, 2010

I’ve been thinking about my post yesterday describing research finding that income increases up to $75,000 are correlated with reported levels of happiness among the half a million people interviewed in 2008-09.

This represents a significant inflation since a similar study found an annual income of about $20,000 would do it.

Personally, I think we benefit from recognizing the role money plays in reducing life’s stresses. As my step-mother used to say “Money might not make you happy; but it sure helps solve the problems that make you unhappy.”

Yet I’m inclined not to go overboard about just how much happiness money can buy. Along with moderating our worries, I suspect money often makes more room in our lives for those things that do bring us positive happiness – love, work, friends, learning, music, all the beautiful things and places and experiences that bring us joy.

But I have to make an effort to achieve for those things. Money doesn’t come with any of them automatically attached.

November 11, 2010

Religious teaching around the world has tried to convince people that money cannot buy happiness. Why worry, when God even takes such good care of the lillies of the field?

Unfortunately, a recent study of almost half a million people carried out by researchers at Princeton University suggests that more money does lead to greater personal happiness.

Their findings are complex, but doesn’t provide a lot of support for those who believe that money can’t buy happiness.

Everyday kind of happiness – the sum total of daily ups and downs – don’t just depend on one’s bio-chemical legacy. Up to a maximum of about $75,000 annual income, more money can increase ones daily quotient of feeling good.

It’s not that people don’t like not having money so much as not having money makes people worry more about life’s daily stresses and problems. Low-income asthma sufferers, for instance, are less happy than high-income asthma sufferers. Low-income people worry more about failing relationships than people who can cushion the blows with income.

This stress-reducing effect does not increase once people are earning more than $75,000 annually. But what does increase and what does not seem to have a ceiling, is the self-esteem that comes with increasingly higher incomes.

The more money people make, the more important they believe their work is, and the greater the contribution they believe they are making. They don’t report necessarily being happier than their poorer counterparts below the $75,000 mark. But they do seem to think their lives are more worth the effort involved.

Given these results, I find myself wondering whether comparable results would be found for both people who earned their money and those who have come upon their wealth through inheritance or other good fortune not the result of work.

November 10, 2010

This is a fractal. It represents a mathematical formula of things that in the first nature look disordered and imperfect. They describe reality as it is rather than describing it as a failure of perfection which traditional geometric shapes have traditionally done.

One of the earliest fractals was modelled on electrical noise which, it turns out, is not the random, annoying mess it so often sounds. Fractals have given order to everything from clouds to sea shores, the growth of organisms and the way milk spills on the floor. The father of fractal geometry, Benoit Mandelbrot, always thought they could be used to make sense of the apparently irrational moves of financial markets.

I think they are just beautiful. It’s not something I would have understood before I realized that looking at what exists in incomparable. I missed so much when I thought it was more important to look for “the essence” of things.

They are one of the things I think about when I contemplate what an incredible, marvellous, simply breathtakingly awesome this universe is.

November 5, 2010

For the last couple of months I have been trying to find out what scientists mean when they say particles go in and out of existence. I had been under the impression that this was related to Heisenberg’s uncertainty principle, but as I’ve been studying it again, I can’t see how. Heisenberg says that both the speed and position of a particle cannot be measured with any precision at the same time. I understand this and I think I understand why.

But particles moving in and out of existence might be a different matter. So might the fact that particles seem to be able to effect the movement of its partner even when they are separated by huge spaces. Scientists studying quantum mechanics say it happens but as far as I can tell, nobody has offered any explanation whatsoever for how it happens.

I struggle with questions like these and take them seriously.

But on some level, I look bonkers even to myself. In my seemingly more rational moments, even to take seriously the assumptions on which questions like this depend sounds absurd .

And yet I do.

I suppose it sounds as crazy to some people as the belief that the world is only four thousand years old and that the entire universe came into existence in less than a week sounds to me.

This is not to say I think there is nothing to choose between these two radically different world views. I think there is a lot more evidence to support the assumptions of quantum mechanics. And I also think it is reasonable, even for those who believe that the Bible reflects the inspired word of God, to view it as a metaphor leading to a higher truth rather than as literal truth representing a divine history lesson.

But I still don’t understand what physicists mean when they say particles go in and out of existence. What do you mean, to exist and not to exist and then to exist again? Is this a mathematical description of what they observe? Or is it meant to describe what actually happens?

October 29, 2010

Perhaps twenty years ago, I read an analysis of what scientists might do to make Mars a planet that is habitable for human life. It included ideas like hanging giant solar-reflecting shields in space to increase the Martian temperature and growing plants to generate oxygen and food.

The estimate at that time was that humans could start to colonize Mars in about a thousand years.

Yesterday, I read that NASA has begun funding plans to send astronauts on one-way trips to Mars or its moon to set up permanent human colonies there.

They estimate that the project would cost about ten billion dollars, and that it would begin about twenty years.

I think the idea is that about four men would be the initial colonizers and that gradually additional settlers would go out and the community would expand.

I’m gob-smacked.

And yet, psychologically I imagine such a trip to Mars in twenty years would be little different from the great ocean voyages barely more than 600 years ago. The voyagers, then, were going into a great unknown. They had no idea how big earth was, when or if they would hit land and if they did, whether they would find food and water. None of them could know if they would ever return and many of them didn’t. They might not even have been absolutely sure the world wasn’t flat.

Actually, colonizers to Mars in twenty years might know more about their voyage and their destination than the men who set out on sea voyages and discovered America. And the rest of the world.

October 25, 2010

I’ve just been reading about the concepts of existence in science. What it means to exist in our everyday world seems pretty clear. Once we have reached the stage of a two-year-old, we realize that things continue to exist even if we can no longer see them, and unless something goes seriously wrong, we don’t tend to change that conviction.

I mean, just because I’ve dropped my contact lens on the floor and can’t find it, I don’t decide that the contact lens has actually gone out of existence.

But on the quantum level, what scientists mean by existence is radically different. Things do go in and out of existence.

The Nobel prize for physics this year was awarded for the discovery of graphene, a form of carbon one layer thick. What is so astonishing about it – at least for people who live in what up to now I would have called “the real world” – are its characteristics.

When graphene is placed in a magnetic field, it becomes a quasiparticle. That means it behaves like an electronic or proton without actually being a particle at all.

I don’t pretend to understand this.

Scientists can actually demonstrate that something like this exists under the right circumstances. But I find it even more astonishing that they think it has a whole host of practical applications for use in our modern world. Just like electricity. Whole entire systems on which we depend depend on particles that go in and out of existence.

Do people who really understand these things think they are as incredibly fantastic as I do?

October 14, 2010

Several posts back, I mentioned that I had no idea how particles could move in and out of existence, which scientists say they do on the miniscule level of quantum mechanics.

Subsequently whether out of compassion, insight, or generosity, or all three, a friend sent me a book View from the Center of the Universe by Premack and Abrams.

It is one of the most fascinating books I’ve read in decades. Premack is a physicist of some significant stature and his wife is an attorney, a writer, and a poet, and I might add a brilliant teacher.

One thing she did was to explain how things go in and out of existence on the quantum level. Existence does not mean the same thing on this level as it does in our ordinary world. I was taught in school that electrons are very small things a little like tiny balls that go whizzing around the nucleus of the atom. But they aren’t. They are “things” at all. We can’t see them no matter how much they are magnified, and if it weren’t for the fact that they do things like turn on the lights when we flip the switch we might say they don’t exist at all.

What scientists do say is that they are probability clouds. All we know is that they exert their influence within a certain area and following very clear predictable laws. But since they are merely clouds of probability, when the probability dips to zero that they exist, they stop existing.

I am tempted to reread the entire book when I am finished with it the first time round and summarize it chapter by chapter here.

I realize you might not learn anything as a result of an escapade like this, but I’m pretty sure I will.

October 11, 2010

I think I’m beginning to understand why so many people don’t believe anything we are told – whether it’s by governments or the media or science or religious leaders.

Every single one of these institutions have claimed to be presenting “facts,” when far more often it is opinion, spin, best guess, doctrine, a mistake, ignorance, or sheer self-serving lies.

If we could just put our faith in the validity of something it would at least give us something firm to hold onto. But that’s not on offer.

Even science and scientists themselves often offer more than they have a right to claim. Scientific facts are not infallible, are not eternal truths, and if scientists admitted this more often, they may appear more credible.

If scientists were more candid about the nature of scientific “facts,” it may be more of a challenge for people to argue that the universe is no more than ten thousand years old, that the dinosaurs did not die out 65 million years ago, or that evolution is a myth concocted by the Communists. Because the argument would not be based on “eternal truth” but on reasonable conclusions based on empirical evidence.

For instance, most scientists in the mid-19th century were convinced that our earth was about 5,000 years old at the most. Less than 150 years ago, physicists, applying the laws of thermodynamics, were sure that the sun could not burn more than 40 million years and that the earth would soon cool and die with the already-waning sun. It was not until 1956 that the true age of the solar system was known. It was barely more than ten years ago that scientists realized that the entire universe is just over 13 billion years old. Not several thousand. Or several hundred thousand years. Or even several million years, as they had previously thought.

Being a little more candid that scientific facts change with the evidence available to us might even make it more difficult not to take global warming seriously.

In other words, it might make our suspicions are little less superstitious and a little more realistic.

October 6, 2010

I think sometimes I have failed to learn the difference between knowing about the problems of the world and taking total responsibility for them.

Certainly it is a form of high self-conceit to think that I need to anguish over the solution to every single injustice, every single stupidity, every single fallacy, every single instance of suffering, every single social or political failure I read about.

And if it is a conceit, then the workings of the entire world are not only not my responsibility, but to worry about them as if they were my responsibility can hardly be a fulfilling way to live.

Besides that, I don’t know what the world needs. I doubt it needs perfection. Stephen Hawking points out that if the particles emitted with the Big Bang had been in perfect alignment, nothing would have ever happened.

Imperfection is the engine that keeps the entire universe running.

I’m beginning to think there is a grain of sanity in isolating oneself from the continuous onslaught of the outside world, made so much noisier these days by our access to global media.

Okay, that’s my thought for the day. I’m off to watch the evening news.

October 5, 2010

Walking in our local park this morning, I noticed that the dandelions are still flowering with abandon. And I remembered that dandelions were one of my earliest introductions to the unyielding realities of life.

I used to pick bunches of dandelions on the farm where I grew up and bring them into my mother. She always thanked me generously and put them in a vase of water. But it didn’t take me long to notice that dandelions wilt very fast. Within hours, they were drooping miserable specimens. There was nothing that could be done to revive them nor could my mother even offer any advice from her vast store of wisdom about how to keep them alive in the future.

Dandelions simply don’t last.

I remember one of my brothers being crushed by a similar experience when he discovered that Dad could not mend a burst balloon. It was that moment of truth when one discovers that ones Dad is not quite up to the standard of almighty perfection which we had taken for granted. It wasn’t that he wouldn’t fix it. He couldn’t.

Like dandelions, when balloons burst, they are dead forever.

I’ve been reading a book sent by a friend From the Center of the Universe that is discussing the great loss we have experienced with the loss of our creation stories which tells us how we got here and – more importantly – what we are doing here. Every culture we know about has had a creation story, and it inevitably tells the people in that culture what life is about, what it means, and why they are especially and uniquely important.

Today, it isn’t just Christians who have lost their creation story. The combination of science and globalization has changed everybody’s story from one that forms the foundation of our very identities to one that is no longer valid in the same way. It has moved from being a story of unquestioned truth accepted by everyone to being interesting anthropological data.

I think loosing our creation stories is for a culture a lot like learning that burst balloons can’t be mended or that dandelions don’t last. But the loss is so much greater. Which is why, I think, fundamentalism is on the rise throughout the world.

The Tea Party in America thinks it is threatened by the loss of what it means to be an American. We have parallel fears throughout the Muslim and Jewish worlds. It’s happening in India and China and South America. The entire 20th century is marked by its appearance in Germany with the rise of Nazism.

Many individuals can live without a creation story and an accompanying cosmology that gives existence its meaning.

But I wonder if whole societies can endure for very long without them. Will fundamentalism win out simply because we as a species cannot bear to accept that balloons break and dandelions wilt?

Or will we succeed in constructing a new cosmology based on the creation story science is now putting before us?

October 2, 2010

As graduate students, some of us used to play a game in which we constructed paragraphs composed to sound like learned articles but which were absolute nonsense. It made us laugh which helped to keep us sane.

Occasionally, I read a serious article that in my ignorance sounds exactly like one of our graduate creations, and I cannot resist the impulse to laugh. I had that experience yesterday reading an article about black holes published in this week’s Economist. Here is an excerpt that illustrates my point. I’ve underlined two statements I think deserve special attention.

Dr. Hawking’s insight came from considering what happens in the empty space just outside the event horizon (of a black hole). According to quantum mechanics, empty space is anything but empty. Rather, it is a rolling, seething cauldron of evanescent particles. For brief periods of time, these particles pop into existence from pure nothingness, leaving behind holes in the nothingness – or antiparticles, as physicists label them. A short time later, particle and hole recombine, and the nothingness resumes.

If, however, the pair appears on the edge of an event horizon, either particle or hole may wander across the horizon, never to return. Deprived of its partner, the particle (or the hole) has no “choice” but to become real.

I’ve read this kind of thing before, but it went so far over my head that I barely noticed. This time I was really trying to understand.

I must confess that I have failed utterly. I cannot conceive how a particle that is not yet real must become real because it has lost its partner. And how does one leave a hole in nothingness? Or if the particle does find its partner — which is a hole — why do both return to nothingness – which replaces the hole. My questions seem so obvious that I am practically speechless.

And I already knew particles on the quantum level “pop in and out of existence” but in truth I have absolutely no idea what the physicists are talking about.

What can I say? I asked somebody to explain this to me once but he sailed into math. Are there any sources that plain ordinary people like me could understand?

(Not that I generally consider myself “plain and ordinary.” But faced with this look at reality, I feel the need for a little unaccustomed humility.)

September 26, 2010

For most of my professional life as a psychologist, I have been fascinated by consciousness. I have studied how it works, asked how it emerges and develops. I have looked at what organisms or even non-organisms are capable of consciousness, how it differs depending on where and how it operates, its relationship to intelligence.

I have been particularly fascinated with the private nature of consciousness. Why can’t we ever access the consciousness of anyone or anything else beyond our own? What is the usefulness in Darwinian terms of consciousness that is so circumscribed? And why or how is it that two different people can look at exactly the same thing and yet be conscious sometimes of vastly different things? It is even possible for two people to experience diametrically opposed things in the face of apparently exactly the same situation.

But above all, I have never stopped wondering what consciousness is. I have little doubt that some day scientists will have built a model of the brain and the nervous system with its complex of synapses and feedback loops which outline with credible accuracy the paths which information travels throughout the organism and the extent to which various parts of the brain operate in an executive function.

But this does not begin to address the question of consciousness. What I want to understand is how this system of electrical impulses and bio-chemical interactions which comprise our nervous system results in the experience of consciousness. How do bio-chemical interactions become experiences such as awe or delight or fear or love or convictions?

This question is similar to the problem of the relationship between matter and energy which Einstein finally cracked with his famous equation E = mc2. I suspect it might even be an amplification of this problem. If we understood the nature of matter and the nature of energy better, we might be better able to understand how the nervous system and consciousness are related.

For centuries, this question has been almost invisible to science because consciousness was equated with the soul or spirit and therefore beyond the realm or interest of science.

But consciousness so clearly is, it is a phenomenon that we all experience no matter what our circumstances, and so clearly is intimately related to what we do, that science cannot afford to leave it to the theologians who explain it in terms of a “soul,” or to philosophers who on occasion dismiss its intrinsic existence altogether.

September 21, 2010

Yesterday, a young television journalist reported on the body of a polar bear that had been washed up on the shore of Cornwall. It turned out to be rather embarrassing though because the polar bear was a cow. I mean, everybody knows that polar bears live thousands of miles away from Cornwall.

On the other hand, the BBC is running a programme this evening on the tigers who are living in the Himalayas thousands of feet above the tree line. But everybody knows tigers don’t live 13,000 feet up the Himalaya mountains.

But they do.

The problem with needing to be right – at least it was for me – was that one can’t think the unthinkable or the ridiculous. Needing to come up with sensible right answer severely limits creativity.

So reporting on polar bears in Cornwall might be embarrassing. But I think that reporter has potential to discover something fantastic.

When it’s not in my field of psychology, I often find it difficult to know whether a scientist sounds off the wall because his/her thinking is so far brilliant, so far ahead of mine that I can barely grasp it or because s/he is simply kooky.

Which is why I’ve been wondering lately about the concept of emergence. I’d begun to get a little suspicious about it, and I wanted to know if it belonged in the same category as Intelligent Design or if it was a legitimate concept within the mainstream. Kauffman in his book Reinventing the Sacred argues that emergent phenomenon cannot be predicted using the scientific method and equates emergence with creativity.

Data supporting the failure of science to predict emergent phenomena is quite solid and broadly accepted. I first began to wonder, though, when Kauffman’s suggested that we might equate this emergence or intrinsic creativity with the Sacred. Even with God. I knew that was a step too far for me, but scientists are always going beyond the limits of the proven. If they didn’t, we’d never have any hypotheses, no break through theories. So that is fine.

I really started to question, though, when I discovered that many writers are using the concept of emergence as that elusive question which so many believers are looking for, that question to which the only possible answer is God.

So although I know it’s hardly a return to the original sources, I asked Wikipedia about emergence.

Art by Holly Werner

It is, indeed, a highly respectable idea that goes back at least as far as Aristotle, and does indeed describe phenomena from physics to psychology that are greater or different than the simply sum of its parts. At the moment it seems to be an aspect of mystery in the universe which is attracting particular attention.

Some people want to find God in the universe. Some people don’t. I’m in the Don’t group. For me adding God doesn’t elevate the universe but downgrades it. I remember as a young teenager saying that I didn’t want someone to love me because I was made “in the image and likeness of God.” I wanted them to love me because they loved me.

And now I sort of feel that way about the universe too. It’s fantastic. It’s incredible. It constantly brings me to a state of stunned awe.

I’m sure it isn’t true for everybody, but for me, adding God flattens everything, it reduces it. Almost as if there nothing particularly impressive to notice about the universe in its own right.

I can see where Plato was coming from with his world of perfect forms and I understand how the Church transformed that to a supernatural world presiding over this one. But I can see why Buddha never added God to his world view.

It would be interesting to know if Stephen Hawking’s new book, Grand Design but not Grand Designer, has caused as big a stir on the American media as it has here. It isn’t quite another Galileo moment, but it has the same ring.

What Hawking is saying is that we do not need to posit a Creator for the universe, that we know the laws of physics now well enough to understand how the universe may have spontaneously emerged out of nothing. I don’t claim to understand this, but I am sure that Hawking, one of the pre-eminent physicists in the world today, knows that he is talking about. In other words, he is talking physics, not theology.

But Hawking is knocking down one of the great “We don’t knows” used to buttress the existence of God for almost half a millennium. If we don’t need a Creator to understand how we got here, what rationale do we have left for believing there is a God?

Personally, I have long felt that substituting “God” instead of “I don’t know” for any question to which we don’t have the answers was always going to be faith on very shaky grounds. Scientists have been answering our “I don’t knows” for centuries — how the stars stay in the heavens, how the eye works, how the great diversity of life occurred, why volcanoes erupt and earthquakes crack open the face of the earth. And now this blow: how the universe started.

The stubborn, unyielding, rejection of science on the grounds that it can’t be so because the Bible says so is not terribly new, but it does seem to be activated by fresh virulence and irrationality. Yet, it seems to me, the direction is irrevocable: believing in God because we have no explanation for an event convinces fewer and fewer people.

Though you might not think so listening to many of the theologians being interviewed over here. Few seem to have a concept of God that can withstand the possibility that the universe is the result of the application of natural laws rather than of supernatural intent.

But if I have no patience with the backward kind of thinking that says “this can’t be true because there is a God,” I equally have little patience for the “It’s nothing but” proclamations of absolute reductionism.

The great majority of people in the world today have neither the opportunity nor ability to analyze the scientific or philosophical issue related to these questions. But I do think people often reject science because they think the scientific attitude requires a “It’s nothing but” conclusion.

And so when someone of Hawking’s stature says that we have no need for a Divine Creator to explain the existence of the universe, what they hear is that life has no meaning, that love and generosity have no value, that their lives, their families, theirwork, have no purpose. It is a message of despair that many people think must follow from a universe without a Divine Creator.

I disagree. I trust my own sense that my life, that all of life, has intrinsic value. I happen to think that it is we who must create meaning, rather than a God presiding above from a supernatural throne.

But I do think I have some appreciation of the whiff of despair that science like that of Hawking can create in those who have been taught that it is only God who gives us meaning.

September 2, 2010

When I rather lazily asked Google this morning what percentage of all games of Solitaire are potentially winnable, I was merely wondering how I compared with the best possible outcome. I didn’t expect it to generate another post on reductionism.

The answer absolutely astonished me. They don’t know. They don’t know because the answer seems to involve Reimann zeta function (I haven’t the faintest idea) used in quantum physics. The Reimann zeta function is used to figure out prime numbers, but has yet to be proved to be correct or incorrect and is one of the most biggest mysteries in mathematics since it was proposed over 140 years ago.

To figure out Solitaire?! I tried to calculate 52! using Excel which would tell me how many potential games of Solitaire there are altogether. The number, I admit, is very very large. But it is not infinite. And if scientists can’t even predict the potential outcome of 52 inert cards, what chance have they of predicting much of anything?

As I’ve said before, what’s brilliant about reductionism is the potential outcome of its analysis, not its ability to predict and explain beforehand.

August 28, 2010

A friend just sent me the address of a and asked if it was predictable. What do you think?

Well, no, it wasn’t predictable. Not in the way traditional reductionism would argue events can be predictable.

In that sense this creative approach to making escalator-walking more appealing is an illustration of the limits of reductionism. And I suspect that the evidence is almost universally incontrovertible now that this emergent creativity can’t be predicted, that it is intrinsically unpredictable using the scientific method.

But I’ve come in this post to praise reductionism, not to bury it, and the case of the musical escalator illustrates some of the reasons why.

Because it would be too easy to say simply “ah, well, if it’s unpredictable, that’s it. It’s unpredictable.” And that would ignore the tremendous contribution that reductionism has to make to our understanding. Because although reductionism fails pretty miserably in making predictions, it is incomparable in its analytical successes.

The anti-reductionism position (which I share) is that the whole is more than the sum of its parts. In other words, the way something is organized actually creates something new, something more. But that something is still also the sum of its parts. A symphony is still made up of individual notes, for instance, or a bird is still made of a handful of molecules.

What the reductionist approach does is to analyze those parts and although they may not often say so, looks at how they are put together to make something new.

After the analysis, the reductionist then asks if what has been taken apart can be put together in a new way to make it work better or work differently or not work at all. So the reductionist examines the elements which constitute a cancerous tumor, for instance, to try to find why it does not stop multiplying the way healthy cells do. A reductionist analyzes a flu virus in order to develop a vaccine that will inhibit its virulence. This approach has been repeated in thousands of different fields in thousands of different ways.

In relation to the musical escalator versus the static stairs, it would be a great loss simply to say it was unpredictable. Once it has happened, what questions might arise from a reductionist perspective?

The key question would be what it is about this situation that led to so many more people using the escalator than before? “Fun theory” is fun, but it isn’t quite specific enough. Partly because “fun” is a pretty foggy word.

Let us ask instead things like:

how long did people continue to use the escalator instead of the stairs? Did they switch routinely to the escalator or try it out only once or twice? In other words, how influential was uniqueness?

is this phenomenon widespread? is it limited to urban populations? does it occur equally during rush hours and less crowded or hurried times?

do people tend to take the escalator more or less often if they are alone? or if they are with children? is there an age variable? do men try it out more often than women? or vice versa?

The advantage of getting answers to the reductionist’s analytical questions like these is that, paradoxically, it opens the way for greater creativity. If we can understand cancer cells, we might be able to find a way to cure cancer. If we know the variables that increase the likelihood of cooperation among strangers, we might be able to reduce incidents of violence and exploitation. If we know the variables that make it more likely that people will enjoy exercise, we might be able to reduce the incidence of heart disease and obesity and cancer.

So hurrah for the reductionists. They might have thought they could figure out everything. They can’t.

But they can figure out a lot. And they are among the most creative people I know.

August 23, 2010

The Kepler space telescope is out there looking for intelligent life on another planet. I’m not sure we want to find it. If it has evolved with a drive for survival anything like ours, both we and they will shoot first and ask questions later. In which case, the meeting may be as deadly as the meetings have been that have occurred between humans reintroducing themselves to each other in the last 500 years. And these meetings were simply between various groups of human beings separated for as little as ten thousands years.

Religious groups won’t like it either. Dealing with the discovery that we humans are not at the centre of all things was hard enough when Galileo convinced the world that the earth revolves around the sun. Darwin’s theory of evolution dealt another blow to our uniqueness, and continues apace with our recent creation of new life forms in a laboratory. Many people already dislike the idea that plants and animals are intelligent in a way akin to human intelligence, and discovering that there are whole groups of intelligent beings out there seems intolerable to some who believe that their god could only love us Homo sapiens if we are not at the top of the tree.

August 22, 2010

If the laws of physics do not totally explain the universe, as reductionism claimed it would, what does? Does anything? Is there a scientifically viable alternative to reductionism? Can the universe be explained in terms of natural law?

Although there seems to be increasing debate among philosophers and scientists, there is no consensus about what needs to be added to physics in order to understand the universe. The possibilities are broad:

I. Traditional reductionism might be re-invented. Mathematicians have devised proofs that all possible events cannot be encompassed by mathematics (don’t ask me to explain: it’s the kind of thing I take on faith, except there do seem to be seriously brainy types who understand this and who, therefore, do not take it on “faith”.) Nonetheless, it is possible that there will be some stunning paradigm that emerges within physics that returns it to its position as a completely comprehensive explanation.

II. The universe is controlled by natural laws which emerge as organization becomes more complex. Biology, therefore, operates in terms of its laws of genetics and evolution. An individual person operates according to principles of psychology, cultures and communities in terms of principles of sociology. Higher levels of organization can be analyzed in terms of their component parts but they cannot to totally reduced to them. In other words, when new levels of organization appear, something new has appeared which must be understood both in its own right and in terms of its component parts.

This approach requires a broader definition of “natural law” than that assumed by traditional reductionism, and thus far I have not found a completely satisfactory definition. How does one recognize whether a proposed law is “natural”? Saying simply that everything is “natural” and nothing is “supernatural” may be valid but it is vague and incomplete.

III. Life, or even all the universe, is emergent, that is essentially creative – and therefore necessarily partially unpredictable. This possibility limits the potential of science to predict events but it does seem to describe much of what we observe. Darwin’s theory, for instance, explains why and how species evolve, but it cannot predict the nature of that evolution. It can predict that if the environment changes substantially, new organisms will emerge that are better adapted to the new conditions. But Darwin’s theory cannot tell us ahead of time about which changes will emerge to make an organism better fitted to survive.

The problem with a creativity hypothesis is that it is an idea that can be used to explain just about everything we don’t understand. It makes scientists like me nervous that it can be used as a blanket excuse for not trying to understand. “Creativity,” attractive as the idea is, tends to hit a blank wall unless, like “natural law,” it is more closely defined to keep it from being used as a catch-all for whatever unpredictable event may occur.

IV. There is an immanent sacredness, even an immanent god, in the essential unpredictability and apparent creativity of the universe. This god does not put aside the natural laws of the universe, but is manifest in its creative adaptability.

This position is similar, but with an added religious dimension, to the more secular creativity hypothesis. It seems to me, however, to share both its potential strengths and weaknesses. It can be used to explain some of our most profound human experiences. On the other hand, it is even more difficult to define and can be used as an all-encompassing explanation for everything we don’t understand.

V. There also remains the traditional dualist position, that God operates from beyond this universe to direct its course. This God may intervene in events, superseding the natural laws governing the universe. This is the idea of God of much of modern Christianity and Islam.

The difficulty with this position is that, because it is intrinsically untestable in empirical terms, it rejects the essential validity of the scientific approach.

Most scientists today, including strict reductionists of the traditional mold, now accept that the Newtonian belief that some day we could potentially explain everything that has ever or ever will happen is theoretically unachievable. We cannot ever understand the universe totally. But we can always understand more.

We may live in an eternal, infinite mystery, but we can always delve into it more deeply, unravel more of its unscrutability. The mystery may not, in truth, become smaller. But our understanding, our witness to it, can become greater.

August 21, 2010

According to traditional reductionism, all phenomena in the universe can ultimately be explained in terms of the laws of physics and the interactions of fundamental particles. Nothing else is real in its own right. Originally, reductionism argued that all events are both completely determined and predictable.

With quantum physics and the principle of uncertainty, the commitment to determinism has been greatly diluted. There is still no room for choice or free will, for values or meanings, but there is certain wiggle room for random events within the limits of probability.

But a commitment to the total sufficiency of the laws of physics for explaining events has remained firmly in place.

Until quite recently.

The problem – actually there are several, but there are two difficulties I find particularly intriguing. According to reductionism, events should be reversible and predictable. Right now, once we are beyond the simplest levels, they are often neither.

If all objects are really only the interaction of particles, it should be possible to return the objects to their initial state. In other words, they should be reversible, the way water can be returned to two atoms of hydrogen and one oxygen. But except on the simplest level, they do not reverse. A drop of ink dropped into water will disperse automatically. But it never coalesces as a drop of ink again. A boiled egg cannot be returned to its unboiled state.

This may seem to be a fairly niggling unimportant observation. And it might be. Except that the higher up the level of complexity and organization one goes, the less reversible events are. A fertilized egg develops forward, not backward. Seed-bearing plants may ultimately evolve into egg-laying birds, but birds do not seem to evolve eventually into simpler organisms like plants again. Even the entire universe is not characterized by the entropy that should be there if the combination of particles reversed as often as they combined.

What seems to be happening instead is that the universe is increasing in complexity in higher and greater levels of organization. Fundamental particles combined to make simple atoms which combined to make molecules which combined in stars where further combinations have produced the molecules out of which simple life emerged. Here on earth we have seen single cell bacterial combine for billions of years into higher and higher forms of living organisms.

The second problem for reductionism is its inability to predict. If everything is the result of the laws controlling the operations of fundamental particles, then it should be possible to predict what is going to happen.

But we can’t predict the future. Except for the motion (within a certain tolerance of error) of the planets and stars, the future cannot be foretold with anything resembling accuracy or precision. Economists didn’t see the credit crunch coming, and can’t predict what will happen if governments increase spending or raise interest rates and taxes to pay down the deficit. Nobody foretold the popularity of the internet or foresaw the floods in Pakistan which thus far have displaced twenty million people. Nobody knows when or if we will ever find a cure for cancer, if a pandemic might again reduce the human population the way the Black Death did, or humans will colonize another planet. We don’t know how technology or cultures will develop or whether books and newspapers will survive the cyber age.

Even after the fact, we are unable to point with certainty to those conditions which produced events like these or almost anything else that surprises us when it happens.

Life does not break any laws of physics. But it seems to go beyond them. On a grand scale, the path of evolution cannot be predicted. On a more immediate scale, we cannot predict with certainty how even a single individual will vote, who they might marry, or the career they will embark on.

Reductionism continues to notch up an incredible analytical record. It has moved into the very center of living organisms, analyzed the way DNA is structured and operates, and has even been able to create new life forms by re-organizing it. But no one, at this point, is under the illusion that science can predict the future. In practice, we all live in a world we cannot fully predict or understand.

The question is why. And what are the alternatives to scientific reductionism?

August 19, 2010

The assumptions of traditional reductionism grew out of physics, and I do not accept the original version of reduction that the explanation of all things can be found in physics.

Yet modern physics has introduced me to the world that I find so magnificent. So incredible, so awful, so absolutely marvellous.

But when I say that, it may sound as if I understand much more than I do.

I was reading an article in the New Scientist this morning theorizing about what made and continues to make the universe expand. And I barely understood a single thing I was reading. I could tell the sentences were in English, but it might as well have been one of those extinct languages that nobody speaks any more. Does the answer lie in SUSY? Can it be tested using the LHC in Geneva? Are the gravitational waves generated by the original sparticles too small to detect with the Planck satellite?

Plain ordinary-looking men and women, probably wearing blue jeans as a disguise, debate and argue over questions like these.

August 8, 2010

A team of American researchers in Oregon have figured out how to make electricity out of raw sewage. They started by adding a smattering of gold particles to the sewage along with bacteria that release electrons that creates electricity. If it works, it would have an immediate impact on agriculture, and be especially valuable in the Third World where untreated sewage often causes disease.

The technology already works in a laboratory, but there are several problems they have to work out before this process can be implemented on an industrial scale. One is the cost of gold – even the small smatterings of it that are required to make the process work fast enough to be viable. At this moment, it looks as if iron might speed up the process just as well, and would produce electricity at a competitive cost in today’s market. The cathode chamber also needs to be improved, and choosing the best microbial species to do the job requires more study.

A little less ambitiously, we are exploring the option of putting photo voltaic solar panels on our roof to generate electricity. Right now it looks as if with the government subsidy on offer, the process would pay for itself in about eight years. At that point, our electricity would essentially be free, and anything we don’t use would go back into the grid, generating a small income of its own.

The biggest problem we have is that we might have to cut down an absolutely gorgeous blue spruce about 100 feet tall. Unfortunately, is hogging the sun that the panels need.

August 6, 2010

I can hear the sighs of boredom as I say I’m writing yet another post about reductionism in science.

So I’m going to defend myself by trying to explain why it matters so much.

Basically reductionism mechanized the universe. Everything – everything – happens automatically in the same way a ball rolls down a hill or water turns into ice when it reaches 32 degrees Fahrenheit. Essentially a plant, a couple in love, a dog protecting a child, a man climbing Mt. Everest are also all operating on these same basic mechanistic principles. The feeling that each of us has of being self-propelled, and the sense that others and all other living things are too, is an illusion.

So if reductionism is right, we not only live in a huge machine. We are merely cogs in a huge machine over whose direction and operation we have no control and for which we have no responsibility.

Personally, I think if reductionism is right, kicking and screaming like a two-year-old isn’t going to change things. It won’t change things if we elect politicians who agree with us and are willing to make laws that force everyone else to live as if they agree with us either. If this is the way the universe is, either because God created it that way or because there isn’t a God at all, then the honest, courageous act of integrity is to accept it.

And so it is an important question to ask not as a religious believer but as a scientist if reductionism really describes the universe as we observe it.

As a scientist I myself am convinced that reductionism is an incomplete description of the universe. I’ve indicated earlier some of the evidence I find convincing, and I suspect I will write at least one more post on the problems arising in physics today with a purely reductionist view.

But not being a reductionist doesn’t make me a believer in god or in a supernatural world. I personally am committed to the view that all laws are “natural.” It’s just that I think the “natural” world is a whole lot more mysterious than the traditional reductionist may think.

In that sense, reductionism did a violent disservice to life. For centuries, it convinced people that animals are no more capable of suffering than a car engine is. Human thoughts and feelings were not really real, nor were we responsible for anything – good or bad – that we did or for any consequences of our choices. Even today, insights we may gain through the arts – poetry, literature, music – are somehow evaluated as “soft,” not quite as valuable as “facts” we can prove scientifically. We don’t know how scientifically to evaluate beauty or the value of wild land or the pleasure of a hug for their own sakes, and so if they do not contribute some countable economic gain, we feel free to destroy them or pollute them or build on them.

I think these things all represent a great loss.

And that’s why I’ve been banging on about reductionism for weeks. It’s not a defense of religious belief, it’s not an argument against the scientific method.

It’s an argument for that approach to science that doesn’t level the world to a single dimension. Whatever else the universe is, I don’t believe it is reducible to a mega-machine. We live in a much more mysterious, overwhelming, astonishing place than that.

August 4, 2010

Once when I was talking about my book, The Big Bang to Now, someone asked me where the Garden of Eden was located. I didn’t understand the question at first because I think Genesis was never meant to be understood literally, and that Eden is a symbolic place where mankind is in harmony with himself, with God, and with nature.

I answered the question saying that all the evidence is that our species began in Africa.

That is still the case, but some intriguing research now suggests that there actually was a place we might call a Garden of Eden.

About 195,000 years ago, an ice age that lasted more than 70,000 years moved over Africa. Only a strip of land on the southern coast remained habitable, where there was rich land vegetation and abundance of food in the sea.

Along this small strip, it is possible that the human population fell to as few as a hundred people. The evidence they have left behind shows, among other things, that they heated the blades of their weapons to make them stronger. In general, they seem to have been capable of a level of abstract thought many researchers thought had not developed until 40,000 years ago.

August 3, 2010

The finest thing we can experience is the mysterious. It frees the fundamental emotion which stands at the cradle of true art and true science. He who knows it not and can no longer wonder, no longer feel amazement, is as good as dead. A snuffed-out candle. It was the experience of mystery, even if mixed with fear, that engendered religion. A knowledge of the existence of something we cannot penetrate, the emotional manifestations at the profoundest reason and the most radiant beauty – it is this knowledge and this emotion that constitute the truly religious attitude. In this sense, and in this alone, I am a deeply religious man.

Albert Einstein –

Address at the Conference on Science, Philosophy and Religion, New York, 1941

August 2, 2010

Somewhat to my surprise, in the last three days I have been engaged in no fewer than four discussions about the alternative to God.

If, as we all agreed, the traditional concept of god is anthropomorphic and coercive, and if, as we all agreed, this concept of god appears to be incompatible with science, what is the alternative?

It was apparent during these various conversations how central the question of reductionism is to this question.

Traditional reductionism says that everything that has ever happened or will happen in the entire universe is ultimately completely explicable in terms of the interactions of fundamental particles. In terms of God, the most this position can say is that there might be a Creator God who set the universe in motion and then dissociated himself from any further interactions with it. It is a machine which now runs on its own and will inevitably go forward according to the mechanical laws which govern it.

The alternative within science to traditional reductionism is not as fully defined as this. There is agreement that the traditional approach has been breathtakingly successful in explaining some phenomena. There is also agreement that at this point traditional reductionism is incapable of predicting the course of developments once we enter the realm of living things. It cannot predict with any precision the course of evolution, of the development of the economy, of technological advances, or even the direction of the stock market within the next week. It doesn’t even have any suggestions about how to go about making these predictions in a more scientifically precise way.

The alternative or additions required to reductionism are far from agreed, however. Not everyone agrees whether all the laws governing the universe can be called “natural.” Or whether a creative impetus is intrinsic to the universe. Or whether we need a new concept of an immanent, even emerging “god,” or whether the word “sacred” describes the mystery in which we find ourselves immersed.

I have already said that I do not believe in the traditional concept of god. Even the word “sacred” makes me very jittery. I have heard people describe what they mean by “sacred.” They use words like overwhelming reality, stunning, awesome, astonishing, mysterious, amazing, all of which I can use without a qualm. But “sacred” carries too much baggage for me.

No matter how much people argue that they don’t mean “god,” when they say “sacred,” I cannot banish the image of votive candles and priests in vestments lifting their hand in blessing. “Sacred” for many is liberating with a suggestion of infinity. For me it is coercive and suffocating.

I suppose my hang-up with the word sacred stems from my childhood when I was taught that “sex is sacred.” This piety didn’t succeed in permanently ruining the possibility of sexual pleasure, but for some time it was pretty effective.

And it does seem to have permanently influenced my reaction to the term “sacred”.

July 27, 2010

I’ve read an argument recently that scientific reductionism was framed the way it was because of the predominant religious view from which it was separating itself.

That view was the dualism of Roman Catholicism of the Middle Ages. Adopted from Plato’s world of perfect forms, Catholic theology made God the vital force which informed matter with souls, giving it life and making is capable of thought and decision-making.

Most of the early scientists continued to believe in God and thought that they were studying his work as they unfolded the marvels of the universe. But they believed that although God had created the world, it was now controlled by purely natural laws undirected by forces from a supernatural world.

Reductionism, therefore, eliminated God as an explanation of dynamism and vitality. Or more precisely, they eliminated dynamism and vitality as scientific forces. Without the supernatural, matter itself became inert. So convincing was this view that it was adopted by almost everybody.

Of course, changes were seen to occur in the universe, but those changes were the result of purely mechanical mechanisms. They were never self-propelled, save for the laws of physics which determined the interactions first of particles, then atoms and molecules, and finally the complexity of life.

With quantum mechanics and Heisenberg’s principle of uncertainty, the absolute determinism assumed by the reductionists was abandoned. But this did not usher in an acceptance of free will. The commitment to the adequacy of the laws of physics to explain everything that happens in the universe remains.

Except it is being undermined from two fronts. The first is a religious argument, which, frankly, I regret. Many religious people believe that reductionism and God are incompatible, and depending on one’s particular theology, they may be. I think, though, that it is unfortunate that scientific doubts about the adequacy of the reductionist view should be confused with religion.

God can neither be proved nor disproved through science. Believers who cling to their beliefs on the grounds that X or Y cannot be explained scientifically are inevitably destined to be disappointed as our understanding of the universe continues to expand. “God” as the answer to questions that we cannot yet answer is based on a flimsy faith.

The second assault on reductionism is coming from within science itself, and this, I believe, is a valid battlefront. Some scientists, looking at the evidence, now doubt that we can explain the phenomena of our amazing universe within the reductionist assumptions. They believe that the original assumption that matter is inert in itself is no longer scientifically viable. Einstein has shown us that matter and energy – that dynamic vitality pervading the universe – are two forms of the same thing.

The increasing organization of the universe, therefore, springs from an intrinsic dynamism of matter. It is no longer only Gestalt theorists who are arguing new level of organization are governed by new laws appropriate to that level. They are joined by physicists and biologists from below, by psychologists and social scientists from above.

I’m going to take a break on the subject of reductionism now until I’ve done more reading on contemporary thought. I don’t know how great my need is to actually understand the research that has started a whole new dialogue on the subject among physicists and biologists, but it is provocative and fascinating.

But not because it has anything to do with the possible existence – or otherwise – of God.

July 26, 2010

The mind-body problem has been written about by great minds for at least a couple of thousand years. I’ve read some, understood a few, and failed to read even some of the best on the subject.

I think I understand the problem in psychology, though, if not the solution.

The fundamental question asks where thought comes from. How does the mind work that it produces what seem to be immaterial thoughts, some times about things we have never experienced or that aren’t even remotely possible?

For the sake of getting off first base, scientists (including me and most everybody else most of the time) assume that the physical world actually exists independently of our thinking about it. In other words, we distinguish between our thoughts about a thing and the thing itself. We don’t usually think of our plans, or our dreams as reflecting concrete objective reality. But we do think of the world we experience in our waking hours as real.

Those real things include everything from gluons to galaxies, from pepples to people, from electrons to elephants. They are all material, all made up of matter. But a thought doesn’t seem to be made up of matter.

In this life, at least, thought does depend on the matter of which our brain is made. But what is a thought itself made of? How do the operations of the material world produce a seemingly immaterial thought?

There are three popular alternatives, none of which is totally satisfactory:

One is that living organisms have souls which are immaterial and are the source both of life and the capacity to think. The predominant religious view is that the human soul is created by God with the beginning of each life, and that it will never die.

The difficulty for scientists with the Soul Solution is that it cannot be subjected to scientific proof or analysis.

Scientific reductionism takes a second position – that thought is no more than the operations of the brain itself, with its multiplicity of feedback loops and incredibly complex and rapid functions. Modern brain research has made huge advances identifying parts of the brain where various functions are supported. We know where short-term memories are encoded, for instance, or what part of the brain supports various senses like vision and hearing. The mapping of the brain through MRI’s is telling us almost daily about how the brain functions.

The limitation of this position, however, is that scientists are unable to identify the content of even the simplest thought. We can often make a good guess. Even a very very good guess. If the relevant part of the brain fires after hearing a particular word, it is a good guess that the person heard that word and responded to it. But we still have no direct access to the person’s private consciousness. So it is possible that the word was misunderstood and the response was really to a different word. The problem is compounded for more sophisticated thought processes. Scientists could not guess from watching MIR read-outs, for instance, the content of an author’s thought if he were asked to think about how he intended to develop the next six chapters of his current novel in process.

Looking at the question from the other side, we run into a similar difficulty. We have built complex computers that can solve many of the same kinds of problems we can. They even produce what we call “artificial intelligence ” and computers are often much faster and more accurate than we are. But all this computer complexity does not seem to have produced computer consciousness. Although, since consciousness is a private experience, it is possible that computers do experience consciousness or thought as we do. But if so, we still don’t know how a computer produces consciousness akin to ours when it is plugged in and turned on.

The third alternative is the possibility hinted at by Gestalt psychology, and which is reflected in what some biologists today call the emergence of agency. This view sees mind as a phenomenon arising out of the operations of the natural world in which we live. Mind has evolved as naturally as have vision to see with and wings to fly with.

This point of view agrees with the reductionists that the mind is natural, but argues that when a new organization emerges, something more than the sum total of the parts now exists on both the physiological level of brain functions and on the psychological level of experience. This new organization then operates with an additional set of laws. The basic laws of physics are not violated, but they are insufficient to describe or predict the totality of the new organization. Biologists, then, for instance, develop additional laws for living organisms which add to the fundamental laws of physics.

This third position also has its limitations. We still do not know how the mind seems to produce apparently immaterial thought through the operations of a physical body.

July 24, 2010

As I said in an earlier post, there were and there are theories of psychology which espouse reductionism. But not all.

Psychologists from a German school of thought called Gestalt Psychology rejected reductionism in the late 19th and early 20th centuries because, they argued, it did not adequately describe the way the human mind worked or the way the physical world is actually constructed.

The Gestalt psychologists focused initially on the nature of perception. What they demonstrated in a large and varied series of experiments is that human perception is not built up from the parts to the whole. Rather it is the other way around.

Look at the following image, for instance:

What is your first impression of what you see? a dog or black dots and yellow circles? Most people see a dog.

The image below illustrates the same principle:

Unless you are extraordinarily unusual, you see five figures including 3 rectangles across the bottom.

In other words, the Gestalt psychologists argued, we see the whole before we see the parts.

What’s more, the whole is different from the sum of the parts.

The 36 dots in the middle square, for instance, would be perceived differently if they were arranged in a circle instead of a square. And the whole would be recognized as similar if the square were composed of small X’s instead of small dots, even though all of the individual parts were completely different.

On a more complex level, a shelf containing jars of chemicals out of which a person is made is quite a different thing from those same chemicals when they are arranged as me. Furthermore the me that I am stays the same despite the fact that the chemicals out of which I am made have changed completely every seven years.

Gestalt psychologists argued that the way something is organized actually changes what it is. The reductionist agenda of trying to understand the world as if it were a result only of the parts out of which it is composed is an incomplete way of trying to understand the physical world. The analysis pursued by the reductionist is valid. But it isn’t enough.

Whether a person sees a vase in this image or two faces depends on how the image is organized. An analysis of the individual parts alone cannot predict which it will be.

And this is why I have never been a reductionist. It seems to me it does not account sufficiently for the world that I know.

Many scientists think that the only reason for rejecting the reductionist agenda in science is because ultimately it eliminates the need for God. Actually, it is quite possible to be a scientific reductionist and to believe in God. One simply accepts that scientific pursuit of knowledge is necessarily limited to only that which can be learned through an analysis of an object’s component parts. Belief in God lies outside that realm. We have always known that belief in God is beyond proof. That is what faith means.

One of the things Gestalt psychology did do by rejecting reductionism was to high-light the mind-body problem, one of the most fascinating questions in the history of philosophy and of science.

July 21, 2010

Although some of the first reservations from scientists came from the field of psychology, several theories which are still alive and well embraced reductionism.

The most well-known are the theories collectively known as Behaviorism. Classical behaviorism, based primarily on Pavlov’s dogs which he trained to salivate to the sound of a bell, endorsed reductionism without qualification.

Reductionism in its strongest form says that all reality can be reduced to subatomic particles and that these particles constitute the only true reality. Nothing else is real in its own right.

Human consciousness, then, the mind, feelings, intention, thought, have no independent existence. Scientifically they do not constitute real things and therefore cannot be used to explain or predict behavior. Classical behaviorism argued originally that what we thought of as thought was really only the subtle, almost imperceptible, movements of the larynx and nerves involved in producing speech.

First, Behaviorists argue that what we do is determined not by what we think but by the environment. They accept that genes limit what effects the environment can have, but genes are hard to change, and in any case, whatever our genetic make-up, we will always be controlled by the rewards and punishments the environment produces following anything we do.

For instance, if we are rewarded for our temper tantrums, if they ultimately produce what we want, then our temper tantrums will increase. On the other hand, if we find that saying “please” is more apt to get us our way, that is the behavior that will increase. And so forth through absolutely everything we do.

For the Behaviorists, then, the way to change behavior is not to change our mind, not to analyze our motives, but to change the punishments and rewards in the environment.

Like the reductionists in general, Behaviorists are essentially deterministic. They do not believe in human choice or free will, and so of course, logically do not believe in either sin or virtue as deliberate acts.

There is a great body of evidence supporting the Behaviorists’ claims. The response to environmental contingencies, to reward and punishment, has been demonstrated in probably every known living organism, from one-celled bacteria to university professors.

Behaviorism was in part given an impetus from Freud’s psychoanalytic theory which claimed to study the unconscious. Although Freud himself believed that one day we would identify the physical realities reflected in the pleasure principle he called the id, the conscious self called the ego, and the moral principle termed the superego, nobody has found it. Freudian psychoanalysts to this day study the human psyche through verbal reports of the patient about their thoughts and dreams. In other words, they study descriptions by a patient of their private experience which is not directly accessible to anyone but the patient him/herself.

Since the essence of the scientific method requires that results reported by one scientist can be replicated by other scientists, the Freudian method in its present form fails utterly as a scientifically verifiable theory. It cannot repeat the conditions of private experience nor even test whether the original descriptions provided by a patient were truthful.

The Behaviorists threw out the psyche as a scientific concept altogether.

But both modern Behaviorists and Freudians, ironically, agree on one important principle. That is that “thought” is essentially a physical reality and can therefore be studied by observations of the brain and the nervous system. “Thought,” therefore, can be reduced to brain waves and synapses, and ultimately to the sub-atomic particles of which our brains are made.

Not every school of psychology agrees. Which will be the topic of my next post on the topic of reductionism in science.

July 20, 2010

Newton’s theory of gravity was revolutionary. With its strong mathematical foundation and stunning accuracy in predicting the movement of the stars and planets and in explaining why some objects seem to fall to earth while others don’t, it was a major influence in the development of reductionism.

Essentially Newton’s theory sees the universe as a giant machine which runs according to the laws of physics. As a result of this simple set of assumptions, scientists believed it would be theoretically possible to predict exactly where every single particle, atom, and molecule in the universe had ever been in the past and how it would move forever into the future. The universe and everything in it was like a gigantic and complex clock.

The first thing the assumptions of reductionism did, backed by the success of Newton’s theory, was to eliminate the need for a vital force in natural events. Explanations not only no longer needed a supernatural world with God and angels and saints, humans did not require souls, we did not even need human intention or mind . The laws of physics ultimately explain absolutely everything we see and experience and are.

This mechanistic model was therefore also totally deterministic. Save for the possibility of God’s initial act of creating it, nothing that happened in the universe required deliberate intent. Intention was a chimera, an illusion that we or other living organisms are the cause of any changes whatsoever, however small. This eliminated both sin and virtue as causes of natural events. Prayers of petition were without power, and the supposed wrath of God reflected in natural disasters were a conceit, making us, even in our sinfulness, far more important than we are.

Newton’s theory also seemed to show that the laws of physics, once they were understood, were predictive of what was going to happen in the future. In relation to any event, it should be possible to state the conditions under which that event would happen. As surely as the chemist can predict that two hydrogen atoms will combine with one oxygen atom to create water, everything else in the universe could ultimately be subject to absolute predictability.

These assumptions of reductionism have been hugely productive for the three and a half centuries since Newton lived. It produced electricity and telecommunications, cars and rocket ships, it eliminated small pox from the face of the earth and produced vaccines that can eliminate polio, measles, whooping-cough, and chicken pox. It is the approach that got us to the moon, and given us sight of the beginnings of the universe.

Religious believers have always had a fundamental problem with a reductionism that eliminates the need for God, for free will and human choice, and reduces all life to mere bio-chemistry.

But after three and a half centuries, problems are emerging from within science itself. They began with psychology. These will be the topic of the next post on the subject of reductionism.

July 19, 2010

The beginnings of reductionism which many believe to be the bedrock of science can be traced back Galileo. The traditional belief is that Galileo got in trouble with church authorities because, following Copernicus, he argued that the earth revolved around the sun, not the other way around.

But that isn’t really the core of what was so threatening about Galileo. He was called to Rome, threatened with the rack, and ultimately confined for the rest of his life to house arrest for a different reason. Galileo believed that physical evidence and his observations of heavenly bodies seen through his telescope provided a more accurate account of the how the natural world worked than did revelation and the teachings of the church.

The belief that natural law is not subordinate to interference of supernatural or higher powers is one of the revolutionary foundation stones of scientific thought.

When Newton’s theory of gravity explained why apples fall from trees but stars don’t fall from the sky, the faith in the study of these natural laws in their own right provided a confidence in science and its methodology that was unprecedented. With its powerful mathematical base, Newton’s theory could tell where stars and planets had been in the past, but also predict where they would be for thousands of years to come.

Science became unequivocably committed to explanations of the universe based on natural laws.

Galileo and Newton and indeed most scientists continued to believe in God and most often to accept church teachings. God created the universe and the laws under which it is was governed, but then did not interfere with their impersonal operation. Mankind might still be held accountable on the day of final judgement and sentenced to eternal heaven or hell, but the rain today did not come because we prayed for it. Nor was the earthquake a punishment for our sinfulness.

This insistence that science seeks to explain the universe only through the discovery of natural laws has changed the very metaphysics of Western thought and the role of religion in society.

But the specific assumptions of reductionism include more than a commitment to the exploration of natural law in its own right. An examination of these further assumptions will be the topic of the next post on the topic.

July 18, 2010

As a graduate student in psychology, I was introduced to the issue of reductionism in science. Since psychology was one of the first disciplines where the problems both solved and posed by reductionism emerged, it was apparent that it was a serious question.

At the time, the question of reductionism was discussed solely in terms of its scientific validity. Whether reductionism represented an assault on religious belief was not on even the distant horizon. So I was fortunate enough to be exposed to the arguments from professors representing theories on both sides of the reductionist question without the complications that the religious right often add today.

As a graduate student, after much serious thought, I rejected the reductionist position. But this issue has not been laid to rest in the scientific community, and in the next few posts I am going to re-examine current research and thinking on the question.

In my next post, I will describe the reductionist position, and explain why it matters.

July 16, 2010

I used to think the idea of telepathy was more off the wall than I do now.

Mostly scientists have dismissed it as superstitious nonsense, not because there aren’t some strange things they can’t explain, because there are. But scientists tend to dismiss explanations of phenomena if we have no hypotheses, no idea at all how they might occur. Since we have no scientifically respectable idea how telepathy might actually take place, we tend to say it doesn’t.

But two things give me pause.

The first is that science, from physics to biology, from physiology to psychology, is full of the most extraordinary phenomena, and explanations for them that surpass the understanding of the intelligence of most humans. Quantum mechanics above all seems to me to be almost literally incredible. And yet it is at the cutting edge of seriously serious science challenging some of the brightest people the world can find.

The second thing that gives me pause is the increasing evidence that communication seems to pervade the universe. Particles seem to communicate with each other across vast distances at incredible speed, plants communicate with each other, the cells in our body communicate with each other, birds and worms and fish and whales communicate with each other.

So maybe – just maybe – telepathy isn’t such a wild idea.

Though, like everybody else, I have no idea how it might occur. And I suspect I’m not the genius who’s going to make the break through in figuring it out either.

July 12, 2010

Almost since psychologists devised tests of intelligence, it has been accepted as proven that intelligence peaks in a person’s 20’s or early 30’s, when a slow and inevitable decline relentlessly begins.

It now looks as if this proven “fact” isn’t correct.

It depends on what is tested. Speed and short-term memory do generally peak before we hit 30. But vocabulary, emotional intelligence, social skills, problem-solving, and decision-making all improve with age and do not necessarily decline unless some debilitating disease intervenes.

There are some things that one can do to keep in peak cerebral condition as one moves toward one’s second century of life. One, of course, is to remain involved in life.

Alas, another is exercise. A 73-year-old from Wales goes sky-diving about once a month. He’s sure it’s helping to improve his thinking.

July 10, 2010

Stephen Hawking is occasionally seen on the side walks of Cambridge in his wheel chair. Nobody thinks that because he’s been in a wheel chair and needs assistance even to speak that he isn’t a genius.

In fact, until recently, nobody has done any extensive research into disease and intelligence at all.

But now scientists have found a strong correlation between the level of disease in a country and general levels of intelligence. Some of the worst effects seem to come from pathogens and parasites like worms that live in the gut, or that cause diarrhoea because they deprive the host of the energy the brain needs first to develop and then to function. The brain of a newly-born child requires 87% of its nutrition to develop normally. Even the brain of fully mature adults uses 25% of our energy.

That’s the bad news, but hidden inside is a kernel of hope. Eliminating disease seems to be an immensely effective way of making us smarter.

July 9, 2010

We still don’t altogether understand why we need to sleep, but there’s a lot of evidence piling up that it’s not smart to skip it. Yesterday’s post described the intellectual benefits to teenagers of sleeping in. Researchers at the University of California, Berkeley, have found that taking a siesta increases our ability to remember specific events, places and times as well.

There is also another bit of good news. People who take siestas are less apt to die of heart disease than the 9-5 crowd.

In fact, the benefits to memory of a mid-day nap are so great that, if well-timed, the beneficial memory effects can equal an entire’s night’s sleep. Naps taken too late in the afternoon, however, can lead to grogginess (the official term for which is “sleep inertia”) which interferes with functioning altogether and is worse than no nap at all.

Ideal nap times vary. Even a fifteen minute nap can benefit many people. In general, however, the first half hour sleep improves motor performance – like driving the car. The most substantial benefits, unfortunately, come in the second hour.

It might be possible to convince the boss that a 15 minute break lying back in the office chair is beneficial. Two hours is probably going to be a little trickier.

Unless the boss is taking a siesta at the same time.

Come to think of it, that might be the smartest way to start off the process.

July 8, 2010

Science has told us for so long that our intelligence is inherited that not nearly enough people know how terribly important environment is. Or what factors in the environment might make a really big difference.

And so does sleep. Though it’s not actually quite as simple as saying that getting more sleep makes people smarter.

Most people know that jet lag is a result of our body clocks getting out of sync. We keep wanting to sleep and wake according to the sunlight hours from the place we’ve just been. So we are wide awake in the middle of the night and fall asleep in mid-day until our clocks get re-adjusted.

Scientists have now discovered that during adolescence – the ages between 13 and 21 – our body clocks go through a massive change. Teenagers body clocks keep them awake later, and turn them into lazy slobs in the morning. Or at least “lazy slobs” is what they have seemed to resemble. In actual fact, they are struggling with circadian rhythms.

A scientific trial of 800 teenagers in Britain last year experimented with starting classes at 10 am every morning instead of 9 am. The data thus far are astonishing. Punctuality has improved, absenteeism has plummeted, and test grades have gone up a dramatic 10%.

I think the lesson is that smart cats stay in bed longer. At least teenage cats do.

July 7, 2010

I’ve just listened to a research report suggesting that people whose diets are rich in Omega 3 fatty acids are less apt to suffer from Alzheimer’s disease. Apparently the typical Greek diet is unusually high in omega 3’s, while incidence of Alzheimer’s is relatively low.

I was already acquainted with research suggesting that omega 3’s are good for the heart, and are considered “brain food” for children. But this is the first time I ever read that it could be implicated in reducing the chance of Alzheimer’s.

Omega 3’s are particularly concentrated in oily fish. After that, soya, flaxseed, and canola (known as rapeseed over here in Europe), walnuts, pumpkin seeds, and green leafy vegetables like spinach and broccoli are the best sources. And supplements.

It’s recommended, of course, not to wait until my age before acting on this. But it’s probably better late than never.

June 19, 2010

If a person’s ring finger is longer than their index finger, the chances are that he or she will tend to have some characteristics that are considered to be more male than female.

And indeed, to have a ring finger longer than the index finger is much rarer in females than in males. Those females who have it may be better than average in mathematical and analytical abilities, be more prone to take risks, or even be better at a game like soccer.

The assumption has long been that this strange correlation is a result of a burst of testosterone in utero when both the fingers and the brain are developing.

Makes sense, but it seems to be wrong. The long ring “finger” seems to be apparent even in birds, with accompanying male and female emphases of male characteristics. But in birds at least it seems to be due not to the male hormone testosterone, but to a surfeit of the female hormone oestrogen.

Interesting, no? Who would have predicted that the female hormone would be responsible for emphasizing male characteristics? I wonder if increased testosterone is associated with emphasized female characteristics.

For myself, my ring fingers suggests that I should be pretty good at fixing things. But I’m ghastly at soccer. In fact, I was nominated at the worst player in our senior high school basketball team, which itself was the only senior team in the entire history of the school that was ever beaten by the freshmen. My class wanted to elect me as the captain, but the nuns said no. Apparently basketball was too serious to be frivolous about.

I was rather hoping for the honour of captaining the team myself. It’s a big disappointment that has remained with me all these years. I’m sure it has greatly undermined my self-confidence.

I am not going to have access to my computer for the next ten days so I won’t be blogging again until July 1st. If you would like to know when there’s a new post, click on the tab in the right-hand column to get an email.

January 31, 2010

Most people don’t think of science as moral. In fact, for many people, science is quintessentially amoral. Science does not submit the validity of its findings to any religious belief and does not claim the right of judgement over the personal conduct of scientists. Except in relation to one totally non-negotiable demand.

Science demands that scientists tell the truth about what they claim they have observed.

This may sound obvious. But the skeletons in the closet of scientific history suggest that it is far harder to tell the truth than it might seem. Lured by promises of promotion, professional recognition, honour, fame, and sometimes simply stubborn conviction, scientists have not always been able to resist the temptation to message the data or even to fabricate it.

And it’s just happened again. Not only has it happened again, but it has happened in relation to one of the most critical issues facing humanity today among scientists held in the highest international regard.

The summit on global warming held in Copenhagen last month was riveted by a warning from the United Nation’s Intergovernmental Panel on Climate Change (IPCC) that climate change would melt most of the Himalayan glaciers by 2035. Since millions of people in India and China depend for their very livelihood on the water generated by these glaciers, it was a claim of terrifying significance. In fact, one of the key points of disagreement which the conference was unable to resolve was the extent to which rich nations should pay the underdeveloped nations for damage which was being inflicted by global warming caused by policies of developed countries.

Since then, scientists behind the warning have admitted that this claim was wrong.

Okay, scientists make mistakes, and scientists studying global warming have been clear that their predictions are not absolute.

But what is so critically wrong is that the IPCC knew before the Copenhagen conference that their claim was wrong. If the Himalayan glaciers continue to melt at their current pace, they will last for at least another 300 years. Not the 25 years they claimed.

Presumably the IPCC argued that rescinding this claim before the conference would do more harm than good for the case of scientific credibility related to global warming.

It’s far more likely to have the opposite effect, giving another boost to those who think global warming, if it is occurring at all, is not the result of human action. But even if the effects of this particular deceit had not been so demonstrably negative, pretending that the data suggested what it did not was a lie.

And that kind of lie violates the one non-negotiable moral imperative of science.

This blog is to help me remember that there is inevitably another way of looking at things besides the one that seems obvious to me. I find that if I can't see another possibility myself, other people are usually able to help with amazingly little effort.

Your comments to disrupt my point of view are welcome.

How this blog is organized: Posts are listed in chronological order. However, each is put into one of several categories and if you wish to see other posts on a particular subject, click on the categories window below and choose the topic of interest to you.

For instance, if you wish to read more about osteoporosis, click on the category Osteoporisis and the other posts on the subject will appear.

Categories

Categories

Email Subscription

Enter your email address to subscribe to this blog and receive notifications of new posts by email.