Living in Glass Houses with Bricks: What about science should concern us?

This weekend, I attended several seemingly-disparate panel discussions on agriculture, statistics, science playwrights, and doomsday theories, but in the end, a theme-type message started to form itself. The conference focuses on educating the world about science, and that goal requires facing one particular major challenge. Education isn't just about offering up new information, it is also about battling misinformation, which somehow manages to sneak into everything. False information even makes its way into the world of the scientist, despite all best efforts towards safeguarding against such an invasion.

1. In which we learn how to survive an earthquake, and how to really scare people that need scaring.

One day last year, I had just finished attending to some brains in lab, when I suddenly felt very dizzy. Once I finally figured out that it was the earth that was shaking and not me, I really freaked out. As a child I was terrified of natural disasters, but just then I realized that I had absolutely no idea what to do in the actual event of one. I ran out into the main corridor of the lab, and then, intent on saving my life, left the building.

Turns out, I did exactly the opposite of what you are supposed to do in an earthquake. You are supposed to get under a table. I forgot to learn about this, and apparently didn't care enough to prepare myself. Many people in California feel the same way, even the natives, for anyone can get too distracted or busy with more immediate concerns. But today, after listening to a panel discussion on earthquakes in southern California, I'm ready to go out and get some jars of Peanut Butter and a flashlight and make a few earthquake kits. Let's just say that the earthquake talks were not as boringly reassuring as I had hoped.

"We are overdue for a massive 7.8 magnitude earthquake. The Big One. Historically, on average, every 150 years southern California gets hit by a major earthquake, and the last one was 300 years ago." The first speaker, Lucille Jones, a seriously super-prepared woman in perhaps her mid-fifties, probably had a fallout-shelter, twenty gallons of emergency water, and a storehouse full of power bars in her backyard.

"The Big One. What does it mean? There are two groups of responses. Some people say 'why bother?'" These people, she explained, are convinced that California is going to break in half and fall into the ocean, so nothing they could do could possibly help. So why bother?

"Others think, 'I have been through some big earthquakes. Nothing really happened. This one can't be any worse, so I don't need to bother.'"

The goal, she insisted, was to teach people that they should actually just bother. This is because the quality of our lives before, during, and after this scary apocalyptic-sounding disaster rests on our own preparedness. When she said this, everyone in the room suddenly became very concerned. For now, any minute, we would have a 150-year overdue earthquake. Thank God we were attending an earthquake seminar. I felt badly for everyone else in the conference who was learning about microorganisms at that particular moment. They would have no idea what to do when The Big One came.

The Coming of the Big One, it turns out, is a very scary story. The predicted stats did not sound so nice. Based on historical data and models created by a troupe of scientists, hundreds of thousands of homes and buildings would likely collapse, 1,600 major fires would break out in its aftermath, and a thousand people would die. How do you convince people that they should be afraid enough to prepare? How do you scare people enough to see the risk involved with a natural disaster that we cannot control or even predict - a disaster that is out of sight and therefore out of mind?

The panel members were on a mission to help the state get through The Big One as well as any smaller ones that could come along. Two years ago, "The Great California ShakeOut," an educational program to get people to bother preparing for earthquakes, introduced the largest earthquake drill in the US. Since then, outreach programs have gotten more than 5 million residents involved, and all based on several simple principles. One of these principles was "Avoid Sensationalism," which sounded practical, until the next principle was introduced, perhaps a more useful one: "Use Engaging Imagery."

The imagery probably worked the best of any of these techniques. The panelists showed a video produced as part of the ShakeOut -- an animation graphically depicting all the horrors that we would have to deal with and how much worse things would be if we didn't have fire extinguishers and 3 gallons of water per person per day. It was pretty terrifying. Seems like you would need some very scary movies to push Californians that little extra bit required to get us prepared.

2. In which we are reminded that science aims to discover and not destroy the universe.

"We are experiencing some technical difficulties. Nothing to be concerned about" said the pilot right before the plane exploded in the catastrophic nuclear blast.

I had walked into the symposium entitled: "Doomsday Verses Discovery" about a half an hour late, just as the room was finishing up watching some movie clips from the end of the world.Turns out, everything from our psychology to our religion, our distrust in science and new technology, and media sensationalism leads us to panic about dangerous-sounding scientific experiments. And since we can only spend so much time worrying about things, these irrational fears prevent us for attending to and assessing real concerns, such as the aforementioned earthquake issue, as well as dangers that come from crossing the street in the rain, riding a bike without a helmet, and baking while fatigued, to name a few.

3. In which we hear all about accidental lies in research, and learn that we shouldn't accept other peoples' analyses.

On Saturday afternoon, I attended the symposium entitled: "False Discoveries and Statistics: Implications for Health and the Environment." This, perhaps, was the scariest talk of them all. For any scientist, the end of the world as we know it might not quite measure up to the idea that much science is actually based on lies. Even if these lies are innocent and accidental, they are still misinformation that can not only affect the direction that science takes and the methods and resources that researchers use to get there, but can also have a culture-changing affect on citizens who trust research completely and blindly. How sad it seems that people loose when they mistrust science as well as when they trust too much.

"There are three types of lies: lies, damned lies, and statistics," goes the old unidentified quotation. The symposium just rubbed it in. The panelists discussed atrocities such as not controlling variables, claiming too much from uncontrolled experiments, and going along with findings that are "totally biologically impossible," simply because of a few manipulated numbers. As an undergraduate, I was always taught that you could say anything you wanted with statistics, if you did them wrong.

"Experimenters always want to find something positive. The more they find, the happier they are," Said Juliet Shaffer of the University of California in Berkeley. While this is definitely true, it seems a bit of a stretch to say that researchers will drop their ethical obligations to honest science just to find something positive. If this were true there would be many more successful grad students out there.

Stanly Young, a panelist from the National Institute of Statistical Sciences, claimed that it was the system of applying for funding that pushed researchers to emphasize the exciting-sounding new results over being sure of the old ones. "The problem is with the system," he repeated over and over, "The problem is not the worker, its the manager, because they set up the system."

"Science is supposed to be self-correcting," Young sighed, "I won't criticize your statistics, you won't criticize mine. We both live in glass houses, we both have bricks, we just won't throw them."

How do we fix this problem of not holding scientists to the highest statistical standards? From my experience this is very difficult, because statistical methods often have to be so carefully tailored to the experiment to be very difficult even for other scientists to completely understand, even after putting in a lot of time and effort. Here are a few ideas that I have heard/come up with:

3. Holding more scientists responsible for research. In order for a paper to be published in a credited journal, it has to be "peer-reviewed" or read and given the thumbs-up by a few readers. If these reviewers had to put their names on the articles that they vouch for, then perhaps publishing would get a bit more stringent?