Dr. Julie, a.k.a. Scientific Chick, brings you insights into what's happening in the world of life sciences. Straight from the scientific source, relevant information you should know about, in plain language.

Wednesday, May 26, 2010

There’s quite a bit of hullabaloo (I can’t believe I just used that word) surrounding the recent creation of the first “man-made” synthetic cell. Genomics, the study of DNA sequences of entire organisms, has come a long way in a very short time: after all, Watson and Crick discovered the structure of DNA less than 60 years ago, and already we’re toying with the thing like it’s silly putty.

DNA can be thought of as the recipe book to make an organism. In your DNA, there is all the information necessary to make every single part of your body. Strangely enough, DNA is only made out of four units: A, C, T and G (remember the movie “Gattaca”? That’s a play on a DNA sequence). The sequence of these units determines your genes (small chunks of DNA), your chromosomes (larger chunks of DNA comprised of many genes), and your genome (the sum of all your DNA). To create the synthetic cell, the researchers started with the genome of one bacteria (let’s call it “donor”). They mapped out the genome of the donor into a computer file, and then edited the code, adding a few specific chunks here and there. These added chunks (the “watermarks”) would later serve as markers to confirm that the resulting organism only hosts the synthetic code. Once the researchers had a DNA sequence they were happy with, they fed the code into a machine that spurts out the As, Cs, Ts and Gs in the correct order and generates small pieces of synthetic DNA. The researchers then used elaborate techniques to stitch together the small pieces into a complete genome. Finally, the researchers transplanted this synthetic genome into a different bacteria (the “recipient”) previously emptied of its own genome. Shazam! Synthetic “life”.

At this point, I don’t think we can call this breakthrough “artificial life”. Life was not created from scratch. What we have here is a functioning cell (and that includes the ability to replicate) with a synthetic genome. This synthetic genome, even though it was created from a computer, only includes genes found in nature (and the watermarks, but they don’t code for anything). And while it’s fun to imagine that we’re only days away from reviving the woolly mammoth, it’s important to keep in mind that this synthetic cell has a very tiny genome. What’s more, the genome of the synthetic cell had some mistakes in it: it wasn’t exactly as the researchers had intended.

Still, I think there is cause for concern, both in terms of ethics and biodiversity. If we had already cracked open Pandora’s box with transgenic animals and plants, this represents a rip-the-lid-off-and-break-the-hinges advance. Especially considering that current legislation allows the patenting of such organisms (a subject for a later post, perhaps). However, with all the bad comes some hope: like stem cells and gene therapy, this technique may represent a new hope to cure diseases.

What’s your take on this new breakthrough? A step closer to mastering life and curing diseases, or a scary slide down a slippery slope?Reference: Creation of a bacterial cell controlled by a chemically synthesized genome. (2010) Gibson DG et al. Science, May 20 [Epub ahead of print]

Saturday, May 15, 2010

Those of you familiar with the Bible will remember the famous words uttered by Pontius Pilate, the judge at Jesus’ trial, as he washed his hands in front of the crowd: “I am innocent of this man’s blood; you will see.”

While I have no intention to change my name to Religious Chick, I just wanted to point out that the link between physical and moral cleanliness goes back, waaay back. In a fascinating recent study published in the journal Science, researchers attempt to investigate an aspect of this link.

To better understand the study, you must first know the word(s) of the day: postdecisional dissonance. When you have to make a choice between two options that seem equally good (say, having lobster for dinner versus having King crab for dinner), you enter a state called “cognitive dissonance”, which essentially means that “I can’t decide!” feeling. In order to get rid of this uncomfortable feeling once you made your choice (say, crab), your mind does this funny trick where it now perceives your choice (the crab) as way better than the alternative you rejected (the lobster). This happens even though initially, you thought crab and lobster were equally attractive. This little mind trick is called postdecisional dissonance, and it’s been well documented. In the recent study, the researchers wanted to find out if washing your hands can affect this postdecisional dissonance mind trick.

The researchers set-up a mock consumer survey where they asked 40 subjects to chose, out of 30 CDs, 10 they would like to own. To thank them for participating in the “survey”, the subjects were offered a CD, either their 5th or 6th ranked selection. After they made their choice, the subjects were made to believe that they would now participate in a survey on liquid soap: they could evaluate it anyway they seemed fit. About half the subjects only looked the bottle, while the other half tested the soap by washing their hands. Finally, the participants were asked to rank the 10 CDs again, allegedly because the company behind the CD study wanted to know what people thought of the CDs before they left the store.

Using this (somewhat complicated) study design, the researchers were able to test if washing your hands can attenuate the need to justify a recent choice. The subjects who only looked at the soap ended up ranking the CD they chose in the first part of the experiment better (closer to 1 than 10). Their mind successfully played its trick where it convinced them that the choice they made was much better than the alternative, and this confirms the standard dissonance effect I describe above. Interestingly, the subjects who washed their hands ranked the CDs the same as how they had ranked it initially.

The researchers suggest that hand washing cleans us from past decisions, and reduces the need to justify them. It seems very strange to me that a common saying (“I wash my hands of what we have for dinner, lobster or crab, it’s all the same!”) and biblical anecdote are actually biologically grounded. I guess I shouldn’t be so surprised. What do you think? Can you come up with other examples?

Sunday, May 9, 2010

The race to develop a drug to treat Alzheimer’s disease is a top priority for many pharmaceutical companies. With the aging population, the size of the market is ever increasing, and some people estimate that the winner of the race will pocket several billion dollars in the first year of a drug being on the market, in the US alone. However, it would be far easier if we could simply find a way to decrease our risk of getting Alzheimer’s disease in the first place. One recent study suggests that it’s as easy as eating healthy.

We’ve known for a long time that what you put in your plate is the single most important modifiable environmental factor for your risk of getting a variety of diseases, ranging from obvious ones like scurvy (drink your OJ!) to less obvious ones like prostate cancer (stay away from red meat!). Given these relationships, a team of researchers decided to study what combination of foods may relate to a risk of getting Alzheimer’s disease. They asked over 2000 elderly subjects to describe their eating habits in great detail, then followed them for 4 years.

After the 4 years had passed, 253 subjects had developed Alzheimer’s disease. After careful analysis of the subjects’ diets, the researchers concluded that the following diet characteristics significantly lowered your risk of developing Alzheimer’s disease:

Eating high-fat dairy, red meat, organ meat, and butter is correlated with an increase in the risk of Alzheimer’s disease.

Subjects who adhered best to the characteristics described above saw their risk of developing Alzheimer’s disease drop by 38%. By now, I’m sure my alert Scientific Chick readers are already wondering if the researchers looked at other factors, such as age. Differences in age, education, ethnicity and sex didn’t change the relationship between diet and Alzheimer’s disease. However, other factors did lessen the importance of the diet, such as smoking, body mass index, and caloric intake. But even when controlling for all these factors, the relationship between diet and risk of developing Alzheimer’s disease remained significant.

The major strength of this study is that it looked at existing dietary patterns instead of trying to impose them, which is often unreliable. However, some of the results can be misleading. A low intake of vitamin B12 may seem to protect you against Alzheimer’s disease, but it’s because a lot of food that contains B12 (such as meat and dairy) also contain high levels of saturated fats, which increase your risk for Alzheimer’s disease. Overall, as with any scientific study, the results must be interpreted carefully. It’s a combination of foods that is important: you can drink antioxidant-rich blueberry juice all you want, but if you’re having it with a side of ground beef, you’re missing the point. In addition, there could be some factor other than diet at play that the authors did not control for.

So while we don’t have a miracle drug yet, and no single food can prevent Alzheimer’s disease, it seems like a healthy, varied and unprocessed diet is a good place to start to ensure healthy aging.

About Me

Dr. Julie is an Assistant Professor of Neurology at the National Core for Neuroethics and the Djavad Mowafaghian Centre for Brain Health at the University of British Columbia. She holds a PhD in Neuroscience.