by Rebecca Schwarzlose

Main menu

Tag Archives: genetics

The history of science is littered with bones. Since antiquity, humans have studied the remains of the dead to understand the living. The practice is as common now as ever; only the methods have changed. In recent years, high-tech analyses of human remains have solved mysteries ranging from our ancestors’ prehistoric mating patterns to the cause of Beethoven’s death. The latest example of this morbid scientific tradition can be found in the e-pages of this month’s PLOS Pathogens. The colorful cast of characters includes European geneticists, a handful of teeth, a 6th century plague, and the US Department of Homeland Security.

Although the word plague is often used as a synonym for disease, plague actually refers to a particular type of illness caused by the bacterium Yersinia pestis. Rampant infection by Y. pestis was responsible for a recent pandemic in the 19th to 20th centuries. Before that it caused the 14th to 17th century pandemic that included the epidemic known as the Black Death.

Yet the pestilence of pestis may have swept across human populations long before the Black Death. According to historical records, a terrible pandemic killed people from Asia to Africa to Europe between the 6th and 8th centuries. It struck the Roman Empire under the watch of Emperor Justinian I, who contracted the disease himself but survived. The pandemic now bears his name: the Justinianic Plague. But was Justinian’s malady really a plague or has history pinned the blame on the wrong bacterium? A group of researchers in Munich decided to find out.

How?

By digging up ancient graves, of course. And helping themselves to some teeth.

The ancient graves were in an Early Medieval cemetery called Aschheim in the German state of Bavaria. The site was a strange choice; the authors reveal in their paper that the historical record shows no evidence that the Justinianic Plague reached Bavaria. However, the site was conveniently located within driving distance of most of the study’s authors. (It’s always easiest to do your gravedigging closer to home.) The authors did have solid evidence that the graves were from the 6th century and that each grave contained two or more bodies (a common burial practice during deadly epidemics). In total, the group dug up 12 graves and collected teeth from 19 bodies.

The scientists took the teeth back to their labs and tested them for a stretch of DNA unique to Y. pestis. Their logic: if the individuals died from infection by Y. pestis, their remains should contain ample DNA from the bacteria. Of course, some of this DNA would have deteriorated over the course of 1.5 millennia. The scientists would have to make do with what they found. They used three different methods to amplify and detect the bacterial DNA, however they only found a reliably large amount of it in the teeth of one individual, a body they affectionately nicknamed A120. They genotyped the Y. pestis DNA found in A120 to see how the bacterial strain compared with other versions of the bacterium (including those that caused the Black Death and the 19th-20th century plague pandemic.) The analysis showed that the Justinianic strain was an evolutionary precursor to the strain that caused the Black Death. Like the strains that sparked the second and third pandemics, this strain bore the genetic hallmarks of Y. pestis from Asia, suggesting that all three plague pandemics spread from the East.

The authors write that they have solved their historical mystery.

“These findings confirm that Y. pestis was the causative agent of the Justinianic Plague and should end the controversy over the etiological agent of the first plague pandemic.”

Ordinarily, the discussion sections of scientific papers are littered with qualifiers and terms like might be and suggestive. Not so here, even though the authors’ conclusion explains a phenomenon that killed many millions of people worldwide based on data from the decomposing remains of a single person who lived in a region that historians haven’t connected with the pandemic. In most branches of science, sweeping conclusions can only be made based on large and meticulously selected samples. In genetics, such rules can be swept aside. It is its own kind of magic. If you know how to read the code of life, you can peer into the distant past and divine real answers based on a handful of ancient teeth.

As it turns out, the study’s result is more than a cool addition to our knowledge of the Early Middle Ages. Plague would make a terrible weapon in the hands of a modern bioterrorist. That’s why the US Department of Homeland Security is listed as one of the funding sources for this study. So the next time you hear about your tax dollars hard at work, think of Bavarian graves, ancient teeth, and poor old A120.

Share this:

Like this:

I recently bought my baby new pajamas with a decal that says, “50% Dad + 50% Mom = 100% Me!” I couldn’t resist an outfit that doubles as both math and biology lessons. But on further reflection, I’ve realized that this simple formula is wrong in more ways than one.

To begin with, my baby doesn’t look like she’s 50% Mom. At best, she looks about 10% Mom. I’ve written before about how our daughter would be a mixture of traits from European and Indian peoples, reflecting her mom and dad’s respective heritages. Yet she arrived looking like a wholly Indian baby. This is fine, of course. I think she’s absolutely perfect with her caramel skin and jet black eyes and hair. But it’s hard to keep a straight face when friends politely ask us who we think she resembles. And when I’m out with her in public I’m aware that I look like her nanny, if not someone who’s stolen a baby. She truly doesn’t look like she’s mine.

How else is the formula wrong? Genetically. Sure, our daughter’s nuclear genes are comprised of DNA sequences from both my husband and me. But she has another sort of DNA in her body, one that literally outweighs the conventional type. This DNA lives in her mitochondria, the bacteria-like structures that populate our every cell. Mitochondria are like tiny internal combustion engines, generating all of our energy through respiration and releasing heat that makes us warm-blooded animals. Although mitochondria don’t have many actual genes, they each carry several copies of those genes. Multiply that by the 10 million billion or so mitochondria in our bodies and you’ll find that we each contain more DNA by weight for mitochondria than humans. And these mitochondrial genes are inherited entirely from the mother.

Mitochondrial genes can’t claim credit for your eye color, jaw shape, or intrinsic disposition. Their reach is mostly limited to details of your metabolism and your susceptibility to certain diseases. But mitochondrial DNA is significant for another reason: scientists use it to trace human lineages across the globe. After all, they don’t get reshuffled in each generation as our nuclear genes are. Mitochondrial inheritance can be traced back hundreds of thousands of years, following the maternal lineage at every generation. Unlike the historian’s genealogy, which often follows surnames passed down from fathers, the scientist’s genealogy is a tree built of mothers alone.

So it is through our mothers that our heritages can be traced into the distant past. In every one of her cells, my baby carries a map leading back through me and my mother and her mother and beyond . . . unbroken all the way back to our earliest origins as modern humans. And since my baby is a girl, she can continue that line. So long as she has a daughter and she has a daughter and so on, I will remain a part of that ongoing chain.

My condolences to all you men out there. Same to all you women who only had sons. You’ve passed on your nuclear genes and your child may be the spitting image of you, but your mitochondrial chain has been broken and you will be left out of the biologist’s tree. Although my daughter looks classically Indian, her mitochondrial DNA reveal only her European lineage. Despite the hair, eyes, and skin she inherited from her daddy, my baby’s mitochondria are mine all mine. She and I are links in a traceable chain of human life while my husband is nowhere to be found.

That’s something I can remember the next time I’m mistaken for the nanny.

Share this:

Like this:

My husband spotted another one yesterday. A half-Indian, half-Caucasian blend. The woman had an Indian first and last name, but her features were more typical of a Persian ethnicity than either Indian or white. My husband overheard her describing her heritage and smiled. These days, with a half-Indian, half-white baby on the way, we’re hungry for examples of what our baby might look like. We’ve found a few examples among our acquaintances and some of my husband’s adorable nieces and nephews, not to mention the occasional Indian-Caucasian celebrity like Norah Jones. We think our baby will be beautiful and perfect, of course, although we’re doubtful that she’ll look very much like either one of us.

Many couples and parents-to-be are in the same position we are. In the United States, at least 1 in 7 marriages takes place between people of different races or ethnicities, and that proportion only seems to be increasing. It’s a remarkable statistic, particularly when you consider that interracial marriage was illegal in several states less than 50 years ago. (See the story of Loving Day for details on how these laws were finally overturned.) In keeping with the marriage rates, the number of American mixed race children is skyrocketing as well. It’s common to be, as a friend puts it, a “halfsie.” At least in urban areas like Los Angeles, being mixed race has lost the negative stigma it had decades ago and many young people celebrate their mixed heritages. Their unique combinations of facial and physical features can be worn with pride. But the mixture goes deeper than just the skin and eyes and hair.

At the level of DNA, all modern humans are shockingly similar to one another (and for that matter, to chimpanzees). However, over the hundreds of thousands of years of migrations to different climates and environments, we’ve accumulated a decent number of variant genes. Some of these differences emerged and hung around for no obvious reason, but others stuck because they were adaptive for the new climates and circumstances that different peoples found themselves in. Genes that regulate melanin production and determine skin color are a great example of this; peoples who stayed in Africa or settled in other locations closer to the Equator needed more protection from the sun while those who settled in sites closer to the poles may have benefited from lighter skin to absorb more of the sun’s scarce winter rays and stave off vitamin D deficiency.

In a very real way, the genetic variations endemic to different ethnic groups carry the history of their people and the environments and struggles that they faced. For instance, my husband’s Indian heritage puts him at risk for carrying a gene mutation that causes alpha thalassemia. If a person inherits two copies of this mutation (one from each parent), he or she will either die soon after birth or develop anemia. But inheriting one copy of the gene variant confers a handy benefit – it makes the individual less likely to catch malaria. (The same principle applies for beta thalassemia and sickle cell anemia found in other ethnic populations.) Meanwhile, my European heritage puts me at risk for carrying a genetic mutation linked to cystic fibrosis. Someone who inherits two copies of this gene will develop the debilitating respiratory symptoms of cystic fibrosis, but thanks to a handy molecular trick, those with only one copy may be less susceptible to dying from cholera or typhoid fever. As the theory goes, these potentially lethal mutations persist in their respective populations because they confer a targeted survival advantage.

Compared to babies born to two Indian or two Caucasian parents, our baby has a much lower risk of inheriting alpha thalassemia or cystic fibrosis, respectively, since these diseases require two copies of the mutation. But our child could potentially inherit one copy of each of these mutations, endowing her with some Suberbaby immunity benefits but also putting her children at risk for either disease (depending on the ethnicity of her spouse).

The rise in mixed race children will require changes down the road for genetic screening protocols. It will also challenge preconceived notions about appearance, ethnicity, and disease. But beyond these practical issues, there is something wonderful about this mixing of genetic variants and the many thousands of years of divergent world histories they represent. With the growth in air travel, communication, and the Internet, it’s become a common saying that the world is getting smaller. But Facebook and YouTube are only the beginning. Thanks to interracial marriage, we’ve shrunk the world to the size of a family. And now, in the form of our children’s DNA, it has been squeezed inside the nucleus of the tiny human cell.

Share this:

Like this:

The results are in. The ultrasound was conclusive. And despite my previously described hunch that our growing baby is a boy, she turned out to be a girl. We are, of course, ecstatic. A healthy baby and a girl to boot! As everyone tells us, girls are simply more fun.

As I was reading in my pregnancy book the other day, I came across an interesting bit of trivia about baby girls. At this point in my pregnancy (nearly 6 months in), our baby’s ovaries contain all the eggs she’ll have for her entire life. As I mentioned in a prior post, the fact that a female fetus develops her lifetime supply of eggs in utero represents a remarkable transgenerational link. In essence, half of the genetic material that makes up my growing baby already existed inside my mother when she was pregnant. And now, inside me, exists half of the genetic material that will become all of the grandchildren I will ever have. This is the kind of link that seems to mix science and spirituality, that reminds us that, though we are a mere cluster of cells, there’s a poetry to the language of biology and Life.

But after stumbling upon this factoid about our baby’s eggs, I was also struck by a sense that somewhere someone seemed to have his or her priorities mixed up. If our baby were born today, she would have a slim chance of surviving. Her intestines, cerebral blood vessels, and retinas are immature and not ready for life outside the womb. Worse still, the only shot her lungs would have at functioning is with the aid of extreme medical intervention. The order of it all seems crazy. My baby is equipped with everything she’ll need to reproduce decades in the future, yet she lacks the lung development to make it five minutes in the outside world. What was biology thinking?

Then I remembered two delightful popular science books I’d read recently, The Red Queen by Matt Ridley and Life Ascending by Nick Lane. Both described the Red Queen Hypothesis of the evolution of sex, which states that the reason so much of the animal kingdom reproduces sexually (rather than just making clones of itself) is to ‘outwit’ parasites. In short, if each generation of humans were the same as the next, parasites large and microbial could evolve to overtake us. By mixing up our genetic makeup through sexual reproduction, we make it harder for illnesses to wipe us out. Like the Red Queen from Lewis Carroll’s classic, we keep running in order to stay in the same place (which is one step ahead of parasites and disease).

Just as there are parasitic organisms and bacteria, one might say that there are parasitic genes. For example, mutations in the DNA of our own replicating cells can cause cancer, which is essentially a self-made, genetic parasite. Moreover, retroviruses like HIV are essentially bits of genetic material that invade our bodies and can insert themselves into the DNA of our cells. And the ultimate road to immortality for a parasitic gene would be to hitch a ride on the back of reproduction. Imagine what an easy life that would be! If a retrovirus could invade the eggs in the ovaries, it would be passed on from one generation to the next without doing one iota of work. It’s the holy grail of parasitic invasion – get thee to the ovaries! According to Matt Ridley in another of his books, The Origins of Virtue, the human germ line is segregated from the rest of the growing embryo by 56 days after fertilization. Within two months of conception, the cells that will give rise to all of the embryo’s eggs (or sperm, in males) are already cordoned off. They are kept safe until they are needed many years in the future.

So perhaps my little baby’s development isn’t as backwards as it seemed at first. Yes, lungs are important. But when you’ve got something of value to others, it makes practical sense to hurry up and lock it away.

Like this:

It’s the early 19th century, before Darwin’s Origin of Species. Before Mendel’s peas and Watson and Crick’s double helix. Scientists are struggling with the big questions of inheritance and reproduction without the aid of modern scientific methods. In this vacuum of concrete information, odd theories gained traction – some based on racial or social agendas, others on intuition or supposition.

Lamarckism, or soft inheritance, was one of the more pervasive of these ideas. According to the theory, organisms can inherit acquired traits. In the days before Darwin’s evolutionary theory, Lamarckism helped explain why organisms were so well adapted to their environments. Take the example of the giraffe’s long neck. A giraffe of yore (when giraffes had shorter necks) had to stretch its neck to reach the luscious leaves further up on tree branches. All that stretching lengthened its neck a little, and this longer neck was passed on to its offspring, who in turn stretched their necks and sired offspring who could reach even higher and munch the choicest leaves. It went on like this until giraffes were tall enough that they didn’t have to strain to reach leaves anymore.

It was a neat explanation that appealed to many 19th century scientists; even Darwin occasionally made use of it. But the theory had a nasty side as well. People applied it to humans and used it to explain differences between races or socioeconomic classes, calling the phenomenon degeneration. The mental and physical effects of years spent boozing and behaving badly would be passed down from father to son to grandson, each successively worse than his predecessor as the collective sum of each reckless lifetime added up. There was a technical term for the poor souls who wound up literally inheriting the sins of their fathers: degenerates. Certain scientists (or pseudoscientists) of the era, such as Benedict Morel and Cesare Lombroso, used the ideas of soft inheritance and degeneration to explain how violence, poverty, and criminality were heritable and could be categorized and studied.

Lamarckism, in the hands of Morel and others, offered a credible explanation of why the son of an alcoholic was more likely to be an alcoholic himself. But it did so by implying that the poor, the miserable, the suffering were inherently inferior to those with better, healthier (and probably wealthier) lifestyles. The poor were genetically degenerate, and they had no one to blame but themselves.

Thank god, thank god, Lamarckism and its corollary, degeneration, were debunked. By the 20th century, scientists knew that inheritance didn’t work that way. Our genetic information isn’t changed by what we do during our lifetimes. Besides, our sex cells are segregated from the other cells in our bodies. We don’t descend from our mothers, subject to all the stresses, strains, and yes, even boozing that their brains and bodies may have experienced. Instead, we descend from their ovaries. And thankfully, those things are well protected.

Only there’s a catch. In the last few decades, we’ve learned that while Lamarckism isn’t correct, it isn’t entirely wrong either. We’ve learned this through the field of epigenetics (literally, above genetics). This burgeoning field has helped us understand why the causes of so many heritable diseases still elude us, nearly a decade after we sequenced the human genome. Epigenetics adds untold complexity to an already complex genome. Some of its mechanisms are transient, others last a lifetime, but they all regulate gene expression and are necessary for normal growth and development. Thanks to them, females inactivate one of their X chromosomes (so women don’t get a double dose of proteins from that set of genes). Epigenetic mechanisms also oversee cellular differentiation, the process by which embryonic cells containing identical genetic material become skin cells, hepatocytes, neurons, and every other diverse cell type in the human body.

It now appears that epigenetic factors play an enormous role in human health. And what we do in our lives, the choices we make, affect our epigenome. Exposure to chemicals, stressors, or dietary changes can cause long-lasting tags to sit on our DNA or chromatin, controlling which genes are read and transcribed into proteins. For example, chronic cocaine use causes lasting epigenetic changes in the nucleus accumbens, a brain area linked to addiction. These changes boost plasticity and drug-related gene expression, which in turn probably contribute to the reinforcing effects of the drug.

But that’s not all. Epigenetic effects can span generations. No, the hardships of your parents’ lifetimes aren’t literally passed on to you in a cumulative fashion, giving you that longer neck or boozier disposition that Lamarckism might predict. Nonetheless, what your parents (and even your grandmother) did before you were born can be affecting your epigenome today.

It’s pretty wild stuff. Even if you’ve never met your maternal grandmother, even if she died long before your birth, her experiences and behavior could be affecting your health. First of all, the prenatal environment your mother experienced can have epigenetic effects on her that then propagate on to the next generation (you). Moreover, all the eggs a female will ever make have already formed in her ovaries by the time she’s born. They may not be mature, but they are there, DNA and all. I think that’s a pretty amazing transgenerational link. It means that half the strands of DNA that wound up becoming you were initially made inside your grandmother’s body. As science reveals the power of the prenatal environment, evidence is mounting that even what your grandmother ate during your mother’s gestational period and whether she suffered hardships like famine can alter your own risk for heart disease or diabetes.

Luckily, epigenetic gene regulation is softer and less absolute than its cousin Lamarckism. It is reversible and it can’t accumulate, generation upon generation, to create a degenerate class. The science of today is more humane than the old guys predicted, but it doesn’t let us off the hook. Epigenetics should remind us that we must be thoughtful in how we live. Our choices matter, for ourselves and for our offspring. We don’t yet understand how epigenetic mechanisms control our health and longevity, but that isn’t stopping our bodies from making us pay for what we do now.

Share this:

Like this:

Thanks to a recent kerfuffle over the Earth’s precession and its effect on our astrological signs, many people have spent this week questioning their personality traits. I went from being a life-long Gemini (changeable, duplicitous) to a possible Taurus (stubborn, steady), neither of which I think describe me. I’ve never believed in astrological signs, but many people do, and this week must have been a confusing one for them.

The whole thing got me thinking about how we look outward for explanations and definitions of our inner selves. No one has a better vantage point than we do to observe our own personal thoughts, feelings, attitudes, and behaviors. How funny that we once looked to the stars in order to understand ourselves! Those of us who consider ourselves scientific and modern are no better. Although we scoff at sun signs and palm readings, increasingly we are turning to our brains and our DNA for answers that they simply can’t give.

In the 1800’s, the Phrenological Fowlers (later Fowlers and Wells) founded a nationwide industry on reading people’s personalities based on the bumps on their heads. They published extensively and sent emissaries to small towns throughout the U.S. so that, for a small fee, the masses might come to know themselves better. The company and its methods were an unrivaled success. America was obsessed with phrenology. Sometime in the 1860’s, a curious Mark Twain visited Fowler’s office under an assumed name. Fowler read his head and said that his skull dipped in at a particular point where it should have bulged out – a sure sign that Twain, the preeminent American humorist, utterly lacked a sense of humor.

Nowadays, many still look to their brains for answers. When I used to scan participants in fMRI experiments, they would often ask what I could tell them about their brains. I couldn’t tell them anything; all the analysis took place later, back at the lab. But as a frequent subject in pilot experiments for my own and colleagues’ studies, I’ve had unfettered access to data from my own brain. I know that I have a large and robust fusiform face area (a region thought to be critical for face recognition) and a rather dinky visual word form area (implicated in identifying letters of the alphabet.) What does that mean, when I am an avid reader and often embarrass myself with my poor ability to recognize faces?

While people still look to the stars and to brains (if not skulls) in order to understand themselves, the next big thing has arrived. The age of personal genomics is upon us and countless startups out there are eager to swap a check and a swab of our cells for a glimpse into our futures and ourselves. I have to admit, I fantasize sometimes about having my genome read. I would love the chance to pour through details about my ancestral line or learn what type of diseases I am predisposed to developing. But the biggest draw is to learn about myself. What forms of the anxiety genes do I have? What about genes linked to mental illness, intelligence, novelty-seeking? As a scientist, I know that complex traits are determined by a mixture of environment and numerous genes, many of which we haven’t yet discovered. Beyond that, epigenetic factors influence the expression of our genes in ways we don’t yet understand. Yet I still find myself wishing someone would hand me that printout with the secrets to myself.

The cognitive scientist Steven Pinker wrote a wonderful essay wading through his results when his own genome was sequenced. In it, he struggles with the discrepancies. His genome says he should be sensitive to bitter flavors, yet he enjoys beer, broccoli, and brussels sprouts. His genome says he has a high risk of baldness, yet he is known for his thick mane of overflowing, curly hair. Other results he believes or would like to believe. What is a person seeking direction and self-wisdom to do?

So at the end of this astrologically confusing week, I find myself at a loss. Why do we crave external guidance to help us understand our internal selves? It may be because we are less static and more changeable than we like to believe. As I alluded to in my post about our potential to do evil, psychology experiments (and history) have shown that human beings are heavily influenced by their circumstances. Because we are adaptable, we behave very differently depending on who we are with and what we are doing. Although the adaptability may be advantageous, I suspect it unsettles us. We want to believe we have a solid, stable identity, and we will look to mystics or scientists – anyone who can give us that assurance. I know who I am and who I always will be.

The hard (but in its own way beautiful) truth is that we are each a complex and contradictory landscape of traits, behaviors, and passions. Be wary of those who try to describe you with a handful of paltry adjectives. Know thyself. Or keep trying, anyway. It should take at least a lifetime.

Share this:

Like this:

My Indian-descent fiancé and I have a politically-incorrect game going. I call it ‘Whose Race is Genetically Inferior?’ And I lose every time. My fiancé has a number of good points for why brown is better than white.

One: fewer genetic diseases. He says this is because whities had several population bottlenecks in Europe, which caused inbreeding that proliferated mutations. I think his term may have been ‘genetic backwaters.’ And it may be true that some of my ancestors got it on with their cousins or siblings. Whatever.

I do point out that a disproportionate amount of medical and genetic research is done in Europe and America, so we would be more likely to detect rare genetic illnesses and identify their causes. My fiancé points out that there are enough people in India that even a rare disease would be detected, to which I roll my eyes but concede.

Which brings us to point number two: reproduction. I’m no expert at whether Indians are more fertile than Caucasians; I’d suspect not. But lord knows they’re doing something right.

Three: success. I argue that success is not genetic, but my fiancé just raises his bushy eyebrows and responds, “who’s to say?” I can’t deny that Indians seem to show remarkable determination and success in the face of often disadvantaged circumstances. The Indian families I know in America arrived here with very little, only to build medical or business careers that left them quite well off. And in India, despite an exploding population, over a century of subjugation, and an effective Cold War with Pakistan, they’ve managed to become one of the fastest growing economies in the world.

So yes, I tell him, I’m taking a step up on the genetic ladder. For the sake of the kids.

Whichever of their parents is superior, our kids will benefit from our genetic diversity. For example, they’ll have a lower risk of cystic fibrosis than white children and a lower risk of sickle-cell anemia and thalassemia than children of Indian descent.

So take this as a warning, people. We’re comin’ at you with the Vanilla-Chocolate Swirl.