Imagine this: What if scientists had a tool that allowed them to edit genes directly, altering their underlying DNA? The science-fictional applications, like designer babies or Frankensteined organisms, would be obvious—although ethical and legal rules in science and medicine might prevent such uses. Immediate applications would be more mundane, but also more significant: understanding and treating disease, manufacturing new types of pharmaceuticals, and engineering more resilient foods, for starters.

There’s no need to imagine, actually. Such a tool does exist, and scientists have been refining it over the last decade or so. But despite massive hype in the science and general press, it probably remains unfamiliar or misunderstood to many people, especially those who don’t follow science news regularly. The reason might have to do with its terrible branding.

* * *

The gene-editing tool is called CRISPR, an acronym for Clustered Regularly-Interspaced Short Palindromic Repeats. More confusingly, the strands of DNA called CRISPR have been around for billions of years. Bacteria use CRISPR to slice out portions of an attacking virus, storing them in their own DNA for later defense. Recently, scientists found a way to harness this mechanism for genetic inspection and editing. Unlike earlier gene-editing techniques, CRISPR is faster, cheaper, and more reliable.

Things get more confusing. CRISPR (the research apparatus, not the biological technique it adapts) is but a nickname; the tool’s proper name is CRISPR-Cas9. Cas9 is the “molecular scissors”—an enzyme that does the cutting. A guide RNA (gRNA) helps Cas9 find the desired gene to cut. The engineering innovation in CRISPR (the tool, not the DNA strands themselves) involved synthesizing the guide RNA so that geneticists could make specific genetic cuts and pastes. Modifications of CRISPR also allow scientists to activate, suppress, or control those genes.

It’s a lot to take in. And no surprise, since genetics is a specialized field. It requires expert knowledge and precise terminology, much of which seems esoteric to outsiders. Though alienating for ordinary folk, expert terminology is useful and necessary. It helps specialists interact with clarity and efficiency.

Perhaps the name for a gene editor should telegraph more gravitas than a trip to the vending machine.

But in science today, specialist language bleeds into the public interest. Gene-editing isn’t just an impressive new practice, but also one with implications for humanity at large. The applications are so varied, from gene identification for medical research to the actual, if technically illicit, manufacture of alien organisms. The commercial and legal stakes are high, too. Hundreds of millions of dollars have been invested in CRISPR-driven biotech businesses, and the technology has been subject to a bitter patent dispute between the research institutions that supported its development.

Given its sweeping impact on the future of biotech, one might hope all informed citizens would have a basic understanding of what CRISPR is and what it does. But conveying such knowledge becomes difficult when the setup is so onerous and unwelcoming. Coverage of CRISPR is frequent, thanks to its rapid evolution and application. But every article about CRISPR must start by laboriously explaining what it is, before moving on to the new news about it.

It’s name should bear a lot of the blame. CRISPR is an acronym. The unwieldy esotericism of “Clustered Regularly Interspaced Short Palindromic Repeats” is obvious. Experts use acronyms to make it easier to reference technical terms. They work even better when truly acronymous—that is, pronounced as a word rather than as a series of initials.

Calling a gene-editing mechanism CRISPR is a little like calling your refrigerator Maggie because your wife Margaret stores produce from her garden in its crisper drawer.

But once transformed into words, acronyms obscure more than they reveal. From “honey” to “Jenny” to “CRISPR,” nicknames imply familiarity and intimacy. For that familiarity to be valid, the speaker must have earned the right to use the nickname. In the case of CRISPR, that’s unlikely the case for most people who utter, read, or type it. That’s nothing new for technical and medical acronyms, of course. How many people have heard of—or been administered—a PET scan, CAT scan, or MRI, without knowing what those letters stand for or what they mean?

CRISPR is a particularly egregious example. PET and CAT scans might justify their acronymous, consumer-facing names thanks to those terms’ association with domestic animals—a gentle comfort for a strange test run in a claustrophobic tube. But CRISPR has a harder time squaring gene editing with a name that sounds like it names a drawer in the refrigerator, or a breakfast cereal.

In fact, there’s already a Snickers Crisper candy bar (no relation), which adds crunchy rice to that treat’s trademark peanuts, nougat, and caramel. And a German-language CRISPR explainer video starts by forgiving the viewer for thinking that it might be a cereal bar. At first, those comparisons might seem like a cute way to apologize for scientific dorkship. But perhaps the name for a gene editor should telegraph more gravitas than a trip to the vending machine—future CRISPR-facilitated, designer foods notwithstanding.

CRISPR makes things worse by conflating CRISPR the DNA strands with CRISPR-Cas9, the DNA-and-gRNA duo—which is still something distinct from the biotechnical apparatus that makes use of the two together to evaluate and alter individual genes. Calling a gene-editing mechanism CRISPR is a little like calling your refrigerator Maggie because your wife Margaret stores produce from her garden in its crisper drawer. In short, CRISPR is a terrible name that does a massive disservice to a revolutionary biotechnology.

* * *

How to name and explain scientific esoterica is a problem normally relegated to science communications and journalism. Once the science is done, it’s up to the rest of us to make sense of it. But there’s reason to see the branding of CRISPR and other esoteric but influential technologies as a first-principles aspect of scientific research and publishing. When scientists stop describing the natural world and begin altering it, then they have a responsibility to help facilitate the public’s understanding of how their technologies work.

Admittedly, it’s a problem that plagues all the sciences as they move from pure to applied. But biotechnological innovation at the cellular or chromosomal level is particularly challenging to make comprehensible to the general public, because its operation and impact take place in hiding. Furthermore, unlike pharmaceuticals, medical devices, or even nanotech equipment, the family of technologies known as CRISPR operate in an unfamiliar, almost alien way.

Rather than inserting foreign media or apparatuses into an organism, CRISPR makes the organism itself do the work on the nucleotides themselves. It’s more like genetic puppeteering than genetic editing. Ideally, the public names for methods and systems built atop CRISPR would telegraph their uses through clearly named and defined functions of particular kinds of Cas9-guided genetic manipulations.

CRISPR’s promise and threat remain opaque to the human beings whose lives might be altered by their impact.

One model for improving biotechnical branding comes from standards organizations. After the industrial revolution, the rise of interchangeable tools and parts both demanded and facilitated standardization. Thanks to the increased precision of machining, it was possible to define parts at a granular level—the sizes and thread patterns of screws, bolts, and nuts, for example. Over time, standards bodies developed to guide and manage the creation of technologies so that they would interoperate. While public communication wasn’t a primary goal of standards organizations, their impact couldn’t help but embrace public interest and public knowledge, particularly as the technologies they sought to influence became more universal, and thereby more public. For example, the Universal Postal Union (UPU) coordinates international postal policies, relieving individual nations of the burden of negotiating individual treaties.

Computing is hardly a great model industry for clearly naming and explaining its apparatuses to the public. But information technology does offer a possible precedent for biotech, thanks to its widespread embrace of standards bodies as intermediaries between technical implementation and public use. The World Wide Web Consortium (W3C) develops standards for the web, for example, including the operation of HTML and CSS. The W3C standards are often adopted (or ignored) by the tech industry, but admittedly, its underlying systems remain largely invisible and unknown to the general public.

The Institute of Electrical and Electronics Engineers (IEEE), by contrast, is a professional association for electrical engineering that operates an internal standards association. IEEE standards not only help computer engineers develop interoperable products, but also facilitates the branding and marketing of those standards to the general public. The IEEE 802.3 standard, for example, is better known as Ethernet, a local-networking technology. Anyone who has purchased a wireless router may have encountered IEEE 802.11, the wireless networking specification known as Wi-Fi. Or take the near-range wireless communication method known as Bluetooth. While originally invented (and, aptly, named) by the Swedish telecom company Ericsson, it was standardized as IEEE 802.15.1, although it is now managed by a separate standards body.

It is impossible to debate a technology without an effective way to refer to the thing being debated.

To be sure, names like Ethernet, Wi-Fi, and Bluetooth do obscure the underlying operation of those technologies. But that’s a matter for experts. In exchange, the public gets an abstraction that does a pretty good job naming and distinguishing three otherwise similar data networking technologies.

Standards organizations in health care and biotech do exist, but they are mostly focused on data interoperability. One exception is the American Type Culture Collection (ATCC). It operates a Standards Development Organization (SDO), which develops and manages standards for biomaterials. Last year, ATCC licensed the CRISPR-Cas9 technology, with an eye toward standardization. But ATCC is mostly interested in using CRISPR to generate standardized cell lines, rather than to help standardize the operation and use of the CRISPR technologies as such. And with problems arising—from implementation errors to potentially dangerous, engineered viruses—even scientists are beginning to recognize the downsides of a CRISPR wild west. It would be nice if there were a less laborious way for everyone else to participate in that conversation.

* * *

Gene editing is both exciting and terrifying. The medical, pharmaceutical, and agricultural applications are particularly enthralling—but likewise, the specter of human engineering, genetic screening, and bioweaponry manufacture are just as paralyzing. The speed of CRISPR-related development has already raised concern in the scientific and defense communities. But both CRISPR’s promise and threat remain opaque to the human beings whose lives might be altered by their impact. For the time being, scientists, journalists, policymakers, and the public lack a clear reference on what the family of molecular gadgets known as CRISPR can—and do—make possible.

Meanwhile, CRISPR research and development is accelerating. CRISPR hype is skyrocketing. That pace will only increase now that the patent dispute has concluded, clearing the way for lucrative licensing. It may seem silly to raise a flag about the name used to signify a technology when so many exciting (and terrifying) applications of that technology are already rolling out. But it is impossible to debate a technology’s applications in the public forum without an effective way to refer to the thing being debated. Unless the scientific community acts to explain and clarify the technology, its uses, and its dangers, it will cede that right to others.

And those proverbial, genetically-modified wolves are already circling. If science doesn’t step in first, the first general-audience take on gene editing might come in the form of a Jennifer Lopez-produced network television drama about saving humanity from a mad scientist. The show’s working title: “C.R.I.S.P.R.”

About the Author

Ian Bogost is a contributing editor at The Atlantic. He is the Ivan Allen College Distinguished Chair in media studies and aprofessor of interactive computing at the Georgia Institute of Technology. His latest book is Play Anything.

Most Popular

Five days after Hurricane Maria made landfall in Puerto Rico, its devastating impact is becoming clearer.

Five days after Hurricane Maria made landfall in Puerto Rico, its devastating impact is becoming clearer. Most of the U.S. territory currently has no electricity or running water, fewer than 250 of the island’s 1,600 cellphone towers are operational, and damaged ports, roads, and airports are slowing the arrival and transport of aid. Communication has been severely limited and some remote towns are only now being contacted. Jenniffer Gonzalez, the Resident Commissioner of Puerto Rico, told the Associated Press that Hurricane Maria has set the island back decades.

A small group of programmers wants to change how we code—before catastrophe strikes.

There were six hours during the night of April 10, 2014, when the entire population of Washington State had no 911 service. People who called for help got a busy signal. One Seattle woman dialed 911 at least 37 times while a stranger was trying to break into her house. When he finally crawled into her living room through a window, she picked up a kitchen knife. The man fled.

The 911 outage, at the time the largest ever reported, was traced to software running on a server in Englewood, Colorado. Operated by a systems provider named Intrado, the server kept a running counter of how many calls it had routed to 911 dispatchers around the country. Intrado programmers had set a threshold for how high the counter could go. They picked a number in the millions.

The greatest threats to free speech in America come from the state, not from activists on college campuses.

The American left is waging war on free speech. That’s the consensus from center-left to far right; even Nazis and white supremacists seek to wave the First Amendment like a bloody shirt. But the greatest contemporary threat to free speech comes not from antifa radicals or campus leftists, but from a president prepared to use the power and authority of government to chill or suppress controversial speech, and the political movement that put him in office, and now applauds and extends his efforts.

The most frequently cited examples of the left-wing war on free speech are the protests against right-wing speakers that occur on elite college campuses, some of which have turned violent.New York’s Jonathan Chait has described the protests as a “war on the liberal mind” and the “manifestation of a serious ideological challenge to liberalism—less serious than the threat from the right, but equally necessary to defeat.” Most right-wing critiques fail to make such ideological distinctions, and are far more apocalyptic—some have unironically proposed state laws that define how universities are and are not allowed to govern themselves in the name of defending free speech.

A growing body of research debunks the idea that school quality is the main determinant of economic mobility.

One of the most commonly taught stories American schoolchildren learn is that of Ragged Dick, Horatio Alger’s 19th-century tale of a poor, ambitious teenaged boy in New York City who works hard and eventually secures himself a respectable, middle-class life. This “rags to riches” tale embodies one of America’s most sacred narratives: that no matter who you are, what your parents do, or where you grow up, with enough education and hard work, you too can rise the economic ladder.

A body of research has since emerged to challenge this national story, casting the United States not as a meritocracy but as a country where castes are reinforced by factors like the race of one’s childhood neighbors and how unequally income is distributed throughout society. One such study was published in 2014, by a team of economists led by Stanford’s Raj Chetty. After analyzing federal income tax records for millions of Americans, and studying, for the first time, the direct relationship between a child’s earnings and that of their parents, they determined that the chances of a child growing up at the bottom of the national income distribution to ever one day reach the top actually varies greatly by geography. For example, they found that a poor child raised in San Jose, or Salt Lake City, has a much greater chance of reaching the top than a poor child raised in Baltimore, or Charlotte. They couldn’t say exactly why, but they concluded that five correlated factors—segregation, family structure, income inequality, local school quality, and social capital—were likely to make a difference. Their conclusion: America is land of opportunity for some. For others, much less so.

One hundred years ago, a retail giant that shipped millions of products by mail moved swiftly into the brick-and-mortar business, changing it forever. Is that happening again?

Amazon comes to conquer brick-and-mortar retail, not to bury it. In the last two years, the company has opened 11 physical bookstores. This summer, it bought Whole Foods and its 400 grocery locations. And last week, the company announced a partnership with Kohl’s to allow returns at the physical retailer’s stores.

Why is Amazon looking more and more like an old-fashioned retailer? The company’s do-it-all corporate strategy adheres to a familiar playbook—that of Sears, Roebuck & Company. Sears might seem like a zombie today, but it’s easy to forget how transformative the company was exactly 100 years ago, when it, too, was capitalizing on a mail-to-consumer business to establish a physical retail presence.

The foundation of Donald Trump’s presidency is the negation of Barack Obama’s legacy.

It is insufficient to statethe obvious of Donald Trump: that he is a white man who would not be president were it not for this fact. With one immediate exception, Trump’s predecessors made their way to high office through the passive power of whiteness—that bloody heirloom which cannot ensure mastery of all events but can conjure a tailwind for most of them. Land theft and human plunder cleared the grounds for Trump’s forefathers and barred others from it. Once upon the field, these men became soldiers, statesmen, and scholars; held court in Paris; presided at Princeton; advanced into the Wilderness and then into the White House. Their individual triumphs made this exclusive party seem above America’s founding sins, and it was forgotten that the former was in fact bound to the latter, that all their victories had transpired on cleared grounds. No such elegant detachment can be attributed to Donald Trump—a president who, more than any other, has made the awful inheritance explicit.

National Geographic Magazine has opened its annual photo contest, with the deadline for submissions coming up on November 17.

National Geographic Magazine has opened its annual photo contest for 2017, with the deadline for submissions coming up on November 17. The Grand Prize Winner will receive $10,000 (USD), publication in National Geographic Magazine and a feature on National Geographic’s Instagram account. The folks at National Geographic were, once more, kind enough to let me choose among the contest entries so far for display here. The captions below were written by the individual photographers, and lightly edited for style.

What the Trump administration has been threatening is not a “preemptive strike.”

Donald Trump lies so frequently and so brazenly that it’s easy to forget that there are political untruths he did not invent. Sometimes, he builds on falsehoods that predated his election, and that enjoy currency among the very institutions that generally restrain his power.

That’s the case in the debate over North Korea. On Monday, The New York Timesdeclared that “the United States has repeatedly suggested in recent months” that it “could threaten pre-emptive military action” against North Korea. On Sunday, The Washington Post—after asking Americans whether they would “support or oppose the U.S. bombing North Korean military targets” in order “to get North Korea to give up its nuclear weapons”—announced that “Two-thirds of Americans oppose launching a preemptive military strike.” Citing the Post’s findings, The New York Times the same day reported that Americans are “deeply opposed to the kind of pre-emptive military strike” that Trump “has seemed eager to threaten.”

More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.

One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”

Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”

Senators Lindsey Graham and Bill Cassidy sparred with Bernie Sanders and Amy Klobuchar on CNN hours after their bill dismantling Obamacare appeared to collapse.

Ordinarily, you debate to stave off defeat. But for Senators Lindsey Graham and Bill Cassidy on Monday night, the defeat came first.

By the time the two GOP senators stepped on CNN’s stage Monday night for a prime-time debate over their health-care proposal, they knew they had already lost.

A few hours earlier, Senator Susan Collins became the third Republican to formally reject the pair’s legislation to repeal and replace the Affordable Care Act, effectively killing its chances for passage through the Senate this week. Graham and Cassidy had hoped to use the forum to make a closing argument for their plan, and to line it up against Senator Bernie Sanders and his call for a single-payer, “Medicare-for-All” health-care system. Instead, the two senators found themselves defending a proposal that was no less hypothetical—and probably much less popular—than Sanders’s supposed liberal fantasy.