How An Entire Nation Became Russia's Test Lab for CyberwarAndy Greenberg June 20, 2017

In 2009, when the NSA’s Stuxnet malware silently accelerated a few hundred Iranian nuclear centrifuges until they destroyed themselves, it seemed to offer a preview of this new era. “This has a whiff of August 1945,” Michael Hayden, former director of the NSA and the CIA, said in a speech. “Somebody just used a new weapon, and this weapon will not be put back in the box.”

Now, in Ukraine, the quintessential cyberwar scenario has come to life. Twice. On separate occasions, invisible saboteurs have turned off the electricity to hundreds of thousands of people. Each blackout lasted a matter of hours, only as long as it took for scrambling engineers to manually switch the power on again. But as proofs of concept, the attacks set a new precedent: In Russia’s shadow, the decades-old nightmare of hackers stopping the gears of modern society has become a reality.

And the blackouts weren’t just isolated attacks. They were part of a digital blitzkrieg that has pummeled Ukraine for the past three years—a sustained cyber­assault unlike any the world has ever seen. A hacker army has systematically undermined practically every sector of Ukraine: media, finance, transportation, military, politics, energy. Wave after wave of intrusions have deleted data, destroyed computers, and in some cases paralyzed organizations’ most basic functions. “You can’t really find a space in Ukraine where there hasn’t been an attack,” says Kenneth Geers, a NATO ambassador who focuses on cybersecurity.

People think they know everything about slavery in the United States, but they don’t. They think the majority of African slaves came to the American colonies, but they didn’t. They talk about 400 hundred years of slavery, but it wasn’t. They claim all Southerners owned slaves, but they didn’t. Some argue it was a long time ago, but it wasn’t.

As a scholar of slavery at the University of Texas at Austin, I welcome the public debates and connections the American people are making with history. However, there are still many misconceptions about slavery.

I’ve spent my career dispelling myths about “the peculiar institution.” The goal in my courses is not to victimize one group and celebrate another. Instead, we trace the history of slavery in all its forms to make sense of the origins of wealth inequality and the roots of discrimination today. The history of slavery provides deep context to contemporary conversations and counters the distorted facts, internet hoaxes and poor scholarship I caution my students against.

Four myths about slavery

Myth One: The majority of African captives came to what became the United States.

Truth: Only 380,000 or 4-6% came to the United States. The majority of enslaved Africans went to Brazil, followed by the Caribbean. A significant number of enslaved Africans arrived in the American colonies by way of the Caribbean where they were “seasoned” and mentored into slave life. They spent months or years recovering from the harsh realities of the Middle Passage. Once they were forcibly accustomed to slave labor, many were then brought to plantations on American soil.

Myth Two: Slavery lasted for 400 years.

Popular culture is rich with references to 400 years of oppression. There seems to be confusion between the Transatlantic Slave Trade (1440-1888) and the institution of slavery, confusion only reinforced by the Bible, Genesis 15:13:

Then the Lord said to him, ‘Know for certain that for four hundred years your descendants will be strangers in a country not their own and that they will be enslaved and mistreated there.’

Listen to Lupe Fiasco – just one Hip Hop artist to refer to the 400 years – in his 2011 imagining of America without slavery, “All Black Everything”:

[Hook] You would never know If you could ever be If you never try You would never see Stayed in Africa We ain’t never leave So there were no slaves in our history Were no slave ships, were no misery, call me crazy, or isn’t he See I fell asleep and I had a dream, it was all black everything

[Verse 1] Uh, and we ain’t get exploited White man ain’t feared so he did not destroy it We ain’t work for free, see they had to employ it Built it up together so we equally appointed First 400 years, see we actually enjoyed it

Truth: Slavery was not unique to the United States; it is a part of almost every nation’s history from Greek and Roman civilizations to contemporary forms of human trafficking. The American part of the story lasted fewer than 400 years.

How do we calculate it? Most historians use 1619 as a starting point: 20 Africans referred to as ”servants” arrived in Jamestown, VA on a Dutch ship. It’s important to note, however, that they were not the first Africans on American soil. Africans first arrived in America in the late 16th century not as slaves but as explorers together with Spanish and Portuguese explorers. One of the best known of these African “conquistadors” was Estevancio who traveled throughout the southeast from present day Florida to Texas. As far as the institution of chattel slavery – the treatment of slaves as property – in the United States, if we use 1619 as the beginning and the 1865 Thirteenth Amendment as its end then it lasted 246 years, not 400.

Myth Three: All Southerners owned slaves.

Truth: Roughly 25% of all southerners owned slaves. The fact that one quarter of the Southern population were slaveholders is still shocking to many. This truth brings historical insight to modern conversations about the Occupy Movement, its challenge to the inequality gap and its slogan “we are the 99%.”

Take the case of Texas. When it established statehood, the Lone Star State had a shorter period of Anglo-American chattel slavery than other Southern states – only 1845 to 1865 – because Spain and Mexico had occupied the region for almost one half of the 19th century with policies that either abolished or limited slavery. Still, the number of people impacted by wealth and income inequality is staggering. By 1860, the Texas enslaved population was 182,566, but slaveholders represented 27% of the population, controlled 68% of the government positions and 73% of the wealth. Shocking figures but today’s income gap in Texas is arguably more stark with 10% of tax filers taking home 50% of the income.

Myth Four: Slavery was a long time ago.

Truth: African-Americans have been free in this country for less time than they were enslaved. Do the math: Blacks have been free for 149 years which means that most Americans are two to three generations removed from slavery. However, former slaveholding families have built their legacies on the institution and generated wealth that African-Americans have not been privy to because enslaved labor was forced; segregation maintained wealth disparities; and overt and covert discrimination limited African-American recovery efforts.

The value of slaves Economists and historians have examined detailed aspects of the enslaved experience for as long as slavery existed. Recent publications related to slavery and capitalism explore economic aspects of cotton production and offer commentary on the amount of wealth generated from enslaved labor.

My own work enters this conversation looking at the value of individual slaves and the ways enslaved people responded to being treated as a commodity. They were bought and sold just like we sell cars and cattle today. They were gifted, deeded and mortgaged the same way we sell houses today. They were itemized and insured the same way we manage our assets and protect our valuables.

Enslaved people were valued at every stage of their lives, from before birth until after death. Slaveholders examined women for their fertility and projected the value of their “future increase.” As they grew up, enslavers assessed their value through a rating system that quantified their work. An “A1 Prime hand” represented one term used for a “first rate” slave who could do the most work in a given day. Their values decreased on a quarter scale from three-fourths hands to one-fourth hands, to a rate of zero, which was typically reserved for elderly or differently abled bondpeople (another term for slaves.)

Guy and Andrew, two prime males sold at the largest auction in US History in 1859, commanded different prices. Although similar in “all marketable points in size, age, and skill,” Guy commanded $1240 while Andrew sold for $1040 because “he had lost his right eye.” A reporter from the New York Tribune noted “that the market value of the right eye in the Southern country is $240.” Enslaved bodies were reduced to monetary values assessed from year to year and sometimes from month to month for their entire lifespan and beyond. By today’s standards, Andrew and Guy would be worth about $33,000-$40,000.

Slavery was an extremely diverse economic institution; one that extrapolated unpaid labor out of people in a variety of settings from small single crop farms and plantations to urban universities. This diversity is also reflected in their prices. Enslaved people understood they were treated as commodities.

“I was sold away from mammy at three years old,” recalled Harriett Hill of Georgia. “I remembers it! It lack selling a calf from the cow,” she shared in a 1930s interview with the Works Progress Administration. “We are human beings” she told her interviewer. Those in bondage understood their status. Even though Harriet Hill “was too little to remember her price when she was three, she recalled being sold for $1400 at age 9 or 10, “I never could forget it.”

Slavery in popular culture Slavery is part and parcel of American popular culture but for more than 30 years the television mini-series Roots was the primary visual representation of the institution except for a handful of independent (and not widely known) films such as Haile Gerima’s Sankofa or the Brazilian Quilombo. Today Steve McQueen’s 12 Years a Slave is a box office success, actress Azia Mira Dungey has a popular web series called Ask a Slave, and in Cash Crop sculptor Stephen Hayes compares the slave ships of the 18th century with third world sweatshops.

From the serious – PBS’s award-winning Many Rivers to Cross – and the interactive Slave Dwelling Project- whereby school aged children spend the night in slave cabins – to the comic at Saturday Night Live, slavery is today front and center.

The elephant that sits at the center of our history is coming into focus. American slavery happened — we are still living with its consequences.

A panel of cancer experts recommended approval Wednesday of an experimental therapy for treating children and young adults with advanced leukemia that could be the first gene therapy approved in the United States, potentially opening the door to a wave of treatments custom-made to target a patient’s cancer.

The advisory panel to the federal Food and Drug Administration advisory panel voted 10-0 in favor of the treatment developed by the University of Pennsylvania and Novartis Corp. The FDA usually follows recommendations of its expert panels.

The therapy could be the first of a wave of individualized treatments custom-made to target a patient’s cancer.

“This is a major advance,” said panel member Dr. Malcolm A. Smith of the National Cancer Institute.

Smith said the treatment is “ushering in a new era.”

Called CAR-T, it involves removing immune cells from a patient’s blood, genetically altering them — in effect, reprogramming them to create an army of cells that can zero in on and destroy cancer cells — and then injecting them back into the patient to fight these blood cancers.

The unanimous vote came after lengthy discussion and impassioned pleas from the fathers of two young patients whose lives were saved by the therapy. The one-time leukemia treatment would be for children and young adults with the most common form of childhood cancer, known as ALL.

“Our daughter was going to die, and now she leads a normal life,” said Tom Whitehead of Philipsburg, Pennsylvania.

Five years ago, Emily Whitehead, now 12, was the first child to receive the experimental therapy.

“We believe when this treatment is approved, it will save thousands of children’s lives around the world,” Whitehead said.

After decades of setbacks and disappointments in efforts to fix, replace, or change genes to cure diseases, several companies are near the finish line in a race to bring CAR-T and other types of gene therapy to patients. Kite Pharma also has a CAR-T therapy in FDA review and Juno Therapeutics, and others are in late stages of testing.

Human T cells belonging to cancer patients arrive at Novartis Pharmaceuticals Corp.’s Morris Plains, N.J., facility. This laboratory is where the T cells of cancer patients are processed and turned into super cells as part of a new gene therapy-based cancer treatment. | Novartis Pharmaceuticals Corp. via AP

Novartis is seeking approval to use the treatment for patients aged 3 to 25 with a blood cancer called acute lymphoblastic leukemia whose disease has spread or failed to respond to standard treatment. That happens to more than 600 patients in the U.S. each year. At that point, they have limited options — all more toxic than the CAR-T therapy — and survival chances are slim. ALL accounts for a quarter of all cancers in children under age 15.

In a key test, results were far better than chemotherapy and even newer types of cancer drugs. Of the 52 patients whose results were analyzed, 83 percent had complete remission, meaning their cancer vanished. Most patients suffered serious side effects, but nearly all recovered.

CAR-T therapy starts with filtering key immune cells called T cells from a patient’s blood. In a lab, a gene is then inserted into the T cells that prompts them to grow a receptor that targets a special marker found on some blood cancers. Millions of copies of the new T cells are grown in the lab, then injected into the patient’s bloodstream where they can seek out and destroy cancer cells. Doctors call it a “living drug” — permanently altered cells that continue to multiply in the body into an army to fight the disease.

During the patient testing, the whole process took about 16 weeks, which can be too long a wait for some desperately ill patients, the FDA advisers noted during the meeting in Silver Spring, Maryland. Drug company officials said they can now produce a treatment and get it to a patient in about three weeks.

Novartis said in a statement that it has long believed CAR-T therapy could “change the cancer-treatment paradigm.”

The cost of CAR-T therapy is likely to be hundreds of thousands of dollars, but it’s only given once. Typically, cancer patients take one or more drugs until they stop working, then switch to other drugs, so treatment — and side effects — can go on for years.

The treatment’s short-term side effects, including fever and hallucinations, are often intense as the body’s revved up immune system goes on the attack. The long-term side effects of the treatment are unknown. It’s also unclear if patients whose cancer goes into remission will be cured or will have their cancer return eventually. The FDA panel recommended that patients who get the treatment be monitored for 15 years.

Other biotech and pharmaceutical companies are developing types of gene therapy to treat solid cancers and rare gene-linked diseases. A few products have been approved elsewhere — one for head and neck cancer in China in 2004 and two in Europe, most recently GlaxoSmithKline’s Strimvelis. That was approved last year for a deadly condition called severe combined immunodeficiency and launched with a $670,000 price tag.

UniQure’s Glybera was approved for a rare enzyme disorder. It was used only once in five years, likely due to its $1 million-plus price tag, so uniQure is pulling it from the market.

It’s an iconic image of fall—a V of geese high overhead, migrating south for the winter. Several other bird species, such as flamingos, also fly in V formations, but geese are the most well known. On any given fall day, numerous Vs can be seen passing gracefully below the clouds. Most birds actually don’t migrate in the V formation; smaller birds tend to fly in huge amorphous flocks. Others fly in a simple line. But why do geese and other birds fly in V formations?

Writing in the journal “Auk,” John Badgerow examined numerous formations of geese, hoping to come up with a definitive answer. Aerodynamism was not initially seen as a primary reason for the behavior; previous analyses had suggested that the birds were too far apart to benefit from the energy saved. Instead, the angle of the V suggested that the primary purpose was a form of visual communication—the geese that composed both diagonal lines, each slightly angled from one another, would have an unobstructed view of the lead goose who determined the flight path of the flock.

Badgerow, unconvinced, decided to test both hypotheses. Working from film, he analyzed the geometry of the formations. Badgerow calculated that maximum energetic advantage (compared to solo flight) is achieved at exactly 0.16 meters between the wingtips of a bird and the one following. The spacing requirements explain why only certain birds fly in V-formations—only birds with large wingspans and slow beats can achieve the energy saving. Rapid or erratic flapping creates too much wake turbulence, which disrupts the formation. Migrating geese effectively function like airplanes.

At the same time, information about changes in course and velocity must be communicated to other members of the flock, so visual communication does play a role. The angle seems to be chosen so all members of a flock can see the leader and adjust to any changes. Flocks contain both experienced and new migrants, so communicating information about rest and feeding areas is vital.

In practice, turbulence, air currents, and other factors make it extremely difficult for the geese to maintain the exact spacing needed to maximize efficiency or communication, and actual efficiency improvements are only about 20% of the theoretical maximum. Geese go to great lengths to maintain their formation as best as possible, so Badgerow suggests that while both communication and efficiency play a role in V-formations, energy efficiency is the primary motivator. More recent studies have confirmed that flying in the lead is the most tiring position, so geese take turns at the head of the V in order to allow leaders to rest.

“The biggest buzzword right now is FOMO [fear of missing out], and that’s a huge factor for Millennials,” said Aubri Nowowiejski, 28, an executive producer at Coterie Spark, a global meeting and corporate event-planning firm based in Houston, Texas.

“It’s all about projecting to your social media network, and painting a picture of a phenomenal lifestyle. They chase experiences over things to get those likes and comments and interactions, and that dopamine fix,” said Nowowiejski, who founded the Student Event Planners Association in 2009 while a student at Texas State University. It currently has more than 2,000 members nationwide.

In fact, the study found that nearly half of Millennials surveyed attend live events so they have something to share online; and 78 percent reported enjoying seeing other people’s experiences on social media.

The Millennial obsession with FOMO and cultivating the perfect social media feed can be a boon to consumer-focused marketing events, especially if they are free, but it also presents challenges to conference and meetings organizers who value engagement. In an effort to lure the young demographic, organizers may reach for the unique “wow” factor, and wind up with an event that is more style than substance.

“That’s one of my biggest frustrations: I want us to put down our phones,” said Nowowiejski. “We’re experiencing the whole world through a lens and we’re not present in the moment, and it is an issue. Conference planners are trying to integrate social media and meet Millennials where they are. You go to a conference and are told to live-tweet, but if I’m tweeting, am I really absorbing what the speaker is saying? It’s a double-edged sword. What did you really experience if you didn’t put your phone down?”

Did your email spam filter keep junk out of your inbox? Did you find this site through Google? Did you encounter a targeted ad on your way?

We constantly hear that we’re on the verge of an AI revolution, but the technology is already everywhere. And Coursera co-founder Andrew Ng predicts that smart technology will help humans do even more. It will drive our cars, read our X-rays and affect pretty much every job and industry. And this will happen soon.

As AI rises, concerns grow about the future of humans. So how can we make sure our economy and our society are ready for a technology that could soon dominate our lives?

And why wouldn’t there be? One of the smartest humans alive, Stephen Hawking, says AI could end mankind.

But the question isn’t whether to worry about AI, it’s what kind of AI to worry about.

Tesla founder Elon Musk recently warned a gathering of governors that they need to act now to put regulations the development of artificial intelligence. “I keep sounding the alarm bell, but until people see robots going down the street killing people, they don’t know how to react, because it seems so ethereal,” he said.

Musk is not talking about the sort of artificial intelligence that companies like Google, Uber, and Microsoft currently use, but what is known as artificial general intelligence — some conscious, super-intelligent entity, like the sort you see in sci-fi movies. Musk (and many AI researchers) believe that work on the former will eventually lead to the latter, but there are plenty of people in the science community who doubt this will ever happen, especially in any of our lifetimes.

To understand the threats AI may or may not pose to society, it’s best to understand the types of AI that do and don’t (yet) exist. Wait But Why has a great summary:

AI Caliber 1) Artificial Narrow Intelligence (ANI): Sometimes referred to as Weak AI, Artificial Narrow Intelligence is AI that specializes in one area. There’s AI that can beat the world chess champion in chess, but that’s the only thing it does. Ask it to figure out a better way to store data on a hard drive, and it’ll look at you blankly.

AI Caliber 2) Artificial General Intelligence (AGI): Sometimes referred to as Strong AI, or Human-Level AI, Artificial General Intelligence refers to a computer that is as smart as a human across the board—a machine that can perform any intellectual task that a human being can. Creating AGI is a much harder task than creating ANI, and we’re yet to do it. Professor Linda Gottfredson describes intelligence as “a very general mental capability that, among other things, involves the ability to reason, plan, solve problems, think abstractly, comprehend complex ideas, learn quickly, and learn from experience.” AGI would be able to do all of those things as easily as you can.

AI Caliber 3) Artificial Superintelligence (ASI): Oxford philosopher and leading AI thinker Nick Bostrom defines superintelligence as “an intellect that is much smarter than the best human brains in practically every field, including scientific creativity, general wisdom and social skills.” Artificial Superintelligence ranges from a computer that’s just a little smarter than a human to one that’s trillions of times smarter—across the board.

Type 1 exists. This is what we use every day. This is what is reshaping our social networks, advertising and economy. The threat here is already visible. “Fake news” designed to hoax humans games algorithms to reach a wider audience. Automation is replacing human jobs.

Types 2 and 3 cause the anxiety. Futurist Michael Vassar, who has worked with AI, has used Nick Bostrom’s thinking on artificial intelligence to predict that “if greater-than-human artificial general intelligence is invented without due caution, it is all but certain that the human species will be extinct in very short order.”

Even though very smart people disagree over whether this AI will ever exist, the concept of a science-fiction dystopia is simultaneously terrifying and alluring. It’s easy to imagine a Terminator-like world where machines do battle with their human creators and think of it as both unlikely to happen in our lifetimes and also inevitable. And this can make it hard to think about taking steps to stop it from happening. At least one study has found that people are worried about smart machines killing them.

“In our current society, automation pushes people out of jobs, making the people who own the machines richer and everyone else poorer. That is not a scientific issue; it is a political and socioeconomic problem that we as a society must solve,” wrote scientist Arend Hentz. “My research will not change that, though my political self – together with the rest of humanity – may be able to create circumstances in which AI becomes broadly beneficial instead of increasing the discrepancy between the one percent and the rest of us.”

The inflation debate among Federal Reserve policy makers comes down to peering through a mud-caked windshield to figure out what's ahead, or being guided by looking in the rear-view mirror. But a third option, the U.S. Future Inflation Gauge, cuts through the confusion and has a solid track record of providing turn-by-turn directions to the road ahead like a GPS.

Traditionalists on the Federal Open Market Committee who place their faith in the Phillips curve worry that, with the jobless rate below the Fed’s estimate of “full employment,” the central bank will fall behind the curve on inflation if it stops tightening. Others argue that the Phillips curve has stopped working. They prefer the rear-view mirror approach, extrapolating core inflation, which is by definition a coincident indicator of inflation, to guide their expectations of future inflation. That methodology never foresees turns in the road ahead, instead projecting a straight extension of the road just traveled.

Some history is helpful. Following the first interest-rate hike in the 1994-95 tightening cycle -- which produced a “soft landing,” thanks to the only pre-emptive moves by the Fed in recent memory -- then Fed Chairman Alan Greenspan endorsed the work of Geoffrey H. Moore in his congressional testimony. The Wall Street Journal described Moore as “the father of leading indicators,” whose work on inflation is the basis for our firm's U.S. Future Inflation Gauge, or FIG. Justin Martin's biography of the former Fed chairman noted that the FIG “would prove to be one of Greenspan’s favorite indicators.”

For those who think that today’s economy somewhat resembles that of the late 1990s, featuring growth without inflation, we'd point out that the FIG correctly anticipated that unusual combination of circumstances, which the Phillips curve failed to foresee. Rather than using the Phillips curve or extrapolating recent inflation data, the FIG leads cyclical upturns and downturns in inflation by tracking underlying cyclical inflation pressures -- in essence, the co-movement of cyclical leading indicators of inflation.

The news today is that even as the Fed and other central banks get more hawkish the FIG is starting to turn down, flagging a change in the direction of the inflation cycle that the Fed is likely to miss. Skeptical? The big run-up in the FIG a year ago was spot on in anticipating the reflation trade, at a time when inflation expectations were near multiyear lows. Inflation expectations then ran up with the reflation trade through the beginning of this year, falling in line behind the upswing in the FIG.

As for wage growth, most still remain confused as to why in a tight labor market it’s logical to see nominal wage growth rising when economic growth decelerates, and falling when it accelerates. Moreover, wage inflation typically lags CPI inflation at cyclical troughs and leads it at peaks. Given that dynamic and the ongoing cyclical downswing in the FIG, wage inflation is unlikely to see significant gains in the months ahead.

The FIG is clearly signaling a fresh cyclical downswing in inflation, which is being obscured, ironically, by the dip in inflation due to so-called idiosyncratic factors such as the declines in wireless phone service and prescription drug prices. The bottom line: Regardless of the decline in the jobless rate, the inflation cycle is turning down.

Yet another outbreak of foodborne illness last week at Chipotle Mexican Grill did what it usually does to the burrito chain: The stock price plummeted. It's bad news—particularly for the patrons who got sick—but it's a boon for anyone that had the foresight to short the stock.

The latest outbreak was first noted by iwaspoisoned.com, a website that crowdsources reports of customer illnesses following visits to restaurants. The goal, it says, is "safer food, safer communities and a healthier economy." Yet, as Bloomberg reported last week, hedge funds looking to profit from others' bad luck can also access a "souped up" version of the site for a $5,000 monthly fee.

Aaron Allen, principal at Aaron Allen & Associates, a restaurant industry consultancy, posited in a LinkedIn post on Monday morning that the Chipotle illness might not just be a matter of luck. "A lot of things stacked up that made it suspicious," he told Bloomberg in an interview on Monday, "and when you look at it from a statistical point of view, even more suspicious." His group has no financial interest in the chain, Allen said, and he has previously lauded the chain's pre-scandal marketing.

He's not the first to publicly speculate about the possibility that corporate sabotage is behind Chipotle's woes. (There was even a recent plot line on Showtime's "Billions" in which a hedge fund manager contaminated the customers of a fictional company, Ice Juice, whose stock he had just shorted.) A similar theory—which was largely dismissed—circulated two years ago when the chain was hit by outbreaks of E. coli, norovirus, and salmonella. The stock price plummeted at the time, costing the company billions of dollars in market capitalization.

While Allen can't prove his theories about Chipotle, he argues that corporate sabotage of a similar nature has happened in the past. A woman planted a severed finger in her Wendy's chili more than a decade ago. In the 1980s, Tylenol capsules were purposely tainted with potassium cyanide, leading to the deaths of seven people in the Chicago area.

Plus, there was a lotof money on the table for Chipotle short sellers—as much as $459 million, according to Allen's calculations—before there were any inklings of a food safety problem. "Chipotle short-sellers saw their ambitions rewarded with $55 million in less than one day," he wrote of the latest scare.

Allen said his team, which includes a statistician and food safety experts, found a number of "statistical anomolies." First, he wrote in his post, the time of year raises questions; while 60 percent of food safety outbreaks occur from December to May, Chipotle's happened from August to December. Second, Chipotle experienced four times the number of norovirus outbreaks a chain of its size would be expected to have, and that's not even counting the E. coli and salmonella. And, he notes, significantly more people got sick from each of the outbreaks than normally do from the same pathogens. The average norovirus outbreak causes about 18 illnesses, while salmonella usually leads to about 25. At Chipotle, more than 200 people were sickened from a single August 2015 norovirus outbreak, and 64 fell ill from the same month's salmonella problem.

"We're not saying this as a definitive," he said. "But if you were a short seller and you were looking for where there would be the most financial gain in the restaurant industry, the best way is a food safety scare, and the best stock would be Chipotle."

Chipotle did not respond to Allen's post but told Bloomberg that it was aware of the 2015 theories. "We ... did not see any evidence to support them," spokesman Chris Arnold said in an email. He also pointed out that the company implemented food safety enhancements after those incidents.

"I can tell you unequivocally that none of the 2015 outbreaks were caused by some terrorist or criminal acts," said Bill Marler, a leading food safety attorney who has litigated foodborne illness claims for decades and who represented clients in each of that year's Chipotle outbreaks. "It's conspiratorial nuts."

Marler examined the fact patterns in each case and believes they were all caused by the usual suspects: sick employees, failure to pay attention to detail and ready-to-eat food showing up already tainted.

He allows that a conspiracy is theoretically possible. "But I don't believe in aliens, or that humans walked the earth with dinosaurs, either," he added.

Allen said he's just raising a question about a statistical anomaly, not accusing anyone of purposely poisoning burrito lovers. "It really is one of those situations, like who would put a finger in the chili?"