After 9/11, the bodies at Ground Zero were made heroic; the immigrant bodies that cleaned them up, less so

If money comes into the world with a congenital blood-stain on one cheek, capital comes dripping from head to toe, from every pore, with blood and dirt.

—Karl Marx, Capital, Volume 1

“We have still not had a death,” he said. “A person does not belong to a place until there is someone dead under the ground.”

—Gabriel García Márquez, One Hundred Years of Solitude

We can tell the story in an easy calculus. The South Tower of the World Trade Center once stood 1,362 feet tall. At 9:03 a.m. on the morning of September 11 2001 it was struck by a Boeing 767 flying at 590 miles an hour and burning 9,100 gallons of jet fuel. The quarter-mile long tower burned at 1,800°F until 9:59 a.m., when it collapsed, sending 500,000 tons of material to the ground at 120 miles per hour and instantly killing 600 people. The North Tower, roughly its equal in size and weight, collapsed 29 minutes later, killing some 1,400 people. Before “Ground Zero” was ever invoked as an election-season rhetorical salvo, before it was ever emblazoned on a special-edition enamel plate anywhere near an eagle, and before it inspired George W. Bush to call it, in his diary, the 21st century’s Pearl Harbor, it constituted 16 acres of a “working fire” that burned 50 meters deep, a designation that gave immediate jurisdiction of the site to the FDNY. The firemen first brought in blueprints and floor plans, rushing to locations where they believed elevators and stairwells would have collapsed with the people they carried. Next, they introduced a map that used global positioning technology to plot patterns among locations where bodies, or parts of bodies, were being found. There were few survivors in the rubble—only 11, in fact —and it soon became clear that the mapping technology would instead be used to locate the dead.

The fire was mean and hard to extinguish. It burned long and deep, flaring when exposed to oxygen and fueled by tons of highly conductive paper and furniture soaked in jet fuel. On September 16, NASA sent an airplane over Ground Zero to gather infrared data that the U.S. Geological Survey made into a thermal heat map, one that showed patches of rubble burning at temperatures above 1,292°F, hotter than the burning point of aluminum. The New Scientist dubbed it “the longest-burning structural fire in history.” Heavy rains fell all day on the 14th and on the night of the 20th, into the dawn of the 21st. A total of four million gallons of water soaked through the debris and pooled at the World Trade Center’s “bathtub,” the 60-foot deep rectangular foundation on which the towers stood. Officials worried the foundation’s weak walls could give way, causing water from the Hudson River to seep into the PATH and subway tunnels, effectively flooding the city’s underground.

The fires burned for 100 days; alternatively, the fires were allowed to burn for 100 days. Officials had not stopped calling Ground Zero a “rescue operation” and they wanted to communicate that it was still possible to find survivors, something they could not do while simultaneously submerging all 16 acres with water and flame retardants. After the 100th day, the last flame was extinguished. It took Rudy Giuliani to stoke it back. At a New York City gala, he reminded listeners of the three firemen who raised the American flag atop the rubble in the now-famous photograph: “They were standing on top of a cauldron. They were standing on top of fires 2,000 degrees that raged for a hundred days. And they put their lives at risk raising that flag. They put the flag up to say, You can’t beat us, because we’re Americans. And we don’t say this with arrogance or in a militaristic way, but in a spiritual way: Our ideas are better than yours.” American exceptionalism was thus reified with the possibility of burnt citizen flesh.

Rescue workers called the 16 acres of debris on Ground Zero “the Pile.” The powdered debris in the Pile contained more than 150 compounds and elements including plaster, talc, synthetic foam, glass, paint chips, charred wood, slag wool, 200,000 pounds of lead from 50,000 computers, gold and mercury from 500,000 fluorescent lights, 2,000 tons of asbestos, and 91,000 liters of jet fuel. The nearly 3,000 human beings who died made up such a miniscule part of the debris that the odds of finding identifiable remains among this city of dust was less than 1 in a quadrillion. Before any tests could be conducted on human remains, however, scientists worried about getting institutional board approval for use of human samples in a scientific investigation.

On September 26, 15 days after the attack, the National Institute of Environmental Health Sciences sent researchers an email issuing a strict definition of “human subject.” According to the board, a human subject is someone for whom “private information must be individually identifiable” and “individuals whose remains ‘could’ be in the dust did not qualify as human subjects.” While the NIEHS framed its response as permission to use the bodily remains in scientific inquiry, their words carry a grave discursive weight. Defining personhood according to a human being’s ability to be personally identified implies that naming is a category of political recognition based on documentation given that parts of the corporal palimpsest must match information about the body in the state’s archive—paper matched with paper. Individuals who leave bodily traces but who do not officially “exist” in state archives because their livelihoods depend upon clandestine existence may not make matching individually identifiable private information easy. According to the NIEHS’s language, these persons would be excluded from the category of “human subject.”

• • •

Fresh Kills, once the largest landfill in the world, is 10 miles south of Manhattan. A makeshift forensic laboratory city was set up on the landfill for nearly a year to sort through 1.6 million tons of debris. At the project’s peak, workers went through 7,000 tons of it in a single day. Recovery workers did find whole bodies but they found many more parts. Although skin fragments and hair abounded, these were poor materials with which to identify victims because 2,000-foot towers that welcomed visitors and employees for over four decades accumulated so much human hair and dead skin cells in rugs, chairs, elevators, bathrooms, lobbies, and every hard surface that researchers would have been confronted with a seemingly infinite number of false-positive identifications. Inside refrigerated trailers, cadavers were stored in body bags and tissue and bones were kept in red bins. Workers divided the bones amongst themselves. After washing the bones, they used small blenders called mills to grind them into a coarse powder that one worker remembers smelling like burnt steak. After they were done, they washed the mills with bleach and alcohol, a long and difficult process abrasive to the workers’ hands.

Legal scholars have called DNA an “alternative source of corporeality” whose personally identifying potential can be used in the courts of law as a facilitator of the Western jurisprudential philosophy of habeas corpus meaning “you have the body [to show].” When a DNA sample leads to the naming of a full human being—first name, last name, ethnicity, Social Security number—it means a body has been begotten “through translations, disassembly and reassembly, and conversion between states, properties and machines.” This follows Elizabeth Grosz’ definition of a body as a “concrete, material, animate organization of flesh, nerves, muscles, and skeletal structure which are given unity, cohesiveness, and organization only through their physical and social inscription as the surface and raw materials of an integrated and cohesive totality.” Whole parts like tissue and sinew and limb or microscopic parts like chromosome links become a “body” through translation and inscription—writerly motions.

Paul Lioy, an expert on toxin exposure who has studied the WTC dust, has observed that the debris “contained materials that were used to make the things that represent the raw material and by-products that defined our civilization.” This doesn’t just mean company letterhead and swivel chairs but also poisons. Asbestos and lead have long been part of the inner city’s ecological landscape, cheaply lining walls, tubes, and ceilings while leaving trace deposits of their dry powders in the blood and urine of inner city children. The EPA did not want to test for those particular substances in the blood of sick New Yorkers in the years following 9/11 because doing so would “lead down an endless rabbit hole costing billions and lasting years” meaning they would have had to find a way to “distinguish trade center material from common urban dust.” Normalizing the bestiary of noxious powders implied in “common urban dust” ignores the many generations of inner city children—disproportionately African-American and Latino/a—who learn to prime asthma inhalers before they learn to sign their names. The pervasive expectation that some lungs and bloodstreams could be born or made impervious to poisonous dusts is evident in a study of WTC search-and-rescue dogs. The dogs, described as “canine equivalents of canaries in a coal mine” because they could show signs of asbestosis and mesothelioma a lot faster than human organisms could, had their chests x-rayed. The x-rays came back normal, but the study was criticized for excluding “many of the rescue dogs that were brought to New York from other parts of the country” meaning that New York dogs must have been so accustomed to quietly menacing substances in the air that their organisms weren’t trustworthy.

The federally funded New York City cleanup program lasted two years and cost $37.9 million dollars, $30.4 million of which went to contractors and subcontractors. Rosa Bramble Weed, a social worker and psychologist who has met with undocumented cleanup workers exhibiting signs of post-traumatic stress, explains that undocumented workers received calls from “a very underground kind of network of people who are undocumented and need work. They called at night. They said, tomorrow there is work, come work.” Port Authority management contacted major contracting firms who sent the projects to subcontractors; by the time the jobs trickled down to cleanup workers, their contractors worked with hundred thousand dollar contracts while the immigrants were routinely paid $60 for 12-plus hour days. Vans drove up and down Roosevelt Avenue in Jackson Heights, Long Island, Nassau County, and Suffolk County, looking for day laborers to bring to Ground Zero. The Polish and Latino construction workers—all men—arrived first, but the women were the last to leave. Undocumented women had cleaned Lower Manhattan residential spaces and offices for years and knew people who would call them up when there was work. And there was work.

Jaime—whose surname I have omitted from this essay—is a Colombian immigrant who used to work as a nighttime security guard near the World Trade Center and helped with cleanup efforts from day one. When he first arrived to the site, “it looked like a Western, like a desert,” he told me. “Everything was dust-water and there was no light.” Jaime walked into building basements flooded by water and chemicals. He had no protective gear so he tied plastic bags around his ankles and waded through the waters. “The city was great at picking up the garbage,” he remembers. He was instructed to discard any debris in black trash bags and throw paper in clear bags—the black bags were for waste. The dust was the hardest to clean because it blinded him and stuck to his clothing, especially when he got wet. The wind took care of loose paper. Police arrived on the scene early and guarded the site fiercely. People were asked to show ID cards to enter the scene but cleanup workers just needed to prove they were there to clean. At first, tourists applauded and took pictures of them. Then, people started yelling, “Leave! Leave!” The women from Rosa’s group that I spoke to all know other women hired by subcontracting firms who were paid checks that bounced back. They are sick. They are also broke, uninsured, and, when I spoke to them, they were all hoping they would somehow qualify for “la Zadroga,” a federal bill that would help cover their healthcare costs. What they really want, though, is a green card. Jaime penned an open letter to President Barack Obama on behalf of the group asking him to grant documentation to immigrants who could prove they had worked at Ground Zero.

The man in charge of evaluating applications for The September 11th Victim Compensation Fund was Kenneth R. Feinberg, a lawyer who had successfully settled the Agent Orange lawsuits and “was given the broadest prerogative to decide, in essence, how much each life was worth.” The Fund had unlimited funds but a clear end date. In 2003, it stopped accepting applications, and some individuals were only beginning to exhibit symptoms. Workers who did not make the deadline fell back on unsubsidized treatment in city hospitals or workers compensation payments. Treatment options were slim for undocumented immigrants. Dr. Charles Hirsch, New York City’s Chief Medical Examiner became “the gatekeeper to the official list” of WTC victims. He looked at each death and decided whether to legally consider it a murder. The final list contained 2,749 names of deaths that had been ruled homicides. 2,749 names would be inscribed on the memorial.

• • •

On September 11, 2002, New York State Governor George Pataki recited the Gettysburg Address. In Rudy Giuliani’s last speech as Mayor, he also quoted from the speech. The space of Ground Zero was almost immediately narrativized as a battleground with those who died on that day discussed as heroes, “the fallen.” Ground Zero tells the story of blood and Fresh Kills tells the story of bone. The dust materialized this narrative through coloring—it was just beginning to settle. The World Trade Center dust had, as Anthony DePalma has written, “an odd pinkish tint, a blush that was as curious as it was repulsive because it suggested blood and human remains. It was probably caused by some chemical reaction, and it did not last long. Eventually, the dust took on the more neutral color of dry bone.”

Blood and bone have a longstanding symbolism in national narratives of kinship and trauma. Think, for instance, of the Civil War. During the only war fought on American soil, Confederate soldiers disinterred buried Union soldiers to steal their skulls and use the skeletal remains as trophies. Union soldiers often mimicked these acts in retaliation. The 19th century saw the emergence of a black market in cadavers; because bodies could be sold to medical schools and amateur dissectors. Affluent Americans began burying their dead in well-secured tombs. The poor, chief among them immigrants and African Americans, were obviously just as ferocious in the desire to protect their dead, but lacked the resources to do so. They were thus relegated to, as Simon Harrison has explained, “burying their dead under a layer of straw to deter digging, or posting armed guards in the cemetery for several nights after a funeral.” Bone stealing arose during the Civil War because there was a great political value in “representing Southerners and Northerners as two different peoples.” Among American soldiers, spilt blood over the same land was less of a catastrophic blow to imagined notions of kinship than was the theft itself.

The wars that broke out on Fresh Kills over the fate of the pulverized bones whose extracted DNA identified individual victims were divisive and individual—it was about private grief, and families’ rights to mourn over their own dead. The mass burial of thousands of people’s bones under equalizing and anonymizing layers of dirt was obscene, and the co-existence of victim bone with terrorist bone was unfathomably disrespectful. Blood, however, had the opposite effect. It was used to collectivize trauma to an exaggerated, though politically advantageous, degree. When researchers began to monitor the health effects of New Yorkers who worked on Ground Zero and New Yorkers who had not, they first tested firefighters because of their proximity to the toxic dust and debris. The control group used to compare the level of toxins in the blood of potentially sick firefighters was made up of firefighters who did not work at Ground Zero. The blood of the firefighters who had not been at Ground Zero was also the control group used when testing the blood of other New Yorkers, i.e. non-firefighters. The blood of venerated heroes was the gold standard for all residents of a city in the throes of collective post-trauma. When the last traces of debris were removed from Ground Zero, the bedrock foundation lay naked and bare like bone without sinew.

It is said, by people who would know, that at its peak, Colombia’s infamous Medellín drug cartel was spending $2,500 a month on rubber bands to wrap around bricks of cash. The arithmetic of human excess begins to acquire mythic status when money becomes nearly impossible to count and we are left to communicate chiefly through estimates and legends, like the one in which Pablo Escobar set fire to $2 million in cash to create a fire for his daughter when they were on the run and she got cold. During Colombia’s dark and bloody 1980s, the cartels’ pecuniary abundance was not only the stuff of legendary proportion. Death, too, became grimly innumerable—and at the intersection of cartel, guerrilla, and paramilitary violence was the question of how to respond to the ubiquity of death.

For communities that have been ravaged by violent deaths, the dignity of a burial and the indignity of a mass grave co-exist as parallel possibilities that seem arbitrarily assigned to victims. Family members of the so-called disappeared in countries like Chile, Argentina, and Colombia have fought for the identification and return of their loved ones, in some cases demanding exhumation of mass graves in order to bring what remains of the dead bodies back home to bury them according to local funerary practices.

Human grief and corresponding funerary traditions are sophisticated—we invented the Kübler-Ross model and can recite the five stages of grief by rote—but the existence of rituals around dead or dying bodies is not unique to our species. We are in the company of elephants, chimpanzees, mole rats, even honeybees. These animals have complex and varied ways of reacting to familiar deaths—covering corpses with sticks and leaves, moving a dead insect away from the hive to deposit their body elsewhere—and the communities of the living remain ordered. If dead bodies can be seen to represent, to borrow a term from anthropologist Mary Douglas, “matter out of place,” this suggests that there are spaces in which this matter, now dead, is meant to lie.

Since the 1950s, several funeral homes in Colombia had been offering funerary plans you could purchase in a state of prenecesidad, or “pre-necessity.” But this particular practice really got off the ground in the drug-torn 1980s with four funeral homes: Cristo Rey in Bogotá, Funerales del Valle in Cali, Funerales y Capillas la Aurora in Bogotá and Funerales la Esperanza in Medellín. The locations of these funeral homes are no coincidence. Bogotá, Medellín, and Cali had been seeing alarming numbers of civilian deaths due to the narco-conflict. And so it began to look like a market.

The plans are called previsión, something like “foresight” or “precaution.” At the national level, one out of two Colombians has opted into such a program. Coverage hovers at about 50 percent in Barranquilla and Cali, and 65 percent in Bogotá, the capital. One of the highest coverage rates, unsurprisingly, is 80 percent in Medellín, home to the Medellín cartel, once the epicenter of the global narco-­enterprise.

The need for pre-necessity as a result of drug-war violence brings us to the United States, to the Jackson Heights, Queens, travel agency owned by a large, warm man named Orlando Tobón, sometimes called “the mayor of Little Colombia.” Tobón moved to the U.S. from Colombia some 40 years ago, when he was 21. He started working as soon as he landed, first as a dishwasher at a Jackson Heights restaurant, eventually earning an accounting degree from a community college, then establishing a travel agency, Orlando Travel.

It was at Orlando Travel that Tobón began overseeing his first corpse repatriations. His neighbor died in a car accident, survived by a sister who was newly arrived in the country and spoke little English. Tobón accompanied her to the morgue. Once there, he was struck by the sight of unidentified corpses. He was told that they were Colombians who had died transporting drugs and nobody had stepped up to identify or claim them. They did not go by their real names, making it difficult to accurately identify them. And so they lay in the morgue. Matter out of place.

Since then, Tobón has organized the repatriation of more than 400 Colombians, many of them young women. They are known as mulas, or drug mules, so called because they are paid to carry drugs inside their bodies, in the cut-off fingers of latex surgical gloves or condoms packed tightly with cocaine. Chloraseptic numbs their throats so they can swallow the drug pellets whole, first practicing by swallowing large grapes to suppress their gag reflex, or inserting the pellets into their vaginas. Once they land, they’re taken to a guarded location and given laxatives to excrete the contraband. They don’t always make it this far. Sometimes the pellets explode inside their bodies and the women die immediately, painfully. Other times, they cannot excrete the drugs and they die in other, violent ways.

Law enforcement both in the U.S. and Colombia proved unhelpful in returning the young women’s bodies home, so Tobón decided to take care of things himself. He went around to nightclubs, churches, restaurants, local businesses, asking people in Jackson Heights to give a few dollars toward sending the girls’ bodies back to Colombia. Sometimes, when he was unable to locate living relatives or when it became clear there would not be a funeral, Tobón brought in a Jackson Heights priest to pray over the bodies. He would attend the service alone.

Tobón is in his 60s now and thinking about retirement. He shares his office with a young man named Mauricio Palacios, the U.S. director of Previsión Exequial, a for-profit company that offers Colombians in Colombia and in New York bicontinental funerary insurance. It has Tobón’s blessing. In Colombia, Palacios worked as a journalist and a consultant in political marketing. He is a gifted speaker. According to Tobón, he discovered Palacios after he got in trouble with his reporting and angered the wrong people. His editor—a friend of Tobón’s—offered his name as a candidate to head Prevision.

Repatriation, in the hands of Tobón, was about responding to emergencies. It was about charitable giving and pre-emptive reciprocity. It was about individuals. Repatriation, in the hands of Palacios, is about planning for tragedy. It is about entire families. It is also for-profit.

Dying in the U.S. is expensive and confusing, especially for vulnerable populations like non-English speakers and undocumented immigrants. Previsión handles every step in the repatriation funerary process, from U.S. permits, which vary state by state, to death certificates, coroners’ reports, health and sanitation licenses, dealings with the airlines. The entire process takes about four days, from first contact with Prevision to a final landing in a Colombian city.

A single repatriation from the U.S. to Colombia costs from $10,000 to $15,000. Previsión’s insurance rates start at $4.12 per person per month. There are two kinds of packages: $21.99 a month for four people, and $32 for eight. Although Previsión primarily deals works with New York-based immigrants, at least one family member still living in Colombia is included in their plans. Since 2005, more than 14,000 people have signed up. More than 330 have already been repatriated.

In a flier from 2010, a folk singer holds up his membership card. Shooting stars in red, blue, and yellow—­colors of the Colombian flag—float by his head, surrounding the numbers “1810–2010”. “It’s been 200 years of independence, passion, and love for Colombia,” the flier reads, “and for the past 5 years, we Colombians have been counting on funerary Previsión in the United States!”

The tie-in to the bicentennial is a savvy move, making the purchase of a pre-necessity plan a patriotic duty. Over the past 20 years, Colombia has launched initiatives to incorporate the Colombian diaspora into its national project: there was the right to a double citizenship (1991), the right for Colombians in the exterior to have their own representative in Congress (1991), the right to run for office as a representative from their home region (1997). Pre-­necessity plans in Colombia were born of the senseless ubiquity of death in its drug-ravaged cities and the public’s need to secure an easy funeral and burial for their families.

The repatriation of corpses from the U.S. began as a simple one-man effort to send back the bodies of young women whose bodies paid bore witness about the failures of their government to protect them, instead dumping them in the hands of a citizen and his colecta. The men and women purchasing the plans stateside do so for many reasons, chief among them the desire to be buried in their homeland. Repatriations require extensive paperwork and many of the clients acquire documentation from the U.S. at long last. The ongoing war on drugs provides a helpful lens through which to track the popularity and necessity of pre-necessity programs, granting disenfranchised citizens—drug mules, undocumented families—political afterlives.

The popularity of botanicas point to the failures of the Catholic Church to properly provide for its own

Before I migrated to the United States, meaning shortly before I turned five years old, my father’s sister decided I had been given the evil eye and needed a cleansing. She took me to see a healer who ran a small botanica out of her home in Cotopaxi, Ecuador. My aunt was fervently Catholic at the time, but she walked into the botanica like it was a church, and looked on stoically as the healer rubbed a live guinea pig all over my body and spit alcohol into my face to ward off evil spirits.

Botanica means botany or botanist, so named for the medicinal herbs sold inside the shops that also stock oils, soaps, sprays, washes, statues, rosaries, amulets, books, and animal skulls. The shops vary in size and content, but they’re everywhere in immigrant neighborhoods. The Bronx, for example, is home to Original Products Co., reportedly the largest botanica in the Northeast. Formerly an A&P supermarket, it is the size of a warehouse but has the intimate feel of a small head shop. Upstairs, rent-free, is The Pagan Center, run by a lesbian Wiccan couple. Lady Rhea, a high priestess, and Lady Zoradia have their own small section in the store called the Magickal Realms Enchanted Candle shop. There are books (Cunningham’s Encyclopedia of Magical Herbs) and oils (Better Business Oil, Break Up Oil). I searched for something to do with long-distance relationships but found only the Mile and Distance Oil, which promises to keep my enemies far away. The powder version comes in one-ounce, half-pound and one-pound packages.

A portion of the store is dedicated to Santería, an oft-misunderstood religion that dates back centuries and has its roots among the Yoruba people in what is now Nigeria. For centuries, thousands of Yoruba were trafficked to the New World. When the Iberians forced them to convert to Catholicism, many found a way to continue practicing their rituals by combining them with Catholic customs. They assigned each Catholic saint a corresponding orisha, or natural spirit, and worshipped them as they normally would according to their customs. There are some 2 million practicing Santería followers today, 50,000 of them in the U.S. alone. Some have rejected the syncretic nature of Santería’s development, tracing and returning to its West African roots; others navigate a happy medium by belonging to the Catholic Church while still practicing Santería.

My aunt, who took me to the botanica in Cotopaxi, eventually joined an evangelical church after becoming disillusioned with her local church and began to deny my “healing” ever took place. Her new church cautioned against spiritism and false idols—guinea pigs included. But although she stopped going to the shops, some remainders of her formerly syncretic Catholicism lingered—rosaries, holy water, red string bracelets to protect us against the evil eye.

Mainstream Catholicism does not openly condone Santería but typically adopts a laissez-faire attitude toward the varied influences that have seeped into local practices of Catholicism in Latin American countries, especially ones with large Afro-Caribbean populations. Protestant groups, even evangelical ones, are much more aggressive in their denouncement of idolatry (the statues, the candles, the prayers) and the occult, so they openly frown upon places like botanicas. But like my aunt, who kept both the holy water and the red thread bracelets, many newly converted evangelicals keep vestiges of their past syncretic worship. Some continue to frequent the botanicas even as they continue to worship in their disapproving churches. The historical syncretism of Latin American Catholicism as it is practiced by millions carves out a space of casual permissibility if not outright approval of churchgoers’ extracurricular spiritual activities. Frequenting botanicas is a complementary aspect to many people’s spirituality, not a replacement for it.

Catholic converts bring with them this worshipping framework to their new evangelical churches: 68 percent of Latinos in the U.S. identify as Roman Catholic, but many are leaving the church in favor of evangelical faiths. Four-fifths of practicing evangelicals are former Catholics. There are doctrinal differences that explain the conversions, but driving the shift is also a broader socio­economic issue: Evangelical churches provide a social safety net that is especially attractive to immigrants who lack the support systems and networks they had in their home countries. And botanicas, too, serve as spaces of support and community for immigrants who would otherwise have limited social networks to rely on when faced with unemployment or the need to find the best dance hall in which to throw a baby shower.

Botanicas thus recreate and satisfy the same needs of the new evangelical ­churches, many of them popping up in storefronts and former factories, offering charismatic pastors, services that encourage audience participation, song and dance, books and pamphlets, preaching and canvassing, movies and books, concerts. Because of immigration, a third of Catholics in the U.S. are now Latino, a coalition that brings with it the syncretic nature of Afro-Caribbean and Mesoamerican homeland practices, convergences that have found a welcoming space outside churches and inside the botanicas of their new home.

It is difficult to know how many botanicas there are in the U.S. because they are commonly registered as religious or herbal stores and are thus not subject to the staunch regulations that would apply should they be registered as anything clinical. Some botanicas have healers or Santeria priests on-site, and others refer clients to healers outside the practice. But since botanicas are not registered as medical practices, they toe a careful line on what healers can and cannot promise or sell, lest they face the wrath of medical-licensing laws. The potential danger of botanicas becoming the only or primary health care source for Latinos in the U.S. is very real. Latino unemployment is two points above the national average and a quarter of the 50 million Latinos in the U.S. currently live in poverty. Historically poor access to health care has led many Latinos, many of them undocumented immigrants, in dire straits and many resort to botanicas for serious medical treatment.

Undocumented immigrants are routinely villainized in the U.S. as burdens on the national healthcare system because it is assumed that they cannot pay for their care. Studies have found, however, that they are actually less likely to seek health-care professionals than their documented counterparts or other Americans. A small recent study found that while some immigrants used the botanica as a complementary treatment alongside biomedical care, others—33 percent of their study sample—were using it as a primary source of health care. They weren’t only seeking care for folk illnesses either: 71 percent of study participants sought care for somatic ailments like asthma, cough, digestive problems, swollen legs, and nervous system problems. Typical obstacles to obtaining health-care for Latinos such as ­language, transportation, ­legal status, lack of insurance, and inconvenient operating hours are overcome in the botanicas. People can receive treatment in Spanish, they are open later and on weekends, health insurance doesn’t matter, and legal status is immaterial.

But botanicas are not merely as independent shops that sell good luck charms and love potion oils, but as places through which to access other goods and services, like medical information and referrals, that perhaps would not have reached clients. In Lawrence, Massachusetts, botanica owners were paid to take an HIV/AIDS workshop where they learned basic information about protection, transmission, and testing. After the workshop, one store owner referred nine out of 53 high-risk individuals tested within three months of initial training, which was deemed a success.

The reliance on botanicas for medical treatment can be dangerous. A lag in diagnosing and treating communicable diseases may lead to an outbreak, delaying the visit to an emergency room can be fatal, and the unsupervised dispensation of antibiotics can lead to unusually resistant strains of bacteria. Powders used in treatment of infant colic have been found to contain high quantities of lead. Studies have shown newly arrived Latin American immigrants to slowly develop the same kinds of illnesses endemic in the U.S. such as obesity, diabetes, asthma, and high cholesterol, all conditions meriting medical treatment, yet one in six undocumented male immigrants have never seen a doctor.

This will be only get worse under the Affordable Care Act, which calls for U.S. citizens or permanent residents to purchase insurance while excluding undocumented immigrants from non-emergency programs. The repercussions could be devastating to immigrant communities that already have a fraught relationship with healthcare. Further alienation from conventional medical coverage may push Latino immigrants to rely more on alternative medicine that’s offered in places like the botanica, which is able to access a segment of the population that is not easily reached by typical biomedical channels because of patients’ fear, positions of precarity, and lack of awareness.

The popularity of botanicas today thus points to the failures of the church to properly provide for its own, highlighting its postcolonial fracture, the failed missions of conversion, and its inability to keep believers from converting and leaving. It also points to neoliberalism’s inability to care for its most vulnerable populations.

For many disenfranchised groups, the botanica serves as safety net and important social network, a destination and a portal for services offered both in this world and the next, and above all a space of casual dissent, where religious men and women openly and publicly reject the counsel from the pulpit that warns against the occult and Santería. The space of dissent embodies these significant protests against church and state while still making room for the occasional love potion or enemy dispeller—a sense of safety, home, and God, dispensed in powder form, in one-ounce, half-pound and one-pound quantities.

Despite the bleak imaginary landscape of food deserts, urban nutritional politics is all about color

December solstice announces the beginning of the ghetto’s wintertime Rorschach foot game. Is the dark red slush on the pavement frost and blood, or spilled food coloring? If latter, sidestep. If the former, sidestep coolly. The key is to look bored by whatever the red is, and if someone guesses wrong, to look bored. The red slug trail starts at the very top of Myrtle Avenue in Queens and ends in Brooklyn. If you keep following the crimson-red drops down Myrtle and onto Knickerbocker Avenue, you’ll enter an urban a foodscape that is, by the Obama Administration’s definition, not a food desert.

No, this is not a food desert. The streets are lined with McDonalds, Wendy’s, WhiteCastles, and Taco Bells separated by some storefronts that pawn gold and others that will dip just about everything into day-old oil, that wet brass-gold. There is a bodega on just about every corner. It is mere walking distance to a large supermarket adorned with carnival flags waving above old men in bright polyester parkas rolling vanilla-flavored cigars in the parking lot. This is not a food desert. It is not “a low-income census tract where a substantial number or share of residents has low access to a supermarket or large grocery store.” Myrtle Avenue is a food biome the White House has left unnamed.

Popular discourse has left the concept of a “food desert” suspended in a place of no context and incomplete metaphor. Unlike literal biomes, which fall along a geographic spectrum of wet to dry or barren to fertile, the food desert stands alone. When we think about biomes, we think in terms of inverses (hot is the opposite of cold; wet is the opposite of dry) and that’s where it gets strange. What is the opposite of a food desert? A food oasis? A food forest? A food lake?

If we follow the industrial ontology of deserts, we can think of Myrtle Avenue as a nutritional timberline. At the timberline, the forest has dried up and the desert begins–with it, the heavy, hopeless permanence that thickens the air “just past that moment when the desert has become the only reality,” as Joan Didion once described the Sonoran Desert.

Of course, our tour through the biomes doesn’t mean a whole lot until you understand neoliberal magical realism, and the way it presses itself onto bodies. Think about Ronald Reagan’s Secretary of Agriculture pushing the FDA to label ketchup and relish as vegetables in order to justify cutting funds for school lunches. Or think about our Congress’ recent passing of an agricultural appropriations bill that would make it easier to count tomato paste (in pizzas, basically) in school lunches as a vegetable. Gabriel García Márquez describes his imaginary land of Macondo as a place “so recent that many things lacked names, and in order to indicate them it was necessary to point.” Enter a supermarket in a working-class neighborhood and you’ll encounter the same. Cool Whip™, Kool-Aid™, Tang™, and ten-for-a-buck neon drinks do not look like anything else because they aren’t quite real. They’re all a product of modernity’s magic—what is Splenda if not some kind of weird zero-cal devil alchemy? — and though the new foods that it offers are always recent, they don’t lack names. We simply can’t pronounce them.

The anthropologist Sidney Mintz has proposed that “what we like, what we eat, how we eat it, and how we feel about it are phenomenologically interrelated matters; together, they speak eloquently to the question of how we perceive ourselves in relation to others.” The issues at the core of delicate food debates haven’t ever just been about calories and nutrients and tax dollars. The way the American diet is publicly discussed reveals an ugly underbelly of American cultural practices that we desperately try to cover up through coded rhetorical pyrotechnics. The focus on landscape as culprit—this bogeyman “food desert”— is a diversionary tactic. The phenomenological, political, and uncomfortable crux in the popular food debate is color.

Note the code here: during this year’s Republican presidential primaries, Newt Gingrich publicly denounced President Barack Obama as “the most successful food stamp president in American history,” a racially charged declaration against a black man whose single white mother was on food stamps during parts of his childhood. The other Republican contenders including Mitt Romney scandalized bases with denunciations of welfare and food stamps that painted people of color as fat and lazy freeloaders, despite statistics revealing 70 percent of food stamp benefits go to white people,

Or here: In 2000, Stanford University ran a weight loss program for Mexican American families that categorized foods as good or bad using the traffic light colors of red (no), green (yes), and yellow (careful). Mexican bread was red, so one family moved on to cereal and skim milk for breakfast. They were dismayed that some foods they thought were healthy were in fact “red” foods: vegetable oil, flavored yogurt, peanut butter. One Mrs. Rodriguez praised the program: ”My children, I don’t have to be saying, ‘Don’t eat more ice cream.’ They don’t want to eat more ice cream. They don’t want red lights.”

Red is bad. Red is stop, and these bad Mexican-American mothers need to know to stop and learn how to mother. But red is also Red 40, a good dye to which American bodies have been systematically and corporationally acclimating for decades. Although scientists, behavioral psychologists, and parents have for years been bringing attention to a link between diets rich in artificial food colorings and hyperactivity in children prone to attention deficit disorders, it took until 2007, when The Lancet published a study suggesting the dyes had a significant behavioral impact in healthy children, that the FDA began to listen. In 2011, the FDA proposed a mandatory labeling of foods containing artificial colorants. They also proposed comparing food coloring-related risks to peanut allergies, arguing that some people are simply more susceptible to those allergies, which is not something that has anything to do with “any inherent neurotoxic properties” of the dyes. Just don’t eat peanuts, right?

Not everyone has that choice.

The history of food dyes reveals an undeniably aggressive genealogy of color and violence. In 1856, British scientist William Perkin made the first known food dye, “aniline purple,” using coal tar, an ingredient that’d eventually be substituted for petroleum because coal tar is a tad carcinogenic. In 1950, the FDA banned Orange 1 after too many kids got sick eating candy that Halloween. In 1976, they banned Red 2 (again, because of the persky carcinogenic situation). And in 1986, Brooklyn stores were forced to pull Kool-Aid from their shelves because an anonymous man called in to admit he’d poisoned some boxes with cyanide.

So how does it feel to hear Kraft announce, as it did last year, that it would triple its advertising budget allocated to cornering the Latino market? Three years ago, Kraft created a group called the Latino Center of Excellence at Kraft. Their first focus was Kraft Singles, thin squares of cheese, and it worked out wonderfully. Focus groups found that Latinos bought cheaper products on the whole but were willing to spend more money on the same product if they were told it was made using real milk, something the cheaper brands didn’t offer. Now, the focus is Kool-Aid.

In 2011, nearly all of Kool-Aid’s advertising budget went into the Latino market. Univision and Telemundo airwaves were inundated with commercials designed by advertising firm Ogilvy & Mather that knew to portray images of warm family gatherings that Latina mothers would respond to because they would be “really worried about how fast-paced the American way of life is today,” a Kool-Aid senior brand manager told The New York Times. Kool-Aid sponsored a summertime family TV movie series on Telemundo, one of the two major basic network TV channels to which families without cable have access.

So, shouldn’t we just avoid the goddamn peanuts? We’ve arrived at the new timberline, the point where liberal narratives of agency and body acceptance and choice sometimes get it wrong. Even good things, like the fat positivity movement, when bled pale of color, can be a repackaged alternative of the same old thing, just like Kraft’s invisible Kool-Aid, their new “unsweetened” no-color flavor. The performance of a good, neoliberal subjectivity means consuming and consuming while remaining happy and thin, and the weight loss industrial complex has expertly cashed on this fantastical, impossible imperative—in 2011, the weight loss industry in the United States made $61 billion. It’s one of the more brilliant conundrums of modern-day capitalism.

***

Chains like Whole Foods and Trader Joe’s don’t stock products that contain artificial food dyes. Whole Foods is like a low budget-cathedral—there’s a good-cheer spiritual geometry to the aisles, the stacked specials, even the checkout lines. But the best thing about it is the lighting. Everyone looks good, virtuous, clean. When the local supermarket is clean, it smells chloroformed. The lighting is one dead fluorescent bulb away from feeling like a DMV office.

The problem with the Obama administration’s discursive handling of mounting rates of obesity, type 2 diabetes, and heart attacks among Americans is that they’ve gotten away with addressing structural problems like the corporate exploitation of America’s most vulnerable citizens by blaming landscapes like food deserts and fat bodies. For example, the White House announced Wal-Mart’s commitment to Michelle Obama’s anti-childhood obesity campaign with promises to open as many as 300 stores in “food deserts” by 2016. The real food crises in our country has at its heart complicated matters concerning gender, race, class, and education. Dropping down a Wal-Mart selling dozens of heads of lettuce in a poor neighborhood will not solve those problems. It will exacerbate them.

And so it happens that the most important predictor of obesity remains income level. Fast-food companies are dropping obscene amounts on advertising in low-income communities of color, and are targeting children. African-American kids see at least 50 percent more fast-food ads than do white children their age. A full 25 percent of all Spanish-language fast food advertising in the U.S. is from McDonalds and the average Latino/a preschooler will see about 290 McDonalds ads a year. In 2006, 9 percent of Upper East Side residents were obese, compared with 21-30 percent in East and Central Harlem and North and Central Brooklyn, two of the poorest stretches of land in New York. 5 percent of Upper East Side residents had diabetes, compared to 10-15 percent in Harlem and Brooklyn. In these neighborhoods, between 1985 and 2000, the cost of fruits and vegetables increased by 40 percent while the price of junk food and soft drinks decreased by 15 and 25 percent respectively.

We remain at the timberline where liberal narratives of agency and body acceptance and choice sometimes get it wrong.

The fat acceptance movement has found an incredible home on the Internet, where they use personal anecdotes, facts and figures, and rebloggable images to debunk myths equating thinness with good health and fat with laziness or bad health. They have been mostly been right to point out that hatred towards fat bodies is coded as concern for the “cost” to the healthcare system, smokers, while heavy drinkers, drug users, and people with fast metabolisms and bad habits get off the hook. This is because in today’s America, thin, white, and male, are still constructed as ideal default positionalities and to be fat, black, brown, and/or woman is to require more imagination from our compatriots to believe we are human. Fat acceptance activists has been putting up a good fight, but their online biomes would benefit from a more intersectional approach to the spaces fat occupy—that is, it is important to acknowledge that poor black and brown bodies have very different experiences of fatness than their white counterparts do, and the nuance is often lost in Tumblr posts filled with decontextualized pictures of admittedly beautiful fat bodies.

Sometimes fat on a body is normal, and simplistic BMI readings, lady magazines, and everyone else should just go eat cake. But often, it’s not. If some communities have had to see their members’ bodies change over centuries by the poisons forced upon them, defining “pride” and “love” as the only appropriate counterhegemonic affective reactions is an oppressive affective expectation. The literary scholar Elaine Scarry has written brilliantly about human pain and one of her theories it that “the ceaseless, self-announcing signal of the body in pain, at once so empty and undifferentiated and so full of blaring adversity, contains not only the feeling my body hurts but the feeling my body hurts me.”

“My body hurts me”–the “me” in this statement is different from the “me” implied in “my body.” These four four small words reveal a bifurcated knowledge of bodily pain that makes it all the more important to see that neither fat without context nor bones without context be granted rhetorical personhood. Defining no-questions-asked fat pride as some sort of subversive affect is a privileged act of placing some affective experiences above others, and even further, privileging some affective experiences above structural realities. For some, “fat” is a carnal inscription on the organ-and-bone mixed media marvel we call the human body, and like all bodily inscriptions, it does not inherently possess a value, good or bad. This is why variations in corpulence have been regarded so differently across cultures and time periods. Other corporal inscriptions we know to be beautiful depending on context: the crescent-shaped birthmark on your girlfriend’s stomach, the lightning-shaped scar on Harry’s forehead. Others are sad: the calluses on my father’s hands, the makeup caked over the bruise on my neighbor’s cheek. Sometimes inscriptions are given to us by chromosomes or a tattoo artist’s needle, but sometimes they are forced upon our bodies by hegemonic forces. That inscription is more like branding; that inscription burns.