health/medicine

In the 1950s and ’60s, a set of social psychological experiments seemed to show that human beings were easily manipulated by low and moderate amounts of peer pressure, even to the point of violence. It was a stunning research program designed in response to the horrors of the Holocaust, which required the active participation of so many people, and the findings seemed to suggest that what happened there was part of human nature.

What we know now, though, is that this research was undertaken at an unusually conformist time. Mothers were teaching their children to be obedient, loyal, and to have good manners. Conformity was a virtue and people generally sought to blend in with their peers. It wouldn’t last.

At the same time as the conformity experiments were happening, something that would contribute to changing how Americans thought about conformity was being cooked up: the psychedelic drug, LSD.

Lysergic acid diethylamide was first synthesized in 1938 in the routine process of discovering new drugs for medical conditions. The first person to discover it psychedelic properties — its tendency to alter how we see and think — was the scientist who invented it, Albert Hoffmann. He ingested it accidentally, only to discover that it induces a “dreamlike state” in which he “perceived an uninterrupted stream of fantastic pictures, extraordinary shapes with intense, kaleidoscopic play of colors.”

By the 1950s , LSD was being administered to unwitting American in a secret, experimental mind control program conducted by the United States Central Intelligence Agency, one that would last 14 years and occur in over 80 locations. Eventually the fact of the secret program would leak out to the public, and so would LSD.

It was the 1960s and America was going through a countercultural revolution. The Civil Rights movement was challenging persistent racial inequality, the women’s and gay liberation movements were staking claims on equality for women and sexual minorities, the sexual revolution said no to social rules surrounding sexuality and, in the second decade of an intractable war with Vietnam, Americans were losing patience with the government. Obedience had gone out of style.

LSD was the perfect drug for the era. For its proponents, there was something about the experience of being on the drug that made the whole concept of conformity seem absurd. A new breed of thinker, the “psychedelic philosopher,” argued that LSD opened one’s mind and immediately revealed the world as it was, not the world as human beings invented it. It revealed, in other words, the social constructedness of culture.

In this sense, wrote the science studies scholar Ido Hartogsohn, LSD was truly “countercultural,” not only “in the sense of being peripheral or opposed to mainstream culture [but in] rejecting the whole concept of culture.” Culture, the philosophers claimed, shut down our imagination and psychedelics were the cure. “Our normal word-conditioned consciousness,” wrote one proponent, “creates a universe of sharp distinctions, black and white, this and that, me and you and it.” But on acid, he explained, all of these rules fell away. We didn’t have to be trapped in a conformist bubble. We could be free.

The cultural influence of the psychedelic experience, in the context of radical social movements, is hard to overstate. It shaped the era’s music, art, and fashion. It gave us tie-dye, The Grateful Dead, and stuff like this:

The idea that we shouldn’t be held down by cultural constrictions — that we should be able to live life as an individual as we choose — changed America.

By the 1980s, mothers were no longer teaching their children to be obedient, loyal, and to have good manners. Instead, they taught them independence and the importance of finding one’s own way. For decades now, children have been raised with slogans of individuality: “do what makes you happy,” “it doesn’t matter what other people think,” “believe in yourself,” “follow your dreams,” or the more up-to-date “you do you.”

Today, companies choose slogans that celebrate the individual, encouraging us to stand out from the crowd. In 2014, for example, Burger King abandoned its 40-year-old slogan, “Have it your way,” for a plainly individualistic one: “Be your way.” Across the consumer landscape, company slogans promise that buying their products will mark the consumer as special or unique. “Stay extraordinary,” says Coke; “Think different,” says Apple. Brands encourage people to buy their products in order to be themselves: Ray-Ban says “Never hide”; Express says “Express yourself,” and Reebok says “Let U.B.U.”

In surveys, Americans increasingly defendindividuality. Millennials are twice as likely as Baby Boomers to agree with statements like “there is no right way to live.” They are half as likely to think that it’s important to teach children to obey, instead arguing that the most important thing a child can do is “think for him or herself.” Millennials are also more likely than any other living generation to consider themselves political independents and be unaffiliated with an organized religion, even if they believe in God. We say we value uniqueness and are critical of those who demand obedience to others’ visions or social norms.

Paradoxically, it’s now conformist to be an individualist and deviant to be conformist. So much so that a subculture emerged to promote blending in. “Normcore,” it makes opting into conformity a virtue. As one commentator described it, “Normcore finds liberation in being nothing special…”

Obviously LSD didn’t do this all by itself, but it was certainly in the right place at the right time. And as a symbol of the radical transition that began in the 1960s, there’s hardly one better.

Flashback Friday.

Responding to critics who argue that poor people do not choose to eat healthy food because they’re ignorant or prefer unhealthy food, dietitian Ellyn Satter wrote a hierarchy of food needs. Based on Maslow’s hierarchy of needs, it illustrates Satter’s ideas as to the elements of food that matter first, second, and so on… starting at the bottom.

The graphic suggests that getting enough food to eat is the most important thing to people. Having food be acceptable (e.g., not rotten, something you are not allergic to) comes second. Once those two things are in place, people hope for reliable access to food and only then do they begin to worry about taste. If people have enough, acceptable, reliable, good-tasting food, then they seek out novel food experiences and begin to make choices as to what to eat for instrumental purposes (e.g., number of calories, nutritional balance).

As Michelle at The Fat Nutritionist writes, sometimes when a person chooses to eat nutritionally deficient or fattening foods, it is not because they are “stupid, ignorant, lazy, or just a bad, bad person who loves bad, bad food.” Sometimes, it’s “because other needs come first.”

Flashback Friday.

Monica C. sent along images of a pamphlet, from 1920, warning soldiers of the dangers of sexually transmitted infections (STIs). In the lower right hand corner (close up below), the text warns that “most” “prostitutes (whores) and easy women” “are diseased.” In contrast, in the upper left corner, we see imagery of the pure woman that a man’s good behavior is designed to protect (also below). “For the sake of your family,” it reads, “learn the truth about venereal diseases.”

The contrast, between those women who give men STIs (prostitutes and easy women) and those who receive them from men (wives) is a reproduction of the virgin/whore dichotomy (women come in only two kinds: good, pure, and worthy of respect and bad, dirty, and deserving of abuse). It also does a great job of making invisible the fact that women with an STI likely got it from a man and women who have an STI, regardless of how they got one, can give it away. The men’s role in all this, that is, is erased in favor of demonizing “bad” girls.

Dana Berkowitz PhD on December 30, 2016

Botox has forever transformed the primordial battleground against aging. Since the FDA approved it for cosmetic use in 2002, eleven million Americans have used it. Over 90 percent of them are women.

In my forthcoming book, Botox Nation, I argue that one of the reasons Botox is so appealing to women is because the wrinkles that Botox is designed to “fix,” those disconcerting creases between our brows, are precisely those lines that we use to express negative emotions: angry, bitchy, irritated. Botox is injected into the corrugator supercilii muscles, the facial muscles that allow us to pull our eyebrows together and push them down. By paralyzing these muscles, Botox prevents this brow-lowering action, and in so doing, inhibits our ability to scowl, an expression we use to project to the world that we are aggravated or pissed off.

Sociologists have long speculated about the meaning of human faces for social interaction. In the 1950s, Erving Goffman developed the concept of facework to refer to the ways that human faces act as a template to invoke, process, and manage emotions. A core feature of our physical identity, our faces provide expressive information about our selves and how we want our identities to be perceived by others.

Given that our faces are mediums for processing and negotiating social interaction, it makes sense that Botox’s effect on facial expression would be particularly enticing to women, who from early childhood are taught to project cheerfulness and to disguise unhappiness. Male politicians and CEOs, for example, are expected to look pissed off, stern, and annoyed. However, when Hillary Clinton displays these same expressions, she is chastised for being unladylike, as undeserving of the male gaze, and criticized for disrupting the normative gender order. Women more so than men are penalized for looking speculative, judgmental, angry, or cross.

Nothing demonstrates this more than the recent viral pop-cultural idioms “resting bitch face.” For those unfamiliar with the not so subtly sexist phrase, “resting bitch face,” according to the popular site Urban Dictionary, is “a person, usually a girl, who naturally looks mean when her face is expressionless, without meaning to.” This same site defines its etymological predecessor, “bitchy resting face,” as “a bitchy alternative to the usual blank look most people have. This is a condition affecting the facial muscles, suffered by millions of women worldwide. People suffering from bitchy resting face (BRF) have the tendency look hostile and/or judgmental at rest.”

Resting bitch face and its linguistic cousin is nowhere near gender neutral. There is no name for men’s serious, pensive, and reserved expressions because we allow men these feelings. When a man looks severe, serious, or grumpy, we assume it is for good reason. But women are always expected to be smiling, aesthetically pleasing, and compliant. To do otherwise would be to fail to subordinate our own emotions to those of others, and this would upset the gendered status quo.

This is what the sociologist Arlie Russell Hochschild calls “emotion labor,” a type of impression management, which involves manipulating one’s feelings to transmit a certain impression. In her now-classic study on flight attendants, Hochschild documented how part of the occupational script was for flight attendants to create and maintain the façade of positive appearance, revealing the highly gendered ways we police social performance. The facework involved in projecting cheerfulness and always smiling requires energy and, as any woman is well aware, can become exhausting. Hochschild recognized this and saw emotion work as a form of exploitation that could lead to psychological distress. She also predicted that showing dissimilar emotions from those genuinely felt would lead to the alienation from one’s feelings.

Enter Botox—a product that can seemingly liberate the face from its resting bitch state, producing a flattening of affect where the act of appearing introspective, inquisitive, perplexed, contemplative, or pissed off can be effaced and prevented from leaving a lasting impression. One reason Botox may be especially appealing to women is that it can potentially relieve them from having to work so hard to police their expressions.

Even more insidiously, Botox may actually change how women feel. Scientists have long suggested that facial expressions, like frowning or smiling, can influence emotion by contributing to a range of bodily changes that in turn produce subjective feelings. This theory, known in psychology as the “facial feedback hypothesis,” proposes that expression intensifies emotion, whereas suppression softens it. It follows that blocking negative expressions with Botox injections should offer some protection against negative feelings. A study confirmed the hypothesis.

Taken together, this works point to some of the principal attractions of Botox for women. Functioning as an emotional lobotomy of sorts, Botox can emancipate women from having to vigilantly police their facial expressions and actually reduce the negative feelings that produce them, all while simultaneously offsetting the psychological distress of alienation.

Jacqueline Clark PhD on October 3, 2016

In 1985, Zeneca Pharmaceuticals (now AstraZeneca) declared October “National Breast Cancer Awareness Month.” Their original campaign promoted mammography screenings and self-breast exams, as well as aided fundraising efforts for breast cancer related research. The month continues with the same goals, and is still supported by AstraZeneca, in addition to many other organizations, most notably the American Cancer Society.

The now ubiquitous pink ribbons were pinned onto the cause, when the Susan G. Komen Breast Cancer Foundation distributed them at a New York City fundraising event in 1991. The following year, 1.5 million Estée Lauder cosmetic customers received the promotional reminder, along with an informational card about breast self-exams. Although now a well-known symbol, the ribbons elide a less well-known history of Breast Cancer Awareness co-opting grassroots’ organizing and activism targeting women’s health and breast cancer prevention.

The “awareness” campaign also opened the floodgates for other companies to capitalize on the disease. For example, Avon, New Balance, and Yoplait have sold jewelry, athletic shoes, and yogurt, respectively, using the pink ribbon as a logo, while KitchenAid still markets a product line called “Cook for the Cure” that includes pink stand mixers, food processors, and cooking accessories, items which the company first started selling in 2001. Not to be left out, Smith and Wesson, Taurus, Federal, and Bersa, among other companies, have sold firearms with pink grips and/or finishing, pink gun-cases, and even pink ammo with the pink ribbon symbol emblazoned on the packaging. Because breast cancer can be promoted in corporate-friendly ways and lacks the stigma associated with other diseases, like HIV/AIDS, these companies and others, have been willing to endorse Breast Cancer Awareness Month and, in some cases, donate proceeds from their merchandise to support research affiliated with the disease.

Yet companies’ willingness to profit from the cause has also served to commodify breast cancer, and to support what sociologist Gayle Sulik calls “pink ribbon culture.” As Sulik notes, marking breast cancer with the color pink not only feminizes the disease, but also reinforces gendered expectations about how women are “supposed” to react to and cope with the illness, claims also corroborated by my own research on breast cancer support groups.

Based on participant observation of four support groups and in-depth interviews with participants, I have documented how breast cancer patients are expected to present a feminine self, and to also be positive and upbeat, despite the pain and suffering they endure as a result of being ill. The women in the study, for example, spent considerable time and attention on their physical appearance, working to present a traditionally feminine self, even while recovering from surgical procedures and debilitating therapies, such as chemotherapy and radiation. Similarly, members of the groups frequently joked about their bodies, especially in sexualized ways, making light of the physical disfigurement resulting from their disease. Like the compensatory femininity in which they engaged, laughing about their plight seemed to assuage some of the emotional pain that they experienced. However, the coping strategies reinforced traditional standards of beauty and also prevented members of the groups from expressing anger or bitterness, feelings that would have been justifiable, but seen as (largely) culturally inappropriate because they were women.

Even when they recovered physically from the disease, the women were not immune to the effects of the “pink ribbon culture,” as other work from the study demonstrates. Many group participants, for instance, reported that friends and family were often less than sympathetic when they expressed uncertainty about the future and/or discontent about what they had been through. As “survivors,” they were expected to be strong, positive, and upbeat, not fearful or anxious, or too willing to complain about the aftermath of their disease. The women thus learned to cover their uncomfortable emotions with a veneer of strength and courage. This too helps to illustrate how the “pink ribbon culture,” which celebrates survivors and survivorhood, limits the range of emotions that women who have had breast cancer are able to express. It also demonstrates how the myopic focus on survivors detracts attention from the over 40,000 women who die from breast cancer each year in the United States, as well as from the environmental causes of the disease.

Such findings should give pause. If October is truly a time to bring awareness to breast cancer and the women affected by it, we need to acknowledge the pain and suffering associated with the disease and resist the “pink ribbon culture” that contributes to it.

Jacqueline Clark, PhD is an Associate Professor of Sociology and Chair of the Sociology and Anthropology Department at Ripon College. Her research focuses on inequalities, the sociology of health and illness, and the sociology of jobs, work, and organizations.

Sunny Moraine on June 6, 2016

Rose Eveleth’s piece for Fusion on gender and bodyhacking was something I didn’t know I needed in my life until it was there. You know how you’ve always known something or felt something, but it isn’t until someone else articulates it for you that you truly understand it, can explain it to yourself, think you might be able to explain it to others – or, even better, shove the articulation at them and be all THAT RIGHT THERE, THAT’S WHAT I’M TALKING ABOUT. You know that kind of thing?

Yeah, that.

Eveleth’s overall thesis is that “bodyhacking” isn’t new at all, that it’s been around forever in how women – to get oversimplified and gender-essentialist in a way I try to avoid, so caveat there – alter and control and manage their bodies (not always to positive or uncoercive ends), but that it’s not recognized as such because we still gender the concept of “technology” as profoundly masculine:

Men invent Soylent, and it’s considered technology. Women have been drinking SlimFast and Ensure for decades but it was just considered a weight loss aid. Quantified self is an exciting technology sector that led tech giants such as Apple to make health tracking a part of the iPhone. But though women have been keeping records of their menstrual cycles for thousands of years, Apple belatedly added period tracking to its Health Kit. Women have been dieting for centuries, but when men do it and call it “intermittent fasting,” it gets news coverage as a tech trend. Men alter their bodies with implants and it’s considered extreme bodyhacking, and cutting edge technology. Women bound their feet for thousands of years, wore corsets that altered their rib cages, got breast implants, and that was all considered shallow narcissism.

As a central personal example, Eveleth uses her IUD, and this is what especially resonated with me, because I also have one. I’ve had one for about seven years. I love it. And getting it was moderately life-changing, not just because of its practical benefits but because it altered how I think about me.

The insertion process was not comfortable (not to scare off anyone thinking of getting one, TRUST ME IT IS GREAT TO HAVE) and more than a little anxiety-inducing ahead of time, but I walked out of the doctor’s office feeling kind of cool. I had an implant. I had a piece of technology in my uterus, that was enabling me to control my reproductive process. I don’t want children – at least not right now – and my reproductive organs have never been significantly important to me as far as my gender identity goes (probably not least because I don’t identify as a woman), but managing my bits and what they do and how they do it has naturally been a part of my life since I became sexually active.

And what matters for this conversation is that the constant task of managing them isn’t something I chose. Trying to find a method that worked best for me and (mildly) stressing about how well it was working was a part of my identity inasmuch as it took up space in my brain, and I wasn’t thrilled about that. I didn’t want it to be part of my identity – though I didn’t want to go as far as permanently foreclosing on the possibility of pregnancy – and it irked me that it had to be.

Then it didn’t have to be anymore.

And it wasn’t just about a little copper implant being cool on a pure nerd level. I felt cool because the power dynamic between my self and my body had changed. My relationship between me and this set of organs had become voluntary in a way entirely new to me.

I feel like I might not be explaining this very well.

Here: Over thirty years ago, Donna Haraway presented an image of a new form of self and its creation – not creation, in fact, but construction. Something pieced together with intentionality, the result of choices – something “encoded.” She offered a criticism of the woman-as-Earth-Mother vision that then-contemporary feminists were making use of, and pointed the way forward toward something far stranger and more wonderfully monstrous.

The power of an enmeshing between the organic and the technological lies not only in what it allows one to do but in what it allows one to be – and often there’s no real distinction to be made between the two. We can talk about identity in terms of smartphones, but when we come to things like technologies of reproductive control, I think the conversation often slips into the purely utilitarian – if these things are recognized as technologies at all.

Eveleth notes that “technology is a thing men do,” and I think the dismissal of female bodyhacking goes beyond dismissal of the utilitarian aspects of these technologies. It’s also the dismissal of many of the things that make it possible to construct a cyborg self, to weave a powerful connection to the body that’s about the emotional and psychological just as much as the physical.

I walked out of that doctor’s office with my little copper implant, and the fact that I no longer had to angst about accidental pregnancy was in many respects a minor component of what I was feeling. I was a little less of a goddess, and a little more of a cyborg.

Sunny Moraine is a doctoral candidate in sociology at the University of Maryland and a fiction author whose work has appeared in Clarkesworld, Lightspeed, Shimmer, Nightmare, and Strange Horizons, as well as multiple Year’s Best anthologies; they are also responsible for both the Root Code and Casting the Bones novel trilogies. Their current dissertation work concerns narrative, temporality, and genocidal violence. They blog at Cyborgology, where this post originally appeared, and can be followed on Twitter at @dynamicsymmetry.

Last week PBS hosted a powerful essay by law professor Ekow Yankah. He points to how the new opioid addiction crisis is being talked about very differently than addiction crises of the past. Today, he points out, addiction is being described and increasingly treated as a healthcrisiswithahumantoll. “Our nation has linked arms,” he says, “to save souls.”

Even just a decade ago, though, addicts weren’t victims, they were criminals.

What’s changed? Well, race. “Back then, when addiction was a black problem,” Yankah says about 30 years ago, “there was no wave of national compassion.” Instead, we were introduced to suffering “crack babies” and their inhuman, incorrigible mothers. We were told that crack and crime went hand-in-hand because the people involved were simply bad. We were told to fear addicts, not care for them. It was a “war on drugs” that was fought against the people who had succumbed to them.

Yankah is clear that this a welcome change. But, he says, for African Americans, who would have welcomed such compassion for the drugs that devastated their neighborhoods and families, it is bittersweet.

Earlier this year Brandy Zadrozny interviewed me for a Daily Beaststory about the new CDC guidelines for alcohol consumption by women. It caused an outcry because it advised all women who could potentially become pregnant to completely abstain from alcohol as a way to prevent fetal alcohol spectrum disorders.

Responses across the blogosphere included several objections, including the fact that research shows that alcohol alone is not sufficient to cause fetal harm (enter poverty as a major confounding factor) and paternal drinking prior to conception is believed to contribute to incidence of these disorders, too, despite no advice to men of fertile age to refrain from any alcoholic consumption.

Interesting points, but an argument made by Renée Ann Cramer in Pregnant with the Stars gave what I thought was some interesting historical perspective.

Until feminists fought to make it otherwise, she explains, it was perfectly legal in America to refuse to allow women access to certain jobs because they might get pregnant. If the working conditions were too challenging or involved exposure to dangerous chemicals, women were considered unfit for the work by virtue of their always-potentially-pregnant status. And if they did this work and harm did come to a child, it was considered a failure of the state to adequately protect her.

Feminists fought to make this “protectionism” illegal, demanding that women themselves have the right to decide, alongside men, if they wanted to take occupational risks. And they largely won this fight.

In turn, though, women themselves came under scrutiny. They were no longer excluded from certain jobs, but if they chose to do them, it was reasonable to judge them harshly for doing so. Cramer calls this the “responsibilization” of pregnancy. Now that women had the right to handle their pregnancy (or pre-pregnancy) however they wished, they (and not the state) would be held responsible for doing so in ways that society approved or disapproved.

This is what the CDC guidelines are doing. It’s not legal to “protect” women from harming her not-yet-existing fetus by refusing to serve her alcohol. Women have the same rights as men. But with rights comes responsibilization and if women don’t make the choices endorsed by their communities, the health industry, and even the federal government, they can expect to be surveilled, judged, and possibly bullied into doing so.