A not so revolutionary blog about feminism, socialism, activism, travel, nature, life, etc.

Archive for the category “sociology”

Deconstructing Duluth’s Demographic Crisis

H. Bradford

4/11/18

On February 24th, the Duluth News Tribune ran an article about Duluth’s impending demographic crisis. I wanted to write a socialist feminist response to this, but never got around to it. Not that I am the authority on socialist feminism, but I am a feminist and a socialist…and I do think about these things…so, why not break it down? Now, whenever I hear the word “demographic crisis” I want to run for the hills, or burn something, or both. Not really, but I think it is one of those sexist, ageist, racist, pro-capitalist concepts that begs to be dismembered. Here is why…

Ageism:

Early into the Duluth News Tribune article, when describing the shifting population of the Duluth region, the aging population is described as problematic.

“If population levels were even across age groups, this wouldn’t be much of a problem. But, as you may have heard, the largest generation in the country’s history is marching into retirement, leaving many jobs vacant just as unemployment levels are bottoming out and productivity growth is stalling (Johnson, 2018).”

It is true that our population is aging, but, one must consider why this is a problem. According to the article, it is a problem because there will not be enough workers to replace those who retire. On the surface, this seems like a problem, as society needs workers to produce things. However, this frames the post-retirement age population as the cause of a social problem. Framing the older population as a “problem” is ageist. It also ignores their labor, as labor does not end when wage labor ceases. Their contributions to society do not cease when they reach the age of 65 (or higher ages for the many people who do not have retirement savings, pensions, or the ability to survive on social security alone). Older adults do unpaid work such as volunteering, caring for grand children, gardening, baking, canning, sharing their knowledge, checking up on one another, and a plethora of other important economic activities that are dismissed because they are unpaid. Just as the invisible, unpaid labor of women is ignored as a natural or unimportant, this invisible labor and its contribution to society is also ignored.

This connects to the socialist feminist concept of social reproduction. Basically, in capitalist society, the labor force must reproduce itself. This can literally mean that the work force must replace itself through biological reproduction, but also means that each worker must sustain themselves through sleep, eating food, washing clothes, maintaining their health, relieving stress, and all the many things that are required to survive and work another day. Typically, women have played an important role in providing the invisible, unpaid labor that keeps the work force …working. Caring for children, giving birth, caring for the elderly, washing clothes, cleaning a home, doing dishes, making meals, grocery shopping, etc. are all important unpaid activities that ensure that capitalism will continue. Of course, older adults who leave the work force also provide some of these services as they are “free” to (my own grandparents made many meals for me, baby sat me, bought me school clothes, taught me information, etc.). Thus, is it really a problem that people grow old? Aging is a natural process. It may happen that we have an aging population, but why is this a problem? Some people might respond that it is a problem because this group requires more care and there are not enough young people to care for them. The article itself argues that it is a problem that there is not enough workers to fill jobs and that productivity will decline.

I am not an expert on matters of aging, but I imagine that the “problem of aging” could be mitigated by providing quality, free health care to people of all ages, along with clean environments, living wages, robust pensions, housing, etc. The aging population might very well “age better” if a high quality of life was ensured for people of all ages. What does it mean to “age well” anyway? I think to most people means the ability to care for one’s self, enjoy a high quality of life, and live independently for as long as possible. If this is what this means, the locus of “aging well” is framed as an individual responsibility and the very human need for care is viewed as burdensome. This concept is very individualistic and puts the rest of society off the hook for taking responsibility of providing and caring for the variable needs of older adults. It is also ageist, as aging well is basically the ability to live as similarly to a young person for as long as possible. Maybe it is okay to be wrinkly, sedentary, crabby, or anti-social. Society is awful. Living through decades of economic ups and downs, cuts to social programs, pointless wars, and the general nonsense of everything deemed meaningful by society might sour a person against living with youthful optimism and vibrancy. After years of being alive, “aging well” might seem like a racket to sell beauty products, skin treatments, fitness memberships, etc.

(This image leads me to believe that aging well has something to do with being white and wealthy. Capitalism doesn’t have resources to spare on caring for the elderly, so make certain you stay healthy with fresh air and bike rides in the country.)

If indeed there is a shortage of workers, there are certainly plenty of people in the world and United States itself. These people might be more inclined to move to this frigid region and provide elder care if this was not low paid, under appreciated service work but unionized with benefits (including retirement plans!), better wages, and better working conditions. A true shortage of workers might require open borders to allow new workers to enter the country, but this would require a move away from our current racist, xenophobic, nationalist, and exploitative immigration policy. The “aging population problem” is not a problem with age, but an ageless problem of capitalism to meet the basic needs of humanity.

Of course, the notion of declining productivity must also be challenged. Why is it a problem when productivity declines? Why must productivity always increase? What does this mean for the environment? When have we produced enough?! Productivity is a problem in capitalism because of the tendency for profits to decline. Because competition lends itself to increased investment in fixed capital and there are human thresholds of how much variable capital can be exploited from workers, profits decline over time. Markets also become saturated as there is only so much people can buy (again because wages only allow so much consumption). When too much is produced and too little is consumed, capitalism falls into a crisis, which Marx called the crisis of overproduction. Therefore, productivity is not necessary good. It is not good for the workers (who must work longer or harder). It is not good for the environment (as it creates waste and overuse of resources). And it is not even good for capitalism, since it lends itself to instability. I think it is important to think against blind productivity and instead think about rational, careful production in the interest of human needs.

Sexism:

Another reason why I dislike the concept of “demographic crisis” is that it is sexist. Although the article only mentions it briefly, increasing birth rates is often suggested as a way in averting the crisis. Even if it is not mentioned in detail in the article, it is implicit in the premise of the argument. If the population is aging and this is a problem, that means that not enough new people are being born. Thus, not only are older adults the problem, the bigger problem is that women are not gestating enough babies. The bodies of women have long been treated as public property, inasmuch as their reproductive power is harnessed for state interests. The fight for reproductive rights is a fight to liberate women from their role as the producer’s of the next generation of soldiers and workers. The birth rate in the United States (according to 2018 CIA World Factbook Information) is 12.5 births per 1000 people. Our birth rate is slightly higher than the UK, Sweden, France, and Australia which all have 12.x births per 1000. The rate is higher than Finland, Canada, Switzerland, Netherlands, and Denmark, which have 10.x births per 1000 people. Our birthrate is certainly greater than South Korea, Japan, and Germany, which range from 8.x to 9.x births per 1000 people. Despite our higher birth rate, there is enormous pressure upon women to reproduce- to the point that the organized movement against abortion has made birth nearly compulsory in many parts of the country due to restricted access to abortion. In many of these countries with lower birth rates, the issue of abortion is far less controversial. Here, anti-choice activists bemoan the loss of millions of fetuses, which they argue contributes to our demographic crisis (fewer workers, fewer students, etc.) At the core of demographic crisis is a demand to control reproduction- because if population is viewed as a resource, women’s bodies are responsible for producing this resource.

In the context of capitalism (and unfortunately many economic systems), population is treated as a resource. Workers need to reproduce so that there are more workers. This leads to a precarious balance. Capitalists do not provide for the reproduction of labor (this has often fallen upon women and families) as this requires an investment in workers. At the same time, workers have to have a basic level of sustenance to continue working and to allow for a new generation. For instance, if a woman works too hard or consumes too few calories, she may stop menstruating. Therefore, workers generally have a basic threshold of exploitation which if reached these workers will no longer be able to survive and reproduce. In the United States in particular, our status as a world power has an economic component and a military component. The military domination of the world is an extension of the economic component, as military might ensures access to markets, thwarts competitors, offers access to capital (for instance natural resources and labor), etc. For the United States to remain an economic and military power, babies must be born. Babies are needed so that there will always be a supply of soldiers and workers. Reproduction is a national interest. I think this contributes to the controversy around abortion and the drive to limit it.

(A piece of art that I created called Capitalism is Built on the Bodies of Women)

As I alluded to in the previous paragraph, capitalism has a contradiction. On one hand, in seeks to increase profit by extracting more surplus value from workers. Because profits decline over time, workers are pressured to work harder and longer. This increased exploitation limits the ability to reproduce labor (to reproduce biologically, but also to maintain a certain level of health as workers). In the United States, not a lot of profit is redistributed towards caring for our existing population (i.e. ensuring the reproduction of labor). We do not offer paid parental leave. We do not have free day cares. There is a shortage of housing. Health care is expensive. The list goes on. The conditions of capitalism are so extreme that 5.8 infants die out of 1000 born. In Japan, two infants die per 1000 births. In Iceland, Norway, Finland, and Sweden, there are slightly more than 2 infant deaths per 1000. In the European Union as a whole, there are about 4 deaths per 1000 according to the CIA world Fact Book. Once again, rather than a demographic crisis, our crisis is an inability to care for our population. Certainly, anyone worried about our economic or military strength might begin by tackling the causes of infant mortality. But, this would mean diverting profits towards human needs. Re-thinking profits and capitalism itself would undermine the logic of militarism and nationalism.

Supposing that the United States provided free access to abortion, birth control, all health care, and social conditions favorable to reproduction (paid leave, free day care, adequate housing, etc.) Even if these conditions were met, women have no obligation to reproduce the next generation. They should not be scapegoated for demographic crisis. In the end, it is up to society to creatively adapt to changing populations- not women.

Racism and Classism:

The article concluded that a key to averting Duluth’s demographic crisis is promoting immigration to the city. Regarding this point, Mayor Larson said, “Duluth needs to be a community that is welcoming and open to new experiences, new faces, new ethnicities, new races to solve workforce shortages (Johnson, 2018).” I think that it is generally a positive, feel good conclusion, since well, who doesn’t want Duluth to be a more welcoming city? The mayor suggests working with education and health care partners to attract more diversity to the city. Hmm…alright. What does really this mean?

In a subtle way, the statement hints at what kind of diversity is acceptable in Duluth. I interpret working with education and health care partners to mean attracting diversity by attracting professionals of color. The center of this argument is not “let’s build more low income housing so we can attract all of the African Americans in Chicago or Minneapolis who are on housing waiting lists and house those who already exist in our community!” Duluth DOES have some racial diversity BUT, this diversity is segregated into poor neighborhoods, homeless shelters, and jail. Yet, because they are poor and people of color, this population is not seen as a solution to the “demographic crisis” because they are an OTHER at best and problem at worse. They are those people. Those people who are blamed for crime or making things not like they used to be for white people. This is another problem with the notion of “demographic crisis”- since demographic crisis always refers to the shortage of a desirable population. We have a low income population that would probably be happy to invite friends and relatives and grow if Duluth was a more welcoming, less racist, expanded housing, housing and employers ceased discrimination against criminal backgrounds, day care was expanded, public transportation was more reliable, schools were not segregated and plainly racist, etc.

Truly making Duluth a city for everyone, as the Mayor suggested, would mean changing what Duluth is right now. Right now, Duluth is focused on being a city for business. In particular, it is a city for businesses that serve tourists. Centering the city on the tourist industry makes Duluth a city not for everyone, but for middle class, mostly white people, who have the leisure and money to stay at a hotel or the outdoor gear to enjoy our nature. Duluth can’t be a city for business and for everyone. We CAN be a city that is for everyone that happens to attract tourists, but the reverse is not possible. The reverse is what has made Earned Safe and Sick time so controversial, as segments of the business community that are most opposed to it are those sectors that serve tourists (restaurants and hotels). The reverse has also been what has stalled the Homeless Bill of Rights- because homeless people are a “problem population” not one that should be accounted for in “demographic crisis” and certainly not one that deserves to be treated with basic dignity. After all, they might just spook the customers! If we want to be a city for everyone, then we should start by being a city for workers, for the homeless, for people of color, and all of the oppressed in our community.

Conclusion:

Duluth is just one city. It would be pie in the sky to try to think we can build socialism in a single city. Many of my suggestions require a massive struggle on a national scale to accomplish. I do believe that we have local activists with the talent and audience to contribute to such a national struggle. I am not one of them, but am a small and marginal voice in that struggle. Beyond the national, there are some things that can be done on a local level. We can focus local priorities on meeting human needs and support things such as Earned Safe and Sick Time and the Homeless Bill of Rights. We can challenge the policies of our schools and police to make the city less racist and classist. We can also think against business interests and promote diverting profits towards social good. Beyond these material things, I wrote this because I wanted to challenge the ideological logic of “demographic crisis.” Like many crisis and panics, it is a social construct. Inherent in this constructed crisis is ageism, racism, sexism, nationalism, and classism. There are no population problems. There are only failures of societies to address the needs of populations. It is only through struggle that we will win the means to address these needs.

Bright Eyed and Bushy Tailed: Reflections on Being the Easter Bunny

H. Bradford

4/3/18

This spring, I saw an interesting opportunity posted on Facebook. The post was a call-out for anyone interested in becoming the Easter Bunny at the mall. Despite the fact that I already have two jobs, or three if you count subbing, I posted my interest and was interviewed later that week. The interview was pretty informal, mostly consisting of questioning why I was interested in the job and trying on a giant Easter Bunny head. With little effort, I was hired on for a two week stint as a costumed Easter Bunny at a mall kiosk for seasonal photos. I thought the whole thing seemed silly and certainly would provide the raw materials for a good story.

The Costume:

The costume itself was hot and claustrophobic. When I first tried the whole thing on, I felt a little overwhelmed by the sense of being trapped. The trapped feeling came from the general heaviness and stuffiness of the head, which provided a dim and limited view of the world. The head does not allow for adequate peripheral vision or the ability to look down. The rest of the body is less challenging. It consisted of oversized rabbit feet, baggy fur pants, a velcro velvety blue jacket, and furry gloves. One thing that I appreciated about the costume was that the bunny looked intellectual, with round glasses and gold trimmed velvet clothes. This was not a rowdy Peter Rabbit, but perhaps his pedantic uncle who is allergic to carrots (unless they are boiled) and whose favorite painting is Gainsborough’s Blue Boy.

(My first time wearing the costume)

In any event, the costume could become hot. Thankfully, there was a fan aimed at the bunny. The only downside was that sometimes the fan upset children or messed up their hair, so it was turned away or tilted up, resulting in a sweltering rabbit. On the upside, I tried to think what skills wearing the suit might translate to. Paul (a fellow rabbit) said that maybe I would be more comfortable in a gas mask, since those are also claustrophobic. I thought perhaps I would do better underwater (with a lessened sense of the space around me or a sense of confinement in a wetsuit or scuba/snorkel mask). Yes, I want to believe that being the bunny better prepares me for revolution, apocalypse, or underwater adventures.

Gender:

The Easter Bunny was usually gendered as male by parents and children. The bunny doesn’t have any specific gender markers, but might be viewed as male due to the blue velvet vest and jacket. In a Twitter Poll, 80% of respondents believed the Easter Bunny to be male. Though, velvetty anything seems pretty gender ambiguous in my opinion. Only Paul suggested that the bunny could use they, them pronouns. Otherwise, parents almost universally used masculine pronouns with the rabbit. A few people inquired about the gender of the person inside of the costume. For instance, a girl asked me if I was a girl bunny or a boy bunny. An older woman asked one of the cashier/photography workers if the person inside was male or female. I don’t expect that most customers would have the knowledge or experiences to envision the bunny outside of the binary of male or female. I myself tended to gender the bunny as male, hence my Peter Rabbit’s uncle story. I often wondered how parents felt about setting their child on the lap of the Easter Bunny. Did the parents envision the person inside as male? If so, how did this make them feel? Male gender and sexuality is always viewed as more potentially threatening to children. This is because we are socialized to view women as more “naturally” disposed for caretaking, more nurturing, and more invested in children. Statistically, men are more likely to be perpetrators of child sexual abuse, though females make up 14% of the abusers of male children and 6% of female children. With this in mind, I wondered how parents might react differently based upon their perceptions of the gender of the person in the costume. As far as I could tell, most parents were extremely comfortable putting their child into the lap or company of a stranger in a rabbit costume. This leads me to my next point…

Consent:

I was not able to speak as the Easter Bunny. This made negotiating consent difficult. As I mentioned, parents were pretty comfortable with placing their child in the temporary care of the Easter Bunny. However, many children were not at all comfortable meeting the bunny. It seemed that children over the age of two and under the age of five were often quite terrified of the bunny. From a distance, they seemed excited. As they grew nearer, the magnitude of meeting the bunny struck them- as well as the general weirdness of having to sit on this character’s lap or beside them. This resulted in reactions ranging from shyness to terror. Parents addressed this a number of ways. A common tactic was to bribe the children. Children were promised that they could ride the train, have candy, go to Build a Bear, or get a toy if they endured a photo with the bunny. Parents also assured their children that the bunny was safe and nice. This was done by approaching the bunny, touching its paw, high fives, sitting next to the bunny with the child in arms, and other tactics to increase the child’s exposure to the bunny and demonstrate that it was no threat at all. Some parents threatened their kids, telling them there would be no candy or that they would go straight home. A final tactic was to simply place the child on the bunny’s lap or on the bench, then run, hoping that the photographer would grab a few shots before the child inevitably ran away.

Parents played an important role in mediating the child’s consent. However, most parents wanted a photo for their own collection of memories or to send to relatives. They had a vested interest in forcing their child to endure a photo. This put me in an awkward position. When one parent placed a child on my lap, the child immediately thrust themselves off my legs and flopped onto the floor. This resulted in more crying. Since I did not want more children to fall over, I would hold them securely on my lap- a violation of their consent. Parents encouraged this, even telling me to hold on tight to their child. When I finally released one child, the crying boy wailed that he would never return to the Easter Bunny again. I felt bad that many kids did not consent to being photographed with the bunny. While I think that with time and patience, many frightened children would warm up to the bunny, the length of the line or impatience of the parents did not allow for this to happen in some cases. In other cases, children naturally became more comfortable with the giant rabbit and ended up having a positive experience. Thus, I can conclude that I think it is alright for parents to challenge their children to overcome their fears in a patient and supportive manner. But, I do think it sends the wrong message for parents to threaten or force the encounter.

As for my own strategies for trying to make children comfortable, I would sometimes grab an egg for the children to hold. This seemed to distract them from the frightening, giant rabbit. I would also try to make the children comfortable with high fives and thumbs ups. If kids rushed towards me (without showing fear) I might gesture for a hug. I didn’t want to be a cold Easter Bunny with walls of boundaries, but I also didn’t want to make children uncomfortable. I found that this was a little challenging to balance, as I naturally am more reserved when it comes to showing warmth and affection.

Working with Kids:

While I work with children at Safe Haven Shelter, I enjoyed my interactions as the Easter Bunny far more. Within the context of the shelter, I am just me. If a child is placed in my care, it is usually in the office, where there are computers, office supplies, and phone calls. Thus, I always feel pretty stressed out about childcare at the shelter because 1.) I have nothing to entertain them with. 2.) I am in a room full of expensive or breakable things- i.e. computers. 3.) I often don’t know how long the encounter will last. 4.) I may have other work to attend to. 5.) I am not actually all that fun or interesting to children. As the Easter Bunny, I was immediately fun and likeable. Afterall, I am the one who brings candy and hides eggs. On several occasions, I was able to ride on the mall train which was a grand entry and an opportunity for sort-of dancing. While I could not speak, I could wave, gesture, high five, and pretend to hop. In all, it was great to NOT be boring old Heather, who has nothing to offer children. Really, being the Easter Bunny is the closest I will ever be to being a celebrity or God.

(a photo of a photo- of my friend Jenny’s niece)

Labor:

From a Marxist perspective, all workers sell their labor power in exchange for a wage. Labor power is not only labor (i.e. selling shoes, making shirts, paving roads, or other examples of the act of working). It is time, work, along with the whole human being. In short, every worker sells their work and time, but also their personality, body, and the sustenance the person (physical health, mental health, caloric use, bodily wear and tear, etc.). My temporary gig as the rabbit was a “hobby” job or one that I did more on a whim than for my actual survival. Therefore, I didn’t feel particularly exploited. At the same time, I think it would be very hard to be the bunny all year long or as a professional job. There are some people (such as Disneyland workers) who do not have the luxury of a two week gig. Thus, I think it is useful to illustrate the way in which this form of work is exploitive (as all work is).

When a worker sells their labor power, they are selling themselves. In the bunny example, the worker is invisible, hidden inside a stuffy, hot suit. The sweat of the worker, the inability to scratch an itchy nose, immediately use the toilet, easily ingest water, move hair that has flopped into the face, to speak, to see beyond the periphery of the eye holes, etc. are all ways in which the body is subjugated in the sale of labor. Playing the character is how the personality of the worker is subjected in the interest of the emotional labor of entertaining children. The way in which work subjugates the body and personality of a worker is pretty obvious inside the confines of a costume. Even other workers tended to ignore the bunny, sometimes neglecting to turn on or move of the fan. The bunny can’t easily communicate needs. Another hardship as the rabbit was a lack of a sense of time. There was no nearby clock, so time could move quickly or slowly depending upon how many customers were visiting. At the same time, the bunny was paid better than other workers. Workers who were not the bunny were pretty adamant that they did not want to end up in the costume. I believe that at some level they realized that the bunny produced more “value” in terms of labor output (i.e. had a harder job but also contributed more to overall profits).

But, a person does not have to be in a bunny suit to realize the bodily oppression of labor. A waitress who has to smile and look pretty for more tips, a social worker whose stress or compassion is a strain on their mental health, and a janitor whose heavy routine deteriorates physical health are all examples of how labor is more than just our work and time, but our whole being.

(A little house of capitalist horrors)

Conclusion:

I would say that the job was certainly novel. Towards the end, I was happy that the season was over since my coworkers seemed worn out and the hours in addition to my regular work hours was making me weary and eager for free time. It was a fun side job and more insightful than one might imagine. While hidden in my costume, I had plenty to think about in terms of gender, consent, and labor itself. There were fun moments. I liked to make children happy. I liked to play a character. I liked the opportunity to be something other than the more serious and quiet version of myself that I sometimes am as an activist and worker at my other jobs. I enjoyed eating at Noodles and Company at the mall and visiting the mall at all! It was something different from my normal routine. I was also happy to have stories to share with my friends, coworkers, and family. I even had a several people visit me as the bunny. If the opportunity arises, I may be the bunny again next year. Being the Easter bunny made me feel more inclined to celebrate Easter. I visited my family and even purchased myself an Easter basket full of candy I don’t need. But, even the Easter bunny needs a little treat! Anyway, we’ll see what next Easter brings. And who knows, maybe I will be one of Santa’s helpers…

More Fluid Than Blood

Vampires and Bisexuality

H. Bradford

12/10/17

Each month, Pandemonium meets to discuss issues related to bi+ identities and organizing. This month, the group gathered to discuss vampires and bisexuality. Anyone who has watched or read vampire themed media might have observed that vampires are often portrayed with ambiguous sexualities, if not outright gay, lesbian, or bisexual. The following presentation seeks to uncover the history of how vampire sexuality has been depicted as well as the implications of these representations. Vampires are very much a reflection of the times in which they were imagined. As monsters, they represent challenges to the social order. Since bisexuality, or for that matter any non-heterosexual sexuality, is a challenge to heteronormative patriarchy, it makes sense that vampires often lend themselves to a queer reading.

Before they were the subject of books or television series, vampires have long been imagined beings from the folklore of many cultures. Blood drinking spirits appear in the stories of many cultures, but vampires as they are understood today were mostly based upon the tales of Eastern Europe. These stories entered the public consciousness of Western Europeans during the 18th century with several highly publicized cases of vampirism within East Prussia and the Hapsburg Empire. Incidents of vampirism and the related hysteria was investigated by 18th century scholars and Maria Teresa of Austria sent her physician to uncover the truth about vampires. He concluded that they were not real and she subsequently passed laws against opening graves or desecrating the dead, which put an end to outbreaks of vampire panics. From then on, vampires, at least in Western Europe, were mostly a matter of fiction. Thus, vampires began appearing in Western European fiction in the early 1800s. The spread of vampires in Western Europe from Eastern Europe represents a transition of folklore and superstition from the lesser developed parts of Europe to the large, urbanized, mostly literate population of the West (Paolucci, 2000).

A “vampire” skull from 1500s Venice, found among plague victims

One of the first works of vampire fiction was Polidori’s 1918 story The Vampyre, which featured a vampire named Lord Ruthven. Polidori served as Lord Byron’s physician and his character, Lord Ruthven, established the trope of that vampires should be aristocratic and seductive (Primuth, 2014). The plot of the book involves Lord Ruthven travelling around Europe as he seduces various women, often accompanied by his friend Aubrey. Ruthven and Aubrey have a falling out, but reconcile. Later, Aubrey watches Lord Ruthven die and makes a promise not to tell anyone of his death. Aubrey stays true to the promise, even after Ruthven is later discovered to be alive. Only when Ruthven tries to marry Aubrey’s sister, does he confess his oath in a letter. Ruthven kills Aubrey’s sister on their wedding night and Aubrey dies as well. While the story is interesting because it establishes the notion that vampires are alluring, sexual, and aristocratic, it is also of interest because Lord Ruthven may have been based on Lord Byron.

Early Vampire Literature

Prior to the publishing of The Vampyre,Byron wrote a poem about a vampire in 1810 while touring Greece. The poem entitled The Giaour takes place in Greece, then ruled by the Ottomans, wherein a character named Leila is killed for her infidelity to her husband, Hassan. Her lover avenges her death by killing Hassan, but Hassan curses him to become a vampire. This early take on the vampire does not have common conventions such as fangs, sleeping in coffins, aversion to sunlight, etc. Yet, the vampire character in the poem is a Byronic hero inasmuch as he is cursed, dangerous, and an outsider (as Giaour means infidel) (Luchsinger, 2015). Later, in 1816, Lord Byron stayed in Lake Geneva with his physician, Poliodori, as well as Mary Shelley and Percy Blythe Shelley and his mistress, Jane Clairmont. During their stay, there was a snow storm, during which they challenged each other to invent stories for entertainment. Mary Shelley developed the story of Frankenstein. Byron began a story about a vampire, which Polidori fleshed out and published as The Vampyre (Lord Byron’s image inspired modern take on vampires, 2010).

Lord Byron, or George Gordon, was a controversial, larger than life figure in his day. He may have had a child with his half-sister Augusta. It is also speculated that he may have been more than friends with Mary Shelley and Percy Blythe Shelley. There is evidence that he was not strictly heterosexual. He wrote poetry under the female name Thyzra to John Edelston, a young choir member who he fell in love with at the age of 17. In letters that he wrote during his travels in Greece and Turkey in 1810, he expressed his interest in seeking same sex encounters in these places, which were more tolerant at the time. He also bragged to friends back home that he had 200 sexual encounters while in Greece and Turkey. At the same time, the punishment for sodomy in England in the early 1800s was death. In 1815, he married Annabelle Milbanke, who left him a year later with their infant daughter. She went to stay with her parents and requested a separation, which unleashed various rumors about his relationship with his sister, adultery, and sodomy. He negotiated a separation from his wife outside of the courts and left for Europe, where The Vampyre was written (MacCarthy, 2002). Certainly, the plot line of the story mirrors his life, as the vampire travels through Europe seducing and harming women in locales such as Italy and Greece, then eventually England. The plot line of Aubrey following Lord Ruthven around Europe, then having a falling out, also mirrors the falling out that Polidori had with Lord Byron. Finally, while Ruthven mostly preys upon women, the relationship between Ruthven and Aubrey may hint at bisexuality. Paolucci (2000) suggests that a cave scene between Ruthven and Aubrey is suggestively sexual and that Aubrey’s refusal to believe in the supernatural is a rhetorical denial of queerness.

It is difficult to classify Lord Byron’s sexuality, since modern sexual identities were not yet developed. The word bisexual was not used until 1892 in the Psychopathia Sexualis, a book about sexual pathologies. While Byron might be viewed as bisexual, inasmuch as he expressed attraction to both men and women, caution should be used in applying modern notions of sexuality to people and situations that pre-date these understandings. Still, he was one of the first famous writers to be labeled bisexual. Though, literary scholar Emily Bernard Jackson warned against this, arguing instead that his sexuality was too fluid and complex for labels (Lord Byron, n.d.). Nevertheless, in studying the history of bisexuality and vampire’s in the media, it is certainly important to recognize that the first vampire in English literature was modelled after Byron, who was controversial, charismatic, and attracted to both men and women. In this sense, bisexuality, is built into the fabric of vampire literature, even if Ruthven’s character is not overtly bisexual. At the same time, this inclusion isn’t necessarily positive, as homosexual/bisexual behaviors and attractions were viewed as deviant.

Lord Byron- the inspiration of Lord Ruthven- an early vampire in Western fiction

In the 1840s, Varney the Vampire appeared as a newspaper serial (Primuth, 2014). Varney the Vampire introduced some modern staples of vampire stories, such as fangs, nocturnal visits, entry through a window, super strength, and hypnotic power. He is also a sympathetic vampire, even though part of the plot of the series involves him trying to take advantage of Bannersworth family. Varney is important because he was the first sympathetic vampire. He feels guilty and alone, and tries to control his predatory nature. He mourns his wife and children from 180 years earlier and is the first vampire to commit suicide. He is attracted to young, virginal women and seems primarily interested in women (Paolucci, 2000). Varney is not a virtuous vampire, but he is a conflicted vampire that is not always villainous. It is possible that his perceived heterosexuality is used to cast him as a “good” vampire rather than a deviant, villainous vampire. There is less scholarly work on the sexuality of Varney, as opposed to other vampires of the 1800s.

Carmilla and Lesbian Vampires

While Varney the Vampire has not lent itself to extensive and rigorous analysis for sexual themes, the novella Carmilla has. Published in 1871, the novella Carmilla predates Dracula by 26 years. Joseph Sheridan Fanu’s novella follows the story of a girl named Laura, who befriends a mysterious girl named Carmilla. Carmilla makes romantic advances on Laura, does not join her family in prayer, sleepwalks at night, and sleeps during the day. Girls in the nearby village begin to become sick and die, while Laura herself begins to have strange dreams, poor health, and mysterious bites on her chest. Eventually, it is discovered that Carmilla is actually Countess Mircalla, a noblewoman from two hundred years prior who had a relationship with a woman whose decedent became a vampire hunter. Laura, whose memory of the events is unsteady, does not grasp the romantic inclinations of Carmilla towards her and even theorizes that perhaps Carmilla was a boy in disguise. Laura explains how Carmilla took her hand, breathed heavily, and kissed her neck. The novella is unique in that Carmilla is not interested in the blood of women and men, but exclusively females. She also explicitly has sexual interest in women with no interest mentioned of men. She was, by modern understandings of sexuality, a lesbian vampire. Since Carmilla visits peasant girls in the area, she may also be viewed as polyamorous as she is not uniquely attracted or bonded to Laura. In the end, it is male power which restores the patriarchal, heterosexual, monogamous order as Carmilla is staked, then beheaded and burned by Baron Vordenburg and General Spielsdorf (Künnecke, 2016).

Perhaps owing to Carmilla, lesbian vampires are stock characters in vampire media. For instance, between 1968 and 1970, there were 20 lesbian vampire films released in the US, Britain, and Western Europe. These films often drew from the story of Carmilla or from the tale of Elizabeth Bathory. Elizabeth Bathory was a 17th century aristocratic woman who used the blood of young women to stay youthful and whose history includes rumors of lesbianism and vampirism. The accusations of lesbianism ignore the fact that she was married to a man, so, perhaps she would more accurately be considered a bisexual woman. Bi-erasure aside, there was a proliferation of lesbian vampire films in the 1970s, which may have been in part to generate interest in horror films, a dying genre at the time. Censorship was also relaxed in the United States in the 1960s along with the sexual revolution which opened society up to sexuality. Another explanation is that lesbian vampires, especially those who preyed upon men, appealed to male anxiety regarding feminism. In these films, the vampire woman must compete with mortal men for the mortal woman. The vampire is killed and the threat to the order of patriarchy is destroyed. Lesbians are doubly marginalized, in that they are women and homosexuals. They are also doubly threatening to patriarchy, which makes them particularly dangerous or sinister vampires. Whereas Lord Ruthven escaped without punishment, this is not possible for Camilla, because of her claim to male power. While many films of the 1970s had lesbian vampire characters, the use of violence to restore male power is graphically evident in the 1974 film Vampyres in which the opening scene depicts two women having passionate sex until they are suddenly shot. Through flashbacks, it is revealed that these two women are vampires who take men home with them to suck their blood after sex (Uygur, 2013).

The most well known of the lesbian vampire film genre were The Vampire Lovers, Lust for a Vampire, and Twins of Evil, which featured Carmilla, but changed her to a bisexual woman. A 1973 film called Female Vampire is a pornographhic film wherein the vampire Irina has graphic sex with men and women, yet is classified as a lesbian vampire film. Again, there has been a problem with conflating of bisexuality and lesbianism in film discussions. In The Vampire Lovers Baker (2012) cites the argument of Weiss that this represents a bisexual triangle, wherein the man is aligned with the forces of good and the vampire with evil, with the woman sought after by both is a neutral party. After Carmilla is destroyed, Emma is united with Carl and Emma’s response to Carmilla’s seduction is reframed as delirium. In seven of the twenty films of the era, the bisexual triangle is employed as a plot device (Baker, 2012). Later films also use the bisexual triangle. For instance, The Hunger uses bisexual triangulation by centering the story on the love triangle between John, Miriam, and Sarah. Miriam is a married bisexual vampire who falls in love with Sarah, who is also bisexual. Miriam is haunted by her male and female lovers from over the centuries. Blood and Roses and Daughters of Darkness, are two additional films which feature bisexual love triangles (Ritscher, 2013).

A scene from Daughters of Darkness

Despite the common use of bisexual love triangles, many of these films are classified as lesbian vampire films. Richter (2013) argued that bi-erasure is a perennial problem in media studies, as bisexual characters are often miscategorized as gay, lesbian, or queer. For instance, the movie Brokeback Mountain is often called a gay film in the media. However, several bisexual theorists have argued against this and used this as an example of bi-erasure as the men in the film are romantically and sexually involved with each other, but also their wives. Ritscher identifies several ways that bisexuality is erased from film. The first way that bisexuality is erased is when scholars or reviewers refer to same-sex attraction or behaviors as homosexual. This creates a false dichotomy wherein sexual acts are either gay or straight. By this logic, only threesomes can be coded as bisexual. In The Hunger, bisexuality is rendered invisible when Sarah and Miriam have a “lesbian sex scene” which is discussed and remembered by critics, scholars, and film viewers. Another way that bisexuality is erased is by downplaying opposite sex relationships. For instance, in the film Daughters of Darkness, Elizabeth who is a vampire, has an erotic scene with Tom, a human. Despite this, she is still considered a lesbian. In Blood and Roses, the character Camilla is depicted as in love with Leopold. In the end, she kills her female lover Georgia and takes her place, so that she can marry Leopold. Nevertheless, Carmilla is classified as a lesbian character. Down playing opposite sex relationships is done to bolster same sex relationships, but in doing so reinforces the binary between straight and gay. Richter (2013) cited Kenji Yomito, who argued that both straights and gays have an interest in erasing bisexuals. Though, this may not be intentional and malicious, but an unconscious social norm. Lesbian vampire film theory is problematic because it has assigned homosexuality to characters that may instead be viewed as bisexual. In doing this, homosexuality is contrasted against heterosexuality as an opponent. According to bisexual theorists, bisexuality is not merely a sexual identity, but an undoing of the two oppositional poles of of sexuality and a challenge to the notion that sexual identity as a category. As such, the goal of bisexual scholarship should not be to spot the bisexual, but instead to challenge thinking about the gay straight binary. Bisexual theorists argue that bisexuality threatens not only the order of male supremacy but is also a threat to sexual rigidity (Ritscher, 2013). I would argue that both male supremacy and sexual rigidity uphold patriarchy. Sexualities that are fluid or non-monogamous threaten capitalist patriarchy because they threaten the structure of family and the gendered roles of men and women. In doing so, these threaten the social reproduction of labor.

Examining Dracula:

While Carmilla serves as the foundation of lesbian and female bisexual vampirism, it is not the most famous or generally influential vampire novel. The most famous vampire novel, Dracula, was published in 1897 by Bram Stoker, an Irish writer. Bram Stoker himself was believed to have been gay, or at least this was an argument made in ‘Something in the Blood: The Untold Story of Bram Stoker, the Man Who Wrote Dracula.’ Bram Stoker wrote letters to his close friend, British author Hall Caine, which may be interpreted as romantic. He also wrote a gushing letter to Walt Whitman regarding his ability to be natural and unashamed when speaking to him, but how his is shackled or unfree (Cardamone, 2017). Presumably, the letter was about his closeted homosexuality. In 1895, when Oscar Wilde, a friend of Stoker’s, was convicted gross indecency, Stoker disavowed the twenty year friendship in a panic. Perhaps this was due to his own anxiety as a closeted, gay man when being out of the closet was a criminal offense (as in the case of Oscar Wilde who was sentenced to hard labor). It is in this context that Dracula was written. McCrea (2010) suggests that the novel depicts closeted heterosexuality, that is, it is written from the view that heterosexuality is foreign and frightening. This interesting argument follows that the story of Dracula is a marriage plot. In the story, Mina and Jonathan are going to be married, their lives are interrupted by the chaos wrought by Dracula, and when this resolves, they become a happily established married couple with a son. At the same time, Lucy, Mina’s less sensible friend, considers her marriage prospects but her life is cut short by vampirism and death. McCrea (2010) notes that many scholars have analyzed Dracula as an “other” in post-colonial, feminist, Marxist, queer, etc. interpretations of the novel. However, McCrea (2010) proposes that Dracula is familiar. For instance, Jonathan Harker passes deeper into Eastern Europe in the novel, into increasingly uncomfortable superstitions, spicy food, slow trains, and unnerving sights. Yet, when he arrives at Dracula’s home, his first thoughts are to pause and consider how he is moving up in his career and what Mina would think of this. Even Dracula himself is courteous, well read, and welcoming. Dracula saved Jonathan from the three vampire women who tried to seduce and bite him, for which he is thankful. Dracula even treats Jonathan’s stay at the castle as a marriage contract, saying that he has entered freely of his own will. When Dracula leaves on unknown business, Jonathan waits for him in the castle, like an imprisoned wife (McCrea, 2010). In this way, the novel is a dark fantasy about heterosexual marriage.

Although Stevenson’s (1988) analysis is dated, it does make specific mention of bisexuality in Dracula. His piece is mainly focused on the theme of exogamy. That is, Dracula represents a foreigner who is trying to seduce the female characters and as such, represents an external threat that must be fought against. He represents British imperialist anxieties over their racial order of the world. Aside from exploring this theme, Stevenson (1988) took time to examine female sexuality in the novel and suggested that vampires are bisexual. However, his view of bisexuality was very narrow and conflated with understandings of gender or even sexual roles. His main argument is that vampires are bisexual because both female and male vampires are penetrators and receivers. Their fangs penetrate and at the same time, they ingest the fluids of their victims. Female vampires in the novel become more sexually aggressive, a demonstration of their masculinity. Stevenson’s (1988) analysis is interesting, but lacks the language and nuance to explore gender as something apart from sexuality, which unfortunately is generically labelled as bisexuality. However, this may be due to the fact that in 1988 there was less knowledge and awareness of bisexuality as an identity and little visibility as a distinct part of the LGBTQ movement. BiNET USA, the oldest bisexual organization in the United States was not founded until 1990. To broaden this analysis, it might be argued that if blood sucking is a metaphor for sex, then vampires are bisexual in that they prey on any human victim, male, female, trans, gender non-conforming, etc. Gender is not as important to vampires as blood itself. Even if drinking blood is not viewed as a metaphor for sex, it is an intimate act in that it usually involves drinking directly from the neck, which is often viewed as a sensual location for kissing in Western societies. This act is usually done privately and at night, again, if not blatantly sexual, at least following social conventions regarding sex. Upon closer examination, there may be hints of bisexuality in Dracula. One example of a homoerotic or bi-erotic scene is the passage wherein Jonathan Harker is passively seduced by a group of vampire women living in Dracula’s castle. This is interrupted when Dracula arrives and tells the women that Jonathan belongs to him (Künnecke, 2015). Dracula affirmed the trope that vampires are threatening to both men and women.

Many of the vampires of the 1800s have “deviant” sexualities. According to Foucault, the development of capitalism resulted in the increased repression of sexuality, so that by the Victorian Era, when many of these famous vampire novels were written, sex had become relegated entirely to the personal sphere. That is, sex and sexuality were not to be expressed or discussed in public. At the same time, the roles of men and women were more sharply defined than any other time in history and homosexuality or any other “deviant” sexual behavior or identity was driven underground. Male homosexuality became highly regulated, whereas female homosexuality was given less attention. Women were viewed as more emotional in general and given more social leniency to express affection towards one another (Künnecke, 2015). Perhaps this accounts for why Carmilla was depicted as a lesbian, whereas male vampires were not overtly homosexual or bisexual. Foucault also noted that monsters are individuals whose behaviors must be corrected. At the core of monstrosity is deviance and irregularity. Monstrosity is threatening because it calls into question social norms. To Foucault, homosexuality became understood as something deviant because society had come to the understanding that the strength of a nation was bolstered by its citizens, their marriages, and their families. In this understanding, sex was a tool used by the state for regeneration. A monstrous vampire always represents a threat to the order and is constructed as somehow deviant. Defeating a vampire results in the re-establishment of order. As such, if blood drinking is a metaphor for sex, a vampire is a bisexual or homosexual threat to society. Early folkloric vampires may have represented fear of Plague. However, 18th century vampires were written about in a time that was beginning to fear homosexuality, and as such they represent anxieties over violations of sexual norms (Uygur, 2013).

Early vampire characters were mostly constructed as monstrous and evil, with some exceptions, such as Varney the Vampire. The ability for vampires to be portrayed as anything other than overtly heterosexual and is a function of social movements which sought to expand the rights of the LGBTQ community. In general, if gays or lesbians appeared in the media before the late 1960s were tragic, unstable, or miserable characters. Some films, such as the 1943 Creature of the Devil may hint at homosexality or bisexuality, in that the main character becomes jealous of his twin brother’s relationship with a woman and sends a hunchback to kill him. The 1944 short story, The Bat is my Brother may allude to homosexuality or bisexuality, in that the main character is shown how to be a vampire by an older vampire mentor. The younger vampire is guided through his vampirism, coming out and and coming to accept his condition. Still, there are no overtly bisexual or homosexual vampire characters. The 1931 film version of Dracula was directed by Tod Browning, who was gay, but in general, queerness was consigned to the shadows due to social conservative and active persecution of gays, lesbians, and bisexuals (Primuth, 2014). Vincent (2015) noted that 1960s and 1970s saw an opening of sexuality in America with the feminist movement and gay rights movement. The FDA approval of birth control in 1960, its legalization in 1964, the elimination of homosexuality as a disease in 1973, and the Stonewall riots in 1969 all contributed to the process of broadening the expression of sexuality in society. The most landmark piece of vampire media created during this era was Anne Rice’s Interview with the Vampire. She created morally ambiguous characters that usurped traditional sexuality. For instance, Louis described his first encounter with Lestat in suggestive terms, describing Lestat as extraordinary, graceful, like a lover, and opening up new possibilities. Yet, even though the transformation into a vampire is coded in homoerotic imagery, Louis becomes interested in woman named Babatte Freniere who spurns him as unholy. Louis and Lestat have fluid sexualities, which may be due to their dependence on the blood of women and men, their outsider status to human societies, and sexuality that is unbound by reproduction (Vincent, 2015).

The 1980s saw a backlash against the gains of the 1960s and 1970s. During the 1980s, vampires were often villainized again, such as Fright Night in 1985 and The Lost Boys in 1987. The AIDs epidemic also influenced vampire media. For instance, in the 1991 novel, Dracula Unbound, Dracula contracts syphilis. In the 1998 film Blade, Blade takes a serum to stay alive, which might be comparable to the cocktail of pills that HIV patients must take to ward off AIDS (Primuth, 2014). It was not until the 1990s that more positive representations of LGBTQ characters began to appear. For example, the heroine of the 1990s young adult book series, The Last Vampire, is a bisexual, though this series mostly focuses on her relationships with men. In 1997, Buffy the Vampire Slayer began airing and featured Willow as a positive lesbian character.

Buffy the Vampire Slayer:

Willow was a popular character on Buffy the Vampire Slayer because of her dialectical nature. She was a character with destructive power, but also the power to help. She was also one of the first characters on television to be depicted in a lesbian relationship, when she entered a relationship with her fellow witch Tara. However, Willow is also an example of bi-erasure because her sexual fluidity is ignored in the series. For instance, her previous heterosexual relationships were ignored or dismissed when she asserted that she was “gay now.” In the series, she was depicted as heterosexual, with a crush on Xander or her relationship with Oz (Muscat, 2014). Mo (2016) noted how in seasons one and two, Willow was depicted as interested in men. First she was interested in Xander, which was unreciprocated in season one. She later became involved with Oz, but cheated on him with Xander, eventually reconciling with Oz who she dated until season four. However, in season five she reminded Anya that she was gay now when Anya expressed concern that she would steal Xander away from her. Later, Tara was worried that Willow wasn’t really a lesbian and would return to dating men. Willow defended herself against this accusation that her sexuality was fluid, which was reinforced by the narrative of the story which did not allow for any deviation from being fully lesbian from then on. Muscat (2014) argued that Willow was reduced to a binary of totally straight or totally gay, which denied the possibility that she might have been bisexual or fluid. In an episode wherein all of the female characters vied for the love of a character named RJ due to the effects of his magical letter jacket, Willow only falls for him when she uses her magic to alter his gender. This reinforces the notion that homosexual attraction is only authentic when absolute. Muscat (2014) also noted that within the Buffyverse, bisexuality is coded as dangerous and often associated with vampire characters. For instance, Vamp Willow, an alternative universe version of Willow was coupled with Xander, but propositioned a girl at The Bronze and licked the neck of regular Willow. In the series Angel, there are homoerotic undercurrents to both Drusilla and Darla’s relationship as well as Spike and Angel’s. Only if a character is evil or morally ambiguous can they experience fluid sexuality. Even Willow called her vampire self skanky (Mo, 2016).

Mendlesohn (2002) argued that in contrast to other characters, the series denies a queer reading of the relationship between Buffy and Willow. A “queer reading” is when a reader, or in this case viewer, constructs homosexual desire in situations wherein this sort of attraction is not overt and heterosexuality is normalized. It is way for readers who was oppressed or excluded to identify codes for same sex relationships or cues that two characters may be flirting, loving, or passionate towards one another. Willow is coded to be young and innocent, as she wears pinks and reds rather than darker colors. Throughout the series, her behaviors are rarely sexualized. Intimacy with male or female partners is usually shown off screen. Buffy, on the other hand, has more overtly sexual behavior. Buffy also tends to look to male characters for support and validation. Throughout the series, Willow grows, changes her appearance, makes new friends, and becomes more confident. On the other hand, Buffy does not grow, nor does her appearance change. Their relationship lacks the necessary tension to drive it towards a queer reading. In contrast, it is easier to do a queer reading of the Buffy and Faith relationship because Faith is the opposite of Buffy in appearance, unrestrained, and sexual. Willow is more of a backdrop to Buffy rather than her equal or antagonist. (Mendlesohn, 2002). Casano (2013) agreed that while While there is no overt bisexuality in Buffy the Vampire Slayer, the relationship between Buffy and fellow slayer, Faith is sometimes speculated to be bisexual. Faith appeared in season three of the series, following the death of the slayer, Kendra. Eliza Dushku, who played the character, felt that Faith had feelings for Buffy and was bisexual. Faith is promiscuous, fearless, bad girl, who is an outsider to the Scooby Gang (Casano, 2013). Any hinting that her character is bisexual would play into the stereotype that bisexuality is deviant or that only a morally ambivalent character could be bisexual. Certainly, in the late 1990s and early 2000s when the series aired, there was growing awareness of bisexuality with the establishment of BiNet in 1990, the release of the book Bi any Other Name: Bisexual People Speak Out in 1991, the first international bisexual conference was held in Amsterdam in 1991, the bi flag was invented in 1998, and the first Celebrate Bisexuality Day was celebrated on September 23, 1999. The 1990s was a pivotal time for biseuxals because it saw the establishment of organizations and inclusion of bisexuals in Pride Festivals. Still, despite the flourishing of bisexual identity in the 1990s, it is disappointing that Buffy the Vampire Slayer did not handle the issue of bisexuality as well as it might have.

An example of wholesome, non-sexualized Willow

In the decades since the 1990s and early 2000s, there has been some improvement in the portrayal and visibility of bisexuals. HBO’s series, True Blood, which aired from 2006-2014 and was based upon Charlaine Harris’s Southern Vampire Mysteries novel series, depicted many LGBTQ characters. One prominent bisexual character was Sophie-Anne LeClerq, the Vampire Queen of Louisiana who was sexually, romantically involved with male and female characters, including Sookie’s cousin Hadley (Reynolds, 2014). Sophie-Anne appeared in eight episodes and was portrayed as a mentally unstable, but powerful antagonistic vampire. She wears glamorous clothes, longs to be in the sunlight, collects birds, plays Yahtzee, and seems genuinely attached to Hadley. In the series, she acquires some debts and resorts to selling vampire blood. Facing an IRS audit, she is forced to marry Russell Edgington, the vampire King of Mississippi. Sophie-Anne is a capricious, immature, unstable, frivolous character so in a way, she may pander to some stereotypes about bisexuals being mentally unstable. However, the character was a survivor, who clambered her way up in the world to become the vampire queen of Louisiana, then submitted to marriage to Russell Edgington to overcome her financial troubles. Evan Rachel Woods played Sophie-Anne and is openly bisexual. At the same time, in an interview with US Magazine, her character was called a lesbian, even as she says her bisexuality has been a part of her for as long as she can remember (Ravitz, 2011).

Sophie-Ann in True Blood

Pam de Beaufort, the bar manager of Fangtasia, was also depicted as bisexual and had a relationship with Tara Thornton (Reynolds, 2014). Pam appeared in 63 episodes and is loyal to Eric Northman. She is depicted as more interested in women than men, has a dry sense of humor, and dislikes children. In her human life, she ran a brothel and was romantically involved with Eric Northman, who later turned her into a vampire. Generally, the character was developed well enough that she doesn’t particularly fall into any bisexual stereotypes. Like most vampires in the series, she is morally ambiguous and in some ways deviant, but generally she is a well-rounded likeable character for the setting and tone of the show. Pam does have a fun quote, “Let bygones be bygones and bigirls be bi girls.” (Nicolaou, 2017) At the same time, her bisexuality was erased when Sookie told her that she didn’t have time for her lesbian weirdness.

Pam- attacking Sara Newlin

Tara Thornton was the most prominent bisexual character in the series (Reynolds, 2014). Tara is Sookie’s best friend in the show. She is sour towards vampires and a survivor of abuse. When she becomes a cage fighter, she begins dating a fellow female cage fighter. She later is turned into a vampire by Pam de Beaufort and the two eventually have a relationship (Zakarin and Fleenor, 2017). Eric Northman is also depicted as bisexual. He is the owner of Fangtasia and a love interest of Sookie Stackhouse. In the series, he seduces Talbot, the partner of Russell Edgington the King of Mississippi (Nicolaou, 2017). The series features many bisexual, gay, and lesbian characters. Even characters that are not portrayed as bi or gay are never rigidly straight. For instance, in season three, Sam the shapeshifting bartender, has a sexual dream about Bill Compton (Gray, 2011). Finally, while Sookie Stackhouse, the main character, is depicted as straight, Anna Paquin, who portrayed her, is bisexual. She has been very open about her bisexuality, but it has been the subject of confusion. In an interview with Larry King, she discussed her marriage to her co-star Stephen Moyer and the birth of her twins. Larry King assumed this meant that she was no longer bisexual. She had to correct him by stating that a straight person does not stop being straight if their partner dies or they become single, so her bisexuality does not change if she is in a monogamous relationship (Nichols, 2014). This demonstrates the misunderstandings that persist about bisexuality.

Tara had a variety of relationships in the series, including a relationship with Pam, a short lived relationship with “Eggs,” a longtime crush on Sookie’s brother Jason, and a relationship with Sam.

The series itself was produced by Alan Ball. Other than portraying a variety of queer characters, the show had many clear and obvious parallels to LGBT issues. For instance, the series takes place after vampires have “come out of the coffin” and are publicly known to exist. Obviously, coming out of the coffin is the vampire equivalent of coming out of the closet. Vampires themselves have their own vampire rights organizations and vampire-human marriage has been legalized, again mirroring the LGBT movement. Not everyone is on board with vampire rights, which mostly include far right Evangelical Christian figures. One hateful church has “God hates Fangs” as a slogan on a sign outside the church, again, a parody of “God hates Fags.” (Primuth, 2014). According to Campbell (2013), queer politics seeks to challenge heteronormativity, resist assimilation, embraces differences, and combats social forces that discipline and normalize. Rather than focusing on identity, it focuses on fluid and contextual opposition to dominant norms of gender, race, class, and sexuality. In True Blood, vampires are a metaphor for queerness and queer politics. Campbell (2013) cites Cathy Cohen when he argues that U.S. institutions seek to appropriate and assimilate queer life and in doing so, marginalize queer women, poor, working class, and queer of color. This mirrors the vampires of True Blood. While all vampires drink blood, some vampires are better than others. For instance, some vampires drink “True Blood” a Japanese blood substitute which allows these vampires to assimilate into society and are viewed as safer than others. Bill Compton, for instance, is presented as a protagonist in early seasons. He is a white, heterosexual vampire who values monogamy, in contrast to other vampires. Bill is gentlemanly and better than other vampires, such as the hedonistic Malcolm, Liam, and Diane. Malcolm was presented as a gay character, who was against coming out of the coffin and assimilation. As such, he is seen as dangerous, immoral, and a stand in for an anti-assimilationist queer identity. In the series, deviant vampires are signified drug use, hedonism, and promiscuity. Cohen called this secondary marginalization. So, although there are many queer characters in the series, many of the characters are vampires, which enjoy power, wealth, comfort, beauty, and immortality. Many, like Eric Northman, Bill Compton, Pam de Beaufort, Sophie-Ann, etc. are white, conventionally attractive, and generally privileged. The show could be critiqued for promoting an assimilationist viewpoint. Nevertheless, the show generally did a good job portraying a large number of queer characters and developing many of those characters beyond stereotypes.

Other recent television and film series have not handled LGBT issues as well. The television series, The Vampire Diaries, did not introduce its first same sex couple until Season Seven. The characters were Nora and Mary Louis, villainous vampires. Mary Louis was captured by an organization that hunts supernatural creatures called the armory, where she was injected with vampire hunter blood. She and Nora both died together in an attempt to destroy Rayna, the vampire hunters, magical sword (Anders, 2016). Their deaths were rather pointless and the characters were not allowed to stay in the series long enough to become compelling. Also, their sacrificial deaths harkens back to film norms that LGBT characters must die or experience tragedy. The Vampire Diaries introduced, Luke, a gay character in season five. He was a witch and had a twin sister named Liv. He sacrificed his life to save Liv, but was never well-developed nor shown in a relationship. Again, the series used the old trope that gays must die tragic deaths. While Caroline Forbes’ father was gay, he was never shown in the series and was referred to disparagingly. Once again, this was not a positive depiction. Finally, Matt, Rebekah, and Nadia had a threesome in the series, but Rebecca and Nadia’s bisexuality is never expanded upon beyond this scene. Because this is the only context for their bisexuality, it seems that the show depicts bisexuality as a performance for the pleasure of men (LGBT Characters in the Vampire Diaries and the Originals, 2015).

While vampire fiction has historically been an arena for expressing subversive sexualities, this is not the case with Twilight.Twilight goes against earlier traditions of gender non-conforming characters by creating characters that are very traditional. Edward Cullen and Jacob Black are brave and muscular, while Bella and female characters have female slumber parties, bake cookies, and cook meals for men. Bella is often a damsel in distress and Edward Cullen shuns intimacy before marriage. When Bella and Edward are finally married and do have sex, Bella finds herself bruised from the encounter and blames herself (Ames, 2010). The Cullens themselves, though not related by blood, live as a family unit of heterosexual couples, with Edward being the only character not coupled until he meets Bella. Other vampires, such as the Nomads and Volturi, do not live in the same traditional family units. They drink blood and act more like traditional vampires. The Volturi allow for more of a queer reading, as the Volturi consist of a trio of men, Aro, Marcus, and Caius, who spend more time together than with their wives. The Volturi are also presented as feminine men. The Nomads are also a trio, which begs the question of how the third person relates to the couple. In contrast, the Cullens consist of Carlile and two males and two females. The Cullens represent the monogamous, heterosexual ideal. Bella marries early in life and become immediately pregnant, then fights the keep the pregnancy even after it threatens her life. Throughout the relationship, Edward is protective and watchful of Bella, which could be viewed as controlling and stalking behavior. When Bella is injured by sex, she is mostly concerned about comforting Edward than her own well-being (Hofstatter, 2012). In this sense, the series is not only heterosexual, it is violently heteronormative. Despite the confining heterosexuality and gender roles in Twilight, Kristen Stewart, who played Bella is openly bisexual and told the guardian that she was not confused about her sexuality and that in general, she saw sexuality as grey or fluid (Brooks, 2017).

The popularity of vampire fiction has declined over the last several years, but more recent vampire stories offer insights about the future. Obviously, True Blood came along way from Dracula in its overt depiction of sexuality and other media in general for its positive and prominent depictions of LGBTQ characters. At the same time, Vampire Dairies was centered upon heterosexual relationships and kept queerness in the margins. Twilight was even worse in its hammering assertion of heterosexuality. The stark differences between these series demonstrates that queer liberation is incomplete. Twilight represents the alluring hold that tradition and conservatism continue in society. It represents a world where deviance from heterosexuality does not dare name itself or where it simply does not exist. This is the same world of Dracula, where sexuality is quieted, impulses controlled, and deviance is exiled or destroyed. In The Vampire Diaries, queerness can exist as an auxiliary to heterosexuality, so long as it stays quiet, does not distract, and dies when necessary. True Blood made the most ground, but it still portrayed queerness as preferable when it is expressed by those with beauty, wealth, power, and whiteness. As for bisexuality, there have been many mis-steps in its presentation over history, the largest being its invisibility, fetishization, or conflation with gay or lesbian identities. However, bisexual social movement organizations are only a few decades old. Better representation of bis in the media hinges upon the success of this movement along with the larger LGBTQ movement to assert itself in society as a whole. Hopefully this is done with a mindfulness towards the rights and representation of people of color, people with disabilities, the working class, people who are poor, people of diverse sizes and appearances, and all the many other ways that groups of people are marginalized in society. Liberation expands the lense of who is portrayed and how they are portrayed in the media. Vampires have long made for fascinating characters and storylines. They are also a mirror for how society constructs deviance and acceptability. In this reflection, there is plenty to see.

Paolucci, P. L. (2000). Re-Reading the Vampire from John Polidon to Anne Rice: Structures of lmpossibility Among Three Narrative Variations in the Vampinc Tradition (Doctoral dissertation, York University Toronto).

Fungi and Feminism

H. Bradford

8/12/17

Once a month, the Feminist Justice League hosts a feminist frolic. This month, the goal was to go on a hike to learn more about fungi, edible and otherwise. We asked Ariel, one of our members, if she would be willing to tell us a little about edible fungi, as she forages for fungi and sells them to a local grocery store. As for myself, I undertook the task of trying to connect fungi with feminism for a short presentation on that topic. Connections between these two topics are not commonly made, but almost anything can be connected to feminism. Indeed, fungi can be connected to feminism through an exploration of women’s roles as foragers and food preparers, the connection between fungi and witchcraft, and the contributions women have made to mycology, the science of fungi.

An Introduction to Fungi:

To begin, it is useful to outline some basic information about fungi. Fungi are a diverse group of organisms that consist of everything from yeast in bread and beer, infections like athlete’s foot or ringworm, mushrooms and toadstools, and mold on bread. Most people are probably most familiar with fungi in the form of mushrooms, the fruiting bodies of some fungi. However, this is just a small portion of the diversity of this kingdom. Taxonomy is always changing, but fungi are often considered to be one of five or six kingdoms of organisms, including plants, animals, protists, archaebacteria, fungi, and bacteria. For most of history, fungi was lumped into the plant kingdom and it was not until the 1960s that they were separated into their own category of lifeforms. It might be easy to confuse fungi with plants, due the fact that both grow in soil and tend to be stationary. In actuality, fungi was more closely related to animals and 1.1 billion years ago they shared a common evolutionary ancestor with the animal kingdom (Staughton, 2002). Fungi are similar to animals in that they cannot produce their own food, as plants do through photosynthesis. Rather, they feed on dead and living organisms, breaking them down by excreting enzymes and absorbing nutrients through their cell wall (Fungi-an introduction, 2009). This means that they differ from animals in that they do not ingest their food, rather they absorb it. Another similarity between animals and fungi is that both of them use oxygen in cellular respiration to convert nutrients into energy. That is, both use oxygen and release carbon dioxide as waste, as opposed to plants which use carbon dioxide and release oxygen (Bone, 2011). Yet, fungi are similar to plants in that both have cell walls, although the cell wall of plants is made of cellulose and the cell wall of fungi is made of chitin. Chitin is the same substance that the beaks of squids and the exoskeletons of crustaceans and insects is made of.

Despite the clear differences between plants and fungi, historically, fungi have been lumped together with plants and even today, mycology tends to be lumped within botany departments rather than zoology. While fungi have had a sort of identity crisis over history, they do indeed have a very close relationship to plants. Over 90% of all plants have a mycorrhizal fungal partner. In other words, plants often have fungi that live on or in their roots for the purpose of helping them extract more nutrients from the soil. In exchange, the fungi obtain sugar, which the plant produces. This is why a person often sees mushrooms at the base of trees. Some unusual plants, such as monotropes (more commonly known as Indian Pipe or Ghost Plant), do not produce chlorophyll and depend upon fungi to obtain energy from nearby trees. Almost every plant has fungi living between their cells. In addition, 85% of all plant disease are caused by fungi. In fact, chili peppers evolved their hotness as a defense against fungi (Bone, 2011). Therefore, it is no wonder that plants and fungi are associated with one another.

One of the most interesting thing about fungi is how diverse that kingdom is. While the animal kingdom contains a wide array of organisms including lifeforms as different as horseflies, sea horses, horseshoe crabs, and horses fungi vary even more greatly. Fungi include organisms that reproduce sexually, asexually, and both. This makes them extremely interesting from a sexual standpoint. Unlike animals, they can be one celled or made up of many cells. Subsequently, fungi include such diverse phylums as club fungi, which include mushrooms, toadstools, puffballs, and shelf fungi. This is the phylum that most people are probably familiar with. These fungi often have club shaped structures with gills containing spores. Another phylum of fungi are called sac fungi, or fungi which produce spores in tiny sacks. This group includes yeast, truffles, molds, and morels. Another phylla is called zygomycota, which feature sexual and asexual reproduction and include black mold. Finally, there are imperfect fungi, which have unknown methods of reproduction and include penicillium and aspergillus. There are about 1.5 million species of fungi, but only one tenth of these are known to science. Interestingly, the mass of the world’s fungi is far greater than the mass of all of the world’s animals, amounting to about ¼ of the world’s entire biomass (Fungi-an introduction, 2009). Fungi also outnumber plants six to one. Finally, the largest organism on the planet is actually a honey fungus in Oregon which is over 2,400 years old and larger than 1,666 football fields (Bone, 2011). Truly, fungi among the most fascinating forms of life on the planet.

Mushrooms, Women, and Foraging:

For most of history, fungi were not given much attention as a unique group of organisms. Thus, most early humans would have understood fungi mostly through the sexual phase or the fruiting body of a mushroom (Garibay-Orijel, Ramírez-Terrazo, and Ordaz-Velázquez, 2012). Humanity’s earliest encounters with fungi would have been with mushrooms and shelf fungi. Humans lived as hunters and gatherers, in small communities that foraged for their food, for 190,000 of our 200,000 years as modern humans. Some human societies continue to live this way. For most of human history, humans foraged for fungi, for food, medicine, ritual, dyes, etc. However, mushroom foraging is confounded by the fact that mushrooms may appear only at certain times of the year or under certain conditions. They may not appear in the same place each year, making them harder to forage than plants. Mushroom foraging is also made difficult by the fact that some mushrooms are extremely toxic, which means that misidentification or experimentation could result in illness or death. Around 2,800 species of mushrooms are used today by humans. Much of the mushroom foraging in the world is done by women (Garibay-Orijel, Ramírez-Terrazo, and Ordaz-Velázquez, 2012). This comes a little surprise, as in a study of 175 modern hunter-gatherer societies, women provided four fifths of the food. According to Crane’s research (2000) the food that was typically gathered by men was further away and harder to obtain. Today, in Mexico, Bahrain, Guatemala, Guyana, Nigeria, Zaire, Southeast Asia, Australia, and Russia, mushroom foraging is largely women’s work. However, in Poland and Switzerland, is is more often done by men. In some tropical areas, women collect mushrooms closest to their homes whereas men collect mushrooms that are deeper in the forest (Garibay-Orijel, Ramírez-Terrazo, & Ordaz-Velázquez, 2012). This is not unlike the gender dynamics of collecting honey and may reflect the importance of women in society for their reproductive capacity (Crane, 2000). In Guyana, men pick up mushrooms that they find incidentally on hunting trips, whereas women engage in active, premeditated mushroom collecting. Beyond this, there are gendered ways in which mushrooms are collected, with men tending to be solitary foragers who search out more valuable and hard to find mushrooms and women collecting them together and in more energy efficient locations. Mushrooms that are collected for ritual purposes are often done by both genders. Mazatec healers in Mexico can be women or men and Maria Sabina was an important informant of mushroom rituals to ethnographers (Garibay-Orijel, Ramírez-Terrazo, and Ordaz-Velázquez (2012).

While it seems that among many hunting and gathering cultural groups women play an important role in obtaining mushrooms, this is not the experience in industrial United States. Bone (2011) found that many of the people she encountered while foraging for mushrooms were men. Professional mushroom foragers, who often travelled the country in search of various mushrooms, were often men. In particular, men from Mexico and Southeast Asia made a living by foraging and selling mushrooms. At the same time, even amateur or more casual mushroom foragers were men. When she sought to learn more about foraging mushrooms, it was always men who shared their expertise. She also noticed a certain machismo among mushroom foragers, as some took risks by eating mushrooms that were known to be toxic or have negative health effects. Bone (2011) was focused on developing her knowledge of mycology and experiencing fungi from the perspective of a foodie. Her book, Mycophilia, does not examine the gender dynamics of mushroom foraging at any length. However, it does very clearly support the idea that in the United States, mushroom science, foraging, commercial production, and preparation are all largely dominated by men. This begs the question of why mushrooms exist so differently from the women centered foraging that is prevalent elsewhere in the world and presumably elsewhere in history.

There may be a few explanations for their phenomenon. For instance, until the 1600s in France, mushroom foraging was women’s work. However, with the scientific revolution, mushrooming became a men’s activity as men began to monopolize the science of mycology (Dugan, 2008). The shift from mushroom foraging as women’s work to men’s work represents a shift of the power of behind which knowledge is given privilege in society. As men took control of institutions of learning, medicine, publishing, science, etc. and systematized scientific knowledge, the folk knowledge of women, but also poor people, indigenous people, criminals, people with disabilities, and other marginalized groups was denigrated, ignored, or suppressed. This might explain why according to Dugan (2008) mushroom collecting was mainly conducted by women in the United States until the 19th century. In was during the 19th century in the United States that women’s knowledge of childbirth, medicine, and the natural world in general was suppressed by emergent medical and professional institutions. As this knowledge was professionalized and monopolized, the knowledge of men was empowered and given social value at the expense of women. Long before the advent of science, many groups of people developed the a body of knowledge about mushrooms that scientists would only later rediscover. For instance, Russian peasants had a deep knowledge of mushrooms and some of the common names for these mushrooms were associated with the tree that the mushrooms grew near. Europeans were latecomers to mushroom identification and even Darwin was indifferent to fungi when writing about evolution. However, the Mayans developed their own system of classifying mushrooms, as did the Chinese. Chen Jen-yu’s Mycoflora, written in 1245, proposed 12 types of mushrooms (Dugan, 2008). In all, this should illustrate that humans have had thousands of years of interactions with fungi and through use and observation developed a body of knowledge. Some of this knowledge was dismissed or overlooked on racist, sexist, and classist grounds.

Mushroom hunting- a painting by Bernardina Midderigh Bokhorst

The ability of women to forage for mushrooms is also challenged by capitalism. Capitalism negatively impacts women more than men, because women are oppressed as workers and on account of their gender in capitalism. The oppression of women include the being paid less than men, doing more unpaid labor in the home, experiencing sexual harassment and sexual assault, having limited reproductive freedom, enjoying less political representation, having less social legitimacy, and a myriad of other expressions of oppression. Thus, at least on the amateur end of mushroom collecting, women may not be as involved because of the ways in which capitalism and patriarchy shape women’s relationship to nature. Within the United States, time in nature is usually associated with leisure, which women have less of due to spending more time with care work and household work. Women are often also economically dependent upon men and make less money than them, which may mean that taking up hobbies and traveling around to pursue them is a greater economic burden. Within the context of societies which are less developed and women continue to forage for mushrooms, women have a harder time obtaining wage labor, surviving on lower wages, and supporting their families. In some areas of the world, foraging and selling mushrooms to middle men is an important way that widows and single mothers generate income for themselves. Historically, women sold vegetables and mushrooms in markets in Europe. This tradition conditions in Eastern European countries like Latvia, Russia, Bulgaria, Lithuania, and the Czech Republic, where women are often the source of mushrooms in markets (Garibay-Orijel, Ramírez-Terrazo, and Ordaz-Velázquez, 2012). Therefore, mushroom foraging is an important source of income to women. Because it is work that is outside of the formal economy, they are more vulnerable to difficult labor conditions. And, because of the environmental problems wrought by more developed countries in the context of capitalism, women are vulnerable as the environment they depend upon for livelihood is threatened. For instance, women in Puebla Mexico must obtain permits to go into the forest and collect mushrooms. In other places, such as Burundi, logging has diminished the abundance of mushrooms. Another challenge is other ecological issues, such as acid rain and soil nitrification in Europe. Mushroom collectors are often independent workers, so they are not afforded health or safety benefits (Garibay-Orijel, Ramírez-Terrazo, and Ordaz-Velázquez, 2012). Indeed, mushroom yields around the world have decreased over the years, perhaps as a result of climate change.

Women and Food:

Closely related to foraging, women are engaged in cooking and eating fungi. The preparation of mushrooms, including cooking and storing, is mostly done by women around the world (Garibay-Orijel, Ramírez-Terrazo, and Ordaz-Velázquez,2012). According to the Bureau of Labor Statistics, in an average day, American women spend about twice as much time as men preparing food and drinks. In an average day, 70% of women reported preparing food compared to 43% of men. This means that women not only do more food preparation than men, more women are engaged in this activity than men (Charts by Topic: Household activities, 2016). This should come as little surprise to feminists, who have long articulated that women do more unpaid household labor than men. This work is often devalued, taken advantage of, and taken for granted as part of the normal gender roles and relationship between men and women. Although women do more unpaid cooking, men dominate professional cooking. Women and men attend culinary school in equal proportions, but most celebrity chefs and paid culinary professionals are men. Men also outnumber women 7 to 3 at more prestigious culinary schools and when women do go into culinary arts, they are disproportionately represented upon baking and pastry programs (Jones, 2009). For instance, at B.A program in pastries at the American Culinary Institute is made up of 86% women (Tanner 2010). Both of these trends represent how “women’s work” is undervalued in society. At culinary schools, pastry sections are called the “pink ghetto” or “pink section” because they are dominated by women. Food and work are both gendered in society. Baking and desserts are associated with femininity (Brones, 2015). This relationship to cooking also creates a special relationship to fungi, even if this relationship is not immediately obvious.

The first relationship to fungi is the relationship between women and yeast. To begin, bread of some kind or another has been eaten by humans for at least 30,000 years. But, early breads were unleavened flat breads which were made from ingredients other than grains. The first recorded discovery of yeast is from Ancient Egypt, where yeast was used to leaven bread and make beer 6000 years ago. No one knows how yeast was discovered. It may have been floating in the air and landed in some bread, resulting in lighter, fluffier bread. Or, it is possible that yeast entered bread by adding ale to it instead of water. In any event, the discovery of yeast necessarily coincided with several other developments in human history. First of all, it arose out of settled societies which domesticated and grew grains. Grains were domesticated by ancient farming civilizations about 8000 years ago. But, for most of human history, people foraged for their food. Settled agriculture allowed for population growth, the birth of cities, the invention of written languages, private property, and social stratification. It also is considered to be the beginning of patriarchy, as with the invention of private property, monogamy and the associated control of women was ensured the transmission of property through sons. Settled agricultural societies were possible because of a surplus of food. This surplus of food also allowed for the creation of professions, thus, in Egypt, there were professional bakers, herders, teachers, doctors, scribes, etc. Egyptian art depicts both men and women engaged in bread making. However, it is more likely that men were involved in the actual profession of bread making or baking, while women made bread in the home or as supporters. This gendered dynamic continued through time. For instance, in Medieval Europe, women prepared food for their families or homes, whereas men were professional breadmakers in guilds. In both examples, the work of women was essential the same, but not given the same social value. So, although women are more likely to work with yeast or for that matter cook with any other fungi, it is not seen as work that matters in the same way professional culinary work matters.

While women have a close relationship to food and by extension, fungi as a food, due to their role as a cook for their families, this often goes unnoticed or unheralded. Despite gender inequalities, women managed to influence society through cuisine. For instance, countries can roughly be divided into mycophobic and mycophilliac depending upon their relationship to mushrooms. France is viewed as a mycophiliac culture, with many recipes calling for mushrooms and a history of foraging for mushrooms. It was largely through women that this French passion for mushrooms spread to other countries. For instance, Hannah Glasse wrote an English cookbook in 1747 which drew from French cuisine and included 110 mushroom recipes called the Art of Cookery Made Easy. Eliza Action’s cookbook Modern Cookery for Private Families (1845) and Mrs Beeton’s Book of Household Management (1861) also included dozens of mushroom recipes. Cookbooks focused on the historical cuisine of the British isles tended to have few mushroom recipes. The first American cookbook, by Amelia Simmons in 1796, does not feature any mushroom recipes. But, by the 1800s, various cookbooks featured mushroom dishes. Campbell’s Cream of Mushroom Soup, introduced in 1934, popularized mushrooms as part of American casserole cuisine. And, one of the most popular American cookbooks of the 20th century, Julia Child’s Mastering the Art of French Cooking (1961) included dozens of mushroom recipes. Irma Rambauer’s book The Joy of Cooking included 30 recipes with mushrooms (Bertelsen, 2013 ). In each of these examples, women were able to influence culture by working within the traditional social space offered to women. The household has traditionally been viewed as the sphere of influence of women. Books about cooking, by women for women, is a way that women exerted power within the confines of tradition. In doing so, in a small way, these cultures were changed. Today, mushrooms consumption has exploded. The global export value of mushrooms was almost 1.75 billion dollars in 2010, compared to 250 million dollars in 1990 and negligible in 1970.

Another way in which women relate to fungi is through the ways that food is gendered in society. Because mushrooms are a viewed as a vegetable and something healthy, one might assume that women eat more mushrooms than men. After all, women are told to watch their weight, monitor their food intake, and make healthy food choices. At the same time, masculinity is connected to meat eating. Eating mushrooms seems to be something lowly and feminine. There is even a racial and ethnic component to eating mushrooms, as they are associated with mycophilliac cultures such as India, China, Japan, and Russia. Surprisingly, men and women in the United States actually eat roughly the same amount of mushrooms each year. According to the USDA, women consume about 8% more fresh mushrooms then men, but men are more likely to eat processed mushrooms. As a whole, men ate about 49% of all mushrooms produced in the United States, whereas women ate about 51% (Lucier, Allhouse, and Lin, 2003). Yet, this isn’t to argue that gender does not shape mushroom consumption. In Mycophilia, Eugenia Bone, a food writer from New York, expressed disdain when she attended a Midwest mushroom foraging event and the men in attendance planned on battering their mushrooms or putting them on steaks (Bone, 2011). In this example, gender, geography, and class intersected to generate a different sense of taste from the Midwestern men with less social capital. In another example, the white truffle is the most expensive food in the world, at $3000 per pound (Bone, 2011). However, men with power are more likely to obtain and ingest truffles. For instance, a 3.3 pound truffle was auctioned for $330,000 to a billionaire named Stanley Ho, a Macau casino owner. The truffle itself was discovered by an Italian truffle hunter and his father, along with their dog. Gordon Wu, a property tycoon from Hong Kong purchased two truffles at an auction for 125,000 euros. An anonymous Chinese writer purchased a truffle for $120,000 at an auction. Globally, women and children are more likely to be among the world’s poor and less represented among the super wealthy. The truffle’s value is because it is hard to successfully commercially cultivate, rare, and labor intensive. At the same time, some its value is more symbolic than material, as truffles are abundant in China, where labor is cheap enough (i.e exploited) that they are raked from the earth by humans rather than trained dogs and pigs. But, these black truffles are viewed as inferior to European black truffles. In this sense, when food is associated with power and privilege, women are less likely to partake in this indulgence. So, while men and women may eat equal amounts of mushrooms, how they are eaten may differ. I would hypothesize that men eat them more often on pizza, battered, on burgers, or on steaks and women in salads and as a meat substitute. Class certainly shapes mushroom consumption as well, not only in access to elite foods like truffles, but in consumption of mushrooms in general. Bone (2011) noted that the biggest consumers of mushrooms were those who were 350% above the poverty line.

(image stolen from National Geographic…)

Mushrooms, Women, and Witchcraft

Another way in which mushrooms have been associated with women is through medicine and witchcraft. In Europe, mushrooms have often been associated with mushiness and evil. French words for mushrooms translate to eggs of the devil, devil’s paintbrush, and toad bread. Toadstool and toad hat are names derived from Danish mushrooms. In Estonia, Fulgio septica, a large yellow slime mold is called “Shit of a Witch (Dugan, 2008).” An edible yellow fungus commonly found on dead branches is called “Witches butter.” Western Europe and the British Isles in particular associated mushrooms with witchcraft (Bertelsen, 2013). In Russia, Baba Yaga is associated with magical tree mushrooms. In one story she spares the life of a hedgehog that is eating a mushroom, under the understanding that the hedgehog will become a boy and serve her. She is also accompanied by spirits that live under mushrooms. In Italy, there is a story of a witch who disguised herself as a mushroom to figure out who is stealing her cabbages. Mushrooms have been associated with fairies and in 1599, the word fairy ring described, which is a ring of mushroom left behind by dancing fairies. In Germany, fairy rings were known as Hexen rings, where witches would dance in a circle on Walpurgis night or the night before May Day (Dugan, 2008). Plant diseases caused by fungi were sometimes believed to be caused by witches, as exemplified by a decree by Pope Innocent the VIII who noted that witches cause crop failure. Witches were also blamed for the poisoning of cattle, which itself was often the cause of grain fungi. Witches were believed to use fungi in herbalism, and that least Inquisition documents indicate the beliefs that witches used puffballs in potions in Basque country, Amanita Muscaria is known as “Witches mushroom” in Austria, and witches in Portugal used a hallucinogenic mushroom called Panaeolus papilionaceus. There is also a Finnish belief that if someone is bothered by a kobald like creature, a certain species of mushroom was fried in tar, salt, and sulfur, then beaten, and the woman who controls the kobald would appear to release the creature. In the Balkans, dried mushrooms were used to ward of witches by placing them in the windowsill (Dugan, 2008). It seems that mushrooms have been associated with witches, mischief, powerful women, and misfortune. Though, there are some exceptions. For example, in China, the lingzhi mushroom or mushroom of immortality, was associated with Kuan Yin, the goddess of healing and mercy (Bertelsen, 2013).

(Witches Butter Fungus- Image from Birds and Blooms)

There may be some actual connections between witchcraft and fungi. For instance, there is a connection between ergotism and witch trials. Ergotism is caused by the grain fungi, Claviceps purpurea. The fungus colonizes cereal crops, producing nectar like droplets containing spores. The disease is called ergot, the French word for spur, due to the rooster spur like shape of the fungus on the infected plant. In medieval times, up to 30% of the harvested grain was actually fungus, due to wet weather conditions. When humans or animals ingest the fungus many symptoms can arise. The infected can feel intense heat over their body and lose blood flow to their extremities, causing the limbs to rot and fall off. This condition was called St. Anthony’s Fire due to these symptoms. The alkaloids produced by the fungus can also cause vomiting, diarrhea, the sensation of ants on the body, twitching, hallucinations, seizures, and distortions of the limbs. Ergotism outbreaks occurred through the 1800s. Peasants were vulnerable as they had to eat lower quality grain or could not waste the diseased grain. Children were particularly vulnerable with 56% mortality in some outbreaks. Historians such as Mary Matossian have hypothesized that witch trials and bewitching may have actually been the result of ergotism. She argued that most witch trials happened in river valleys in southwest Germany and south east France, where cool and wet conditions would have promoted fungal growth. Both places grew rye and peasants in the area would have consumed up to three and a half pounds of bread a day. There was only one witch trials in Ireland, where grain was not grown as much. Trials for witches often happened in the fall or winter following wet years. Even the Salem Witch Trial followed this pattern as it occurred after a cool spring. The symptoms reported in the witch trials were similar to ergotism and the fact that children reported these symptoms is also consistent with the fact that children are more vulnerable to the effects of ergotism. It is interesting to note that in studying ergot grain fungi, Albert Hofman developed LSD (Hudler, 2000). In any event, it is possible that outbreaks of ergotism were blamed on witches and a catalyst for witch hunts.

(A vintage Halloween postcard featuring a costumed witch with fungi)

Beyond this association with witch trials, it is useful to dissect what a witch is. A witch is symbolic for a women with power and knowledge. For thousands of years, humans obtained an immense amount of knowledge from the natural world in terms of edible foods, useful medicines, dyes, animal movements, etc. Because women had an important role in gathering foods, they had special knowledge. Further, prior to the invention of patriarchy, women likely had important roles as religious or spiritual leaders, healers, and religions with goddesses. Over time, with changes in social structures and the introduction of Christianity, the role of women was diminished and their knowledge was viewed as threatening and connected to paganism. In this way, the idea of a witch is a way to diminish and persecute the traditional knowledge and roles of women. Witches may be associated with mushrooms because of how mushrooms were used in healing and rituals. Indeed, some fungi have healing properties. Mushrooms are valued in Chinese cuisine, culture, and medicine. Chinese medicine includes 100 species of mushrooms, including the wood ear mushroom which was eaten for its perceived improvement to circulation and breathing. The health effects of mushrooms are only recently being discovered in the West. Mushrooms contain polysaccharides, which boost the immune system and can be a source of protein, potassium, riboflavin, niacin, vitamin D, copper, and selenium. Chanterelle mushrooms are 11 to 24% protein. In contrast, the average potato contains 3.9% protein. Mushrooms also secrete antibiotics (Bertelsen, 2013). The most famous fungal cure is penicllin, but fungi are used in many modern medicines. Beano is made with the fungi Aspergillis niger, which digests methane and in turn relieves flatulence. Lovastatin and Pravastatin are both derived from fungi and used to treat high cholesterol. Cyclosporin comes from a fungus and is used to suppresses the immune system for organ transplants. Shiitake mushrooms may have cancer fighting properties (Hudler, 2000). Gypsy mushroom may be effective against herpes, the steroids used in birth control come from fungi, turkey tail mushroom may be a treatment against hepatitis C, and fomitopsis officinalis has been used to treat tuberculosis and e-coli. Midwives in Germany and Italy used ergot, the deadly grain fungus, to induce labor (Bone, 2011). Mold was used by Chinese, Ancient Egyptians, and French to treat wounds (Hudler, 2000). Of course, the benefits of fungi should not be overstated. They may be hard to digest due to their chitin cell wall. Some fungi are deadly. Designating fungi as a superfood is a marketing ploy to sell more mushrooms. However, the healing properties of many mushrooms may mean that witches were associated with mushrooms because healers traditionally used mushrooms as medicine. By associating healing with evil and witchcraft, women’s knowledge, experience, and power was de-legitimized. At the same time, through witch hunts and trial, women themselves were terrorized with violence and the threat of violence as a form of social control.

Women and Mycology

It should be clear that one of the themes related to women and fungi relates to the value of the knowledge and work of women in society. It is suiting then that the final point is how women have contributed to the science of mycology. In this feminist narrative of history, women have probably been closely connected to fungi for most of human history as foragers for food and as healers. With the end of hunting and gathering societies in many parts of the world, women took on new, but subservient roles in society. Still, women continued to be connected to fungi through their preparation of food and role as caregivers, even if this labor was not given social importance. This final segment of history is about women struggling to assert themselves in male dominated science. Outside of the realm of formal science, women are often responsible for passing down knowledge of mushrooms to their children. Even the science of mycology depending upon the knowledge of women. For instance, Carolus Clusius and Franciscus van Sterbeeck, who lived in the sixteenth and seventeenth century, respectively were two of the the first pioneers in mycology. These men relied upon the knowledge of wise women, known as herb wives, to obtain information about mushrooms (Garibay-Orijel, Ramírez-Terrazo, and Ordaz-Velázquez, 2012). It is tragically ironic that when men were developing science based upon the knowledge of women, these very same women were persecuted as witches for their knowledge of nature.

Later in history, Mary Elizabeth Banning was a pioneer in mycology who sought to identify mushrooms in the 1800s (Bertelsen, 2013). She identified 23 new species of fungi and completed one of the first guides to mushrooms of the New World. She worked as a teacher to support her mother and sisters after her father died, but found time to pursue mycology, then associated with botany. Men dominated professional botany, but women were sometimes amateur botanists. For 20 years, she studied the mushrooms of her home state of Maryland at a time when there was only one book on American fungi. She never earned money or recognition and was often viewed as a lunatic by those outside of the scientific community. She did however correspond by mail with various scientists (Pugliosi, 2016). Her life represents several barriers for women who wish to pursue science. For one, she was burdened with care work for her family. Her mushrooming adventures were limited by the constraints of caring for her family. At the same time, her work was stymied by the fact that she also had to be a wage laborer as a teacher. Her “hobby” as a scientist was an unpaid third shift. While she produced useful information, she never published it out of lack of confidence and her outsider status to scientific institutions.

(An illustration by Mary Elizabeth Banning)

In a similar but less tragic example, Beatrix Potter was interested in mycology and painted hundreds of scientifically accurate portraits of fungi. She studied fungi under a microscope and presented a paper on fungal spores at the Linnean Society of London. She began creating watercolor paintings of mushrooms at the age of 20 and sent her paintings to the naturalist, Charles McIntosh. In turn, McIntosh gave her scientific advice and sent her specimens to paint. Beatrix Potter also began studying lichens, which she wrongly believed were fungi rather than a symbiotic relationship between fungi, algae, and bacteria. The mycologist, George Murray, rebuffed her, both for the position on lichen and her earlier work on spore germination, which he said had already been studied in Germany decades earlier. Her paper was never published and she was told to make revisions. Female students were not accepted into the society until 1905 and she was unable to present the research herself. Her biggest contribution to mycology was her illustrations, which were used for fungi identification (Flemming, 2016). Potter went on to achieve fame as a children’s book author and illustrator, but her scientific endeavors largely went unnoticed in history. Again, she was shut out of a world controlled by men and men mediated her access and legitimacy within science.

(Mushroom watercolor painting by Beatrix Potter)

With successes of the early women’s rights movement and other social movements, the social space within science slowly expanded for women. In 1950, Elizabeth Hazen and Rachel Fuller Brown discovered Nystatin while trying to isolate antibiotics from Strepomyces noursei (Hudler, 2000). Nystatin was one of the first anti-fungal drugs and is used to treat various Candida infections such as diaper rash, yeast infections, and thrush. Both scientists worked together for the New York Department of Health and went on to develop two antibiotics. Developing anti-fungal drugs is particularly challenging because, as it was noted earlier, fungi are closely related to animals. This makes fungal infections harder to fight than bacterial infections. Bacteria are simpler organisms, with a cell wall but not the complex cellular structures of animals and fungi. This makes it easier to destroy bacteria. Drugs developed to fight fungal infections may attack healthy human cells, as they are more similar (Staughton, 2002).

Another contribution to mycology was the discovery of the cause of Dutch Elm Disease, a fungus that destroyed elm trees in Europe and the U.S.. The cause of this disease was discovered by a team of five female Dutch scientists (Hudler, 2000). The source of the devastating tree disease was uncovered in 1921 by a team, lead by Johanna Westerdjik. Westerdjik was a plant pathologist and the first female professor in the Netherlands. She wrote over 70 papers on mycology and plant diseases and supervised over 55 Phd students, half of whom were women. It was her student, Marie Beatriz Schwartz who isolated the fungus infecting elms and another student, Christine Johanna Buisman who developed Dutch Elm Disease resistant elms. The project that she started continued until the 1990s.

“Moldy Mary” was another contributor to mycology. Alexander Fleming discovered penicillin after observing mold attacking bacteria in a petri dish. He hired a woman nicknamed “Moldy Mary” to collect moldy produce so the mold could be studied. Her real name was Mary Hunt and she was a young lab assistant. The molds that Hunt found were tested to determine if they were penicillin. Some of the cantaloupes she collected indeed contained a culture of Penicillium chrysogenum and many modern strains used in modern penicillin come from her moldy melon (Hudler, 2000). Another contributor to knowledge about fungi was Valentina Wasson. Unfortunately, her husband, R. Gordon Wasson is more famous than she is for his research into the cultural relationship between people and mushrooms. However, he was struck by the cultural difference between them when on their honeymoon, Valentina, a Russian, began collecting mushrooms. He was terrified that they were toxic, a reaction that highlighted a difference between his American upbringing and her Russian upbringing and how that shaped their relationship to mushrooms. The incident inspired the couple to research these cultural differences together and they authored Mushrooms, Russia and History in 1957. They went on to travel to Mexico where they studied the relationship to mushrooms among indigenous people and went on to introduce psychoactive mushrooms to a mass American audience through Life magazine (Hudler, 2000). Unfortunately, this attracted droves of Western visitors to the Mazatec community and especially to Maria Sabina, who was interviewed in their book. Maria was investigated by the Mexican police for selling drugs to foreigners and had her house burned down. Thus, while they examined cultural differences in the relationship between cultures and mushrooms, their work had a negative impact on indigenous people of Mexico. Finally, as one last tidbit of mycological history, all button mushrooms, the mushrooms commonly used in pizza, salads, canned mushrooms, and cream of mushroom soup all come from a spore discovered by the Dutch scientist Gerda Fritsche in 1980 (Bone, 2011).

A depiction of “Moldy Mary”

While women have made contributions to mycology over time, gender inequality in mycology persists today. There are two times as many male members of the American Mycological Society as there are females. Only 13% of the presidents of the MSA (founded in 1932) have been female, starting with Marie Farr in 1980. MSA secretaries have been consecutively female since 1991, but treasurers have historically been men. Various MSA awards have also gone disproportionately to men, although female students have won travel grants in greater proportion to their male counterparts. The majority of published articles in Mycologia are written by men (Branco and Vellinga, 2015). Mycology is not unique among the sciences. The gender inequality within mycology is pretty comparable to similar sciences such as botany, ecology, and lichenology. It begs the question of why women do not enter the sciences or when they do, they are not as active in leadership roles.

Oddly enough, I wanted to be a botanist when I was a kid. I even went through a period of time in the 5th grade when I wanted to be a mycologist. I attended science camp and continued to be interested in science through high school. However, I think a deterrent for me and science was a lack of confidence and a fear of math. Low self-esteem is pretty common among girl. There are varying statistics on the occurrence of low self esteem, but if one believes the statistics put forth by Dove’s Self Esteem fund, as many as seven in ten girls believe they are somehow deficient. If girls indeed believe they are not smart enough or capable enough, they may be deterred from science. And, if they do enter the sciences, they still must contend with the social expectations of women, such as having a family, doing research, doing unpaid labor at home, etc. This cuts into time spent for research or going to conferences and limits the ability to become leaders in their field. They may also face sexism and sexual harassment in their work environment, like many women do. Finally, as it has already been outlined, scientific institutions have not been welcoming to women in the past and have suppressed the knowledge of women. Rationality itself is associated with masculinity, whereas femininity associated with emotions. But, rather than viewing one as inferior or that reason and feeling are opposed to each other, they are instead, interconnected. The drive to study the natural world, interest in research, dedication to a subject, and passion for science all come from an emotional place.

Conclusion:

I am certainly not a scientist, but I hope that the presentation and accompanying hike provided a few insights about fungi. Personally, I find fungi pretty fascinating and hope to learn more about them in the future. That is the goal of feminist frolics, to get together, share knowledge, and hopefully open the door to future learning. For thousands of years, the knowledge and experiences of women have not been valued. I think that learning together and sharing builds confidence, community, and self-efficacy. It is also a way to find a place in nature, science, and history. Hopefully you will join the Feminist Justice League in future feminist frolics. I think you will find we are a bunch of fun gals and fungi!

My Adventures as an Egg Donor

H. Bradford

8/1/17

I remember when I was a student teacher, I taught a lesson on the social construction of gender. A seventeen year old smarty pants wanted to argue that gender was not socially constructed. After all, a woman can’t get another woman pregnant! With a smile, I told him that I had, in fact, impregnated three women. He was taken aback by this and retreated from the argument (which to him was really was more about biology than the social construction of gender). The story of egg donation came up again tonight at Socialism and a Slice, a monthly meeting of local activists. The topic was again the social construction of gender, but also the promise that reproductive technologies can usurp some aspects of biological determinism in reproduction. Of course, reproductive technologies exist in a social context and I am not for the blind worship of science and technology. Yet, at the same time, I like to think that someday technology can be used to grant genders/biological arrangements access to parenthood.

In 2007 and 2008 I was really struggling. I had a large bill with St. Scholastica, was making less than minimum wage as an Americorps volunteer, worked two to four jobs, and was just beginning to pull myself out of the black hole that is depression. My long experience with depression is another story. But, to make that long story short, I spent a good portion of my 20s as a non-existent person. I hid from the world, didn’t pay my bills, and waited patiently for death. Needless to say, I had a lot of financial things to deal with once the clouds began to clear. One solution to this problem was working myself in a demoralizing frenzy of drudgery to climb out of the hole. Another solution, in addition to that one, was to donate eggs. I began to look into this option. The closest place to donate eggs was a hospital in Minneapolis. But, it paid around $3000 if successful. I filled out a long application. I believe it was over 25 pages long. The application was accepted and I was invited to the hospital to continue the process- which would include a mental health examination, health exam, and interview.

I believed that at any part of the process, I would be weeded out. But, I am generally a pretty healthy person. I have never smoked, drank alcohol, had a surgery, had a major illness, been hospitalized, tried an illegal drug, etc. On paper, I seemed like a good candidate, as I have many hobbies, was a healthy weight at the time (they had weight restrictions), intelligent, driven, etc. I even passed the mental health evaluation. So, despite some struggles with anxiety and depression in my early 20s (which I can talk about later), they were not red flags. I passed each barrier, which was great as I invested my meager resources at the time in traveling to Minneapolis for evaluations. Finally, they took my photo and told me that I would be put on the roster of possible egg donors. With a few weeks, I was told that I had been chosen to donate. It should be noted that it was an anonymous donation, so I would never know the recipient of the eggs nor would that person know me. I was simply donor number 306.

The donation process was involved. It first involved a visit to the hospital to go onto birth control pills so that my menstrual cycle would align with the recipient. I was told to begin them on a certain date. After which time, I would begin a series of injections. I was given a large amount of hormones, as the goal was to make my body produce a dozen or more mature eggs. I injected myself with Gonal-F once or twice a day, depending upon the stage in the process. Towards the end, it was more and in all, I spent about three weeks taking hormones. In addition to the Gonal-F injections, I also took injections of a medication that suppressed ovulation, simulating menopause (Lupron). Beyond this strict schedule of injections, the process also involved early morning drives to Minneapolis, as my blood was tested for its estrogen level and I was given ultrasounds to check on the progress in my ovaries. It was an intense time, as I would rush to the cities then drive back for work. At the same time, towards the end, my ovaries felt like bags of marbles. I felt heavy. I am sure it was imagined, but I felt droopy and weighed down. The first time that I donated was in November and I remember making a large Thanksgiving meal for my family. I remember them attributing this to my mega dose of estrogen. As if they believed that somehow I was magically domesticated by the hormones. I was deeply offended. Despite being pumped full of estrogen and in a fake state of menopause, I was not weepy, crabby, plagued by hot flashes, or somehow more feminine. Really, I just like cooking things from time to time…hormones or no hormones. I felt entirely like my self, just weighed down and worn out from the driving. In any event, after daily trips to the cities for a week…the time finally came to donate. I was given a dagger sized syringe and a date. I was told to impale myself on my butt then show up the following morning for the extraction. No eating. No drinking. The final injection was some sort of magic potion that would mature the follicles and release the eggs (HCG).

I made several dishes for Thanksgiving this past year. Not one of them was the outcome of my hormone level.

The extraction itself was uneventful. I was put to sleep, a needled was inserted into my vagina, and eggs were somehow sucked out from my ovaries (I believe?). The extraction process took less than an hour, but I was moved to another room to rest for an additional hour. In all, around 15 or 16 eggs were removed. Though, I believe that my second time donating, it may have been as many as 23. These eggs would go on to be fertilized. The most promising would then be implanted in the recipient. The failures and duds would be destroyed with the option of freezing some eggs for later use. Thus, I am responsible not only for three pregnancies (since I donated three times and each time resulted in a pregnancy), I am responsible for some abortions (depending upon how one defines such a thing). Because of the large number, I was told that I was a good donor. I also did not experience much pain or any complications after the first donation. Again, I handled it pretty well! I was given a check for my efforts as well as parting gifts from my recipient. The first time, the gift included a card and some gift cards. In all, it was pretty cool. I used the money earned from three donations to pay off my car loan, put money towards my St. Scholastica bill, and a little money towards a trip to Cuba.

As I said, I donated three times. The first two times were uneventful and largely successful. But, I was kept on a pretty tight schedule. Not long after I had donated the first time, I was asked to donate again. And, once I had donated the second time, I was asked to donate a third time. This is a pretty intense process. It was a lot of driving. It was a lot of early mornings in addition to working over 60 hours a week. It was a lot of hormones. It was a lot of sedation. Plus, I was saving up for an expensive trip to Cuba. In order to afford the trip to Cuba, I worked from March to June without a day off. I have never worked that long of a stretch in my life. I hope to never do that again. Even with money from the donation process (which I mostly put to bills) I still had to save several thousand dollars for the Cuba trip. And, my third donation actually happened shortly after this trip, so I was taking hormone injections while on vacation. My third donation did not go as well.

When I awoke from sedation, I began having odd body spasms. My arms and legs shook. I felt nauseous. The nurse and doctor asked if I had taken any drugs, but I had not done anything unusual. Eventually this uncontrolled trembling stopped, though for the next week, whenever I was resting, I would spasm a little. Because of this reaction, I was told that I could no longer donate. I have no idea why this happened, but I felt angry at myself. I felt angry at my body for betraying me. Had I been a trooper…the kind of person who could soldier on through exhaustion and hormones…without complaint or complication, I could have donated my way out of debt. I felt so upset with myself. So, so, so upset! But, three times was an accomplishment. Perhaps it was hard on me. Perhaps I was overly tired. Maybe I was anxious. Maybe I hadn’t been taking care of myself. Why did the third time go awry? I will never know. But, that was the end of my short lived career as an egg donor.

Having gone through that experience, I have mixed feelings. On one hand, I feel great. It helped me pay off some debt and go on a trip to Cuba. I also feel like I cheated evolution, gender, and biology. In terms of evolution, success is passing on your genes. I am not sure if the three recipients had successful pregnancies, but supposing that they did, this means that I may have three offspring in the world. I may have more because of the high incidence of twins from IVF and the possibility that some eggs may have been frozen. I cheated biology, since as a person who was born female, reproduction requires a lot of effort. Raising a child requires a huge amount of resources and labor. Thus, I feel that I am the equivalent of a brood parasite, such as a catbird. I laid my eggs in some other bird’s nest and got to fly away, without effort or consequence. Egg donation is a bit of biological trickery on my part. Finally, I have suffered some gender dysphoria in the past. It is not something I am particularly open about nor is it immediately obvious because of my feminine gender presentation. In this regard, I feel that I transcended some of the limits of my gender and biology. I was able to express both my gender and biology in a non-conventional way. I’ve impregnated multiple women who I don’t even know. I kind of felt like a stud.

On the other hand, there is a darker side to all of this. Egg donation was hard on my body. After the third donation, I actually developed wrinkles around my eyes. The skin on my face became like crepe paper…very fragile and wrinkled. It was an odd reaction that went away over the months following the donation (thus I know it was correlated with egg donation rather than with natural aging). I also woke up convulsing on a hospital bed. Then, I felt that I was blamed for this reaction (as I was barred from donating again and accused of taking drugs). The reason why I donated was because I was in debt. I was overworking myself. My debt was related to my depression and the high cost of education. In the context of capitalism, those who donate will always mostly be lower income women. The cost of IVF is extremely expensive. Thus, the recipients will always be women with access to money. Of course, both women in the situation are oppressed. Why do women feel that they must spend tens of thousands of dollars on reproductive technologies? Why not adopt? Why is going through the process of pregnancy so important? I don’t blame the women for their choices nor do I look down upon these choices. However, choice exists in social context and our society does tell women that motherhood and pregnancy give value and meaning to life. Women who choose not to have children are seen as deviant, selfish, or of lesser character. To make matters more complex, there are plenty of women with infertility issues who can’t afford IVF or adoption (which itself costs tens of thousands of dollars). For instance, now that I am older and my fertility is waning, I know that would never be able to afford to have children through adoption or IVF. It is plainly too expensive. Additionally, why was I considered a “good donor?” Partially because of supply and demand. The demand is for young, educated, talented WHITE women, as most recipients are professional white women. So, while I support reproductive technologies, in the context of capitalism and patriarchy, there is inherent exploitation involved. I was so miserably poor I really didn’t care if there were medical complications. I wanted a better life. I became upset when my own body became a barrier to a better life.

Despite the negatives, I mostly draw a positive balance sheet from the experience. I needed to pay off a bill with St. Scholastica so that I could further my education. I have…furthered my education a bit too much…but it certainly opened a door for me. I feel proud of my unique gender experience. I feel smug about my place in evolutionary history. I traveled to Cuba, which was a wonderful and educational experience. I paid of my car early, improving my credit score and freeing up more spending money. In all, I have little to complain about. As for the exploitative nature of the situation, that could be mitigated by free higher education, living wages, universal medical care, etc. It was certainly odd that I used money from the donation process to travel to Cuba, where education and health care are free, despite a much smaller GDP to work with and embargo. As for the recipients, I am thankful that I was selected and hope that they have a happy family. I hope that their children turned out to be smart, talented, well-behaved, thoughtful, independent, creative, angelic little creatures. I hope that donor 306 was a blessing in their life and a mystery to puzzle, rather than an accursed brood parasite.

Feminist T-Shirts: Severing the Thread between Capitalism and Feminism

H. Bradford

7/5/17

I’m not going to lie. I like to wear things that advertise my politics. It’s terrible. It’s hypocritical. But, it is also a way to tell the world that I am weary of the status quo and sometimes it’s a way start a conversation. It is also an expression of self (which itself should not be idealized) and a message to others like me that they are not alone. Recently, when I saw a really cool feminist t-shirt at a store at the mall, I really wanted to buy it. I didn’t. Still, I am not a saint and my wardrobe is made from the blood and sweat of exploited workers. Thus, this piece of writing is not a call for people to be perfect. Certainly, I have a lot of room to grow. Instead, it is a call to analyze a disturbing trend in feminism with the hope that this knowledge can shape our organizational tools and demands. The trend this piece examines is the rise of the feminist t-shirt and the accompanying ideology of corporate feminism. To this end, this topic is July’s educational component of the Feminist Justice League’s “feminist frolic.” Many topics have been discussed over the past year and it seems appropriate to explore how feminism has been appropriated by capitalism, while doing some small act to combat this trend: making our own t-shirts.

This year has seen an increase of feminist activism. Locally, there has been an explosion of feminist events to partake in. Nationally, between three and five million people in the United States participated in the Women’s March, making it the largest single protest in American history. More people participated in the Women’s March than are members of the U.S. military. This burst of feminist activism is certainly a welcome development. Feminism is cool right now. As a result of the rise in popularity of feminism there has been an increased demand for feminist t-shirts. T-shirt with slogans such as “Feminist AF”, “The Future is Female”, and “Nevertheless, She Persisted” are a few examples of popular mantras this year (Spinks, 2017). While it is great that feminists proudly wear their politics on their sleeve or chest, the trend is problematic in that it may ignore the working conditions behind the production of these shirts and reduce feminism to a profit making fashion statement. For instance, in 2014 women at a sweatshop in Mauritius were paid 80 cents an hour to make t-shirts that said, “This is what a feminist looks like.” In all, they earned less than $155 a month working at a factory which produced over 40 million t-shirts a year for Urban Outfitters, Next, and Top Shop. Further, women who did not produce the quota of 50 shirts a day were subjected to discipline (Ellery, 2014). At their meager wages, it would take the workers about 72 hours to buy one of the shirts produced at the factory. The workers themselves share a cramped room with 15 other women. The women often lived in the dorms for months without seeing their families overseas. The iconic shirt was worn by celebrities and politicians, and was even featured in Elle magazine. Astonishingly, the shirt itself was used by Fawcett Society, a nonprofit that promotes the labor rights of women. When confronted by the conditions of the Mauritius factory, the non-profit argued that the garments were ethically produced (Bianco, 2014).

The case of the “This is what a feminist looks like” t-shirt is appalling, but it is far from an isolated incident. Beyonce, a self proclaimed feminist, also came under fire because the clothing in her company, Ivy Park, was produced at a sweatshop in Sri Lanka. Workers made less than 65 cents an hour and would need to work over a month to afford the leggings that they produce. Similar to the women from the factory in Mauritius, they worked 60 hours a week and stayed at a boarding house, as many came from rural areas (2016, Euroweb). Dior sold a $710 t-shirt with the slogan, “We should all be feminists.” The quote was from an essay by Chimamanda Ngozi Adichie, a Nigerian writer. Some proceeds from the shirt were donated to the Clara Lionel Foundation founded by Rihanna (Ngabirano, 2017). The investment of some proceeds of an over priced shirt to a celebrity foundation should raise some eyebrows. All of these examples illustrate how corporations have sought to profit from the popularity of feminism, but also offer insight to the troubling nature of clothing production.

Globally, around ¾ of all garment workers are female (Spinks, 2017). The garment industry has historically been dominated by women and has traditionally been very dangerous. In the United States, one of the deadliest industrial accidents was the Triangle Shirtwaist Fire, which took the lives of 123 women and 23 men. The accident occurred on March 25, 1911 when a fire erupted in the 11 story building near the end of the workday. A fire that is believed to have originated in a scrap bin quickly consumed the building. There were no alarms in the building and the doors were locked to prevent theft. Many people jumped to their death on the streets or elevator shaft, as the tallest ladder from fire fighters only reached the seventh floor. The Triangle Shirtwaist Fire is noteworthy because it radicalized the labor movement, generated demands for more safety regulations, and was important in the founding history of International Women’s Day. Yet, little has changed since 1911. While clothing production has shifted away from industrialized countries like the United States, the working conditions are is inhumane as a century ago. The Tazreen Factory Fire of 2013 in Bangladesh illustrates this point. The massive fire killed 112 Bangladeshi workers at the Tazreen Factory which produced clothes for Walmart, Gap, and Disney, among other companies. Walmart refused to offer compensation to the survivors and families of victims. Just like the Triangle Shirtwaist Fire, the factory doors were locked. Victims had to break windows to try to escape the inferno. When a foreman told workers that there was a fire, a manager told the workers that there was no fire and they should continue working (Survivor of Bangladesh’s Tazreen Factory Fire Urges U.S. Retailers to Stop Blocking Worker Safety, 2013).

While there are many accidents each year in the garment industry, another startling example of the horrors these workers face was the Rana Plaza collapse, also in Bangladesh. On April 24, 2013, the eight story Rana Plaza in Dhaka Bangladesh collapsed, killing 1,134 workers in the largest garment industry accident in history. JCPenny, Walmart, Benneton, and other brands and stores were connected to the garments produced at the factory. Since then, North American companies signed a safety plan that they call Alliance to ensure safety standards in Bangladesh. However, the accord has been criticized as industry driven and not transparent. Companies must pay for inspections, but are not obligated to pay for upgrades related to safety concerns. Instead, Alliance signees have financed loans to suppliers for safety improvements (Kamat, 2016). This is surely a boon to brands who can make more money from loans than they can from investing their profits into safety improvement. The safety issues are not a matter of bad luck, but characteristic of capitalist production. Bangladesh is one of the cheapest places in the world to make garments. It is number two to China in garment exports and employs 5 million people in the garment industry (Kamat, 2016). The low cost of production comes at the expense of safety. Since October 2015, 3,425 factories in Bangladesh have been inspected, but only eight have passed the inspection (Tomes, 2017).

Many feminists are mindful of the horrific conditions of the garment industry. Officially, merchandise for the Women’s March was made and printed in the United States, but there were knock offs or other organizations which made have produced garments not made in the United States. However, simply because a t-shirt was made in the United States does not mean that it was ethically made, since a t-shirt has many inputs (Spinks, 2017). At the same time, Made in the United States does not necessarily mean sweatshop free as 50% of sewing shops in the United States fit the definition of sweatshops, i.e. they break one or more federal or state labor law. 85% of sweatshop workers are women aged between 15-25 years old (Feminists Against Sweatshops, n.d). To make matters worse, even if a person purchased clothing from an ethical source, the cotton used in the clothing itself is made with extremely exploited labor. In Turkmenistan and Uzbekistan, hundreds of thousands of citizens are mobilized to pick and grow cotton under the threat of loss of land, punishment, and public humiliation. The governments of these countries maintain monopolies on cotton production, selling it at under the cost of production to remain competitive (Skrivankova, 2015). US cotton is heavily subsidized, as US cotton farmers receive a total of about $490 million dollars in subsidies. The Chinese government offers 8.2 billion dollars in subsidies to cotton farmers. The large subsidies makes it harder for poorer countries to compete, which in turn increases the level of exploitation to maintain profits. Beyond the human cost of this, there is an environmental toll. Although cotton is grown on only 2.5% of the world’s agricultural land, it uses 16% of the insecticides and 7% of the herbicides used in agriculture. It also requires huge amounts of water. In central Asia, this resulted in the destruction of the Aral Sea, once the 12 th largest lake in the world and now 10% of its original size (Organize Cotton, n.d.). In India, where cotton has been cultivated for thousands of years, 400,000 children under the age of 18 work in the cotton industry. Children are often employed in pollinating the cotton by hand to increase yields. The children are said to have nimble fingers and girls are preferable to boys, as they require less punishment to work. (Neal, 2014). Of course, these same arguments have been used to justify the exploitation of women in the garment industry.

It is difficult to know the conditions under which a t-shirt is made or all of the inputs that went into the t-shirt because we are alienated from labor. That is, in Marxist terms we are not in control of how things are produced. A t-shirt might have labels that offer clues to the working conditions, but because we are estranged from production, we never see the entire process. This makes it easy to mindlessly consume. It also makes it challenging to ethically consume goods. At the same time, consumers are individuals who exist in a social world. Focusing on consumption atomizes social problems to a matter of consumer choices. Thus, while it is important for feminists to consider where and how a shirt is made, it is also important to consider the dynamics in the world which produce exploitative labor conditions in the first place. This is where the situation becomes far more complicated.

To begin to unravel this, let’s first examine the role of women in the labor force. The vast majority of garment workers are women. This is the case in 2017 as much as it was in 1917. The relationship between women and labor is complex. On one hand, women’s access to waged labor is a basic demand for gender equality. Women’s entrance into the labor market has allowed women to support themselves without male support. This was a historical gain for women, as it allowed women to access such things as divorce, their own housing, their own careers, etc. That is, wage labor has allowed women to be something more than just the property or dependents of men. However, it has also subjected women to harsh working conditions, sexual harassment, and lower wages than their male counterparts. Women can participate in society, but they are still not equal and still dependent upon men. Work alone did not liberate women, it simply subjected them to the oppressions that wage workers face, combined with the gender oppression of patriarchy. Within capitalism, women continue to perform a larger share of unpaid labor, which has resulted in a second shift for women as they do unpaid household labor as well as paid labor. Since paid labor is well, paid, it is given more value in society. Unpaid labor is invisible and taken for granted as part of the role of women. Thus, paid labor has also created a dichotomy between labor that matters and labor that does not. The problem is not with wage labor, but the conditions of labor in capitalism and the challenge of connecting the struggle with women with the struggle for worker’s rights. To make matters worse, wage labor v. unpaid labor has sometimes created an antagonism between women who work at home and those who do not. These sorts of antagonisms are useful in blinding people to their common oppression.

Some writes such as Leslie Chang and Naila Kabeer have argued that waged work, even in third world sweatshops, liberates women. It allows women to contribute to their families and find economic independence. Schultz (2015) argued that this position ignores the dynamics that create the exploitative conditions in workplaces of the global south. Free trade policies create a race to the bottom for wages and conditions, generating pressure for countries to have the cheapest labor or production costs. Longer hours, lower wages, and greater environmental destruction are all outcomes of fast, flexible production. At the same time, because of the lower social position of women, they have less ability to resist and organize for better wages or make demands. Thus, they often make much less than men and find themselves discriminated against without much option for social mobility. These exploitative conditions grant super profits to capitalists, while denying the most basic human rights to the workers (Schultz, 2015). This dynamic answers why women are part of the garment industry to begin with. It is an industry that seeks to profit by seeking out the lowest wage workers. Women are not equal to men in society, which puts them at a disadvantage in the labor market. Women in some developing countries may even be new to wage work, having instead grown up in communities that still earn a living from farming. Since its origin, capitalism has pushed farmers off their land into wage work. This dynamic is still at play in the developing world and will only increase with climate change. Bangladesh is extremely vulnerable to natural disasters and climate change as it is a low-lying country that is often battered with powerful cyclones. This creates pressure on rural populations to move to cities and seek work there. Bangladesh was not always a major producer of clothing, but became one due to free trade agreements which incentivized a focus on exports, ended textile quotes, and set up export processing zones in the country. This combined with war, famine, and natural disasters resulted in the development of its sweatshop economy. So while sweatshops may provide women with jobs, the working conditions are not the inevitable growing pains of development. They are instead constructed by trade organizations, trade agreements, and the larger dynamics of global capitalism. The working conditions within the garment industry can be analyzed, understood, organized against, and changed.

(Bangladesh workers struggling for a union at the Orchid Sweater Factory)

While the right to paid labor is a basic feminist demand, this demand has also been used by the World Bank and International Monetary Fund to justify women’s employment in export processing zones. Export processing zones are free trade zones wherein businesses are exempt from taxes, tariffs, health and safety regulations, etc. Since the 1960s, these special economic zones sprung up in Asia, spreading to Latin America, the Caribbean, and elsewhere. Once again, some feminists have argued in favor of work in EPZs as a means of escape from patriarchal family dynamics. This argument is unfortunate because it accepts the inevitability of capitalist oppression. EPZs do nothing more than replace the patriarchy of their family with the economic exploitation of capitalism. Women who work at Haiti’s Ouanaminth free-trade zone making Levi’s jeans, face verbal abuse, beatings, interrogation, and threats with guns. Women who work at EPZs in Mexico are subjected to health screenings for pregnancy, personal questions about their sex lives, short term contracts. EPZs are only liberating in the same way that capitalism is liberating compared to feudalism (Eisenstein, 2015). As absurd as it seems for a feminist to support sweatshops and EPZs, many feminists did not make the connection between Hillary Clinton and the exploitation of working women. This is either symptomatic of the lack of anti-capitalist analysis in mainstream feminism or the fear wrought by the lesser evilism and abysmal candidates of the two party electoral system. Hillary Clinton’s first high profile job was on the board for Walmart at a time when the company was enmeshed in a lawsuit over gender discrimination (Barrett and Kumar, 2016). Yet, she was endorsed by NOW and viewed by many as a feminist candidate. ⅔ of Walmart employees are women, and yet, during labor disputes, Clinton kept quiet while serving on the board. She also accepted campaign donations from the Walton family which were much higher than the average wage of a Walmart employee. She also bragged that welfare rolls had dropped 60% while her husband was in office, but this was not because of an accompanying decrease in poverty. Clinton also supported the Trans-Pacific Partnership, a free trade agreement to open new markets for American business in Asia (Young and Becerra, 2015). Bill Clinton supported the passage of NAFTA, which forced Mexican farmers from their land into maquiladoras, or sweatshops along the border with the United States (Barrett and Kumar, 2016). Despite her support of policies which create the conditions for sweatshops and service to Walmart, which actually uses sweatshop labor, Hillary Clinton was viewed as a feminist candidate.

Hillary Clinton was not elected president and perhaps it is unfair to target her more than any other ruling class candidate. She is simply an easy target because she exemplifies “corporate feminism” so well. “Corporate feminism” wants to see more women in board rooms and as leaders. But this brand of feminism can never be intersectional and can never truly liberate women because it encourages women to partake in the exploitive mechanisms of capitalism. Capitalism will never allow the feminist struggle to be intersectional as capitalism itself pits men against women, white workers against people of color and immigrants, etc. The problem with the “Girl Boss” feminism is that girl bosses exert power over other women. Consider the “Fearless Girl” statue on Wall street, wherein a young girl stands up to the iconic bull of the market. The statue is meant to depict female power and send a message that there should be more female leaders in Wall Street. Yet, this panders to the basest, most atomized version of feminism. Feminism is not simply about girl power. Bell hooks very simply defined feminism as “a movement to end sexism, sexist exploitation, and oppression.” (Sow, 2017). Feminism envisions women as leaders on Wall Street lures and distracts women from connecting feminism to other social struggles to end oppression. One such struggle was the Occupy Wall street movement wherein thousands of protesters occupied parks and other public spaces in protest of the growing economic inequality in America, wherein the wealth of the top 1% increased 450% since the 1970s. It was movement in protest of bank bailouts and costly wars. Corporate feminism with celebrity endorsements and a women equality that is based upon an equal share in capitalist leadership, feeds into the oppression of women. Of course, a simple t-shirt is not a statement of alignment with the ruling class, but it is a subtle and insidious expression of the corporate appropriation of feminism.

What is to be worn?

A woman should not be shamed if she choses to wear a feminist t-shirt. Many people are new to feminism, may like the message, may not know about the labor conditions, may not have the money or access to other clothes, or any number of other reasons. Feminism should not be a war against each other, but a war against capitalist patriarchy. To this end, there are a number of things that are far more productive than policing the clothes worn by others. Feminist organizations can certainly be mindful of where and how their t-shirts are produced, but the alienation of labor makes this rather difficult. Feminist organizations can host events that involve crafting, clothing swaps, or DIY t-shirt making as an alternative to buying clothes. This is a way to use recycle clothes while building community. However, this is limited because it does not do much to challenge the conditions of capitalist production. To broaden the impact, feminists can connect with anti-sweatshop groups or labor organizations. This tactic can amplify the impact of feminists and feminist groups by challenging institutions through boycott and protest. Connecting with labor organizations can broaden the impact as some may have connections to workers in other countries and may even be involved in organizing them. The global organization of the working class is a key to improving global working conditions, as capital is extremely mobile. Factories can easily move in search of the lowest paid, most complacent workers if workers try to organize for their rights. The goal must be to make this difficult through solidarity and fierce organizing. Beyond this, feminists can challenge free trade agreements and organizations and the status quo of American imperialism. The ruling class should fear putting the word feminist on their shirts. The word will not be a trend, but will spell their doom as part of the untamed, un bought, intersectional expression of the unyielding power of working people to create a better world.

Socialism, Feminism, and the Plight of Pollinators

H. Bradford

5/11/17

The Feminist Justice League meets once a month for a feminist frolic. These events involve an educational presentation and an outdoor activity. This month, the Feminist Justice League will meet to do some seed bombing and learn about pollinators. The goal of the event is to learn more about the challenges faced by pollinators and do something small to benefit them. In preparation for the event, I researched the history and troubles facing pollinators. However, since it is a feminist group, I wanted to add a theoretical component. It is not enough to learn about pollinators. To grow as feminists, it is important to analyze them from a lense that is critical of patriarchy and capitalism. I am not a scientist nor am I an expert on this topic. Nevertheless, I hope it offers some insight to the history of pollinators and how this history is deeply connected to economic and social trends in human history. Understanding this history can help us understand the present plight of pollinators as well as solutions of how to move forward in protecting nature.

A Feminist History of Pollinators:

Both flowers and humans depend upon bees and other pollinators for survival. One third of our diet consists of food that requires pollination by bees, though it should be noted that beetles, ants, birds, bats, flies, and butterflies can also act as pollinators. Bees themselves are believed to have evolved 140-110 million years ago during the Cretaceous Era, which is around the same time that flowering plants appeared (Cappellari,Schaefer, Davis 2013) . It is astonishing to think that flowers and bees are relatively new in evolutionary history. Turtles, sharks, frogs, and fish have hundreds of millions of years of evolution before the appearance of flowers and bees. Even mammals and birds predate bees and flowers by tens of millions of years. Butterflies also evolved about 130 million years ago, also appearing after the advent of flowering plants. Since plants cannot move around in search of a mate, they evolved to attract pollinators to spread their pollen, or the male gametophyte of plants. Pollen could roughly be described as something akin to the plant equivalent of sperm. Plants produced pollen before the evolution of flowering plants or angiosperms, but prior to this, all plants were pollinated by the wind. Angiosperms or flowering plants evolved nectar, attractive colors, or fragrances to attract pollinators. Millions of years of natural selection has produced very specialized relationships between some flowers and particular pollinators. For instance, some flowers are red and tubular to appeal to the beaks of hummingbirds. Some flowers are so specialized that only certain species of hummingbirds can pollinate them, such as the sword-billed hummingbird of South America which has a ten centimeter bill (and a 4 cm body) that can reach the nectar deep within the tubular petals of the Passiflora mixta. Hummingbirds are relative newcomers, which evolved from swifts and tree swifts over 22 million years ago, flourishing in South America (Sanders, 2014).

All pollinators are important, but bees have been particularly important in human history. Humans have a long history with bees. Even our closest relative, chimpanzees, are known to use sticks to obtain honey from hives (Kritsky, 2017). Interestingly, both male and female chimps collect honey, with female chimps able to collect honey with babies on their back. However, humans are less proficient at climbing, so it might be assumed that collecting honey was historically men’s work. For about five million years of hominid evolution, humans and their ancestors hunted and gathered their food. Modern humans have existed for about 200,000 years, but it is only in the last 10,000 years that some human societies moved away from hunter-gathering. From a Marxist feminist perspective, hunter-gatherer societies were likely more egalitarian and placed more value on women than societies that existed after the advent of private property. These societies were small and there there little social stratification, since there was less ability for individuals to accumulate significant wealth. Although there is little stratification in hunter-gather societies, there are gender based divisions of labor. As such, women likely had a different relationship to pollinators, and bees in particular, than men. In a study of 175 modern hunter-gatherer societies, women provided four fifths of the food to these societies. Typically, the food gathered by men is further away and harder to obtain. Thus, men may have been involved in collecting honey as this would involve travelling larger distances and climbing trees. This seems to be true in some modern examples of hunter-gatherer societies. In Democratic Republic of Congo, Ngandu women and children would look out for hives, which men then collected honey from. Some honey hunting societies ban women from gathering honey, such as the Ngindo tribe in Tanzania and the Bassari in Senegal. Hunter-gatherer men have been observed eating honey when it is found, but bringing some back to home to be divided and then stored by women (Crane,2000). Rock paintings in Spain depict humans stealing honey from bees 7000-8000 years ago (Kritsky, 2017). The paintings do not clearly depict a man or woman, so it is hard to know the exact gender roles of men and women concerning bees.

Some societies moved away from hunter-gathering and adopted settled agriculture. The development of agriculture allowed for private property to arise as well as larger populations and cities based upon stored and surplus food. The first agrarian societies emerged 10,000-8,000 years ago in the Middle East. Thus it is no wonder that the first evidence of beekeeping arose in civilizations of the Middle East. In contrast to previous hunter-gatherer societies, agrarian societies developed classes and specialized occupations. The oldest evidence of actual beekeeping is from Ancient Egypt, where pyramid artwork depicts beekeeping in 2450 BCE. In Egyptian society, it appears that beekeeping was an established profession. Likewise, in 1500 BCE, various Hittite laws were passed regarding stealing hives and swarms of bees. The oldest bee hives themselves have been found in Israel. Early hives were made from straw and then later pottery (Kritsky, 2017). The oldest record of beekeeping in China dates from around 158 CE. A relief at Angkor Wat in Cambodia depicts beekeeping and dates from 1000 CE. Mayans also raised bees, which arose independently from Western Culture. They depicted bees in art, hieroglyphs, and developed cylindrical, ceramic hives. It is interesting to note that honey bees had gone extinct in North America, but the Mayans encountered stingless tropical bees (Kritsky, 2017). Stingless bees do not produce as much honey as honey bees, but modern Mayans continue to cultivate them. Deforestation has caused these bees to become endangered.

From a Marxist feminist perspective, the status of women fell with the invention of agriculture. Thus, in all of these examples, the status of women would have been less than the the status that women enjoyed during the long history of hunting-gathering. The development of private property marks the origin of patriarchy, as the exchange of property from one generation to the next required monogamy and close control of female sexuality. These societies were often based upon slaves, which were used to build monuments, but also required warfare to obtain. Because agriculture created surplus, it resulted in more specialization and stratification. There emerged groups such as scholars, priests, kings, etc. who could live off of the labor of others. Laws and written language were also developed for the purpose of managing property. However, many of these civilizations continued to worship female goddesses, some of which were connected to bees. For instance, the Minoans worshiped a nature, birth, and death, which was symbolized by a bee. In Greek mythology, a nymph named Melissa discovered honey and shared it with humans. She also is credited with feeding baby Zeus honey and was later turned into a bee by Zeus after his father tried to kill her. The Greek myths probably were drawn from the stories of nearby societies and societies that predated the Greeks. Lithuanian, Hindu, Mayan, Greek, and Minoan societies had bee goddesses, though there are also examples of bee gods in other cultures.

Moving along in history, bees were kept during medieval times, and it was even ordered by Charlemagne that all manors raise bees and give two thirds of the honey produced to the state. In the middle ages, Germany, Poland, Ukraine, Russia, and the Baltic states were engaged in less formal beekeeping in the form of forest beekeeping. This involved hollowing out trees to encourage bees to form colonies in them, then seal up the tree once the colony was established as a sign of ownership and to protect it from bears. Early hives could not be dismantled. Therefore, obtaining honey meant destroying the hive and the bees (Kritsky, 2017). In Poland in 1337, a statute said that women and men had equal rights to in buying and selling honey. Husbands and wives were both able to own land related to forest beekeeping and both a son or daughter could inherit this land. Some evidence suggests that tree beekeeping was done by women. Nuns and monks were known to raise bees. In one story, Saint Gobnait, a nun from county cork in Ireland, is said to have sent away cattle thieves by unleashing bees upon them. Hildegard of Bingen, also wrote about bees. Examples of artwork from the 1400s and 1500s depict both men and women involved in beekeeping. In the 1600s in England, there are literary references to housewives as beekeepers and that beekeeping was commonly done by country women. The first use of the word “skep” in the English language appeared in 1494 and referred to female beekeepers (Crane, 2000). Perhaps during European feudalism, women were more involved in beekeeping than in other periods of history. It is hard to know why this might be, as the status of women in feudalism was no better than earlier agrarian societies. Women were controlled by the church, had limited opportunities, were controlled by their husband or father, and were burned as witches. Perhaps women’s involvement in beekeeping could be attributed to various wars or plagues that would have decimated or occupied the male population or the role of women in general food production. Interestingly, when European thinkers saw a single ruler bee, they assumed it was male. Aristotle called this ruler bee the king bee, and through the middle ages, bees were seen as entirely male. Through the 1500s and early 1600s, queen bees were referred to as King Bees or Master Bees (Crane,2000). So, even though women may have had an expanded role in beekeeping during European feudalism, the imagined social organization of bees themselves reflected a very masculine and feudal worldview.

Capitalism arose in the 16th century in England with the privatization of public lands. The enclosure movement turned former peasants into workers, driving them off the land into cities for paid work. Landlords maintained the best lands, which were rented, again, requiring paid labor. New laws were passed against vagrancy, again encouraging paid work. The invention of the working class and increased agricultural production of paid farm workers, laid the groundwork for capitalism. Of course, capitalism really took off with the Industrial Revolution in the mid-1800s. It was not until the 1800s that hives with removable frames were developed. Until the invention of modern hives in the 1800s, both the bees and the hives were destroyed to obtain honey. Large scale production of honey also coincided with the Industrial Revolution and the invention of a centrifugal honey extractor in 1865. The 1860s also saw the commercial sale of honey. Prior to this, it was produced and sold locally. Honey was shipped in wooden drums at the end of the 1800s, but switched to 60 lb metal cans. Specialized honey packing plants emerged in the 1920s (Oertel, n.d.) In the U.S., women mostly worked to assist their husbands in beekeeping, but in 1880 Mrs. L Harrison of Illinois was a commercial beekeeper in her own right who later published a book about beekeeping. In the early 1900s, work related to beekeeping was gendered, with women participating in extracting, selling, handling, and bottling honey and men tending to the hive and bees. Today, 42% of the membership of local beekeeping clubs is comprised of women. Women make up 30% of state beekeeping organizations and around 30% of national beekeeping associations as well. However, women are not often in leadership roles and often serve as secretaries or supporters in the clubs. Some clubs do not allow women as leaders or women as leaders do not last long. A few clubs even have auxiliaries just for women. As such, women make up less than ⅓ of the leadership of beekeeping organizations. As a whole, in the United States, about 31% of farmers are women (Calopy, 2015).

Capitalism and Pollinators:

Capitalism will be given special attention from hereon. Despite the beauty and importance of pollinators, as well as their long history with humans, they are in peril. According to a UN report, 2 out of 5 invertebrate pollinators are on the path to extinction. 1 out of 6 vertebrate pollinators like birds and bats are also facing extinction (Borenstein, 2016). There are over 20,000 species of bees in the world and 17% of them face extinction. Pollination is important as without it, plants cannot reproduce. 75% of the world’s food crops require pollination. Without pollinators, there will be no food. 87% of the money made globally comes from food crops that require pollination (Okeyo, 2017). More than half of the 1400 species of bees in North America are facing extinction (Worland, 2017). Monarch butterflies have also garnered attention as over the past several decades their population has declined by 96.5%. There are several reasons for this, including deforestation of their habitat in Mexico, climate change, loss of milkweed plants, and pesticides. Habitat has been turned into farmland. Nevertheless, there have been efforts to restore monarch butterfly populations such as planting $2 million of milkweed at 200,000 acres of land administered by the Fish and Wildlife Service. In Mexico, the Monarch Butterfly Biosphere Reserve is a project to expand their winter habitat (The Monarch Butterfly is in Danger of Extinction). There are many factors related to the decline in pollinators, such as loss of habitat and biodiversity, pesticides, farming practices, diseases, and climate change. 97% of Europe’s grasslands have disappeared since WWII, often turned to farmland. Pesticides containing neonicotinoids have been found in some studies to reduce the chances of bee survival and reproduction (Borenstein, 2016). Aside from pesticides, bees are vulnerable to climate change. Whereas butterflies can migrate to new areas with climate change, bees have difficulty establishing themselves in new areas. In the north end of their range, they have failed to move towards the north pole. At the south end, they have died off. Together, bees have lost a range of about 200 miles on their north and south ends (Worland, 2015). Disease and parasites can also be blamed for the decline of bees. The Varroa Mite first appeared in the United States in 1987 and within ten years spread across bee colonies across the US. Bees infected with the mite may be deformed, have shorter longevity, less ability to reproduce, and lower weight. Pesticides used in mosquito control have also been linked to colony collapses. Additionally, some scientists believe that pollen from transgenic crops can be harmful to bees as the pollen itself may have insecticidal proteins (Status of Pollinators in North America, 2007).

The rusty patched bumblebee was the first wild bee listed as endangered in the continental United States, when it was added to the endangered species list in January 2017. The bee was once common in 28 states and now can only be found in small populations in 13 states. In September 2016, several species of yellow faced bees were listed as endangered in Hawaii. Once again, neonicotinoids are blamed since they are commonly used in agriculture, forestry, and lawn care, and are absorbed into a plant’s leaves, nectar, and pollen (Gorman, 2017). The problem with neonictoninoids first noticed in 1994 in France, when the country first began using neonicotinoids. The pesticide was produced by Bayer and first used on sunflower crops. Bees that collected pollen from treated sunflowers showed symptoms of shaking and would abandon their hives. One quarter of a trillion bees perished before French farmers protested the use of the pesticide, which resulted in its ban. In the United States, the symptoms were first observed in 2006 and coined Colony Collapse Disorder. There was confusion over the cause of Colony Collapse Disorder, but a prominent theory suggests that beekeeping has shifted from being centered on producing honey to using bees to pollinate cash crops. 2.5 million hives are trucked around the country each year. The bees are transported to farms and fed corn syrup rather than wildflowers. The corn syrup may be laden with neonicotinoids, which results in Colony Collapse Disorder. Almonds, apples, blueberries, avocados, cucumbers, onions, oranges, and pumpkins are just a sample of some of the crops that could not be grown without pollinators (10 crops that would disappear without bees, 2012). Of course, there are some crops, such as soybeans, corn, cotton, alfalfa, beans, tomatoes, pecans, and peanuts which do not require honeybee pollination. Nevertheless, our diets would be much different without pollinators. Entire ecosystems would be quite different.

The plight of pollinators can largely be connected to industrial agricultural practices. The key challenges to pollinators: loss of habitat, loss of wildflowers, use of pesticide, and agricultural monoculture are all broadly connected to agriculture in the context of capitalism. Pollinators have been around for millions of years, so it is startling that it is only the past few decades that have pushed them towards extinction. This begs the question of why agriculture happens as it does and what can be done? Karl Marx was a critic of agriculture in capitalism. Marx observed in Capital, that capitalism divides the city from the countryside. Capitalism itself emerged as the result of the privatization of common land. When people were pushed off their land, they were separated from their ability to feed themselves. That is, they had to work for another person to earn the money needed to buy the things that are needed for survival rather than grow or make them themselves. Capitalism depends on workers, who Marx called wage slaves because of their dependency upon wages to survive. The birth of capitalism meant the death of a certain relationship to the land. This connection is part of the Marxist concept of metabolic rift. Just as workers are alienated from production and one another, they are alienated from nature and human nature. Humans are deeply connected to the environment, but according to Marx’s belief, it is capitalism which severs this connection (Williams, n.d) Marx also observed that capitalism reduces the rural population while expanding the urban population (Westerland, 2015). Human societies always depend upon the natural world to exist. In this sense, humans metabolize nature. For most of history, nature has been experienced in terms of its use-value, or the ways in which it is useful to our existence. However, capitalism have commodified nature and separated humans from it. Our economy is dominated by exchange value rather than use value. This has resulted in metabolic rift, or a separation from our place in ecosystems (Foster, 2015).

Aside from the original sin of moving people off of public land and the privatization of land, Marx was a critic of how land was used in capitalism. He noted that capitalism resulted in the exhaustion of the soil in the interest of profits. Marx believed that it was possible to increase the productivity of soil through good management or use of manure, but that it was not profitable to do so in capitalism. He observed that when land became exhausted it was often abandoned in search of new lands to exploit (Saito, 2014) Capitalist agriculture not only robs the laborers but the soil (Westerland, 2015). Oddly enough, despite the surplus of human and horse manure in cities, countries like Great Britain and the United States scoured the globe in the 1800s for fertilizers for their over exploited agricultural land. Wars were even fought to obtain guano as fertilizer. Capitalism is so wasteful and illogical, that it made more sense to colonize empty islands for their bat manure than sustainably manage agricultural land or obtain manure locally. But, capitalism is not driven by what is sustainable, rational, or healthy. It is driven by profits. It is the pursuit of profits that results in the vast environmental destruction the world experiences today and the agricultural practices that imperil our food supply by destroying pollinators. As such, around 75 billion tons of soil wash away or is blown away each year after ploughing. 320 million acres of agricultural land is salinated due to agricultural practices. 40% of the world’s agricultural land is in someway degraded. Over half to three fourths of all industrial inputs return to the environment as waste within one year. At the same time, pollinators are worth over 14 billion dollars to the US economy. Despite their use value, the profit motive trumps sustainable agricultural practices which might protect pollinators. As a result, farmers in China have actually had to pollinate their own apples with brushes and pots of pollen due to the decline of bees (Goulson, 2012).

Industrial agriculture in capitalism could be described as not very diverse, pesticide intensive, and wasteful. Agriculture is not very diverse since crops are grown to make a profit. Therefore, a few reliable varieties of crops are planted because they will grow predictably, ship easily, have uniform qualities, or other desirable traits. This means a loss of biodiversity, as heirloom varieties of crops go extinct because they are not grown widely. At the same time, since its beginning, capitalism has needed to divide people from their ability to sustain themselves. This forces individuals into the economy as workers. Farmers around the world are drawn into the economy when their seeds or agricultural inputs are privatized and sold on the market. Farmers who may have once saved seeds have found that the seeds are not patented and they must buy them. Again, this leads to a loss of biodiversity as old farming practices are replaced by paid farm labor and commercialized seeds. In pursuit of profits, capitalism over uses fertilizers, as the land is overexploited. Pesticides are also used because it is cheaper to dump chemicals on plants than practice sustainable, organic agriculture with natural pest control. Fertilizers and pesticides themselves are often the product of chemicals developed for war. After World War II, factories which produced nitrogen for bombs were converted to fertilizer factories. DDT, which was used as a pesticide with devastating effects on bird populations, was actually used in WWII to protect soldiers from fleas and mosquitoes. Capitalism requires war to open up new markets, destroy competitors, and access new raw materials and cheap labor. But, it also develops new technology and weapons. Agriculture’s chemical age in the 1950s was the peacetime application of war technology. Finally, capitalism is wasteful. It is wasteful because the drive for profit requires more production. Production occurs to create more value, from which profit is derived. Pollinators are in trouble because of the destructive, wasteful, and polluting nature of industrial agriculture within the context of capitalism.

There are many things that can be done to help pollinators. However, most solutions are individual solutions. This is a flaw with the environmental movement, as it often focuses on consumer choices or individual behaviors rather than the larger issue of dismantling capitalism. These small scale activities are not useless, but must be coupled with movements that challenge industrial agriculture within capitalism. Individuals can plant gardens that attract pollinators. Community groups can plant milkweed plants or seed bomb for pollinators. Individuals and communities can partake in beekeeping. Partaking in community gardens, visiting farmer’s markets, buying locally, saving seeds, etc. are all small scale actions that can be done. However, these activities will not tip the scale towards saving the planet as they do not challenge capitalist production. Capitalism must be overthrown so that giant agribusinesses can be dismantled, food production can be more locally centered and worker controlled, and rational choices can be made of how, what, and where to grow food. The environmental and labor movement must work together towards empowering workers to take control of the economy in the interest of a sustainable future. Agribusinesses and the fossil fuel industry donate millions of dollars to both of the major capitalist parties. Neither can save pollinators or the planet as they pursue free trade and market solutions to environmental problems. The anarchy of capitalist production could result in the destruction of pollinators we depend upon for survival and which have inhabited the planet for millions of years. But, each society contains the seeds of its own destruction. For capitalism, it is its instability and the immiseration of workers. It is my hope that social movements that can seriously challenge capitalism will emerge and that the labor movement can be reinvigorated and mobilized towards ecosocialism. Anything less will condemn the planet to a hotter, less biodiverse, more socially strained future.

Lessons and Myths About Domestic Violence from the Case of Graham Garfield

H. Bradford

5/7/17

For the past few weeks, local activists in Superior, Wisconsin have worked together to challenge domestic violence in their community following the arrest of city councilor, Graham Garfield. On April 4th, Graham Garfield was re-elected to represent the 6th district. It was a very tight race wherein he won the election by a single vote. Aside from serving as a city councilor, has served in other positions including Vice Chair of the Democratic Party of Wisconsin Labor Caucus, President of the Superior Federation of Labor, Vice President of the National Association of Letter Carriers-337, and Chair of the Parks and Recreation Commission. He was endorsed by the Superior Federation of Labor and viewed by local progressives as a labor candidate who stood up against Islamophobic statements from the previous mayor. Only a few days after being sworn into office, he was arrested on charges related to domestic violence.

In the evening of April 20th, Superior police responded to a report of a domestic dispute involving Garfield and his fiance. To summarize the official police report, Graham’s fiance informed the police that they had been arguing that evening. Graham had been drinking and had become verbally abusive. When she tried to remove him from their residence, he bit her and when she slapped him, he pretended to call 911. She retreated to another room. Graham followed her and grabbed a gun, which he pointed at her chest from a few feet away. Graham was located several hours later 10 miles out of Superior at Pattison Park and arrested. He posted bail the following day and faces three misdemeanor and one felony charge. The felony is for recklessly endangering safety and the misdemeanors are for possession of a firearm while intoxicated, pointing a firearm at someone, and disorderly conduct. He will appear in court for an arraignment hearing on May 26th.

Following his arrest, various activists and politicians requested his resignation. On April 24th, Jim Payne, Superior’s newly elected mayor, called for Garfield’s resignation, arguing that because of the felony charges against him, he could spend months in court. That would impede his ability to serve the city council. Meanwhile, Garfield did not release any public statements regarding the incident nor regarding his resignation. When it seemed that he would be attending the bi-monthly city council meeting on May 2nd, members of the Feminist Action Collective and Feminist Justice League simultaneously called for activists to show up at the meeting dressed in purple, as purple is symbolic of domestic violence. Both groups mobilized their members to attend the meeting as a way of drawing attention to domestic violence, supporting the victim, and pressuring for his resignation. In addition to this action, the Feminist Action Collective also developed an open letter asking for Garfield’s resignation. Garfield remained silent until shortly before the city council meeting, when he released his first public statement. In the statement, he said that he would not be resigning.

“In response to ongoing legal matters and the mayor’s request that I resign my position, I have decided it will be best for my district and the council that I continue to serve in my existing capacity. Just as the election process is sacred, so too is the American justice system; a system that maintains that I am entitled to a fair legal process before judgment is passed against me. It was unfortunate that the mayor sought to inappropriately pass that judgment. Regardless, I continue to support his agenda and believe in the principles on which I was elected. I would also like it noted that I am now living a sober life and have begun to attend AA meetings. I appreciate the public’s support and understanding as I continue on the path of recovery. I will have no further comment for the press at meeting time.” -Graham Garfield, May 2nd

Around twenty activists attended the city council meeting wearing purple. Because his statement was released shortly before the meeting, many activists had not yet read his statement. He arrived late and was treated cordially by some of his peers. The meeting itself was rather short, with time allotted for public commentary. Several local activists spoke out during the public commentary section of the meeting. Fellow city councilor, Brent Fennessy, who appeared wearing purple, also voiced his concern regarding the allegations and asked Garfield to resign.

After the meeting and reading over Garfield’s statement, several activists from the Feminist Justice League discussed the next steps in pressuring for Garfield’s resignation. To this end, a petition was developed and the Feminist Justice League called upon activists to not only attend the next city council meeting but to have a picket before the meeting. It was felt that in order to pressure him into resigning, the activism against him would have to intensify. This justified the more public action of a picket, as well as the development of a strongly worded petition meant for the city council. Furthermore, Garfield’s decision to remain on the council and his abhorrent statement earlier that day, inflamed activists as it did not reference domestic violence, seemingly shirked responsibility for his actions, and pinned his behaviors on alcohol.

Two local news stations drew attention to the petition and picket the the following day. Within forty eight hours, the petition attracted over 150 signatures. The picket event on Facebook had attracted the interest of over seventy individuals. The media coverage of the petition coincided with coverage of Graham Garfield’s first court hearing. The same day, a motion was made at the monthly meeting of the Superior Federation of Labor that he should be asked to resign from that body. This motion was not seconded, but expanded the discussion of domestic violence to representatives of the labor movement. On the evening of May 4 th, just as the movement against Garfield seemed to be gaining momentum, Garfield unexpectedly released a statement that he had changed his mind and that he was going to resign. Various media outlets attributed his change of mind to the public pressure put upon him. His own statement cited concern for his colleagues and the community.

“Out of concern for the well-being of the community and wishing no harm upon my colleagues, I announce that I will be stepping down. It has been one of my life’s greatest pleasures to serve the people of this city, and I hope that I can be an asset to the community again someday. I continue to support, as a citizen, a progressive agenda that will benefit all members of the community and make our city a better place to live.” -Graham Garfield, May 4th

His resignation and the activism related to it offers many valuable lessons. For one, it shows that social movements can be effective in making change. At the same time, it revealed some flaws with how domestic violence is discussed and understood in society. His resignation is a small victory, but the fight is not over. It is important that his is held accountable by the criminal justice system. It is also important that the Superior Federation of Labor and other organizations he is involved with also hold him accountable for his actions. Thus, moving forward, future actions will be focused on making certain that the criminal justice system does not fail the victim and that the community holds him accountable. Activists are also tasked with drawing lessons from their successes and failures, as well as further challenging and shaping the discourse around domestic violence. To this end, there are several components of the public discourse regarding the Garfield case that should be challenged.

The Myth of Alcohol and Domestic Violence:

In Garfield’s May 2nd statement, he said that he was now living a sober life and attending AA treatment. While it is encouraging that he wanted treatment for an addiction, the statement was problematic for a number of reasons. One persistent myth about domestic violence is that it is caused by alcohol or that alcohol plays a role in violence because users are less inhibited. There are a few things wrong with framing domestic violence this way. On one hand, if alcohol means a loss of inhibitions, that implies that ordinary people want to be violent towards others but do not act upon this until alcohol has lowered their inhibitions. I would hope that most people are not forcing down their dark urges to physically abuse someone, especially since most abuse is directed at women (97% of abusers are men with female partners). Another problem with this narrative is that it ignores abuse that happens when an abuser is not drunk. Financial control, emotional abuse, limiting where a victim goes or who they see, stalking a victim, etc. are kinds of abuses that may be ongoing in a relationship, irrespective of if the abuser is drunk or not. Thus, the alcohol argument reduces abuse to a one time occurrence rather than a pattern of behaviors that exert power and control. This argument is also problematic since if alcohol is blamed, it is easier to dismiss abusive behaviors as the result of being impaired. This makes it easier to dismiss the abuse and in doing so, fails to hold abusers accountable. Finally, alcohol exists in a social context. If an abusive person is indeed more impaired by alcohol, they are still acting in a way in which they have been socialized. Alcohol exists in society. How alcohol use is expressed in society is shaped by gender roles, social expectations, and gender inequalities. Some of the countries with the strictest prohibitions against alcohol have the highest rates of violence against women. For instance, according to the WHO, in North Africa and the Middle East, 40% of women have experienced intimate partner violence. These regions have the lowest rates of alcoholism in the world. One would assume that if alcohol is used less frequently, there would be less violence. As a whole, blaming alcohol ignores the broader context of abusive behaviors and the patriarchal social context which shapes alcohol use and behaviors while under the influence.

The Myth of Loss of Control:

Another myth about domestic violence is that it is about a loss of control, such as losing one’s temper. This myth is problematic, since it again, does not make the abuser accountable for their actions. In this narrative, the abuser might be an otherwise good person who has a problem with anger or who lost control of themselves. This ignores how the abuser can control themselves in other situations and how the violence was directed at their partner. If a person suffers from loss of control, one could assume that they would attack their boss, the checkout person at Walgreens, their mother, the police, or a stranger. Instead, abusive behaviors are targeted at a partner. Only 5-10% of abusers have records of assaults with victims other than their partner, which implies that most abusers are very capable of controlling their behaviors. It is also problematic since it frames the abuse as a one time incident, rather than an ongoing exertion of power and control over another person. The abuser maintains control inasmuch as they choose who, when, how, and where to exert their power and control. For instance, it is more likely to occur in the home where it is private, than in front of a group of coworkers or family members that the abuser wants to impress or who may not condone the behaviors. Rather than framing abuse as loss of control, it should be viewed as a means of maintaining control over the victim. For instance, in the police report, Garfield faked calling 911 after his partner slapped him. This was a way of controlling her by making her feel that he was the victim and that she would get in trouble with the police.

The Myth of the Single Incident:

Closely related to loss of control is the myth that a domestic violence incident is simply that, a singular incident. Instead, it should be viewed as a pattern of behaviors. Almost every single woman who comes to the shelter that I work at experiences various kinds of controlling or abusive behaviors before they are actually physically or sexually abused. Abusers may defend their actions by stating that they have anger issues or lost control, but usually their anger is not directed at everyone and they maintain control in other situations. Viewing abuse as a single incident ignores the power and control that was exerted through jealousy or controlling behaviors, stalking, monitoring, put downs, threats, using isolation, destroying property, blaming, denying, gaslighting, etc.

The Myth of the Criminal Justice System:

Many individuals in the community expressed that there should not have been actions to ask for Garfield’s removal. In their perspective, it is an issue that should be left to the criminal justice system. These individuals also argued that he is innocent until proven guilty. Even Garfield himself called the criminal justice system sacred and said that he would remain in office as he deserved a fair trial. This enormous faith in the criminal justice system ignores the ways in which the criminal justice system has failed poor people, women, racial minorities, and other oppressed groups. It is true that individuals are innocent until proven guilty in our court system, but the outcomes in the criminal justice system are shaped by power, privilege, and money. For instance, a study noted that there were 64 cases of reported domestic violence perpetrated by professional athletes in the NFL, NBA, and MLB between 2010 and 2014. Only one of these allegations resulted in a conviction. Athletes, politicians, celebrities, or others with wealth, resources, and prestige are treated very differently in the criminal justice system.

Furthermore, the criminal justice system has not and often does not, take domestic violence and sexual assault as seriously as it should. The feminist movement and the movement against domestic violence and sexual assault has worked for decades to be taken seriously by the criminal justice system. It is important to note that for most of U.S. history, wife beating was viewed as the legitimate right of a husband. While wife beating has been illegal since the 1920s, it was not until the 1970s that law enforcement began viewing domestic violence as something more than just a private, family matter thanks to the effort of feminists to educate and organize around the issue. It was not until 1994 that the Violence Against Women Act was passed, which included the first federal laws against battering as well as provisions to fund shelters, legal aid, and other victim services. Although there have been many gains in how the criminal justice system handles domestic violence, there is still much to be done. One in four women experience domestic violence in their lifetime and each day, three women are murdered by their partners. Only one in four incidences of domestic violence are actually reported to the police and in a study that appeared in Psychology Today, only three out of five domestic violence calls to the police resulted in an arrest. The same study reported that only 2% of domestic violence offenders received any jail time. In South Carolina, a study found that 40% of the domestic violence cases handled by the General Sessions Court since 2012 were dismissed. There are many reasons for these numbers. Domestic violence cases may be hard to prosecute because they occur within the home, often without witnesses. Since few offenders actually see jail time (over 90% did not in the Psychology Today study) it may seem pointless to call in the first place. African Americans, Native Americans, and other oppressed groups may fear calling the police due to negative experiences with the police. Skepticism regarding the criminal justice system is understandable based upon these statistics.

Aside from the fact that the criminal justice system fails victims, the argument that the community should take a hands off approach is disempowering. Any thinking person should be able to make conclusions about a public figure based upon police reports and reports in the media. While ordinary citizens do not have all of the facts, the facts that are available are pretty damning. It is a serious matter that an elected official reportedly pointed a gun at his fiance, bit her, and left the scene while intoxicated. Just as you don’t need a weatherman to know which way the wind blows, you don’t need a judge or jury to form an opinion on what appears to be a very serious and terrible incident of domestic violence. Having an opinion is not anathema to believing in a fair trial. Holding a public official, or any abuser, accountable, is not opposed to belief in working with the court system. The “hands off, innocent until proven guilty” argument deflates the potential for social organizing. Social organizing is important since if these recent events have taught us anything, it is that there is a need to continue to educate our community about domestic violence and work to end it.

Moving Forward:

On Monday, the Feminist Justice League will be meeting to discuss future actions related to this case. We will likely be calling for activists to attend the court hearings wearing purple. Other actions will also be discussed at that time. On Monday, I will also be appearing on Henry Bank’s radio program at 4 pm. In fact, this article was developed so that I could better organize my thoughts before appearing on the radio. Moving forward, I hope that we are able to support the victim in this situation, but also draw lessons from what has happened so that a positive change can be made in the community. Changing how domestic violence is talked about, holding public officials and abusers accountable, while identifying the ways in which our criminal justice system is imperfect are important components of future organizing.

Activist Notes: Solidarity Valentine Cards to Prisoners

H. Bradford

2/16/17

On February 13th, the Feminist Justice League (formerly the Twin Ports Women’s Rights Coalition), collaborated with Letters to Prisoners and Superior Save the Kids to do a solidarity Valentine card event. The event was attended by about nine individuals, who met at the Superior Public Library for an hour and a half as they wrote letters and sent cards to various incarcerated individuals. The event was a great way to celebrate Valentine’s Day and produced a large pile of letters. As I report back on this event, I wanted to share a little history and why this is a feminist issue.

Now, when I pitched the idea to Meghan, the organizer for Letters to Prisoners, she was a little worried that the idea of women sending Valentine cards to prisoners was a little….iffy. Not that there is anything wrong with women forming relationships with men who are in prison, but this sort of letter writing doesn’t seem like a particularly feministy activity. In a way, this represents how it is almost impossible to think about Valentine’s Day in a non-romantic way! Valentine Cards are almost always about romantic love. (Though in my own card shopping this year, I was surprised to find that there are a fair amount of Valentine Cards sold on behalf of cats and dogs…) In any event, it was very important to make clear that this event was not about romance. It was about sending cards that express love for freedom, social justice, humanity, and a better world. Hence, these were solidarity Valentine cards.

Sending cards to prisoners on Valentine’s Day makes a lot of sense to me. Now, eons ago, I used to be a Lutheran. Lutherans aren’t known for their support of Saints (hence, the whole protestant reformation). However, I remember in confirmation class I learned about Saint Valentine, probably around Valentine’s Day. I learned that he was imprisoned for performing Christian weddings in the pagan Roman Empire and sent a letter to his followers before his execution. Even Catholics don’t know if St. Valentine actually existed as a historical individual. According to the Catholic Education Resource Center, the first person named St. Valentine was beheaded on Feb 14th 270 AD for comforting Christian martyrs. There are two other saints named Valentine, one who was killed in Africa and another who was a Bishop in Terni (north of Rome). It is the Bishop from Terni who may be the Valentine most associated with the holiday, as he allegedly married couples and thusly became the patron saint of young people, marriage, and love. He is also the saint of bee keeping, epilepsy, and plagues, though these don’t sound quite as romantic. Like the original Valentine, his feast day was February 14th. In some stories, he either befriended or was romantically involved with the daughter of his jailer/judge, but this may have been a later addition to his story. Over time, the story and feast day became more closely associated with romance rather than Christian martyrdom. The romantic associations with the holiday may have come from other holidays, such as the Roman holiday of Lupercalia (which involved match making) between Feb 13-15 and Galatin’s Day (lover of women day).

Like Halloween, Christmas, and Thanksgiving, modern Valentine’s Day took off in the United States mid to late 1800s. This was bolstered by the industrial revolution, which allowed for the mass production of media, gift items, and cards (a boon for popularizing holidays). Like the other holidays, the modern celebrations was also buoyed by increased space for secularism in society. It would be interesting to write a history of Valentine’s Day, but for lack of space and time, suffice to say the holiday has had some romantic connotations since the beginning, but if the story is boiled down to its most basic elements it is a story of a man who is imprisoned and executed by a powerful empire on the basis of religious belief. It is a story about capital punishment and religious intolerance. Christianity may be the dominant religion in this country today, but religious persecution certainly continues through the violence, incarceration, and surveillance of Muslims in our country as well as recent attempts to ban Muslims from entering the country. In terms of capital punishment, the United States is the only country in the Americas that executed prisoners in 2015. Most industrialized countries have abolished the death penalty, so although we are among the 54 countries that practice capital punishment- most of the other countries are our so-called enemies…you know, the poor countries that we want to bring democracy to in the Middle East and Africa. Because of the actual history of St. Valentine, I think that the holiday provides a great opportunity to put a spotlight on our criminal justice system.

I have only gone to a few Letters to Prisoners events, but a person can learn a lot about our criminal justice system by simply sending a letter or card. For instance, although the event was pitched as a Valentine card making event, prisoners are not allowed to receive glittery, pretty, colorful handmade cards. The cards must be done in black and white ink. Likewise, the prisoners can not receive letters with colorful birds or flowers. The stamps must be Forever FLAG stamps. Officially, this is to control what kinds of stamps are sent to prevent drugs from being sent in the guise of stamps. But really? Really? Forever FLAG stamps. I think this sends a powerful message that they are owned and controlled by the United States. The letters are stamped with mandatory patriotism. I also observed that most of the prisons are in the south of the United States. For instance, I sent cards to four political prisoners with birthdays in February. Three of the four were in prisons in the southern U.S. The south has the largest prison population. There are 867,000 prisoners in Louisiana, Alabama has 677,000 prisoners, and Mississippi has 740,000 prisoners. Georgia has over 550,000 prisoners and Texas 669,000. Minnesota has 194,00 and California 365,000. The states with the highest prison populations have the poorest populations and a history of slavery. While the bulk of the U.S. prison population lives in the south, as a whole, the United States has 5% of the world’s population, but 25% of the world’s prison population. We have more prisons than colleges. Finally, many of the political prisoners that I have written to have been imprisoned for decades. Many, like Leonard Peltier, will likely die in prison. If we look to our neighbors in the Americas, many countries limit life sentences. In Brazil, Nicaragua, Venezuela, and Uruguay the maximum prison sentence is 30 years. It is 25 years in Paraguay and 35 years in Ecuador. I find it ironic that many of the countries that the United States has tried to bring democracy to through supported coups and military training are actually more democratic and humane that our own country.

To connect the issue more closely to feminism specifically, I had my friend Lucas construct a list of female prisoners for participants to write to. Participants in the letter writing event chose from this list or sought other lists from online. While the United States hosts 25% of the world’s prison population, if we looked at the world’s female prison population, we detain 33% of the world’s female prisoners. It is astonishing to think that 1/3 of all of the women in prison, in the entire world, are held in the United States. While we did not discuss female specific issues related to prisoners at the event, there are some unique challenges that female prisoners face. For one, while prisoners should legally have the right, this has been denied to women. States such as Georgia, Wisconsin, Nebraska, Wyoming, and Missouri have no policy regarding pregnancy, which leaves the decision in the hands of correctional facilities. In the past, correctional facilities in Arizona and Missouri have refused to transport female prisoners for abortion procedures. Of course, barriers that all women face in obtaining abortion pertain to imprisoned women as well, including waiting periods, mandatory ultrasounds, limited access in some parts of the country, parental consent, etc. Female inmates with children must navigate custody issues and expensive phone calls if they want to remain in touch with their children. A collect phone call from a prison in Minnesota costs about 75 cents a minute, but in Kentucky, the cost is $5.70 a minute! In North Dakota, the cost is over $6.00 a minute! These costs are expensive because prisons have contracts with phone companies, which offer kickbacks to the agency that contracted with them. From my own experience working in a domestic violence shelter, many of the women who come to shelter have criminal histories. However, some of this includes arrests for assaults that were really actions taken for self-defense. These criminal backgrounds make it harder to obtain housing, as it may disqualify them from some programs or make landlords less willing to rent to them. It also makes employment more difficult, as only the lowest paid sectors of the service industry will hire them. This creates barriers for escaping domestic violence and building a life outside of crime and poverty for their family. But, gender aside, as human beings, we are all diminished by a racist, ableist, and classist criminal justice system which divides us, removes sectors of the population away from their families and communities, and steals the lives of fellow humans at a profit to corporations!

White Winter:

Racism and Winter Sports

H. Bradford

1.28.17

This past fall, the Twin Ports Women’s Right Coalition began doing small events called “Feminist Frolics.” These events were meant to educate our participants about feminism while enjoying the outdoors. The very first frolic was entitled “Patriarchy in the Parks.” This talk explored how patriarchy shapes women’s relationship to nature and participation in outdoor recreation. The original talk discussed how history, gender roles, safety, and leisure influenced how women participated in nature. Since that talk, I wanted to connect how racism, classism, ableism, and other “isms” shape how individuals participate in the outdoors. As such, this talk puts a special focus on race and recreation. In particular, it explores racism and winter recreation. In my own experiences, when I spend time outdoors in the winter, I don’t often see racial minorities participating in skiing, snowshoeing, and hiking. This talk hopes to shed some light on why this is.

The Myth of Geography:

When one considers the racial composition of winter recreational activities, the whiteness of these activities seems almost a given. In our racist imaginations, it seems natural that white people would participate in winter activities. Afterall, Europeans live in the northern hemisphere, where there is snow and cold. Thus, one might argue that geography plays a role in why winter sports tend to be more popular among white people. But, arguments about geography ignore larger issues of racism and classism. It is true that many parts of the earth do not receive snow and that these warmer regions are inhabited by darker skinned ethnic groups. However, geography does not entirely account for participation. For instance, some parts of Africa actually have ski areas. Algeria has two ski resorts and Morocco has three. Morocco has participated in six Winter Olympics, but has never won a medal. Algeria has competed in the Winter Olympics three times, but again, has never won a medal. South Africa has one ski resort, which operates three months out of the year. Lesotho also has a ski resort, which is open during the winter months and is located about 4.5 hours away from Johannesburg and Pretoria in South Africa. Despite having one ski area, Lesotho has never participated in the Winter Olympics and South Africa has never participated in ski events. In 2014, Sive Spielman, a black South African teenage skier was denied entry into the Sochi Olympics. He qualified to compete in slalom skiing, but the South African Sports Confederation and Olympic Committee disqualified him on the grounds that they did not think he was good enough. Considering he came from a poor area of South Africa, was black, and learned to ski through a ski club at his public school, his participation would have been remarkable (South Africa withdraws only athlete, 2014). Even more remarkable considering that blacks would have been barred from ski clubs and the single ski area until apartheid ended in 1994. Because under apartheid black athletes could not compete alongside white athletes, South Africa was barred from competing in the Olympics between 1962 and 1992 (they were allowed to return to the Olympics before apartheid had ended). Thus, four South African figure skaters competed in the 1960 Winter Olympics in Squaw Valley and the country did not compete in a Winter Olympics again until 1994.

In contrast to South Africa, Zimbabwe has no ski areas, but had a skier compete in the 2014 Sochi Olympics. Their skier, Luke Steyn, was white. Unlike Spielman, he was quite privileged, as his family moved to Switzerland when he was two years old and he attended college in Colorado. Furthermore, he was provided financial support by the Zimbabwean government (Blond, 2014). It is odd to think that Zimbabwe’s athlete was a white skier who left the country around 1995. Although he was celebrated in the media, the celebration was oddly colorblind. While many Americans adopt colorblindness as a way to avoid the sticky issue of racism, it actually perpetuates racism by skirting around issues of oppression and invalidating the continued racism in society. While I am not sure about Luke Steyn’s history, his race in contrast to his country of origin seems like an elephant in the room. His family would have been among the 120,000 whites living in Zimbabwe in the mid 1990s and likely left, like many did, because the political situation was not favorable for white people. That is, his family probably left because of land reforms which sought to turn white landholdings over to the largely black population. This was done to rectify a history of colonization, wherein white farmers were offered large tracts of land in exchange for the conquest of the country in the late 1800s. It was also done to dismantle the economic foundation of apartheid in that country. While I don’t know his family’s history, judging by his Dutch surname and his family’s ability to move to Switzerland, I can only assume that they were privileged if not landowners. The stories of Steyn and Spielman make for an interesting juxtaposition, as it shows how a white man can still succeed in a black country whereas a black man struggled for recognition even though he was part of the majority population in South Africa. One was privileged by race and class, the other disadvantaged.

All Olympic athletes are to some degree privileged, but in Africa, and when it comes to winter sports, this is more pronounced. For instance, in 2014, Togo sent its first athlete in the winter olympics in Mathilde Petitjean Amivi, a cross country skier who grew up in France but has a Togolese mother. In the 1984 Winter Games in Sarajevo, Lamine Gueye was the first black African to compete in the Olympics. But like Amivi and Steyn, he grew up outside of Africa. He went to live in Switzerland after the death of his grandfather, also named Lamine Gueye, the head of Senegalese Party of Socialist Action. Gueye has been an advocate for changing the rules of the Winter Olympics to allow more countries to compete. In fact, 96 nations have never participated in the Winter Olympics.

While tropical climate is certainly an impediment to participation in winter sports, there are many countries which have snowy areas which have not participated in the Olympics to the same degree as European countries. For instance, India has eleven ski areas and Pakistan has nine. Iran has almost twenty ski areas. Kazakhstan has four ski areas, Kyrgyzstan has three, and Lebanon has six. Ski areas indicate that the countries have elevations high enough for snow, which lends itself to skiing, along with snowboarding and sledding sports. Iran has participated in the Winter Olympics ten times, but has never won a medal. Kyrgyzstan has never participated in the winter olympics and Kazakhstan has six times. Kyrgyzstan is 94% mountains and has 158 mountain ranges. The Soviet Olympic skiers trained in Kyrgyzstan Karakol Mountain Ski Base (Krichko, 2016). Pakistan has participated in two winter Olympics and Nepal has twice. Chile, which has eight ski resorts, has participated in sixteen Olympics, but has never won a medal. Argentina has ten ski resorts, has participated in eighteen Olympics, and has never won a medal.

The trend is not so much that a country has to have snow to earn medals, as there are plenty of countries with snow, mountains, and wintry conditions which have not won medals. Instead, it seems that the countries with the highest medal counts are European and high income countries. The top ten countries for medals are Norway, United States, Germany, Soviet Union, Canada, Austria, Sweden, Switzerland, Russia, Finland. China, South Korea, and Japan each make the top twenty. These countries have more money to devote to developing sport programs and more citizens with income required to compete at a higher level. Thus, high income countries tend to be more competitive in the Olympics and high income individuals have more opportunities to participate and compete. This explains why diverse countries like the United States do not have more athletes of color in winter sports. Athletes of color have excelled in baseball, basketball, soccer, running, and many other sports. African Americans have long participated in the Summer Olympics. For instance, George Paoge competed in the 1904 summer Olympics and won two bronze medals in the 200m and 400 m hurdles. In contrast, the first African American to compete in the Winter Olympics was almost 80 years later in the 1980 Lake Placid games when Willie Davenport and Jeff Gadley competed as part of a four person bobsled team. The first African American woman to win a medal was in 1988 when Debi Thomas won a medal in figure skating at the Calgary games(Winter Olympics: Why Team USA is Nearly as White as Snow, 2010).

Rather than geography, the reason why few African Americans participate in winter recreation is because winter sports require more money for equipment, training, and coaching. Facilities to practice winter sports are often far from urban centers where African Americans might live (Winter Olympics: Why Team USA is Nearly as White as Snow, 2010). While I could not find any recent statistics, as of 2003, 2% of skiers in the United States were African American, 3% were Latino, 4% were Asian, and 1% were Native American. Among the membership of the National Brotherhood of Ski Clubs, an African American ski organization, 74% of the members are college graduates and 60% live in households with incomes of $50,000 to $100,000 a year (Rudd, 2003). Thus at an international level, but also at the level of individual local participation, access to resources shapes these sports. This is a barrier to participation among racial minorities. So, even in places with wintry conditions, there is still the barrier of cost of participation. On the low end, a beginner snowboarder would expect to pay $500-$1000 for a board, bindings, and boots. Adult skis can range from $200 to $1200. A winter season ski pass for Spirit Mountain costs over $400. Since 27% of African Americans live in poverty, compared to 11% of the general population, these kinds of expensive outdoor activities are beyond the reach of many in their community.

The Role of History:

Another reason why winter sports are white is because of the history of these sports. After all, when an individual imagines winter sports, they might imagine their white ancestors participating in some form of skiing, hockey, or skating. However, this version of history ignores that some cultures may have their own winter sports. For instance, Pakistan hosts a Baltistan Winter Sports and Culture Festival wherein participants play Ka Polo and ice football. Pakistan actually has the highest concentration of glaciers outside of the poles (“Traditional Winter Sports festival and ice sporting in GB,” 2016). Likewise, every two years, various circumpolar regions compete in the Arctic Games. Participants from Northern Canada, Alaska, Greenland, Sami areas of northern Europe, and Northern Russia compete in snowshoeing, snowboarding, volleyball, futsal, skiing, and traditional Dene games like finger pulling, pole push, and stick pull. Additionally, while there is evidence that skiing originated in Finno-Scandinavia with the discovery of rock drawings in Norway and a 4,500 ski in Sweden, Iran also has a long history of skiing. In 2000 BC ancient people in Iran produced skis made of hides and boards (History of skiing, 2005). Cree women would play a marble came, wherein marbles carved from buffalo horns were slid towards holes made in ice (Christensen, 2008). Snowshoeing originated in Central Asia 6,000 years ago, then migrated across the Bering strait to the Americas. Anishinabe, Cree, and Inuit invented sledding. The word toboggan comes from the Algonquian word odabaggan. Sled dogging was an indigenous invention and the Jean Beargrease sled dog race was named after an Ojibwe postal worker who delivered mail from Two Harbors to Grand Marais in often treacherous conditions. The Iroquois also invented a sport called Snow Snakes, or snow darts. In this game, the players must underhand throw a smooth stick along the snow to see whose stick rolls the furthest (“Winter workout: Enjoy traditional native snow sports,” 2011). Thus, many cultures have robust histories of winter games and sports. However, these winter games were either lost and diminished by colonization, appropriated by colonizers, or simply not promoted as mainstream winter activities.

Colonialism continues to play a role in winter sports. The Ktuanaxa tribe of Canada has been fighting the construction of a ski resort for 25 years. The tribe has argued that the site is sacred to them as it is a place called Qar’muk, where a grizzly bear spirit resides. The Canadian Supreme court is reviewing whether or not the resort will impinge on their religious rights, as the tribe has argued that the resort will scare away the spirit and render their rituals meaningless (“Skiers v the religious rights of Canada’s indigenous peoples,” 2016). Even Spirit Mountain in Duluth, was one of seven sacred sites to Anishinabe people. It was a place for burials and worship and development of the ski area and subsequent golf course and hotel. Spirit Mountain was a meeting place for Anishinabe and had historical significance as place on their western migration route (Podezwa and Larson). Environment and culture did not stop a ski resort from being built in Arizona. In 2012, the Navajos and twelve other tribes appealed a judge’s decision to allow Arizona Snowbowl to use wastewater to make snow for their ski resort. The Navajo argued that the land was sacred and that the use of wastewater to make snow was a threat to human health. Navajo people collect medicinal plants from the mountain, which have been contaminated by the wastewater. Using only natural snowfall, the resort would have a nine day ski season. However, the artificial snow extends the season to 121 days. Once again, geography is not necessarily an impediment to winter sports if there is money involved. As of 2015, the issue was not resolved (Finnerty, 2012). While it would be unheard of to construct a skating rink in a cemetery or cathedral, the religious and cultural practices of Native Americans have been ignored, suppressed, and mocked. It is little wonder why they would not be interested in participating in high priced, environmentally destructive leisure activities on sacred land.

While the lack of Native American participation in some winter activities could be attributed to a different relationship to land, it doesn’t account for why Native Americans do not participate in snowshoeing. Rudimentary snowshoes originated in Central Asia 6000 years ago and moved across the Bering Strait to the Americas with the migration of aboriginal peoples. Differing snow conditions resulted in various designs, with longer snowshoes developed by Cree people, who faced warmer, wetter snow conditions and shorter snowshoes were developed by Iroquois people (Carr. n.d.). Snowshoes were developed as a matter of survival, as they allowed indigenous people to travel and hunt during the winter. The construction of snowshoes themselves was a traditional craft undertaken by both men and women (Boney, 2012). As with many things, European colonizers adopted snowshoeing for their own uses, eventually converting them to something used for recreation. Snowshoeing first became a sport in Canada, then the U.S. By the 1970s, they began to grow in mainstream popularity. During the 1980s, aluminum snowshoes grew in popularity (King, 2004). In the advent of manufactured snowshoes, the craft of snowshoe making has been declining. This has also rendered snowshoeing a profitable industry to companies who make snowshoes. Companies such as Red Feather, Tubbs, Atlas, and Yukon Charlie are not owned by Native Americans nor do they specifically seek to benefit them. While Tubbs boasts about inventing the first snowshoe for women in 1998 and donating money to Susan G. Komen for the Cure, there is no mention of how their snowshoes might benefit anyone other than white women. Likewise, Redfeather snowshoes based in La Crosse, Wisconsin mentions on its website that it hires people with disabilities, but does not mention anything about helping Native Americans, even if its name and company logo invoke Native American imagery. It is no wonder that a simple google image search of snowshoeing features hundreds of pictures of white people, but no images of Native Americans partaking in the activity. It has become a thoroughly white pastime. It is an example of cultural appropriation that is so normal and commonplace that the historical and cultural meaning of snowshoeing is almost entirely invisible.

The Role of Racism:

The lack of participation in winter sports may seem trivial, but in many ways it is a microcosm of the larger racial issues in society. For instance, in 1997, Mabel Fairbanks was the first African American woman inducted into the U.S. figure skating hall of fame. She was 82 at the time of her induction and was never allowed to skate competitively. Because of segregation, she was not allowed to practice at skating rinks. However, she went on to do her own skating shows for black audiences and was a coach to Debi Thomas and Tai Babilonia. Thomas cited income as a barrier to competitive skating, as she was raised by a single mother and the cost of training can be $25,000 on the low end (Brown). In U.S. society, class intersects powerfully with race. African American children are four times as likely to live in poverty in the United States than white children (Patten and Krogstad, 2015). In 1967, the median income of African Americans compared to white Americans was 55%. In 2013, this had increased to 59%, but a 4% increase over four and a half decades is hardly impressive. Looking at wealth, or such things as retirement savings and house ownership, African Americans owned 7% of the wealth of white people in 2011. This was actually down from 9% in 1984 (Vara, 2013). The segregation that Mabel Fairbanks faced continues today in the form of economic segregation that relegates African Americans to poor communities and low paying service industry jobs. It also persists through the criminal justice system. After all, an African American male born in 2001 has a 32% chance of going to jail, compared to a 6% chance for a white male born in the same year (Quigley, 2011).

Aside from the racist structures that may prevent individuals to partake in winter recreation to begin with, there is racism within these sports. Surya Bonaly, a black French figure skater from the 1990s, was the only figure skater in the history to do a backflip and land on one blade. This astonishing feat actually disqualified her in the 1998 Olympics. She did the flip to flip off the judges, who she felt scored her lower because of her race. At the time, the rule was that a jump must land on one blade, which was meant to deter back flips as this would be a two bladed jump. However, she landed on one to test the judges, who disqualified her anyway (Surya Bonaly is the biggest badass in Winter Olympics history, 2014). At the time, critics called her inelegant and more powerful than graceful. Surya was accused of damaging the nerves of fellow ice skater Midori Ito, which caused Ito to fall in her performance (Du, 2016). These critiques demonstrate both racism and sexism, as she did not meet the judge’s expectation of what a figure skater should look like. To them, a powerful black woman was not only threatening to the sport, but to other skaters. The nine time French National champion, five time European champion, and three time World silver medalist now resides in Minnesota, where she teaches skating lessons.

There are many examples of more blatant racism against athletes of color. Irina Rodina, who lit the torch for the Sochi Olympics, posted an image of Barack and Michelle Obama as monkeys with bananas on her Twitter (Myerberg, 2014). The Northwestern University Ski Team, consisting of 65 individuals, hosted a racially themed party in April 2012, where they dressed as South Africans, Ugandan, Ireland, Canada, Bangladeshi, and Native Americans. The students participated in a “Beer Olympics” wherein they portrayed various nations competing with each other in drinking games. The students dressed in a stereotypical and mocking fashion. This caused a controversy on campus in which the ski team offered an apology but was also portrayed as victims of aggression from students of color who were offended by their party (Svitek, 2012). Val James, the first American born black player in the NHL, experienced racism when he played for the Toronto Maple Leafs and Buffalo Sabres in the early 1980s. Bananas were thrown into the rink and a monkey doll was hung from a penalty box. He was born into a low income family in Florida and did not start skating until he was 13. Despite his accomplishment in overcoming racial and class barriers, mocking spectators would eat watermelons with his name on it. Even today, only 5% of NHL players are black (Sommerstein, 2015). These blatant acts of racism send the message that people of color are not welcome to participate in winter sports.

Another example of racism is evident in the story of the Jamaican bobsled team. Jamaica debuted its famous bobsled team in the 1988 Calgary Olympics. The story was made into a highly fictionalized movie called Cool Runnings. The national team appeared again at the Salt Lake Olympics and Sochi. In the Lillehammer Olympics, the team placed 13th and beat the US, Russia, and Italy. Bobsledding was easier to adapt to Jamaica since it entailed pushing a 600 pound sled as fast as possible, then jumping in. The Jamaican bobsled team crashed during their first Olympics, but were treated as national heroes. The team inspired other unlikely countries to form bobsled teams such as Mexico, Philippines, Trinidad and Tobago, and several U.S. territories (Atkin, 2014). Nigeria wants to field its own bobsled team in the 2018 Olympics in South Korea. The Nigerian team of former Olympian sprinters has formed to practice with a wooden sled until they can raise enough funds for an actual sled and track (Payne, 2016).

The Jamaican bobsled team could be seen as heroic, considering the challenges of becoming a winter athlete in an impoverished tropical country. Yet, the team continues to be a joke at best and racist trope at worst. For instance, two San Diego High School football coaches wore “Cool Runnings” inspired Jamaican Bobsled costumes, complete with black face in 2013 (Walsh, 2013). In 2015, a group of UW-Stout students attended a private Halloween party as the Jamaican bobsled team, again in black face. The college made a statement that they do not affiliate with those actions (Perez, 2015). In 2014, a group of Brock University college students dressed up as the Jamaican bobsled team and won a $500 costume prize. A critic of these students wrote that black costumes represent the limit of the white imagination to envision black people as anything other than rappers, gangsters, or athletes. These costumes are also a way to control how black people are understood. The film Cool Runnings itself represented Jamaicans in a stereotypical way by actors who were not even Jamaican. Blackface dehumanizes black people. The Jamaican Bobsled costumes affirm a racial hierarchy by making the athletes a stereotype or joke (Traore, 2014).

While much of this discussion has focused on African and African Americans, other racial minority groups face similar challenges. Out of 11,000 U.S. Olympic athletes, only 14 have identified as Native American. Only two of the 14 were female. One of the two was Naomi Lang. In 2002, Naomi Lang became the first Native American identified woman to compete in the Winter Olympics. She is a member of the Kuruk tribe of California but was mocked for wearing traditional regalia at the 2010 Vancouver Olympics. Skating cost her family $60,000 a year. To afford this, she slept on a mattress and wore hand me down clothes as a high schooler. Lang resisted competitions, since she felt that her culture stressed cooperation and community. Aside from differences in culture and challenges such as racism and poverty, Native Americans face the added challenge of health. 30% of Native American 4 year olds are obese, which is twice the amount of any other ethnic group (Sottile, 2011). Native Americans are also three times as likely to develop diabetes than white people. These health problems can be related back to colonization, which removed Native Americans from their land and traditional food sources and created historical trauma that continues to cause stress and health problems.

Conclusion:

The goal of feminist frolics is to enjoy the outdoors while learning. As we venture outdoors this winter, perhaps we will notice how very white the forests, trails, and hills are. Hopefully, this can be connected back to the larger racial disparities that exist in society. It is my hope that this can help us become attuned to other spaces that are largely white. For instance, one of the critiques of the recent Women’s March in Washington was the whiteness of the feminists in attendance. Many of the issues that keep racial minorities out of winter sports also prevent them from participating in politics. For instance, the media and police had an easier time imagining the protests as non-violent because it was undertaken by large crowds of white women, as opposed to Standing Rock and Black Lives Matter, which are viewed more negatively and violently by police and the media. Becoming aware of why certain groups may feel excluded or unwelcome can help us build stronger and broader movements. So, that is the larger mission of this discussion. There should be more spring times for oppressed groups than endless, white winters.