Author: katemschaefer

World War II changed a multitude of things, but not American gender norms and stereotypes. The war reinforced the differences between men and women and deepened the power struggle. Allan M. Winkler drew a direct correlation between women’s involvement in the war effort and the development of the women’s rights movement, but this only tells part of the story.[1] It was not participation, but the gender-based barriers and limits to women’s participation in the war effort that reinvigorated the women’s civil rights movement. “Utilizing American woman power was a matter of military expediency,” wrote Michaela M. Hampf in “‘Dykes’ or ‘Whores’: Sexuality and the Women’s Army Corps in the United States during World War II.”[2] Expediency does not connote acceptance or appreciation, a distinction that followed women throughout the war. “Opponents to even a temporary participation of women felt that not only the efficiency of the military was threatened, but also the traditional system of male dominance and the roles of female homemaker and male breadwinner” continued Hampf.[3] In other words, women who did not stick to hearth and home were seen as more likely to burn down the house than to keep the home fires burning. The response to the possible subversion of traditional gender roles was an increased effort to keep women in their place.

One effective way to reinforce the traditional structure was to play up the differences between men and women by highlighting the ways in which women could never measure up to the ideal represented by American manhood. Low wages and low expectations concerning the duration of female employment were blatant reminders of women’s worth in the workplace relative to their male counterparts; others were less transparent. Articles on industry beauty contests, fashion shows, and “war fashion tips for feminine safety” shared pages with war reports in the monthly newsletters of a New England shipyard, for example.[4] These articles framed women workers as both “helpless” and “glamorous,” two decidedly nondesirable traits in workers meant to keep the economy and the war effort on track.[5]

Media depictions took contradictory representations of women even further. Women were depicted in images like “Rosie the Riveter,” but were also prominent in posters warning soldiers of venereal disease, “penis propaganda” that implied any woman could present a threat to manhood.[6] Male promiscuity is excused, accepted, and even expected, but female promiscuity threatened the health of American society and of its fighting men. The “virgin/whore binary” (coined by Lisa Wade in her essay for Sociological Images) was not limited to factory work or propaganda.[7] Women who served in military capabilities had to be careful not to be too ambitious lest they be branded as lesbians, prostitutes, or a combination of both. Linda Grant DePauw noted more work on military prostitution has been published than on women on who served as combat soldiers during the war.[8] The relative lack of research on women’s combat service compared to their illicit sexual service preserves the hypersexualized “otherness” of women in war, reminding us historians are not immune from the social norms and cultural mores of the environment in which they research and write.

Participation in the WWII workforce did not magically give women agency, nor did it open society’s eyes to their worth and abilities. If it had, there would have been no need for the women’s civil rights movement. Society does not change on its own, and the process is brutal. Some women simply could not reconcile the “new sense of self” and “self-reliance” fostered by working outside of the home with the societal expectation that they would “cheerfully leap back to home” when the men returned from war.[9] As Dellie Hahne told Studs Terkel in an interview for his “The Good War:” An Oral History of World War II, “a lot of women said, Screw that noise. ‘Cause they had a taste of making her own money, a taste of spending their own money, making their own decisions.”[10] As the hands that rocked the cradle learned their hands could handle many other tasks, they were not content to go back to how things were. The war had changed them, but it was up to them to change their world.

[2] Michaela M. Hampf, “‘Dykes’ or ‘Whores’: Sexuality and the Women’s Army Corps in the United States during World War II.” Women’s Studies International Forum 27 (2004): 13-30, accessed December 14, 2016, EBSCOHost., 13.

There were no turning points in the Civil War. There were, however, several turning points identified after the war. Determining a “turning point” is an academic exercise: we see the events that determined the ultimate outcome of the war only after we know the ultimate outcome. A turning point is, as historian Erik Rau described, “ultimately a construct of historical reflection.”[1] Historian Carl Becker went a step further in a 1926 speech to the American Historical Association, describing a turning point as a “symbol, a simple statement which is a generalization of a thousand and one simpler facts which we do not for the moment care to use.”[2]

The symbol a historian chooses depends on the outcome she wishes to highlight. Social historians interpret Antietam as a turning point because the Union victory gave Lincoln the impetus to proclaim the emancipation of slaves in areas in rebellion of the Union. The Union’s hard-won victory over Vicksburg could be seen as an economic turning point because it effectively broke the Confederacy in two along the Mississippi, further stressing already over-extended Confederate supply lines.[3] Gettysburg, on the other hand, could be understood as a military turning point because it demonstrated General Grant’s ability to definitively defeat Lee on the battlefield.[4] It is also considered the “high watermark of the Confederacy,” a characterization that only makes sense in hindsight as we can only recognize the Confederacy’s highest point in reference to its lowest point, Lee’s surrender at Appomattox. Turning points–symbols–are very personal things: one hundred fifty years later, historians continue debating the relative importance of individual battles. Gary Gallagher, for example, rejects the importance of Vicksburg and Gettysburg in favor of the Seven Days Battles, writing Gettysburg “looms largest in the public imagination as the war’s grand turning point…(but) affected the long term-shape of the war relatively little” while Vicksburg “generated a greater emotional than military result.”[5]

These debates can get historians into trouble as it can be very difficult to requite the knowledge gifted by historical hindsight with the information Americans had at the time. A turning point, Rau reminds us, is not something that “reveals itself to the people living through it at the time.”[6] The soldiers fighting at Antietam, Vicksburg, Gettysburg, and countless other battles, did not know the tide of battle shifted as they took and left the field.

As a budding military historian focused more on war’s effects on society than how they were fought, I am much more interested in the war’s point of no return than in the war’s various turning points. In 1861, the North and South alike anticipated a speedy end to the conflict. Twelve months of battle proved this expectation dead wrong. Union and Confederate corpses littered the battlefield of Shiloh in alarming numbers, yet somehow the armies’ determination to keep fighting did not die. Grant responded to a demoralizing Confederate attack at Shiloh on April 6, 1862 with his own counterattack, telling one of his officers that retreat was not an option: “Retreat? No. I propose to attack at daylight and whip them.”[7] Soldiers seemed to share the same dogged determination despite the horrors they witnessed. “If my life is spared I will continue in my country’s service until this rebellion is put down, should it be ten years,” wrote a Union soldier after the battle of Shiloh.[8] Fifteen months and thousands of casualties letter, another soldier’s correspondence echoes the same. Concluding a letter to his family written after the Battle of Gettysburg, Lieutenant William Wheeler of the 13th New York Battery wrote,

“The time may vary a few months, a few years, or even a few decades, but the job will be settled and that all right too. I am…ready to bear, believe, hope, and endure all things for the cause, knowing that if we do so, we also, like Charity, shall never fail.”[9]

Shiloh was a wakeup call to the North and South, but the soldiers also experienced it as a point of no return. It was their first taste of the brutal battles to come, yet it did not deter them from fighting. Whether their motivation was honor, moral conviction, or some other amorphous justification, the soldiers fought on. Both sides knew the short war they predicted was simply not possible: that bridge was crossed and burned. Something changed around Pittsburg Landing in Spring 1862. As the armies progressed to Antietam, Vicksburg, and Gettysburg, they did not know the war was shifting in favor of Union victory, but the battle experiences of 1862 and 1863 gave rise to different soldiers and different armies. In contrast to the “turning points” identified by historians years after the fact, the renewed determination to keep fighting despite the brutality, trying conditions, and uncertainty experienced after Shiloh was “reveal(ed) to the people living through it at the time.”[10] As poet Walt Whitman wrote in 1883,

“Future years will never know the seething help and black infernal background of countless minor scenes and interiors (not the official surface-courteousness of the Generals, not the few great battles) of the Secession War…the real war will never get in the books…The actual soldier of 1862-’65, North and South, with all his ways, his incredible dauntlessness…will never be written.”[11]

Amid all this speculation I also wonder how women experienced the inertia of the war. Their experiences were very different from those of the men, but no less difficult and certainly of no less importance. How did one feel the war on the homefront, especially as the line between homefront and battlefront blurred? “I do not write often now – not for want of something to say, but from a loathing of all I see and hear. Why dwell upon it?” wrote diarist Mary Chestnut in 1865.[12] These words demonstrate Chestnut’s own feeling of loss and awareness of an ending, but how did Northern women feel? Or the emancipated slave women? Did they sense the beginning of the end or recognize the turning of the tide?

Historians are quite adept at finding “the few great battles,” but they sometimes overlook the minutia of individual experience. The difference between a turning point and a point of no return is quite simple, really: one we find in the words of historians, the other in the words of the soldiers themselves.

On March 15, 1895, Michael Cleary burned his wife Bridget alive. He claimed his real wife had been taken by the fairies, and a changeling put in its place. After days of folk remedies (including dousing her with urine and force-feeding her herbal concoctions) and attempts to coax the fairy to leave through exposing it to the lit hearth (in other words, burning Bridget with the flames), Cleary finally poured paraffin oil on her smoldering clothing, setting her aflame. The media frenzied at Cleary’s trial, digging into the details of the witness statements and “evidence” of the supernatural at work in modern times.

There was more to this fairy story, however. “The overwhelming message of the fairy legends is that the unexpected may be guarded against by careful observance of society’s rules,” explained Angela Bourke in her 1999 book, The Burning of Bridget Cleary.[1] To Bourke, Bridget presented a more potent challenge to her local society than the supernatural ever could. A trained dressmaker who owned her own Singer sewing machine and also raised her own chickens, she was an educated tradeswoman who earned her own money. Her clientele brought her into contact with men and women in higher social classes, and through them, new ideas about what she wanted and expected from life. A woman who could support herself financially could not be as easily controlled by a husband or society in general. Adding the fact that she had not performed her wifely duty and borne a child to carry on the Cleary name, Bridget was a dangerous anomaly within the social norms of her community.

News coverage of battered spouses always seems to turn up warning signs far too late, and Cleary’s story is no different. A few months before she was killed, Bridget confided in her aunt Mary Kennedy about her troubles at home, saying “He’s making a fairy of me now, and an emergency…he thought to burn me three months ago.”[2] Cleary could have been speaking figuratively, saying her husband was disappointed in her and wished she would revert to the naïve, uneducated woman he married. It also could have been a literal cry for help, voicing her fears that her husband planned to harm her physically. History does not allow us to say with certainty which of these possibilities is true, but we do know Michael Cleary justified burning his wife to death because she was a “fairy.”

Cleary went to jail for fifteen years and his wife became the “last witch of Ireland,” a neat label that both sold papers and kept the public from developing too much empathy for the woman. Bridget Cleary was not a witch. At most she was a victim of the supernatural, or at least a victim of a society that used the supernatural as a cover for forcibly bringing women into line with accepted conventions.

One hundred years later, we pat ourselves on the back for disdaining the supernatural. We say we don’t burn witches, but that’s not exactly true. Modern society retains its own system of rules and punishments to regulate female behavior that is more often than not contradictory to those it holds for males. Our worst censure is reserved for women who defy convention: the ones who speak when they are supposed to be silent, rage when they are supposed to be resigned, act when they are supposed to be accepting. We don’t burn women at the stake; we roast them on social media. There is a reason the slang term for putting someone in their place using a well-timed insult is called a “burn.”

The survivors of the school shootings at Stoneman-Douglas High School in Parkland, Florida on February 14, have come under fire for their response to the massacre. It defies the resigned “thoughts and prayers” that bolster the status quo. Channeling their grief and anger into action, the teenagers built one of the most powerful and compelling challenges to the American gun lobby in recent memory, if not ever. The sincerity of their message, spoken and shouted through tears, is difficult to deny, so detractors took aim at the messengers themselves. NRA leaders and other anti-gun control supporters insisted the teens are too young to be so poised and must therefore be talking heads for adult anti-gun/anti-Second Amendment groups already in place.

The worst insults seem to be reserved for Emma Gonzalez, a young woman whose words are as cutting as her hair is close-cropped. She called B.S., so Leslie Gibson, a Republican candidate for Maine’s House of Representatives referred to her as a “skinhead lesbian” on Twitter. Outrage over Gibson’s comments forced him to drop out of the race, but branding Miss Gonzalez in this manner shows modern America has its own answer to the Irish changeling fairy tale. Women must look and act a certain way to be accepted and must parrot the approved message if they are to be respected. Her haircut is not threatening in itself. Her sexual orientation, whatever it may be, has absolutely no bearing on her stance on gun control. Gibson may have attacked other classmates for their message, but he refused to hear Gonzalez because of her appearance and his interpretation of her sexuality. A non-white female with the courage to stand up to established adult politicians and the strength to stay on message as she attended a month of friends’ funerals and memorial services? Threatening does not begin to describe the woman. Neither does powerful. She again did the unthinkable at the March 24 March for our Lives rally in Washington, D.C. by staying silent. For six long minutes and twenty interminable seconds, Gonzalez stood on the stage, most of them saying nothing as tears dripped down her face. She weaponized silence, bringing the crowd to its feet and her detractors to their knees. The gun control crusader was without speech but had the last word.

In looking to history for lessons, we must remember we will sometimes see things we don’t want to see, including the fact that repeated “thoughts and prayers” are historically ineffective at keeping it from repeating itself. That prejudice, hate, and fear make words like “lesbian” (and “Pocahontas” for that matter) a slur and insult. That over a hundred years of experience, growth, and technology cannot keep us from behaving in the same ways as our “backward” ancestors did when confronted by change and challenge. We don’t burn young women as witches anymore, but we are very keen to crush the spirits of women and men who refuse to conform to societal expectations.

Describing the Cleary case in 1901, historian Michael J. McCarthy bemoaned the fact that the “events took place, not in Darkest Africa, but in Tipperary; not in the ninth or tenth, but at the close of the nineteenth century.”[3] Another century has passed. When will America stop burning its “witches,” or at least accept the fact that we aren’t as enlightened and modern as we would have others believe?

One hundred one years ago today, thousands of Russian women took to the streets to protest high prices and food scarcity. “Down with high prices” and “down with hunger,” they shouted. Their voices did not go unheard. Thousands joined them the next day as a labor strike broke out. On March 9 (February 25 according to the Russian calendar), approximately 200,000 workers filled Petrograd. Their new battle cry? Down with the tsar.

In Hemingway’s The Sun Also Rises, one character describes how he went bankrupt as happening “gradually and suddenly.”[1] “Gradually and suddenly” is also an extremely apt way to characterize the 1917 February Revolution in Russia. The revolutionary spark kindled by the massacre of Father Gapon and his followers in 1905 was temporarily dimmed by Nicholas II’s creation of the Duma and other assorted attempts at reforms. The next twelve years saw steady economic decline, rampant inflation, military setbacks and defeats in World War I, and a continued increase in the people’s distrust and disfavor with their autocratic government. All of these factors contributed to the February Revolution, but what caused it to occur at that specific time? Why not in January, or the previous December? The revolution needed a flashpoint, and that came in the form of a loaf of bread. The person who wants to identify the roots of the February 1917 Revolution need look no farther than what was on (and more importantly, what was not on) Russian dinner tables. More than allegiance to any revolutionary dogma or nationalist feeling, Russians of every class and creed shared the experience of persistent food insecurity. Food scarcity does not link to the entire revolutionary movement in a straight line, but it is both a common theme and symbol of the problems within the Russian government, military, and people themselves.

“It all began with bread,” wrote historian Orlando Figes in his social history of the Revolution.[2] As the country mobilized for war, the majority of the nation’s food production was earmarked for sustaining the millions of men (and women) serving at the front (and rear).[3] Even this was not enough, as soldiers complained of the lack of provisions, arms, and other necessities. “In Ivov, before the eyes of 28 thousand soldiers, five people were flogged for leaving their courtyard without permission to buy white bread,” wrote soldier A. Novokov.[4]

Food insecurity was even worse on the home front. As peasant farmers realized they could not buy enough food to support their families, they turned to farming subsistence crops like potatoes and oats instead of traditional grains. In the cities, workers had money to buy food, but near constant food shortages meant there was no food to buy. “We will soon have a famine,” wrote Maxim Gorky to his wife, Ekaterina. “I advise you to buy ten pounds of food and hide it.”[5] Most would not be able to make such preparations. “They say: work calmly, but we are hungry and we cannot work,” wrote a group of female workers in June 1915. “They say there is no bread. Where is it then? Or is it only for the Germans that the Russian land produces?”[6]

Everyone seemed to recognize Russia’s situation was dire except the tsar. While his country starved, the “little father” of Russia dined in style. Describing a typical meal at Tsar Nicholas II’s table, Alexander Mosolov writes of “soup…followed by fish, a (game or chicken) casserole, vegetables, sweets, (and) fruit.”[7] The ruling family washed down this abundance of food with “madeira, white and red wines for breakfast… and different wines served at lunch, as is the custom everywhere else in the civilised (sic) world.”[8] There is no more powerful demonstration of the tsar’s disengagement from the people he ruled than the royal family enjoying the finest Bordeaux while his people could barely scrape enough food together to keep themselves alive. Nero is said to have fiddled while Rome burned, but the Romanovs did feast while the Russian people starved.

The people were hungry, the army was in disarray, and the government seemed out-of-touch at best, but the situation was still not quite ripe for revolution. The people needed a common cause they could rally behind. This cause crystallized in the bread lines of Petrograd. Figes described the Petrograd bread lines as almost “a sort of political forum or club, where rumours, information, and views were exchanged.”[9] As they realized common experiences and concerns, the people began to organize. Put quite simply by Figes, “(t)he February Revolution was born in the bread queue.”[10] Organized civil disobedience took a more violent turn as bread shortages led to bread riots. Strikes and walkouts in factories increased the number of people demonstrating in the streets, making it ever more difficult for the police to regain control. The tsarist government fell, the Romanov Dynasty ended, and a Provisional Government made up of leaders of the Duma was left to pick up the pieces. It should be no mystery why the Bolsheviks captured the imagination of the people. Their promises of peace, land, and bread neatly summed up the needs of every Russian soldier, farmer, and worker, man or woman, child or adult.

When compared to other causes of the Revolution—World War I, failed reforms, tsarist incompetence—bread seems insignificant. Lack of bread, however, is extremely significant. The Russian government could not meet the needs of its people. Hundreds of thousands died at the front lines and at home while the Duma struggled against a tsar who had no understanding of his country’s issues or impending demise. The 1917 February Revolution in Russia continued a legacy of protest that included the 1789 women’s bread riots in revolutionary France and the bread riots in Richmond, Virginia (then capital of the Confederacy) in 1863. Food insecurity inspired women to speak, and gave them a message to which their societies listened.

One hundred one years later, women still wait in bread lines, walk miles for clean water for their families, and struggle to care for their families. Equal pay and equal rights continue to escape even the most modern, “civilized” nations. It is easier to create hashtags and slogans than real change. On this International Women’s Day, we recognize the women who spoke up and walked out. We salute the women who continue to refuse to let the status quo determine their present and stifle their future.

On February 29, 1940, African-American actress, singer, and entertainer Hattie McDaniel won an Academy Award for her portrayal as Mammy in Gone With The Wind. Though 1939 also saw the premieres of movies like The Wizard of Oz and Mr. Smith Goes to Washington, GWTW earned thirteen nominations and eight awards, including Best Picture, Best Director, Best Actress, and McDaniel’s Best Supporting Actress accolade. Given the racism and discrimination rampant in the United States in the 1940s, the decision to award the Supporting Actress to an African-American woman seemed to be a tremendous step forward.

It was.

It also wasn’t.

Born in 1893, Hattie McDaniel began performing when she was in high school as part of a troupe called The Mighty Minstrels. By the time she was in her 20s, she was performing on the radio, the first African-American woman to do so. The performing life did not pay well, and McDaniel often worked as domestic help to make ends meet. After moving to Los Angeles, she was cast as an extra in a Hollywood musical. After earning her Screen Actors Guild (SAG) card, McDaniel went on to small roles in I’m No Angel, The Little Colonel, Judge Priest, and Show Boat. She worked with and became friends with many of the major stars of the day, including Shirley Temple, Henry Fonda, Clark Gable, and Olivia de Havilland. Her relationships with the latter two helped her win the role of Mammy in Gone With The Wind.

McDaniel’s acting ability was never in doubt. Mammy was the soul of Margaret Mitchell’s novel and of the film adaptation. Reception to the film (and its actors) demonstrates the black soul of American racism, however. None of the African-American actors were able to attend the film’s opening night at Atlanta’s Loew’s Grand Theater. Jim Crow also showed up at the Oscar ceremony the following year. GWTW director David O. Selznick had to petition for McDaniel to be able to attend the ceremony at the Ambassador Hotel. She and her date ended up sitting at a table at the back of the room separate from her GWTW costars. It was easier to award McDaniel one of the top acting awards in the nation than to find a place for her in American society as an African American woman. She could be a star, but not an equal.

McDaniel also faced censure from the African-American community, who saw GWTW and the character of Mammy as romanticizing the Old South and slavery. They criticized her for taking roles as slaves and servants, saying she was preserving the stereotypes that fueled discrimination against black Americans. McDaniel disagreed, arguing African-American women did not have the luxury of choosing their roles if they wanted to continue to work (and to eat). “The only choice permitted us is either to be servants for $7 a week or to portray them for $700 a week,” she said.[1] McDaniel believed “a woman’s gifts will make room for her,” but we cannot forget for a moment that a woman is rarely in control of the room’s location or its conditions.[2]

Almost eighty years later, the Academy Awards, and the United States, struggles with diversity. The 2015 Awards earned the hashtag #OscarsSoWhite when the Academy of Motion Picture Arts and Sciences nominated Caucasian actors for all twenty major acting awards, the first occurrence since 1998. American society is diverse, but depicting and honoring that diversity continues to be difficult. It is hard to believe we can still celebrate the “first black,” “first Asian,” “first Hispanic,” “first LGBTQ,” “first woman” (the list goes on and on) anything in the year 2018, but that is our reality and our society.

Tonight’s 90th Academy Awards is not without its own firsts: the first female cinematography nominee (Rachel Morrison for Mudbound); nominations for African-American director/comedian/actor Jordan Peele (Get Out) and for female director Greta Gerwig (Lady Bird); and of course, the first Oscars since Harvey Weinstein was dethroned by industry leaders finally taking the sexual assault and abuse allegations against him seriously. 2018 seems to be the year where Hollywood is at least ready to listen to disenfranchised voices, but that does not mean the path ahead is certain. Some have criticized the film Call Me By Your Name, the story of a young man’s summer affair with his father’s research assistant, as promoting sexual promiscuity and underage sexual relations. This is especially interesting during a cinematic season that also saw the opening of Fifty Shades Darker, the second film in a trilogy that regularly substitutes softcore pornography for plot and character development. Others criticize Guillermo del Toro’s The Shape of Water for not going far enough in its development of its disabled characters, namely the protagonist, Elisa.[3] The Oscars, like society itself, is perpetually caught in a game of one step forward, one step back, not far enough—wait, too far. It is only by stretching boundaries that we will ever arrive at a new, more equitable normal.

McDaniel said “we respect sincerity in our friends and acquaintances, but Hollywood is willing to pay for it.”[4] Perhaps the best way forward is to keep reminding Hollywood, and other sources of American power, that it will only get what it pays for. We must also remember that we, the consumers, get what we pay for. Hollywood films what sells. If we continue to demand film art that is inclusive and also put our money where we say our priorities lie, #OscarsSoWhite can become part of history, not a recurring pattern. Race and gender shape art, but do not and cannot determine its worth. Perhaps we must also keep reminding them we have the receipts.

Two statues guard the entrance of the U.S. Army Women’s Museum at Fort Lee, Virginia: Pallas Athena, the Greek goddess responsible for wisdom and war, and a female American soldier, the personification of those attributes. The USAWM was originally part of Fort McClellan, Alabama, but moved to Virginia after the base closed in 1999. It opened at Fort Lee in 2001, but it was only five years ago that the museum became the first American military installation to display a statue of a female soldier. This timeline parallels women’s fight to both participate in the U.S. military and be recognized for their participation. The museum does a very good job at establishing the fact that women have always been involved in American wars; it was official recognition of their contributions that trailed behind.

The museum begins and ends with a large tree adorned with replicas of dog tags left behind by fallen female soldiers. One electronic exhibit allows the visitor to select the names of individual soldiers and pull up their pictures and a short biography and service record. Sacrifice is key to the USAWM: from the “unofficial” soldiers in the Revolutionary and Civil Wars to the WACs of WWII and combat soldiers of Desert Storm and following, female sacrifice was essential to American military success.

As described on its website, the USAWM is a “repository of artifacts and archives,” but also “an educational institution.” The curators have done a fantastic job integrating elements that will keep younger visitors interested and entertained. A theater in a small alcove explains the role of Walt Disney animation in the war effort and shows several WWII-era Donald Duck cartoons produced during the time. There is also an area that allows children to try on the various caps/head gear, uniform pieces, and arms mentioned and depicted in the exhibits. Kids can also take home free coloring pages as a souvenir.

The USWM is also an important resource for historians and researchers. Appointments to view the archives or explore their service member oral histories can be made on the museum website. The archive holds over 1.5 million documents, including books, photographs, scrapbooks, correspondence, and other formats. The museum also allows visitors to take home copies of the U.S. Army Center for Military History’s books on the Women’s Army Corps (WACs) by historians Bettie J. Morden and Mattie E. Treadwell. As military history has long been the domain of men, finding sources about military women written by military women is refreshing to say the least. Morden’s and Treadwell’s works would be extremely useful to students and researchers interested in investigating female participation in the army from 1942 to 1978. Treadwell’s Special Studies text provides additional information on mid-century American interpretations of gender and war and reflects on how these interpretations shaped what military women were and were not allowed to do during wartime.

The U.S. Army Women’s Museum is easy to overlook, but well worth a visit. At the time of our visit (March 1, 2018), several exhibits were under construction, including redesign of a gallery and an expansion of one area. I’m definitely planning a return visit to see the new pieces and check out the U.S. Army Quartermaster Museum next door.

“The right of citizens of the United States to vote shall not be denied or abridged by the United States or by any State on account of sex. Congress shall have power to enforce this article by appropriate legislation.”

The U.S. Constitution is, and is intended to be, a living document, but that does not mean changes are easy. Constitutional amendments are hard-fought and hard-won. The debates they spur often inspire strange political alliances. The long fight for female suffrage is the story you know, but the woman’s suffrage movement’s awkward alliance against the 15th amendment is not widely publicized (for obvious reasons).

The 19th amendment to the Constitution, ensuring “(t)he right of citizens of the United States to vote shall not be denied or abridged by the United States or by any State on account of sex,” was ratified in 1920. The amendment was the result of an over eighty-year battle for women’s rights in the United States. The 1848 Seneca Falls Convention laid out the women’s movement’s battle strategies and goals, but the changing political landscape would often thwart the suffragettes’ plans.

The Civil War divided the nation; Reconstruction would end up dividing the woman’s rights movement. The proposed 15th amendment stated suffrage “shall not be denied…on account of race.” This gave white and non-white men the right to vote, but as it did not specify suffrage could not be denied based on sex, women were again denied the right. Woman’s rights leaders Elizabeth Cady Stanton and Susan B. Anthony bristled at the idea and withdrew their political support for the amendment. “If that word ‘male’ be inserted,” wrote Stanton, “it will take us at least a century to get it out.”[1]

“That word” was not included, but the implication was enough to bar women from voting. The women’s movement split into two groups: Anthony and Stanton formed the National Woman Suffrage Association (NWSA), while Lucy Stone and others who supported the ratification of the 15th amendment formed the American Woman Suffrage Association (NWSA). Disagreements over ideology and methodology hampered the movement’s efficiency and its ability to present a united message regarding woman’s rights. Further complicating matters, some groups interpreted the NWSA’s anti-15th amendment stance as evidence of racism within the movement. Though the NWSA’s connection to groups that supported racial discrimination was tenuous at best, it impacted their image and message. It was not a particularly effective way to court the thousands of African-American women who also wanted civil rights as American citizens, to say the least.

As is often the case, an American war was the ultimate impetus to bringing about American social change. Women’s contributions in mobilization for World War I finally convinced male leaders and politicians that women’s participation could not be ignored. Anyone who gave so much for their country, and made do with so little, deserved the civil rights too long denied them. (Of course, women’s protests and other forms of mobilization for suffrage also forced politicians’ hands.) “I regard the concurrence of the Senate in the constitutional amendment proposing the extension of the suffrage to women as vitally essential to the successful prosecution of the great war of humanity in which we are engaged,” said President Woodrow Wilson in an address to Congress in 1918.[2] Congress agreed, passing the 19th amendment in 1919. It was ratified the following year.

The final challenge to the amendment’s constitutionality came in the 1922 Supreme Court case, Leser v. Garnett. In the original case, lawyer Oscar Leser sued to have two women removed from Maryland voting rolls, saying women did not have the right to vote in Maryland because the state had not ratified the 19th amendment. Chief Justice Louis Brandeis ruled women’s suffrage applied to all American women regardless of whether or not their state ratified the amendment (approved women’s right to vote). Ratifying the amendment put the law in the books, but the 1922 decision in Leser v. Garrett ensured it was a law that women could use.

February 21 was the 170th anniversary of Karl Marx’s and Friedrich Engels’ magnum opus, Manifest der Kommunistischen Partei, commonly known as The Communist Manifesto. This controversial work was built on a controversial philosophy: end distinctions between social classes; abandon capitalism and the free market system; and divorce society from religion and religious practices. Religion, class, and economics were critical drivers of thousands of years of European history. Borrowing a phrase from the Disney movie Pocahontas, if an endeavor did not increase one’s glory, God, or gold (and preferably some combination of the three), it was quickly abandoned.

Amid the censure of his community and frequent run-ins with local law officials, Marx never stopped working for revolution. This is the story we know. What history rarely mentions is the woman who made it possible: his wife, Jenny von Westphalen.

Joanna Bertha Julie Jenny von Westphalen was born into Prussian aristocracy and all of the privileges that entailed. Her father, Ludwig von Westphalen, seemed to enjoy Karl Marx as a conversationalist, but the idea of him becoming part of the family was out of the question. Jenny loved him, however, and turned her back on her family to marry Marx.

It was not an easy life for the former aristocrat—she went from bourgeoisie to proletariat in one fell swoop, trading salons and dinner parties one day for pawn shops and bread lines the next. She firmly believed in her husband’s ideas and teachings, possibly even more so because she had to live them. Her liberal views extended to her stance on women’s place in society, which skewed towards proto-feminism:

“In all these struggles we women have the harder part to bear because it is the lesser one. A man draws strength from his struggle with the world outside, and is invigorated by the sight of the enemy, be their number legion. We remain sitting at home, darning socks.”[1]

Jenny did more than sit at home and darn socks. In addition to giving birth to seven children and enduring the pain of losing several, she kept the household together as the family fled from country to country. In an interesting twist of irony, the economic historian and philosopher could not keep his own accounts straight. If the family owed money, and it always seemed to owe something to someone, Jenny went to the local pawn shop and sold whatever she could to make ends meet. Her ability to keep the family fed and clothed allowed Karl the time to write, think, and occasionally philander (one by-blow resulted in a son that Friedrich Engels adopted as his own to protect Karl’s reputation).

Jenny is also directly responsible for the publication of Marx’s writings. Karl Marx’s handwriting was so messy that his first drafts were illegible. Jenny recopied the pages in her own hand, producing manuscripts that could be sent to publishers for printing. She also acted as Karl’s personal correspondence secretary, answering letters for him when he was too ill to take on the task.[2]

Jenny von Westphalen Marx fought for her husband, for her family, and for the class revolution she believed to be inevitable. The only fight she could not win was against cancer. She died on December 2, 1881 after battling the illness for years. Karl was not well enough to attend the funeral, but family friend Friedrich Engels spoke at the graveside on his behalf. Buried “at the cemetery of Highgate in the section of the damned,” historians also buried Jenny in the historical record.[3] Without Jenny’s work as copywriter and editor, the Communist Manifesto and Das Kapital may never have seen the light of day. Marx and Engels may have given birth to Communist revolution, but Jenny was the revolution’s midwife.

kms

Author’s Note: For more information on Jenny von Westphalen and her relationship with her husband, please see the following sources:

Love and Capital: Karl and Jenny Marx and the Birth of a Revolution by Mary Gabriel

On February 20, 1985, the Republic of Ireland legalized the sale of non-medical contraceptives. Whether in the form of pills, condoms, or spermicides, it is difficult to argue against the positive impact of birth control on women’s history. A woman’s ability to decide whether she wanted to have a baby allowed her the freedom to decide to prioritize other aspects of her life. For some, that meant joining (or rejoining) the workforce. For others, it meant they could simply choose not to bear children.

1985 seems very late for a nation to legalize contraception, but we should remember other nations also had checkered relationships with the topic and continue to struggle with the idea that a woman should have the last word on her body. American nurse and women’s rights activist Margaret Sanger was imprisoned several times for trying to educate early twentieth-century women on their reproductive health and options. In what is surely the irony to end all ironies, Sanger was arrested on the grounds of spreading pornography. Her attempts to mail copies of her newsletters and journals ran afoul of the 1873 Comstock Act which outlawed the circulation of “obscene and immoral materials.”[1]

To Sanger, the true obscenity was forcing women into an occupation in which they were unprepared or uninterested. Personal experience seemed to be her guide: she both witnessed her mother’s early death from multiple pregnancies and miscarriages and nursed many women who suffered the consequences of back-alley abortions and other do-it-yourself methods intended to end unwanted pregnancies. “No woman can call herself free who does not own and control her body,” wrote Sanger in a 1919 article for Birth Control Review. “No woman can call herself free until she can choose consciously whether she will or will not be a mother.”[2] Believing reproductive education and accessible, reliable, and effective contraceptive methods would enable women to make informed decisions about their own health, Sanger worked tirelessly for women’s civil rights until her death in 1966.