Search

How much of a story about life in the good old days is fact and how much is fiction? In the HBO miniseries John Adams, a mob of Patriots attack a British customs officer, strip him naked and cover him in tar and feathers. The scene shows the victim slathered in asphalt tar – a substance that did not exist in the 1770s. Mobs instead used pine tar, which is brown instead of black, but filmmakers of course knew that modern viewers would not recognize it as easily as they would asphalt.

Such artistic license is arguably negligible and John Adams deserves distinction as a period drama that is predominantly accurate, rendering its characters and indoor scenes as gray and as musty as life was before electricity and indoor plumbing. Most filmmakers prefer to embellish period dramas, opting for audience appeal over historical accuracy. In the 2002 film version of The Importance of Being Earnest, the Victorian protagonists serenade their beloveds with an upbeat jazz number, which is the equivalent of playing disco music in 1945. And for most of the story, Colin Firth and Rupert Everett look like they always do – that is, clean-shaven and donning boyish coiffures they previously wore in romantic comedies set 100 years later. While parasols and top hats abound, no one in the film is flaunting the glistening hair gel and heavy handle-bar mustaches of the play’s original stage production in 1895.

Directors almost always decide that lovers and heroes in period pieces should adhere to contemporary fashion rules from the neck up, lest audiences be less likely to swoon. Thus pretty much any film set in Ancient Egypt, Rome or the Early Modern Era pretends that men never wore eyeliner or lip rouge. (And thatall the good guys looked white.) Films set in the Victorian era correctly leave cosmetics off the men but wrongly apply it to the female characters, who would have been insulted by anything more than face powder. (Makeup was for actresses and prostitutes, and Victorians didn’t see much difference between the two.) Even though Queen Elizabeth II is the most famous woman in the world, the actress who portrays her in the award-winning series The Crown has a far daintier nose and jaw, with eyebrows plucked to evoke the cover girls of today. Filmmakers who wish to forego such historical inaccuracies face an uphill battle, according to John Adams director Tom Hooper: “Wherever possible I wanted to do things that weren’t about making people beautiful. The truth is there’s a whole machine of filmmaking that’s all about making people look great. And you have to really intervene in every department to sort of say, ‘No, I don’t want that. I don’t want people to wear any makeup. You’re not allowed to wash people’s hair.’ ”

Hollywood takes such liberties in the hopes that the audience will barely notice. Viewers watch period dramas in order to oo and ah at the finery, and imagine that they could easily slip into an earlier era and have a grand old time. They can imagine this because they are protected from unpleasant information such as the fact that the powdered and painted aristocrats of Louis XIV’s courts regularly relieved themselves in the gilded corridors and behind the velvet curtains of the palace. Horace Walpole noted the stench at the time, but Hollywood has yet to. The audience’s comfort comes at the expense of the opportunity to learn that standards of attractiveness, cleanliness, and morality are far from universal, shifting continuously throughout human history. Likewise, it is an opportunity to learn that our feelings of disgust are often not innate but a product of where and when we grew up.

A handful of films and plays have thrived by underscoring the changes between then and now. Mad Men earned critical acclaim and a loyal following not only for its meticulously authentic fashion but for subtly laying bare the secrets of everyday life in the early 1960s that TV shows of the era had omitted: rampant infidelity, casual racism, sexual harassment, anti-Semitism, misogyny, covert homosexuality and vicious homophobia, legal date rape, domestic violence, and health hazards as far as the eye can see. Hamilton has been a Broadway sensation for deliberately altering the facts and urging the audience to take notice – wanting all to be fully aware of the historical significance of people of color portraying national heroes who owned slaves.

Mad Men and Hamilton have garnered attention precisely because they deny audiences the escapism so commonly peddled by period pieces. Escapism can be innocuous, but not when it warps our sense of reality and the world as it is, once was, and should be. When wildly popular stories like Gone with the Wind and Song of the South portray plantation life as merry, influential social conservatives argue that African-Americans had no complaints before the Civil Rights Movement. When populistpoliticians inform voters who pride themselves on a lack of “elitist knowledge” that they can make their countries “great again,” difficult truths about the past remain problems unsolved. Too often our glorious history as we like to think of it is more fantasy than fact – which is why sociologists call it The Way We Never Were.

Last week there was much discussion on the blog about the social ramifications of height, but what about high heels? The Women and Equalities Committee of the U.K.’s House of Commons recently found that employee dress codes that require heeled-shoes for women are violating laws banning gender discrimination. The Committee reviewed the matter after receiving a petition signed by 138,500 people and started by Nicola Thorp, a London receptionist who in December 2015 had been suspended by her employer without pay for violating the company’s dress code for women by showing up for work in flats.

I personally find high heels frequently quite becoming. I also personally find them physically hazardous. Pretty much anyone with any sort of orthopedic disability has been advised by their specialists again and again to limit the time they spend in heels to a minimum. While reporting on the U.K. ruling, NBC News let women in on “an essential secret — carrying a pair of trainers in your handbag.” This is cold comfort to those of us who know that back pain is also caused by carrying more than 5% of your body weight in your handbag. One twentysomething friend with an invisible disability was told by her spinal surgeon that she should wear heels pretty much never. Thorp was right to sue on the basis of gender discrimination because only women are required by some employers to toddle about on their toes, but a case could be made on the basis of disability discrimination as well.

That disabled women could be fired—or simply looked upon unfavorably in the workplace for “not making an effort”—is indeed a social justice issue. We in the West have come to regard heels as a sign of female beauty and professionalism not so much because they are inherently smart looking, but because they were invented to signify wealth.

Heeled shoes were designed to be painful and inefficient if you walked around much because the upper classes around the world have traditionally used their fashion statements—from foot-binding to corsets to flowing robes and fingernails—to prove that they were wealthy and didn’t need to labor to survive like the lowly workers. Prof. Lisa Wade offers a wonderful break-down of the history of the high heel at Sociological Images, pointing out that they were first considered manly because men were the first to don them to display social status. Women began wearing them to imitate this status, which led to men abandoning them. Wade explains:

This is a beautiful illustration of Pierre Bourdieu’s theory of class distinction. Bourdieu argued that aesthetic choices function as markers of class difference. Accordingly, the elite will take action to present themselves differently than non-elites, choosing different clothing, food, decor, etc. Expensive prices help keep certain things the province of elites, allowing them to signify their power; but imitation is inevitable. Once something no longer effectively differentiates the rich from the rest, the rich will drop it. This, I argue elsewhere, is why some people care about counterfeit purses (because it’s not about the quality, it’s about the distinction).

Eventually men quit wearing heels because their association with women tainted their power as a status symbol for men. (This, by the way, is exactly what happened with cheerleading, originally exclusively for men). With the Enlightenment, which emphasized rationality (i.e., practical footwear), everyone quit wearing high heels.

What brought heels back for women? Pornography. Mid-nineteenth century pornographers began posing female nudes in high heels, and the rest is history.

In many moments in the history of manycultures, extra pounds of body fat have also signified high social status because wealth was needed to keep someone well-fed. The price of sugar and of meat plummeted in the 20th century in the West and were soon no longer considered delicacies only the wealthy could afford. This coinciding with the eugenics craze in the early 20th century brought about the birth of our modern preoccupation with not just longevity and bodily cleanliness but physical “fitness.” These shifts are why modern fashion dictates that those who wish to project high social status should dress inefficiently, like a traditional aristocrat, while remaining physically strong, slim and active, like a traditional laborer.

High-status men are now encouraged to wear expensive attire in addition to building and maintaining a muscular physique that can get down in the dirt – something the manly dukes and earls of yore would have considered horrifically common. High status women are now encouraged to diet and exercise to be “healthy” in addition to wearing heels to hint at sexiness in their physique via the historical association with both princesses and porn stars – at the risk of breaking down their bodies as they rush off to work and back like the peasant women of yore.

Indeed, our modern fashion rules for professional women are ever so young because upper class women who worked were an anomaly in the Modern Era until the 20th century. The First and Second Wave feminists successfully fought for our right to vote and become actors, bankers, flight attendants, and politicians, but we have yet to expunge the idea that a woman who suffers for beauty is admirable, rather than irresponsible. Nicola Thorp’s petition, however, has dealt it a blow.

Women should feel free to wear heels almost whenever they wish, but disabled women should not have to suffer social consequences for choosing to protect their bodies. True equality may also come when men can wear heels like Mozart and Louis XIV without fear of gay bashing, as long as such a fashion shift does not harden into a fashion decree. If it does, then disabled men will have to use their right to petition against discrimination.

No matter how you personally feel about them, just remember that modern ideas about fashion, gender/sex, class, and disability all meet whenever we consider a pair of high heels. That’s why we call it intersectionality.

There are undoubtedly those who find the idea of a night club offering its VIP-members a “free midget” for the evening hilarious. (It’s just so novel, ain’t it?) And there are certainly those who find the idea offensive. (“That was offensive,” comedienne Joanna Hausmann points out, is the third most-uttered phrase in America.)

And then there are those of us who know that the idea is not original. Far from it. It is at least 2,000 years old. Records show people with dwarfism were purchased as slaves in Ancient Rome and China up through the Renaissance. In bondage for their entertainment value, they were made to dance like monkeys and sometimes kept in cages.

From the Early Modern Era on into the 18th century—and, in some parts of the world, the late 20th century—they remained ubiquitous as lifelong servants and entertainers to aristocrats and dictators. Whether such servitude constituted slavery is difficult to ascertain. There is no evidence to suggest dwarfs were relegated by law to slave status at birth like other minorities were, perhaps because dwarf entertainers and servants were a frivolity for monarchs rather than a source of cheap labor for major industries. Records predating the 20th century reveal a handful of people with dwarfism lived independent lives. But, like the freak shows of the circus, servitude was often dwarfs’ best hope for sustenance in a world where families often abandoned them as children.

Dwarf advocacy organizations have condemned the Manchester night club’s offer as “discriminatory.” But rather than entangle ourselves in another battle between the that’s-so-offensive crowd and the hey-lighten-up crowd, I would prefer to ask both sides if they are aware of the history of servitude and enslavement. And if, as I suspect, most are not aware of it, it is necessary to consider why.

“Is the world becoming a more dangerous place?” This is not a subjective question, but it is all too often answered by entirely subjective findings. Do you watch the local news and listen to a police scanner? Do you see graffiti as street art, or cause to clutch your valuables and not make eye contact with anyone? Do you know someone personally who has been robbed, attacked, or murdered?

The objective answer to the original question, however, is no. The world is in fact safer than it has ever been in human history because we humans have become drastically less violent. Never before has there ever been a place of such high life expectancy and such low levels of violence as Western Europe today. Around the globe, there are lower rates of war and lower rates of spankings. There is no guarantee that the decline in violence will continue. But most of us have a hard time even believing that it exists at all.

In his book The Better Angels of Our Nature, Harvard psychologist Stephen Pinker proves that the human emotional response to perceived danger—especially danger towards ourselves or someone with whom we can easily empathize—always risks distorting our perceptions of safety. One of the problems of empathy, he argues, is that we more readily feel for those we perceive to be more similar to us. This results in our investing more time, money and emotion toward helping a single girl fighting cancer if she speaks our language and lives in a house that looks like our own than toward helping 1,000 foreign children fighting malaria. We are more likely to disbelieve a victim of abuse if we can more quickly identify with the accused, and the same is true for the reverse scenario. And if you have been the victim of a horrendous crime or are struggling to survive in any one of the countries ravaged by war this year, you may become angry at any suggestion that the world is getting better, lest the world ignore the injustices you have suffered.

Those of us working in human rights must beware these problems whenever we trumpet a cause. Every activist’s greatest enemy is apathy, and fear of it can lead us to underscore threats while downplaying success stories in order to keep the masses mobilized. But any method founded on the claim that we have never lived in such a dangerous time is spreading lies.

The only sound way to appraise the state of the world is to count. How many violent acts has the world seen compared with the number of opportunities? And is that number going up or down? … We will see that the trend lines are more encouraging than a news junkie would guess.

To be sure, adding up corpses and comparing the tallies across different times and places can seem callous, as if it minimized the tragedy of the victims in less violent decades and regions. But a quantitative mindset is in fact the morally enlightened one. It treats every human life as having equal value, rather than privileging the people who are closest to us or most photogenic. And it holds out the hope that we might identify the causes of violence and thereby implement the measures that are most likely to reduce it.

There is a risk that some will see the decline in violence as reason for denying crime (“Rape hardly ever happens!”), dismissing others’ pain (“Quit whining!”), and justifying their disengagement (“See? We don’t need to do anything about it!”). Pinker and Mack, however, claim the decline can be attributed in the modern era to the efforts of those in the human rights movements. In the example of violence against women:

The intense media coverage of famous athletes who have assaulted their wives or girlfriends, and of episodes of rape on college campuses, have suggested to many pundits that we are undergoing a surge of violence against women. But the U.S. Bureau of Justice Statistics’ victimization surveys (which circumvent the problem of underreporting to the police) show the opposite: Rates of rape or sexual assault and of violence against intimate partners have been sinking for decades, and are now a quarter or less of their peaks in the past. Far too many of these horrendous crimes still take place, but we should be encouraged by the fact that a heightened concern about violence against women is not futile moralizing but has brought about measurable progress—and that continuing this concern can lead to greater progress still…

Global shaming campaigns, even when they start out as purely aspirational, have led in the past to dramatic reductions of practices such as slavery, dueling, whaling, foot binding, piracy, privateering, chemical warfare, apartheid, and atmospheric nuclear testing.

The decline of violence undermines the arguments of those who invest their energy in fear-mongering (“People are evil and out to get you!”), self-martyrdom (“I’ve tried for so long—I give up!”) or indifference (“There’s no point to even trying.”). In his excellent book, which is well worth your time, Pinker demonstrates that all humans are tempted to use violence when we are motivated by feelings of greed, domination, revenge, sadism, or ideology (i.e., violence for a greater good), but we have proven that we can overcome these temptations with our capacity for reason, self-control, sympathetic concern for others and the willingness to adhere to social rules for the sake of getting along. There is much work to be done, but the decline is ultimately cause for hope.

Nothing divides a country quite like a national holiday. When I was studying in St. Petersburg ten years ago, there was as much apathy as there was celebration on the Russian Federation’s June 12th decennial. German reactions to Reunification Day every October 3rd are anything but united. And on the United States Fourth of July last month, Chris Rock tweeted, “Happy white peoples independence day, the slaves weren’t free but I’m sure they enjoyed fireworks.”

Amid the outbursts of “unpatriotic!”, conservative blogger Jeff Schreiber shot back, “Slavery existed for 2000yrs before America. We eradicated it in 100yrs. We now have a black POTUS. #GoFuckYourself.”

Schreiber has since written a post on his blog, America’s Right, apologizing for cursing and conceding that the slave trade was unconscionable. But for all his insistence that he never intends to diminish the horrors of American slavery, he adds that President Obama’s policies are now “enslaving Americans in a different way.” (Real classy.) And for all his reiteration that slavery was always wrong, he still hasn’t straightened out all the facts skewed in his Tweet.

“Slavery existed for 2,000 years before America.” He uses this supposed fact to relativize the oppression, as if to shrug, “Well, everyone was doing it back then.” His tweet implies that the ubiquity of the slave trade makes America’s abolition of it exceptional, not its participation. This argument hinges on fiction. Slavery did not exist for 2,000 consecutive years. In the West, it was pervasive in Antiquity and the Modern era, but it was downright uncommon in the Middle Ages. (While anathema to our modern ideas of freedom for the individual, medieval serfdom was not slavery.) Slavery was re-instituted in the West roughly 500 years ago with the advent of colonialism. And the United States held on to it long after most other colonial powers had abolished it. Critics can say what they want about the effectiveness of Chris Rock’s rain-on-a-parade tactics, but his argument did not distort history.

In my last post, I argued the risks of concealing the human rights abuses of the past for the sake of nostalgia, if anything because it is the height of inaccuracy. But portraying history as an unbroken tradition of straight, white, able-bodied male dominance like Schreiber did is also inaccurate. The universal human rights movement in its modern form is indeed only a few decades old, but the idea of equality for many minorities can be found all over in history at various times and places. The Quakers have often been pretty keen on it.

And almost no minority has been universally condemned. People with dwarfism appear to have been venerated in Ancient Egypt. Gay men had more rights in Ancient Greece and in many American Indian societies than in 20th century Greece or the United States. Muslim women wielded the right to divorce long before Christian women. English women in the Middle Ages were more educated about sex than their Victorian heiresses. Much of the Jewish community in Berlin, which suffered such unspeakable crimes culminating in the mid-20th century, were at earlier times better integrated into the city than Jewish people were in many other capitals of Central Europe. In short, history does not show that racism, misogyny, homophobia, ableism, transphobia, and our current beauty standards are dominant social patterns only recently broken by our ultra-modern culture of political correctness. The oppression of minorities may be insidious and resilient throughout history, but it has never been universal.

Downplaying the crimes of the past by claiming everybody did it is both historically inaccurate and socially irresponsible. It is perverse when such misconceptions fuel arguments for further restrictions on human rights. In 2006, Republican Congress member W. Todd Akin from Missouri claimed that, “Anybody who knows something about the history of the human race knows that there is no civilization which has condoned homosexual marriage widely and openly that has long survived.” Even if this were true, the argument is absurd. (It appears that no civilization has regularly chosen women with dwarfism for positions of executive power, but does that mean it’s a bad idea?) But the argument collapses because it relies on facts that are untrue.

Granted hyperbole is a constant temptation in politics. Stating things in the extreme is a good way to grab attention. In an earlier post on sex, I asserted that mainstream culture assumes women’s sex drive is lower than men’s because female sexual expression has been “discouraged for millennia.” Patriarchy has certainly been a major cultural pattern around the world and throughout history, and we cannot emphasize its power on both the collective and individual psyche enough. But patriarchy is by no means a cultural universal. Ethnic groups in Tibet, Bhutan, and Nepal continue to practice polyandry into the present day, while history shows many others that have done the same at various times. These exceptions question the biological theory that heterosexual male jealousy is an insurmountable obstacle to sexual equality. And prevents any conservative excuse that insists, “Everybody’s been doing it.”

They haven’t been. Xenophobia has never been universal. Humans may have a natural fear of the unfamiliar, of what they perceive to be the Other, but our definitions of the Other change constantly throughout time and space, as frequently and bizarrely as fashion itself. This makes history craggy, complex, at times utterly confusing. Like the struggle for human rights, it is simultaneously depressing and inspiring. But whatever our political convictions, we gotta get the facts straight.

Anytime my partner and I don’t know what to do or say, one of us asks, “What’s in the news?” and we dive into a political discussion. So it’s no surprise that we’ve become somewhat embarrassingly addicted to Aaron Sorkin’s The Newsroom. The news media has been (unsurprisingly) critical of a show founded on the idea of chastising the news media. Feminists have been (sometimes rightly) critical of its portrayal of women. The show has almost countless strengths and weaknesses, but I find myself still obsessing over the brilliant, captivating opening scene that kicked off the series. If you can’t this clip, it basically boils down to a flustered news anchor named Will McAvoy overcome with disgust at the state of the nation and nostalgia for the 1950s and 60s: “America’s not the greatest country in the world anymore,” he sighs. “We sure used to be.”

We stood up for what was right. We fought for moral reasons. We passed laws, we struck down laws for moral reasons. We waged wars on poverty, not poor people. We sacrificed, we cared about our neighbors. We put our money where our mouths were, and we never beat our chests… We cultivated the world’s greatest artists and the world’s greatest economy. We reached for the stars, acted like men. We aspired to intelligence. We didn’t belittle it. It didn’t make us feel inferior… We didn’t scare so easy.

“Nostalgia” literally means “aching to come home.” It’s the temporal form of homesickness, time rather than place being the source of pain. We all do it. It can be oddly soothing at times to be in awe of another era, especially the one you were born in. But Will McAvoy should watch Woody Allen’s Midnight in Paris for proof that nostalgia is an ultimately futile pastime that every sad sack of every era has hopelessly indulged in. (If “things were better back in the day,” then how come every generation says this?) But since McAvoy’s nostalgia is an earnest, political battle cry, heaping laurels on the good old 1950s and 60s when the leaders of the day did their job right, I’m more inclined to have him watch Mad Men. Or just open up the 1960 children’s illustrated encyclopedia I found at my great aunt’s house, which states, among other things: “The Australian aborigine is similar to the American negro in strength, but less intelligent.” Didn’t scare so easy, indeed.

The problem with nostalgia is that it is far more emotional than intellectual and thereby lends itself to inaccuracy all too easily. America was indeed doing great things sixty years ago. And reprehensible things. We hid our disabled and gay citizens away in institutions, asylums and prisons. We enforced the compulsory sterilization of mentally disabled and Native American women. We took decades to slowly repeal segregationist laws that the Nazis had used as models. We maintained laws that looked the other way when husbands and boyfriends abused their partners or children. In short, we handed out privilege based on gender, sexuality, ethnicity, religion, physical and mental capabilities with far greater frequency and openness than we do today. Perhaps we were the “greatest country in the world” compared to the others. (Europe and East Asia were trying to recover from the devastation of World War II, after all, while other nations were trying to recover from the devastation of colonialism.) But McAvoy’s wistful monologue is much more a comparison of America Then with America Now. And that is hard to swallow when considering that a reversion to that society would require so many of us to give up the rights we’ve been given since then.

Am I “another whiny, self-interested feminist” out to bludgeon the straight, cis, WASPy male heroes of history? Am I “just looking to be offended”? No, I’m struggling. Next to literature and foreign languages, history has always been my favorite subject. And pop history always touches upon this question:

“If you could go back to any period in history, which would it be?”

From an architectural point of view? Any time before the 1930s. From an environmental point of view? North America before European contact. From a male fashion point of view? Any period that flaunted fedoras or capes. From a realistic point of view? No other time but the present. Because if I am to be at all intellectually honest in my answer, there has never been a safer time for me to be myself.

Last year, I read The Lives of Dwarfs: Their Journey from Public Curiosity To Social Liberation by Betty Adelson. Despite my love of history, I hated almost every minute of it. Lies my Teacher Told Me by James Loewen had helped me understand how so many black American students feel uninspired by U.S. history and the figures we hold up as heroes because so many of those men would have kept them in shackles. But it wasn’t until I read The Lives of Dwarfs that I understood how nasty it feels on a gut-level to face the fact that most of history’s greatest figures would more likely than not consider you sub-human.

With the exception of Ancient Egypt, my own lifetime has been the only period wherein someone with dwarfism could have a fair chance of being raised by their family and encouraged to pursue an education and the career of their choice, as I was. At any other point in Western history, it would have been more probable that I would have been stuck in an institution, an asylum or the circus (the Modern Era before the 1970s), enslaved by the aristocracy (Rome, Middle Ages, Renaissance) or left for dead (Ancient Greece). Of course inspiring cases like Billy Barty show that a few courageous/decent parents bucked the trends and proved to be the exception to the rule, but that’s what they were. Exceptions.

I am fortunate to have been born when I was and for that reason, nostalgia for any other period in time can never be an intellectually honest exercise for someone like me. The moment someone says, “Yeah, well, let’s not dwell on odd cases like that. I’m talking about the average person,” they’re essentially saying, “Your experience is less important than mine.”

Everyone is entitled to have warm, fuzzy feelings about the era in which they grew up. If any period can put a lump in my throat, it’s the 1970s. The Sesame Street era. The boisterous, primary-colored festival flooded with William’s Doll, Jesse’s Dream Skirt, inner city pride à la Ezra Jack Keats, and androgynous big hair all set to funky music can evoke an almost embarrassing sigh from me. Donning jeans and calling everyone by their first name, that generation seemed set on celebrating diversity and tearing down hierarchies because, as the saying goes, Hitler had finally given xenophobia a bad name. Could there be a more inspiring zeitgeist than “You and me are free to be to you and me”?

But I’m being selective with my facts for the sake of my feelings.

Sesame Street and their ilk were indeed a groundbreaking force, but it was hardly the consensus. Segregation lingered in so many regions, as did those insidious forced sterilization laws. LGBT children were far more likely to be disowned back then than today—Free To Be You And Me had nothing to say about that—and gay adults could be arrested in 26 states. The leading feminist of the time was completely screwing up when it came to trans rights. Although more and more doctors were advocating empowerment for dwarf babies like me, adult dwarfs faced an 85% unemployment rate with the Americans with Disabilities Act still decades away. And Sesame Street was actually banned in Mississippi on segregationist grounds. When the ban was lifted, its supporters of course remained in the woodwork. We have made so much progress since then. It would be disingenuous for me to ignore that simply for the sake of nostalgia.

To be fair to Sorkin, it’s a hard habit to kick. We have always glorified the past to inspire us, no matter how inaccurate. Much of American patriotism prides itself on our being the world’s oldest democracy, but we were not remotely a democracy until 1920. Before then, like any other nation that held free elections, we were officially an androcracy, and of course we didn’t guarantee universal suffrage until the Voting Rights Act of 1965. That my spellcheck doesn’t even recognize the word “androcracy” signifies how little attention we afford our history of inequality. But we have to if accuracy is going to have anything to do with history. A brash statement like “We sure used to be [the greatest country in the world],” as a battle cry for self-improvement is asking to be called out on the inanity of this claim.

Everyone is entitled to appreciate certain facets or moments in history, just as everyone is entitled to look back fondly upon their childhood. Veracity falters, however, with the claim that not just certain facets but society as a whole was all-around “better.” This is never true, unless you’re comparing a time of war to the peacetime preceding it (1920s Europe vs. 1940s Europe, Tito’s Yugoslavia vs. the Balkans in the 1990s), and even then the argument is sticky (Iraq during the insurgency vs. Iraq under Saddam Hussein). In the words of Jessica Robyn Cadwallader, concealing the crimes of the past risks their reiteration. Whenever we claim that something was socially better at a certain point in history, we must admit that something was also worse. It always was.

But such a sober look at the past need not be depressing. It reminds me how very grateful I am to be alive today. My nephews are growing up in a society that is more accepting than almost any other that has preceded it. That is one of helluva battle cry. Because what could possibly be more inspiring than history’s proof that whatever our missteps, things have slowly, slowly gotten so much better?