Search

Last week there was much discussion on the blog about the social ramifications of height, but what about high heels? The Women and Equalities Committee of the U.K.’s House of Commons recently found that employee dress codes that require heeled-shoes for women are violating laws banning gender discrimination. The Committee reviewed the matter after receiving a petition signed by 138,500 people and started by Nicola Thorp, a London receptionist who in December 2015 had been suspended by her employer without pay for violating the company’s dress code for women by showing up for work in flats.

I personally find high heels frequently quite becoming. I also personally find them physically hazardous. Pretty much anyone with any sort of orthopedic disability has been advised by their specialists again and again to limit the time they spend in heels to a minimum. While reporting on the U.K. ruling, NBC News let women in on “an essential secret — carrying a pair of trainers in your handbag.” This is cold comfort to those of us who know that back pain is also caused by carrying more than 5% of your body weight in your handbag. One twentysomething friend with an invisible disability was told by her spinal surgeon that she should wear heels pretty much never. Thorp was right to sue on the basis of gender discrimination because only women are required by some employers to toddle about on their toes, but a case could be made on the basis of disability discrimination as well.

That disabled women could be fired—or simply looked upon unfavorably in the workplace for “not making an effort”—is indeed a social justice issue. We in the West have come to regard heels as a sign of female beauty and professionalism not so much because they are inherently smart looking, but because they were invented to signify wealth.

Heeled shoes were designed to be painful and inefficient if you walked around much because the upper classes around the world have traditionally used their fashion statements—from foot-binding to corsets to flowing robes and fingernails—to prove that they were wealthy and didn’t need to labor to survive like the lowly workers. Prof. Lisa Wade offers a wonderful break-down of the history of the high heel at Sociological Images, pointing out that they were first considered manly because men were the first to don them to display social status. Women began wearing them to imitate this status, which led to men abandoning them. Wade explains:

This is a beautiful illustration of Pierre Bourdieu’s theory of class distinction. Bourdieu argued that aesthetic choices function as markers of class difference. Accordingly, the elite will take action to present themselves differently than non-elites, choosing different clothing, food, decor, etc. Expensive prices help keep certain things the province of elites, allowing them to signify their power; but imitation is inevitable. Once something no longer effectively differentiates the rich from the rest, the rich will drop it. This, I argue elsewhere, is why some people care about counterfeit purses (because it’s not about the quality, it’s about the distinction).

Eventually men quit wearing heels because their association with women tainted their power as a status symbol for men. (This, by the way, is exactly what happened with cheerleading, originally exclusively for men). With the Enlightenment, which emphasized rationality (i.e., practical footwear), everyone quit wearing high heels.

What brought heels back for women? Pornography. Mid-nineteenth century pornographers began posing female nudes in high heels, and the rest is history.

In many moments in the history of manycultures, extra pounds of body fat have also signified high social status because wealth was needed to keep someone well-fed. The price of sugar and of meat plummeted in the 20th century in the West and were soon no longer considered delicacies only the wealthy could afford. This coinciding with the eugenics craze in the early 20th century brought about the birth of our modern preoccupation with not just longevity and bodily cleanliness but physical “fitness.” These shifts are why modern fashion dictates that those who wish to project high social status should dress inefficiently, like a traditional aristocrat, while remaining physically strong, slim and active, like a traditional laborer.

High-status men are now encouraged to wear expensive attire in addition to building and maintaining a muscular physique that can get down in the dirt – something the manly dukes and earls of yore would have considered horrifically common. High status women are now encouraged to diet and exercise to be “healthy” in addition to wearing heels to hint at sexiness in their physique via the historical association with both princesses and porn stars – at the risk of breaking down their bodies as they rush off to work and back like the peasant women of yore.

Indeed, our modern fashion rules for professional women are ever so young because upper class women who worked were an anomaly in the Modern Era until the 20th century. The First and Second Wave feminists successfully fought for our right to vote and become actors, bankers, flight attendants, and politicians, but we have yet to expunge the idea that a woman who suffers for beauty is admirable, rather than irresponsible. Nicola Thorp’s petition, however, has dealt it a blow.

Women should feel free to wear heels almost whenever they wish, but disabled women should not have to suffer social consequences for choosing to protect their bodies. True equality may also come when men can wear heels like Mozart and Louis XIV without fear of gay bashing, as long as such a fashion shift does not harden into a fashion decree. If it does, then disabled men will have to use their right to petition against discrimination.

No matter how you personally feel about them, just remember that modern ideas about fashion, gender/sex, class, and disability all meet whenever we consider a pair of high heels. That’s why we call it intersectionality.

As I wrote on Facebook after I saw friends posting them, I really don’t like those#TinyTrumpmemes. I’m not outraged. I’m just really, really uncomfortable whenever human size is used as an insult or a sight gag. (And yes, I have had friends and admire severalhuman rights activists who are almost as short as Trump appears in those memes.) Being physically small isn’t hilarious or humiliating. It just is.

200 years after Napoleon, political discourse is still rife with the insidious concept of small man syndrome. Male acquaintances still report conversations coming to a screeching halt on Tinder after they answer an interested woman’s inquiry after their height. So here is an old, popular post on the subject that is just as apt as it was when I first published it:

I’ve said it once and I’ll say it again. I did not undergo limb-lengthening to “look normal.” I did it to function better in everyday life with less difficulty and less pain. Height has mattered tremendously to me as an issue of accessibility. But as an issue of social interaction, I tend to find it only slightly more significant than eye color.

Throughout high school, I had a Yoko Ono quote taped to my bedroom wall: “You call me ‘little,’ but I have a universe in my head.” Every teen needs role models. I got excited when I lived for six months in southern France, where I encountered several women my size. There is something inexplicably pleasing about being at eye-level with someone. Which is what made the moments when guys have gotten on their knees to dance with me utterly touching.

But my husband stands at 6’5” (1.96 m), more than a foot taller than I am. Being at eye-level with someone can feel important, but it’s not that important.

And we’ve gotten compliments for being such a striking couple due our height difference. (Should we thank John and Yoko for blazing the trail?) But as said before, when we tell our loved ones what exquisite hair or adorable hands or gorgeous eyes they have, it’s more a display of affection than a statement of what we require to be intrigued. When we tell someone, “You are so beautiful,” and we mean it, it’s a testament to the sum of their parts. To the entrancing union of their perfections and imperfections. Height is what you make of it.

I generally find a preoccupation with height amusing. When my father-in-law, who is from the Black Forest, married my mother-in-law, who was from Stockholm, they had their wedding photos shot only in close-up, so that you can’t tell that he was standing on a box.

When I was undergoing my first limb-lengthening procedure at age 11, I explained to one of my teachers, “I’ll never be super-model tall. The muscles tighten up when you stretch them and that’s why there is a limit to how far you can lengthen your legs.”

“Well, that’s actually good for you as a girl,” she said.

“Why?” I asked.

“Well, you wouldn’t ever want to be too tall and end up walking alongside a man who’s shorter than you!”

I looked at her quizzically and then smirked to myself. Sure. That was my first concern about undergoing limb-lengthening.

Eighteen years later, as I prepared my wedding, I came across a discussion on a forum for brides-to-be about the ubiquity of complaints about heels that were too high.

“Why am I hearing so many comments about not wanting to be taller than your husbands?” the main commenter wrote. “I mean, seriously? This is the 21st century. We’re all liberated about LGBT rights and feminism and healthy body image and equality, but we’re still convinced it’s unfeminine for a woman to be taller than her husband?”

Nine out of ten of the replies all said, “Well, I don’t want to look like some freak.”

This week, HuffPost Live features an interview in which dwarf reality TV star Ben Klein reveals his past struggles with depression and suicidal thoughts due to social isolation and bullying. Earlier today on Germany’s ZDF Sundays morning news show, opera singer Doris Michel revealed that no man has ever been able to get over her dwarfism and see her as a romantic partner.

It’s easy to shake our heads and feel sorry for these individuals, and then to be inspired by the courage they have demonstrated in overcoming such hardship. We praise them for raising their children to be self-confident enough to face adversity. But when the adversity is inflicted by our society’s lingering attachment to something as silly as height, it is crucial that we own up to our collective responsibility for it.

Surely if Klein and Michel can overcome bullying and denigration, we can overcome any hang-ups we have about size. And in the nature vs. nurture debate, we gotta stop saying “nurture” and start saying “culture” because it takes more than one set of parents to change the world.

The saleswoman shook her head sympathetically. “These doctors just don’t understand. They make it so difficult for women looking for shoes.”

Um, I don’t think that’s who’s making it difficult, I said to myself. Because she isn’t the one flooding the market with shoes that discriminate against disabled bodies, it didn’t feel necessary that this one saleswoman be confronted with the issue. But we as a society probably should.

If I don’t wear my orthotics, I burden my achondroplastic back in very unhealthy ways. The same goes if I wear heels regularly, instead of only occasionally, as my orthopedist advises. When I was younger I would often flout the rules, but my tolerance for pain-inducing shoes has lessened since I turned 30 and needed back surgery to avoid paralysis, as one-third of all people with achondroplastic dwarfism do. A friend who has undergone a few operations on her spine absolutely cannot wear heels. Yet wearing orthotics every day is not seen as being healthy and responsible in the same way that, say, running a marathon is.

Will chronic pain management never be seen as bad-ass because it lacks the thrill of breaking records or leaving others in the dust? Or is it because it defies the “no pain, no gain” rule? In which case, foregoing orthotics and swallowing the pain would seem to be the bad-ass choice.

Eat something sugary or fattening and you can easily attract disapproving looks or even commentary. (“Do you know how many calories/toxins are in that?!”) But risk back pain in a pair of stilettos that make you teeter like a giraffe and you’re suffering for beauty like any self-respecting woman would.

Why? Is it because, as Jessica Valenti wrote last year, too few woman are willing to endure “the social consequences of aesthetic apathy”? Does bodily beauty always require some degree of discomfort? Even the love-your-body yoga crowd pushes the back-to-the-earth barefoot aesthetic, which can be supremely painful for many disabled people.

Fashion is fickle and ever-changing. In a world where humans can find beauty in everything from body-builder biceps to heroin chic, and switch from viewing heels as manly to sexy, it seems possible for us to stop marginalizing and perhaps even some day tout medically responsible choices as fashionable choices. Why haven’t we managed this yet? What will it take to get us there?

As Halloween approaches along with all the stomach-turning caricatures of minorities and foreigners, I find myself repeating the same question over and over: When is it okay to wear or adopt something from a culture you don’t belong to?

Obviously, the most offensive appropriations rely on inane stereotypes most people I know would never go near. But this doesn’t mean that globe-trotting, multicultural enthusiasts—like myself—can do no wrong. Since the 1960s, upper/middle class whites dabbling in other cultures has been celebrated under the banner of “Diversity!” But from the point of view of certain cultures, Nigerian writer Jarune Uwujaren argues, it’s often just another chapter in the long tradition of Westerners “pressing their own culture onto others and taking what they want in return.” American Indians do not appreciate headdresses used as fashion statements. Hindus do not applaud non-Hindus flaunting bindis. And Mexicans don’t enjoy seeing Day of the Dead re-appropriated as just another Halloween costume.

Yet the Mexican Día de los Muertos is the result of Catholics adopting what was originally a pre-Columbian tradition. Modern German children meanwhile have taken to celebrating Halloween, much to their parents’ chagrin. There isn’t a holiday on earth that hasn’t been adapted from something else, leading atheist comedian Mitch Benn to observe, “If only practicing Christians can celebrate Christmas, then only Vikings can say, ‘Thursday.’ ”

Indeed, intercultural contact always leads to intercultural mixing. Nowadays brides in China often wear two wedding dresses on their big day: a traditional Chinese red dress and a traditional Western white gown. When a friend from Chengdu married her German husband in Berlin, she turned this trend on its head, wearing a Western designer dress that was red and then a cheongsam that was white. Borders move and cultures blend constantly throughout history, often blurring the line between cultural appropriation and cultural exchange.

For this reason, it is important to remember that absorbing the fashions and customs of another culture is not always offensive. But it is just as important to remember that it is not always open-minded, either. After all, colonial history is rife with Westerners who filled their homes with foreign gear and lectured others about the noble savage. Among the most ardent fans of Tibetan Buddhism, American Indian animism, and Norse mythology were the Nazis.

We all love to show off what we’ve learned and delving into another culture can be enriching. But minorities tend not to like it when an outsider appoints herself an expert and lectures more than she listens. Or thinks that listening to minorities is a heroic act, rather than common courtesy. Visiting another country feels special when we’re the first of our friends and family to go, but there is no guarantee we’ll truly be acquainted with the culture. Thanks to language barriers and the insular nature of expat bubbles and tourist tracks, it is fairly easy to study or even live in another culture for several years without getting to know a single person from that culture. (Waiters and receptionists don’t count.)

Whether venturing to the other side of the world or the other side of the tracks, it is always much easier to buy something, taste something, or get a bit of history from a book than to talk to someone from another culture. Because books and merchandise can’t talk back. They won’t call us out if we make false assumptions. If we do actually strike up a conversation with someone from another ethnic group, whether Liverpudlian or Laotian, the temptation to flaunt the experience like a feat of greatness can be overwhelming. Jarune Uwujaren wrote about this pervasive temptation last month:

I remember that at my sister’s wedding, the groom – who happened to be white – changed midway through the ceremony along with my sister into modern, but fairly traditional, Nigerian clothes.

Even though some family members found it amusing, there was never any undertone of the clothes being treated as a costume or “experience” for a white person to enjoy for a little bit and discard later. He was invited – both as a new family member and a guest – to engage our culture in this way.

If he had been obnoxious about it – treated it as exotic or weird or pretended he now understood what it means to be Nigerianand refused to wear Western clothes ever again – the experience would have been more appropriative.

But instead, he wore them from a place of respect.

Appreciating the beauty in other cultures is always preferable to xenophobia. Enjoying a trip abroad that happened to involve minimal interaction with the locals is perfectly fine. But drawing attention to oneself for reveling in the mysteriousness of a culture is to revel in its supposed Otherness. Whenever an entire culture is reduced to its exoticism, it becomes nothing more than an accessory or a toy – not a sign of cultural understanding.

And while adopting a sacred custom “just because it looks cool” can be inconsiderate, imbuing our reasons for adopting a trinket with too much meaning can also make a native roll their eyes. It’s one thing to buy a handbag on a trip to Tokyo simply because it’s beautiful. A Japanese woman is buying it simply because it’s beautiful, after all. But it’s another thing to flaunt it like a badge of enlightenment.

The blog Hanzi Smatter documents and explains the snafus and utter nonsense that so often result when Westerners get tattoos of Chinese characters copied off the Internet. Such incidents demonstrate that vanity is often mistaken for art. We’re all a little vain, yet the difference between art and vanity is crucial because vanity is an indulgence, not a challenge or an attempt to communicate. When Dita von Teese donned yellowface for a London performance titled “Opium Den,” fellow burlesque artist Shanghai Pearl wrote:

I am not saying artists should not tackle controversial or challenging subjects. However, if we choose to take on challenging material, we should be prepared to have challenging conversations. I absolutely believe that art will not suffer from sensitivity. Sensitivity should make us work harder, research more, and think more. Art can only benefit from that.

Indeed, nothing suffers from genuine sensitivity. The lesson from colonialism is not to stop exploring the world and reading about it, but to always bear in mind that there can be no cultural understanding without dialogue. When deciding whether to adopt a tradition or style from another culture, we should consider what several people from that culture have to say about it. Because there are no cultures without people.

It’s been a good week in the media for dwarfs. Not only did Peter Dinklage’s Emmy win allow for him to speak out once again against bullying, but Fashion Week just ended in the city I call home and I couldn’t help but squeal a little “OMG!” at seeing history being made.

With her collection “At Eye Level,” Berlin-based designer Sema Gedik presented clothes made for and modeled by Laura Christ, Mick Mehnert, Eva Ehrmann and others with dwarfism. Gedik was inspired to do so after observing the difficulty of finding clothes that fit—not to mention stylish ones—faced by her cousin Funda, who has achondroplasia. That the fashion industry has never seemed interested in offering dwarfs clothing made for their bodies imbued Gedik with “an intense feeling of injustice.” She tells Berlin’s Tageszeitung, “Fashion should not be restricted by social conventions.”

But those restrictions are there, which is why she reports being surprised that she even managed to get the project off the ground and into Fashion Week. Indeed, a feature in The Washington Post earlier this year about the work of American designer Kathy D. Woods did little to help her kickstarter campaign to fund her line of clothes for fellow dwarfs. The campaign ended up falling far short of its goal.

El Mundo has declared Gedik’s debut “a revolution,” but the revolution is arguably the easiest step for any social justice movement. The trick is getting the new ideas to stick. Distributors argue that dwarf clientele wield too little purchase power to be worth investing in due to their small numbers. Gedik rebuffs this claim, pointing out that the number of women with typical catwalk-like measurements also constitutes a minority of consumers.

Ever since New York Fashion Week featured a handful of disabled models, some cultural critics have wondered whether the ulterior motive of the world of haute couture is to exploit those who stand out for shock value. After all, Francis Bacon’s truism that “there is no exquisite beauty without some strangeness in the proportion” both honors diversity and draws inordinate attention to an individual’s Otherness at the expense of anything else they may offer or need. Which is perhaps why my excitement and pride at Gedik’s breakthrough is tempered by a slightly more cynical It’s about time.

Gedik is adamant that her goal is to finally and fiercely open up the fashion market to dwarf consumers. “This is only the first step,” she insists. But there is a risk that the opening will close as soon as the novelty wears off. The final step in the path to justice will be to see work like Gedik’s so often that all that’s worthy of note is the choice of colors.

One of my responsibilities at my day job is to coordinate photo shoots for employee portraits. I’ve done this three times now, and it always requires warmly coaxing reluctant coworkers into saying yes, and chatting with them while the flashbulbs fire off in their face. Because, as the photographer told me the first time, “I need someone there to hold their hand. To keep them calm and smiling. Otherwise, a bunch of them will get all self-conscious and fussy. Sometimes it really feels like taking kids to the dentist.”

Indeed, even getting them to show up can be a challenge. A fair number of people flat-out refuse; most but not all of them women, who cut me off mid-sentence and insist, “No photos! I hate being photographed.”

Last week, just after I’d heard this for the umpteenth time, my cell phone rang. It was a reporter who is doing a television piece about Painting On Scars.

“Emily, my team and I just came up with a new idea for our story. We’d like to film you having your picture taken in a photo shoot to show how self-confident you are in front of a camera!”

I couldn’t hold back my laughter.

And then I thought, what is self-confidence in front of a camera?

My experience watching others has shown me that there are unspoken, commonly held beliefs that dictate so much behavior during photo shoots.

For one thing, we tend to believe that selfies are empowering, but that it’s embarrassing to be photographed by someone else. Which goes to show that it’s not about being photographed but relinquishing control over the photograph. Most of us have an idealized view of ourselves that includes seeing our own faces at a particular angle, but we hate it if someone captures us from an angle that deviates too much from our ideal. (This has been proven by clinical trials.)

We tend to prefer smiling photos of others but closed-mouth photos of ourselves. Showing teeth often strikes us as warm and welcoming on someone else, but the fear of looking too uninhibited results in many of us appearing overly serious in our portraits.

We tend to loudly list every physical feature we don’t like about ourselves, believing it signifies modesty. Even though it often comes off as fishing for compliments.

So we tend to reject direct compliments, again believing it to be a sign of modesty. Even though John Cleese famously told Stephen Fry:

“You genuinely think you’re being polite and modest, don’t you?”

“Well, you know …”

“Don’t you see that when someone hears their compliments contradicted they naturally assume that you must think them a fool? Suppose you went up to a pianist after a recital and told him how much you had enjoyed his performance and he replied, ‘Rubbish, I was awful!’ You would go away thinking you were a poor judge of musicianship and that he thought you an idiot.”

“Yes, but I can’t agree with someone if they praise me, that would sound so cocky. And anyway, suppose I do think I was awful?” (Which most of the time performers do think of themselves, of course.)

“It’s so simple. You just say thank you. You just thank them. How hard is that?”

You must think me the completest kind of arse to have needed to be told how to take a compliment, but it was an important lesson that I (clearly) never forgot. So bound up with not wanting to look smug and pleased with ourselves are we that we forget how mortifying it is to have compliments thrown back in one’s face.

Indeed, the photographers I’ve worked with remember subjects in terms of their agreeableness versus their fussiness. I bore this in mind as I prepared for my own photo shoot.

How much preparation was required? Recovering from surgery and combating unanticipated complications, I wasn’t feeling that I looked my best. I won’t reveal what about my looks were particularly displeasing to me because there is no right way to hate your body. Many in the Body Image movement have argued that it’s fair, not rude, to voice our insecurities. In fact, isn’t it good to let others know that they are not alone in their struggle for self-acceptance? But these insecurities do not exist in a vacuum. They exist in a hierarchy, and this hierarchy dictates that if I’m ashamed of gray hair, someone with more gray hair should be more ashamed. If I’m upset about having noticeable scars, someone with more noticeable scars should be more upset. And so on. Body-bashing upholds the hierarchy

And ignoring the effects one’s own body-bashing has on others is, no matter how you look at it, self-involved.

So instead of spending time and energy on whatever might disrupt my ideal self-image, I thought about what makes a photo shoot enjoyable.

A kind, charismatic photographer.

People who make you laugh.

Someone who truly loves you saying something particularly nice about their favorite photo.

Hearing from the photographer, “Thanks for being so easy-going! That was really fun.”

For two years, a friend would never let me or anyone take his picture. It was on very rare occasions—group photos, flirty hugs with a close friend—that he wouldn’t turn away or cover his face. Whatever hang-ups he had about physical imperfection, he carried himself in a manner that attracted both sexes from miles around. He visited me in college once and we noticed four of my fellow students check him out during his first hour on campus.

On another visit, I snapped his picture and declared, “Hey, you didn’t cover your face this time!”

“Yeah, I’ve stopped doing that.”

“Why?”

“ ’Cuz I found it’s really annoying when other people do that when I want to take their picture.”

Barbie turns 55 today and her birthday risks being overshadowed by a rival. Designer Nickolay Lamm has kicked off a very successful crowdsourcing campaign to fund the production of Lammily, a doll whose body is modeled after the mean proportions (taken from the Centers for Disease Control) for an American 19 year-old because, as her slogan goes, “average is beautiful.” The center photo above shows Lammily at her earliest design stage in contrast to Barbie. The left and right photos show her updated, final form.

Despite that her name sounds like the way most toddlers mangle mine, Lammily does seem quite lovely. But mostly because the problems with her competitor are countless. Barbie represents—and was very much intended to represent—an idea born in the middle of the last century that little girls should play not just with baby dolls or girl dolls, but with a woman doll, a post-pubescent beauty they should aspire to. The very first Barbie was inspired by the German Lilli, a character featured in tabloid comics who worked as a secretary by day and an escort by night. While it’s disputed whether or not the Lilli doll was in fact a sex toy, the longer you look at Barbie, the more that explanation makes sense.

Barbie is all fantasy: too thin to menstruate, with breasts so big she’d have to crawl on all fours to get around. (Sporty Lammily could knock her to the floor with a light kick.) Fantasies about beauty are fine as long as they remain a niche, not a standard. If her fame and influence were not so unparalleled, Barbie wouldn’t be a cause of much trouble. But she is the most famous doll in the world, and while she often changes jobs and outfits to bend to society’s trends, her body type never budges from the sex toy standard.

My mother swore I would never own a Barbie—how could it be healthy for a girl with dwarfism to idolize a lady who’s all legs?—but a neighbor bought me one for Christmas, and within the next 10 years I owned 12: Tropical Barbie, Superstar Barbie, Ice Capades Barbie, Gymnast Barbie, Fun-to-Dress Barbie, Loving You Barbie, Hollywood Hair Barbie, Cool Times Barbie, Dreamtime Barbie, Dream Glow Barbie, Dream Date Barbie, and my mother’s own, dragged-out-of-the-attic Barbie from the 1960s, whose earrings had turned her cheeks green. The funny thing is that every one of these Barbies had a slightly different face and slightly different blond hair with varying lengths and textures. But, just like the Disney Princesses, the bodies were all exactly the same. Barbie’s oh-so-80s Rocker friends Diva (brunette), DeeDee (black), and Dana (possibly Asian?) represented a broader range of hair and skin, but their bodies were all replicas of Barbie’s. This is what makes Lammily so radical.

But I don’t want an answer to Barbie. I want many answers to Barbie. Lammily correctly demonstrates that an average girl in the Western world is not blond. But blondes shouldn’t be any more excluded or celebrated than anyone else. Declaring “average” bodies and physical features a beauty standard continues to marginalize girls who deviate from the average. Another word for average is “normal” and it’s never fun for a young girl to hear that her body is “not normal.” Both Barbie and Disney have dared to dabble in the beauty of different ethnicities, but they haven’t been brave enough to try different body types – short, curvy, bony, disabled, with freckles or scars or glasses or birthmarks in the shape of Mexico.

As Hannah Blanke shows in her stellar piece, “Real Women,” there is no wrong way to have a body. If Mattel can invent over 50 varieties of blond hair for their preeminent princess, surely doll manufacturers can find a way to profit from providing a rainbow of body types. Maybe they will be brave enough by the next time International Women’s Day rolls around. That’s my fantasy, anyway.

Orthopedic casts haven’t changed much in 50 years, until now. Engineering student Jake Evill of New Zealand has designed the Cortex cast, a brace made from 3-D printing. While all casts could effectively be described as exoskeletons, the Cortex looks like one. Its lattice structure allows for ventilation, which Evill advertises as its greatest asset. The Cortex is still at its conceptual stage, but, as with almost all new technology, reviews in the media have been pulsing with excitement.

The problems of plaster and fiberglass casts are well known to anyone who’s had to wear one. They’re fairly heavy and very bulky. Worst of all, they make your skin itch like the dickens and you are forbidden from using any implements to scratch because the smallest cut can become badly infected in the dark, suffocating conditions damp with sweat and dead skin. I had to wear casts on both legs after two tendon surgeries and once after having Ilizarov fixators removed. The itching alone was bad enough to make me wish I had the fixators back on.

Anything that claims to be lighter and breathable is a very attractive proposition. But while the Cortex website boasts that the cast is waterproof and therefore perfect for bathing and swimming, this probably means that there is no cloth involved. The cloth lining between a traditional cast and your skin contributes to the itching, but it’s there to prevent abrasion. Watchmakers, jewelers and BDSM professionals all know that any material other than cloth or leather can pose serious risks to human skin.

And the claims that the innovative appearance of the new cast is stylish? What exactly makes a cast stylish? While I could see goths maybe being partial to the Cortex if they could order it in black, reviewers seem to be fawning over the look of it simply because it’s new. And the promotional photo for the Cortex features a well-toned, scarless, unbruised arm that looks a bit too healthy to contain a broken bone. (I half-expect the owner of the model’s fist to be shouting, “BY THE POWER OF CORTEX!”)

Style is all about what you do with what you’ve got. Fiberglass casts come in assorted colors. I had hot pink ones while performing in a school play and ended up enhancing one dream-like scene lit only by ultra-violet light. When I had neon green casts, friends painted my toenails to match. And the good old tradition of letting your loved ones cover your limbs in graffiti is worth mentioning. A friend who is a professional painter adorned the bottoms of my feet with elaborate sunflowers.

Then again, some casts do not conceal only injuries. A young friend of mine once stuck a chunk of steak down her cast in order to get out of having to eat it before dessert. She managed to retrieve only part of it after dinner – the rest tore away and remained lodged deep in the plaster caverns enveloping her arm. Her parents remained unaware for days until the entire house began to reek of rancid meat. With the new cast design, families with deceptive children need not fear such hazards. The Cortex offers not only porousness but transparency!

21 year-old Mariah Serrano was born with a club foot. By the time she was a teenager, she faced increasing chronic pain and her doctors strongly advocated amputating and replacing her leg with a prosthetic one. Now an assistant designer for American Rag and author of the blog Confessions of a One-Legged Fashionista, she recently shared her story with the Post:

Serrano struggled to look like the other girls in her high school who often called her “gimpy.”

“I felt silly in pictures, I was the only one in these shitty little ballet flats,” she recalled.

“I had to wear all sorts of braces. It was uncomfortable and frustrating because they weren’t solving the problem and I often felt embarrassed.”

The glamour girl wore patterned knee highs and flashy tights to mask her deformity. She even dyed her hair pink to distract people from staring at her leg. She eventually stopped going to classes and was home-schooled.

“Kids are mean,” she said. “It made things very hard.”

“A lot of times I felt left out because I loved to dance and go out.”

But even more mortifying for the teenage girl, was being forced to wear sneakers to prom. “I was really devastated in the mall,” she recalled, after shopping for four hours to find a chic shoe.

The article never mentions any medical purpose for the amputation. Serrano is only quoted as hating the limited number of footwear options that had been available to her prior to the operation. The story ran four days ago and was quickly picked up the British tabloids. And Serrano is not pleased. She explains on her blog:

I did not choose to cut my leg off so I can wear high heels, I had my leg amputated because I was very sick and the quality of my health and life were suffering. Doctors do not welcome the idea that you are unhappy with your footwear choices, so you should remove body parts.

This event was a real decision that I took very seriously. It was a decision my family and I made together, so that I would be able to live my dreams, and not mind you, dreams of footwear, but dreams of waking up and going about my life not in chronic pain.

I think it’s safe to say that The New York Post is not a feminist crusader on the issues of body image and beauty standards. So why then would they decide to warp Serrano’s words to feed the image of the fashionista lifestyle as a vile instigator of self-mutilation? The story of a young girl simply but bravely electing to trade chronic pain for a prosthesis is severely lacking in vitriol. This means there is no surefire guarantee that it will unleash a deluge of jaw-dropping, eye-rolling, and catty comments from readers about the girl in question. That guarantee is essential to the business the Post is in.

Serrano is hardly the first individual to be misrepresented by the tabloids. But who’s keeping the tabloids going by hungering after such headlines? It’s this hunger that drives journalists across the spectrum to emphasize the most soap opera-like elements of a person’s life story. I’ve seen the most loving, supportive families with disabled children portrayed as walking tragedies based on a few of their more emotional quotes taken out of context. This approach knows that readers and viewers will consequently feel sorry for the pathetically confused freaks, and good about themselves. Not unlike the mean classmates Serrano cites from her high school days.

So if anyone is interested in ending the tabloids’ tradition of tearing people’s personal lives to shreds, we can curb their sales by curbing our desire to use bits of information about people we don’t know as an easy way to prop ourselves up. Of course this is asking a lot, and so, once again, we must decide which is harder – altering the way we think or altering our bodies?

Perhaps not quite in the spirit of Thanksgiving, I’m about to alienate half the people I know: I don’t have much patience for openly picky eaters over the age of 20. The covert ones don’t bother me at all. But announcing to your host that you simply won’t eat mushrooms or mustard or millet is to revert to your 10 year-old self, brazenly acting on the assumption that anyone cooking for you will find your pig-headedness as endearing as your parent or guardian apparently did. I don’t particularly like pears or peas or plenty of other things, but if I learned anything from my time as an exchange student with the American Field Service, it’s that intellectually curious, culturally respectful, fully-grown adults eat whatever is set in front of them. Or at least try a few bites and then leave it to the side without advertising their distaste.

Unless, of course, it threatens their health. I recently hosted a friend who has celiac disease and who apologized several times in advance for the inconvenience. While the sincerity of his remorse was indeed helpful because it was convincing, I assured him there was no need for shame. I’ve had friends with juvenile diabetes and colitis and who need to be fed through tubes. At my wedding, where guests had been requested to bring cakes instead of presents, I chased down every single baker in order to mark any desserts containing traces of peanuts, oranges, or coconut. I never mind offering vegetarian options because they accommodate a wide array of dietary restrictions with both cultural and medical bases, just as alcohol-free beverages are helpful to kids, recovering alcoholics, devout Muslims, and pregnant women alike. But my tolerance generally ends there. Because beyond that boundary seems to be where stubborn intolerance for all sorts of food spreads like the plague.

According to the cover story of Die Zeit this week, less than 1% of Americans are gluten intolerant, yet 28% of Americans purchased gluten-free products last year. The percentage of Germans who purchase lactose-free products has tripled in just five years. Such sky-rocketing numbers sound much more like successful marketing trends than biological shifts in the population. Inordinate media attention to rare medical issues always inspires swathes of people to self-diagnose rather than check with their doctor. In response to what sometimes does seem like an epidemic of hypochondria, a kindergarten in Hamburg has recently taken to demanding medical documentation for any alleged food restrictions among its students. Die Zeit writes that parents were insisting their untested children had food allergies after the appearance of the slightest yucky face. Of course a child at risk for anaphylactic shock is better safe than sorry, but to teach a child to regularly cry wolf is to teach them to rely on their most narrow-minded instincts.

This is not a call to villainize health advocates or burn certain cookbooks. On the contrary, the greatest thing about the human culinary tradition is its diversity. When I grew up in the Eighties on Long Island, skim and lite and sugar-free products were in fashion, but anything organic or “foreign” or “ethnic” was scarce because what’s wrong with some good old American spray-on cheese? Sushi was gross (“It’s raw fish, you know!”), vegetarian dishes were for pansies, and escargot was what made the French so weird in the first place. (See this Indiana Jones clip.) Kids today are growing up more environmentally conscientious and more open to exploring new cultures, and I am glad to see the American tradition of grimacing at all the icky cuisine of the savages and the smelly Europeans go the way of the Twinkie. But there’s no progress in simply switching the grimace from the sight of imported cuisine to the sight of anything that isn’t in line with the latest imported health fad.

While it seems many finicky eaters think their aversion to certain foods resembles a disability (“Please don’t criticize me for something I can’t do!”), it often resembles ableism (“I refuse to budge on this issue!”). We cannot be open-minded and at the same time refuse to leave our comfort zone. As the Food Commander writes in his excellent Huffington Postarticle, “Unless you suffer from a disease or real (unlike imagined) food allergies, … kindly embrace the fact that your body is not all that fragile. Humans survive every day in conditions way worse than, say, a four-course dinner in an Upper East Side townhouse.”

Outside of a meal, it can be fun to explore cultural differences and personal preferences: why so many Chinese love meat but dislike butter, or why German senior citizens detest turnips. It’s also amusing to try to argue the illogic of taste. (One such argument culminated in one of my relatives bellowing, “I am not a fan of the bean!”) It is also imperative that we eventually discover which food restrictions have been caused by environmental changes and which have been encouraged by marketing trends. But the fun comes to a screeching halt when these discussions ooze onto the dinner table.

Such candidness often has innocent origins. In these rather unrepressed times, where dinner guests discuss everything from politics to polyamory, why not share our honest opinion of what’s on our plates? This approach, however, ignores two very important facts. Firstly, unlike in a restaurant or your own home, the meal laid out for you has been paid for by someone else. Secondly, unlike the selection of films or games or whatever else it is you don’t like about the home you’re in, the meal laid out for you is the result of someone else’s time and effort. To go so far as to scrutinize it (“Is it organic?”) or disparage it (“It’s too bad it has olives in it!”) is to spit on the dinner invitation that was extended to you out of sheer generosity.

I know what it’s like not being able to participate in communal activities. This blog is all about those who have no choice about being an exception to the rule. Those who have bona fide difficulties digesting certain foods—perhaps akin to my difficulty walking long stretches—should not feel ashamed. But shame should not be supplanted with complacency, either. As my friend with celiac disease said, it is usually regrettable to have to limit one’s range of experience, and it is always regrettable when it involves rejecting an offer of kindness.

Indeed, my proudest moment during my bout of stenosis last year was pulling off a Thanksgiving dinner with my partner that fed 17 people while I was still recovering from spinal surgery. If any of this year’s guests cannot stomach something, they will hopefully follow the example of my more gracious friends who keep things discreet, at least at the table. I don’t subject them to judgment by examining their plates for leftovers and threatening to deny them dessert because they spare me the insult of telling me exactly how my offering failed to satisfy them.

They also resist the temptation to dive into an unsolicited monologue of healthier-than-thou moralizing, a tendency that accompanies food more than any other health issue. I’m usually the last to squirm at medical stories, but I’ve been thinking lately that if I have to hear about the details of the latest nutritional research every time I put a spoon to my mouth, maybe I should start lecturing about my back problems every time I see someone wearing heels or sitting at a computer.

Eating is a necessity and a health issue and an environmental issue and a cultural tradition. I love learning from friends and researchers about the different ways we all eat, and the socio-political forces of the food industries are absolutely fascinating. But I won’t ever admire someone merely for eating homemade bread or fine delicacies or simple fare or whatever it is that the Paleo diet currently dictates. Those I do admire cook joyfully in their own homes and, when invited to someone else’s home, plunge their hands obligingly into whatever their host has set out for them, whether it’s okra or Oreos. As minority rights activist Andrew Solomon has pointed out, a truly tolerant culture celebrates additive social models, not subtractive ones.

Or, more simply, I will always care a lot more about your table manners than your diet.

This holiday weekend I’m sparing you my deep and profound thoughts about the Barbie Dreamhouse exhibit that opened this week in downtown Berlin and the protest that accompanied it. Instead, I’ll let the issues and problems of beauty standards and femininity and sexuality and body image and fashion and pink and sparkles be summed up by a little story I discovered this year:

In 1999, Jon Stewart was invited to be featured in People magazine’s annual list of 50 Most Beautiful People. (I’ve written about the List before in The Body Image Series, highlighting Michael Chabon’s excellent reaction to it.) Stewart agreed to be featured but insisted on wearing a pink prom dress and a tiara for the photo shoot. Why?

While chatting with colleagues over coffee this week, I ended up “outing” myself as a dwarf who’s had limb-lengthening. (Experience has taught me some people notice right away when they meet me that something is up, while others go a long time without the slightest idea, especially in the wintertime when my scars are hidden under sleeves and pants.) We arrived at this topic by discussing fashion—and the recent scandal in Sweden that’s left me almost speechless—and then beauty and self-confidence. Several of my colleagues pointed out that every person they know who’s undergone cosmetic surgery never struck them as unattractive before the fact. Only an idiot would think that there’s only one kind of beautiful nose or mouth or whathaveyou. And only a jerk would tell someone to have cosmetic surgery.

As you may have guessed, I agreed wholeheartedly. But what about telling someone to wear makeup?

This week, a man writing to Slate’s Dear Prudence advice column confessed he feels simultaneously guilty and helpless about the fact that some of his female friends are unlucky in love because “their looks are probably the only thing holding them back.” Prudence tends give good, progressive advice, but this time, instead of telling him the ladies should move in less superficial circles, she suggested he pair them up with some similarly “average-looking” male buddies. She then added, “If the problem with your female friends is not their intrinsic looks but the fact that they dress like schlubs or never wear makeup, then a guy’s perspective that they aren’t doing everything with what they’ve got could spur them into action.”

Ugh. Say what you want about clothes, but the makeup debate is as messy and gunky as makeup itself, which is why I’ve avoided it up until now. But am I the only one who thinks telling someone to start using makeup is entirely different from giving them your opinion about the way they dress?

Everyone, from my partner to my grandmother, rolls their eyes at certain fashion choices and, as I’ve said before, anyone who denies they ever do it is lying. It betrays a pathetic insecurity to trash others’ dress for the sake of your own self-aggrandizement—e.g. “I wouldn’t be caught dead in that!”—but it is fair to say what just isn’t your cup of tea. We can snark a little about someone’s clothes, hairstyles, accessories, headgear or makeup style (if they have one) without too much malice because someone is probably snarking about ours. No one on earth dresses in a way that is universally attractive because there is no such thing as a universal beauty standard. And as the saying goes, there is no arguing taste. Someone thinks this is kick-ass, and someone else thinks it’s sloppy:

Someone thinks this is dreamy and someone else thinks it’s one big yawn:

Someone thinks this is sexy and someone else thinks it’s garish:

People find beauty in this:

Or this:

Or this:

Or this:

Or this:

Or this:

And that’s just a tiny sample from around the world. There is even more variation across time because, as Oscar Wilde said, “Fashion is a form of ugliness so intolerable that we have to alter it every six months.” I think some of my friends, like some of the subjects above, have a great sense of style, while others do not. They in turn probably think the same about me. But if any of them thought I should wear makeup more often than I do—which is almost never—and told me so, they wouldn’t be my friends. But what if they’re my supervisors?

In January, a study featured in The New York Times revealed that (American) women who wear makeup are considered more competent and more likable in the workplace. A panel of stylists and professors made various points about this that basically all boiled down to, “It’s a choice. If it makes women feel more confident, they should go for it.” But if the study indicates that their confidence would result from garnering more positive attention for their looks, then their lack of confidence without makeup would result from a fear of not getting attention for their looks.

Many modern women, especially lipstick feminists, repeat, “Empowerment is all about being free to choose!” There is truth in this. I know guys who were bullied in school for wearing concealer or plucking their eyebrows. Women meanwhile are often forced into a nearly impossible balancing act wherein no makeup = plain Jane, but too much = slut, and kudos to anyone who refuses to play that game. Good girl culture, as well as the results from the study, assert that “less makeup is more – you should look like you’re not wearing any.” This rule seems potentially problematic to me because it is insidious. If someone gets used to just slightly “improving” their face every day, it is more likely they’ll feel insecure without these improvements. I occasionally enjoy wearing heavy makeup bordering on the outrageous (like glitter), but it feels like a mask and everyone knows it’s a mask. When it’s so obviously part of a costume, there’s not much danger that I’ll start considering it an inalienable component of myself. But the subtle makeup seems to be a lot harder for people to let go of. I know women who refuse to be photographed without their makeup on—and you probably do, too—and if that doesn’t sound like an unhealthy insecurity, I don’t know what does.

In any case, it doesn’t sound like they are “free to choose,” as lipstick feminists advocate. As I’ve written before in explaining my choice to have my limbs lengthened, we should be free to make complex decisions about our bodies without others making snap judgments about our motivations. Anyone who does is a coward. But it is also cowardly of us to voice hatred for our natural faces and simultaneously deny that this has any impact on others. In the words of philosopher Arthur W. Frank, “When we make a choice, we confront others with that choice.” The freedom to choose diminishes when a strong majority bends in one direction, because majorities create social pressure. In a society that literally rewards women who wear makeup—i.e., with higher salaries—it is undeniable that many do so in order to win these rewards, ultimately playing by the rules under the guise of empowerment. The cosmetics industry, like any industry, always aims to make their customers feel that they cannot live without their product and so they too have embraced the slogan of “Empowerment!” Leading The Onion to smirk, “Women Now Empowered By Everything A Woman Does!”

It would be obnoxious of me to assume that every woman with a compact in her purse does it to acquiesce. I know and admire self–confidentwomen who love putting on bright red lipstick and self-confident men who wish they could, too, without being gawked at. Primping can be fun. Painting your skin certain colors can make you feel fine and refreshed, like slipping into a brand-new top or getting a new haircut. Or brushing your teeth after a hangover.

But it’s not quite the same thing, is it? Once again, it’s a mask. A friend of mine who loves dressing up but hates wearing makeup recently said, “I guess, ultimately, it’s weird looking in the mirror and seeing something that doesn’t look like me. I don’t really like makeup on other people either though, so perhaps it’s a general class of trying to hide oneself that bugs me.”

Indeed, that is one of my many reasons for rarely ever using cosmetics, why I graciously declined friends’ offers to do me up on my wedding day, why I cringe at the idea of anyone pressuring women into it. I also like being able to rub my face without having to worry about smudging. I’d rather spend the money on a million other things. My partner hates the taste of cream, gloss or powder—“Kissing someone wearing foundation is like kissing a sandbox!”—and I must say I don’t blame him. Most importantly perhaps, I don’t understand why our culture believes that women’s faces require some paint in order to be attractive but men’s faces don’t. If I can’t compensate for the plainness of my natural face with my charisma, then no one should be able to.

Of course, almost all of us conform to our culture’s beauty standards to some degree. I’ve worn concealer for blemishes and plucked my eyebrows to make them even, but I feel a strong attachment to my scars and so I’ve kept them. I don’t always like my face—don’t we all have those days when we look in the mirror and just feel yucky and dissatisfied?—but even if I thought putting on some modern Western style of makeup would make me look “better,” it wouldn’t look like me. Experience has also taught me that a dissatisfaction with one’s looks is almost always rooted in something more substantial: feeling not very fit, feeling overtired and stressed, feeling lazy because there’s been too much or too little to do. And even if it’s not, I often feel very satisfied with my face, so on a bad day why not simply walk away from the mirror, focus on something a little more profound than my appearance, and have confidence that the feeling of self-satisfaction will return?

Women who feel that makeup use is obligatory but unwanted, that it requires a forced confrontation with the mirror when they’d rather put their attention elsewhere, do not feel more confident after using it. Research suggests that women can feel objectified by makeup, and for such women, any potential advantage may be offset by the emotional labor of wearing it.

And, in an excellent article on weddings, Ariel Meadow Stallings of Offbeatbride.com writes:

I’ve been thinking a lot lately about the pursuit of authenticity versus the pursuit of attention. The first feels very internal, like you really have to look with-in yourself with a lot of introspection and thought to determine what’s important … while the other feels very external, like you’re hunting for other people’s eyeballs. And why does one seem like so much fun, while the other seems like so much work? …

I guess it comes down to this: Attention gives you the cheap high of other people’s energy focused at you … but authenticity gives you that deep, long-lasting satisfaction of knowing that you’re on the right path and you’re doing the right thing. While the quick high is more fun in the short run, the deep satisfaction is ultimately more filling.

This is why it is fine to wear makeup but wrong to tell someone else to. Not only is it a ludicrously presumptuous, boundary-crossing thing to say—like telling someone to switch careers or leave their spouse—but it’s vacuous because it has nothing to do with matters of justice or morality. It is sheerly a matter of beauty standards. The worst thing about beauty standards is that they create peer pressure based merely on taste. The best thing about them is that, as seen above, there are millions of them, and they are constantly changing. If humans are capable of thinking the lip-plate is attractive, then surely we are capable of thinking a woman without makeup is attractive.

Women and men should feel free to smear their faces with whatever they wish or go without, to pluck their eyebrows or leave them be, to shave any body part or refrain. (Bearing in mind doctors have recently explained the cringeworthy risks of shaving certain parts.) But the moment they say that someone should do the same in order to feel better or lure lovers or advance their career, we have a problem. And it’s not physical.

A new study claiming that Overweight and Class 1 Obese people have a lower mortality rate has been bouncing around the world since Thursday. National Public Radio’s report seems to be the most comprehensive but hints at the two most extreme, polarized viewpoints:

Cosmetic: This is a victory for the overweight—now we can trash skinny people (again)!

Medical: If people hear about this, everyone will stop exercising and eating their vegetables and then everyone’s going to die!

Both views treat the public like infants who can’t possibly think for themselves.

Doctors are right to worry that a sizeable portion of the population will use this news as an excuse for whatever unhealthy habits they love. This is why it is important to include the many possible factors skewing the results. But many people will always cherry-pick whatever statistics suit their lifestyle or claim to be the exception to the rule. I don’t have any political solutions for engaging with contrarians—whether we’re debating eating habits or global warming—but talking down to them and using scare tactics has a pretty high failure rate.

And from the disability rights perspective, there are exceptions to the rule when it comes to health. Thousands of them. As said before, a round belly is not always a sign of fat. A bony body is not always a sign of an eating disorder. Many forms of exercise can be more hazardous than beneficial to people with certain conditions. And many life-threatening conditions are invisible. Medical tests, not appearance, are always the most reliable indicators of health. This robs us of the easy answers we crave and which facilitate public debate, but there has never been and never will be a one-size-fits-all health program for the 7 billion humans on the planet.

You and your doctor know better than anyone else if you are healthy or not. If she says you are overweight but your genes and cholesterol levels put you at no risk for heart disease, she’s probably right. If she says your weight is ideal but your eating habits put you at risk for malnutrition, she’s probably right. And if her advice seems sound but her delivery makes you feel too ashamed to discuss it, go find someone with better social skills to treat you. At the individual level, it’s no one else’s business. Outside of the doctor’s office, it shouldn’t be any more socially acceptable to discuss someone else’s weight or waist size than it is to discuss their iron levels, sperm count, or cancer genes.

But beauty standards and health trends often go hand-in-hand. And what really needs to go is the lookist idea that we’re all semi-licensed doctors who can diagnose people just by glancing at them and deciding how they measure up according to the latest medical research. The reason we have a hard time letting this go is because it’s fun to point out others’ supposed weaknesses. It’s self-elevating and validating to snicker that ours is the better body type because it calms our insecurities. Beauty standards are cultural and constantly morphing throughout history, but they have always remained narrow. (This is especially the case for women, though I sincerely apologize for not providing more research on men.) Whether fawning over big breasts or flat tummies, public praise for certain body types has almost always been at the expense of others:

After decades of the Kate Moss heroin chic, Christina Hendricks (see above) of Mad Men has garnered lots of attention for her curves and this week’s study is likely to encourage her fans. “Christina Hendricks is absolutely fabulous…,” says U.K. Equalities Minister Lynne Featherstone. “We need more of these role models. There is such a sensation when there is a curvy role model. It shouldn’t be so unusual.” She is dead right that it shouldn’t be hard for curvy women to find sexy heroines who look like them in film and on television, just as skinny women or disabled women or women of any body type shouldn’t have to give up on ever seeing celebrities with figures like theirs. But “Real women have curves!” is just as exclusionary as the catty comments about fat that incite eating disorders. And when Esquire and the BBC celebrate Hendricks as “The Ideal Woman,” they mistake oppression for empowerment.

We can accept the idea that people of all sorts of different hair colors and lengths can be beautiful. Will mainstream medicine and cosmetics ever be able to handle the idea that all sorts of different bodies can be healthy? History says no. But maybe it’s not naïve to hope.

And what does Christina Hendricks have to say about all of this? “I was working my butt off on [Mad Men] and then all anyone was talking about was my body.”

This family portrait of a father and son in a small town—deep in the province and deeply religious—in Southern Germany has been traveling around the world. When his five year-old boy expressed a love for dresses but found himself alone on the playground, Nils Pickert writes in Emma magazine that the only way to make sure his son knew that he supported him 100% was to be a role model of self-confidence and don a skirt himself.

“Yeah, I’m one of those fathers who believes in liberation when it comes to parenting,” he writes. “I am not one of those academic dads who ruminates and lectures about equality between the sexes, and then, the moment a child arrives, slips back into the old comfortable gender roles: He does his own thing by having a career, she takes care of the rest.”

When he switched to a new kindergarten, the teasing got to be too much and the author’s son stopped wearing dresses to pre-school. But he turned to his father and asked, wide-eyed, “Papa, when are you going to wear a skirt again?” So Dad made sure to keep wearing his skirt out in public. He writes, “I’m very grateful to the woman who stared at us on the street until she walked into a lamppost. My son roared with laughter. And the next day, he fished a dress out of his closet again.”

I don’t have much to add to this story besides the smile it brought to my face. And a hope that someday these two will be models for a poster that will take its place in history alongside Rosie the Riveter.

When I was about 10 years-old, a friend of mine with achondroplasia was being teased at her school for being so short. After being shunned at lunchtime repeatedly—“No freaks at this table!”—her mother finally called her local chapter of Little People of America, which sent a spokesman into the school to give a presentation. After he read Thinking Big to the class, explaining thoroughly in an age-appropriate manner why my friend looked the way she did, one of the biggest bullies raised his hand. “So, you mean, she’s little because she’s a dwarf?” he asked.

The spokesman offered to let my friend answer the question herself and she replied, “Yes.”

The boy who had teased her so much suddenly had tears in his eyes. It later came out that his new baby brother had just been diagnosed with dwarfism. He had had no idea until that moment that his brother was going to grow up to look just like the girl he’d targeted.

To anyone who insists, “He couldn’t have known,” he could have. We could have let him know. What is school for, if not the pursuit of knowledge? With the exception of women, all minorities risk marginalization not only by others’ lack of empathy but by the lack of visibility automatically brought on by their lower numbers. Any place that prides itself on learning should pride itself on learning about other perspectives, other identities, other behaviors, no matter how rare.

So “What’s Wrong With A Boy Who Wears A Dress?” asks The New York Times magazine on its cover this week. Despite that the flippant headline sacrifices sensitivity for saleability, at least it’s shedding light on the subject. I know so many men and boys and trans individuals who wear dresses for so many different reasons, and they do it a lot more than mainstream movies, TV, and advertising suggest:

The Times article has its flaws. When discussing how boys who wear dresses turn out later in life, the article stuffs them into three overly simplistic boxes: a) gay, b) heterosexual, and c) transsexual. Such labels do not encompass all the ways and reasons people of various gender identities and sexualities wear dresses into adulthood. As one friend observed, “The path of least resistance for so many is to wear dresses in secret. By using these limiting categories, the article implies that and also does nothing to change that.” The use of the categories also implies that these individuals owe us a clear-cut, sex-based explanation for their behavior, which is itself a symptom of narrow mindedness. No one demands a woman explain why she likes wearing jeans.

And yet the article also keeps its subjects silent. While documenting the struggles of both conservative and liberal parents, the author would have been wise to include the perspective of adults who wore or wear dresses. In the absence of their agency, their nervous parents are essentially speaking for them. (Rule Number One in Battling Intolerance: Never, ever let a minority’s agency be ignored.)

But for all these errors, the article concludes with those who ultimately support their sons as best they can. One dad heard that his five year-old was being taunted in kindergarten for wearing pink socks, so he bought himself a pair of pink Converse sneakers to wear in solidarity. The kindergarten teacher jumped in, too, opening up a class discussion about the history of gender rules and shocking the kids with the information that girls were once not allowed to wear pants.

Whenever reports on “different” children list the anxieties parents have about their kids not being accepted, the message often starts to get muddled. Sometimes the article is clear that we as members of society need to get over our hysterical hang-ups and start accepting these children as they are so that they and their parents no longer have to worry what we and our own children will say. Too often, however, the article spends so much time quoting the parents’ fears that the source of the problem starts to sound more and more like the child’s disruptive identity, not others’ clumsy reactions to his identity. And that’s wrong.

Whenever a child is made fun of for being himself, it’s our problem, not his. Biologists can say what they want about a fear of difference being an evolutionary adaptation, but our culture values differences two ways, either as “abnormal” (i.e., strange and pitiful) or “super-normal” (strange and admirable). The Beatles’ mop-tops were abnormal to parents of the time (“They look like girls!”), and super-normal to their teenage children. In the nature vs. nurture debate, we need to stop saying “nurture” and start saying “culture,” because changing the environment a child grows up in means changing the behaviors of more than just one set of parents. Mine never once told my younger brother, “Only sissies cry,” but his little league coach told the team just that.

This is our culture and we are the ones shaping it as the creators and consumers. By making and watching films and TV shows that state what’s “gay,” “wimpy,” “ugly,” “freaky,” or “gross.” By stating, “Guys just don’t do that,” or letting such remarks go unchallenged. By repeating traditional views of minorities—e.g. the dwarfs of Snow White and Lord of the Rings—and failing to provide more realistic portrayals with greater frequency. As adults, we bear so much responsibility for shaping the world the younger generation is trying to navigate. (As this German Dad proved so well.)

Since the Sixties, many parents and teachers and educational programs have embraced books that promote understanding of ethnic diversity such as People and of disability such as I Have A Sister: My Sister Is Deaf to broaden our children’s perspective and nurture empathy toward people they do not encounter every day. Yet books like My Princess Boy or The Boy In The Dress have yet to break into the standard curriculum. There seems to be an unspoken assumption that such books are primarily for the boys they’re about. (Buy them only after your son starts actively asking for a tiara.) But everyone should be reading them, for the same reason everyone should be reading Thinking Big. By waiting to address the idea of free gender expression until a little boy gets bullied, we are cultivating the assumption that the problem never existed until that little boy came along. The problem was always there.

Critics have argued The Boy In the Dress is unsuitable for any boy in real life who feels the like the protagonist because any school he attends in real life is far less likely to rally around him so enthusiastically. But that’s exactly why this book needs to be read and discussed and picked apart by school classes around the world, not just by boys alone in their bedrooms.

As a teacher, babysitter and relative, I encourage the little boys in my life to play dress-up, house or princess with their female playmates because I’ve yet to hear a convincing argument as to why it’s any different from encouraging the girls to get down and dirty in the mud with their brothers. Sure it’s radical—just as my mother’s wearing jeans to school 42 years ago was radical—and the last thing I want to do is turn a child into something he’s not. But as with a girl, I want him to feel that every option is open to him, despite any hang-ups tradition has about it. And if it becomes evident that he truly has no interest in anything soft or sparkly, I at least want to do my best to ensure that he never, ever makes fun of any boys who feel otherwise.

Nothing divides a country quite like a national holiday. When I was studying in St. Petersburg ten years ago, there was as much apathy as there was celebration on the Russian Federation’s June 12th decennial. German reactions to Reunification Day every October 3rd are anything but united. And on the United States Fourth of July last month, Chris Rock tweeted, “Happy white peoples independence day, the slaves weren’t free but I’m sure they enjoyed fireworks.”

Amid the outbursts of “unpatriotic!”, conservative blogger Jeff Schreiber shot back, “Slavery existed for 2000yrs before America. We eradicated it in 100yrs. We now have a black POTUS. #GoFuckYourself.”

Schreiber has since written a post on his blog, America’s Right, apologizing for cursing and conceding that the slave trade was unconscionable. But for all his insistence that he never intends to diminish the horrors of American slavery, he adds that President Obama’s policies are now “enslaving Americans in a different way.” (Real classy.) And for all his reiteration that slavery was always wrong, he still hasn’t straightened out all the facts skewed in his Tweet.

“Slavery existed for 2,000 years before America.” He uses this supposed fact to relativize the oppression, as if to shrug, “Well, everyone was doing it back then.” His tweet implies that the ubiquity of the slave trade makes America’s abolition of it exceptional, not its participation. This argument hinges on fiction. Slavery did not exist for 2,000 consecutive years. In the West, it was pervasive in Antiquity and the Modern era, but it was downright uncommon in the Middle Ages. (While anathema to our modern ideas of freedom for the individual, medieval serfdom was not slavery.) Slavery was re-instituted in the West roughly 500 years ago with the advent of colonialism. And the United States held on to it long after most other colonial powers had abolished it. Critics can say what they want about the effectiveness of Chris Rock’s rain-on-a-parade tactics, but his argument did not distort history.

In my last post, I argued the risks of concealing the human rights abuses of the past for the sake of nostalgia, if anything because it is the height of inaccuracy. But portraying history as an unbroken tradition of straight, white, able-bodied male dominance like Schreiber did is also inaccurate. The universal human rights movement in its modern form is indeed only a few decades old, but the idea of equality for many minorities can be found all over in history at various times and places. The Quakers have often been pretty keen on it.

And almost no minority has been universally condemned. People with dwarfism appear to have been venerated in Ancient Egypt. Gay men had more rights in Ancient Greece and in many American Indian societies than in 20th century Greece or the United States. Muslim women wielded the right to divorce long before Christian women. English women in the Middle Ages were more educated about sex than their Victorian heiresses. Much of the Jewish community in Berlin, which suffered such unspeakable crimes culminating in the mid-20th century, were at earlier times better integrated into the city than Jewish people were in many other capitals of Central Europe. In short, history does not show that racism, misogyny, homophobia, ableism, transphobia, and our current beauty standards are dominant social patterns only recently broken by our ultra-modern culture of political correctness. The oppression of minorities may be insidious and resilient throughout history, but it has never been universal.

Downplaying the crimes of the past by claiming everybody did it is both historically inaccurate and socially irresponsible. It is perverse when such misconceptions fuel arguments for further restrictions on human rights. In 2006, Republican Congress member W. Todd Akin from Missouri claimed that, “Anybody who knows something about the history of the human race knows that there is no civilization which has condoned homosexual marriage widely and openly that has long survived.” Even if this were true, the argument is absurd. (It appears that no civilization has regularly chosen women with dwarfism for positions of executive power, but does that mean it’s a bad idea?) But the argument collapses because it relies on facts that are untrue.

Granted hyperbole is a constant temptation in politics. Stating things in the extreme is a good way to grab attention. In an earlier post on sex, I asserted that mainstream culture assumes women’s sex drive is lower than men’s because female sexual expression has been “discouraged for millennia.” Patriarchy has certainly been a major cultural pattern around the world and throughout history, and we cannot emphasize its power on both the collective and individual psyche enough. But patriarchy is by no means a cultural universal. Ethnic groups in Tibet, Bhutan, and Nepal continue to practice polyandry into the present day, while history shows many others that have done the same at various times. These exceptions question the biological theory that heterosexual male jealousy is an insurmountable obstacle to sexual equality. And prevents any conservative excuse that insists, “Everybody’s been doing it.”

They haven’t been. Xenophobia has never been universal. Humans may have a natural fear of the unfamiliar, of what they perceive to be the Other, but our definitions of the Other change constantly throughout time and space, as frequently and bizarrely as fashion itself. This makes history craggy, complex, at times utterly confusing. Like the struggle for human rights, it is simultaneously depressing and inspiring. But whatever our political convictions, we gotta get the facts straight.