A Sydney woman has been declared fit to stand trial after being charged with murder for the 2010 death of her infant daughter. The judge has concluded that before the child died, the mother was “obsessed with perfection,” and was panicked that her daughter had achondroplasia, the most common type of dwarfism. She insisted that skin tags, a flat nose, and the shape of the baby’s forehead were proof of the condition, and subjected her to rigorous x-rays and genetic tests, which all came back negative. The Sydney Morning Heraldreports: “When one friend got ‘fed up’ and told her she wasn’t dealing with something like cancer, the mother replied: ‘Sometimes it’s better to deal with a terminal illness than to live with a dwarf for the rest of your life.’ ”

That her daughter did not have achondroplasia is wholly irrelevant. Neglecting or harming a child on the basis of a bodily deformity she did or did not have is tragic no matter how you cut it. It sends two extra shivers down my back stemming from the fact that I have achondroplasia and would have a 50% chance of passing achondroplasia on to any children I were to have biologically. In severalpreviousarticles, I’ve examined the complicated issue of children with rare conditions and parents who lack the skills to give them the support they deserve. I am equally preoccupied with what it means for the child and what it means for the parent.

I’m not interested, however, in judging the accused woman personally because we can draw few accurate conclusions from the reports of her case. Many will argue that her schizophrenic disorder was the sole catalyst of her actions, while many experts on mental illness have tried to convince the hard-to-convince public that having schizophrenia does not make someone more likely to commit murder or manslaughter, and bigotry against achondroplasia is certainly not a symptom of the illness. Schizophrenic disorders are complex, and armchair diagnosis is a dangerous game far too many of us like to play. The temptation is best left resisted.

But it is safe to say that the likelihood of incidents like these would dramatically decline if our society saw nothing wrong with looking like a dwarf. Humans have a long history of parents abandoning or murdering deformed or disabled children. It goes as far back as Ancient Sparta and was codified into law here in Germany under the Nazi regime. And even in cultures where disabled or deformed citizens have generally not had to fear a death sentence, being humiliated or abandoned for having a certain body type is horrid enough. Firm belief in bodily hierarchy can be found in countless corners of modern society, from the glossy pages of lifestyle magazines, to Nobel Prize winner James Watson’s lectures on inherent attractiveness, to capitalist icon Ayn Rand’s arguments about who should be considered subnormal.

Yet while the long history of ableism and lookism may be a daunting fact, it is also a fact that fashion is constantly in flux. Humanity’s habit of relentlessly coming up with new ideas for how bodies should look is a cause for hope. Not because a woman with achondroplasia winning a beauty pageant could ensure our universal acceptance once and for all. It couldn’t. But by understanding how utterly diverse beauty standards, athletic standards, and intelligence standards really are throughout time and space, and by facing the very real dangers of xenophobia in extremis like the horror in Sydney, we should be able to agree that we’re all better off never being “obsessed with perfection” when it comes to bodies.

Can we stop using the words “nerd” and “geek” interchangeably? Forgive me if this doesn’t sound like the most pressing social justice issue of our time, but hear me out. I think the distinction is subtle but significant.

Geeks are a subculture. They like science fiction usually because it’s built around ideas posed by math and the natural sciences, just as literature is built around ideas posed by the humanities. If you don’t have a big appetite for Star Trek, the Hitchhiker’s Guides, or video games, you’re probably not a geek. Just like if you don’t enjoy nature, long hair, or folk rock music, you’re probably not a hippie.

Nerds, in contrast, simply share one trait: wanting to learn almost everything there is to know about a subject at the expense of their cool factor. And it seems to me that there’s a little nerd in all of us. From trivia and statistics to random factoids, a nerd examines a topic down to what Slate calls “the granularity that would glaze the eyes of a normal, well-adjusted human.” Sometimes the eye-rolling this brings on is fueled by inane rules for style that value keeping the lowest common denominator very low. But anyone with social intelligence knows that it’s also unfair to demand everyone share your love for a subject, no matter what it is.

I try not to look bored when friends expound upon existentialism, or when my dad gets excited about weather statistics, but I can likewise put them to sleep with monologues about typography or Russian grammar. I have a hard time looking thrilled when my husband analyzes the meal he cooked for us in too much detail, or when my uncle gets out his car magazines, but I get the same looks from outsiders whenever I discover a fellow classic rock fanatic. An obsession with trivia—in any area—will forever be the opposite of a social lubricant. Saying, “I’m such a nerd” with a sheepish grin usually means, “I love something to a degree that might ruin the evening if you ask me about it.”

But traditionally, the nerd word is used much more specifically. Nerd hobbies are thought to be geeky. Nerd intelligence almost always means “book smart.” The Urban Dictionary says a nerd is “one whose IQ exceeds his weight.” A gardener and a mechanic can be skilled, but only botanists and engineers can be nerds. Why?

One summer in my early teens, I was sunbathing at a friend’s house and talking about the new atlas I had bought. “I’m hoping that someday I can identify all the flags of the world,” I smiled, with perhaps a bit too much enthusiasm.

My friend’s mother frowned and asked, “Why?! Just to be better than everyone else?”

She knew how to repair a motorcycle. I knew the names of the world’s nations. Why was my knowledge automatically seen as a pretension? (I was too embarrassed and too young to dare to ask her, but I wish I had.)

A lot of it has to do with social status, however ridiculous that is. We tend to see bookish people as the inventors of ideas and therefore the brains. People working in production and maintenance are the realizers of the ideas and therefore the salt of the earth. Artists are classified depending on which of these two groups they appeal to: Classical composers and jazz musicians make high art for the “elite,” while rappers and country singers make soul for “the people.” (Artists who appeal to both are gods and everyone wants to sleep with them.)

Self-proclaimed nerds sometimes defend these rigid categories, reassuring themselves that the only reason anyone would malign their expertise is meat-headed jealousy. This is certainly true in many cases. The stereotypical anti-intellectual will lash out when someone’s way of life threatens to highlight his weaknesses. But the stereotypical ivory tower snob will sneer when someone’s way of life threatens to highlight his weaknesses. Both the belligerent athlete and the arrogant mathlete lack the emotional intelligence to recognize that both trigonometry and football require brains. Both topics can be obsessed over in nauseating detail. But Western society—which places an inordinate emphasis on IQ—has yet to be convinced of this. IQ tests define “intelligence” as strong mathematic and/or verbal skills, and so do most of us when we describe someone as “smart.” This is wildly inaccurate and unhelpful.

Howard Gardner’s theory of multiple intelligences turns 30 this year, but we have yet to adopt the concept into our common parlance. The theory currently identifies seven forms of intelligence:

(Some groups have promulgated a theory of Culinary Intelligence, as well as Sexual Intelligence.)

So there are more than two ways to be “smart.” It seems logical to conclude that people choose their jobs based on combinations of intelligences. A speech therapist needs both linguistic and interpersonal intelligence, whereas a songwriter needs linguistic and musical intelligence. A dancer needs musical and kinesthetic intelligence, while a soldier needs kinesthetic and spatial intelligence. Take that, IQ tests.

But this shouldn’t come as a big surprise. Every one of us knows someone who’s read a hundred books but can’t fill out a tax form. Or who can identify every bit of green in your backyard but can’t analyze news stories in a historical context. Or who can counsel people with all sorts of problems but can’t dance for the life of them. Or who can sew the coolest costumes but can’t make strangers feel comfortable. We should all be big enough to take pride in our talents and to be teased for our weaknesses. Especially if we’re going to start fully accepting people with certain disabilities.

The theory of multiple intelligences does not claim that everyone is a genius in their own way. Everyone knows a good guitarist isn’t as smart as a great guitarist. But the theory asserts that a great guitarist is no smarter than a great nurse or a great ballerina or a great chemist. So why then do we call the chemist “smart” and the others “talented”?

And why isn’t the soccer nut who won’t stop analyzing the semi-final games called a nerd? Why isn’t the housewife who goes on and on about how to master pie crust recipes called a nerd? Maybe it’s because these activities are socially condoned: A guy is expected to love sports and a housewife is expected to love baking. Maybe by choosing less socially accepted hobbies, people of high IQ monopolize the term “smart” as a consolation prize. Maybe the term “nerd” still carries too much stigma for socialites to desire it. Maybe if we broaden the use of these words, maybe if everyone recognizes their inner nerd, then maybe some social barriers will be knocked down along the way.

I’m not expecting utopian results. We’re all doomed to clash over our passions because no one can be expected to obsess over the intricacies of every subject on earth. Whenever I get together with a friend who works as a computer programmer, it’s a fight over whether we play games that reward strategy (like Monopoly), or games that reward vocabulary (like Scattergories). He’s geekier than I am, but he’s not nerdier. In any case, I always get my way because I’m bossier.

I was about to help a 5-year-old remove her tricycle helmet when we were cut off by a man staggering slowly down the street. Unnerved by his sudden presence and unusual gait, she stepped back and did a double-take. She stared at him and then turned to me. “He walks strange.”

I smiled but waited a few more beats until he seemed to be out of earshot. In the meantime, I wondered what to say to her. The adult/cynic in me was responsible for my gut feeling that he must be struggling with drugs or alcohol.

But then I considered how useful gut feelings really are in such situations. Annette Funicello complained of being accused of drunkenness when she was struggling with the early stages of multiple sclerosis. I have had enough questions about my sway back and achondroplastic gait to the point where I can only guess how many people aren’t bothering to ask me and simply making their own silent assumptions.

And while some have claimed gossip can be beneficial, it is so often responsible for misinformation and arrogance – the bedrock of ableism.

“He might be sick,” I said to her. “But we don’t know. He hasn’t told us. Sometimes when you’re sick your legs don’t work right. Do you remember when I had a brace on my leg last year?”

She nodded, and then peered once more down the street at him. “I think it’s ‘cuz he’s old! He has a gray beard and lots of old people have gray beards…”

“Some do! Like Santa Claus, right?”

She nodded.

“And my dad has a gray beard and lots of people call him Santa Claus!”

She laughed.

“But [my husband] has little gray whiskers, too, and he’s not really old yet, is he?”

“No… ”

“Does your daddy have little gray whiskers, too?”

“One or two… ”

“Yeah. Do they scratch when he gives you a kiss?”

“Yup!”

Neither she nor I will ever be fully liberated from the temptation to silently classify many of the strangers we encounter throughout our lives. But the idea that we can remind ourselves that we ultimately cannot know for sure, and that such conversations need not be engulfed in tones of complacency or pity is an idea worth considering.

The tiff between comedienne Amy Schumer and Glamour magazine this week has reached the media coverage level of Big Deal. In an issue featuring plus-size models on its cover, Glamour listed Schumer under “Inspiring Women We Admire” alongside Melissa McCarthy and Adele. Schumer took to Twitter to complain:

I think there’s nothing wrong with being plus size. Beautiful healthy women. Plus size is considered size 16 in America. I go between a size 6 and an 8. @glamourmag put me in their plus size only issue without asking or letting me know and it doesn’t feel right to me. Young girls seeing my body type thinking that is plus size? What are your thoughts? Mine are not cool glamour not glamourous.

The Glamour editors apologized for hurt feelings, while emphasizing their respect for Schumer and that they didn’t actually mean to suggest she is plus-size.

The public has divided in two, with Schumer’s supporters claiming she has helped to question not only the definition but the very idea of “plus-size.” After all, as the children’s book You Are (Not) Small shows, size is relative. “Plus size” is, to be sure, an utterly made-up idea, necessary to absolutely no one on earth.

The other faction has criticized Schumer’s seemingly contradictory praise for plus-size models in the same breath that she insists she doesn’t belong with them. While I am not interested coming to any conclusions about Amy Schumer’s true personality and values, her actions thusfar represent an all too common problem in the body positive movement. The problem leaves women who larger than a size 6 or 8 to fend for themselves not only against the hideousness of lookism in general, but against the implication that their smaller sisters are all quietly consoling themselves with the mantra, “At least I don’t look like that!”

Spend decades working to pick apart body image and lookism, and you’ve heard this all before. A woman—usually a woman—is an out and proud feminist, ready to rar about restrictive beauty standards while cracking jokes about her curves, but she cannot and will not stand anything less than compliments on her looks from others. In some cases, she goes fishing for compliments as much if not more than your average beauty pageant queen:

The reason so many of us end up doing this is because we like to be thought of as confident, yet we behave based on fear. We fear being called ugly, we fear not having broad appeal, and we do nothing to confront those fears. We talk openly about them. And stop there. And in doing so, we spread them.

We don’t face up to the fact that “winning” the beauty pageant game by having fashionable looks is no guarantee of lasting love or happiness. Instead, we keep on envying the winners and ever so quietly echoing the Mean Girls we met in high school: It is very important that most people think you are attractive. Beauty contests matter. Hierarchies matter, at least a little. No one wants to be last. You need someone to look down on in order to build yourself up. That’s natural. It’s a mess of a message to women and men, young and old alike. And it helps no one.

Sometimes it helps to switch from the high school mindset to an even less mature one. Spend a lot of time around pre-school children, and you know you can’t control what they notice:

“I think you’re pregnant!”

“Your skin’s all wrinkly!”

“Why are you so short?”

“Why do you walk so funny?”

“This hair is gray!”

“What’s that stripe on your arm?”

“What are those dots on your face?”

“Twenty-two is old!”

Pre-school teachers will fail—let alone make it through their first week— if they let such comments get to them. The best response, of course, is to engage the child and together examine the bodily feature they want to understand. If you don’t have the energy for a teaching moment, however, you simply shrug it off. Or say, “I am short/scarred/disabled because that’s just how my body looks. I like it that way.”

And if you want them to believe that—or anyone to believe that—then it helps if you believe it, too.

And I was only slightly startled to find nothing but solipsistic snickering and overdone puns. The Atlantic doesn’t win any points for ending the article on a pun, either. But praise is due for addressing the topic at all. Based on an extensive interview with Dr. Marylou Naccarato, who has Kniest dysplasia, the article takes a wonderfully sex-positive approach to the experiences of people with dwarfism and the physical obstacles they can face in bed.

As per nearly every feature on dwarfism in the mainstream media, there are some factual errors. For example, one dwarf couple is quoted claiming that people with achondroplasia require “no medication, surgeries, special needs, nothing.” (See here for a list of the many complications we are at risk for.) But Naccarato is doing great work that is revolutionary in light of the fact that Little People of America, and probably most disability advocate organizations, repeatedly shy away from the topic of sexuality.

A simple reason for their silence is that almost all disability organizations comprise just as many parents and relatives of disabled people as disabled people themselves. And who wants to debate the best way to masturbate with Mom or Dad sitting next you? A more sinister reason for the silence is one of the building blocks of modern prejudice against disabled people: that is, the presumption that they are innocent, and therefore asexual. Most positive portrayals of disabled people are cute and cuddly. Is it the only way society can accept us? Refusing to see a minority as anything but asexual is to deny them their full humanity, on par with slut-shaming, prude-shaming, queer bullying, and objectification.

Before I go any further, let me say this: I do not want to talk publicly about what I do in the bedroom and I do not want to know what you do in the bedroom. My firm belief in sex-positive feminism and equality does not mean I think that you are sexy or exciting or impressive. Unless we’re close confidantes or I’ve indicated otherwise, please assume I don’t want any mental images of you and your naughty bits, no matter what they look like.

That said, I fully support anyone’s right to desire any sort of consensual sex imaginable. Without double-standards. Without the pressure of competition. Without the nuisance of others turning their personal preferences into rigid rules.

Take, for example, the way virginity is so frequently turned into not just a game but a high-stakes tournament. When and how did you lose it is an idea all of us are expected to base much of our identity on, even as adults. This is despite the fact that, according to medicine, virginity doesn’t exist. After all, what kind of sex does a guy have to engage in to officially “lose” it? And what about girls born without hymens? When exactly do lesbians lose their virginity?

Like race, virginity is a social construct and, in the words of a very wise person on Tumblr, what can be socially constructed can be socially changed. Last year the great Tracy Clark-Flory interviewed acquaintances about the sexual experience they considered to be their “first time.” The glorious thing about her inclusive project was that it revealed human sexuality to be just as diverse as everything else about us. Some defined their first time by their first orgasm, others by a particular first touch or experience of being touched. The problem with her stretching the definition of “losing your virginity” so broadly is that it robs competitive, insecure people of their ability to set standards with which they can gloat and put others down. Wait, no. That’s another glorious thing about it. There really is no problem with recognizing everyone’s experience as equally valid.

Failing to include everyone not only causes unnecessary humiliation, but it causes us to miss out on opportunities for true enlightenment. To quote the authors of You Can Tell Just By Looking: “Sexual minorities—people whose sexual desires, identities, and practices differ from the norm—do a better job talking about sex, precisely because they are constantly asked to explain and justify their love and their lust to a wider culture and, even, to themselves.” The more you examine harmful traditions, the less necessary they become.

This does not mean that minorities have better sex. Indeed, too many activists in the sexual revolution end up repulsing readers and listeners when they allow pride in their sexuality to devolve into arrogance, insisting their sex life is better than yours, rather than merely different. For a year, the BDSM club at my alma mater ran the slogan: “I do what you’re scared to fantasize about.” Not helpful. And kinda pathetic the more you think about it.

I will never judge someone for liking any particular kind of consensual sex, but I will judge anyone who tries to turn sex into a competition to calm their own self-doubts. Whether you’re a wise-cracking online commenter or a sex-positive pioneer, true sexual liberation is about moving beyond the middle school clique mentality, not indulging in it. It’s pretty much the least attractive thing there is.

In the 1990s, Cristina Hartmann was one of the first of a few hundred deaf and hearing impaired children in the United States to undergo surgery for a cochlear implant. She has writtenextensively about the experience of hearing sound for the first time after the implant in her right ear was activated, most recently this month on Quora.com:

My mother was the one who told me, “Raise your hand when you hear something.” That statement left me baffled. What was I looking for? It was a bit like searching for Waldo when you didn’t know what he looked like.

In that tiny, windowless room deep in the large Manhattan hospital, the audiologist began tapping away at her keyboard. Everyone stared at me, even a woman standing in the doorway whom I had never seen before. I felt the heavy weight of expectations on my shoulders. I had to do something. I concentrated very hard, searching for the mysterious, indefinite Waldo. Whenever I felt anything, an itch or a breeze, I raised my hand slowly, searching everyone’s expressions for whether I had gotten it right or wrong. Nobody gave me any confirmation, so I went on guessing. Twenty-five years later, I realize the whole thing was a show that I performed. I knew this was a momentous event, and I didn’t want to disappoint….

As a congenitally deaf child (who was a bit long in the tooth at 6), I had never formed the neural pathways for my brain to even begin processing auditory stimulation. In the fashion of the ostrich, my brain ignored the strange stuff, and I remained as deaf as I had been an hour prior…

It took months and plenty of therapy for her brain to adapt. Thirteen years later, the activation of a second implant, this time in her left ear, proved a more harrowing experience than the first:

As the audiologist began the beep sequence, I burst into tears and involuntarily clenched the left side of my face. She looked up, puzzled. “Why are you crying? You’ve had this before!” she said. The pain was like sparklers going off on the left side of my head. The stimulation, as little as it was, completely overwhelmed me.

Even though I had already laid the neural pathways for auditory stimuli for my right ear, my brain was unprepared for the stimuli coming from the left side. Since my brain had already experienced this type of stimuli, it could process it, but it was still sensory overload. That stuff hurts. It took me months to acclimate myself to the new implant, but in the meantime, I cringed every time I turned it on. As I said, laying new neural pathways takes work.

Hartmann was later told by the mother of another patient, “Once they started with the beeps, [my daughter] screamed and cried.”

Such narratives exist in stark contrast to the YouTube videos of newly activated implant users laughing and smiling—and, in one case, crying for joy—that have been bouncing around the Internet with far greater frequency. While both narratives provide important information for those considering cochlear implants for themselves or their children, they are also an important contribution for the greater public in our understanding of what it means to be deaf.

It makes sense that crossing out of the world of silence into the world of sound is just as disorienting as its opposite. A hearing person with a middle ear infection strains to perceive the sound of speech, and a deaf person with a new cochlear implant strains to tune out noise pollution: the knocks of a radiator in another room, car doors slamming on the street, wind, footsteps, not to mention the countless background beeps and clicks of the Digital Age. After all, when a baby leaves the womb, she does not instantly adapt to her new home. She comes out crying. There’s too much light and not enough warmth. And, if she is not deaf, there is too much sound.

Speech is no less difficult to learn than Sign language, just as English is no less difficult than Chinese. The ease with which we learn one form of communication or the other depends entirely upon our personal experience and place in the world. For those of us who have grown up hearing speech, the viral videos communicate something very different than for those who grew up in Deaf culture.

While the experiences of utter delight portrayed in the videos are valid, their popularity contributes to an oversimplification of the issue. Watching a toddler smile upon finally hearing his mother’s voice for the first time sends a very strong subliminal message: Being deaf must be worse than not being deaf, and therefore anyone would want to join the world of the hearing. But the general public as an audience is already biased toward the hearing world’s standards of happiness. We are moved by the sound of loved ones uttering our names but not at the image of them signing our names because our culture does not rely on—and therefore does not highly value—Sign language.

I want to make it clear that I don’t have a problem with people who choose to get cochlear implants. Medical decisions are painfully personal… I’m all for people making the health choices they think are best for them. What bothers me are the maudlin videos produced out of someone’s intense, private moment that are then taken out of context and broadcast around the world. What bothers me is how the viewer never learns how the individual came to the decision about their implant, which factors they took into account, whether their medical insurance covered it. Sometimes we don’t even learn their names.

This gives me pause. I consider the clip of me removing my casts to look at my newly lengthened legs, which featured 15 years ago in the HBO documentary Dwarfs: Not A Fairy Tale and last year on Berlin’s public station. The moment was simply joyous—as was the moment I stood up, let go of my friend’s hands and took my first steps—but the story behind it was abundantly complex. Which hopefully both documentaries portray.

Limb-lengthening and cochlear implant procedures are markedly different in several ways. Limb-lengthening, for example, does not threaten to endanger another language. But it does threaten to break ranks in the dwarf community through the controversy of altering versus accepting extraordinary bodies. Both procedures have proven to evoke vitriol among their proponents and detractors.

Hartmann reveals:

Most of my deaf friends were good about my CI. They didn’t mind it, except for the fact that my speech therapy cut into play time. That being said, people in the Deaf community felt free to make pointed and derisive comments about my CI. I still get these comments, even almost 24 years after my surgery. To some, I’ll always be a CI-wearer and a turncoat.

The CI advocates aren’t any better, if not worse.

I have very pleasant relationships with many parents of implanted children and CI users. I, however, have also been called a failure because I still use [American Sign Language] and don’t speak perfectly. I’ve also seen a mother run across a room to prevent her child from signing to another deaf child. I’ve been scolded for making gestures and looking too “deaf.”

But for those of us not faced with opting for or against a cochlear implant, we are faced with the challenge of overcoming our bias and remembering that Deaf culture is no less valid than the hearing culture we inhabit. Especially when those admittedly tantalizing videos wind up in our Facebook feeds.

Fourteen years ago, I made a trip to Hot Topic—that quintessential 90s chain store for all things goth—in search of some fishnet stockings for a friend. It was my first visit to the store since I was back in a wheelchair for my third and final limb-lengthening procedure and the narrow aisles prevented me from venturing beyond the entrance. My first time in a wheelchair, from ages 11 to 12, had been a completely humbling experience as I was forced to see how very inaccessible the world is for the non-ambulatory. This time around I was battling the hot-cheeked self-consciousness that adolescence attaches to any signs of dependency.

As I tried to look casual while flipping through black gloves, black stockings, and black dog collars, a guy approached me sporting crimson hair, eyebrow rings, an employee badge and a smile. “This is store is easily adjustable,” he grinned, and with that he began shoving aside the display cases and clothes racks—which were, like me, on wheels—clearing a path for me right through to the back and taking little notice of the other shoppers, some of whom took one to the shoulder. It was one of those crushes that disappear as quickly as they develop but leave a lasting memory: my knight in shining jewelry.

Thanks to experiences like this, I have a special place in my heart for the acceptance of physical differences that can often be found in the subcultures of punks, hippies, and goths. From the imagining of monsters to the examination of anything taboo, counter-culture is often unfazed by physical qualities that fall outside of mainstream beauty standards. The first kid in my high school who chose not to stare at the external fixators on my arms but instead held the door for me had green and purple hair. About a month after my trip to Hot Topic, I showed a death-metal-loving friend my right fixator (shown above) for the first time, with the six titanium pins protruding from open wounds in my thigh. He grinned, “That is the ultimate piercing, man!” He hardly could have come up with a more pleasing reaction. That my wounds were cool instead of “icky” or “pitiful” was a refreshing attitude found almost exclusively outside mainstream culture. This attitude more readily understands my belief that my scars are merit badges I earned, not deformities to erase.

However, this tendency toward decency over discomfort is just one side of the alternative coin. Every subculture has its strengths and its weaknesses, and for all the freaky heroes I’ve encountered, I’ve also met plenty whose celebration of difference devolves into a sick fascination with the grotesque. “Weird for the sake of weird” is progressive when it asserts that weird is inescapable, that it is in fact as much a part of the natural order as any of our conventions, and when it serves as therapy for the marginalized. But it is problematic when it involves self-proclaimed artists using others’ reality as their own personal toys.

In a previous post, I referred to a friend of friend including me in an Internet discussion about limb-lengthening. His comments were in reaction to a photo of a leg wearing an Ilizarov fixator that had been posted on a Tumblr page focused on the “wonders of the world.” There are countless sites like it, where photos of conjoined twins, heterochromatic eyes, intersexual bodies, and medical procedures are posted alongside images of animals, vampires, robots, cosplay, self-harm, manga and bad poetry. I get it. The world is “crazy” and it’s all art. But if that’s not a freak show, what is?

Disabled people are no longer put behind glass or in the circus—at leastnot in the U.S., Canada or Western Europe—but many people still believe they reserve the right to stare, both in public and on the Internet. Whether under the guise of promoting diversity or admiring triumph in the face of adversity, they suppress any realization they may have that no one likes being stared at. Unless it’s on our terms.

I see endless art in my medical experiences and it can be so therapeutic. During my first limb-lengthening procedure I also had braces on my teeth, leading my dad to observe, “She’s now 95% metal.” Kinda cool. During my third procedure, I had Botox injected into my hips twice to paralyze my muscles lest they resist the lengthening. At the time, when I along with most people had no idea what it was, it was described to me as “basically the most deadly poison known to man.” Whoa, hardcore. When I happened upon photos of my anterior tibialis tendon graft surgery, I was enthralled: “I’m so red inside!” And when a fellow patient recently alerted me to the fact that a high-end jeweler designed a bracelet strongly resembling the Ilizarov frame, I laughed my head off. Almost all of us like looking at our bodies, and perhaps this is especially so for those of us who have had real scares over our health. It’s a matter of facing our fears and owning it. But no one likes the idea of others owning it. This subtle but severe preference, this desire for dignity determines the difference between human rights and property rights.

Two years ago, NPR featured a piece by Ben Mattlin, who is non-ambulatory and who said he used to be uncomfortable with the idea of Halloween and its objectification of the grotesque. From my very first costume as a mouse to my most recent stint as the Wicked Witch of the West, my love of Halloween has not so much as once flickered, but his point is worth discussing. Costume play, Halloween and any celebration of “weird” that is primarily attention-seeking inherently assumes there is a “natural” basis to be disrupted. (And all too often Halloween devolves into offensive imitations of all sorts of minority identities.)

I have my own collection of artsy photos stolen off the Internet that I use as screensavers and montages for parties, but they do not include photos of bodies taken outside the context of consensual artistic expression. Re-appropriating a photo in a medical journal for a site about all things bizarre is protected under freedom of speech, but it can feel like disregard for consent. And in any case, such xenocentrism will always be just as superficial as the status quo it seeks to disrupt.

When conjoined twins Abigail and Brittany Hensel agreed to be interviewed once—and only once—for a documentary about their lives (which I highly recommend), they explained that they don’t mind answering strangers’ questions at all. (Ben Mattlin has said the same, as do I.) What they hate more than anything is being photographed or filmed without permission. While attending a baseball game outside their hometown, a sports film crew quickly directed their attention to the girls. Even though they were already being filmed by their own documentary team, the stranger camera’s invasive, presumptuous stare ruined the day for them.

Sensitivity toward others’ experience with medicine and death should never kill the discussion. These discussions are imperative and art is the most glorious way we relate to one another. But just as there’s more to good manners than simply saying “Please,” there’s more to genuine learning and artistic expression than poking at anything we can get our hands on. Nuance, deference and respect are prerequisites for anyone with artistic or scientific integrity not only because they are the building-blocks of common decency, but because history has shown that curiosity will more likely harm the rat than the cat.

On Friday the British Parliament resoundingly struck down a bill that would guarantee its citizens the right to physician-assisted death. Yesterday California’s legislature voted to make it the sixth state in the U.S. to legalize it.

Robust, nuanced arguments have been made for and against physician-assisted death for terminally ill patients, and none of these arguments could be successfully summarized within a single article. This is why a conclusive stance on the issue will never appear on this blog. It is nothing but moving to hear the deeply emotional pleas from those in the right-to-die movement who have thought long and hard about the prospect of death, who feel empowered by having some choice when facing down a daunting fate, who don’t want to find out which of their loved ones may turn out to be unskilled at care-giving. And it is equally moving to hear the experiences of those working in hospice and palliative care who face the approach of death every day with the determination to make it as minimally painful and emotionally validating as possible for all involved.

However, despite the emotional validity of both sides, there are tactics the right-to-die movement should avoid if it does not wish to make our culture more ableist than it already is. Openness about end of life decisions can shed light on a subject previously cloistered away, but the more the right-to-die movement celebrates the idea of ending someone’s life before it transforms into a certain condition, the less willing the public may be to engage with and invest in those who live in that condition.

Which is why no one should call physician-assisted death “Death with Dignity,” as lawmakers in Washington, Oregon, and New York have done. The implication that anyone who opts out of assisted death might live an undignified life is reckless and arrogant. A patient declaring the prospect of invasive life-saving interventions “too much” is fair. A writer declaring the quality of life of those who opt for them “pathetic” is ostracizing. It insults not only those enduring late-life debilitation, but the everyday conditions of many, many disabled people of all ages around the world.

Even today, when so many movements push to integrate disabled people into the mainstream, the average person is generally isolated from the reality of severe deformity, high dependence, and chronic pain. This isolation feeds fear and is therein self-perpetuating. As opponents have pointed out, many right-to-die arguments quickly snowball, equating terminal illness with chronic illness and disability, and portraying all three as a fate worse than death. Hence the name of the New York-based disability rights group Not Dead Yet.

Vermont’s recent law, the Patient Choice and Control Act, bears a far less polemic name than the others currently on the books. That’s a start. Experts are divided as to whether the current openness about end of life decisions in the U.S. has led to more terminally ill Americans considering and opting for hospice and palliative care. Regardless, both sides should be encouraging well-informed discussions that honor a patient’s right to voice his beliefs based on personal experience, and a disabled person’s right to not be further marginalized by a culture that has historically feared her existence.

*Note: I use “physician-assisted death” and other terms in deference to World Suicide Prevention Day this past Thursday and the media guidelines from the Center for Disease Control, which discourage use of the word “suicide” in headlines to avoid contagion.

This week marks the 25-year anniversary of the passing of the Americans with Disabilities Act. As others have noted, the law was ground-breaking not only because of its international ripple effect, but because it recognized disability not as an issue of health, but of human rights.

Author of the bill, Robert L. Burgdorf, Jr. writes in The Washington Post why this was so necessary:

People with disabilities were routinely denied rights that most members of our society take for granted, including the right to vote (sometimes by state law, other times by inaccessible polling places), to obtain a driver’s license, to enter the courts and to hold public office. Many states had laws prohibiting marriage by, and permitting or requiringinvoluntary sterilization of, persons with various mental or physical conditions, particularly intellectual disability, mental health conditions and epilepsy. A number of states restricted or denied the right of people with mental disabilities to enter into contracts. Several U.S. cities, including Chicago, Columbus and Omaha, had what became known as “ugly laws” that banned from streets and public places people whose physical condition or appearance rendered them unpleasant for other people to see. These laws were actuallyenforced as recently as 1974, when a police officer arrested a man for violating Omaha’s ordinance.

In some instances, discrimination threatened the very lives of individuals with disabilities: Lifesaving medical treatments that would routinely have been made available to other patients were denied to patients with disabilities; in 1974, the New York Times cited an estimate thatunnecessary deaths of babies with disabilities in the U.S. resulting from withholding of medical treatment numbered in the thousands each year.

Things have improved substantially, which is cause for celebration. But not complacency. Which is why NPR’s article “Why Disability and Poverty Still Go Hand-In-Hand” is well worth your time, as is the above TED Talk by the late, great Stella Young, whose unexpected death last winter was a tremendous loss to the disability rights movement and to anyone who enjoys a good dose of sarcasm with their social critique.

Pharmaceuticals company BioMarin announced last week the first results of their clinical trials for the drug BMN-111, now named vosoritide by the World Health Organization. Researchers have been developing vosoritide in hopes of one day curing achondroplasia, the most common type of dwarfism. Vice-President Dr. Wolfgang Dummer reported:

In children receiving the highest dose of 15 micrograms per kilogram daily, we observed a 50% increase in mean annualized growth velocity compared to their own natural history control growth velocity. This increase in growth velocity, if maintained, could allow children with achondroplasia to resume a normalized growth rate. More importantly, vosoritide was well tolerated in all dose cohorts and we have observed no major safety concerns to date.

Since many of my readers are new to the blog, I’m re-posting my article “Will We Live to See the End of Dwarfism?” about how some of us with achondroplasia feel about all of this.

* * *

Medicine has been transforming the fate of human society since the first moment someone bandaged a wound. Bearing this in mind, along with the more recent advances in genetics, I have realized for the past decade or so that there is a future, however near or distant, that promises a world without dwarfism. But what if this world arrives as soon as the next generation?

Pharmaceuticals company BioMarinreported earlier this year the start of clinical trials for a drug called BMN-111. If it ends up doing what it promises, repeated injections could transform the bone and cartilage growth of children born with achondroplasia, essentially curing them of the condition. Could this mean that I might someday belong to the last of the dwarfs?

To be clear, BMN-111 could cure only achondroplasia, the most common type of dwarfism, not the other 200+ types. (So the attention-grabbing name of this article is a tad misleading.) Dwarfism caused by growth hormone deficiency—which affected circus performerGeneral Tom Thumb and most of the actors playing the Munchkins in The Wizard of Oz—has already been cured by hormone injections invented at the end of the last century. But 70% of all dwarfs have achondroplasia. Without us, the small number of people identifiable as dwarfs would become much smaller.

Because I’m a fully grown adult, I can’t ever cure my achondroplasia. But would I have chosen to do so if I could? Were my doctor to offer me a pill that would transform my joints and my muscle tone, allowing me to walk and stand around for longer than an hour without my feet swelling with pain, I would take it in an instant. The same goes for a pill that would endow me with more normal fine motor strength, so that I could open jars and push down sticky buttons and do all those tasks that leave me swearing and/or asking someone else for help. I would gladly have taken a pill that would broaden my Eustachian tubes so that I would stop getting ear infections every year. And I would have embraced any sort of medicine that would have widened my spinal column so that I would never have had to havea laminectomy, and so that I could cook and clean my house without back pain. All of the discomfort and inconvenience I just listed are part and parcel of achondroplasia – parts that limb-lengthening could never alter.

But when I consider a pill that, in ridding me of all that pain, would also rid me of every physical marker of achondroplasia, I suddenly hesitate. My wrists, my feet, my skull, my face would look significantly different from the one I have. The idea of never having had to learn how best to react to being the most physically remarkable person in school, of never having undergone limb-lengthening, of never having lived in an institution with children with all sorts of serious conditions, of never having had to explain my unique history to others – it makes me have a hard time imagining an Emily Sullivan Sanford that is anything like the one I know today. My dwarfism is only part of who I am, but it has been a significant part of who I am. This is why I understand the Little People of America memberswho balk at BMN-111, put their fingers in their ears and chant, “Go away, go away, go away!”

We must approach the future rationally because our emotional attachment to life as we know it can lead us to delude ourselves with an unrealistic sense of control. History after all demonstrates that future generations will never know all kinds of things we treasure today. Give or take a few centuries, people in our part of the world will most certainly not face the same illnesses, speak the same language, wear the same clothes, eat the same foods, or observe the same traditions we do. Whether we’re debating the politics of Hawaiian Pidgin or that punk’s not dead, we do not get the final say on what future generations will know and what will be lost to the ages.

Identity is a construct, but a construct that is as powerful as any other. As Andrew Solomon writes, “I don’t wish for anyone in particular to be gay, but the idea of no one’s being gay makes me miss myself already.”

Granted achondroplasia is not merely a difference like a dialect or homosexuality. It is a medical condition that causes very real physical pain and health risks. Like diabetes. I can write with certainty that the vast majority of people with diabetes, while rightfully proud of the obstacles they’ve overcome, would happily rid themselves of the disease. They would celebrate never having to check their blood sugar, inject themselves with insulin, or worry about developing dangerous complications. We can safely make the same assumption for people who have to deal with migraine headaches or deep-vein thrombosis.

But let’s consider a condition that, like achondroplasia, has as many social ramifications as medical ones. I bet most people who wear glasses would gladly take a pill that guaranteed perfect vision. No more headaches, no more pressure sores on the bridge of your nose, no more wondering where you set them down, no more worrying if they break, no more bills! But would they so easily let go of their bespectacled appearance? Although he no longer needs glasses since his laser surgery, comedian Drew Carey wears non-prescription glasses to maintain his look.

I surveyed a handful of friends in Europe and the U.S., and most answered that they would indeed take a pill guaranteed to improve their vision, and also that they would never wear anything but sunglasses again. If this scenario ever becomes reality, themovement of the past 100 years to broaden beauty standards to include the bespectacled will begin to fade. The 20% of my respondents that answered, “I would wear non-prescription glasses because it’s a part of my identity,” will belong to a shrinking minority left to fend for itself. They will likely start counting the minutes until they hear something marginalizing like: “Isn’t it great you won’t have to look like a nerd anymore?”

Once again, people with achondroplasia must admit that our distinguishing condition involves far more innate physical complications than simply needing glasses or being gay. Activist Harry Wiederbemoaned the reticence among people with dwarfism to even admit that we are disabled, and he was right to be so critical. Downplaying the pain and surgical risks everyone with achondroplasia faces is a matter of denial. But such denial is often rooted in the worry that others will overemphasize our pain, distancing themselves from us in a way all too similar to the fear and pity that fuels ableism. Such distance imposed by other minorities can break solidarity and lead to hierarchical thinking along the lines of, “At least I’m not like that!”

Anyone who reacts to the idea of BMN-111 ridding humanity of the achondroplastic appearance with a sigh of relief has a problem. It’s a problem we can never afford to ignore. The lessons of diversity awareness and inclusion are priceless. If dermatologists some day offer a cure for vitiligo, Winnie Harlow’srecent successes in the world of modeling will still have only been a good thing.

My attachment to my starfish hands, my achondroplastic nose, and my scars is not rational. But the human experience is never purely rational. And self-acceptance is an achievement like no other. Almost every person with achondroplasia has a jarring moment when they see themselves in photos or on film and are reminded that their hands are not at all slender, like most of the hands they see in photos or on film. Or that their hips sway when they walk. Or that their skulls are larger. Learning to live with the shock is a difficult but worthwhile experience. When a mother of a girl with achondroplasia wrote to me, asking about her four-year-old daughter’s future, my family awwwwwed at the photos she sent us. “I remember having an adorable little girl with a forehead like that!” my dad grinned.

I was not nearly so moved by therecently published images of celebrities photoshopped to “reimagine them with dwarfism” next to an image of Peter Dinklage photoshopped to “reimagine him without” because only their legs were modified.

The project itself is thought-provoking, but Daniel Radcliffe simply wouldn’t get into the achondroplasia club with those ridiculously long arms. And Peter Dinklage—whom GQdeclared a “stud” in its 2011 Men of the Year list—would have a dramatically different forehead, cheekbones, jaw, and nose.

One of the respondents to my survey who said he would keep his glasses explained, “Not really for aesthetic reasons, exactly, though that’s part of it (and it is fun to buy glasses). But because they’re a part of my face! I’ve never considered contacts, either, come to think of it. They serve some other function, beyond utility and style, I guess.”

Similar feelings have been expressed by people who underwent surgery to remove the sixth finger on their right hand for convenience, while opting against the removal of the sixth finger on their left: “Why would I cut it off? It’s a part of me.”

Syndactyly runs in two sides of my family. One relative remarked about her child, “I was so happy when she was born to see she didn’t have those fused toes!”

To which another relative with fused toes later said, “Why? It hurts a bit more when you stub them, but otherwise, what’s the big deal?”

Replace the word “fused toes” with red hair or monolids or pale skin or dark skin or freckles or whatever intrinsic part of you might somewhere be considered unfashionable and you’ll know a little how dwarfs feel about BMN-111. As with limb-lengthening, BMN-111 threatens to out the uglier feelings some people have about our appearance. We must remember that it’s the feelings that are ugly, not the body.

Talking out my endlessly complex thoughts about a world without dwarfism feels like moving through a labyrinth that is partly my own making. During one such recent talk, a close friend said to me, “If we could look at a version of you that never had achondroplasia, I understand that you would miss yourself and I would miss you, too. But you would be awesome in a different way that would still be your own way, and it would be without all the pain and complications and danger.”

This is what people with achondroplasia need to hear from those who truly accept them.

I recently read Good Kings Bad Kings by Susan Nussbaum, winner of the PEN Bellwether Prize for Socially Engaged Fiction and several other accolades. When describing it to friends as a story told from the perspectives of patients and staff at an institution for severely disabled minors, I got a common response: “Well, that sounds like a fun read!”

I will perhaps never fully grasp what distinguishes a depressing story that brings you down from a great drama that hooks you from the start. The bestselling books in the English language are about a boy who must face down his parents’ killer, a girl who spends hours in her lover’s Red Room of Pain, and a high schooler who can’t wait to have a monster baby with an emotionally disturbed vampire. Crime shows and novels continue to be wildly popular through the generations. If you turned on the closed captioning for most of the top-grossing films of the last 30 years, you would be reading, “[scary music],” every few minutes.

Why do we embrace all this while believing that a book that starts off with the rants of a teen in a wheelchair might be too heavy to handle?

Of course, realistic portrayals of suffering pack a far more visceral punch than contrived ones. Pirates of the Caribbean and Star Wars will widely be perceived as less distressing than The Piano and Love Is Strange because, despite their carnage, the adventure stories never get inside their victims’ heads. Touchy-feely tales embraced by mass audiences tend to have happy endings, or at least the satisfying downfall of an easily identifiable villain. This is why, as Salon’s book critic Laura Miller has pointed out, a story is schlocky and sentimental insofar as it lies to the audience.

And Good Kings Bad Kings does not lie to its audience. I embarrassingly ended up having to conceal tears streaming down my cheeks while sitting on a bus as I read about one particularly beguiling character who (SPOILER ALERT) dies after getting third-degree burns in the shower due to human error and then catching pneumonia after surgery. I can attest that such a tragic scene is representative of reality, not sheer melodrama. I lived in a pediatric hospital for five months when I was a pre-teen, and the next year I learned that one of my friends had died after his breathing apparatus failed due to human error, and another one had died from catching pneumonia after surgery.

Living at that hospital was far from easy. As I’ve written before, listening to others share their realities in group therapy was one of the most humbling experiences I’ve ever had. But while the human fear of death and suffering is rational and something I never lost, living alongside the patients did knock down many of my fears of illness and disability that were irrational.

Within a few weeks on the ward, I was no longer disconcerted at the sight of head injuries, tracheostomy tubes, stumps, or burned faces. At first I stared. Many of the owners stared back at me and my Ilizarov fixators. We all stared at anyone with a condition we hadn’t seen before. And sometimes we stared at each other’s wheelchairs out of envy. But the constant exposure soon rendered such features as mundane to us as glasses, braces, and freckles. We were used to it. What is the harm in allowing the rest of the world to get used to it, both through inclusion in society and representation in books and film?

As a study published in Science found, reading literary fiction makes you more emotionally intelligent. As The New York Timesreported, “This was true even though, when asked, subjects said they did not enjoy literary fiction as much. Literary fiction readers also scored better than nonfiction readers — and popular fiction readers made as many mistakes as people who read nothing.” The results are unsurprising when literary fiction distinguishes itself from popular fiction by avoiding formulas and stereotypes. We’ve already seen that avoiding stereotypes fosters more creative, innovative thinking. Now it makes us better at understanding each other, too.

Indeed, literature provides characters who are realistic because they are just as complex as we all are. Realistic characters don’t make us the readers like them. They make us understand them, while simultaneously being a little bothered by them because we recognize their faults and selfish impulses in ourselves. In other words, a great literary feat doesn’t show you good people triumphing over the bad. It shows you how and why we hurt each other.

The harm in Good Kings Bad Kings is not wrought by cackling villains upon innocent angels. It comes from the fear, anger, and selfishness easily recognizable in everyday life. And it is visited upon disabled people who are not dying to escape their diagnoses but who are sick of the condition our society has left them in. As Susan Nussbaum writes in her afterword:

I used to wonder where all the writers who have used disabled characters so liberally in their work were doing their research. When I became a wheelchair-user in the late seventies, all I knew about being disabled I learned from reading books and watching movies, and that scared the shit out of me. Tiny Tim was long-suffering and angelic and was cured in the end. Quasimodo was a monster who loved in vain and was killed in the end, but it was for the best. Lenny [in Of Mice and Men] was a child who killed anything soft, and George had to shoot him. It was a mercy killing. Ahab [in Moby Dick] was a bitter amputee and didn’t care how many died in his mad pursuit to avenge himself on a whale. Laura Wingfield [in The Glass Menagerie] had a limp, so no man would ever love her…

None of the characters I write about are particularly courageous or angelic or suicidal, bitter for their fate, ashamed to be alive, apt to kill anyone because they have an intellectual or psychiatric disability, or dreaming of being cured or even vaguely concerned with being cured.

And that’s what makes realistic portrayals of disabled people so significant. Not for the sake of inspiration porn. Not to make us proud of how good we have it. But to welcome disabled people’s lives, stories, and perspectives into the arts and therein mainstream society.

The assumption that a story about severely disabled characters must be overwhelmingly upsetting is precisely the mentality that marginalizes severely disabled people. If we won’t read their stories because they’re too sad, we’re not very likely to know how to approach them in real life.

And for all its lines about the importance of realistic stories for the sake of galvanizing greater empathy, The New York Times never reviewed Nussbaum’s award-winning book.

Prologue: My three-month long hiatus from blogging was due to tendon surgery I underwent in January and rare complications that arose from it. I am now gradually returning to work from sick leave and thrilled to be back.

* * *

Medicine has been transforming the fate of human society since the first moment someone bandaged a wound. Bearing this in mind, along with the more recent advances in genetics, I have realized for the past decade or so that there is a future, however near or distant, that promises a world without dwarfism. But what if this world arrives as soon as the next generation?

Pharmaceuticals company BioMarin reported earlier this year the start of clinical trials for a drug called BMN-111. If it ends up doing what it promises, repeated injections could transform the bone and cartilage growth of children born with achondroplasia, essentially curing them of the condition. Could this mean that I might someday belong to the last of the dwarfs?

To be clear, BMN-111 could cure only achondroplasia, the most common type of dwarfism, not the other 200+ types. (So the attention-grabbing name of this article is a tad misleading.) Dwarfism caused by growth hormone deficiency—which affected circus performer General Tom Thumb and most of the actors playing the Munchkins in The Wizard of Oz—has already been cured by hormone injections invented at the end of the last century. But 70% of all dwarfs have achondroplasia. Without us, the small number of people identifiable as dwarfs would become much smaller.

Because I’m a fully grown adult, I can’t ever cure my achondroplasia. But would I have chosen to do so if I could? Were my doctor to offer me a pill that would transform my joints and my muscle tone, allowing me to walk and stand around for longer than an hour without my feet swelling with pain, I would take it in an instant. The same goes for a pill that would endow me with more normal fine motor strength, so that I could open jars and push down sticky buttons and do all those tasks that leave me swearing and/or asking someone else for help. I would gladly have taken a pill that would broaden my Eustachian tubes so that I would stop getting ear infections every year. And I would have embraced any sort of medicine that would have widened my spinal column so that I would never have had to have a laminectomy, and so that I could cook and clean my house without back pain. All of the discomfort and inconvenience I just listed are part and parcel of achondroplasia – parts that limb-lengthening could never alter.

But when I consider a pill that, in ridding me of all that pain, would also rid me of every physical marker of achondroplasia, I suddenly hesitate. My wrists, my feet, my skull, my face would look significantly different from the one I have. The idea of never having had to learn how best to react to being the most physically remarkable person in school, of never having undergone limb-lengthening, of never having lived in an institution with children with all sorts of serious conditions, of never having had to explain my unique history to others – it makes me have a hard time imagining an Emily Sullivan Sanford that is anything like the one I know today. My dwarfism is only part of who I am, but it has been a significant part of who I am. This is why I understand the Little People of America members who balk at BMN-111, put their fingers in their ears and chant, “Go away, go away, go away!”

We must approach the future rationally because our emotional attachment to life as we know it can lead us to delude ourselves with an unrealistic sense of control. History after all demonstrates that future generations will never know all kinds of things we treasure today. Give or take a few centuries, people in our part of the world will most certainly not face the same illnesses, speak the same language, wear the same clothes, eat the same foods, or observe the same traditions we do. Whether we’re debating the politics of Hawaiian Pidgin or that punk’s not dead, we do not get the final say on what future generations will know and what will be lost to the ages.

Identity is a construct, but a construct that is as powerful as any other. As Andrew Solomon writes, “I don’t wish for anyone in particular to be gay, but the idea of no one’s being gay makes me miss myself already.”

Granted achondroplasia is not merely a difference like a dialect or homosexuality. It is a medical condition that causes very real physical pain and health risks. Like diabetes. I can write with certainty that the vast majority of people with diabetes, while rightfully proud of the obstacles they’ve overcome, would happily rid themselves of the disease. They would celebrate never having to check their blood sugar, inject themselves with insulin, or worry about developing dangerous complications. We can safely make the same assumption for people who have to deal with migraine headaches or deep-vein thrombosis.

But let’s consider a condition that, like achondroplasia, has as many social ramifications as medical ones. I bet most people who wear glasses would gladly take a pill that guaranteed perfect vision. No more headaches, no more pressure sores on the bridge of your nose, no more wondering where you set them down, no more worrying if they break, no more bills! But would they so easily let go of their bespectacled appearance? Although he no longer needs glasses since his laser surgery, comedian Drew Carey wears non-prescription glasses to maintain his look.

I surveyed a handful of friends in Europe and the U.S., and most answered that they would indeed take a pill guaranteed to improve their vision, and also that they would never wear anything but sunglasses again. If this scenario ever becomes reality, the movement of the past 100 years to broaden beauty standards to include the bespectacled will begin to fade. The 20% of my respondents that answered, “I would wear non-prescription glasses because it’s a part of my identity,” will belong to a shrinking minority left to fend for itself. They will likely start counting the minutes until they hear something marginalizing like: “Isn’t it great you won’t have to look like a nerd anymore?”

Once again, people with achondroplasia must admit that our distinguishing condition involves far more innate physical complications than simply needing glasses or being gay. Activist Harry Wieder bemoaned the reticence among people with dwarfism to even admit that we are disabled, and he was right to be so critical. Downplaying the pain and surgical risks everyone with achondroplasia faces is a matter of denial. But such denial is often rooted in the worry that others will overemphasize our pain, distancing themselves from us in a way all too similar to the fear and pity that fuels ableism. Such distance imposed by other minorities can break solidarity and lead to hierarchical thinking along the lines of, “At least I’m not like that!”

Anyone who reacts to the idea of BMN-111 ridding humanity of the achondroplastic appearance with a sigh of relief has a problem. It’s a problem we can never afford to ignore. The lessons of diversity awareness and inclusion are priceless. If dermatologists some day offer a cure for vitiligo, Winnie Harlow’s recent successes in the world of modeling will still have only been a good thing.

My attachment to my starfish hands, my achondroplastic nose, and my scars is not rational. But the human experience is never purely rational. And self-acceptance is an achievement like no other. Almost every person with achondroplasia has a jarring moment when they see themselves in photos or on film and are reminded that their hands are not at all slender, like most of the hands they see in photos or on film. Or that their hips sway when they walk. Or that their skulls are larger. Learning to live with the shock is a difficult but worthwhile experience. When a mother of a girl with achondroplasia wrote to me, asking about her four-year-old daughter’s future, my family awwwwwed at the photos she sent us. “I remember having an adorable little girl with a forehead like that!” my dad grinned.

I was not nearly so moved by the recently published images of celebrities photoshopped to “reimagine them with dwarfism” next to an image of Peter Dinklage photoshopped to “reimagine him without” because only their legs were modified.

The project itself is thought-provoking, but Daniel Radcliffe simply wouldn’t get into the achondroplasia club with those ridiculously long arms. And Peter Dinklage—whom GQdeclared a “stud” in its 2011 Men of the Year list—would have a dramatically different forehead, cheekbones, jaw, and nose.

One of the respondents to my survey who said he would keep his glasses explained, “Not really for aesthetic reasons, exactly, though that’s part of it (and it is fun to buy glasses). But because they’re a part of my face! I’ve never considered contacts, either, come to think of it. They serve some other function, beyond utility and style, I guess.”

Similar feelings have been expressed by people who underwent surgery to remove the sixth finger on their right hand for convenience, while opting against the removal of the sixth finger on their left: “Why would I cut it off? It’s a part of me.”

Syndactyly runs in two sides of my family. One relative remarked about her child, “I was so happy when she was born to see she didn’t have those fused toes!”

To which another relative with fused toes later said, “Why? It hurts a bit more when you stub them, but otherwise, what’s the big deal?”

Replace the word “fused toes” with red hair or monolids or pale skin or dark skin or freckles or whatever intrinsic part of you might somewhere be considered unfashionable and you’ll know a little how dwarfs feel about BMN-111. As with limb-lengthening, BMN-111 threatens to out the uglier feelings some people have about our appearance. We must remember that it’s the feelings that are ugly, not the body.

Talking out my endlessly complex thoughts about a world without dwarfism feels like moving through a labyrinth that is partly my own making. During one such recent talk, a close friend said to me, “If we could look at a version of you that never had achondroplasia, I understand that you would miss yourself and I would miss you, too. But you would be awesome in a different way that would still be your own way, and it would be without all the pain and complications and danger.”

This is what people with achondroplasia need to hear from those who truly accept them.

If you’ve happened to set aside 14 hours in the last month for Ken Burns’ The Roosevelts: An Intimate History, which aired on public television in the U.S., you know it affords considerable attention to FDR’s disability. Most touching is a 10-minute feature about Warm Springs, the Georgia health spa and rehabilitation center for polio patients, which Roosevelt founded and which soon became his primary vacation destination throughout his political career. Former employees and patients tell of him shaking the hands and asking the names of every patient, swimming alongside them and dunking whoever got within arm’s reach.

His biographer Geoffrey C. Ward explains:

It allowed him to be unself-conscious about polio… I don’t care how magnetic or self-confident you are, or you think you are… At Warm Springs, he could: not wear his braces, and go to the swimming pool, and have everybody see how small his legs were and it didn’t bother him at all because there were people there with worse problems…

He loved being one of them and the number one of them at the same time… To see someone so famous, who suffered from exactly the same problems that you suffered from, meant an enormous amount to all of the people who went there. Most of the people who went there went there mostly out of despair, at least at first. There wasn’t any other place to go. And here was this laughing giant who would kid them, and who would make the kind of awful sick jokes about being handicapped that other handicapped people love, but that you can’t share with anybody else. He loved doing that.

FDR told the staff that all at Warm Springs were equals, and many interviewees point to this as the beginning of his dedication to humanitarian, egalitarian projects. “It is tempting and probably true to say that polio gave FDR the gift of empathy,” says George F. Will. “There was no suffering that he could not in some sense relate to. And also, just as soon as the iron [brace]s were clapped onto his legs, the steel entered his soul. By having to fight through the constant pain of therapy that was unforgiving in its demands and not very fulfilling in its success.”

FDR had intended to market Warm Springs as both a vacation resort and a health spa, hoping the profits from the hotel would fund the rehabilitation center. The hotel ultimately failed, according to Burns’s documentary, “because prospective guests were scared off by the presence of polio patients.” Outside Warm Springs, attitudes toward disabled people were hardly tolerant. When voters elected a disabled president in 1932, 1936, 1940 and 1944, they did so in spite of his disability, not in acceptance of it.

Doctors attested to his physical and mental fitness in newspaper articles that asked, “Is he healthy enough to be president?” When Teddy Roosevelt’s family publicly opposed FDR’s candidacy, his daughter Alice took an ableist tack. Her famously hyperactive father had had the strength and will power to overcome his affliction, she argued, referring to TR’s childhood bout with asthma, while FDR’s paralysis from polio was a sign of his weakness and the reason why he embraced such wimpy social policies.

Both Ken Burns and Geoffrey C. Ward contend that FDR could not be elected today. Ableism was pervasive in the 1930s and 40s, and it was well understood that publishing photographic evidence of his disability—his braces hidden by the podium, his difficulty getting in and out of cars, his regular falls—would be too detrimental to his image. But the press obliged. Photos like this one remained out of the public eye. Today neither the media nor bystanders with cell phone cameras afford anyone such privacy.

Appearance is as important as ever to politicians, if not more so since images in film, in print, on television, and online are countless times more prevalent now than they were in FDR’s time. This ubiquity is both the cause and the result of our expecting to see celebrities up close and from every angle. While Germany distanced itself from the idea of demanding charm and showmanship from their political leaders in the post-war era, America became ever more preoccupied with it, giving more credence to the photogenic Kennedys than any other presidential family.

The power of representation cannot be underestimated. We all like to be able to identify with famous and successful people because it imbues us with optimism about our own chances for success. We watch documentaries about celebrities’ lives in the hopes of discovering that they are the kind of person we would like, and who therefore would like us, if they ever had the chance to get to know us. Such idol worship, whether severe or mild, is of course ultimately irrational. But it satisfies the emotional need for recognition. If we cannot go on to be president for whatever reason, we can enjoy living vicariously through someone who does.

Ward is right when he speaks of how meaningful it was for ordinary patients with polio to see a sitting president with polio. But it is discouraging to consider that only those who could make the trek to Warm Springs were able to have the experience. And it is discouraging to consider Ward and Burns’ contention with its implication that disabled people today cannot have the experience of seeing a visibly disabled president because the American people will not elect one. Are they right?

In our age of a million media images, we commonly see senators, singers, elite athletes and film stars visiting disabled and ill children to boost their morale. But none of these celebrities are simultaneously as enormously powerful and as visibly disabled as Franklin Roosevelt was. Indeed, no one since his time ever has been.

Most of them were not born with dwarfism. This is what I observed from a history of eminent dwarfs who enjoyed some degree of success outside of freak shows before the minority rights movements of the late 20th century. Most of them, such as Toulouse-Lautrec, experienced stunted growth as the result of an accident or an illness well after birth. Well after it would have been socially acceptable for their parents to give them up or hide them away. Such cases account for a very small minority of people with dwarfism, yet they dominated the scene of non-marginalized dwarfs for most of Western history. This got me thinking.

I conducted a crowd-sourcing experiment on Facebook, asking friends to name very famous people with severe physical disabilities. They had to be household names, nothing along the lines of “that little guy on Game of Thrones” or “that comic on that show from the Eighties who had a muscle problem.” The list of responses bore no surprises: Helen Keller, FDR, Beethoven, Frida Kahlo, Ray Charles, Christopher Reeve, Stephen Hawking, Michael J. Fox. All but two of them—Stevie Wonder and Oscar Pistorius—incurred their disability after infancy. Was this another sign of congenitally disabled people being hidden away? The vast majority (85%) of disabled people become disabled after birth. But the 15% whose conditions are congenital appear to be underrepresented in public.

Does society more readily accommodate those who lose certain abilities than those who never had them to begin with? Anthropologists know that for most of human history any injury or illness without a visible cause was presumed to be the result of black magic or a vengeful deity. From the European mythology of the changeling right up to the Nazi condemnation of genetic “monsters,” congenitally disabled people have been traditionally viewed as non-human and segregated accordingly. Vestiges of this remain in our general tendency to simply not consider congenitally disabled people as potential friends or partners or even peers, in contrast to the conviction that we should stick by our loved ones no matter what befalls them. Pop icon Dick Clark was warmly welcomed back to television as a co-host after his debilitating stroke, but I’ve yet to find a TV presenter in America who was born with a speech impairment like the one Clark developed. I don’t have the funding to empirically test my hypothesis, but you don’t have to delve too far into mainstream media to come up with stories, articles and interviews spotlighting someone who seemed to have it all until one fateful day when tragedy struck. And notice the comparable paucity of such resources on people who have always lived that way.

I squirm as I write this for fear of implying that those who become disabled have an easy time of it. Far from it. It would be utterly callous to ignore the often indescribable strain illness and injury can inflict on relationships, and the horrific social isolation that too many patients face. There’s a reason that “fair-weather friend” is a well-known term. And the human fascination with suffering can be more voyeuristic than empathic.

But no matter the motive, it is always accompanied by the unspoken understanding that no one would ever want to become disabled. This is, in essence, the most universal view of disability: Who on earth would want to lose an ability of any kind?

Even as a congenitally disabled person I understand this. I would never choose to erase my dwarfism from my life experience. But I do not like becoming more disabled than I already am. After tendon injuries and surgery to combat stenosis, I miss being able to ride a bike, to walk barefoot, to cook and type and sit on benches for long periods without pain. And if tomorrow I were to lose my ability to hear, see, or walk, I would be distraught, to put it mildly.

But in voicing this, it is crucial for me—and everyone listening to me—to recognize that my becoming deaf would be a profoundly different experience from that of my friend who has been Deaf since he can remember. Many Deaf people with cochlear implants have told of how overwhelmingly unpleasant hearing sound for the first time can be: One man has “discovered that, far from being adorable, the voices of his grandchildren were rather shrill and often best experienced with the implant turned off.” That Deaf Guy comic strip tells of the authors’ son pitying people who don’t know how to sign.

Similarly, those who have always needed a wheelchair to get around tend to see it as no worse than needing shoes to get around. Yes, it’s inconvenient in a world where ramps are all too rare, just as it would be inconvenient for those of us who are ambulatory if most public facilities didn’t accommodate the shoes on our feet. But that difficulty is imposed by a society that fails to accommodate certain minorities, not by the disability itself. Congenitally disabled bodies do not notice what they lack. As so many have said before me, How can you miss something you never had to begin with?

Researching all of this has brought me to the following conclusion: As individual humans, it is harder for us to deal with becoming disabled than with being born disabled. But as a society, the reverse is true – it is harder for us to accept someone who is born disabled than someone who has become disabled.

As a result, those who were born disabled and those who have become disabled often find themselves on opposite ends of the argument. A woman like Stella Young, who has never been able to walk, is rightly insulted when people tell her she is brave and inspiring just for getting up every morning. (Her TED Talk below is worth every minute.) But a woman like Christine Miserandino, who is slowly losing the ability to walk, is rightly seeking others’ encouragement and support as she struggles to do something she once took for granted. (Her oh-so-quotable Spoon Theory has already been linked on this blog before.)

Because the majority of disabled people are like Miserandino, not Young, the discourse on disability is dominated by sympathy, fear and lamentation. It is hard for us to remember that we shouldn’t pity a woman with cerebral palsy for her spasticity when so many people with multiple sclerosis openly mourn their loss of agility. Those who become injured or ill are entitled to their grief and no one should ever attempt to silence them. But everyone should think beyond their own experience before they publicly decry their condition as unbearable. Especially when it ends up joining the chorus of ableism led by non-disabled people.

One of the most read articles at The Atlantic this month is a piece by bioethicist Ezekiel Emanuel who explains why he hopes to die before age 76:

[Living too long] renders many of us, if not disabled, then faltering and declining, a state that may not be worse than death but is nonetheless deprived. It robs us of our creativity and ability to contribute to work, society, the world. It transforms how people experience us, relate to us, and, most important, remember us. We are no longer remembered as vibrant and engaged but as feeble, ineffectual, even pathetic.

Emanuel is an amateur mountain-climber and a professional writer. He is entitled to feel upset at the idea of losing the abilities he currently holds most dear. And his other arguments about the drawbacks to longevity are as thought-provoking as physicians’ personal opinions on life-saving interventions. But his decision to openly denounce dependence and weakness as unproductive and undignified increases the lifespan of our culture’s ableism. How can we ever stop marginalizing disabled people if we continue to openly voice our fear of becoming like them?

The experiences of those who are born disabled and the experiences of those who become disabled are profoundly different and equally valid. Simply remembering that would change a lot.

Imagine your body just as it is, in a world that looks exactly like the one we live in, but for one crucial difference:

You’re at a dinner party with some friends and some new faces, and you excuse yourself before heading to the bathroom. When you return to the table, you notice a few people exchanging looks. You wonder if your friends explained your absence to those who don’t know you well. You’re not sure if you should explain it yourself. Do you owe it to them? You’re not embarrassed, but they look a bit embarrassed for you. Then again, maybe you’re just being paranoid? You’re not really in the mood to get into it, and maybe these people would find it inappropriate dinner conversation. Maybe they’re the kind of people who would cringe, and you’d rather not discover that about them just now because the evening has been going pretty well and they seem nice so far.

Just when you decide not to say anything, the woman next to you asks, “Is everything okay?”

You reply, “Um, yeah.”

“Why did you get up and leave? Do you smoke?”

“Oh, no. No, I had to use the bathroom.”

“Oh… Wait, you mean like… you’re one of those… um… what’s the word for it?”

“Yeah. I’ve got frequent excretion syndrome. I have to use the bathroom a couple times a day.”

If 99% of humankind evolved in a way that they only needed to excrete once a week—as ball pythons do, for example—then modern society would look pretty different. The number of toilets available in public facilities would decrease significantly. A home with a bathroom would not be unheard of, but it would be a bit of a luxury, like an apartment building with an elevator. No one would assume that dinner at a restaurant or a friend’s house would guarantee access to such facilities. And the 1% of people who still needed a bathroom a few times a day would be considered disabled.

Eventually debates would crop up as to whether needing to excrete so often is “defective” or “just different.” There would be arguments as to who should accommodate whom: Should society provide more bathrooms, or should the minority wear diapers? Would you date someone who did? It must be so hard for parents and partners to deal with someone like that! I read on the Internet that those freaks do it in the shower! I would never get in a pool with one. You shouldn’t let your kids near them!

If you lived in this world, where most people’s bodies did not need to excrete more than once a week but yours did, you would undoubtedly experience frustration, as most disabled people do. But the source of your frustration would depend upon how you got to be the way you are.

If, after an accident, you suddenly belonged to a small minority of people who needed a bathroom more than once a week, you would experience a good deal of stress adjusting to your new schedule. Losing an ability you had taken for granted would feel unfair. Life was so much easier before this happened! Why me?! Some would be arguing that they would rather die than live like that. Depending on your support network and self-image, you might join that argument. But no matter how accepting your friends and family were, you would probably struggle with some internal shame about being less independent.

But if your body had always functioned that way for as long as you could remember—as it presumably does in the real world—most of your problems would stem from how alien the majority would make you feel. In the real world, we can all admit that needing a bathroom a few times a day can be inconvenient, especially on car trips, but it doesn’t feel “wrong,” “sick,” “crippling,” “freakish,” or “sad.” In a world where you’re the minority, you might accept the idea of wearing diapers rather than demand more bathrooms be built for you, or you might be deeply insulted by it. You might decide to combat the stigma of diapers. You would likely be upset hearing people say they would rather die than live like you.

That’s the difference between people who are born disabled and those who become disabled. The latter understandably experience stress, sometimes trauma, adjusting to a new condition. The former rarely feel the need to miss what they never had to begin with. Society likes to offer both groups pity. But they often respond to this pity with different answers because they have many different experiences. As we’ll see next week, conflict can only be avoided if everyone involved—those who were born disabled, those who became disabled, and those who are non-disabled—tries to understand the others’ point of view.

If you haven’t caught it already, Jonathan Novick’s video documenting his experiences in public as a person with achondroplasia is worth your time. Having grown up in a small town where almost everyone knew his backstory, Novick’s move to New York City was a rude awakening to the problem of street harassment. A day out and about, recorded by a hidden camera, features strangers shouting at him from afar, “Hey, short stuff!” “What is he?” “Little midget! Big man, big penis!” A few ask him, “Have you ever been on TV?” “Are you on that show with the little people?” “Can I take your picture?” Two people walk by while photographing him, without asking for permission.

Although I did not undergo limb-lengthening to blend in (more on that here), it has undeniably spared me a lot of this unpleasant commentary which so many dwarfs endure, and which I used to endure as a child. Writing from the U.K., Eugene Grant’s blog demonstrated last year that Novick’s tales of being incessantly photographed and called “Big man!” are far from rare. On Tumblr a college student reported this incident last September:

Walking home from coffee, a random car driving by yelled, “Slut” out their window. I’m not sure who it was directed toward. I was technically showing more skin than the other two in our party, but I also am the height of a 9 year-old and from a distance in the dark it’s hard to determine my age.

Either way assholes are assholes.

This is what sociologist Lisa Wade has called the burden of not being able to assume it’s not about you. This is a burden most people who are visible minorities carry with them. In a review of a street photography project by an artist regularly harassed for being fat, Wade explains:

The truth is that [she] often does not know what’s going on in the minds of her subjects. Yet, because she carries a body that she knows is disdained by many, it is perfectly reasonable for her to feel like every grimace, look of disgust, laugh, shared whisper, and instance of teasing is a negative reaction to her body. In fact, this is how many fat people experience being in public; whether they’re right about the intent 100% of the time is irrelevant to their lived experience.

And this is how people of color, people who speak English as a second language, disabled people and others who are marginalized live, too. Was that person rude because I speak with an accent? Did that person say there was no vacancies in the apartment because I’m black? Was I not chosen for the job because I’m in a wheelchair? Privilege is being able to assume that the person laughing behind you is laughing at something or someone else, that the scowl on someone’s face is because they’re having a bad day, and that there must have been a better qualified candidate.

While I’ve had my fair share of strangers asking about my scars, hands, and gait, they usually have to be particularly nosy in order to take notice of these features in the first place. This happens to me a lot more often in certain rural areas than in the urban setting I call home.

This is why the small town vs. big city debate isn’t quite as simple as Novick presents in his film. I understand the idea that extraordinary-looking people can benefit from living in a close-knit community, where most are already aware of your condition and don’t need you to explain it to them. Conjoined twins Abby and Brittany Hensel’s parents have also claimed their daughters benefited from this. But plenty of people who belong to minorities can attest that small towns do not always embrace diversity in their community. And while there are tremendous advantages to an atmosphere where people are outgoing and unrepressed, there is a fine line between friendliness and nosiness: In places where everyone knows everyone’s business, the assumption that everyone has the right to find out what they don’t know about you can be pervasive. In the choice between small town gossip versus big city street harassment, I’d choose neither.

In my experience, what matters is not the size of the place but the culture. Cities do not have to be hostile environments of street harassment, and villages do not have to be breeding grounds for judgmental hearsay. As Novick says, “I’ll ask that the next time you see someone who is different from you, think about their day. Think about what their day might be like… And then think about what part of their day you want to be.”

And I was only slightly startled to find nothing but solipsistic snickering and overdone puns. The Atlantic doesn’t win any points for ending the article on a pun, either. But praise is due for addressing the topic at all. Based on an extensive interview with Dr. Marylou Naccarato, who has Kniest dysplasia, the article takes a wonderfully sex-positive approach to the experiences of people with dwarfism and the physical obstacles they can face in bed.

As per nearly every feature on dwarfism in the mainstream media, there are some factual errors. For example, one dwarf couple is quoted claiming that people with achondroplasia require “no medication, surgeries, special needs, nothing.” (See here for a list of the many complications we are at risk for.) But Naccarato is doing great work that is revolutionary in light of the fact that Little People of America, and probably most disability advocate organizations, repeatedly shy away from the topic of sexuality.

A simple reason for their silence is that almost all disability organizations comprise just as many parents and relatives of disabled people as disabled people themselves. And who wants to debate the best way to masturbate with Mom or Dad sitting next you? A more sinister reason for the silence is one of the building blocks of modern prejudice against disabled people: that is, the presumption that they are innocent, and therefore asexual. Most positive portrayals of disabled people are cute and cuddly. Is it the only way society can accept us? Refusing to see a minority as anything but asexual is to deny them their full humanity, on par with slut-shaming, prude-shaming, queer bullying, and objectification.

Before I go any further, let me say this: I do not want to talk publicly about what I do in the bedroom and I do not want to know what you do in the bedroom. My firm belief in sex-positive feminism and equality does not mean I think that you are sexy or exciting or impressive. Unless we’re close confidantes or I’ve indicated otherwise, please assume I don’t want any mental images of you and your naughty bits, no matter what they look like.

That said, I fully support anyone’s right to desire any sort of consensual sex imaginable. Without double-standards. Without the pressure of competition. Without the nuisance of others turning their personal preferences into rigid rules.

Take, for example, the way virginity is so frequently turned into not just a game but a high-stakes tournament. When and how did you lose it is an idea all of us are expected to base much of our identity on, even as adults. This is despite the fact that, according to medicine, virginity doesn’t exist. After all, what kind of sex does a guy have to engage in to officially “lose” it? And what about girls born without hymens? When exactly do lesbians lose their virginity?

Like race, virginity is a social construct and, in the words of a very wise person on Tumblr, what can be socially constructed can be socially changed. Last year the great Tracy Clark-Flory interviewed acquaintances about the sexual experience they considered to be their “first time.” The glorious thing about her inclusive project was that it revealed human sexuality to be just as diverse as everything else about us. Some defined their first time by their first orgasm, others by a particular first touch or experience of being touched. The problem with her stretching the definition of “losing your virginity” so broadly is that it robs competitive, insecure people of their ability to set standards with which they can gloat and put others down. Wait, no. That’s another glorious thing about it. There really is no problem with recognizing everyone’s experience as equally valid.

Failing to include everyone not only causes unnecessary humiliation, but it causes us to miss out on opportunities for true enlightenment. To quote the authors of You Can Tell Just By Looking: “Sexual minorities—people whose sexual desires, identities, and practices differ from the norm—do a better job talking about sex, precisely because they are constantly asked to explain and justify their love and their lust to a wider culture and, even, to themselves.” The more you examine harmful traditions, the less necessary they become.

This does not mean that minorities have better sex. Indeed, too many activists in the sexual revolution end up repulsing readers and listeners when they allow pride in their sexuality to devolve into arrogance, insisting their sex life is better than yours, rather than merely different. For a year, the BDSM club at my alma mater ran the slogan: “I do what you’re scared to fantasize about.” Not helpful. And kinda pathetic the more you think about it.

I will never judge someone for liking any particular kind of consensual sex, but I will judge anyone who tries to turn sex into a competition to calm their own self-doubts. Whether you’re a wise-cracking online commenter or a sex-positive pioneer, true sexual liberation is about moving beyond the middle school clique mentality, not indulging in it. It’s pretty much the least attractive thing there is.

Tonight 60 Minutes will feature the very first interview with the Australian couple that has attracted international scorn ever since the Thai woman they hired to be their surrogate mother publicly accused them of adopting one of the twins she gave birth to while refusing Baby Gammy, the one with Down Syndrome. Hiring a surrogate mother who lives abroad is both legal and unregulated in Australia, with none of the criminal background checks or counseling that are required for domestic surrogacy arrangements.

The Digital Age has seen the rise of prospective parents independently seeking out surrogate mothers online without any oversight, as well as a rise in “re-homing,” wherein adoptive parents join Facebook or Yahoo groups to seek out new parents for a child they’ve decided is harder to handle than they had thought. A disturbing Reuters report last fall profiled a couple who handed over a girl with medical problems they had adopted from Liberia to a new family they had found online, only to later discover that the new parents were known sex offenders.

Yet while black market adoption may be on the rise thanks to the Internet, the history of people rejecting only certain kinds of children is depressingly long. Only 2% of all babies born are disabled, yet half of the children up for adoption in the United States are disabled. Half of them are also black. Chad Goller-Sojourner told NPR this year that prior to his adoption by a white family, he was passed over by more than one black couple for being “too dark.”

I am deeply grateful that my parents did not put me up for adoption, like so many parents of dwarfs before them. Being rejected by your own parents simply for your body feels like a rejection of your very life. But I will not start chanting that parents should never ever make adoption plans for their children until we admit that not everyone is capable of being the sort of parent certain children need. The skills required for accepting your child’s skin color or body shape are not the same skills required for accepting a lifetime of waiver agreements about the deadly risks of invasive surgery. In the real world, some marriages do break down and some parents do become abusive and some parents do murder their half-grown children when they try and fail to cope with their child’s disability. I know a good number of people who are great at working independently but terrible at caregiving. In Far From the Tree, Andrew Solomon profiles a British woman who eventually relinquished custody of her severely disabled daughter to a foster mother, telling the NHS, “I’m not the right mother for this child.” Such honest humility requires some degree of bravery and, as Solomon points out, honors the skills of the foster mother and all parents who keep their commitments to disabled children.

Do some parents give up too easily? Absolutely. But are some children better off far away from their parents? Evidently. Because no two parents are alike, what is best for the child is best decided on a case-by-case basis. The Australian case sounds dreadful, but I’m withholding judgment until the parents have had their say. And as long as there is reproduction, there will always be parents who put their children up for adoption or terminate pregnancies, and society must thus ensure that the means for doing so are absolutely safe and heavily regulated.

But we cannot deny that too many parents end up failing to support certain kinds of children because the society they live in fails to support such kinds of people. Parents can usually see through the B.S. of those who urge them to stand by their kids no matter what and who also regularly make disparaging remarks about scars, fat, or dark skin, and openly wince at the idea of looking like a freak, a wimp, or a pussy. We won’t ever lower the disturbing number of prospective parents who would reject a child with an extra finger or toe until we as a society confront what would cause a parent to think that having an extra finger or toe is too horrific to endure.

During a discussion in college about the individual’s right to make their own medical decisions, I was shocked to hear a bunch of my friends insist that they would rather die than lose the ability to walk. Is it possible to attach such extreme shame to a hypothetical situation for yourself without attaching shame to the situation of others who live that way every day?

When I told one of my fiftysomething mentors about how upset I was by the incident, she smiled and said, “Well, that’s something young people are certainly more likely to say than anyone else.”

A fortysomething friend piped up, “Yeah, that is a very young person thing to say. I swore when I was young that I’d shoot myself if I ever went bald and yet here we are!”

Indeed, while the strains of physical pain and special accommodations and repeated doctor’s appointments are very real, perfection is not. And no matter how far technology advances, the belief that we can guarantee ourselves “normal” children is delusional. After all, unlike Baby Gammy and I, 85% of all disabled people were not born disabled. That’s something to bear in mind when heading to the obstetrician’s or the adoption agency.

A South Carolina woman was arrested earlier this month for allegedly letting her 9 year-old daughter play alone in the park while she went to work at McDonald’s. The mother had given her daughter a cell phone for safety’s sake, but a concerned stranger’s call to Child Protective Services led to the mother’s incarceration and loss of custody. Bloggers on both sides of the political spectrum are outraged over what they are calling a case of helicopter parenting gone mad. On Twitter, stories of “When I was a kid…” abound.

I wholeheartedly share their shock and dismay. (Seriously, couldn’t CPS have merely talked to the mother and helped her find a friend or a caregiver whose home could be a base for the girl during mom’s eight-hour shift?) But I am concerned about the mounting vitriol aimed at those whose job it is to protect the child. I grew up among social workers. And these bloggers, while rightfully critical, are failing to acknowledge that the mind-your-own-damn-business mentality they advocate is exactly what prevails in societies where everyone looks the other way when a child is neglected or abused.

Of course there are terrible social workers out there, just as there are those to be found in any profession who should really be working elsewhere. More importantly, it is dangerous to pretend that institutionalized xenophobia does not exist. A 2012 report revealed ableism appears to be a tremendous problem at CPS, with many disabled parents living in fear of being declared incompetent by social workers with a poor understanding of their abilities. In the South Carolina case, it seems reasonable to postulate that two of the American South’s most infamous cultural institutions—classism and authoritarianism—are what led to a cruel and unusual punishment doled out for what was, at best, a misdemeanor by a working mother.

But while attention to this case is warranted, news outlets tell real-life tales of wrongly accused parents to such an extent that one would assume most actions by CPS are unjustified. The media bias tends toward parents because parents are legally allowed to talk publicly about their children. Were a social worker to attempt to tell his side of story, he would be breaking the law. And children and families grateful to CPS for repairing broken homes rarely head to their local news station to rehash their past personal struggles.

We must acknowledge and condemn every instance of misconduct by social workers, just as we must acknowledge and condemn every case of medical malpractice, and of police brutality. But unlike doctors or police officers, social workers do not enjoy a wealth of Hollywood blockbusters and TV shows glamorizing what they do. Most portrayals in film and on television are fiercely unflattering: from the soulless bureaucrat too obsessed with rules to know love when she sees it, to the more sinister instrument of a government conspiracy to threaten political dissidents by taking away what they hold most dear. These stereotypes invariably evoke sympathy for the devastated parents and children, who wish those heartless busy-bodies would just learn to stay out of other people’s business. Rarely are social workers featured fighting the good fight.

And yet, that’s what they are there to do. Not to get a thrill from ripping crying kids away from their distraught parents, but to listen to every member of the family until they understand the source and extent of the problem. While pop culture promotes individual therapy as a path to wellness on par with yoga or meditation, the idea of family therapy tends to be seen as an outrageous invasion of privacy imposed by some glaring ice queen who is just waiting for the parents to slip up. Yet adept social workers know that the parents of neglected children sometimes have significant learning disabilities or were the victims of abuse themselves. When funding allows, parenting courses are available for those who have a hard time remembering how often diapers need to be changed, or that there are often alternatives to screaming and spanking. Adept social workers also know that neglected children are often overly forgiving of an abusive loved one, just as victims of domestic violence often are. And adept social workers know that children are far more likely to be abused, molested, or kidnapped by a member of their family than by a stranger. As with women, the most dangerous place for a child is their own home.

When I was an 11 year-old on Long Island, there was a report that a girl my age named Katie Beers had been kidnapped from a local arcade where I’d attended birthday parties. The perpetrator turned out to be a friend of the family, who kept her locked in his basement for 17 days. When he broke down and confessed to police, Beers was not returned to her mother, but placed in a foster home. I clearly remember the mother’s tearful face plastered across the headlines: “I just got her back and now they’re taking her away from me!” CPS investigators had discovered that, prior to the kidnapping, Beers’s mother had left her for years in the care of her godparents, where she was treated “like a slave” and repeatedly raped by her godfather. Beers writes today that she was ultimately relieved to be placed in foster care and that, had she not been taken out of her home, she never would have graduated high school, let alone college.

When it comes to the legal rights of the child versus the rights of the parent, the court of public opinion will always be fueled by vitriol. Family court, of course, should transcend this, putting reason and research first and foremost. CPS is undoubtedly rife with problems, many due to its miserable lack of funding. But we as a society will never put forth a sincere effort to endow social workers with enough funding to do their job well until we truly value what they do in the first place.

* Please note that while my sympathy for the social worker’s perspective is inspired by what I’ve learned from those I know, the views and conclusions expressed here are mine and mine alone.

Leaving you this holiday weekend with the brilliant Maysoon Zayid whose TED Talk above includes myriad revelations well worth your time, among them:

One fun fact I learned while on the air with Keith Olbermann was that humans on the Internet are scumbags. People say children are cruel, but I was never made fun of as a child or an adult. Suddenly, my disability on the world wide web is fair game. I would look at clips online and see comments like, “Yo, why’s she tweakin?” “Yo, is she retarded?” And my favorite, “Poor Gumby-mouth terrorist. What does she suffer from? We should really pray for her.” One commenter even suggested that I add my disability to my credits: screenwriter, comedian, palsy…

Disability is as visual as race. If a wheelchair user can’t play Beyoncé, then Beyoncé can’t play a wheelchair user. People with disabilities are the largest minority in the world and we are the most underrepresented in entertainment.