Category Archives: Mental Health

The word ‘crazy’ has the remarkable power to instantly render invalid whatever person, perspective, or practice it is applied to.

It suggests behavior that is illogical or irrational; that is so unpredictable as to defy the bounds of ‘normal’ human reason. It therefore invalidates through implicit othering — crazy people can not be reasoned with, their behaviors can be neither interpreted nor explained, their beliefs carry little more meaning than noise.

Perhaps this is why ‘crazy’ is typically used as a pejorative.

Yet, the beliefs and behaviors that are deemed to be ‘crazy’ change over time. They are continually interpreted and reinterpreted to fit the narratives of the day. Madness, in other words, is a social construct.

Foucault documents this in detail, pointing to stories of the mad, insane, and crazy that seem absurd to our modern sensibilities. Scientifically-defended theories of hard bile and hot blood, concerns over contagious epidemics of women’s ‘hysteria,’ illness interpreted as a failure of morality.

Again and again in the West, cognition and behavior have been interpreted through a narrow normative lens: anyone who thinks or acts outside this framework is taken to be crazy.

‘Crazy’ then, is perhaps better understood not as a property of a person, but as a property of society. To call something crazy is to place it outside the bounds of standard social norms, to say that it is too far out there to be reasoned with rationally. It is the intellectual equivalent of throwing up your hands and declaring there is nothing to be done — a reasonable person simply cannot engage with crazy.

Yet, its very nature as a social construct raises the question: who determines what is crazy? Creative works are full of stories of in which those deemed mad are perhaps the only reasonable ones. The French film King of Hearts, for example, contrasts the world created by asylum inmates with the brutal and senseless killing of World War I.

I find myself particularly drawn to the word ‘crazy’ because it is inexplicably gendered. It’s not quite as causal as the relationship between old and spry — but women are much more likely to be described as ‘crazy’ and the word has a long history of being used to discredit women and their experiences.

Given my description of ‘crazy’ above, this makes sense — if you can’t reason with someone who is crazy, if you can’t meaningfully interpret their words or actions, then you are free to dismiss their claims. There is simply nothing to be done. In this sense, the epithet intrinsically provides authority to the person using the word while diminishing the power of the person it’s applied to. It’s actually quite a brilliant tactical maneuver.

For this reason, many people prefer to avoid the word ‘crazy.’ There are other good reasons to avoid it, too — as you may have already inferred from the shaky language of this piece, ‘crazy’ has a deeply problematic tendency to casually lump together several different concepts. It dismisses mental health challenges, disparages neurodiversity, and glibly ostracizes any deviance from the supposed norm.

Yet — as someone who is ‘crazy’ along multiple of these dimensions — I find the word can give me power, too.

I wrote above that ‘crazy’ locates a person outside the bounds of the ‘norm.’ I think that’s true, but — I don’t find that the word itself places a normative judgement on that positioning. That is, we interpret ‘crazy’ to be bad because we implicitly assume that being outside the norm is bad. We accept that crazy people cannot be reasoned with because we implicitly assume that people who who are outside the norm cannot be reasoned with. We feel embarrassed or ashamed when labeled as ‘crazy’ because we implicitly assume that falling within the norm is good.

I reject those claims.

For one thing, I don’t really believe in ‘normal.’ We are all crazy. But more deeply — what we generally take to be ‘normal’ only refers to an idealistic conception of a small slice of humanity. Why should any of us fall over ourselves trying to fit into a norm that doesn’t exist?

I refuse to feel shame for who I am.

In that sense, I find being labeled crazy to be quite freeing, actually. Oh, you thought you could diminish me by saying that I exist outside the norm? Oh, no no no, my friend – this is where I thrive.

Being crazy means being free to discover and create yourself, it means not worrying about conforming to the norm, and it means not letting anyone dictate your truth for you.

To be clear, there are still plenty of other things to worry about. I hardly mean to suggest that nothing is true and everything is permitted. Rather, the types of things one ought to worry about — being good, compassionate, respectful — are very different from trying to be ‘normal’ or trying to fit someone else’s mold of who you should be.

And that, perhaps, is the best thing about accepting the mantel of crazy: it gives other people permission to be crazy, too. When we shy away from talking about mental health, when we assume a neurotypical view, when we accept ‘crazy’ as a personal fault, we implicitly reinforce the idea that these are somehow shameful or wrong.

Embracing and even showcasing those pieces of ourselves not only can be personally fulfilling, it implicitly sends the message: None of us should have to hide who we are.

So that is why I frequently choose to refer to myself as ‘crazy,’ why I tend to talk about my thoughts, actions, choices, and diagnoses with such levity. I cannot hide who I am, and more than that — I don’t want anyone else to do so either.

So, though it may defy all norms and reason, I will continue to describe myself with that word. I will continue to think my crazy thoughts, act on my crazy impulses, and aim to be the best person I can be with no regrets for the fact that person will never be ‘normal.’ And I will do my best to create spaces where others feel they can genuinely do the same. I feel no shame or hesitation in this commitment, it is simply who I am: a total crazy person.

I woke up this morning to a flood a #MeToo comments, as women from all spectra of my life stepped forward to share their personal stories of sexual abuse and harassment; stories of being silenced, of not being believed, of being told it was their fault, of normalizing the incessant stream of misogyny.

It’s a powerful campaign, and – at least in my feeds – has successfully emphasized how wide-spread these experiences are. Yes, all women have suffered some form of abuse or harassment. All women.

To be honest, it was more than I was prepared to handle on a Monday morning. Of course, I am glad to see these issues gaining mainstream attention, and I’m hopeful that the current wave of shock and indignation will ultimately lead to greatly needed change. But…reading about sexual assault and harassment is not my ideal way to start the day.

I don’t want to think back to remember just how young I was when I was first harassed. 9? 10, maybe? Before then, my memory is too fuzzy to be reliable.

I don’t want to figure out how old I was when strange men regularly took to following me down the street, making comments far too inappropriate to be repeated here. I may as well ask how old I was when I started going out alone – harassment is so indelibly intertwined with the way I experience the world.

And I certainly don’t want to think back to my own stories of assault. Stories I’m barely prepared to whisper privately, much less share publicly. I don’t have energy for that today.

I don’t want to think about such things, and I don’t want to relive such things, except on my own time on my own terms. I’m glad to see so many women empowered to share their stories, and I’m glad to see so many men seeming to take their words seriously. But at the same time, I just want to yell:

YES, OF COURSE, ME TOO.

These experiences happen to all women, and it shouldn’t take a flood of “me too” for us to admit we have a problem. We shouldn’t be forced to relive our traumas, or prove our traumas, or justify our traumas. It shouldn’t be solely on women to fight this battle. We shouldn’t be have to say, “me too.”

Ringing in my ears are the words of Shakespeare’s Desdemona. Shortly before her husband murders her in retribution for imagined infidelity, confronted with increasing abuse and a situation beyond her control, she shakes her head and sighs:

Oh, these men, these men.

Desdemona is caught in a double-bind. She can neither speak up nor stay silent. She is alone and truly powerless to act.

There are so many levels of horror to assault. The act itself is an abuse beyond accounting, but there’s also the fact that many women are assaulted not by some shadowy stranger but someone that they know. Many women are forced to live in contact with their abuser, picking up the pieces of their life as though nothing happened at all. Seeing him succeed in life while leaving behind a restless wake of harassment charges.

Too often, the actions of these abusers are an open secret. Everybody knows. Women try to warn each other off, knowing that open complaints will only result in retribution while doing nothing to harm the assailant. The men know, too. Nothing is done.

And that big nothing only makes it more clear who has the power, who is protected. Women continue to suffer, surrounded by a sea of men who claim to care but who fail to act.

Scattered in among the “me too” posts have been a number of men offering their support and solidarity. I appreciate many of these. I know a lot of genuinely good men who I’m glad to see in the fight.

But, there’s also another type. As one Twitter personality put it: “ok. its happened. a man who sexually assaulted me has faved another woman’s tweets about calling out harassment and assault.”

I’m hardly surprised. I know some of those men, too.

The problem isn’t a few bad apples who go around assaulting women at the drop of a hat. It isn’t simply about identifying the most virulent harassers and bringing them to justice.

The problem is a culture in which men feel entitled to sexual attention; in which they commit abuse without even knowing it. Or, at least, without the slightest acknowledgement that their actions were problematic.

As long as assault and harassment can be written off as “boys being boys,” as long as it’s a possibility that “she was asking for it,” as long as men fail to call each other out for inappropriate behavior and allow abusive men to go unchallenged amongst us, we will perpetuate a culture of abuse – no matter how many women come forward to share their stories and to say, yes –

I’ve been writing a lot recently about two potentially conflicting views.

On the one hand are scholars like John Dewey and Maurice Merleau-Ponty, who see the self as something largely or entirely created by others. As Merleau-Ponty writes, “I am a psychological and historical structure.“

On the other hand is the modern yearning for “authentic” selves – for me incapsulated by scholars such Kenji Yoshino, who sees the suppression of the authentic self as a civil rights issue; such suppression disproportionately affecting minority populations.

These views perhaps seem like they’re in conflict: how can one express their authentic self if their authentic self isn’t their own creation? Furthermore, there are a host of other questions: what if your authentic self is a terrible person, is it still good to be authentic? Surely, your “self” – if such a thing can be said to exist – doesn’t exist in some static state, waiting for you to discover it, so no matter how much agency you put behind the notion of “self” the idea of finding it is seems foolish.

I have more thinking to do on this, to be sure, but I’m not sure these ideas are in as much conflict as they seem from the surface. I can be changing and co-created and still be. Furthermore – and perhaps this comes from Yoshino’s framing of authenticity as a civil rights issue – I can’t shake the feeling that there is something important there. Saying an authentic self doesn’t matter does injustice to the people who have fought so hard to express themselves.

I see ‘self’ as intrinsically linked to agency.

The question of self is deeply important to civil society – after all, what is a society if not some collection of self-like beings seeking to coexist. An ideal society built with the notion that we are each discrete pockets of uniform consciousness would look quite different from one in which ‘self’ is conceived entirely as social construct. There is no self, only interactions. The separation between ‘I’ and ‘you’ is much smaller than we’re currently inclined to think.

So the question matters, yet I haven’t yet stumbled upon my answer.

I love the imagery of interconnected selves, of a ‘self’ that looses substance if separated from the world; but I cannot fully abandon the headstrong, ego-centric notion of self which says: I am a person. I exist.

This thought has perhaps become bastardized by generations of egotistical posturing, but for the oppressed, it is something profoundly radical. And this, perhaps, why I can’t let my notion of the ‘self’ go: when society says you don’t matter, when society says you’re nothing, you’re no one. It is this concept of the self which quietly stares back: I exist.

As author Adam Grant argues, our “authentic selves” would most likely do and say things that we – and everyone around us – would regret in the morning. Being true to yourself becomes rather ignoble if your authentic self is deeply flawed.

Rather than being authentic, Grant urges that we aim to be sincere. “Instead of searching for our inner selves and then making a concerted effort to express them…Pay attention to how we present ourselves to others, and then strive to be the people we claim to be.”

This is an interesting argument, but I’m not convinced there’s an inconsistency in being true to your authentic self and having a malleable social self.

First, dismissing the value of an authentic self seems to very much come from a position of privilege. If being authentic means nothing more to you than blurting out every thought that passes through your head, then your authentic self does not need to be found.

In Covering: The Hidden Assault on our Civil Rights, legal scholar Kenji Yoshino examines the disproportional social and legal pressures some people face to hide their authentic selves. And this ‘covering’ can do real, psychological damage. Our laws have come to protect people from certain overt forms of discrimination – you can’t fire someone because of the color of their skin or because of their gender. But, Yoshino points out, you can force them to cover.

You can forbid certain hairstyles, for example. In fact, it’s perfectly legal for employers to ban hairstyles predominately worn by African American women. Don’t Ask, Don’t Tell banned service men and women from expressing their sexual identity for nearly 2 decades. Yoshino provides numerous other examples of legal precedent which supports the suppression of minority identities in favor of the norms of white, straight identity. (Yoshino also argues that women face the particular challenge of being told to “act like men” in the workplace while also being told to be ‘feminine’. Employers can even mandate that women wear makeup or otherwise alter their appearance.)

This is what I think of when I hear ‘authentic self.’ I don’t imagine there’s some isolated island of ‘me’ that I need to discover and remain statically true to in order to be virtuous. It means there are some elements of my identity which are fundamental to who I am, and losing those elements or having them submerged by society is harmful to me.

I don’t see such an idea as being in conflict with the idea I’ve been writing about much of this week: that a ‘self’ is more a reflection social interactions than it is an isolated entity.

A self can be co-created and still have distinctive qualities which are worth being authentic to.

I can joke with one group of friends and be serious with another; I can show different sides of myself and express myself in different ways. I can have different types of relationships with different types of people – and I can sometimes even keep my mouth shut so as to not say something inappropriate. None of that is inconsistent with being authentic. None of that is inconsistent with striving to be the best person I can be. And none of that is inconsistent with the idea that the core of who I am is formed, not as some Athena sprung from my head, but mainly by my interactions with others.

I have complained before about the common solution to the so-called “confidence gap” – that those with less confidence (typically women) should simply behave more like their confident (typically male) peers.

There’s a whole, complex, gender dynamic to this conversation, but even putting that issue aside, I have a hard time accepting that the world would be better if more people were arrogant.

Of course, those advocating for this shift don’t call it arrogance, preferring the positive term of confidence, but there is a fine line between the two. If a person lacks the confidence to share a meaningful insight, that is a problem. But it is just as problematic – perhaps even more problematic – when someone with unfounded confidence continually dominates the conversation.

Confidence is not intrinsically good.

Thinking before you speak, questioning your own abilities – these are good, valuable traits. It’s only at their extreme of paralyzing inaction that these traits become problematic. Similarly, confidence is appropriate in moderation, but quickly becomes tiring at its own extreme of arrogance.

Finding a balance between the two is the skill we all ought to work on becoming good at.

Unfortunately, there doesn’t seem to be a good word for the opposite of over-confidence. Modesty is one, but it doesn’t quite capture the concept I’m trying to get at. Modesty is a trait of accomplished people who could reasonably be arrogant but manage not to be. Can you be modest while sincerely unsure of yourself?

I’ve started using the term self-skeptism; a sort of healthy, self-critique.

The word skeptic has a somewhat complicated etymological history, but is derived in part from the Greek skeptesthai meaning, “to reflect, look, view.” This is the same root as the word “scope.”

It implies a certain suspension of belief – an ability to step back and judge something empirically rather than biased by what you already believe. And, it implies that skeptical inquiry is a valuable process of growth. The skeptic neither loves nor hates the subject they are skeptical about – rather, they hope to get at a better, deeper understanding through the process of inquiry.

Applied to one’s self, then – though perhaps more typically called by the general term of self-reflection – self-skepticism can be seen as the process of trying to become a better person through healthy skepticism of yourself as you currently are.

This, to me, lacks the judgement implied by “lacking confidence,” while embracing that we are all flawed and imperfect in our own ways – though we can always, always work to become better.

All these self-help articles are written in the blasé tone commonly found in fat-shaming weight loss articles. If you want to lose weight, eat less. If you want to be a morning person…just get up in the morning.

This advice does not seem that helpful.

For one thing, sleep habits are – at least in part – biologically determined. In one 2013 study, researchers used the standard Munich Chronotype Questionnaire to sort participants into “morning” and “night” type people. They then studied melatonin and saliva samples of the participants, finding the the difference in circadian rhythms could be “detected at the molecular clockwork level.”

I am certainly reaching far beyond my areas of expertise, but it seems as though there is sufficient evidence for the conclusion that it is unproductive to simply tell a night owl to try harder to get up in the morning.

To be compound the matter, there is some evidence to suggest that “misalignment of circadian and social time may be a risk factor for developing depression” – eg, that “night owls,” whose preferred timing is disconnected from what is generally socially acceptable – are at higher risk of depression.

To be clear, chronotype is not a binary state. On the whole, a population may skew towards early or late, but diurnal preferences are a distribution for which most people fall in the middle. So those individuals glibly writing guides for how they became morning people were most likely not particularly night people to begin with.

If you really want to be a morning person, it seems reasonable to give it a try…but if it really doesn’t work for you, it may be best try finding a lifestyle that better supports your given sleep preferences.

It is often intense, powerful, and unpleasant for everyone around it. The emotion may have several negative health effects, and may be especially bad for your heart. Anger management resources are widespread. Because of the problematic nature of anger, that it is “such a forceful negative emotion and makes people uncomfortable,” as one Psychology Today article puts it, “taboos about expressing it are widespread.”

To further complicate matters, many psychologists “believe that holding anger in is bad for you, that it only builds pressure to be expressed.” On the other hand, the American Psychological Association (APA) now says that freely expressing anger may be “a dangerous myth” used “as a license to hurt others.” Furthermore, “research has found that ‘letting it rip’ with anger actually escalates anger and aggression and does nothing to help you (or the person you’re angry with) resolve the situation.”

Feeding into the taboo nature of anger, it seems as though our best solution is to simply not have any anger in the first place – thus avoiding the conundrum of holding it in or letting it out.

Recognizing the seeming impossibility of simply deleting anger from our lives, the APA puts this a little more constructively, recommending: “It’s best to find out what it is that triggers your anger, and then to develop strategies to keep those triggers from tipping you over the edge.”

This strikes me as the advice you give when you don’t know what to say.

Most notably, this advice seems to imply that most anger is unjustified. Figure out what makes you angry and avoid it, the way a person with Celiac ought to avoid gluten.

But what if what makes you angry is…injustice? What if you are angry because of historical legacies of power and oppression, because of deep disparities which are so entrenched as to seem normal?

A coping mechanism hardly seems appropriate for the task.

In one of the few memorable lines from The Phantom Menace, Yoda uses a line of thought similar to the APA when he proclaims, “Fear is the path to the dark side…fear leads to anger…anger leads to hate…hate leads to suffering.”

Yet, this is the the logic of someone in power – it subtly assumes that anger is little more than the selfish reaction of someone who doesn’t get their way.

There is, of course, a certain truth to Yoda’s claim – there are plenty of instances throughout history where fear mongering has proven to be an effective, though unfortunate, tool for power, hate, and suffering.

But the idea that all anger intrinsically leads to hate goes too far.

This is a danger, no doubt, but the power of justified anger is a force to be reckoned with. A power which can critically be harnessed for positive social change.

As Hitendra Wadhwa writes in a 2012 piece on Martin Luther King:

Great leaders do not ignore their anger, nor do they allow themselves to get consumed by it. Instead, they channel the emotion into energy, commitment, sacrifice, and purpose. They use it to step up their game. And they infuse people around them with this form of constructive anger so they, too, can be infused with energy commitment, sacrifice and purpose. In the words of King in Freedomways magazine in 1968, “The supreme task [of a leader] is to organize and unite people so that their anger becomes a transforming force.”

Researchers recruited individuals who had been diagnosed with some form of psychosis, as well as a comparative “healthy” population. They “collected a large panel of biomarkers of known relevance to psychosis and functional brain activity” and “refined a subset of the biomarker panel that differentiated people with psychosis from healthy persons.” Clustering the relevant biomarkers, researchers found three distinct biotypes (“biologically distinctive phenotypes”).

Interestingly, the three biotypes identified “did not respect clinical diagnosis boundaries.” That is: the biological expression of psychoses differed from their clinical diagnosis, highlighting the need to refine current diagnosis techniques.

However, the clusters did reveal a meaningful lens through which to view psychosis. For example, “the biotypes significantly differed in ratings on the Birchwood Social Functioning Scale, which assesses social engagement, psychosocial independence and competence, and occupational success; biotype 1 showed the most psychosocial impairment, and biotype 3 had the least impairment.”

Particularly interesting are the implications of this work:

The biotype outcome provides proof of concept that structural and functional brain biomarker measures can sort individuals with psychosis into groups that are neurobiologically distinctive and appear biologically meaningful. These outcomes inspire specific theories that could be fruitfully investigated. First, biotypes 1 and 2 should be of greater interest in familial genetic investigations, while perhaps biotype 3 would bemore informative for explorations of environmental correlates of psychosis risk, spontaneous mutations, and/or epigenetic modifications.

This is fascinating research and certainly worthy of further study, but it also raises the haunting specter of modernity. As Gordon Finlayson describes in Habermas: A Very Short Introduction:

There is a sinister aspect to the assumption that science and rationality serve man’s underlying need to manipulate and control external nature: that domination and mastery are very close cousins of rationality. Not only science and technology, but rationality itself is implicated in domination.

Unlike true scientific scholarship, high modernism was “a faith that borrowed, as it were, the legitimacy of science and technology. It was, accordingly, uncritical, unskeptical, and thus unscientifically optimist about the possibilities for the comprehensive planning of human settlement and production.”

In short, high modernism is the authoritarian imposition of a planned social order, designed by bureaucrats foolish enough fancy themselves as benevolent conquerors of nature.

To be clear, the study itself is not inherently high modernist. Better understanding and diagnosis of psychosis is a worthy scientific goal. But you’ll forgive me if I’m somewhat weary of the profession which considered homosexuality a mental ailment until the 1970s. Social understandings of “mental health” have long been propped up by the scientific understanding of the day – with the currently scientific research miraculously changing to validate social norms.

Michel Foucault perhaps best documents this phenomenon in Madness and Civilization, a brilliant historical account of “madness” as a social construct which shifts to fit the norms of the day.

Perhaps this seems unlikely in our modern world – surely our modern scientific understanding of biology far out shines the dark, half-science of the middle ages. Finding biological underpinnings of madness, biotypes that reveal psychosis, seems, on its face, reassuring: madness can be rationally explained.

Yet it is exactly that reassurance which ought to give us pause. Perhaps we have only found what we wanted to find – irrefutable proof that the mad are somehow different than the healthy, that there is something fundamentally, biologically, different about “them.” And, of course, it’s the implied outcome which should surely give us pause – if we can define the root of their madness, we can at last fix these poor, broken souls.

In 1925, T.S. Eliot, already an established and respected poet, published The Hollow Men.

It was a transitional time the author. Two years later, Eliot – who had been born to a prominent Missouri family and raised in the Unitarian church – would convert to Anglicanism and take British citizenship. A conversion which is reflected in his 1930 poem, Ash Wednesday.

He was in an unhappy marriage. In his sixties, Eliot confessed in a private letter, “To her, the marriage brought no happiness. To me, it brought the state of mind out of which came The Waste Land.”

Eliot had composed the epic poem largely while on three months enforced bedrest following a nervous breakdown.

It was in that state of mind – post-Waste, as Eliot later described it, yet without the peace he found later in life – that Eliot wrote The Hollow Men.

We are the hollow menWe are the stuffed menLeaning togetherHeadpiece filled with straw. Alas!

The poem is full of allusions to hollow men – Guy Fawkes of the infamous Gunpowder treason and plot; Colonel Kurtz, the self-professed demigod of Joseph Conrad’s Heart of Darkness; Brutus, Cassius, and other men who conspire to take down Julius Caesar; and the many cursed shades who call to Dante as he travels through the afterlife in The Divine Comedy.

Ultimately, these hollow men, full of veracity and determination in life; worshipped, perhaps, as gods among men, are nothing. They are only the hollow men, the stuffed men.

In Purgatory, Dante finds such hollow men. “These shades never made a choice regarding their spiritual state during life (neither following nor rebelling against God) instead living solely for themselves. Neither heaven nor hell will let them past its gates.”

They are remembered – if at all – not as lost,

Violent souls, but only As the hollow menThe stuffed men.

There is an in-betweenness to their existence, a nothingness far worse than the tortuous circles of hell. They are shape without form; gesture without motion. They are paused, eternally, in that inhalation of oblivion.

Between the ideaAnd the realityBetween the motionAnd the actFalls the Shadow

Perhaps we are all hollow men. Perhaps we are all doomed to that empty pause. Perhaps, we, like Fawkes, will be found – seemingly on the eve of our victory – standing guard over our greatest work not yet accomplished.

This is the way the world ends
This is the way the world ends
This is the way the world ends
Not with a bang but a whimper.

Or, perhaps not. Perhaps, as Eliot did himself, we can find new beginnings out of our quiet darkness. As Eliot writes in Ash Wednesday:

Because I do not hope to turn again Because I do not hope Because I do not hope to turn Desiring this man’s gift and that man’s scope I no longer strive to strive towards such things (Why should the aged eagle stretch its wings?) Why should I mourn The vanished power of the usual reign?

Earlier this month, the Corporation for National and Community Service (CNCS) and the National Conference on Citizenship (NCoC) released research showing that Americans’ volunteering rate remains strong. Over a quarter of U.S. adults volunteered through an organization last year, while nearly two thirds volunteered informally.

This is welcome news, but also disconcerting: recent trends point to steady volunteering rates but drops in other civic activities. A December 2014 report found that “16 of the 20 civic health indicators dropped,” with volunteering as one of the few positive outliers. Collected by the U.S. Census Bureau, civic health indicators cover topics including voting, volunteering, political expression and group membership.

Personally, I am quite concerned about indicators related to civic voice. A 2010 NCOC report found that, among Americans who are not engaged with a community group, less than 15% express their political voice in one or more ways. This number rises significantly for those who are involved in a group (about 40%) and especially for those with a leadership role within a community group (nearly 70%).

But the disparity indicated by these gaps is alarming. Under 10% of the population falls into the category of “leaders,” raising important questions about the socioeconomic and gender disparities represented in that gap.

For example, a 2012 study by my former colleagues at the Center for Information and Research on Civic Learning and Engagement (CIRCLE) looked specifically at the civic lives of young people with no college experience – some of the most underrepresented people in our society.

When one focus group was asked whether they had a voice in their school, they all simply laughed. One young man in Little Rock argued that student voice in school was a myth: “Even when you are class president and school president you still don’t have a say, so … it’s only a show.”

Another student is quoted as saying, “even if you do voice your opinion it won’t do any good—the suits are the ones who are gonna make all the decisions.”

That is deeply problematic for civil society.

Too many people feel as though their voice does not matter, as though their perspectives don’t add to the world.

This is a fallacy. A myth perpetuated by false social standards laying claim to what types of people have value and what types of views have value.

All people have value; all voices matter.

Unfortunately, too many people have been taught that their voices don’t have value – that they would only add to the noise if they ever dared to speak up. Once this message is internalized, the civic silence is hard to break.

But that’s why it’s important to remember – speaking out is not a luxury, its not an activity you do to show off how important you are. It is a civic duty. Sharing your own voice and perspective – particularly for those whose voice and perspective is often overlooked – is critical to transforming the state of civic dialogue. Everyone’s voice needs to be heard.

There’s this great and terrible irony in the world – it’s the people who worry about being rude or incompetent or otherwise being a terrible person who are the least likely to be rude, incompetent or an otherwise terrible person.

The same can be said about civic voice – if you never speak up because you are so convinced that your own voice can’t possibly add value, then you are depriving the rest of us of your wisdom. I know it is awkward, and I know it feels self-aggrandizing, but forget about all that: we need your words and your perspective. It’s a civic duty to share your voice. Really. We can’t tackle the hard problems without you.