Menu

Allie Brosh, creator and author of the blog Hyperbole and a Half, has recently returned after an extended absence from the blogosphere. Before her unplanned hiatus, she published a post about her experience with depression; and now, upon her return, she has published a follow-up on her continuing struggle with the mental illness. Brosh’s posts are as insightful as they are entertaining. I cannot recommend them enough. For those who are suffering, or have suffered, from depression, you will find much in common with Brosh’s experiences. And, for those who haven’t been touched by depression, you will find it very eye-opening; Brosh does an amazing job of describing this often frustratingly ineffable experience. Follow the links below to Brosh’s blog, and prepare for a take on mental health like you’ve not seen before.

This, if any, is perhaps the most fitting post to be publishing late, since the reason for my delay is none other than the antagonist of today’s entry: perfectionism. Perfectionism is a trait seen by some as a blessing and by others as a curse. Unfortunately, in the academic community, it’s seen as an almost unambiguous virtue. This mentality is not just unhealthy, it is downright dangerous. In this post, I will make the case for what might seem like a pedantic distinction; but it is—as so many are—a distinction with purpose. In a word, I want to distinguish the virtue from the vice in the hope of excising perfectionism from our culture without compromising other truly virtuous academic values.

Before continuing, it is worth noting that there is so very much that can be written on the topic of perfectionism. We might quarrel with the academic atmosphere that values a certain kind of “greatness”. We might challenge the value of striving after perfection altogether. In this post, I want to challenge the culture of perfectionism on a more ‘grassroots’ level. I will take it for granted that excellence is to be valued and sought after. Some may find this contentious, and as always, I invite those who do to share their thoughts in the comments below.

Perfectionism Condoned

For the individual, perfectionism manifests in the answer to the following question: “How much is enough?” In my case, I find myself convinced that there is no answer to this question. Nothing is enough. The result is that I often work myself to exhaustion, as the only limits I (grudgingly) recognise are those my body forces upon me.

The trouble is, as one student of mine put it to me, there is a certain culture in academia that fetishises this kind of approach to one’s work. “I’ve been in the library all night,” some will announce. “I’ve barely slept all week,” another will declare. And while they may sound like complaints, one can’t help but get the feeling that they are also badges of honour. There’s often a certain bravado that accompanies such remarks. A sense that this is the way of things at university—or, more precisely, the way of things if you want to be successful. Getting enough sleep? You’re doing it wrong. Have time for hobbies? You’re doing it wrong. Generally speaking, if you’re not exceptionally stressed about your work and your grades at all times, then—you guessed it—you’re doing it wrong.

What’s worse is that this is not only perpetuated by the students who participate, but also by the teachers who neglect to encourage their students. “Good! They should be scared,” you’ll hear some say. As though frayed nerves, and frantic caffeine-fueled nights are a kind of academic rite of passage.

Room for Excellence?

But surely, traditionalists will complain, striving for excellence is precisely what academia is all about! And on a certain understanding of “excellence”, I quite agree. But the pursuit of excellence and perfectionism are two very different things. And it is this all-important distinction that the academic culture often blithely ignores.

So, how do we distinguish between these two? Well, as it happens, psychology already has the tools we seek for doing just this. Since the publication of a seminal paper on the topic in 1978, many psychologists have distinguished between the more positive perfectionist strivings, and the far more negative perfectionist concerns (so called by Stoeber and Otto (2006)). For clarity’s sake, I will refer to the former of these as the ‘pursuit of excellence’, and to the latter of these as ‘perfectionism’ simpliciter. Where the pursuit of excellence is associated with “high personal standards” (Stoeber and Otto 2006: 296), perfectionism is associated with “concern over mistakes, doubts about actions, socially prescribed perfectionism, and perceived discrepancy between actual achievements and high expectations” (ibid.)

Excellence without Perfectionism

To be clear, I don’t for a second want to discourage academics from having high personal standards; I take such standards to be admirable. Indeed, I pride myself in my own. What is essential, however, is that these standards be achievable, that our successes be recognised, and that our mistakes be accepted.

In the academic community, we are each of us teacher to some and student to others. We learn our expectations from our teachers before passing them on to our students. And so, I feel a cultural shift such as the one I propose is best affected by way of a shift in our approach to instruction. If perfectionism lies, at least in part, in the belief that nothing—no amount of effort, no level of success—is enough, then, as teachers, we have a responsibility to show our students otherwise.

Here are five ways we can begin to do this:

1. Celebrate successes, big and small.

Make sure your students walk away knowing more than just what they’ve done wrong. Acknowledging successes is our way of leading by example. It demonstrates to the student that we, their instructors, take “enough” to be achievable. And be specific! Nothing is more frustrating to a student than hearing that they’ve done well, but not knowing what they’ve done well. Nebulous comments encourage students to set unspecific, and therefore, unachievable goals.

2. Offer clear, precise, and constructive criticism.

As much as students want to know what they’ve done well, they equally want to learn what to do better. And, as with the previous, specificity is essential. It’s not enough to say, for instance, that the argumentation was weak. Which argument? What made it weak? How might they fix it? The student should be left wondering about these questions. If it’s not at all clear what is wrong and why, it becomes impossible to determine whether one has improved at all the next time around. By making our suggestions clear and precise, we provide achievable goals, and in so doing, help students determine how much is enough with respect to a given piece of work.

3. Empower students to make mistakes.

Show students that mistakes need not be feared. Far from unforgivable sins, mistakes are inevitable, and indeed, expected. What’s more, they are often highly instructive. Fear of mistakes is nothing short of paralyzing—so it is our responsibility as teachers to expose mistakes as the stepping-stones they are, rather than specters they are often seen to be.

4. Encourage questions.

We need to create a safe environment in which students can feel comfortable speaking up when they don’t understand. Ask if what you’ve said makes sense to them, and remind them that it’s okay if it doesn’t! I’ve lost count of the number of students who have felt the need to apologise when they found something unclear. This is precisely not how we want our students to feel. If we make it clear that we don’t expect perfection, students will be less likely to expect it of themselves.

5. Remember when you were in your student’s position.

It’s too easy to forget what it’s like to learn our respective subjects for the first time. What now seems obvious is likely far from obvious to your student. As much as I can, I try to avoid editorialising as I teach; that is, I avoid describing the content of what I’m teaching as “clear” or “easy”. Using words such as these encourages students to feel shame when they don’t understand; in avoiding such language, we go a long way toward achieving #3 and #4 from above.

Thoughts?

Do you have experience coping with perfectionism? Have you overcome your perfectionistic tendencies? Teachers, do you have strategies for discouraging perfectionism? Leave your thoughts and experiences in the comments below.

In this short, interesting article, Depression’s Collateral Damage considers the benefits of shift from the language of “mental” illness, to “brain” illness.

I must confess, as someone with a mental illness, I have my reservations about the shift toward thinking of mental illness as a strictly physiological affliction. I worry that it will lead to an overly pharmaceutical approach to its treatment. I fully acknowledge the role that medication has to play in wellness (indeed, it plays a significant role in my own), but I think that talk-therapies are essential to wellness too. If, however, we start to think of mental illnesses as a strictly physiological condition, I worry that sufferers will neglect the importance of therapy. If mental illness is nothing more than a flaw in chemistry, talking would no more solve the problem than it would cure diabetes.

Admittedly, treatment schemes are, and should be, as individual as the people who suffer from these illnesses. But, when coupled with increasing pressures on public health systems to cut back on spending, I worry that a strictly biological conception of mental health will lead to a severe underemphasis of the value of talk-therapies.

Apologies to my subscribers for the brief hiatus. I will return with an article on perfectionism in the coming days.

I have become aware of something very extraordinary these past few days. It has completely altered how I look at particular things. And that’s the gift of new language.

When I started using brain illness instead of mental illness, I thought it was perhaps just an exercise in semantics. I didn’t think it would make any difference, but it has.

Now, when I look at my husband and think brain illness, something has subtly shifted. I am processing what is happening in terms of something wrong with him physically, not that his mind is haywire. Sure, the illness can affect his thinking, causing him to obsess, but I know that there is something physically wrong with him and that makes all the difference.

When I think of others suffering other forms of brain illness, I find myself with more hope than when I approached it from the point of view…

Share this:

Like this:

Application forms are a nuisance, no matter who you are. But, for me, there is one question in particular that I dread each time I fill one out:

“Do you have a disability?”

I have never once answered ‘yes’ to this question. But should I? Is my mental health condition a disability?

Am I disabled?

Before I continue, a brief aside: today’s post will be rather more informal than those previous. I want to share my thoughts on this question, but I am woefully short of answers. As always, I have cast about for other material on the subject, but have found precious little. However, while this week’s piece is little more than a collection of musings, I hope it will still resonate with some of you, and perhaps inspire more conversation on the topic.

To Declare or Not To Declare…

I find myself in a difficult situation where this question is concerned. My mental illness is not so serious that I am unable to work; indeed, I am fortunate that it has hitherto not disrupted my work. And yet, it is serious enough that it can at times be a very real challenge requiring professional attention. This middle ground in which I find myself makes answering the question above complicated at best.

There are two different reasons why declaring my mental illness gives me pause.

Equal Treatment

First, I loathe the thought of receiving special treatment on account of my mental health. Does my application really deserve special attention because of my condition? In my case, I am convinced that it doesn’t. After all, my condition hasn’t compromised my work. It seems unfair to plead special circumstance when those circumstances were not so challenging as to adversely affect my results. And what is more, I should like to think that my place at university or my offers of funding were granted on the merit of my academic excellence alone. Indeed, it is for this same reason that I also never declare myself to be a member of a visible minority; but of course, the two questions are not quite analogous. And so-called “positive discrimination” is a topic for another day.

Who am I?

My second concern, however, I find far more compelling than the first. To me, the more challenging question is this: Am I prepared to adopt the label ‘disabled’? More than a question about legal definitions, it is a question about my identity. Do I see myself as a person that is disabled?

adj; …2. Of a person: having a physical or mental condition which limits activity, movement, sensation, etc….

But I do not feel less able than my colleagues; my scholarly activities aren’t limited. I attend talks, teach classes, write papers. In all the relevant ways—that is, all the ways relevant to being an academic—I am equally able as my peers.

And yet, there are nevertheless days when emerging from my room is an ordeal, and days when my anxiety erupts into full-blown panic.

So am I disabled…?

I suppose my activities are indeed limited on the days described. But there are equally days on which I experience no such distress. Unlike many other disabilities that pose a constant challenge, my mental health difficulties are intermittent. And it is unclear to me what degree of limitation constitutes a disability.

Finally, I am particularly fearful of the label because I am an academic. Qua academic, I am my mental capacities. So, to say of myself that I have a disability on account of my mental illness feels like a threat to that identity. But this, I recognise, is an oversimplification.

And perhaps so too are many of my concerns. After all, we are none of us reducible to the conditions that we suffer. But knowing this, I find, does not make answering the question any easier.

Thoughts?

Do you declare your mental health condition when completing applications? Do you have concerns about the label ‘disability’? How has declaring/not declaring affected you? Share your thoughts and experiences in the comments below.

The experience known as “impostor syndrome” has been a topic of increasing interest in recent years, featuring in many blog posts and newspaper articles. I have found it reassuring to read the experiences of so many academics who share this experience, and so would like to add my own contribution in the hopes that others might take some comfort in it. As such, at the risk of repeating much of what has been said by others already, I present to you my thoughts on this pervasive phenomenon.

“Syndrome”

First, though, a note about terminology: although called a “syndrome,” impostor syndrome is not a medical condition. It is rather a convenient expression for referring to a quite common experience in academia—that of feeling that one is a fraud. (NB: the phenomenon is not restricted to academics, but is alarmingly common among them.) I have my qualms about referring to the experience as a “syndrome;” it is, I think, indicative of our growing desire to pathologise the entirety of the human experience—a trend to which I am staunchly opposed. Clinical psychologists Pauline Clance and Suzanne Imes (1978), the first to identify the experience, named it the “impostor phenomenon;” and so, in what follows, I shall use Clance & Imes’ terminology.

It is worth noting that some mental health professionals also share my reticence to apply the term ‘syndrome’ to this experience. In an article in Nature, Dr. Mayada Akil, Director of Outpatient Psychiatry at Georgetown University Hospital, Washington DC, is quoted as saying that while the experience certainly occurs, it is “not a disorder or a syndrome;” Dr. Akil feels that much has been made of the so-called syndrome for “commercial purposes.”

Now, it might seem odd to devote this amount of space to a terminological discussion. The point, however, is not a minor one; there is power in words. In labels. In this case, I think to pathologise the experience is to absolve the academic environment of too much responsibility—an environment that, as I wrote last week, allows phenomena like this one to thrive.

So much, then, for terminology.

Hiding in Plain Sight

“I’ve just pulled the wool over their eyes…”

This line, and many others like it are a constant refrain in my internal dialogue. Worse than a doubt, I harbour an unfaltering conviction that I have not earned my place here in the hallowed halls of academia. A conviction, I hasten to add, that is utterly irrational—one that is dogmatically impervious to all assurances and evidence to the contrary. And yet, as much as I’m aware of its irrationality, it is a belief from which I seem unable to divorce myself.

What is more, the impostor phenomenon is not just a belief about one’s own unworthiness. It is also the constant fear of being exposed. The fear that the next question you ask, the next presentation you give, the next page you write will be the one that gives you away. It is the fear that you are surrounded by people who can smell deceit—by, as Colonel Pickering might have it, “impostorologists.” Indeed, combined with the combative atmosphere that prevails in academia (see “Shame and Academic Darwinism”), the impostor phenomenon makes one feel like a sheep in wolf’s clothing. One who is amidst many hungry wolves.

The tragedy of it is, the impostor phenomenon is strikingly common. And yet we each feel alone in our experience; we don’t realise that we’re in what amounts to a fancy dress party—so many supposed sheep in wolves’ clothing.

As many of the articles on this topic explain (see “References and Further Reading” below), one of the most effective ways of combating the impostor phenomenon is to talk about it. This is why I felt it important to add to the already sizable literature on this all too common experience. That being said, I must admit, it is with no small amount of trepidation that I confess to feeling this way. Despite having met numerous students and peers who have expressed fears very similar to my own, I cannot help but think, “their beliefs may be mistaken, but I am in fact a fraud.” A truly laughable thought! And one that demonstrates the self-reinforcing nature of this troubling phenomenon.

Cognitive Distortion and Negative Self-Image

I take my experience of the impostor phenomenon to be part and parcel of my mental health issues (namely, anxiety and depression). Indeed, in my experience, all three feed into one another, each reinforcing the deeply negative beliefs engendered by the others. The worse my depression, the more I feel an impostor, and so the more anxious I become. This often leads to a deepening of the depression, thus completing the circle.

One of the most notable symptoms of depression is the possession of a negative self-image and the cognitive distortions that accompany that image. There are many different kinds of cognitive distortion, among them one called “disqualifying the positive.” As the name suggests, it is typified by a tendency to explain away positive experiences. For instance, dismissing compliments as merely polite gestures, or attributing successful results to luck.

Sound familiar?

The impostor phenomenon is itself a product of this cognitive distortion. To be certain, not all who experience the impostor phenomenon suffer from a mental health condition. While cognitive distortion is common to both depression and the impostor phenomenon, this does not imply that all who experience the latter necessarily suffer from the former. As with so many symptoms of mental health conditions, cognitive distortions admit of degrees of severity, not all of which suffice for clinical diagnosis. However, what this overlap does explain is why those suffering from depression are particularly susceptible to the impostor phenomenon. It suggests that the two experiences are deeply related.

Fighting Back

So, what can we do to overcome the impostor phenomenon? To answer that, I will point you to a marvelous article at GradHacker entitled “Banishing Impostor Syndrome.” In her article, author Andrea Zellner outlines four main strategies for combating this phenomenon:

(1) Share how you feel.
(2) Be kind to yourself.
(3) Fake it ‘til you make it.
(4) Help others.

“People are hurting. They’re looking for a chance to talk about how mental illness is affecting their lives. Too often the stigma prevents them from discovering that others are living in similar situations. Loneliness, isolation, abandonment abound. One simple act of sharing can change all that.”

The relation between shame and mental health is an obvious one, and it manifests in so many different ways. Less obvious—or, as I shall suggest, less explicit—is the role shame has to play in the world of academia. I’m certainly not the first to observe the culture of shame in academia—a quick google search pulls up articles at Pittsburg PhD, Legally Sociable, Anne Brannen (who’s written a how-to guide on coping with this culture), The Chronicle, and JAC (articles from 2005 and 2006). Nevertheless, I’d like to take a moment to reflect on the role of shame in academia, and more specifically, on its impact on our attitude towards mental health in the academy.

What is shame?

No self-respecting academic (and certainly not a philosopher!) could proceed much further without defining her terms, so let’s sort out what we mean by ‘shame’. I found a number of different definitions in my search, but all shared a common theme: shame involves a judgement of the person we are. In this way, shame is often defined in contrast to guilt; according to Fossum and Mason (1986) where “guilt is a painful feeling of regret and responsibility for one’s actions, shame is a painful feeling about oneself as a person” (my emphasis). Similarly, according to Brene Brown, Ph.D., LSMW, “the difference between shame and guilt is best understood as the difference between ‘I am bad’ and ‘I did something bad’ [respectively]” (2012). In other words, shame concerns a judgement about our identity, whereas guilt concerns a judgement about our actions.

Identity and Darwinism in the Academic Jungle

Unsurprisingly, this action-identity distinction figures in academic life as well. In my experience, there are broadly two kinds of people who attend university: (1) those who regard themselves as persons engaged in academic activities, and (2) those who regard themselves as academics. People belonging to the second category wed their identities to their academic achievement. (Indeed, we seem to romanticise such individuals—the tortured intellectuals who live for their work.) What is more, among academics, there is a tendency to judge peers in just the same way, i.e. to identify the person with their academic contributions. Add to this the highly adversarial approach to our practice, and you have the recipe for a severely destructive environment.

Listen and watch carefully the next time you’re at a conference, or in a seminar. Listen to the chatter after the official goings-on have finished. The discourse can be ruthless. Indeed, I’ve attended some seminars that have been Darwinian in their atmosphere. One is tempted to remind everyone, “We’re all on the same side here!” From the rhetoric and the posturing, one often gets the impression that there is a battle being waged, rather than a mutual pursuit of knowledge. As Professor Linda Hutcheon writes in her excellent article on the topic, “[t]he academy rightly values critical thinking, but increasingly we seem to define that quality in terms of the wolfish belittling and even demolishing of opposing positions” (2003: 43). And is it any wonder, when we take our performance to be reflective of our worth as an academic? And, for too many of us, our worth as a person? The more threatened we feel, the more threatening we often become, in defence. It’s the academic equivalent of a porcupine’s quills.

(Of course, I don’t believe for a second that this is a problem unique to academia; after all, it is a basic capitalist assumption that competition encourages productivity… but that’s a topic for a different blog.)

Define ‘Fittest’…

If academia is, as I’ve been suggesting, a community that behaves according to the principle of “survival of the fittest,” how does that community define ‘fitness’? Who are the fittest among us? Naturally, we take those demonstrating excellence in critical thinking, analysis, and so on, to be among the best of us. Indeed, these traits are part of the everyday discourse. But there are unspoken measures, too. The fittest, it’s assumed, are the ones who can “handle it”, who can “hold it together”, who can tolerate the stress without “falling apart.” These are all expressions I’ve heard students and peers alike use in conversation. And, if I’m quite honest, they’re expressions I’ve used in my own internal dialogue, too.

To be sure, it’s not explicit discussion that leads to this attitude; I’ve not heard anyone say that, to succeed in academia, you have to “hold it together.” No, it’s a sin of omission that is to blame. How many professors can you think of that made it a point to mention university counselling services at some point in a class? I’ve had two such professors. Two. In the six years I’ve spent at university. And that’s still more than many of the people I’ve asked can think of.

So now, we have an environment in which,

(a) many individuals wed their identity to their academic achievements and mental prowess;
(b) antagonism among its members is all but encouraged; and
(c) mental health is seldom, if ever, discussed.

We’ve created an institution in which we are spurred on by shame, but are unwilling to acknowledge shame’s consequences on our wellbeing. And this, predictably, inspires yet further shame in all who suffer those consequences. It’s nothing short of poisonous.

Towards a Solution: Inspiring Collaboration

As members of this grand institution, it is incumbent on us to change this poisonous atmosphere. So, what can we do, as individuals within the system, to move things forward?

1. Start the conversation.

Teachers, talk to your students about mental health. Start a dialogue about the prevalence of mental health issues and the resources available for coping with them. It’s been written that it can alleviate feelings of shame “if the person can admit them openly to others, and feels respected instead of judged by [those people]” (Dr. Thomas Scheff, cited in a 1987 New York Times article). Take steps to create that safe space.

2. Be aware of your approach.

Antagonism escalates by feeding into itself, so don’t be part of the problem. Pay attention to how you phrase your questions in discussion. Be aware of your tone. Try to change your own attitude toward the purpose of critique. We don’t have control over the way others act; the best we can do is to change ourselves. If we each make a conscious effort to approach academic discourse with a collaborative attitude, we can start to change the tenor of that discourse as a whole.

3. Separate the person from the work.

Try to maintain this separation with respect to yourself and with respect to others. We can still identify as academics without reducing our identity to that part of ourselves. And similarly for our peers. Compare the difference between the following two comments: (1) “What you’re doing there is _______”, versus (2) “The issue with that argument is _________”. One makes a remark about a person while the other remarks on an argument or position. Be aware that in discussion, we aren’t fighting a person, we’re considering an idea. And, with respect to your own work, recognise that very clever people can have very bad ideas. Remember the difference between action and identity. Try to think, “I made a bad argument,” instead of “I am bad at argumentation.”

Thoughts?

What are your experiences with shame in academia? Do you have any strategies for shifting away from our present atmosphere of academic darwinism? Share your thoughts in the comment section below!