Category: The self

A lot has been written about the downwardspiral of loneliness. People who crave more social contact often develop behaviours and thinking styles that only serve to accentuate their isolation, such as turning to drink and becoming more sensitive to perceived slights and rejections. Less studied is the question of whether some people have personality traits that give them a buffer against these loneliness-related risks. A new study published in the Journal of Health Psychology finds a promising candidate that appears to fit this description – authenticity, or being true to yourself.

Jennifer Bryan and her colleagues surveyed 537 undergrads (average age 22; age range 18 to 60), nearly three quarters of whom were female. The students filled out questionnaires about how lonely they felt; their mood; any unpleasant physical symptoms they’d experienced in the last month; how much alcohol they typically drank on a daily basis and whether they had a drink problem; and finally their authenticity.

To get a sense of what the researchers really mean by “authenticity” let’s look in more detail at that last questionnaire. It consisted of 45-items in four categories: Awareness, which means how much someone is motivated to understand themselves (points are awarded for agreement with statements like “For better or worse I am aware of who I truly am”); Behaviour, which measures how much the person actually acts in accordance with their values and beliefs; Related Orientations, which is about how open and honest the person is in their relationships; and finally, Unbiased Processing, which speaks to how much someone can accurately evaluate themselves without being misled by what other people say or do. The researchers averaged across these subscales to give their participants an overall authenticity score.

The main result is straightforward. Across the whole group of students, feeling more lonely tended to correlate with being feeling more depressed and anxious; having more physical symptoms and more drink problems. Sadly, this is consistent with prior research on the sequelae of loneliness. But here’s the thing: among those students who scored more highly on authenticity, these associations were all reduced. That is, if you felt lonely but you also scored highly on authenticity, then your depression and anxiety tended to be lower, so too your drink problems and physical symptoms.

This is a cross-sectional study – it only involved taking measures at one point in time – so we need to interpret the results with caution (we also don’t know if the same findings would apply to a different demographic group, such as elderly people). But one hopeful interpretation of these results is that being true to yourself provides a kind of protection against the usual negative effects of being lonely.

Why might this be? Bryan and her colleagues posit a couple of explanations: First, perhaps highly authentic people don’t overanalyse their lonely feelings – they don’t see their loneliness as some kind of indictment of their personality, it’s just the way things currently are. Second, authentic people are likely less inclined to try to get out of their lonely situation by hanging out with people they don’t want to be with, or doing stuff they don’t want to do. Yes, this might increase their isolation at first, but it probably helps prevent them from growing more bitter and resorting to counter-productive coping mechanisms like drinking too much.

Of course there’s a lot of speculation here. We need a replication of the finding with a more robust longitudinal research methodology (that follows people’s changing feelings and traits over time), and to test other demographics. What’s exciting though, is that if the effect proves to be real, then it hints at a useful way to help lonely people – simply encourage them to be true to themselves. “Such an intervention would be uniquely beneficial,” the researchers said, “as it would not require effort from others (who need to interact with the lonely individual).”

Like this:

The directed abstraction technique acts a springboard,
allowing the timid to gain confidence from initial success

Last week Kathleen finally put aside her fears about public speaking to give a presentation… and it went pretty well! But when you caught her at lunch today and asked if she wanted future opportunities to present, you found she was as pessimistic about her ability as ever.

This story reflects an unfortunate truth: people with low self-belief are liable to hold onto negative assumptions about themselves despite concrete evidence of the contrary; that is, they fail to “generalise from success”. Thankfully, in a new paper, psychologist Peter Zunick and his colleagues describe a technique, called directed abstraction, that can help the self-critical change their mindsets.

Direct abstraction means stopping to consider how a specific success may have more general implications – this is the abstraction part – and also ensuring this thinking is directed towards how personal qualities were key to the success. Let’s see what this means in practice.

In a first study, 86 students guessed the number of dots flashed up on screen, and were given fake but convincing positive feedback on their performance. Half the students were then asked to explain how they completed the task, which kept their thoughts on a very concrete, specific level. The other half were prompted to engage in directed abstraction by completing the sentence: “I was able to score very high on the test because I am: … ” This query is not about how, but why – a more abstract consideration – and also focuses on the individual’s own qualities.

Engaging in directed abstraction appeared to give a particular boost to those participants who’d earlier reported believing they have low competence day to day: afterwards, they not only had more confidence in their estimation ability (than similarly self-critical control participants), they also believed they would do better at similar tasks (like guessing jelly beans in a jar) that they faced in the future.

In another experiment, Zunick’s research team sifted through hundreds of students to find 59 with low faith in their public speaking skills. Each of them was given a few minutes to prepare and then make a speech to camera on the topic of transition to college life, a fairly easy one to tackle. Each participant then watched themselves on video, with the experimenter offering reassuring feedback and implying that they did surprisingly well.

The same participants then engaged in directed abstraction (or the control “how” query) before being thrown once more into the breach with a second speechmaking experience, this time on a tough topic, with no coddling feedback afterward – this was the real deal. Did the directed abstraction participants gain confidence from their early success that could survive a rockier second round? They did, reporting more confidence for future public speaking than their peers.

The technique seems to be appropriate for a range of settings, although obviously it’s only useful to use it following an event that can be reasonably seen as a success, otherwise it could backfire. And it’s simple to use to help a friend or yourself, just by taking the time after a success to think through what it owes to your personal qualities. Then confidence can follow.

Like this:

To help people who perform non-lethal self-harm, such as cutting and burning themselves, we need a better understanding of the thoughts and feelings that contribute to them resorting to this behaviour. Risk factors are already known, including depression and a history of sexual abuse. However, Noelle Smith and her colleagues wondered if these factors increase the risk of self-harm because they lead people to experience self-disgust. Viewed this way, the researchers believe “self-disgust may serve as an emotional trigger” for self-harm.

Over five hundred undergrads, men and women, answered questions about whether they’d ever intentionally harmed themselves (including cutting, burning and scratching); when they’d last performed such an act; their depression symptoms; any history of physical or sexual abuse; their anxiety; and crucially, their feelings of self-disgust, as measured by 18 items, such as “I find myself repulsive”.

Consistent with the researchers’ predictions, the more self-disgust a student reported, the greater the likelihood that they had previously performed self-harm (statistically speaking, a one standard deviation increase in self-disgust was associated with a two-fold increase in the odds of reporting self-harm).

Levels of self-disgust were the highest in those students who said they’d performed self-harm in the last year. These were also the same students who tended to report depression symptoms and a history of physical or sexual abuse. It’s notable though, that depression was no longer associated with self-harm once self-disgust was taken into account, suggesting that self-disgust is the key mediating factor.

These findings jibe with past research on the more cognitive aspects of self-disgust – for example, there’s evidence that self-harm is associated with being self-critical and having an excessive focus on one’s own mistakes. Other studies have highlighted reductions in self-disgust after acts of self-harm, but also increases. Smith and her colleagues suggested the link could be bi-directional: self-harm may assuage feelings of disgust with self, but performing a self-harming act may then trigger feelings of shame with one’s own actions.

The cross-sectional nature of this study means it can’t shed light on the direction of causality – whether self-disgust contributes to self-harm behaviours, or if the reverse is true. Self-disgust was also measured as trait, rather than as an acute state of mind. The researchers acknowledged these issues, but they note theirs is the first study to look at the emotion of self-disgust as a precipitating factor for self-harm, and they call for more research. For now, they said their results suggest reducing self-disgust may help people who are at risk of self-harm.

Like this:

The human mind can be its own worst enemy. When we want to do well in sports, we often intensify attentional focus on bodily movements that are best off left on automatic pilot. The result, even for elite athletes, can be a dire instance of choking. The muscles stiffen or shake. Fluid, expert movement is lost, and the learning of new skills is impaired.

A common assumption is that an internal focus is harmful to performance because it directs unhelpful conscious attention to bodily control. But what if the costs of self-focus are more general and profound than that? Perhaps merely thinking about ourselves in any way is harmful to performance and learning because to do so activates the “self-schema”.

The self-schema is “more than a philosophical construct” argue Brad McKay and his colleagues in a new paper, it is in fact a “functional neural network located anatomically in cortical midline structures.” Their theory is that anything that activates this network – be that over-focus on bodily movements, memories of past performance, or the scrutiny of an audience – will be detrimental to skilled performance and learning.

The researchers began by dividing 36 students (26 men) into two groups and asking them to throw 10 balls underarm at a bulls-eye style target. Throws nearer the target earned more points. Both groups performed equally well. Now one group spent a minute “thinking about their previous throwing experience including their strengths and weaknesses as a thrower”; the other group acted as controls and just waited out the time. Both groups then performed 10 more throws. The students who’d spent time thinking about themselves showed inferior performance compared with their earlier standard; the control group maintained their skill level.

“A simple manipulation designed to activate the self-schema … was sufficient to degrade performance,” the researchers said.

Next, 37 more students were recruited (18 women) and split into two groups. They spent time practising using a bat to hit golf-ball-sized balls, travelling at 25mph, at a target. None of them had played organised baseball or softball in the last year.

Over two practice days, all the students completed a writing task in the various short breaks between hitting. Those in the self-reflective group wrote for one minute either about their experience at baseball; their personal attributes as an athlete; their emotional experiences related to baseball; or their strengths and weaknesses as a hitter (all the different topics were covered during the different breaks). The other group acted as controls – they spent the same breaks writing about objects in the laboratory where the training took place, either focusing on colours or shapes or the names of the objects.

A few days later, the students had a final go at the ball hitting challenge. Adjusting for initial performance differences, the control group significantly outperformed the self-reflective group. The control group also outperformed the self-reflective group when the task was changed slightly by speeding up the delivery of the balls.

McKay and his team said their results were surprising – as one of their participants remarked, you’d think spending time writing about one’s past glory days at baseball would have provided a confidence boost, and maybe rekindled old movement patterns too. Instead, the researchers said their results showed how the “ostensibly innocuous activity of contemplating one’s own experiences, emotions, strengths, weaknesses and attributes, might have activated a lurking neural self-network that interfered with the process of motor learning.”

Critics may feel this study raises as many questions as it answers – with no measures of muscle tension, or mood, or a myriad other possible mediating factors, we’re left in the dark about why writing about the self appeared to be detrimental to motor skill learning.

However, the researchers believe their study has broken new ground. Where previous research has shown instructions to focus on parts of the body can be harmful to performance, McKay and his team said their “experiments are the first to show that self-reflection alone is sufficient to interfere with motor skill activation and performance.”

If you’ve been on the internet at all this year, you may have noticed an explosion of fiction-based personality quizzes. What house would you belong to in Hogwarts—or in Westeros? Which “Mad Man” are you? What Shakespeare role were you born to play?

While there is a clear, bright line between real people and imaginary people (I exist, Hermione Granger does not), there is no such line dividing real and imaginary relationships. (As far as you are concerned, dear reader, both Ms. Granger and I are studious women who exist only on the page or screen.) Even in our most intimate personal relationships, we are often interacting with a mental model of our partner or parent, imagining their current state of mind, or how they would respond to whatever situation we find ourselves in. Although operationalised in this article as relationships with fictional characters, other researchers have included connections with real people whom we don’t personally know (artists, politicians, athletes) and historical figures in the spectrum of parasocial relationships.

Parasocial relationships enable us to explore emotional and social realities without the risks inherent in the real world. The authors dryly note: “Readers and viewers are protected from social rejection and the physical danger of threatening circumstances; thus, forming a relationship with an interesting but potentially dangerous character (e.g., Tony Soprano) does not present the same obstacles in the narrative world as it might in the physical world.”

Can our fictional friends make us better people?

Other than safe distance, what might a relationship with a fictional mobster have to offer? This study examines the extent to which parasocial relationships facilitate “self-expansion,” or the sense of greater possibilities for the self. Real-world relationships lead to self-expansion when people view their relationship partner as “a valuable source of new knowledge and experiences.” Can fictional characters have the same effect of helping us envision a bigger, better version of ourselves?

They can. University students were asked to read an unfamiliar short story about a young person competing in a race, and then to rate the story’s protagonist, along with two real-life contacts (a close friend and a classmate) and two television characters (the participants’ favorite and a non-favorite character) across various dimensions of likability and relevance to the self. Self-expansion was measured by a 14-item scale (e.g. “How much does X help to expand your sense of the kind of person you are?” and “How much has knowing X made you a better person?”) and was found to vary upwards in line with the intensity of the relationship, not its real-life or fictive origin.

Close friends inspired the most self-expansion, followed by favourite television characters, then non-favourite characters, and finally casual acquaintances. The more a character was perceived as being like the participant’s ideal (as opposed to actual) self, the stronger the effect. Participants’ “narrative transport,” or the degree to which they felt engaged and absorbed in a fictive world (this was manipulated via instructions given to participants before reading the short story) also enhanced self-expansion.

While no one claims that parasocial relationships can replace mutual ones, the authors see their study as largely good news, as it implies that our capacity to learn and grow from relationships is not constrained by our daily environment. “[I]mmersion into narrative worlds can create opportunities for growth in which experiences, perspectives, and knowledge of fictional characters prompt readers’ own development,” the authors maintain, pointing out that parasocial relationships can provide role models “especially for those who are temporarily or chronically isolated, those who have limited social relationships, or those with homogenous social groups.”

The authors note two shortcomings of the study—the lack of developmental and personality perspectives. What are the effects of long-term parasocial relationships? Are they as beneficial as brief ones, or are there potential dangers to an extended commitment to someone, real or imagined, who can never reciprocate? Secondly, why are some people more likely than others to identify themselves with fictional characters, and use that identification as a source of personal growth?

Personal experience suggests, unsurprisingly, that both temperament and upbringing play a role. Self-enhancing parasocial relationships require a fair amount of imagination and psychological-mindedness. Real-life peers and authority figures, meanwhile, can encourage such relationships or mock them as “imaginary friendship” or a pop-culture obsession. Of course organised religion has harnessed the power of parasocial relationships for self-betterment for millennia: Asking one’s self “What would Jesus [or Mohammed, Buddha, or Martin Luther King Jr.] do?” is, after all, a classic case of transcending the self through a relationship with a person one has never met.

Like this:

We know self-talk can help people’s self-control (e.g. “Don’t do it!”), and boost their morale (e.g. “Hang in there!”) in sporting situations. However, before now, no-one has investigated whether self-talk is more effective depending on whether you refer to yourself in the grammatical first person (i.e. “I can do it!”) or the second person (i.e. “You can do it?”).

Sanda Dolcos and her team first asked 95 psychology undergrads to imagine they were a character in a short story. The character is faced with a choice [we’re not given any detail about these vignettes], and the participants are asked to write down the advice they would give themselves in this role, to help make the choice. Crucially, half the participants were instructed to use the first-person “I” in their self-advice, the others to use the second-person “You”. Right after, the participants completed a series of anagrams. Those who’d given their fictional selves advice using “You” completed more anagrams than those who’d used the first person “I” (17.53 average completion rate vs. 15.96).

A second study with 143 more psych students was similar, but this time the students gave themselves self-advice specifically in relation to completing anagrams, and this time the researchers finished up the study by tapping the students’ attitudes to anagrams, and their intentions to complete more in the future. Students who gave themselves self-advice in the second-person managed to complete more anagrams, and they said they would be happier to work on more in the future (as compared with students who used the first-person, or a control group who did not give themselves advice). The greater success rate for the second-person students was mediated by their more positive attitudes.

Finally, 135 more psych students wrote down self-advice in relation to exercising more over the next two weeks. Those who referred to themselves as “You” in that advice subsequently stated that they planned to do more exercise over the next two weeks, and they also went on to report more positive attitudes towards exercising, than those students who referred to themselves as “I”.

Dolcos and her colleagues said theirs was the “first experimental demonstration” that second-person self-talk is more effective than the first-person variety, thus complementing “past intuitions and descriptive data” suggesting that people resort to second-person self-talk when in more demanding situations. The researchers speculate that second-person self-talk may have this beneficial effect because it cues memories of receiving support and encouragement from others, especially in childhood. “Future work should examine ways to actually training people to strategically use the second-person in ways that improve their self-regulation …” they said.

Many readers will likely be disappointed by the dependence on purely psychology student samples. You might wonder too whether writing down self-advice is truly equivalent to internal self-talk; and maybe you’ll have doubts about the extent to which anagram performance and exercising intentions tells us about potential effects in the real world. Another issue is that the study didn’t investigate people’s preferences for self-talk – is it a blanket rule that second-person self talk is superior for everyone?

_________________________________ Dolcos, S., & Albarracin, D. (2014). The inner speech of behavioral regulation: Intentions and task performance strengthen when you talk to yourself as a You European Journal of Social Psychology DOI: 10.1002/ejsp.2048

Like this:

How does a man feel if another man opens a door for him? The researchers Megan McCarty and Janice Kelly conducted a field study to find out.

Male research assistants waited near two university building entrances and looked out for men and women approaching. On some trials the research assistant went through a door adjacent to the arriving person (so that the person had to open the door for themselves). On other trials, the research assistant leaped into action, held open the door for the approaching person, then stepped aside for them to enter first. Once inside, the targeted men and women were approached by a female assistant bearing a clipboard. She asked them questions about their self-esteem and self-efficacy (measured by their agreement with statements like “I feel that I have a number of good qualities” and “I can learn almost anything if I set my mind to it”). In total 221 people were tested this way (122 women).

Men who had the door held open for them scored lower on self-esteem and self-efficacy than men who didn’t have the door held open for them. Women’s self-esteem and self-efficacy scores were no different regardless of whether a man held a door open for them or not.

McCarty and Kelly said their findings are likely explained by the fact that men holding open doors for men is socially unusual, and could be taken by the recipient of the gesture as implying that they look like they are needy and vulnerable. This likely clashes with their masculine self-concept and makes them feel deflated. Women, by contrast, are more used to men holding doors open for them, and aren’t so bothered by the connotations (although note there is a literature on the potentially harmful effects of benevolent sexism).

The researchers’ interpretation is based on the idea that door-holding for men is socially unusual (or in the jargon, “non normative”). This was backed up by some earlier observation and survey work. When research assistants spent time observing university entrances, they found that men and women were equally likely to have the door held for them by the person passing through an entrance ahead of them, but that women more often than men had the door held for them in a chivalrous manner, in which the door-opening person steps aside and lets the recipient of the gesture pass through first. This chivalrous door-holding was the form used in the field study. A further survey of male and female students also found that they thought it was more typical for both types of door-holding to be performed for women than for men.

“This work demonstrates that simple but unexpected helping behaviours as fleeting and seemingly innocuous as door holding can have unforeseen negative consequences,” the researchers said. “Thus, this work contributes to a growing literature on the consequences of helping for the recipients of help, as well as the growing literature on the influence of seemingly inconsequential everyday social behaviours.”

Critics might point out that it was a shame the researchers didn’t look at the effect of door-opening by women as well as by men. Perhaps the adverse effects on men’s self-esteem were due to receiving help from a member of the same sex, and perhaps women would have shown these effects too if women had been doing the door opening. For similar reasons it would be have been favourable if, inside the buildings, both male and female research assistants had asked the questions about self-esteem and self-efficacy. Finally, future research needs to examine social context – for instance, would men still experience a dent to their self-esteem if a male subordinate opened the door for them?

Like this:

You wouldn’t believe the amount of ink spilled by neuroscientists and psychologists attempting to explain the simple fact that we can’t tickle ourselves. A popular, long-standing theory posits that the self-tickle failure occurs because of the way that the brain cancels out sensations caused by its own movements. To do this, so the theory states, the brain uses the motor command underlying a given action to make a prediction of the likely sensory consequences of that action. When incoming sensory information matches the prediction, it’s recognised as self-generated and cancelled.

If this explanation is true, then any situations that confuse the brain’s ability to predict the sensory consequences of its own actions should scupper the sensory cancellation process, thereby making self-tickling a possibility. George Van Doorn and his colleagues have put this principle to the test in dramatic fashion. They measured the potential for self-tickling in 23 participants who underwent a body-swap illusion (open access article here).

The experimental set-up involved each participant sitting opposite the experimenter. The participant wore a pair of goggles that displayed a video feed from a camera that was either placed forward-facing on the participant’s own head (giving them a conventional first-person perspective), or was positioned forward-facing on the experimenter’s head, thus giving the participant a view from the experimenter’s perspective and provoking a body-swap illusion.

During both of these camera arrangements, the participant and experimenter each held one end of a wooden rod with foam at each end. The participant either moved the rod rhythmically with their right hand, causing the foam to rub against their own left palm (potentially causing self-tickling), and the experimenter’s left palm. Or, the experimenter was the one who moved the rod, causing the foam to rub against’s participant’s left palm (i.e. potential for tickling by another person) and his own left palm.

During the body-swap illusion, the participants said they felt the sensation of the foam, not where their real hand was located, but at the position of the experimenter’s hand. Given the illusion, they perceived this to be their own hand, even though it looked like someone else’s. Crucially, even in this strange situation, the participants were still unable to tickle themselves if they were the ones moving the rod (they felt the foam, but it didn’t tickle). They felt much more of tickling sensation when it was the experimenter who moved the rod.

The classic theory for why we can’t tickle ourselves is unable to explain why tickling is still not possible even in such extreme illusory contexts when the brain’s ability to predict the sensory outcomes of its actions is thrown into disarray. Moreover, self-tickling was still not experienced even in variations of the experimental setup, in which the body-swap illusion was combined with the “rubber hand illusion” and the movement of the foam was felt in a baseball bat viewed from the experimenter’s perspective!

Van Doorn and his colleagues said their “remarkable” findings are consistent with an alternative neuroscience theory that’s gaining currency. This “active inference” theory states that self-generated movements cause non-specific suppression of sensory input, regardless of whether predictions of the consequences of one’s own movement are accurate or not.

The researchers concluded: “We asked ‘can you tickle yourself if you swap bodies with someone else?’ The short answer is ‘no’.”

Like this:

To be capable of laughing at oneself is usually considered a mark of good character and the foundation of a robust sense of humour. Yet this is a behaviour that’s barely been touched on by psychologists. Opinions have been expressed – for example, La Fave and his colleagues thought that laughing at oneself was never genuine and couldn’t be a truly happy event. But for largely practical reasons, experiments on the topic are non-existent. Now Ursula Beermann and Willibald Ruch have shown one way to do it.

Sixty-seven undergrads rated their own ability to laugh at themselves and they nominated one or two peers to provide third-party ratings of the same. Sneakily, whilst the participants filled out these and other questionnaires at a computer, a screen camera took pictures of them. A little later the participants were asked to rate distorted pictures of the faces of unfamiliar men and women. To their surprise, included in the selection were the sneaky photos taken earlier of themselves. These photos of the participants had also been distorted to be, for example, stretched wide as if looking in a spoon (the Mac “Photobooth” software was used to create these effects).

The participants were filmed while they rated the photos so the researchers could later analyse the footage to see whether the participants laughed at the distorted images of themselves. Ekman’s Facial Action Coding system, which focuses on the flexing of specific facial muscles, was used to decode the participants’ facial expressions, and in particular to look for signs of genuine “Duchenne smiles”, which are symmetrical and involve creasing of the muscles around the eyes. Signs of laughter were also noted.

The findings seemed to validate the new methodological approach. Although 80 per cent of participants flashed a genuine smile at least once on seeing their own distorted image, it was those who claimed to be able to laugh at themselves, and whose peers agreed with this verdict, who showed more frequent and intense smiling and laughter in response to the distorted self-images, and fewer signs of fake smiles or negative emotion. On the other hand, there was no correlation between participants’ ability to laugh at themselves (based on self- and peer-report) and the amount of laughter triggered by distorted images of other people’s faces. This suggests that proclivity for laughing at oneself really is a distinct trait, separate from a general readiness to laugh.

Finally, those participants who laughed more at themselves tended to have more cheerful, less serious dispositions and to be in a better mood on the day of testing.

“…[T]he current study succeeded in providing the first empirical evidence on the phenomenon of laughing at oneself,” the researchers said.
_________________________________

Like this:

The performance of young children on the ‘mirror self-recognition test’ varies hugely across cultures, a new study has shown. This is the test that involves surreptitiously putting a mark on a child’s forehead and then seeing how they react when presented with their mirror image. Attempts by the child to touch or remove the mark are taken as a sign that he or she recognises themselves in the mirror. Studies in the West suggest that around half of all 18-month-olds pass the test, rising to 70 per cent by 24 months. Chimps, orangutans, dolphins and elephants have also been shown to pass the test, and there’s recent debate over whether monkeys can too.

Tanya Broesch and her colleagues began by taking a simplified version of the mirror self-recognition test to Kenya, where they administered it to 82 children aged between 18 to 72 months. This version of the test involved a small, yellow post-it note rather than a red splodge, and children weren’t given the usual verbal prompts such as ‘who’s that in the mirror?’. Amazingly, just two of the children ‘passed’ the test by touching or removing the post-it note. The other eighty children ‘froze’ when they saw their reflection – that is they stared at themselves but didn’t react to the post-it note.

Next, Broesch and her team took their test to Fiji, Saint Lucia, Grenada, Peru, Canada and the USA, where they tested 133 children aged between 36 to 55 months. The performance of the North American children was in line with past research, with 88 per cent of the US kids and 77 per cent of the Canadians ‘passing’ the test. Rates of passing in Saint Lucia (58 per cent), Peru (52 per cent) and Grenada (51 per cent) were significantly lower. In Fiji, none of the children ‘passed’ the test.

So, what’s going on? Are children in these non-Western nations seriously delayed in their mirror self-recognition. The researchers don’t think so. First of all, they deliberately tested a wide age range – in Kenya up to age six – and they think it’s highly unlikely mirror self-recognition could be delayed that far. ‘Our impression,’ the researchers said, ‘was that they [the children] understood that it was themselves in the mirror, that the mark was unexpected, but that they were unsure of an acceptable response and therefore dared not touch or remove it.’

Inspired in part by past research conducted in Cameroon, in which children who failed the mirror test tended to be the most compliant and obedient, Broesch and her colleagues speculated that the performance in the non-Western, more interdependent cultures may have been affected by the fact that children in these societies are often discouraged from asking questions (they’re expected to learn by watching). ‘This is in sharp contrast with the independence and self-initiative that tends to be encouraged and nurtured in the Industrial West,’ the researchers said. Another factor could be the non-Western children’s relative lack of familiarity with mirrors.

More research is needed to test the truth of these assertions. Meanwhile, this study provides a compelling example of why we must be cautious when extrapolating from Western psychology research. ‘Negative results (whether in monkeys or humans) must be examined more closely and results remind us that transporting culture-specific tests among diverse human populations has the potential to lead to flawed interpretations of cognitive differences and developmental processes,’ the researchers said.
_________________________________