Saturday, 16 November 2013

Earlier this year I
committed my research group to pre-registering all studies in our recent BBSRC
grant, which includes fMRI, TMS and TMS-fMRI studies of human cognitive
control. We will also publicly share our raw data and analysis scripts,
consistent with the principles of open science. As part of
this commitment I’m glad to report that we have just published our first pre-registered
study protocol at the Open Science Framework.

For those unfamiliar with
study pre-registration, the rationale is simply this: that to prevent different
forms of human bias creeping into hypothesis-testing we need to decide before starting our research what our hypotheses are and how we plan to test them. The best way to
achieve this is to publicly state the research questions, hypotheses, outcome
measures, and planned analyses in advance, accepting that anything we add or
change after inspecting our data is by
definition exploratory rather than pre-planned.

To many scientists (and
non-scientists) this may seem like the bleeding obvious, but the truth is
that the life sciences are suffering a crisis in which research that is purely
exploratory and non-hypothesis-driven masquerades as hypothetico-deductive.
That’s not to say that confirmatory (hypothesis-driven) research is necessarily
worth any more than exploratory (non-hypothesis driven) research. The point is
that we need to be able to distinguish one from the other, otherwise we build a false certainty in the theories we produce. Psychology and cognitive
neuroscience are woeful at making this distinction clear, in part because they
ascribe such a low priority to purely exploratory research.

Pre-registration helps
solve a number of specific problems inherent in our publishing culture,
including p-hacking
(mining data covertly for statistical significance) and HARKing
(reinventing hypotheses to predict unexpected results). These practices are common
in psychology because it is difficult to publish anything in ‘top journals’
where the main outcome was p >.05
or isn’t based on a clear hypothesis.

Evidence of such
practices can be found in
the literature and all around us. Just last week at the Society for
Neuroscience conference in San Diego, I had at least three conversations where
presenters at posters would say something like: “Look at this cool effect. We
tested 8 subjects and it looked interesting so we added another 8 and it became
significant”. Violation of stopping rules is just one example of how we think
like Bayesians while being tied to frequentist statistical
methods that don’t allow us to do so. This bad marriage between thought and
action endangers our ability to draw unbiased inferences and, without appropriate Type I
correction, elevates the rate of false discoveries.

In May, the journal Cortex launched a new
format of article that attempts to solve these problems by incentivising pre-registration.
Unlike conventional publishing models, Registered Reports are peer reviewed
before authors conduct their experiments and the journal offers provisional
acceptance of final papers based solely on the proposed protocol. The model at Cortex not only prevents p-hacking and
HARKing – it also solves problems caused by low statistical
power, lack of data transparency, and publication bias. Similar
initiatives have been launched or approved by several other journals, including
Perspectives on Psychological Science,
Attention Perception & Psychophysics,
and Experimental Psychology. I’m glad
to say that 10 other journals are currently considering similar formats, and so
far no journal to my knowledge has decided against offering pre-registration.

In June, I wrote an open
letter to the Guardian with Marcus
Munafò and >80 of our colleagues who sit on editorial boards. Together we
called for all journals in the life sciences to offer pre-registered article
formats. The response to the article was overall
neutral or positive, but as expected not
everyone agreed. One of the most striking features of the negative
responses to pre-registration was how the critics targeted a version of pre-registration
we did not propose. For instance, some felt that the Cortex model would prevent publication of serendipitous findings or
exploratory analyses (it doesn't), that authors would be “locked” into
publishing with Cortex (they aren’t),
or that the model we proposed was suggested as mandatory or universal (it is
explicitly neither). I would ask those who responded negatively to reconsider the
details of the Cortex initiative
because we don’t disagree nearly as much as it seems. In regular seminars
I give on Registered Reports at Cortex
I include a 19-point list of FAQs and response to these points, which you can
read here.
I will regularly update this link as new FAQs are added.

I believe we are in the early stages of a revolution in the way we do research – one not driven
by pre-registration per se, and
certainly not by me, but by the combination of converging future-oriented
approaches, including emphasis on replication (and replicability), open
science, open access publishing, and pre-registration. The pace of evolution in
scientific practices has shifted up a gear. Clause 35 of the revised
Declaration of Helsinki now explicitly requires some form of study
pre-registration for medical research involving human participants. Although
much work in psychology and cognitive neuroscience isn’t classed as ‘medical’,
many of the major journals that publish basic research also ask authors to adhere to the Declaration, including
the Journal
of Neuroscience, Cerebral
Cortex, and Psychological
Science.

The revised Declaration of
Helsinki has caused some concern among psychologists, and I should make it
clear that those of us promoting pre-registration as a new option for journals had
no role in formulating these revised ethical guidelines. However we shouldn’t necessarily
see them as a problem. There are many simple and non-bureaucratic ways to
pre-register research (such as the OSF),
even if the journal-based route is the only to reward authors with advance
publication.

One valid
point that has been made in this debate is that those of us who are promoting pre-registration should practice
what we preach, even when there is no journal option currently available (and
for me there isn’t another option because Cortex
– where I am section editor – is so far the only cognitive neuroscience journal
offering pre-registered articles). Some researchers, such as Marcus Munafò,
already pre-register on a routine basis and have done for some time. For my
group it is newer venture, and here is our first attempt. Our protocol describes an fMRI experiment of response
inhibition and action updating that forms the jumping off point for several upcoming
studies involving TMS and concurrent TMS-fMRI. We are registering this protocol
prior to data collection. All comments and criticisms are welcome.

Writing a protocol for an
fMRI experiment was challenging because it required us to nail down in advance our decisions and
contingencies at all stages of the analysis. The sheer number of seemingly
arbitrary decisions also reinforced my belief that many, if not most, fMRI
studies are contaminated by bias (whether conscious or unconscious) and
undisclosed analytic flexibility. I found pre-registration
rewarding because it helped us refine exactly how we would go about answering our research questions. There is much to be said for taking the time to
prepare science carefully, and time spent now will be time saved when it comes
to the analysis phase.

Most of the work in our
first pre-registration was undertaken by two extremely talented young
scientists in my team: PhD student Leah
Maizey and post-doctoral researcher Chris
Allen. Leah and Chris deserve much praise for having the courage and conviction
to take on this initiative while many of our senior colleagues 'wait and see'.

Pre-registration is now a
normal part of the culture in my lab and I hope you’ll consider making it a
part of yours too. Embracing the hypothetico-deductive method helps protect the outcome of hypothesis-driven research from our inherent weaknesses as human practitioners. It also prompts us to consider deeper questions. As a community we need to reflect on what sort of scientific culture
we want future generations to inherit. And when we look at the current status quo of questionable research practices,
it leads us to ask one simple question: Who are we serving, us or them?

NeuroChambers is my personal blog, where I write mostly about
science-related things but occasionally post more personal
stuff.

First, a bit about me. I’m a researcher at the Cardiff
University School of Psychology. I’m originally from Australia, where I did
a PhD about 10 years ago in an area called ‘psychoacoustics’ – the psychology
of auditory perception. After that I got interested in the relationship between
the brain and cognition, so I moved to an area called cognitive neuroscience,
which bridges the gap between neurobiology and traditional experimental psychology.
I now run a research group in Cardiff, where we use brain imaging and brain
stimulation methods to understand human cognitive control and attention. At the
moment I’m particularly interested in the psychology and neuroscience of
response inhibition, impulse control, and addiction.

I started NeuroChambers in 2012 after taking part in a debate
on science journalism at the Royal Institution. Following some energetic arguments
in the press about the good,
bad, and ugly of science reporting, we came to the conclusion that scientists
and journalists need to cooperate far more constructively in the service of public understanding
(you can watch the debate here
and read more about it here).
One area, in particular, that I feel scientists need to work on is the process of communicating science to non-scientists. And a great way to
do this, of course, is through blogging.

There are four main types of article I post here on my personal blog:

1. Research
Briefings: these are (hopefully) accessible summaries of our recent research. Whenever
we publish an article in a scientific journal that I think might have broader
appeal, I write an overview of the work for a general audience. Here are a few
I wrote about human
vision, impulse
control, and human
brain stimulation. I’m not the only scientist to do this – Mark Stokes at Oxford University also does it over at Brain Box
(and does it well!)

2. Calls to Arms:
I’m a psychologist and I think psychology is an important and fascinating
discipline. But I’m actually quite critical about what passes for acceptable research
practices these days, and lately I’ve been working on possible solutions. One approach I’ve been advocating is called study
pre-registration. In short, what this means is that scientists should specify
the predictions and statistical tests in their experiments before they conduct
them. Doing so helps us stay true to the scientific method and avoid fooling
ourselves into believing that we’ve discovered something real when in fact we're only staring at the reflection of our own bias. For me, study pre-registration is
common sense but
not everyone agrees. Psychological science is in the midst of a revolution, and revolutions are never easy. We’ll be writing more about
this at Head Quarters as we gradually reform the field.

Another area that I’ve been fairly vocal about recently is
the importance of evidence-based policy in government. Last year, Mark Henderson, head of Communications at
the Wellcome Trust, published a very important book called the Geek Manifesto, which
explains why science is so important and yet so undervalued in modern politics.
Mark’s book inspired me and many other scientists to do something proactive to
address this issue. Together with Tom
Crick and several colleagues – as well as 60 generous donors from across the UK
– I helped coordinate a campaign to send
one copy of the Geek Manifesto to each elected member of the National Assembly for Wales. I’m also following up on this initiative with Natalia Lawrence at
the University of Exeter. Natalia and I are aiming to establish a rapid-response ‘evidence
information service’ for politicians and civil servants.

3. Advice columns for
students and junior scientists: These posts will have less general appeal
as they're usually written for those already pursuing a career in science. Still, my
most popular post on this blog has been a (probably overly) blunt list
of do’s and don’ts for the aspiring PhD student.

4. Whinges: I’ve
lived in Britain long enough to cherish the art of a good whinge, and part of
being a scientist is challenging bullshit. I occasionally write critical
pieces questioning (what I see as) flawed
or overegged science, or bad practice. You’ll see more of this style of
piece over at Head Quarters as well.

Also, a warning. As you’ll have noted above, I’m a bit
sweary at times (for which you can blame my Australian upbringing). Apologies in
advance if I write or say something that offends! Don't worry, my Guardian posts will be more civilised - usually!

So that’s a quick overview of me and the things I write about at NeuroChambers. Meanwhile stay tuned for more posts at Head Quarters – we’ve got some exciting topics
in the pipeline.

Finally, for no reason whatsoever, here’s a picture of our
two cats...doing what cats do best.

The Research

Lets start
by talking about the science. Our aim in this study was to test for a link between
two types of visual spatial bias called ‘neglect’ and ‘pseudoneglect’.

Neglect (also
known as ‘unilateral
neglect’) is a neurological syndrome that arises after brain injury – most
often due to a stroke that permanently damages the right hemisphere. Patients
with neglect present with a striking lack of attention and awareness to objects
presented on the left side of their midline. Such behaviours may include ignoring
food on the left side of a dinner plate or failing to draw the left side of
objects. Importantly, the patients aren’t simply blind on their left side. The visual parts of the brain are generally
intact while the damage is limited to parietal, temporal, or frontal cortex.

Neglect has
been studied for many years and we know a lot
about how and why it arises. But one unanswered question is how the spatial
bias of neglect relates to other spatial biases that are completely normal. We felt this was an important question because
we don’t know enough at a basic level about how the brain represents space, so testing for neurocognitive links
between spatial phenomena helps us build better theories. Furthermore, if there happens
to be a predictive relationship between neglect and other forms of bias, we may
be able to estimate the likely severity of neglect before a person has a stroke. This could have a range of useful applications
in clinical therapy and management.

Enter ‘pseudoneglect’. Pseudoneglect
is a normal bias in which people ignore a small part of their left or right side of space. One simple way to measure this is to ask someone to cross
the centre of a straight horizontal line. Most people will
misbisect the line to the left or right of its true centre. This effect is tiny (in the order of millimetres) but reliable.

In this
study we wanted to know whether patterns of pre-existing bias, as reflected by
pseudoneglect, predict the patterns of actual neglect following neurological
interference. Of course, we couldn’t give our participants permanent brain
injury, so we decided to use transcranial magnetic stimulation (TMS) to
simulate some of the effects of a brain lesion. Using a particular kind of
repetitive TMS called ‘theta burst stimulation’, we temporarily suppressed activity
in parts of the brain while people did tasks that measured their spatial bias. To
see if there was a link between systems, we then related these effects of TMS
on spatial bias to people’s intrinsic pseudoneglect.

As expected by previous studies, we found that TMS of the right parietal cortex
induced neglect-like behaviour – compared to a sham TMS condition (placebo), people bisected lines more to the right of centre, indicating that TMS caused a
subtle neglect of the left side of space. This effect lasted for an hour (upper figure on the left). But what was particularly striking
was that the effect only happened in the participants who already showed an
intrinsic pattern of left
pseudoneglect. In contrast, those with right
pseudoneglect at baseline were immune to the effects of TMS (lower figure on the left).

There were
a number of other aspects to the study too. We compared the effect of TMS using
two different methods of estimating bias, and we also asked whether the TMS influenced
people’s eye movements (it didn't). I won’t go into these details here but
the paper covers them in depth.

What do
these results mean? I think they have two implications. First, they provide evidence that neglect and pseudoneglect arise from
linked or common brain systems for representing space – and they provide
a biological substrate for this association in the right parietal cortex. Second, the
results provide a proof of principle that initial spatial biases can predict
subsequent effects of neurological interference. In theory, this could one day lead
to pre-diagnostic screening to determine whether a person is at risk of more severe neglect symptoms in the event of suffering a stroke.

All that said, we
need to be cautious. There is a world of difference between the subtle and
reversible effects of TMS and the dramatic effects of brain injury. We simply don't know whether
the predictive relationship found here would translate to patients – that
remains to be established. Also, our study had a small sample size, has yet to be
replicated, and provides no indication of diagnostic or prognostic utility. But
I think these preliminary results provide enough evidence that this avenue is
worth pursuing.

Open Access, Open Science, and
Publishing Reform

Apart from
the science, our paper represents a milestone for my group in terms of our publishing
practices. This is our first article in PLOS ONE and our first publication in
an open access journal. Also, it is our first attempt at open science. Interested
readers can download our data and analyses from Figshare (linked here
and in the article
itself). I increasingly feel that
scientists like me who conduct research using public funds have an obligation
to make our articles and data publicly available.

This paper
also represents a turning point for me in terms of my attitude to scientific
publishing. We originally submitted this manuscript in 2012 to the journal Neuropsychologia, where it was rejected
because some of our results were statistically non-significant. Rejecting
papers on the basis of ‘imperfect’ results is harmful to science because it
enforces publication
bias and pushes authors to engage in a host
of questionable research practices to generate outcomes that are neat and eye-catching.
With some ‘finessing’ of the analyses, we could probably have published our
paper in a more ‘traditional’ outlet. But we decided to play a straight bat,
and when we were penalised for doing so I realised on a very personal level that
there was something deeply wrong with our publishing culture. As a consequence I severed
my relationship with Neuropsychologia.

My small
part in this reform traces back to having this manuscript rejected by Neuropsychologia editor Jennifer
Coull in September 2012. So, in a very true sense, I owe Jennifer a debt of
gratitude for giving me the kick in the butt I needed. Sometimes rock bottom
can be a great launching pad.

Wednesday, 10 April 2013

Last October I joined the editorial board of Cortex, and my first
order of business was to propose a new format of article called a Registered Report.
The essence of this new article is that experimental methods and proposed
analyses are pre-registered and peer reviewed before data is collected. This
publication model has the potential to cure a host of bad practices in science.

In November the publisher approved the new
article format and I’m delighted to announce that Registered Reports
will officially launch on May 1st. I’m especially proud that Cortex will
become the first journal in the world to adopt this publishing mechanism.

For those encountering this initiative for the
first time, here are some links to background material:

Why should we want to review papers before
data collection? The reason is simple: as reviewers and editors we are too
easily biased by the appearance of data. Rather than valuing innovative
hypotheses or careful procedures, we too often we find ourselves applauding
“impressive results” or bored by null effects. For most journals, issues such
as statistical power and technical rigour are outshone by novelty and
originality of findings.

What this does is furnish our environment with
toxic incentives. When I spoke at the Spot On conference last year, I began by
asking the audience: What is the one aspect of a scientific experiment that a
scientist should never be pressured to control? After a pause – as though it
might be a trick question – one audience member answered: the results. Correct!
But what is the one aspect of a scientific experiment that is crucial
for publishing in a high-ranking journal? Err, same answer. Novel,
ground-breaking results.

The fact that we force scientists to touch the
untouchable is unworthy of a profession that prides itself on behaving
rationally. As John Milton says in Devil’s Advocate,
it’s the goof of all time. Somehow we've created a game in which the rules are
set in opposition.

With little chance of detecting true effects,
experimentation reduces to an act of gambling. Driven by the need to publish,
researchers inevitably mine underpowered datasets for statistically significant
results. No stone is left unturned; we p-hack, cherry
pick, and even reinvent study hypotheses to "predict" unexpected
results. Strange phenomena begin appearing in the literature that can only be
explained by such practices – phenomena such as poor repeatability, prevalence
of studies that support stated hypotheses, and a
preponderance of articles in which obtained p values fall just below the significance threshold.
More worryingly, a recent
study by John et al shows that these behaviours are not the actions of a
naughty minority – they are the norm.

None of this even remotely resembles the way
we teach science in schools or undergraduate courses, or the way we dress it up
for the public. The disconnect between what we teach and what we practice is so
vast as to be overwhelming.

Registered Reports will help eliminate
these bad incentives by making the results almost irrelevant in reaching
editorial decisions. The philosophy of this approach is as old as the
scientific method itself: If our aim is to advance knowledge then editorial
decisions must be based on the rigour of the experimental design and likely
replicability of the findings – and never on how the results looked in
the end.

We know that other journals are monitoring Cortex to gauge
the success of Registered
Reports. Will the format be popular with authors? Will peer reviewers be
engaged and motivated? Will the published articles be influential? This success
depends on you. We'll need you to submit your best ideas to Cortex –
well thought-out proposals that address important questions –
and, crucially, before you’ve collected the data. We need your support to help
steer scientific publishing toward a better future.

For my part, I’m hugely excited about Registered Reports
because it offers hope that science can evolve; that we can be self-critical,
open-minded, and determined to improve our own practices. If Registered
Reports succeeds then together we can help reinvent publishing as it was
meant to be: rewarding the act of discovery rather than the art of performance.

In the process of setting
up TMS – a new technique for Cardiff at the time – I had to submit a lengthy application
for ethics approval. After several weeks of discussion, the committee and I
decided that building a TMS lab offered the opportunity to do
some novel research on the side effects of brain stimulation.

Since it was developed
in 1985, TMS has been generally considered safe for human use. Serious adverse effects,
such as seizures, are rare, and few incidents have been reported since international
guidelines for TMS safety were established in 1998 (updated in 2009). However,
TMS has been suspected to cause a range of more mild adverse effects, such as
headache and nausea. Much less is known about these lesser side effects, even
though they can be very unpleasant for participants.

So back in 2008 we decided
to put in place a system for monitoring side effects. After every experimental
session involving TMS, participants were given a form to complete that listed a
series of possible symptoms occurring within 24 hours of the session (the forms
can be downloaded here). Then, when the participant returned for their next
session, we collected and archived these forms. Over several years of TMS
experiments – and many different variants of the technique – we amassed more
than 1000 such forms from over 100 unique participants. Last year, after
four years, we decided we had enough data to commence the analysis.

I’m now happy to report that the paper documenting this analysis has appeared in the
journal Clinical Neurophysiology, written primarily by my PhD student,
Leah Maizey. To our knowledge this paper reports the largest TMS safety study yet conducted
by a single research team.

Overall,
participants in our study reported mild adverse effects (or MAEs) following ~5% of sessions,
although 39% of participants reported at least one MAE at some point during
their experimental regime. When MAEs did occur, the most common was headache
(41%). Rates of adverse effects were higher for active TMS compared to sessions
involving ‘sham’ (placebo) TMS, although a small number of adverse effects could nevertheless
be attributed to coincidence or placebo effects.

Two other findings
are notable and may be of special interest to TMS researchers. First, MAEs were
more likely to occur following a participant’s first session, even controlling for
various extraneous factors. We believe this tendency
could be explained by anxiety when receiving TMS for the first time, so steps
taken by researchers to ensure that participants are relaxed and comfortable
are likely to help.

Second – and most
striking – nearly 80% of MAEs were reported after participants had left
the laboratory at the end of their session. We don’t have a good explanation
for why this is, but 80% is too big to ignore. Maybe the physiological
aftereffects of TMS are longer lasting than is generally assumed, or maybe the
immediate aftereffects can have knock-on effects to other physiological systems.
This was a serendipitous finding, so it will be important to see whether other researchers can independently replicate
such long-lasting effects.

The good news for
TMS researchers is that our study adds to a body of evidence that TMS
is safe for human use under carefully controlled conditions. The adverse effects we did observe were mostly very minor (no
seizures!) and only a few participants withdrew from the experiments. Our main
recommendation is that it would be useful for the TMS community to monitor adverse
effects more closely and to adopt standard methods for doing so. We
provide relevant monitoring forms as part of our paper.

---

* Special thanks to Matthew Rushworth for helpful discussion
at the outset of this project.

Tuesday, 19 February 2013

I’m none of the above (I used to be an expert at one
of them) but today I had a funding application rejected by the Wellcome Trust.

The application was for a research fellowship in basic biomedical science. I
was planning to study in two parts,
first, how training people to inhibit actions toward food and alcohol changes their brain chemistry
and physiology, and second, how we might combine brain stimulation with
inhibition training to help people recover from alcohol addiction and obesity. I felt these were
closely linked themes: a basic strand followed by an applied strand that took the
results to the streets.

I felt a bit like Sauron and the One Ring with this application, but with a bit less malice. Into it I poured everything I had
done and learned over the last ten years of my research. I used all of my (very
helpful and generous) connections and collaborators to devise a project that included such aspects as:

a mass online internet experiment that would have
been hosted by the Guardian and provided the world’s largest study of human inhibition to date

the use of simultaneous brain stimulation (TMS) and
brain imaging (fMRI) to study the effects of inhibition training on key connections in the brain

randomised controlled trials on the effects of
brain stimulation and inhibition training in alcoholism and obesity (a promising combination)

Still, my application wasn’t good enough – not even to make
it to interview – and there’s a lesson in that. Science doesn’t care about
effort, only about outcomes. I won’t quote the feedback from the Wellcome Trust's Expert Review Group, as
it is intended to be confidential (for all concerned). But suffice to say, I
felt the single paragraph of feedback rather misunderstood the project and made
some factual errors. Of course, this is not the Committee’s fault – it is mine.
In science, if you fail to communicate your message clearly then you have only
yourself to blame.

I’ve had a lot of rejections in my career– far more than I've had successes – and I think you
can learn a lot about yourself in terms of how you deal with them. In the junior years they feel
like getting shot (sometimes stabbed), but with time the trauma gives way to the gentle thud of "not good enough" meteorites bouncing off your own rhinoceros hide.

A few tips for beginners:

1)Remember it probably isn’t personal. Even if the
reasons for rejecting your application are unfounded or based on a
misunderstanding, it’s rare for decisions to be driven by personal grudges.

2)The decision makers are human like you. They
will make mistakes. Sometimes those mistakes will go in your favour and the
panel will overlook a genuine weakness in your application. Other times they
will pounce on non-existent problems. We have to accept that this decision
process is noisy, like every other biological system.

3)The basis for decisions is never entirely random, so getting things wrong
helps you get them right next time. When I repackage and resubmit my
application somewhere else, I'll use the feedback from the Trust to
make it stronger. Never just blindly resubmit your application; always try to
learn something from the rejection and improve it. Dealing constructively with rejectionwill make you a better scientist.

4)Resist
the urge, implicitly or explicitly, to take out your disappointment on others. This
is a surprisingly easy trap to full into and I suspect many scientists do. Next time a grant application
(particularly one from the Wellcome Trust) lands on my desk to review, I might be
tempted to treat it particularly harshly because I feel I was treated the same
way. Or, what if I happen to be editing a manuscript submitted to Cortex or PLOS ONE by a
member of the panel? We must resist being led by (natural) negative emotions because "an eye for an eye" is the anithesis of science.

5)Finally, remember that reviewers and panel
members are ultimately doing you a favour, whatever the outcome. They took the time to read something you wrote. They thought about it and gave you feedback on it. This is actually a pretty
remarkable thing and we should be grateful.

So, my feeling about today’s grant rejection is that yes, it
sucks! And yes I think the committee made a mistake because I could have
settled all those concerns within the first minute of an interview (and yes, of course, I would say
that!)

But I’m also grateful for the feedback and I recognise that comparing funding applications is difficult and noisy. Could my application have been stronger? Nope, I gave it everything I had. Could I have done a better job reviewing grants than this panel? No, definitely not. Science is a human enterprise on all fronts.

So I'm going to wallow for another day or so,then I'm going to scrape myself off the floor and rework that
application.

Monday, 7 January 2013

How’s that for an ominous, grandiose title. Being the new
year there are some wonderful, inspirational posts kicking around. If you
haven’t seen them already, make sure you read this by Tania Browne and this by Gia Milinovich. Now really is the time to
get up and make the world turn. Don’t let anyone stop you.

Early January is always an interesting timebecause it
reminds me on the one hand of how important it is to start things fresh,while
at the same time – for me – it carries the echo of death. Now, I can understand completely if
you find self-indulgent posts about personal loss rather irksome. So if you
read my blog purely for professional reasons, feel free to sneak out quietly and I
won’t hold it against you.

On 2 January 1989, my mother died of cancer. I remember
that her room in the hospital had a window that looked out on to a small garden. She
loved nature and I like to think that having a garden in view made things
easier for her at the end. Her life and death shaped my life – and my choice of
career – in ways that I am still coming to understand as an adult.

At the time of her death I was a cocky, extraverted, smart-arsed
11-old kid who liked to run around in a tracksuit labelled “LIGHTING BOLT”. I
had a big mouth and wanted to be a film director. I thought I was the smartest
kid in town and that my mum would live forever despite being sick. I marched
around that hospital like I owned it. I even stamped around my mum’s hospital room
bragging how long I would live because I was sure she was going to make it too. I
was a cheeky, embarrassing little turd.

All that changed pretty suddenly.

On 25 December 1988 my family had our last Christmas together
with mum. She came home from hospital for the first time in weeks. I have no
memories of that Christmas day, save one. In it my mum is sitting on the sofa in
the lounge room while most of the others are in the dining room
finishing off leftovers. My dad is sitting next to my mum, and they are
speaking together in hushed voices that I can barely discern, accepting some
grim reality that my immature senses read as a defeat. She isn’t
going to make it.

Years later, I now know that mum came home that Christmas
knowing for certain that she was going to die soon after – the cancer had
metastasized and she was having her lungs drained with a needle every day to prevent
drowning in her own lymphatic fluid. I shudder to think of the pain she must
have endured during that time, especially while away from her morphine drip.
But she simply wanted to see her family in her own home one last time. If only I
had known all of this then, I’d have hugged her more. If only I could go back
as the adult I am now, I would do more than act like a dumb arrogant kid.

Part of my 11-old identity was aware of this inevitability. Yet,
at the time I maintained the doublethink that she wouldn’t die. I was confident for two reasons.
First, I knew she had fought cancer for four years and it hadn’t beaten her yet
– in fact she had gone into remission not so long before - so why now? And, in
darker moments when I wondered if she might die, I prayed. Those who know me
will balk. It seems strange (and inexplicably shameful) to admit this now as a
scientist and atheist, but there it is, I prayed. I wasn’t sure what to, or how
to do it. My mum was a failed Catholic who turned to Taoism and meditation, on
top of chemo, when she was diagnosed with cancer. So I wasn’t sure where those higher
powers resided. The sky? The earth? I was pretty ecumenical about it. I even remember asking the apple tree in
our garden to let her live (I fell short of asking the cat).

When I overheard the conversation between my mum and dad
that Christmas I realised something was wrong, so I upped the ante. I started
bargaining with those higher powers. I prayed to get sick instead of my mum. A fair
trade, surely. How could any god say no? I promised to behave better, to be
less of a prank-playing, punishable shit. I asked for forgiveness. I did everything
my religious instruction teacher (who I tortured mercilessly with questions
about dinosaurs and evolution) said I should do. And more.

Still, the 2nd of January 1989 came and she died shortly
after midnight. Soon after Christmas she had returned to the hospital and we
saw her one last time in the afternoon. Then, that evening, we all sat on the floor, leaning
on the wall of the back room in our house – me, my dad, my sister and my grandmother.
After the phone call confirming her death, my father took off his wedding ring.
I don’t remember what he said but it was at this point in life that I realised
how alone we truly are. There was simply no reason for god and I never
came close to believing in anything like it again. Gods were for the stupid and gullible.

The worst part was when my mum’s clothes and belongings returned from the
hospital. They carried the biological smell of death: a sweet, cloying odour that
I know now reflects the breakdown of metabolites in the body. That smell
remained in the house for weeks and such was its strength that I can conjure it
from memory.

My father became a shell of grief and much of my childhood ended
as well. The confident extraverted kid was replaced by a shy death-conscious child
who retreated into a world of books and rarely came up for air. My mother’s
death coincided with starting in a new school, transitioning from primary
school to high school. In the first year of that school I felt like an alien
dropped in a zoo, surrounded by stupid creatures who I had absolutely no
interest in, and who had absolutely no ability to understand me. I wished I
could just skip school altogether.

The teachers were hopeless too. One time, early in first
form, a particularly odious teacher made us stand up and say what our mothers
and fathers did for a living. I immediately dreaded it. After seemingly endless
examples of “dad works in insurance and mum does the shopping”, it was my turn.
I didn’t even mention my dad and just cut to the chase. “My mother is dead”. Muffled
laughter from other kids and a scowl from the teacher, followed by a loud reminder
that this was a Serious Exercise. I was prompted again and repeated myself. The
class laughed harder this time, a galvanised herd, as though I was performing a
perverse comedy act for their amusement. I was promptly dispatched to the
Principal’s office for lying and issued a detention notice. Once it became
apparent that I was in fact telling the truth, I had the edifying experience of
receiving (for the first time in my life) a grovelling apology from an adult.

As the years went on, my father declined further into
depression but I slowly regained something resembling confidence and found
myself gravitating toward the world of science.Like nothing else, science offered actual answers to existence, a
taste of the future, and the opportunity to stand out and be noticed for
something other than running fast or kicking a football. While my father
retreated to his cave, David Attenborough and Carl Sagan stepped up to tell me
about nature, while Gene Roddenberry, David Eddings, and Margaret Weiss kept
my imagination alive. The ghost of my mother began to take shape and it had a
clear message: The world is a big place. Don’t rely on higher powers to solve your problems, whether
people or groups or gods. Be your own master, then you cansolve your own problems and those of the people you care about. My mother’s
death destroyed the confident, extraverted child but created something else,
something more resilient and self-reliant.

Coming face-to-face with death as a child taught me – in the
words of Hobbes – that life can be brutish and short (though of course, as
an educated adult I now realise that it can be far worse on all fronts than
what I experienced). Knowing this at such a young age was strangely empowering and helped me understand the importance of seizing opportunities when they arise.

It's in early January that I’m reminded why: because that 11-year old kid isn’t dead afterall. He’s
certainly a lot less noisy than he used to be and (thankfully) doesn’t wear bright tracksuits
anymore. But his 35-year old replacement still has an overblown sense of
self-entitlement and gets out of line pretty often.

As I conclude this indulgence about life, let me offer some advice.Nature, above all, rewardstwo things: creativity and persistence. So go forth in 2013 and do something you’ve never done
before. Change something. Be innovative. Don’t be
afraid of critics or criticism. Be self-entitled, just like an 11-year old pain in the arse I used to know.

About Me

I'm a psychologist and neuroscientist at the School of Psychology, Cardiff University. I created this blog after taking part in a debate about science journalism at the Royal Institution in March 2012.
The aim of my blog is give you some insights from the trenches of science. I'll talk about a range of science-related issues and may even give up a trade secret or two.
You can follow me on Twitter: @chrisdc77