This weekend I read April Dembosky’s article, “Cerebral Circuitry” at Financial Times. In the article, Dembosky explains how researchers are studying how our use of technology is changing our brains, particularly in terms of empathy and human interaction.

From the article:

“There is growing concern that our emotional and empathetic pathways are being eroded by all the screen time. We spend so much time on our computers and gadgets that we are starting to think like them. Brain circuits are being rewired to accommodate these tools of modern life. We process more bits and bytes of information, and we are quite fast at it. But there could be a trade-off – our motivations to act like Superman are diminishing.”

“Our ability to pay attention and focus is also being taxed. Most studies show the human brain is not equipped to handle multiple streams of information at once. But we sit for many hours in front of multiple screens, flitting back and forth between various windows. A 2009 study published in the Proceedings of the National Academy of Sciences found that people who had become practised at ‘chronic media multitasking’ were worse at filtering out irrelevant distractions and at switching between tasks than people who spent less time on gadgets.”

Looks like my concerns aren’t entirely unfounded. Reading this article on my laptop while sitting next to my son who was playing a game on my iPad, I began to despair over what we’re doing to our brains. But then I read this:

“Matt Langione lies on his back in an MRI machine, reading a copy of Jane Austen’s Mansfield Park. Neuroscientists in Stanford’s imaging laboratory are comparing the patterns in his brain when he skims the pages leisurely, and when he concentrates hard on the literary form. The technicians are surprised by what they find. The areas of the brain that light up during close reading are not just those associated with attention, but also those involved in movement and touch. It is as if readers physically place themselves in the story when they analyse it more carefully… That could mean the more people read superficially, the less they put themselves in other people’s shoes.”

And it hit me: this is why story matters. Getting lost in a book connects us to the emotional core of what it means to be human. It teaches us empathy and compassion by causing us to live in someone else’s world and walk a different path than our own.

Perhaps I’m off base here, teasing a theory from between the lines of the study. But my bet is if they focused on studying concentrated reading of fictional texts (which I hope they do), they’d find that reading stories makes us better people.

One thought on “Why Story Matters”

[Editor’s note: This is fiction but read between the lines — it might true
in future.}
Dr Ellen Marker studies reading. But not off screens or in
paper books.

Her research is done in an American laboratory in Boston.

The pioneering neuroscientist analyzes brains in their most enthusiastic
reading state, hoping to understand the differences between reading
off screens and reading on paper surfaces.

Like me, Dr Marker feels that her studies will show reading on paper
is superior to reading off screens in terms of
retention, processing, analysis and critical thinking.

But first, let’s see what the scans will be like.
Dr Marker asks me to put myself into an fMRI machine so she and his
team can study which areas of the brain are activated by reading text
on paper compared to reading the same text on a computer screen or a
Kindle e-reader.
And this is why I’m here. Today I will donate my brain scans to science.
Among the things that Market has discovered so far is that reading on
paper might be
something we as a civilization should not ever give up.
“Even though reading on screens is useful and convenient, and I do it
all the time, I feel that
reading on paper is somethine we should never cede to the digital
revolution,” Marker, 43, says. “We need both.”
On the day I climb into the brain imaging cocoon, I am thinking about
what it all might mean.

But since I am just a guinea pig and not a scientist, I will have to
wait for the results.
I enter a sterile lab, and Marker and her four associates greet me,
all in white lab coats.

As they hand me my a pale blue gown to change into, I have
second thoughts — “How can I read while lying down horizontally my
back, not my preferred reading mode?” — but decide to push myself.
Science needs me!

The scientists load me into the machine and I’m off.

Next step: They strap my head down, because any movement distorts the
brain imaging. Ever try to read a book without facial movements?
I feel as if I’m being shoved into the middle of a toilet paper roll,
the walls so close my eyelashes almost graze them.
Then I hear a voice through the earphones I’m wearing. It’s Dr Marker.
“You okay in there?” she asks.
Graduate student Dan Smith, 32, tells me to relax before
running around to join the other scientists in the control room.
With the invention of the fMRI only 20 years ago, along came the
ability to look at brain activity. Marker says that by understanding a
function as gigantic as reading, how the reading brain does its magic
dance, a response that hijacks all of
one’s attention, she might also learn how reading on screens could be
inferior to reading on paper.

“The more we understand how the brain works,” she says, “the more we
will be able to help people modulate its activity.”

As the machine switches on, it sounds like a jackhammer. I follow
Marker’s instructions and as I do, the group watches my brain on
their computer monitors. I willl read passages from a novel, and then
later I will read
the same passages on a Kindle. I just hope the Kindle does not blow up
inside the brain scan machine!

Research and teaching take up most of Marker’s time, but when she has a
spare moment, she thinks about what all this might mean for the future
of humankind.

During my first hour in the fMRI machine, researchers map my brain’s
reading paths
to find out which parts correlate to
which regions of the brain.

“You have 10 minutes,” Marker says through my earphones near the end
of our test. “Keep reading.”

On the
other side of the glass pane, the scientists can see my brain lighting
up as I read on paper and as I read on a screen. Regions light up in
different ways, Marker says.

Marker discusses what her research could do for the future of
humankind. “We need to know
if reading on screens is going to be good if it replaces all our
reading on paper.”

Marker’s lab has paid me a
$100 subject fee, so I want to give them their money’s worth.
After all, it’s not easy to get funding for this stuff — Marker
says she spends at least half of her time applying for grants.
“There’s no premium on studying paper reading modes versus
screen-reading modes in this society,” she tells me
as Smith murmurs, “What do you expect? The gadgetheads want to take over.”
When the tests are over, Market tells me the data takes two hours to
convert, but it can take much longer to
make sense of it.

“We’ll be at this for a while,” she says.

One of the biggest conundrums turns out to be a nagging
question for all mankind: What if reading on screens is not good
for retention of data, emotional connections and critical thinking skills?

Marker begins slipping more and more
into her thoughts. “Neurons, little bags of chemicals, create
awareness,” he says, “but how? How does the brain create the mind?
What is reading, really?”

I see that at the heart of all her research, there is a
philosopher trying not only to understand reading, but also figure out
the nuts and bolts that make up the human experience.