Outsourcing memory: the internet has changed how we remember

Author

Disclosure statement

Ryan Wittingslow does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond the academic appointment above.

When Nicholas Carr’s article “Is Google Making Us Stupid?” hit newsstands in the July/August 2008 edition of The Atlantic, the reaction was predictably vociferous.

The essay itself – a 4,175 word editorial monolith of the kind The Atlantic does so well – was a thoughtful exploration of the fear that heavy reliance upon the internet is detrimental to certain cognitive faculties, including (but not limited to) concentration, memory and the capacity for quiet reflection.

It has been four years since that particular tempest in a teacup, but it seems uncontentious to claim that these concerns still resonate.

If nothing else, there is certainly no shortage of evidence in favour of Carr’s observations that the internet is changing our relationship with information in some fairly profound ways.

In a study published in Science in 2011, US scientists claimed the internet has become a form of “external or transactive memory”, with information being stored outside ourselves.

In the face of this transition, the imperative to remember information has instead been replaced with the imperative to remember where information is located.

This is what is commonly known as “the Google effect”, and is the motivating observation behind theories of extended cognition, such as those of philosophers Andy Clark and David J. Chalmers in their paper “The Extended Mind”.

Meanwhile, another study, conducted by UCLA Professor of Psychiatry Gary Small, showed experienced users of the internet demonstrated increased brain activity in parts of the prefrontal cortex associated with problem-solving and decision-making, as opposed to novice users.

These changes were not manifest when the two groups were asked to read printed text.

In Small’s words, this provides evidence for the fact “the current explosion of digital technology not only is changing the way we live and communicate, but is rapidly and profoundly altering our brains.”

The observation that our relationship with information can change based upon our artefacts seems the merest fact, and by no means a necessary cause for dismay.

That being said, it is my claim that the widespread use of these devices is necessarily changing what it means to be a successful learner.

When I was but a wee lad, barely knee-high to a grasshopper, my mother used to regale me with horror stories about her classroom experiences in the early 1960s.

Times tables and chemical formulae were learned by rote and devoid of inferential and internal coherence. Information was divorced from praxis, with each claim nacreous and glib, like banal pearls.

Of course, we would like to think contemporary pedagogy has overcome those shortcomings – and indeed they have, to a large extent.

But – at least in my experience – undergraduate university students still exhibit many of the educational conceits of prior decades. That is, although they retain information perfectly well, they often have a limited capacity to manipulate it in any meaningful way.

Of course, gifted students will continue to do well, as they have always done. But those other students – perhaps less gifted, or having had fewer opportunities – appear to be unaware of, and unfamiliar with, the shifting informational landscape.

Mere retention has never been sufficient to achieve academic excellence and now, with the informational ubiquity afforded to us by smartphones, tablet computers and netbooks, it is even less relevant.

But although students are no longer burdened by the requirement to remember (catalysing the aforementioned “Google effect”), other pedagogical demands make themselves known.

In his 1941 short story “The Library of Babel” Argentine author and librarian Jorge Luis Borges asked us to imagine a library that contains all possible books of 410 pages, written in a language with 23 letters, spaces and punctuation marks (somewhere in the region of 251,312,000 volumes).

Although the system has maximal information, the librarians live in a state of near-suicidal despair – their task is vast and impossible, because something with maximal information content also has zero information content, at least as expressed as a ratio of signal-to-noise.

Of course, the state of the internet is nowhere near as dire, but the story does serve to illustrate the quandary that informational ubiquity presents for both learners and educators.

It is unquestionable that the internet has facilitated the spread of information more than any invention since the printing press. But greater ease of both creation and transmission means it is easier to spread information that is spurious, misleading or just plain incorrect (see the otherwise inexplicable staying power of “wind turbine syndrome”). The signal-to-noise ratio has suffered in favour of the latter.

It is hardly a novel observation that there is a lot of garbage on the internet. But phenomena such as the Birther Movement – those who believe Barack Obama was born outside the US and is therefore ineligible for the presidency – only serve to demonstrate the need to reappraise the way we teach students to navigate informational topographies.

It is a truism that a dedicated internet user can find something to support their views no matter how ridiculous they are – a phenomenon that many, I suspect, would consider undesirable.

Although I do not share Carr’s pessimistic view that heavy reliance upon the internet will be detrimental to humanity as a whole, I do believe greater emphasis needs to be placed upon the teaching of incredulity, given the ease with which ostensibly credible misinformation can be accessed.

Although we already attempt to imbue our students with a degree of hard-headed cynicism with respect to sourcing information, our attempts thus far have plainly been marginally inadequate at least.