Open University research explodes myth of 'digital native'

Gerald Haigh visits his alma mater to learn that a good attitude to technology correlates with good learning habits

A new research project by the Open University explores the much-debated concept of “the digital native”. The university does this by making full use of the rich resource which is its own highly diverse student body.

It concludes that while there are clear differences between older people and younger in their use of technology, there’s no evidence of a clear break between two separate populations.

Is there really a distinct group of younger people who are not only easy with technology because they’ve grown up with it, but actually think and learn differently as a result? The idea gained quite a bit of traction after Marc Prensky wrote about the idea ten years ago in Digital Natives, Digital Immigrants, with other writers weighing in, such as Bradley Jorgensen with Generation X and Generation Y.

Since then, the concept has often been questioned, and even Prensky’s own ideas have changed somewhat. The notion persists in the public imagination though. After all, it seems to bear the fatal hallmark of “common sense”. On one side of the divide is the young person who uses technology like she drives her car, without the need for conscious attention to the process. On the other side sits a grizzled and mature individual, maybe a would-be ‘silver surfer’, frowning impotently at a keyboard and calling for his granddaughter.

This isn’t, though, just a saloon bar debating point, or material for yet another Grumpy Old Men TV programme. If there really is a clear generational separation of brain process, then we need to know more about it because there are important implications for learning.

The OU research involved more than 4,000 students aged from 20 to 60

Research, in fact, is called for, and who better to undertake it than the Open University? After all, you can enrol as a student at the Open University at any adult age, with no upper limit. And in order to do a degree, or any other qualification at the OU, you need a computer and internet access. Putting those two conditions together seems to create the ideal conditions within which to examine the reality of the inter-generational divide.

So, the University’s Institute of Educational Technology set about the task by putting together an age-stratified, gender-balanced cohort of 7,000 students aged between 21 and 100 . There were 2,000 between ages 60 and 69, 1,000 aged 70 and over, and, for comparison, four groups, 1,000 in each, from students respectively in their twenties, thirties, forties and fifties. All were surveyed by detailed and carefully constructed questionnaires.

These began by asking for information about access to technology, including mobile phone features. The questions then moved to exploring levels of confidence and frequency of use across 13 different computer tasks. And finally, students were given nine statements about their attitudes to technology and 18 statements intended to explore their approaches to studying – whether their approach was “deep” or “surface” for example. They were given the choice of responding online or by post.

A total of 4,066 students responded to the questionnaire – 58.1 per cent, which is regarded as a good response rate. The age distribution, though, was remarkably uneven. 81.2 per cent of the over-seventies responded, but only 30.8 per cent of those in their twenties. Perhaps that’s understandable. What’s rather more counter-intuitive is that while over 60 per cent of the over-sixties chose to respond online rather than by post, only 46.4 per cent of those in their twenties did so.

Other results are, at first sight, relatively predictable. Younger students are more likely to have worked on a wide range of computing tasks, and to have used technology over longer periods. Older students are more likely to have access to a desktop computer, where younger ones more frequently have laptops and handheld devices – phones, music and games players. And there’s a group in the middle years who are doggedly sticking to their palmtops.

Almost all the respondents use the internet – not surprisingly, given its importance in OU courses. More of the younger users than the older ones, though, are likely to have access beyond the home computer – at work, at a public facility, or anytime, anywhere with a mobile device.

When it comes to mobile phones, the differences are again in line with common perception – older users are just as likely as younger ones to make calls, but are less likely to use all the other features – text, camera, music, internet, wi-fi.

So Prensky was right first time? 'No, certainly not'

Across all of these comparisons, incidentally, the gender balance remains more more less even. Again, perhaps in line with expectation, actual attitudes to technology differ across generations, with younger participants more positive. At the same time though, attitudes at all ages fall at the positive end of the scale.

So there are generational differences, then? Certainly there are. Nobody’s arguing about that.

So Prensky was right the first time – there really is digital native generation? No, certainly not – and that’s what’s important about this study. It shows that while those differences exist, they are not lined up on each side of any kind of well-defined discontinuity. The change is gradual, age group to age group. The researchers regard their results as confirming those who have doubted the existence of a coherent ‘net generation’.

“We found no evidence for any discontinuity in technology use around the age of 30 as would be predicted by the Net Generation and Digital Natives hypothesis," says the report. What the reseachers do find interesting and worthy of further study is the correlation – which is independent of age -- between attitudes to technology and approaches to studying. In short, students who more readily use technology for their studies are more likely than others to be deeply engaged with their work.

“Those students who had more positive attitudes to technology were more likely to adopt a deep approach to studying, more likely to adopt a strategic approach to studying and less likely to adopt a surface approach to studying.”

So, in conclusion, first, there’s no evidence of a clear-cut digital divide. Use of technology varies with age, but it does so predictably, over the whole age span. And secondly, although younger people are more likely to be positive about technology, there is evidence that a good attitude to technology, at any age, correlates with good study habits.

More information

The research is presented in a paper currently under review, “Older Students’ use of Digital Technologies in Distance Education”, by Chetz Colwell, Anne Jelfs and John T E RichardsonChetz Colwell is a project officer, Anne Jelfs is the learning and teaching development manager, and John T E Richardson is the professor of student learning and assessment in the Institute of Educational Technology at the UK Open University.

It's really pretty simple. It's a metaphor and like all metaphors there's a point you can take it to where all you are proving is metaphorical truths, but before you get to that point it is useful as an illustration of some genuine (and genuinely interesting) features of the social landscape (not technological landscape - this is not a technology question to anything like the same degree that it is about people). Immigrants have experience of two cultures (or more, but let's keep this really simple). Natives (i.e. their children born in the new country) do not. Most of the time the cultural experience they lack is that of their parents' old culture. They might read about it or hear about it from the immigrants, but they haven't lived it. That's what Prensky is inviting us to agree on. I should know: he told me so himself the day before yesterday at the ECIS Technology Conference I helped to organise. It amuses Prensky that people attempt to take his metaphor to an assumed apotheosis, fail and then blame him for getting it wrong in the first place. Apparently here in the UK, we're the most likely to take issue with it. He can't decide why. There is little wrong with the metaphor, and you can conduct research designed to disprove mistaken assumptions drawn from it and still end up with something worthwhile. But please, can we dispell the notion rapidly acquiring momentum around the world that we are incapable of recognising a metaphor? OK, I've got to go. The Mayflower sails in ten minutes...

I am not siding for or against the digital native concept, just want everyone to critically think about making broad assertions based on one research study. This concern was also addressed above in a comment.
An interesting element (and I think confounding variable) of this study was that each person involved was already pre-disposed to technology and learning, they had to use a computer and have internet access to be involved with the Open University and they had to be actively involved in a course of study. So for this population, for this study, the results are accurate. But, not sure how much farther we can generalise into the entire population based on these results (same bias as Prensky). These are people who already are comfortable with computers and the internet, at least enough to particpate as students in an online experience. So any divide based on technology would be minimized.
These are already digital natives (to use Prensky's term). For a population already pre-disposed to learning and technology, there is no digital divide based on age, although differences were noted. Some of the question depends on how you define "divide". Is it subtle differences in the use and adoption of technology or is it something more. Is it more stark and contrasing, an all or nothing. As a comment indicated above, perhaps it's more of a continuum. For a more general population, a divide may indeed exist. We don't know from this select sample.

Why should teens rule the use of technology? What about the upcoming kids under 5 who are the first generation to grow up using touchscreen technology? Look at the little girl in the photograph with this article if you want to see the future.

I note this finding: "It shows that while those differences exist, they are not lined up on each side of any kind of well-defined discontinuity. The change is gradual, age group to age group." This simply indicates that it is the construct of "generation" that is flawed, not the "digital native" hypothesis as a description of behavior. In fact, one should expect a gradual change if the hypothesis is correct.

Karl and Abby and other commentators may find useful research done by UCL CIBER for the British Library and JISC in 2008 which reported on the 'Information Behaviour of the Researcher of the Future'. See http://www.ucl.ac.uk/infostudies/research/ciber/downloads/ To quote the Executive Summary 'Many of the claims made on behalf of the Google Generation in the popular media fail to stack up fully against the evidence'.

One of the underlying issues raised by this is the fact that the English education system was based on the design principles based on Taylorism. Our children today live in a digital world predicated on Googleism. This leads to what Martin Bean,VC of the OU describes as the "growing crisis of relevance" facing our schools and colleges.

It's a typical move of faith-based arguments to simply say 'Oh, they haven't studied this other group who are certainly digital natives'. Like someone who says 'It will rain tomorrow' and then it doesn't and replies 'Tomorrow!' Believers do not care about evidence when they have belief. So it's those born since 1984, then 1990, then 1995 then... and on and on. There are as many if not more differences within generations than between them. The sooner believers face the facts, the better. You are encouraging the waste of education's limited budget on nonsense-based policies that require DNs to exist. But, as Popper said, a rational argument has no effect on someone who doesn't want to be rational.

Devin Byrka said: "I actually think that the Google generation's fundamental conception of knowledge and information is inherently different than of those born before the 90's." You are absolutely allowed to think that and suggest it, but if the conception of knowledge and information (and I really do think you need to explain what you mean by that in terms of epistemology) have changed in fundamental ways then a lot of learning theory will need tearing up. Now, that's no bad thing, but you had better have some serious data if you want to convince people that you're not just building the next bandwagon.
Marshal
BTW 2 (it's back, and this time it's personal) the spelling drop-down now works, but it also displays the formatting menu - right on top of the spelling suggestions :O(

Is it not a teeny bit skewed given someone would have to be fairly confident they could cope with the technology before they would sign up for an online degree..? My Mum (in her 60s) was taught in an environment where technology (and any non-classroom based item) was wheeled in to view. Much teacher led knowledge relating to said item wasn then laboriously transferred and after passing some form of assessment the top of the class were perhaps allowed to use it, maybe. It's not surprising she now waits to be shown how to use anything: a new dvd player; sky box; mobile; anything, will lie unused until a lesson has been given and everything fully understood and a test run completed. My Mum would never sign up for an online course and is completely unrepresented in this research. Compare that to a classroom today...try taking a games console into a primary 6 class and see what happens :) The generational focus of Prensky's theory can't be ignored but as with any complex social theory there are bound to be lots of contributing factors and it's always useful to remember that learners are different. Expectations surely are?!

Yeah, I know - it's actually older than 15 years, but for all intents and purposes, it really hasn't been that important in our lives until about 1996, and even then we were probably afraid to enter our credit card info into Amazon or eBay. The reality is that the internet is just picking up steam, entering our lives at every junction. Wikipedia, Facebook, and YouTube are only ten, seven, and six years old, respectively. Who knows what the next 15 years of the net have in store? Not including the under 20's essentially nullifies any argument this study is trying to make. The under 20's were born in the 90's. If you're a 15-year-old in 2011, you were born in the same year as Google. A lot has changed in those 15 years, and a lot has changed since Marc Prensky first wrote of digital natives.
Like Abby said, Prensky appears to have been ahead of the times, and perhaps now we're seeing the divide between natives and immigrants more sharply as the internet, smartphones, and social media permeate our lives more readily than ever before. If you've grown up with Google at your fingertips from Day 1, you are a different type of individual than someone who learned how to use Google as an adult. I actually think that the Google generation's fundamental conception of knowledge and information is inherently different than of those born before the 90's. All that being said, as humans we've gone through these types of transitions (adjusting to new technologies) before and eventually some type of equilibrium or 'happy medium' will be reached. It will likely take some time to smooth itself out as new generations of educators slowly enter the workforce and we change our conceptions of learning, teaching, and education. I'd suggest reading Douglas Thomas' and John Seely Brown's "A New Culture of Learning" for a great take on the subject.

There is something about the world of education that seems to lend itself to the raising of particular ideas to almost divine status – often well beyond what the original creators of those ideas intended. Something is given the fatal hallmark of “common sense” and a bunch of people who have probably only read an abstract or a review in The TES feed it to teachers who don't really have the time to question it, but who are always really glad of simple answers. The whole digital native thing has been used to justify all sorts of initiatives – some of them still good by the way – and further a lot of careers. The same was true of emotional intelligence, the Initial Teaching Alphabet, Synthetic Phonics, learning styles, VLEs, the whole shebang. We don't seem to be capable of devising a system that filters this out – though the socialist in me can't help thinking it's got something to do with privatisation, tendering etc. This is really just an extended sigh from someone who saw this begin 30 years ago and has been ducking and weaving to try and find little islands of sanity in the educational asylum ever since – pay me no mind :O). Marshal By the way, Merlin (hi – been a long time) your spell-checker generates an error and this text box doesn't allow Firefox's own checker to work – just sayin' :O)

I have always found Prensky's terminology misleading as this research shows but in January I spoke with James Clay who is the ILT and Learning Resources Manager at Gloucestershire College. He said that the terms are now less often used and have instead been largely replaced with 'digital residents and visitors'. This is very well explained here (http://tinyurl.com/4hxvfe) so there is no need for me to repeat it. I think this works much better as a model and I would be surprised if it doesn't appear somewhere in the OU report.

I've always thought that the 'generation Y' stuff made about as much sense as astrology. It would be good if we could all be judged by our abilities and aptitudes, not our age group. I've responded in a blog post: http://olderthanelvis.blogspot.com/2011/08/dont-call-me-silver-surfer.html

It always struck me that the categorisations of digital natives and immigrants only applied to extremes, and a full continuum ran between. "Across all of these comparisons, incidentally, the gender balance remains more more less even." I assume they only recorded sex, male/female, not gender. http://karldrinkwater.blogspot.com/2011/08/sex-versus-gender.html

Abby, I agree with the points you make re: the exclusion of a specific group within the study and the need for us to think differently. However, this sentence "They are the only generation who have been brought up with technology as second nature and are the complete epitome of Prensky's 'Digital Native'." is predicated on the same assumption that causes Prensky's 'Digital Native' thesis to be so easily refuted. The world has not seen an entire generation brought up with technology. And perhaps it never well. This argument falls down as it assumes that everyone under 20 has access to modern technologies: Internet, Games, Mobile, etc. This is just not the case and as such Prensky is wrong to suggest such an exclusive division. Additionally, many of the pro-'digital native' arguments assume that users of technology are critical users, making effective use of the technology to aid learning. This is often not the case. With many of my own students, I know/understand ways to use technology in a far more useful/sophisticated way that they do. Many young people understand the rudiments of a technology quickly but this does not constitute critical use. It is the critical use of tools that it important here - where, learners (and/or educators) make use of tools in sophisticated ways to transform their learning.

Thank you for such a smart, common sensical analysis. In what generation is there even homogeneity on anything, never mind technology adaptation? And I really applaud this comment: "And secondly, although younger people are more likely to be positive about technology, there is evidence that a good attitude to technology, at any age, correlates with good study habits." Yes. Thank you for some wisdom and light.

This is just one of many studies that have debunked the digital native myth. See http://netgenskeptic.com for more. Despite the overwhelming evidence educators continue to be seduced by the simplistic discourse and consultants continue to profit by advising uncritical educators how to deal with the "natives".

As an organisation with more than a quarter of a million students in more than 20 countries, and one which has broken access records on Apple's iTunes U service, the Open University would be the very last outfit to "discount the revolutionary change that is taking place". It rides on that change.The point of this article is to share insights from OU research about how students of most ages use technology. This is of interest to many educators. In doing so, it undermines the popular, rather crude oversimplification of Prensky's Digital Natives, Digital Immigrants analogy that seems to have developed over the years. The article's conclusion is very positive about technology and learning.

What's the point of this article? To simplify the notion of 'digital native' to such a degree that one can discount it? Prensky's work allows us to acknowledge a world of change. To oversimplify the premise makes it easy to discount the revolutionary change that is taking place and the challenges we face as educators in developing digital literacy in all age groups.

Doesn't your observation hold some truth for all generations Abby? Given that the Open University had no reason to extend the research to under-20s, didn't its over-20s grow up with mobiles, texting, gaming, and social networking (Myspace started in 2003, Facebook in 2004 and Twitter in 2006)? In fact aren't these over-20s some of the "natives" Prensky was writing about in Digital Natives, Digital Immigrants as it was published in 2001?

While the study was obviously extensive, there was a huge failure to incorporate an entire generation; the under-20s. They are the only generation who have been brought up with technology as second nature and are the complete epitome of Prensky's 'Digital Native'. Had the study encorporate this age group you would have found that the majority use technology for educational purposes without even realising; there is a generation gap, and failing to acknowledge such is to fail an entire generation. Technology is the future, people need to learn to think differently in order to prepare the younger generation for a future we can't even comprehend. Twenty years ago no one would have imagined things like the iPhone, and with the speed of technological advances now it would be irresponsible of us not to accept that today's primary children are being brought up in an entirely different world from us. I think Prensky was ahead of the times, and while the generation divide may not be noticeable in large degrees from 20 upwards it most certainly is from 20 down. This needs to be acknowledged because as you said it has massive implications for education.