We hear a lot about the notion of Digital Natives and Digital Immigrants, a concept originally suggested by Marc Prensky in a paper by the same name. It makes a presumption that those born after the widespread introduction of digital technologies are somehow out of step with the world of technology, while those who were born and raised in the digital age are naturally able to function within it. Prensky contends that these younger folk – the “natives” – are born into a technology rich environment and are therefore akin to those who grow up natively speaking a given language, immersed in its use and able to converse fluently with it, while the “immigrants” are like those who come to a foreign land and need to learn to speak a whole new language. He argues that the immigrants will always have a digital “accent”, and therefore their non-native heritage will always be conspicuously obvious.
To be a native implies that you are not only comfortable, but knowledgeable about the culture in which you have grown up. Being a native – of a country for example – suggests that you know the words to the anthem, have an idea about your country’s history and geography, that you have become steeped in its many traditions, culture and language. It suggests that a certain amount of understanding and knowledge comes from being immersed in it, such that you may not always know how you know things, but you know them nonetheless.
The Natives vs Immigrants concept serves as a neat, tidy metaphor that is useful on a basic level to help understand some of the differences between Gen-Y and those who grew up in the primitive pre-Google world. However, the problem with the metaphor is that while it’s neat and tidy, it is demonstrably wrong on so many levels.
Here are three simple examples from own personal experience…
Exhibit A: My class of Year 11 students doing a course in computer applications. These students are 16 and 17 years old. That means they started school around 1996. By 1996 – when they were in kindergarten – personal computer software had been around long enough that certain standards had emerged, making their operation relatively easily to understand. Computers had been in most schools for the better part of a decade. The World Wide Web had been invented three years earlier in Switzerland by Sir Tim Berners-Lee and although had not reached its full stride quite yet, it had already started to make a significant impact on the world. Windows 95 – an operating system which brought the Internet directly to every computer desktop – had been around for a year. These students had certainly had grown up in an environment that immersed them in technology from their very earliest days at school, and they all grew up computers at home.
And what do I observe these students doing with technology? They know how to search Google … badly. They mostly use single words for searches and click on the first or second result on the first page of results, assuming that the top result must be what they were looking for. They are mostly unaware of any other search tool besides Google. They have never heard of tags. They can add content to their Facebook or Myspace pages, but they mostly do not know the basics of how HTML works, what embed code is or how to use it, and their sense of graphic design on their own site pages is quite poor. They mostly use the clunky Hotmail service for email, partly because of a mistaken belief that a Hotmail account is required to use MSN Messenger, and partly because they have no real idea that alternative webmail options even exist. They had never heard of Twitter, Gmail, GoogleDocs, Flickr or Delicious. Their use of older, more conventional productivity tools like Word or Powerpoint was basic at best, with almost no knowledge of even semi-advanced features like Find and Replace, Change Case, the use of Styles, Tracked Changes or Index tools… all of which are extremely useful to a senior student. Their understanding of a tool like Excel for analysing data was almost non-existent. They rarely used any software beyond what they needed to be technologically functional in their own little world.
Sure they can text on their cellphones pretty quickly, most have large numbers of friends on IM services and social networks, and they are good at sharing photos and illegal music, but beyond a sort of functional literacy in using a fairly small set of popular online tools, I would hardly describe them as “digital natives”.
Exhibit B: Two boys I know, one 16 and the other 18, each get a new laptop for Christmas and want to connect them to their existing home wireless network. Their father struggles with the wifi on the new Vista laptops for several hours but cannot get it working, so I was asked to lend a hand. Despite having no password for the router or WEP key, I manage to look up the router’s default password using Google and log into it (because, of course, it was never changed). I reset the router, create a new WPA2 key and within a few minutes, despite having never worked with Vista before, all the computers in the household are now connected and working.
The 16 year old boy now asks whether I could help get his XBox 360 connected to the wireless as well, since he has had it for over a year and neither he, his brother, nor his father have managed to figure out how to connect it to the wifi network. Let me repeat that… a 16 year old boy gets an XBox and a year later he still has not worked out how to connect it to the household wireless! I show him what to do and within minutes he is online. He then says that he was given a XBox Live subscription last Christmas and has not yet activated it because he did not know how. I help him step through the instructions and, aside from him lying about his age during the setup process, it’s up and running in a few minutes. He waited over a year to do this.
This didn’t particularly strike me as “digital native” behaviour.
Exhibit C: My own two kids have grown up in a house that was always full of computers and gadgets. They saw lots of examples of technology being used in interesting ways and they had access to pretty much any hardware or software tool they wanted. Despite this, my 13 year old daughter needed help setting up her new iPod, did not know how to insert an SD memory card in her mobile phone, and had to ask for assistance to get her photos off the camera. My 16 year old son, although an avid gamer, complained that he could not understand Open Office when I switched him from Microsoft Office, and until I showed him what to do, could not work out how to save a document using Open Office in a format that the Microsoft computers at school could open.
I love both my kids dearly, but that seems to me to be a pretty bad example of what it should mean to be a “digital native”.
So is there such a thing? Is being ‘“digitally native” really a function of being born into a particular generation, as Prensky suggests? Is it true that our youth are just naturally better at adapting to technology? Is it purely a function of age, or is it far more complicated than that?
Despite these examples, I also know of many kids at the other end of the spectrum; those who are incredibly adept at using and learning technology. I’ve had students who are amazing digital artists, others who can easily create complex computer code, and some who can take apart and put back together almost any piece of hardware you can throw at them. I know some kids who learn new software almost instantly, who seem to “get” whatever technology they encounter almost immediately, and who do it all with such comfort and ease that onlookers are astounded. But when we see these kids we make the mistaken assumption of thinking that they are representative of their generation, that all kids are like them. These kids are the ones we hold up as the “digital natives”, the ones that make us marvel at just how intuitive they are when it comes to using technology. The problem is that these kids are not really representative of their whole generation. They are freaks – naturally good at technology in the same way that others are naturally good at swimming or gymnastics or drawing or singing.
Prensky’s logic falls down for me when I see older folk – those who were clearly born before most people had even heard of a microchip – behave with just as much “native-ness” as many of their Gen-Y counterparts. Many of the cleverest, most insightful technology users I’ve ever met are in their 40s, 50s and 60s, and should – according to Prensky – be speaking with an almost unrecognizable “digital accent”, and yet they don’t. So I’m convinced that age has very little to do with it. I’ve seen 80 years olds who can surf the web effectively, use a digital camera, carry their music around on an iPod and use a mobile phone. And I’ve seen teenagers that can’t figure out how to Google a piece of information properly, don’t realise that Wikipedia can be edited, and have no idea how to listen to a podcast.
So if it’s not age, then how can we say that someone is “digitally native” in a generational sense? How can we support an argument that suggests anyone not born into this technological revolution will always have a “digital accent”.
I think we make a huge error of judgment if we assume that just because a 14 year old takes a lot of photos with their phone and sends 300+ texts a month that they have some sort of innate “digital native” status. We seem to assume that because they use tools like Google to find information, that they understand how to do it well. And we assume that because they might have 200 friends on Facebook that they understand what it means to live in a digital world.
I’ll agree that being young does, on average, tend to make one more at ease with technology. It usually (though I’d argue, not always) means that someone born into a technology-rich world is less afraid of the digital world, not scared of trying a new device or piece of software and more able to pick up its use more quickly. Kids are usually not afraid to learn new skills and software and tools… they just aren’t always very good at doing these things in a particularly broad or deep way. My observations of most younger “natives” suggest that although they are generally quite good at using technology to do a fairly narrow set of tasks that matter to them (as you’d expect) such as sending text messages, playing games, downloading digital music and managing their collections of online friends, they can often be pretty lacking in further technological depth. The wider perception held by many, that “they are young and they spend lots of time online, so therefore they must be whizzes when it come to anything to do with technology” just doesn’t hold water. When you can find plenty of examples to support the idea that those who should be naturally adept with technology are not, and an equal number of examples of those who shouldn’t be, but are, I think we need to rethink this whole natives and immigrants myth.
It’s a dangerous myth because it has some real implications for how we approach technology in schools. If we believe that “all kids are good with technology and all adults aren’t”, which, in its most basic terms, is the kind of polarised thinking that the native/immigrant myth perpetuates, it can play out in schools with all sorts of bizarre unstated beliefs…
- “As long as the hardware and software is available, it will make the learning more effective since the kids already know how to use it”
- “We don’t need to actively teach the responsible use of social tools… the kids already know how to use them”
- “As a teacher I don’t need to really understand this stuff, since the kids will figure it out”
- “It’s ok to be a basic user of technology, since the kids are all experts at using computers”
- Using technology in class is not that important, since the kids spend so much time using it out of school anyway”
… all of which are ridiculously untrue of course, but if you look for these unspoken beliefs it’s amazing how often you find them.
Perhaps we need a greater meeting of the minds. Instead of thinking in terms of us and them – natives and immigrants – maybe we need to value the qualities that both parties bring to the table – combining the fearless sense of exploration of our natives with the wisdom and experience of our immigrants – and work harder on teaching and learning from each other, regardless of age, so that we all live happily in this shared digital land of ours.
Image: ‘gran´pa, gran´ma n´ pa´‘ http://www.flickr.com/photos/36613169@N00/41294219The Myth of the Digital Native by Chris Betcher is licensed under a Creative Commons Attribution 4.0 International License.
Hi Chris,
As has already been expressed many people already before now, I whole-heartedly agree with you on the fact that digital “nativeness” is not dependent on age, and for many of the reasons previously outlined. What I am concerned about, though, as a teacher of IT, is that this assumption is having such a detrimental impact on my subject area! The move seems to be away from teaching IT as a subject, towards integrating it effectively into other subject areas and removing the need for it to be addressed separately.
Now, I’m all for improving how ICT is used across the school, don’t get me wrong, but it annoys me that people who do NOT understand IT are the ones that are deciding what is and isn’t important, and I’m sure there are teachers in other subject areas that feel exactly the same way. This myth that we’re dealing with a whole generation of digital natives is laughable – just because they grow up in that environment, doesn’t mean they know it.
As an aside, I found it interesting that a similar point was made in a movie I saw today. Slumdog Millionaire – without spoiling it for anyone, there is a point in the movie where a comment is made that the policeman’s 5 year old daughter would be able to answer that question (about the phrase that appears on India’s coat of arms) and that it was ludicrous that a man who grew up in India wouldn’t know the answer. Jamal turned to him and asked if he knew the details of some minor crimes that had occurred in the area – the policeman didn’t – and he made the comment back that any 5 year old living the slums would have been able to answer that question.
It’s not safe to make an assumption so broad without really understanding all of the circumstances around someone’s life. This is the case we have here – kids are only native users of the technology THEY CHOOSE TO US, and generally that doesn’t include online research tools, word processors and spreadsheets. What it does mean is computer gaming consoles, portable media players and mobile phones. That doesn’t make them digital natives – that just demonstrates the role of these particular technologies in defining who these kids are and how they interact with one another.
I find it less exasperating to teach my students than older colleagues who need to be taught numerous times how to perform a basic technical task. I’ve found that younger people pick things up quicker and then run with it. Sometimes I have to muster all my patience while teaching other teachers, some of whom seem a little lazy and would prefer me to do the task for them. Although teachers need to be given credit if they are trying to learn.
Some teachers are scared to push a button, while my students are far more willing to experiment and try new things. It makes me want to cry that there are probably a few (top pay-scale) teachers at my school who have never googled subject-related learning materials in their lives. Our kids deserve better.
Hi, I commented about this issue way back in November last year at:
http://efoliointheuk.blogspot.com/2008/11/towards-new-digital-divide.html
My main argument being that there are too many ‘elitists’ in the world of ICT and it is these people, of whatever age, who are causing a so-called ‘Digital Divide’.
I have a long and continuous history of digital innovation stretching way back from the early 50’s ie pre-digital, including radio-control, radio-telescopes, audio-visuals, developed CEEFAX subtitling for the Deaf, teaching machines and then came the first PCs in schools. And to cut a very long story short I’ve been in ICT teaching and staff develpment ever since. – I’m now 67.
So, what would Prensky think of me? For more of my rants see:
P: http://www.raytolley1.xfolioworld.com
B: http://www.efoliointheuk.blogspot.com/
W: http://www.maximise-ict.co.uk/eFolio-01.htm
I guess this is just like learning any skill some people will pick things up very quickly and be a natural (native) at it and others wont. But you still need expert instruction, coaching and guidance to become really good. Think of a swimmer who has a natural affinity with the water but still needs to master technique to become a great swimmer. We as teachers need to be teaching kids the techniques on how to become great ICT users. Which means that teachers themselves need to have the knowledge to teach it! Great thought provoking post Chris!
Interesting post. I have to say, the term “digital native” seemed to make sense to me before reading your post. But, I now realize that “digital native’ really refers to the younger generation’s basic comfort level with technology. They are basically willilng to click on things many adults would be afraid to try. But..as far as getting deeper into technology and having a good proficiency with new tools….I think you are correct….most just don’t have the skills.
Great post! Just because we are all exposed to the same things doesn’t mean we retain them at the same rate, use them or even like them. I find that in a class of 18 students, given the same lesson- some will “run with it”, some will be somewhat interested and others will just “tune it out”. Just because you are a digital native that doesn’t mean you are proficient in all aspects of technology, just like all of us in Algebra 101 didn’t become proficient in it!
perhaps a more useful differentiation is the type of learner..or mindset
fixed mindset = use status quo, not so adventurous, conservative, know what they know, rule dependent
growth mindset = constantly seeking a better way, challenging, questioning, enquiring, have a go because they can…dare I suggest creative and constructivist as well!?
fits to students and to teachers..nothing to do with age
Well said, Chris. There is risk and unending frustration (especially in teaching) when making assumptions about prior skills. We teach alphabetical order, various comprehension skills, skimming, scanning all throughout the early education process. Why would we assume basic technology skills would not also need to be taught and revisited each year on various levels?