It’s an old adage, perpetuated by the media, that children and young people understand technology better than adults in their mid 40’s and beyond, but is this correct?
As an ICT teacher, I have, on occasions, had the pleasant experience of a student approaching me and saying that when they have been playing around at home with a software package that we’ve been using in school, they have discovered something really cool. They then demonstrate their new knowledge and, with a mischievous glint in their eye, ask me “did you know you could do that sir?” Now, when I declare that I did not, they respond by saying “but, how come you’re the teacher and now I’m teaching you stuff?”
In it’s recently published annual study of British consumers, the communications watchdog OfCom states that the advent of broadband in or around the start of this new millennium, created a generation of digital natives, the youngest of which are learning to operate digital devices before they are able to talk.
According to OfCom, these youngsters are “…developing fundamentally different communication habits from older generations, even compared to the 16 - 24 year old age group.”
For this research, the watchdog developed a ‘digital quotient’ (DQ) test measuring awareness and self confidence around technology from smartphones to smart watches, awareness and understanding of super-fast broadband, 4G networks and mobile apps.
The watchdog studied 800 children and 2000 adults and in the 6 to 7 year old age group the average DQ score was 98, whilst for those in the 45 - 49 age group the average DQ score was 96. The research demonstrated that digital understanding peaks around the ages of 14 -15, with a DQ score of 113. It then gradually reduces through adult years and drops significantly in old age.
Back to my cheeky student, standing in front of me with a gleeful, self-satisfied grin on their face. I explain that although I’m the teacher, they, as a care-free, youngster, possess far more of a precious commodity than me, which allows them, at will, to meander through the tools and functionality of an app…
And I believe this is one of the reasons why, apparently, this phenomena occurs. The average young person comes home from school and after (hopefully) completing homework and eating a meal, the evening is pretty much their own to do as they please, so, should they be inclined to play with a web design package or try to write a program in a particular computer language or even just become familiar with an app that all their friends have been using, they have the time to do just that. Adults, on the other hand, have careers, family duties and other commitments that constantly demand their time and therefore have limited opportunity (and also the energy) to raise their awareness and learn what the latest technology can offer them, unless it is necessary for their work.
It is also my opinion that young peoples’ susceptibility to fashions or trends is a contributory factor to this issue.
OfCom’s report stated that whereas 18% of the children used the picture messaging app. ‘Snapchat’ and a further 11% knew a lot about it, almost half of the adults questioned had never heard of it. Could it be that once adults find out about say, a social networking app. and learn how to use it, they do not see the point of learning how to use another app that appears to do the same thing - just with pictures, purely because it happens to be ‘in’?
Indeed, many teachers use Facebook - the original social networking site but, how many have you ever over heard say “D’yer know what, Facebook is just so ‘yesterday’, I’m moving over to Snapchat!”?
What do you think of this issue? Do you believe that just because young people have grown up in the ‘digital age’ they are inherently better equipped to use technology and to immerse it into their lives? Or do you think it’s just that they have the free-time to find out about what tech is cool to use and how to do it? Do you think that there are other reasons? Comment below and let us know your thoughts.