Is the whole concept of digital natives based on flawed logic? Research suggests that while technology might change, humans don’t.
HRM has written previously about millennial myths, and how categorising employees by generation can be counterproductive. But new research says the most common assumption about the ME generation might be incorrect. It turns out “digital native” is a misnomer because there is no evidence that when you were born has any impact on how tech-savvy you are.
The theory of digital natives is that everybody born after the mid-80s has a special affinity for technology absent in older generations. They were born into a digital world, and it changed them. As reported in Discover magazine, the term was coined by educator and CEO of Games2train Marc Prensky in this essay. In it, he argued for a radical shift in how students are taught, based on his theory. “My own preference for teaching Digital Natives is to invent computer games to do the job, even for the most serious content,” he wrote.
It’s likely Prensky’s theory gained popularity because it makes intuitive sense that growing up with technology would give you a talent for it. But generation X grew up watching a lot of television, and no company thinks delivering their professional development via MTV style videos is smart.
What this means for HR
Two of the most common things you will hear about younger employees and candidates is that they need to be viewed through the prism of the technology they love (this is for recruitment, training, etc) and that they are natural multitaskers.
The former has been questioned for a while, with a paper by ECDL demonstrating that while younger generations are more familiar with lifestyle technology, that doesn’t make them any more adept at technology in the workplace. Remember, most consumer products are designed to be very user-friendly. Even babies and pets can use an iPad (though that doesn’t mean they should).
- that there is no evidence “information-savvy digital natives” exist;
- people don’t multitask, they switch between tasks in a way that negatively impacts learning;
- and that trying to teach anybody assuming either of these things are true also hinders their education.
It’s important not to mistake taste for talent. Younger generations might prefer to communicate digitally, but that doesn’t mean they’re better at it. Nor for that matter does it mean they’re worse at communicating face-to-face or on the phone. Much of our technology is new but the human mind is a very old contraption, and it can only adapt so far.
The most harmful byproduct of the digital native theory might be what it means for older workers. When it comes to hiring, there is a sense that if you’re of a certain age you’re not going to be as comfortable or adept at using the latest technology. A report from earlier this year found that this bias was being perpetrated against people as young as 45.
Considering that when people older than 75 learn how to use technology it becomes an integral part of their lives, and that Generation X is more addicted to social media than millennials, anyone with recruiting responsibilities should think twice before dismissing candidates based on technological assumptions.