Technology has fundamentally changed the way we learn. It’s undeniable. Without technology, you wouldn’t be reading this. While people may debate whether this change has been positive or negative, there’s no denying that learning has changed significantly in the digital era. Just think of the progression: from abacuses to smartphones, from chalkboards to laptops, and from trawling through libraries to searching Google in an instant. In many ways, learning today barely resembles learning 50 years ago.
However, dig deeper, and you will find that technology has changed how we learn in other sophisticated and intriguing ways. Far from just changing the learning environment and the learning tools that we rely on, technology has changed how we process information on a neurological level.
While this may sound alarming at first — technology is rewiring our brains! — it is a natural evolution that L&D professionals must strive to understand. After all, if we want to offer the most engaging learning strategies and the most up-to-date eLearning technologies, we must understand how people learn.
With this in mind, we’ll take you step-by-step through how technology has changed the way we learn, before offering crucial lessons for L&D leaders.
The short answer is that technology has changed almost everything about the way we process information.
In general, this has been a positive development. A study by PBS LearningMedia found that 74% of teachers think that learning technology has helped them motivate students and reinforce learning materials. Additionally, 73% of teachers said that it helped them engage students with different learning styles.
Let’s dig deeper to see how this has occurred.
Gen Z provides the clearest example of technology changing the way we learn. Often the butt of jokes about social media addictions and poor work ethic, members of Gen Z have grown up immersed in technology. As a result, you have probably heard that Gen Z’s attention span is shrinking — eight seconds, on average, compared to 12 seconds for Millennials. At best, this is a half-truth. In reality, it is an example of technology changing the way we process information.
According to a study by Altitude, the truth is that members of Gen Z have developed sophisticated, eight-second ‘content filters’. Since they have grown up with access to so many platforms and so much information, Gen Z has evolved to quickly assess large quantities of information. As such, they excel at judging what is worthy of their attention and what isn’t.
Basically, members of Gen Z process information differently due to their relationship with technology, using these content filters to accurately filter out irrelevant information and hone in on things that matter to them. Altitude’s study explains that “they’ve grown up in a world where their options are limitless but their time is not. As such, Gen Z have adapted to quickly sorting through and assessing enormous amounts of information.”
Gen Z’s content filter is a fascinating example of how technology has changed the way we process information. While attention spans have ostensibly decreased, our ability to select relevant information has increased. L&D professionals would do well to keep this in mind when targeting Gen Z with eLearning content.
If Gen Z’s content filter wasn’t enough to convince you that technology is redefining the way we process information, how about this. According to a study by Dundee University in Scotland, people over 55 who grew up in a household with a black and white television are more likely to dream in black and white.
On the other hand, younger participants (you guessed it) almost exclusively dream in colour. Several other studies have found similar results, including the American Psychological Association in 2011.
Remarkably, exposure to technology influences how we process information so much that it can change our dreams.
A 2008 study by Gary Small, a leading neuroplasticity researcher from UCLA, reinforces the idea that technology fundamentally changes the way we learn and process information. We already know that our brains are capable of reorganising and forming new neural pathways due to neuroplasticity. For a deeper dive into neuroplasticity and learning, check out our article on Fixed mindset vs growth mindset.
Small’s experiment sought to prove that technology has a similar effect in reorganising our brain chemistry. To do this, he put a group of “internet naive” people into an MRI machine to observe their baseline brain activity while they browsed the internet. Following this, he asked participants to browse the internet for an hour a day for the next week.
When participants returned to the MRI machine after a week, Small observed that “those subjects now toted brains that lit up significantly in the frontal lobe, where there had been minimal neural activity beforehand.”
According to Wired, “neural pathways quickly develop when we give our brains new tasks, and Small had shown that this held true — over the course of just a few hours, in fact — following internet use.”
A 2019 study observed similar results, this time through the medium of video games. In the study, researchers asked people to play video games for half an hour a day for two months, after which their brain volumes were compared with a control group. The researchers found that people who had been playing video games had larger grey matter structures in areas of the brain associated with memory, spatial navigation, strategic planning, and fine motor skills.
Simone Kuhn, the study’s lead author, explained that these findings demonstrate “the direct causal link between video gaming and a volumetric brain increase. This proves that specific brain regions can be trained by means of video games.”
Technology has also had a significant impact on our memories. After all, why would you waste precious brainpower memorising a phone number when you can store it in your phone instead? Why memorise the route for a road trip when Google Maps is always at your fingertips?
Researchers have dubbed this phenomenon ‘the external brain', with a survey of 1,021 experts and stakeholders concluding that many young people now use the internet as “their external brain”. In other words, they outsource certain memory tasks to the internet and other technologies.
Susan Price, the chief Web strategist at San Antonio’s Firecat Studio, believes that this is a natural evolution, saying, “those who bemoan the perceived decline in deep thinking... fail to appreciate the need to evolve our processes and behaviors to suit the new realities and opportunities.”
Several studies prove this external brain theory. For example, Harvard researchers asked participants to memorise a series of statements, such as “an ostrich’s eye is bigger than its brain.” Participants were more likely to remember these statements if the researchers told them that they had been erased from the computer. In contrast, participants were more likely to forget these statements if the researchers told them that they had been saved on the computer.
Put simply, participants were more likely to remember the statements when they did not have an external memory source to rely on. The study also found that participants were more likely to remember the folder locations where the statements were stored on the computer than the statements themselves.
According to Harvard psychologist Daniel Wegner, this is an example of “transactive” memory, where we share the work of remembering “because it makes us collectively smarter, expanding our ability to understand the world around us.”
Similarly, a 2014 study found that people who read a short story on a Kindle were less likely to remember the order of events than people who read the same short story in paperback.
Anne Mangen, the study’s lead researcher, explains that “Kindle readers performed significantly worse on the plot reconstruction measure i.e. when they were asked to place 14 events in the correct order.” She suggests that this is because “the haptic and tactile feedback of a Kindle does not provide the same support for mental reconstruction of a story as a print pocket book does.”
It is indisputable that technology has changed the way we learn and process information. The lessons — and opportunities — for L&D are far-reaching. Above all, L&D teams must adapt to offer the most compelling and up-to-date learning experiences.
In an article investigating the science of learning, Technology & Innovation Director AJ Juliani explains how technology has redefined the four stages of learning as defined by Peter Nilsson: attention, encoding, storage, and retrieval.
As Nilsson explains, “almost everything we do or know, we learn through these stages, for our learning is memory, and the bulk of our memory is influenced by these four processes: what we pay attention to, how we encode it, what happens to it in storage, and when and how we retrieve it.”
However, technology has redefined all of these processes. In a world of notifications, dings, and vibrations, attention has fundamentally changed, shifting from deep attention spans to sophisticated content filters. Encoding has also changed, with Juliani explaining that “the ‘internet of things’ is connecting our experiences to others’ experiences and learning at a rapid exponential pace, making knowledge double every 12 months.”
Finally, technology has redefined the phases of 'storage' and 'retrieval', as explained when analysing the external brain. Instead of falling back on the same patterns and processes, L&D teams must be aware of these changes when designing and implementing eLearning.
Shift eLearning views these changes through a positive lens, offering four lessons for L&D teams. Firstly, they suggest moving from individual to collaborative learning to take advantage of our transactive memory. According to Shift eLearning, collaborative learning “not only supports cognitive processes, but also socio-emotional processes by involving learners in getting to know each other, committing to social relationships, developing trust and belonging, and building a sense of online community.”
Further, they suggest moving from passive to active learning, as access to interactive technologies and a glut of information means “learners are no longer content-receptors merely taking down notes or listening to teachers talk for hours without pause.” Similarly, they suggest building multitasking into your eLearning strategy to account for how younger generations work and learn.
Finally, they recommend a personalised, differentiated approach to learning, saying, “it's extremely crucial to apply different types of instruction to different learners. No single method can accommodate all their learning needs. A flexible and personalised approach to content delivery is a must.”
With memory being outsourced, brains continually restructuring, and younger learners’ attention at a premium, planning an engaging L&D strategy may feel like an intimidating task. However, it is L&D’s responsibility to adapt and offer learners an experience that suits their current learning needs. Technology has changed the way we learn. Now, L&D must change too.