As deep-fake technology and voice cloning become more sophisticated, it’s becoming harder to discern what’s authentic, especially for older, more vulnerable groups.
Age UK, a U.K.-based charity that provides support and services to older people, has warned that the repercussions of scams, from diminished trust in digital platforms to a sense of violation and isolation, highlight a growing need to safeguard older people. There’s also a financial motivation to enhance protections against AI-enabled fraud, with Deloitte estimating these scams resulted in $12.3 billion in losses in 2023, with projections reaching $40 billion by 2027.
Earlier this year, I designed and facilitated a workshop for Age UK that aimed to introduce older generations to AI, specifically generative AI chatbots.
My goal was to demonstrate the practical capabilities of generative AI, enhancing digital literacy and fostering an understanding of associated risks. When we think about AI and digital literacy, we rarely consider older generations as engaged learners. However, the workshop revealed something surprising: Older adults are eager to understand AI and emerging technologies. The problem isn’t a lack of interest; it’s a lack of accessibility and exposure.
Workshop Findings
A few things surprised me during the workshop, notably the wide range of familiarity with AI. One attendee proudly shared that he has 10 Alexa devices connected to his doorbell and uses voice commands to turn on lights when his hands are busy with laundry. While this wasn’t representative of the majority, it highlighted that older audiences already engage with AI in sophisticated ways, demonstrating high digital literacy.
On the other end of the spectrum, one participant asked ChatGPT, “Is there some office far away where people are doing nothing but typing these ingredients down?” This exchange reaffirmed the opportunity for this workshop to have an impact.
I designed and facilitated the workshop to introduce AI capabilities, explore potential risks and improve digital literacy. We used creative prompts generating recipes, poems and even a song to guide participants from text, to visuals, to sound, gradually pushing their expectations of what this technology could do. I aimed to demystify AI by showing its relevance in everyday life, not just self-driving cars and humanoid robots.
What emerged was not only gleeful reactions to an instant satsuma and okra stew recipe tailored precisely for the seven of us, but also organic conversations around the risks it poses to artistic authenticity and skill.
When we used Suno, a generative AI music creation program, to create a bespoke reggae song about driving down a country lane in merely 20 seconds, initial awe and group dancing turned into concern. “That’s amazing, incredible,” one participant said, before another added: “It would make me think that every new song I hear, you can never trust that it’s an original.”
This shift from delight to critical reflection captured the essence of AI’s impact; it’s a powerful tool for creativity and innovation, but it also raises important questions around originality, authorship and the future of creative industries.
And just like newspapers, as one participant aptly said: “You can’t always believe what you read,” we can say the same for ChatGPT. We emphasized the need for due diligence in verifying information. We encouraged a healthy skepticism; everything should be taken with a pinch of salt, particularly regarding areas like medical or legal advice.
In a short period of time, AI and emerging technologies have reshaped how we communicate, create and consume. This rapid evolution often presents a steeper learning curve for older generations, accompanied by skepticism and distrust. However, with the proper education and training programs, like those championed by Age UK, these tools hold great potential for older generations to stay connected and informed and equipped to recognize AI-generated misinformation.
As user experience designers, we have accessibility requirements to ensure that users with disabilities can use digital products. However, the less considered inclusive design goes a step further. It considers a broader range of socioeconomic factors such as age, technical literacy, socioeconomic background and cultural context. To build a truly equitable digital future, we must create accessible and inclusive learning experiences, encourage intergenerational tech dialogue and implement regulations to protect vulnerable groups without stifling innovation.