Explore AI Insights with QuantumLoop Blog

Older Patients Aren't the Problem. The Design Is.

Written by Tina Stenzel | April 24, 2026

When an elderly patient struggles with a new technology, the instinct is to conclude the technology is wrong for them. It is a natural reaction, and in some cases it is correct. But it can also obscure a more important question: was the person who designed this system thinking about that patient at all?

The difficulties some older patients have experienced with AI voice systems in GP surgeries are real and should be taken seriously. But they are symptoms of a design challenge, not evidence that the technology itself is incompatible with older people. There is a significant difference between those two things, and confusing them leads to the wrong conclusions.

What the Data Actually Shows About Older People and Technology

There is a persistent and largely outdated assumption that older people do not engage with technology. The evidence tells a different story.

Of the 4.5 million people in the UK who have never been online, 94 per cent are over 55, according to the Centre for Ageing Better. That is a real and serious challenge that nobody should dismiss. But it sits alongside a different and equally important truth: the majority of older people in the UK are online, are using smartphones, and are increasingly comfortable with digital tools.

Smartphone usage in the UK has risen across every age group. Among adults aged 55 to 64, adoption has grown more rapidly than almost any other demographic over the past decade, according to Ofcom data. Globally, 62 per cent of adults over 70 now use smartphones. Among those aged 60 to 69 the figure rises to 81 per cent. AI usage among older adults has nearly doubled in a single year, rising from 18 per cent in 2024 to 30 per cent in 2025, according to AARP research. Around half of older adults currently use or are interested in using a voice personal assistant such as Siri or Alexa.

The picture is not of a generation that refuses technology. It is of a generation that adopts technology when it is designed well enough to be worth adopting.

The Real Barrier Is Design, Not Age

The National Pensioners Convention estimates that between 500,000 and 700,000 older people in the UK would be unable to access online NHS consultation or patient record tools as currently designed. That is a damning figure, but notice what it actually says. It is not that older people cannot use technology. It is that the specific tools as currently designed do not work for them.

This distinction matters enormously. A voice system that requires knowledge of the NATO phonetic alphabet excludes people who were never taught it, which skews heavily older. A system with no clear fallback to a human when interactions fail excludes people who need a different pathway, again disproportionately older. These are not failures of the users. They are failures of the designers to think carefully enough about who would actually be using the system.

Inclusive design research is consistent on this point. When technology is designed with older users genuinely in mind, not as an afterthought but from the beginning, adoption rates are higher, satisfaction is better, and the improvements made for older users tend to improve the experience for everyone. The touchpad technology that became the foundation of the iPhone's interface was originally designed for users with hand disabilities. Captions, developed for deaf viewers, are now used by millions of people watching video on mute. Designing for the edges of the user population does not limit a product. It makes it better for the whole population.

Building Inclusivity In, Not Bolting It On

There is a meaningful difference between a company that responds to inclusivity complaints and one that builds inclusivity into how it works from the start. The latter requires a specific kind of discipline, particularly in healthcare, where the stakes of getting it wrong are higher than in almost any other context.

QuantumLoopAI works directly alongside NHS partners and GP surgeries to test and improve EMMA using Patient Participation Groups, the patient representative bodies that exist within practices specifically to ensure the patient voice shapes how services are delivered. Testing with real patients across specific age ranges and demographics is how genuine inclusivity gaps get identified, not through assumptions about who will use the system but through actual evidence of how different people experience it.

Language is one of the most tangible examples of this in practice. EMMA is available in multiple languages, with new languages added on a rolling basis as they pass clinical sign-off. That sign-off process is not a formality. Clinical safety and adherence to NHS guidelines are treated as non-negotiable requirements before any new capability is released, because a healthcare tool that has not been properly validated is not a healthcare tool, it is a risk.

This matters particularly for older patients and those for whom English is not a first language, two groups that overlap significantly in many NHS practices and two groups that have historically been underserved by both human and digital systems. A voice system available in a patient's own language, designed with their specific needs tested and validated, is not just more inclusive. It is clinically safer.

What This Means in Practice

The removal of the phonetic alphabet requirement was one step toward greater inclusivity, directly addressing a barrier that fell disproportionately on older patients. Improvements in name recognition mean patients no longer need to spell out their details in ways that feel unnatural. Call flows have been refined so that patients who are struggling are identified earlier and routed to a human more quickly rather than being left to navigate a system that is not working for them.

Perfection is not the claim here, and it never should be. Inclusivity in a healthcare voice system serving the full diversity of the NHS patient population is not a destination you arrive at. It is a standard you keep working toward, informed by patient feedback, clinical guidance, and the lived experience of the people using the system every day. The commitment is not to have solved it. It is to keep improving, to keep testing with real patients, and to take every gap seriously when it is identified.

The Fallback Is Not a Failure

One principle that should be non-negotiable in any AI deployment in NHS primary care is that there must always be a clear, accessible route to a human for patients who need it. Not buried in a menu, not available only after multiple failed attempts, but present and obvious from the start.

This is not an admission that AI does not work. It is a recognition that different patients have different needs, and that a good system serves all of them. NHS England's digital inclusion framework makes exactly this point: alternative contact methods must remain available, not as a concession but as a design standard.

The older patients who have struggled with AI receptionists deserve better designed systems. They also deserve a conversation that treats them as the capable, adaptive people the evidence shows them to be, rather than assuming that difficulty with a poorly designed tool reflects something about their generation.

The problem was never the patients. It was always the design. And design, when approached with the right discipline and the right partners, can always be improved.

Sources: