How driverless vehicles can be made safer for deaf and hard of hearing people

Technology


Self-driving cars are very much a reality and no longer a vision from science fiction. In the UK, automated vehicles (AVs) such as self-driving shuttles are already being tested on public roads.

Self-driving taxi services are expected to launch in 2026, and the Automated Vehicles Act is scheduled for implementation in 2027. This act establishes the legal groundwork for driverless cars to operate on Britain’s roads.

As these vehicles move from research labs to our streets, one question becomes critical: how will they communicate safely with the people around them? Researchers and designers have proposed installing equipment on the vehicles called external human–machine interfaces. These are designed to help driverless vehicles signal their behaviour to pedestrians and other road users (cyclists, wheelchair users and human drivers).

The driverless vehicles would employ pulsing lights around the vehicle, text displays showing the car’s intentions, and auditory cues that announce forthcoming actions, such as “I’m stopping” or a truck-like reversing sound.

However, much of this research still overlooks people with disabilities, including pedestrians with hearing loss. When accessibility isn’t built in from the start, the resulting designs often fail. So how can this be improved?

There are many examples of where current driverless vehicles fall short. Text-only displays may appear universal, but they can be less accessible for people whose primary language is sign language. They are also inacessible to blind people. Auditory cues, such as hums or droning sounds, could help the blind, but are difficult or impossible to detect for many people with hearing loss – even those with hearing aids.

Speech-based cues, meant to help people with low vision, can unintentionally introduce new risks. Hearing loss can distort speech, so a message like “I’m stopped” may be heard only as “stop” – completely altering its meaning.

One size fits all

Driverless vehicles are not inherently unsafe for deaf and hard of hearing people – the challenge lies in a design process that assumes a universal, one-size-fits-all approach. Historically, communication interfaces in regular vehicles have been built with an assumed “typical” hearing pedestrian in mind.

When accessibility becomes an afterthought, communication becomes unreliable, and the systems meant to increase safety may end up excluding the people who need them most. Technology alone cannot solve this problem.

Man with hearing aid
Cars could use lights and text to signal their ‘intentions’ to deaf people.
Peakstock / Shutterstock

Only thoughtful, inclusive design can. Our research shows that combining visual (pulsing lights and a text display) and audio (speech) cues can significantly increase trust and support safer decisions for pedestrians in general. But much more development is needed to ensure these communication interfaces are equitable for all people with special needs.

This gap between technological promise and lived experience reflects a broader pattern. Even though the Automated Vehicles Act aims to improve accessibility, most research in this area in this area still neglects people with special needs, including those with hearing loss.

If we want driverless vehicles to create more accessible streets – and not merely introduce new barriers – then people with special needs must be included in research, design and policy from the beginning.

Drawing on a series of user studies, we offer several practical recommendations to guide industry, researchers and policymakers toward a safer, more inclusive driverless car ecosystem.

Manufacturers should include diverse populations in the design and evaluation of their vehicles. We found that pedestrians with hearing loss may experience external human–machine interfaces differently from hearing people. Designers cannot fully anticipate the potential risks unless they inclusively involve user testing groups.

People need to understand not just that a vehicle exists, but what it intends to do. Displaying the vehicle’s “state”, such as “stopped”, and transitions, such as “slowing down”, helps pedestrians accurately judge the situation and feel more assured.

Combining audio and visual cues increases trust, acceptance and perceived safety. No single mode of communication is effective for everyone, but together, they offer back-ups and clarity.

Relying on just one type of visual cue is risky – lights, text or icons can fail in certain conditions. Providing combined visual information helps ensure that if one fails, another still supports pedestrian understanding.

Urban soundscapes can interfere with with audio cues, especially for pedestrians with hearing loss. Studying external human–machine interfaces in realistic environments is essential for ensuring they work when it matters.

Vehicle manufacturers must work with hearing aid and cochlear implant manufacturers to help ensure that audio cues are distinguishable, rather than confusing.

In many cases, barriers to inclusion arise not from technology itself, but from a lack of awareness or consultation. When people with special needs are excluded from design decisions, systems are built on assumptions rather than lived experience.

When they are actively involved, however, we are a step towards an inclusive and equitable future. Driverless vehicles have the potential to make our roads safer for everyone. But that future depends on purposeful, inclusive design choices today.

If developers, policymakers and researchers commit to engaging with deaf and hard of hearing people, along with others, we can help create streets that are safer, more accessible and more equitable for all.

The Conversation

Wenge Xu does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *