In our previous blog on iPhone accessibility, we spoke about features such as:
- VoiceOver
- Speak Selection
- Speech Controller
- Audio Descriptions
- Voice Control
- Zoom
- Magnifier
- Display and Text Size
These features are extremely useful for people who are blind or have low vision. They allow these individuals to access the digital world, but this is assuming that this world has an accessible UI/UX design and was developed with people of all demographics in mind.
However, there are a multitude of other accessibility features built into iPhones. This blog covers all the features that can help people with hearing loss to participate in everyday activities independently.
Hearing Aids
The iPhone can connect to many modern hearing aids via Bluetooth software. This allows calls, notifications, music, and all other sounds to transmit directly to the hearing aid. The transmitted sound can be the only thing the person hears or this sound can be blended with the base noise of the nearby environment, depending on the type of hearing aid.
Live Listen
Many hearing aid users purchase expensive Room Listeners. These act as an external microphone which transmits the sound to the hearing aid. They are often used for meetings where, for example, the user would place the Listener in the centre of a boardroom table.
Using the Live Listen feature on your iPhone, the iPhone itself can act as this listener and be used in the same scenarios as bespoke Room Listeners.
Sensory Alerts
iPhone and iPad device users can set up a sensory alert for notifications. For example, incoming calls, new texts, emails, and calendar events can come through as vibrations or a quick LED light flash.
Sound Recognition
This feature uses on‑device intelligence to alert you when it detects one of 15 different types of sounds, including alarms, door knocks, car horns, or even the sound of a crying baby. iPhones and iPads can even be trained to notify users of sounds unique to their environment such as a doorbell. Users with hearing loss are alerted to these sounds via a visible vibration.
Live Captions
iOS devices also offer the new Live Captions feature. This allows for real-time transcriptions of speech, audio, and video to text and can be used during phone calls or with any media content. The font, size, and background colour of Live Captions can be customised according to your preferences. Mac users have the additional option of using Type to Speak to type out their responses and have them read out loud to others in a real-time conversation.
Conclusion
Apple continues to change the game in terms of providing accessibility features to its users. Above are just a few examples of the many ways iPhones can be used to enhance the everyday lives of people with disabilities, but it is the responsibility of app and website designers to build their content to the WCAG 2.1 standards to allow these tools to work correctly. If you would like any advice on these standards, please don’t hesitate to contact IA Labs.