Apple has announced software updates for cognitive, vision, hearing and mobility accessibility, along with innovative tools for people who are nonspeaking or at risk of losing their ability to speak.

Users with cognitive disabilities will be able to use iPhone and iPad with greater ease and independence with Assistive Access that delivers a customised experience for Phone and FaceTime, combined into a single Calls app, as well as Messages, Camera, Photos and Music, with an emoji-only keyboard and the option to record and share a video message.

Non-speaking people will be able to use type-to-speak during calls and conversations with Live Speech and those at risk of losing their ability to speak can use Personal Voice to create a synthesised voice for connecting with family and friends. With Live Speech on iPhone, iPad and Mac, users can type to have it be spoken out loud during phone and FaceTime calls, as well as in-person conversations.

For the blind or vision-impaired, Detection Mode in Magnifier offers Point and Speak, to help interact with physical objects such as household appliances. It is built into the Magnifier app on an iPhone and iPad, and also works with VoiceOver, and can be used with other Magnifier features such as People Detection, Door Detection and Image Descriptions to help users navigate their physical environment.

Deaf or hearing-impaired users can pair Made for iPhone hearing devices directly to Mac and customise them for hearing comfort.  Voice Control Guide, offer the speech-impaired tips on using voice commands as an alternative to touch and typing across iPhone, iPad and Mac while users with physical and motor disabilities who use Switch Control can turn any switch into a virtual game controller to play games on iPhone and iPad.

Users sensitive to rapid animations can automatically pause images with moving elements, such as GIFs, in Messages and Safari. Users can also customise the rate at which Siri speaks to them, with options ranging from 0.8x to 2x.