[Image from Apple]

Apple yesterday unveiled next-generation software features designed for people with mobility, vision, hearing and cognitive disabilities.

Cupertino, Calif.-based Apple’s operating systems are set to undergo software updates later this year that would allow people with limb differences to navigate the Apple Watch using AssistiveTouch, while the iPad will support third-party eye-tracking hardware for easier control and low-vision communities, according to a news release.

AssistiveTouch for watchOS uses built-in motion sensors like the gyroscope and accelerometer, along with the optical heart rate sensor and on-device machine learning, to detect subtle differences in muscle movement and tendon activity, allowing users to navigate a cursor on the display through a series of hand gestures, such as a pinch or a clench.

The iPad’s eye-tracking support makes it possible for users to control their iPad using just their eyes through third-party eye-tracking devices, with a pointer moving to follow the person’s gaze while extended eye contact performs an action, such as a “tap.”

VoiceOver’s new capabilities will allow users to navigate a photo of a receipt by row and column like a data table, while it can describe a person’s position along with other objects in images, allowing users to relive memories in detail, Apple said.

Additionally, Apple’s VoiceOver screen reader will use on-device intelligence to explore objects within images and new background sounds will be introduced to minimize distractions in support of neurodiversity and for those who are deaf and hard of hearing. Made-for-iPhone will also support new bi-directional hearing aids.

The company’s added support for bi-directional hearing aids enables those who are deaf or hard of hearing to have hands-free phone and FaceTime conversations with the MFi hearing devices program. Apple’s Headphone Accommodations will soon recognize audiograms, while the new background sounds feature offers balanced, bright or dark noise, along with ocean, rain or stream sounds to mask unwanted environmental or external noise.

Apple also said it plans to launch SignTime today, offering a service that allows customers to communicate with AppleCare and retail customer care by using American Sign Language (ASL) in the U.S., British Sign Languate (BSL) in the U.K. or French Sign Language (LSF) in France through their web browser.

SignTime also offers remote access to a sign language interpreter without booking ahead of time. Apple plans to expand the service into additional countries.

“At Apple, we’ve long felt that the world’s best technology should respond to everyone’s needs, and our teams work relentlessly to build accessibility into everything we make,” Apple senior director of global accessibility policy & initiatives Sarah Herrlinger said in the release. “With these new features, we’re pushing the boundaries of innovation with next-generation technologies that bring the fun and function of Apple technology to even more people — and we can’t wait to share them with our users.”