ETV Bharat / technology

Eye Tracking, Music Haptics For Physically Disabled: All About Apple's New Accessibility Features

author img

By ETV Bharat Tech Team

Published : May 16, 2024, 8:25 AM IST

Apple announced new accessibility features coming later this year, including Eye Tracking, a way for users with physical disabilities to control iPad or iPhone with their eyes. Additionally, Music Haptics will offer a new way for users who are deaf or hard of hearing.

Apple announced new accessibility features coming later this year, including Eye Tracking, a way for users with physical disabilities to control iPad or iPhone with their eyes. Additionally, Music Haptics will offer a new way for users who are deaf or hard of hearing.
The Apple logo is displayed at an Apple store, Jan. 3, 2019. (AP Photos)

Hyderabad: Apple on Wednesday announced new accessibility features such as Eye Tracking, Music Haptics, Vocal Shortcuts, and others for iPhone and iPad that will come later this year.

According to the company, the Eye Tracking feature will provide a way for users with physical disabilities to control their iPad or iPhone with their eyes. "We're continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users," said Tim Cook, Apple's CEO.

Music Haptics will offer a new way for users who are deaf or hard of hearing to experience music using the Taptic Engine in iPhone; Vocal Shortcuts will allow users to perform tasks by making a custom sound; Vehicle Motion Cues can help reduce motion sickness when using iPhone or iPad in a moving vehicle; and more accessibility features will come to visionOS, Apple said in a statement.

Eye Tracking on iPad and iPhone

According to Apple, Eye Tracking gives users a built-in option for navigating iPad and iPhone with just their eyes. Designed for users with physical disabilities, Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on the device, and isn’t shared with Apple.

Eye Tracking works across iPadOS and iOS apps and doesn’t require additional hardware or accessories. With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes.

Music Haptics

Music Haptics is a new way for users who are deaf or hard of hearing to experience music on iPhone. With this accessibility feature turned on, the Taptic Engine in the iPhone plays taps, textures, and refined vibrations to the audio of the music. Music Haptics works across millions of songs in the Apple Music catalog and will be available as an API for developers to make music more accessible in their apps.

Other Accessibility Features

The Vocal Shortcuts feature will allow users to perform tasks by making a custom sound, whereas Vehicle Motion Cues can help reduce motion sickness when using an iPhone or iPad in a moving vehicle. Listen for Atypical Speech, another new feature, gives users an option for enhancing speech recognition for a wider range of speech. Listen for Atypical Speech uses on-device machine learning to recognise user speech patterns.

Designed for users with acquired or progressive conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke, these features provide a new level of customisation and control, building on features introduced in iOS 17 for users who are nonspeaking or at risk of losing their ability to speak.

Additional Features on VisionOS

In addition, the company said that more accessibility features will come to visionOS as well. The accessibility features that will come to visionOS will include systemwide Live Captions to help everyone -- including users who are deaf or hard of hearing -- follow along with spoken dialogue in live conversations and in audio from apps.

With Live Captions for FaceTime in visionOS, more users will be able to easily enjoy the unique experience of connecting and collaborating using their Persona. Apple Vision Pro will also add the capability to move captions using the window bar during Apple Immersive Video, as well as support for additional Made for iPhone hearing devices and cochlear hearing processors, the tech giant stated.

Read More

  1. Apple Shares Five Tips to Maximise the Battery Life of an iPhone
  2. Apple's Biggest Announcements From Its iPad Event: Brighter Screen, Faster Chips, Pencil Pro
  3. Apple 'Accelerates' Work on Foldable Devices; May Launch in 2026: Report
ETV Bharat Logo

Copyright © 2024 Ushodaya Enterprises Pvt. Ltd., All Rights Reserved.