iPhones and iPads Are Getting Eye Tracking and Other Accessibility Features

iPhone eye tracking

Apple detailed today new accessibility features coming to its mobile devices later this year, including Eye Tracking for the iPhone and iPad. The feature, which will leverage the AI and the devices’ front-facing camera will allow users to navigate across iOS and iPadOS with just their eyes.

“Designed for users with physical disabilities, Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple,” the company explained today. “Eye Tracking works across iPadOS and iOS apps, and doesn’t require additional hardware or accessories.”

Microsoft added eye-tracking support to Windows 10 years ago, and the feature is also available on Windows 11, but the feature requires a separate eye-tracking device. However, to my knowledge, Android doesn’t have native support for eye tracking yet.

Apple announced other accessibility features today that will be coming later this year:

Music Haptics: This feature will use the Taptic Engine on iPhones to allow users who are deaf or hard of hearing to experience music with taps, textures, and refined vibrations to the audio of the music. This will work first in Apple Music, and there will be APIs for developers of third-party music apps.

Vocal Shortcuts: This feature will allow iPhone and iPad users to teach Siri to recognize words or phrases to launch shortcuts and complete complex tasks.

Listen for Atypical Speech: iOS 17 will use on-device machine learning to enhance speech recognition and understand atypical user speech patterns.

Vehicle Motion Cues: This feature aims to reduce car motion sickness by displaying animated dots on the edges of the screen that represent changes in vehicle motion. Apple said this should help to reduce the sensory conflict between what a person sees and what they feel, which is one of the main causes of motion sickness.

New CarPlay accessibility features: Apple’s in-car smartphone mirroring system is getting support for color filters and bold and large text. Voice Control will also allow users to navigate the interface with their voice, while Sound Recognition will trigger alerts to notify users who are deaf or hard of hearing of car horns and sirens.

New VisionOS accessibility features: Apple’s Vision Pro headset is adding support for system-wide live captions and new Made for iPhone hearing devices and cochlear hearing processors. VisionOS will also allow users to reduce transparency effect and dim flashing lights.

While Apple didn’t share a precise release date for these accessibility features, we may get more details during the company’s WWDC developer conference in June. This is where the company will also reveal the next major updates for its software platform, and AI should be one of the main topics this year.

Tagged with

Share post

Thurrott