Apple Details New Accessibility Features Coming to iOS, iPadOS, and macOS Later This Year

Apple has detailed today new accessibility features coming to iOS, iPadOS, and macOS later this year. The new features include Assistance Access, which will make Apple’s core apps easier to use for people with cognitive disabilities, as well as a new Personal Voice feature that will be able to recreate the voice of a person.

“Accessibility is part of everything we do at Apple,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. “These groundbreaking features were designed with feedback from members of disability communities every step of the way, to support a diverse set of users and help people connect in new ways.”

Windows Intelligence In Your Inbox

Sign up for our new free newsletter to get three time-saving tips each Friday — and get free copies of Paul Thurrott's Windows 11 and Windows 10 Field Guides (normally $9.99) as a special welcome gift!

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

With Assistive Access, Apple will simplify how its core iOS and iPadOS apps work. The Phone and FaceTime apps will be combined into a single “Calls” app, while the Messages, Camera, Photos, and Music apps will offer an optimized interface with high-contrast buttons and large text labels. Assistive Access will also provide the option to have the iOS and iPadOS home screens display a bigger grid-based layout or a row-based layout.

Apple is also working on a Live Speech feature that will allow iPhone, iPad, and Mac users who are unable to speak to type what they want and have the OS say it out loud. This will work for in-person conversations as well as during phone and FaceTime calls.

With Personal Voice, Apple will also let iPhone and iPad users recreate a voice that sounds just like them by recording their voice for 15 minutes. “This speech accessibility feature uses on-device machine learning to keep users’ information private and secure, and integrates seamlessly with Live Speech so users can speak with their Personal Voice when connecting with loved ones,” Apple explained today.

Apple announced many other upcoming accessibility features today including a new Point and Speak capability in the Magnifier app. This feature will work on iPhones and iPads with a LiDAR scanner, and it will help users with vision disabilities to interact with objects and devices that have text labels, such as microwaves.

Apple didn’t say when these features will roll out to the public, but they will likely ship with the next major updates for iOS, iPadOS, and macOS coming later this year. We should hear more about consumer-facing features coming with these software updates during the company’s WWDC developer conference on June 5-9.

Tagged with

Share post

Please check our Community Guidelines before commenting

Windows Intelligence In Your Inbox

Sign up for our new free newsletter to get three time-saving tips each Friday

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

Thurrott © 2024 Thurrott LLC