We select and review products independently. When you purchase through our links we may earn a commission. Learn more.

Apple Will Be Your Eyes, Ears, and Voice With New Accessibility Features

Hannah Stryker / Review Geek

Apple users with disabilities are getting some good news this week. In a press release, the company announced a slew of upcoming accessibility features that can serve as their eyes, ears, and voice to empower them to navigate the world with greater ease and independence.

The forthcoming software covers various accessibility needs, including cognitive, speech, vision, hearing, and mobility. In the press release, Apple stated it worked in deep collaboration with community groups of various disability types to ensure the features adequately address the wide range of challenges users with handicaps face to make a tangible impact on their lives.

An iPhone and iPad in Assistive Access mode

One of the most remarkable new accessibility tools is Assistive Access, which serves users with cognitive disabilities. It leverages innovative design principles to streamline core iPhone and iPad features like connecting with loved ones, capturing memories with the camera, and enjoying music. Assistive Access lightens the cognitive load necessary to listen to music, make calls, send texts, and capture and look at photos.

A woman using Live Speech on an iPhone

Live Speech is a feature aimed at assisting nonspeaking individuals in phone communication. During calls and video chats, users can type messages into Live Speech that are then read aloud to those on the call. The feature even lets you save commonly used phrases for quick access.


Maybe the most impressive new tool Apple announced today is the Personal Voice. This tool utilizes on-device machine learning to create a synthesized voice that sounds like the user. This will be very helpful for users diagnosed with degenerative diseases such as ALS and allow them to retain their own voice when they lose the ability to speak.

Also worth mentioning is Detection Mode in Magnifier, which offers a form of sight to users who are blind or have low vision. Combining the Camera app, LiDAR Scanner, and on-device machine learning, Magnifier can identify and read text aloud on physical objects like household appliances, providing a more independent and immersive experience.

All of these features and more will be available on iPhone and iPad later this year.

The 4 Best iPhones of 2023

Best Overall
iPhone 14
Best Midrange
iPhone 12
Best Premium
iPhone 14 Pro Max
Best Small
iPhone 13 Mini

Source: Apple

Danny Chadwick Danny Chadwick
Danny has been a technology journalist since 2008. He served as senior writer, as well as multimedia and home improvement editor at Top Ten Reviews until 2019. Since then, he has been a freelance contributor to Lifewire and ghostwriter for Fit Small Business. His work has also appeared on Laptop Mag, Tom’s Guide, and business.com. Read Full Bio »