opens in new window
PRESS RELEASE May 16, 2023

Apple introduces new features for cognitive accessibility, along with Live Speech, Personal Voice, and Point and Speak in Magnifier

New software features for cognitive, speech, and vision accessibility are coming later this year
Assistive Access Home Screen layouts on iPad and iPhone next to a Live Speech recording displayed on iPhone.
Apple’s new accessibility features, including Assistive Access, Live Speech, and more, will arrive later this year.
CUPERTINO, CALIFORNIA Apple today previewed software features for cognitive, vision, hearing, and mobility accessibility, along with innovative tools for individuals who are nonspeaking or at risk of losing their ability to speak. These updates draw on advances in hardware and software, include on-device machine learning to ensure user privacy, and expand on Apple’s long-standing commitment to making products for everyone. 
Apple works in deep collaboration with community groups representing a broad spectrum of users with disabilities to develop accessibility features that make a real impact on people’s lives. Coming later this year, users with cognitive disabilities can use iPhone and iPad with greater ease and independence with Assistive Access; nonspeaking individuals can type to speak during calls and conversations with Live Speech; and those at risk of losing their ability to speak can use Personal Voice to create a synthesized voice that sounds like them for connecting with family and friends. For users who are blind or have low vision, Detection Mode in Magnifier offers Point and Speak, which identifies text users point toward and reads it out loud to help them interact with physical objects such as household appliances.
“At Apple, we’ve always believed that the best technology is technology built for everyone,” said Tim Cook, Apple’s CEO. “Today, we’re excited to share incredible new features that build on our long history of making technology accessible, so that everyone has the opportunity to create, communicate, and do what they love.”
“Accessibility is part of everything we do at Apple,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. “These groundbreaking features were designed with feedback from members of disability communities every step of the way, to support a diverse set of users and help people connect in new ways.”

Assistive Access Supports Users with Cognitive Disabilities

Assistive Access uses innovations in design to distill apps and experiences to their essential features in order to lighten cognitive load. The feature reflects feedback from people with cognitive disabilities and their trusted supporters — focusing on the activities they enjoy — and that are foundational to iPhone and iPad: connecting with loved ones, capturing and enjoying photos, and listening to music.
Assistive Access includes a customized experience for Phone and FaceTime, which have been combined into a single Calls app, as well as Messages, Camera, Photos, and Music. The feature offers a distinct interface with high contrast buttons and large text labels, as well as tools to help trusted supporters tailor the experience for the individual they support. For example, for users who prefer communicating visually, Messages includes an emoji-only keyboard and the option to record a video message to share with loved ones. Users and trusted supporters can also choose between a more visual, grid-based layout for their Home Screen and apps, or a row-based layout for users who prefer text.
The new streamlined Home Screen on iPad and iPhone with Assistive Access features enabled.
With Assistive Access on iPhone, users can choose between a more visual, grid-based layout for their Home Screen and apps, or a row-based layout for those who prefer text.
“The intellectual and developmental disability community is bursting with creativity, but technology often poses physical, visual, or knowledge barriers for these individuals,” said Katy Schmid, senior director of National Program Initiatives at The Arc of the United States. “To have a feature that provides a cognitively accessible experience on iPhone or iPad — that means more open doors to education, employment, safety, and autonomy. It means broadening worlds and expanding potential.”

Live Speech and Personal Voice Advance Speech Accessibility

With Live Speech on iPhone, iPad, and Mac, users can type what they want to say to have it be spoken out loud during phone and FaceTime calls as well as in-person conversations. Users can also save commonly used phrases to chime in quickly during lively conversation with family, friends, and colleagues. Live Speech has been designed to support millions of people globally who are unable to speak or who have lost their speech over time.
For users at risk of losing their ability to speak — such as those with a recent diagnosis of ALS (amyotrophic lateral sclerosis) or other conditions that can progressively impact speaking ability — Personal Voice is a simple and secure way to create a voice that sounds like them. 
Users can create a Personal Voice by reading along with a randomized set of text prompts to record 15 minutes of audio on iPhone or iPad. This speech accessibility feature uses on-device machine learning to keep users’ information private and secure, and integrates seamlessly with Live Speech so users can speak with their Personal Voice when connecting with loved ones.1
Personal Voice being recorded on iPhone 14 Pro Max.
Personal Voice allows users at risk of losing their ability to speak to create a voice that sounds like them, and integrates seamlessly with Live Speech so users can speak with their Personal Voice when connecting with loved ones.
“At the end of the day, the most important thing is being able to communicate with friends and family,” said Philip Green, board member and ALS advocate at the Team Gleason nonprofit, who has experienced significant changes to his voice since receiving his ALS diagnosis in 2018. “If you can tell them you love them, in a voice that sounds like you, it makes all the difference in the world — and being able to create your synthetic voice on your iPhone in just 15 minutes is extraordinary.”

Detection Mode in Magnifier Introduces Point and Speak for Users Who Are Blind or Have Low Vision

Point and Speak in Magnifier makes it easier for users with vision disabilities to interact with physical objects that have several text labels. For example, while using a household appliance — such as a microwave — Point and Speak combines input from the camera, the LiDAR Scanner, and on-device machine learning to announce the text on each button as users move their finger across the keypad.2 Point and Speak is built into the Magnifier app on iPhone and iPad, works great with VoiceOver, and can be used with other Magnifier features such as People Detection, Door Detection, and Image Descriptions to help users navigate their physical environment.
Point and Speak in Magnifier makes it easier for users with vision disabilities to interact with physical objects that have several text labels.

Additional Features

  • Deaf or hard-of-hearing users can pair Made for iPhone hearing devices directly to Mac and customize them for their hearing comfort.3
  • Voice Control adds phonetic suggestions for text editing so users who type with their voice can choose the right word out of several that might sound alike, like “do,” “due,” and “dew.”4 Additionally, with Voice Control Guide, users can learn tips and tricks about using voice commands as an alternative to touch and typing across iPhone, iPad, and Mac.
  • Users with physical and motor disabilities who use Switch Control can turn any switch into a virtual game controller to play their favorite games on iPhone and iPad.
  • For users with low vision, Text Size is now easier to adjust across Mac apps such as Finder, Messages, Mail, Calendar, and Notes.
  • Users who are sensitive to rapid animations can automatically pause images with moving elements, such as GIFs, in Messages and Safari.
  • For VoiceOver users, Siri voices sound natural and expressive even at high rates of speech feedback; users can also customize the rate at which Siri speaks to them, with options ranging from 0.8x to 2x.

Celebrating Global Accessibility Awareness Day Around the World 

To celebrate Global Accessibility Awareness Day, this week Apple is introducing new features, curated collections, and more:
  • SignTime will launch in Germany, Italy, Spain, and South Korea on May 18 to connect Apple Store and Apple Support customers with on-demand sign language interpreters. The service is already available for customers in the U.S., Canada, U.K., France, Australia, and Japan.
  • Select Apple Store locations around the world are offering informative sessions throughout the week to help customers discover accessibility features, and Apple Carnegie Library will feature a Today at Apple session with sign language performer and interpreter Justina Miles. And with group reservations — available year-round — Apple Store locations are a place where community groups can learn about accessibility features together.
  • Shortcuts adds Remember This, which helps users with cognitive disabilities create a visual diary in Notes for easy reference and reflection.
  • This week, Apple Podcasts will offer a collection of shows about the impact of accessible technology; the Apple TV app is featuring movies and series curated by notable storytellers from the disability community; Apple Books will spotlight Being Heumann: An Unrepentant Memoir of a Disability Rights Activist, the memoir by disability rights pioneer Judith Heumann; and Apple Music will feature cross-genre American Sign Language (ASL) music videos.
  • This week in Apple Fitness+, trainer Jamie-Ray Hartshorne incorporates ASL while highlighting features available to users that are part of an ongoing effort to make fitness more accessible to all. Features include Audio Hints, which provide additional short descriptive verbal cues to support users who are blind or low vision, and Time to Walk and Time to Run episodes become “Time to Walk or Push” and “Time to Run or Push” for wheelchair users. Additionally, Fitness+ trainers incorporate ASL into every workout and meditation, all videos include closed captioning in six languages, and trainers demonstrate modifications in workouts so users at different levels can join in.
  • The App Store will spotlight three disability community leaders — Aloysius Gan, Jordyn Zimmerman, and Bradley Heaven — each of whom will share their experiences as nonspeaking individuals and the transformative effects of augmentative and alternative communication (AAC) apps in their lives.
Share article

Media

  • Text of this article

  • Images in this article

  1. Personal Voice can be created using iPhone, iPad, and Mac with Apple silicon, and will be available in English.
  2. Point and Speak will be available on iPhone and iPad devices with the LiDAR Scanner in English, French, Italian, German, Spanish, Portuguese, Chinese, Cantonese, Korean, Japanese, and Ukrainian.
  3. Users will be able to pair Made for iPhone hearing devices with select Mac devices with M1 chip, and all Mac devices with M2 chip.
  4. Voice Control phonetic suggestions will be available in English, Spanish, French, and German.
  5. SignTime sessions are available in the U.S. and Canada using American Sign Language (ASL), in the U.K. using British Sign Language (BSL), in France using French Sign Language (LSF), in Japan using Japanese Sign Language (JSL), and in Australia using Australian Sign Language (Auslan). On May 18, SignTime will be available in Germany using German Sign Language (DGS), in Italy using Italian Sign Language (LIS), in Spain using Spanish Sign Language (LSE), and in South Korea using Korean Sign Language (KSL).

Press Contacts

Apple Media Helpline

media.help@apple.com

Apple Japan 広報部

press@apple.co.jp