Apple unveils powerful accessibility features coming this year to Mac, iPhone, iPad, Apple Watch, and more

Apple today announced new accessibility features coming later this year, including Accessibility Nutrition Labels, which will provide more detailed information for apps and games on the App Store. Users who are blind or have low vision can explore, learn, and interact using the new Magnifier app for Mac; take notes and perform calculations with the new Braille Access feature; and leverage the powerful camera system of Apple Vision Pro with new updates to visionOS. Additional announcements include Accessibility Reader, a new systemwide reading mode designed with accessibility in mind, along with updates to Live Listen, Background Sounds, Personal Voice, Vehicle Motion Cues, and more. Leveraging the power of Apple silicon — along with advances in on-device machine learning and artificial intelligence — users will experience a new level of accessibility across the Apple ecosystem.
“At Apple, accessibility is part of our DNA,” said Tim Cook, Apple’s CEO, in a statement. “Making technology for everyone is a priority for all of us, and we’re proud of the innovations we’re sharing this year. That includes tools to help people access crucial information, explore the world around them, and do what they love.”
“Building on 40 years of accessibility innovation at Apple, we are dedicated to pushing forward with new accessibility features for all of our products,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. “Powered by the Apple ecosystem, these features work seamlessly together to bring users new ways to engage with the things they care about most.”
Accessibility Nutrition Labels Come to the App Store
Accessibility Nutrition Labels bring a new section to App Store product pages that will highlight accessibility features within apps and games. These labels give users a new way to learn if an app will be accessible to them before they download it, and give developers the opportunity to better inform and educate their users on features their app supports. This includes VoiceOver, Voice Control, Larger Text, Sufficient Contrast, Reduced Motion, captions, and more. Accessibility Nutrition Labels will be available on the App Store worldwide, and developers can access more guidance on the criteria apps should meet before displaying accessibility information on their product pages.
“Accessibility Nutrition Labels are a huge step forward for accessibility,” said Eric Bridges, the American Foundation for the Blind’s president and CEO, in a statement. “Consumers deserve to know if a product or service will be accessible to them from the very start, and Apple has a long-standing history of delivering tools and technologies that allow developers to build experiences for everyone. These labels will give people with disabilities a new way to easily make more informed decisions and make purchases with a new level of confidence.”
On the App Store, new Accessibility Nutrition Labels will highlight accessibility features within apps and games, allowing users to learn whether an app will be accessible to them before they download it.
An All-New Magnifier for Mac
Since 2016, Magnifier on iPhone and iPad has given users who are blind or have low vision tools to zoom in, read text, and detect objects around them. This year, Magnifier is coming to Mac to make the physical world more accessible for users with low vision. The Magnifier app for Mac connects to a user’s camera so they can zoom in on their surroundings, such as a screen or whiteboard. Magnifier works with Continuity Camera on iPhone as well as attached USB cameras, and supports reading documents using Desk View.
With multiple live session windows, users can multitask by viewing a presentation with a webcam while simultaneously following along in a book using Desk View. With customized views, users can adjust brightness, contrast, color filters, and even perspective to make text and images easier to see. Views can also be captured, grouped, and saved to add to later on. Additionally, Magnifier for Mac is integrated with another new accessibility feature, Accessibility Reader, which transforms text from the physical world into a custom legible format.
Coming later this year, an all-new Magnifier for Mac will make the physical world more accessible for users with low vision.
A New Braille Experience
Braille Access is an all-new experience that turns iPhone, iPad, Mac, and Apple Vision Pro into a full-featured braille note taker that’s deeply integrated into the Apple ecosystem. With a built-in app launcher, users can easily open any app by typing with Braille Screen Input or a connected braille device. With Braille Access, users can quickly take notes in braille format and perform calculations using Nemeth Braille, a braille code often used in classrooms for math and science. Users can open Braille Ready Format (BRF) files directly from Braille Access, unlocking a wide range of books and files previously created on a braille note taking device. And an integrated form of Live Captions allows users to transcribe conversations in real time directly on braille displays.
With Braille Access, users can quickly take notes in braille format and perform calculations using Nemeth Braille, a braille code often used in classrooms for math and science (left). An integrated form of Live Captions allows users to transcribe conversations in real time directly on braille displays.
Introducing Accessibility Reader
Accessibility Reader is a new systemwide reading mode designed to make text easier to read for users with a wide range of disabilities, such as dyslexia or low vision. Available on iPhone, iPad, Mac, and Apple Vision Pro, Accessibility Reader gives users new ways to customize text and focus on content they want to read, with extensive options for font, color, and spacing, as well as support for Spoken Content. Accessibility Reader can be launched from any app, and is built into the Magnifier app for iOS, iPadOS, and macOS, so users can interact with text in the real world, like in books or on dining menus.
Accessibility Reader — a new systemwide reading mode available on iPhone, iPad, Mac, and Apple Vision Pro — gives users new ways to customize text and focus on content they want to read.
Live Captions Arrive on Apple Watch
For users who are deaf or hard of hearing, Live Listen controls come to Apple Watch with a new set of features, including real-time Live Captions. Live Listen turns iPhone into a remote microphone to stream content directly to AirPods, Made for iPhone hearing aids, or Beats headphones. When a session is active on iPhone, users can view Live Captions of what their iPhone hears on a paired Apple Watch while listening along to the audio. Apple Watch serves as a remote control to start or stop Live Listen sessions, or jump back in a session to capture something that may have been missed. With Apple Watch, Live Listen sessions can be controlled from across the room, so there’s no need to get up in the middle of a meeting or during class. Live Listen can be used along with hearing health features available on AirPods Pro 2, including the first-of-its-kind clinical-grade Hearing Aid feature.
Live Listen controls come to Apple Watch with a new set of features, including real-time Live Captions.
An Enhanced View with Apple Vision Pro
For users who are blind or have low vision, visionOS will expand vision accessibility features using the advanced camera system on Apple Vision Pro. With powerful updates to Zoom, users can magnify everything in view — including their surroundings — using the main camera. For VoiceOver users, Live Recognition in visionOS uses on-device machine learning to describe surroundings, find objects, read documents, and more. For accessibility developers, a new API will enable approved apps to access the main camera to provide live, person-to-person assistance for visual interpretation in apps like Be My Eyes, giving users more ways to understand their surroundings hands-free.
Additional Updates
• Background Sounds becomes easier to personalize with new EQ settings, the option to stop automatically after a period of time, and new actions for automations in Shortcuts. Background Sounds can help minimize distractions to increase a sense of focus and relaxation, which some users find can help with symptoms of tinnitus.
• For users at risk of losing their ability to speak, Personal Voice becomes faster, easier, and more powerful than ever, leveraging advances in on-device machine learning and artificial intelligence to create a smoother, more natural-sounding voice in less than a minute, using only 10 recorded phrases. Personal Voice will also add support for Spanish (Mexico).
• Vehicle Motion Cues, which can help reduce motion sickness when riding in a moving vehicle, comes to Mac, along with new ways to customize the animated onscreen dots on iPhone, iPad, and Mac.
• Eye Tracking users on iPhone and iPad will now have the option to use a switch or dwell to make selections. Keyboard typing when using Eye Tracking or Switch Control is now easier on iPhone, iPad, and Apple Vision Pro with improvements including a new keyboard dwell timer, reduced steps when typing with switches, and enabling QuickPath for iPhone and Vision Pro.
• With Head Tracking, users will be able to more easily control iPhone and iPad with head movements, similar to Eye Tracking.
For users with severe mobility disabilities, iOS, iPadOS, and visionOS will add a new protocol to support Switch Control for Brain Computer Interfaces (BCIs), an emerging technology that allows users to control their device without physical movement.
• Assistive Access adds a new custom Apple TV app with a simplified media player. Developers will also get support in creating tailored experiences for users with intellectual and developmental disabilities using the Assistive Access API.
• Music Haptics on iPhone becomes more customizable with the option to experience haptics for a whole song or for vocals only, as well as the option to adjust the overall intensity of taps, textures, and vibrations.
Sound Recognition adds Name Recognition, a new way for users who are deaf or hard of hearing to know when their name is being called.
• Voice Control introduces a new programming mode in Xcode for software developers with limited mobility. Voice Control also adds vocabulary syncing across devices, and will expand language support to include Korean, Arabic (Saudi Arabia), Turkish, Italian, Spanish (Latin America), Mandarin Chinese (Taiwan), English (Singapore), and Russian.
• Live Captions adds support to include English (India, Australia, UK, Singapore), Mandarin Chinese (Mainland China), Cantonese (Mainland China, Hong Kong), Spanish (Latin America, Spain), French (France, Canada), Japanese, German (Germany), and Korean.
Updates to CarPlay include support for Large Text. With updates to Sound Recognition in CarPlay, drivers or passengers who are deaf or hard of hearing can now be notified of the sound of a crying baby, in addition to sounds outside the car such as horns and sirens.
• Share Accessibility Settings is a new way for users to quickly and temporarily share their accessibility settings with another iPhone or iPad. This is great for borrowing a friend’s device or using a public kiosk in a setting like a cafe.
MacDailyNews Take: Go to Settings > Accessibility on your Mac, iPhone, iPad, Apple Watch, and Apple TV and explore Apple’s many accessibility tools – there’s something in there for everybody!
Apple’s accessibility features are simply unmatched. They’re light years ahead of would-be rivals. — MacDailyNews, May 14, 2018
Please help support MacDailyNews — and enjoy subscriber-only articles, comments, chat, and more — by subscribing to our Substack: macdailynews.substack.com. Thank you!
Support MacDailyNews at no extra cost to you by using this link to shop at Amazon.
The post Apple unveils powerful accessibility features coming this year to Mac, iPhone, iPad, Apple Watch, and more appeared first on MacDailyNews.