Daily Tech News, Interviews, Reviews and Updates

Apple introduces new accessibility features coming later this year across the Apple ecosystem

Apple has been introducing various innovative and new features across its ecosystem to enhance the user experience and security of its devices and apps. Recently the company has unveiled some new accessibility features that will be coming later this year across the Apple ecosystem.

“At Apple, accessibility is part of our DNA,” said Tim Cook, Apple’s CEO. “Making technology for everyone is a priority for all of us, and we’re proud of the innovations we’re sharing this year. That includes tools to help people access crucial information, explore the world around them, and do what they love.”

Apple introduces new accessibility features

Accessibility Nutrition Labels come to the App Store

Accessibility Nutrition Labels bring a new section to App Store product pages that will highlight accessibility features within apps and games. These labels give users a new way to learn if an app will be accessible to them before they download it, and allow developers to better inform and educate their users on features their app supports. This includes VoiceOver, Voice Control, Larger Text, Sufficient Contrast, Reduced Motion, captions, and more. Accessibility Nutrition Labels will be available on the App Store worldwide.

All-New Magnifier for Mac

The Magnifier feature has been available on iPhone and iPad for users who are blind or have low vision, with tools to zoom in, read text, and detect objects around them. This year, Magnifier will be made available on Mac as well. The Magnifier app for Mac connects to a user’s camera so they can zoom in on their surroundings, such as a screen or whiteboard.

With multiple live session windows, users can multitask by viewing a presentation with a webcam while simultaneously following along in a book using Desk View. With customized views, users can adjust brightness, contrast, color filters, and even perspective to make text and images easier to see. Views can also be captured, grouped, and saved to add to later on. Additionally, Magnifier for Mac is integrated with another new accessibility feature, Accessibility Reader, which transforms text from the physical world into a custom legible format.

A new Braille experience

Braille Access is an all-new experience that turns iPhone, iPad, Mac, and Apple Vision Pro into a full-featured braille note taker that’s deeply integrated into the Apple ecosystem. With a built-in app launcher, users can easily open any app by typing with Braille Screen Input or a connected Braille device. Features available include- Braille Access, Nemeth Braille, Braille Ready Format, and an integrated form of Live Captions that transcribes conversations in real-time directly on Braille displays.

Accessibility Reader

Accessibility Reader is a new systemwide reading mode designed to make text easier to read for users with a wide range of disabilities, such as dyslexia or low vision. Available on iPhone, iPad, Mac, and Apple Vision Pro, Accessibility Reader gives users new ways to customize text and focus on content they want to read, with extensive options for font, color, and spacing, as well as support for Spoken Content. Accessibility Reader can be launched from any app and is built into the Magnifier app for iOS, iPadOS, and macOS.

Live Captions on Apple Watch

Live Listen controls come to Apple Watch with a new set of features, including real-time Live Captions. Live Listen turns an iPhone into a remote microphone to stream content directly to AirPods, which are made for iPhone hearing aids or Beats headphones. When a session is active on an iPhone, users can view Live Captions of what their iPhone hears on a paired Apple Watch while listening along to the audio. Apple Watch serves as a remote control to start or stop Live Listen sessions or jump back in a session to capture something that may have been missed.

Live Listen can be used along with hearing health features available on AirPods Pro 2, including the first-of-its-kind clinical-grade Hearing Aid feature.

Enhanced view with Apple Vision Pro

VisionOS will expand vision accessibility features using the advanced camera system on Apple Vision Pro. Updates include- in Zoom, users can magnify everything in view using the main camera, Live Recognition in visionOS to describe surroundings, find objects, read documents, and more. For accessibility developers, a new API will enable approved apps to access the main camera to provide live, person-to-person assistance for visual interpretation in apps like Be My Eyes.

Additional updates

  • Background Sounds become easier to personalize with new EQ settings, the option to stop automatically after a period of time, and new actions for automations in Shortcuts.
  • For users at risk of losing their ability to speak, Personal Voice becomes faster, easier, and more powerful than ever, leveraging advances in on-device machine learning and artificial intelligence to create a smoother, more natural-sounding voice in less than a minute, using only 10 recorded phrases. Personal Voice will also add support for Spanish (Mexico)
  • Vehicle Motion Cues, which can help reduce motion sickness when riding in a moving vehicle, comes to Mac, along with new ways to customize the animated onscreen dots on iPhone, iPad, and Mac
  • Eye Tracking users on iPhone and iPad will now have the option to use a switch or dwell to make selections. Keyboard typing when using Eye Tracking or Switch Control is now easier on iPhone, iPad, and Apple Vision Pro, with improvements including a new keyboard dwell timer, reduced steps when typing with switches, and enabling QuickPath for iPhone and Vision Pro
  • With Head Tracking, users will be able to more easily control iPhone and iPad with head movements, similar to Eye Tracking
  • For users with severe mobility disabilities, iOS, iPadOS, and visionOS will add a new protocol to support Switch Control for Brain Computer Interfaces (BCIs), an emerging technology that allows users to control their device without physical movement
  • Assistive Access adds a new custom Apple TV app with a simplified media player. Developers will also get support in creating tailored experiences for users with intellectual and developmental disabilities using the Assistive Access API
  • Music Haptics on iPhone becomes more customizable with the option to experience haptics for a whole song or for vocals only, as well as the option to adjust the overall intensity of taps, textures, and vibrations
  • Sound Recognition adds Name Recognition, a new way for users who are deaf or hard of hearing to know when their name is being called
  • Voice Control introduces a new programming mode in Xcode for software developers with limited mobility. Voice Control also adds vocabulary syncing across devices, and will expand language support to include Korean, Arabic (Saudi Arabia), Turkish, Italian, Spanish (Latin America), Mandarin Chinese (Taiwan), English (Singapore), and Russian
  • Live Captions adds support to include English (India, Australia, UK, Singapore), Mandarin Chinese (Mainland China), Cantonese (Mainland China, Hong Kong), Spanish (Latin America, Spain), French (France, Canada), Japanese, German (Germany), and Korean
  • Updates to CarPlay include support for Large Text. With updates to Sound Recognition in CarPlay, drivers or passengers who are deaf or hard of hearing can now be notified of the sound of a crying baby, in addition to sounds outside the car, such as horns and sirens
  • Share Accessibility Settings is a new way for users to quickly and temporarily share their accessibility settings with another iPhone or iPad

To celebrate global accessibility awareness day with Apple, Apple Retail is introducing dedicated tables spotlighting accessibility features on a variety of devices in select store locations throughout May. Sessions can be scheduled at all Apple Store locations worldwide through Group Booking or by visiting a nearby store.

There are some new additions to the Apple apps as well-

  • Apple Music shares the story of artist Kiddo K and the power of music haptics for users who are deaf or hard of hearing, unveils updates to its Haptics playlists, and launches a brand-new playlist featuring ASL interpretations of music videos alongside Saylists playlists.
  • Apple Fitness+ welcomes Chelsie Hill as a guest in a Dance workout with Fitness+ trainer Ben Allen. The workout is available now in the Fitness+ app
  • Apple TV+ shares a behind-the-scenes look at the making of the new Apple Original film Deaf President Now!, which premieres on Apple TV+ on May 16.
  • Apple Books, Apple Podcasts, Apple TV, and Apple News will spotlight stories of people with disabilities and those who are working to make the world more accessible for everyone.
  • App Store is sharing a collection of apps and games designed to be accessible to everyone, in addition to featuring the story of Klemens Strasser, a developer guided by a philosophy of making accessible apps and games like The Art of Fauna.
  • The Shortcuts app adds Hold That Thought, a shortcut that prompts users to capture and recall information in a note so interruptions don’t derail their flow. The Accessibility Assistant shortcut has been added to Shortcuts on Apple Vision Pro to help recommend accessibility features based on user preferences
  • New videos on the Apple Support accessibility playlist include features like Eye Tracking, Vocal Shortcuts, and Vehicle Motion Cues, as well as a library of videos to help everyone personalize their iPhone, iPad, Mac, Apple Watch, and Apple Vision Pro to work best for them.

Get real time updates directly on you device, subscribe now.



You might also like