Site icon MacTech.com

Apple announces new accessibility features including Assistive Access, Live Speech, more

Apple has previewed software features for cognitive, vision, hearing, and mobility accessibility, along with tools for individuals who are nonspeaking or at risk of losing their ability to speak. 

The tech giant says these updates draw on advances in hardware and software, include on-device machine learning to ensure user privacy, and expand on Apple’s long-standing commitment to making products for everyone. 

Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives, says Apple works in deep collaboration with community groups representing a broad spectrum of users with disabilities to develop accessibility features that make a real impact on people’s lives. 

Coming later this year, users with cognitive disabilities can use iPhone and iPad with greater ease and independence with Assistive Access; nonspeaking individuals can type to speak during calls and conversations with Live Speech; and those at risk of losing their ability to speak can use Personal Voice to create a synthesized voice that sounds like them for connecting with family and friends. 

For users who are blind or have low vision, Detection Mode in Magnifier offers Point and Speak, which identifies text users point toward and reads it out loud to help them interact with physical objects such as household appliances.

Additional features planned include:




Article provided with permission from AppleWorld.Today
Exit mobile version