Yesterday saw the announcement of a series of accessibility improvements from Apple, including a new “Assistive Access” interface option, Live Speech, and Personal Voice. And people who rely on accessibility features have welcomed it, although they’d like to see greater ambition for a feature.
Personal Voice was the feature that got the most attention, as it only needed 15 minutes of training to allow users to trigger speech on their iPhone…
Stephen Scott is blind and is the creator and host double click, a daily radio show about how technology is used by blind people. The show airs daily on AMI-audio across Canada at 12 noon ET, and is also available as a podcast.
He said that it can sometimes be difficult for sighted people to understand how revolutionary a simple-sounding technology can be.
Having the ability to select all the different buttons on my microwave seems pretty unnoticeable to most people, but a lot of times when you’re blind it shows you what you need to know and nothing more, so you might not even know that the microwave Your doing defrost function. With something like the new Point and Speak feature, I can find out for myself the functions of my microwave and other kitchen appliances.
But he says some accessibility features can also benefit everyone.
The beauty of accessibility features is that they often help more people than intended. Assistive technology will be very useful to many who just want a more streamlined experience using their devices.
This year I’m looking forward to Reality Pro and beyond. Apple is known for its commitment to accessibility across its product lines, so I’d be interested to see how it implements it in a whole new category.
Colin Hughes, former BBC producer and advocate for accessible technology, said that as a quadriplegic, controlling technology without using hands is key.
As someone with a severely physical disability, who relies on voice to get things done, the Apple accessibility features that make the biggest difference to me are Voice Control, all the hands-free features that Siri offers.
I’m glad to see Apple improving Voice Control by adding voice suggestions for editing text so users can choose the correct word among several that might sound similar, such as “do,” “due,” and “dew.”
In addition, the Voice Control Guide, a Windows-like voice access feature where users can learn tips and tricks on using voice commands will be useful, especially for newcomers to Voice Control.
We’ve previously shared how Apple tech helps him, with a daily video showing how HomeKit in particular offers so much autonomy.
Hughes also feels comfortable announcing the personal voice, even though he doesn’t need it yet.
For people with progressive disabilities like mine, it’s heartening that someday new features like Live Speech and Personal Voice will be there when we need them. It’s reassuring to see Apple act so comprehensively.
However, he believes that Apple needs to set more ambitious goals for voice control.
The new text editing feature looks great, but it’s a bit like a hard door lock after a horse has pulled away. The company should double the accuracy with AI so that less editing is needed.
Apple must also use sound-isolating microphone technology and AI to block out background noise when dictating with your Mac or iPhone in noisy environments. Voice control dictation is nowhere near the 98 – 99 percent accuracy that users achieve with an app like Dragon Professional, which unfortunately is now only available on Windows PCs.
Voice Control still struggles with proper names and foreign names. Even if you add proper names to the vocabulary of voice control, the app ignores the capitalization of the name.
Again, more reliable dictation features benefit everyone, not just people with disabilities.
I’d love to hear more from the company about how it’s trying to improve spelling accuracy for everyone.
Hughes also has a proposed next step for voice control.
I’d like to see Apple do more for the estimated 250,000,000 people who have non-standard speech and have trouble getting their words understood. Right now, you can’t train the app to recognize words the way you pronounce them, so I’d like to see a dedicated speech recognition component of Voice Control in the future.
With higher accuracy and personalized speech recognition, Apple will be able to help more people be heard.
Apple has often said that it wants to make its products useful to as many people as possible, and that it does not seek a financial return on its investment in accessibility technology.
If you have a disability, please share your own thoughts on yesterday’s announcement of Apple’s accessibility improvements — and if you don’t, do you see any of them having broader appeal?
FTC: We use affiliate links to earn income. more.
#People #disabilities #welcomed #Apples #accessibility #improvements #opportunities