Apple is adding new features to the Accessibility app including one that allows you to navigate the iPhone with your eyes.
Apple announced some new accessibility features including one for Siri called Vocal Shortcuts. With this feature, iPhone and iPad users will be able to assign a custom phrase that can be used to trigger the digital assistant into launching a shortcut and handle complex tasks. For example, you can set up Siri to turn on low power mode by saying «I’m running out of juice.»
Another new accessibility feature called «Listen for Atypical Speech» uses on-device machine learning to recognize the speech patterns of iPhone users. Those with cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke who cannot talk clearly will use the feature to help them customize and control their iPhone through their speech.New accessibility features such as Eye Tracking help iPhone users with speech issues
Mark Hasegawa-Johnson, with the Speech Accessibility Project at the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign, says, «Artificial intelligence has the potential to improve speech recognition for millions of people with atypical speech, so we are thrilled that Apple is bringing these new accessibility features to consumers.»
With Vocal Shortcuts, users can set a custom phrase to trigger a specific Shortcut
The iPhone and iPad will be adding Eye Tracking which uses AI to help device owners navigate their phone and tablet using their eyes.