
Apple on Wednesday announced new accessibility features for iPhones and iPads coming later this year. These features give a glimpse of Apple’s next OS as they should be arriving on all supported iPhones with iOS 18. Some of the notable accessibility features include “Eye Tracking” and “Listen for Atypical Speech.” The former is a feature that allows users to navigate their iPhones and iPads with just their eyes. The latter utilizes on-device machine learning and AI to recognize users’ speech patterns.
Apple revealed the upcoming Eye Tracking accessibility feature that is powered by AI. The feature will allow users to navigate on their iPhone and iPad with their eyes. The feature is designed for users with physical disability. It uses the front camera to set up and calibrate and takes the help of the on-device machine learning to store the data required on the device. So it’s not shared with even Apple.
People who are deaf or have a hearing impairment can feel the music with a new accessibility feature called Music Haptics. The feature uses the Taptic Engine in the iPhone and plays the taps, textures, and refined vibrations to the audio of the music playing. Apple will offer this feature’s API for developers to make music more accessible.
Your iPhone will soon get the Vocal Shortcuts accessibility feature. The feature will allow iPhone and iPad users to assign custom utterances that Siri can understand to launch shortcuts. This means you can set certain voice commands for Siri that will command it to complete complex tasks you set. In addition to this, there’s another voice-centric feature called Listen for Atypical Speech. The feature will use on-device machine learning to recognize speech patterns. This feature will also be soon available on iPhones and iPads and is designed for people with progressive conditions that affect their speech.
Apple has announced a useful feature for those who face motion sickness. The new Vehicle Motion Cues feature will reduce motion sickness by enabling small dots on the screen that move in the opposite direction of the vehicle. So if your car is taking a right turn, the dots on the screen will shift towards the left so you are comfortable viewing the screen.
CarPlay is getting three new accessibility features – Voice Control, Color Filters, and Sound Recognition. The Voice Control feature allows users to navigate CarPlay and control apps with their voice. The Sound Recognition feature will show car drivers that there’s a siren sound nearby, allowing drivers who are deaf or hard of hearing to stay vigilant. For colourblind users, the Color Filters make the CarPlay interface visually easier to use.
These are all the features coming to iPhones and iPads later this year likely with the next iOS version, the iOS 18. Most of these features are powered by AI or ML, which hints that we might have an AI overload at the actual launch of iOS 18 next month at WWDC.
Apart from this, there are some accessibility features for visionOS such as Live Captions for FaceTime in visionOS. Apple will also allow users to reduce transparency, give Smart Invert, Dim Flashing Lights feature and more in the next version upgrade.
Other than this, Apple has updated the existing accessibility features. You can learn more about them here.
Get latest Tech and Auto news from Techlusive on our WhatsApp Channel, Facebook, X (Twitter), Instagram and YouTube.Author Name | Pranav Sawant
Select Language