Apple has now unveiled a range of features for users with special needs, which are scheduled to be released later this year and promise to revolutionize how people interact with iPhones, iPads, and CarPlay. To build these tools, Apple has joined forces with the Speech Accessibility Project at the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign, as well as collaborated with the likes of Google and Amazon.

One of the most groundbreaking innovations is built-in Eye Tracking. This feature eliminates the need for external hardware, a significant hurdle for some users due to cost or compatibility issues. With Apple’s solution, users can navigate apps, select options, and control their devices entirely through eye movements. The system leverages the front-facing camera and on-device machine learning, and by interpreting what users are looking at and translating those intentions into actions, Eye Tracking grants them greater independence and control over their devices, allowing them to perform tasks like texting, browsing the web, or playing games without physical manipulation.

“We believe deeply in the transformative power of innovation to enrich lives,” Apple CEO Tim Cook commented on the matter. “That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.” Eye Tracking, Apple notes, will work across iPadOS and iOS apps. “With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes,” the company notes.

Apple is further enhancing accessibility for users who rely on voice commands. “Vocal Shortcuts” allows users to assign custom sounds or words (not necessarily coherent phrases) to launch shortcuts and execute complex tasks through Siri. For example, a simple “Ah!” could launch a specific app or initiate a pre-programmed routine. Additionally, “Listen for Atypical Speech” leverages machine learning to recognize a user’s unique speech patterns. This feature is a game-changer for individuals with speech conditions like ALS, cerebral palsy, or those who have acquired speech impairments.

For the deaf and hard-of-hearing community, Apple introduces “Music Haptics.” This innovative feature utilizes the iPhone’s Taptic Engine to translate music into a series of taps, textures, and vibrations that correspond to the audio. Millions of songs on Apple Music will be compatible with this immersive experience, allowing users to feel the rhythm and nuances of music in a new way. Furthermore, Apple is releasing an API to developers, enabling them to integrate Music Haptics into their own music apps. VisionOS, a software platform for assistive eyewear, will also be further bolstered with system-wide “Live Captions.” This feature provides real-time transcriptions, facilitating communication during FaceTime calls and enabling users to follow audio from apps.

Furthermore, Apple’s “Vehicle Motion Cues” will now display animated dots on the screen’s edges, which respond to the vehicle’s motion, subtly swaying in the direction of travel. By visually aligning with the vehicle’s movement, this feature aims to reduce the sensory conflict that triggers motion sickness for some users. And to add to this, drivers will now be able to control apps and navigate CarPlay simply by using voice commands. This feature reduces the need for visual attention on the screen, promoting safer driving habits. Furthermore, “Sound Recognition” alerts drivers to crucial sounds like sirens or car horns, enhancing awareness for deaf and hard-of-hearing users. CarPlay also gains “Color Filters” and text size adjustments