Eye control technology may seem like something straight out of a science fiction movie, but for millions of iPhone users worldwide, it’s about to become a reality. Apple has confirmed the implementation of eye tracking on its iPad and iPhone devices. This innovative tool utilizes artificial intelligence (AI) to allow users to manage their Apple devices using only their eyes.
According to Apple, the eye tracking feature utilizes the front-facing camera for setup and calibration, which takes just seconds. With on-device machine learning, all data used for setup and control remains securely stored on the device and is not shared with Apple. Last week, Apple unveiled several new accessibility features, including the eye-tracking tool, emphasizing its commitment to inclusive design.
Apple CEO expressed belief in the transformative power of innovation to improve lives, highlighting Apple’s longstanding commitment to integrating accessibility into both hardware and software. The eye tracking feature does not require additional hardware or accessories, functioning seamlessly with iPadOS and iOS apps. Once set up, users can utilize Dwell Control to activate app elements and navigate through them, using their eyes to access various features such as physical buttons, swipes, and gestures.
Although the feature won’t be available until later in the year, it has already generated significant interest, sparking discussions on platforms like X (formerly Twitter). Some users see it as a boon for those with disabilities, while others humorously comment on the potential for increased laziness in society.
In addition to eye tracking, Apple unveiled other new capabilities, including a function aimed at reducing motion sickness in car passengers. This feature, called Vehicle Motion Cues, adds animated dots to the screen to depict changes in vehicle motion, potentially alleviating sensory conflicts that contribute to motion sickness. Another feature, Music Haptics, utilizes the iPhone’s haptic engine to allow deaf or hard of hearing individuals to experience music through vibrations synchronized with the audio.
Furthermore, Apple announced new speech features to aid customers with speech impairments, enabling them to program specific commands for Siri to assist with app shortcuts. These initiatives underscore Apple’s ongoing efforts to make technology more accessible and inclusive for all users.