Apple has unveiled a new accessibility feature called Eye Tracking, bringing an innovative way for users with physical disabilities to navigate their iPads and iPhones using only their eyes. Leveraging advanced artificial intelligence, this feature uses the front-facing camera to quickly set up and calibrate, making it user-friendly and efficient.
Eye Tracking is designed to work seamlessly across iPadOS and iOS apps, without the need for additional hardware or accessories. The front-facing camera tracks eye movement, allowing users to interact with on-screen elements through Dwell Control. This method enables users to navigate apps and activate functions such as physical buttons, swipes, and other gestures simply by looking at them.
The setup process is straightforward and quick, requiring just a few seconds to calibrate the device to the user's eye movements. Importantly, Apple ensures that all data used for setting up and controlling Eye Tracking remains securely on the device, thanks to on-device machine learning. This means that no data is shared with Apple, maintaining user privacy and security.
Eye Tracking will be available later this year with the release of iOS 18, further expanding the capabilities of Apple’s accessibility offerings.