Apple has recently announced groundbreaking accessibility features that offer a glimpse into the company’s AI-driven future plans. One of the standout features is Eye Tracking, designed particularly for individuals with physical disabilities, enabling them to control their iPhones and iPads solely with their eyes.
Setting up the Eye Tracking feature is remarkably simple and swift. Users merely need to utilize the front-facing camera for a brief calibration process, making it incredibly user-friendly. Moreover, this feature doesn’t require any additional hardware or accessories, making it accessible to a wide range of users.
Eye Tracking functionality is compatible with both iPadOS and iOS platforms. It empowers users to navigate within apps, activate various sections using Dwell Control, execute actions such as pressing buttons, swiping, and employing gestures seamlessly.
Another notable addition to Apple’s accessibility arsenal is the “Listen for Atypical Speech” feature. This innovation utilizes on-device machine learning to enhance Siri’s comprehension of a broader spectrum of voices, catering to diverse user needs.
Apple emphasized that these features epitomize the convergence of cutting-edge hardware and software, leveraging Apple’s proprietary silicon, artificial intelligence, and machine learning technologies. These advancements underscore Apple’s longstanding commitment to inclusive design, ensuring that its products cater to the needs of all users.
Anticipated to roll out later this year, most likely with the iOS 18 and iPadOS 18 fall updates, these features are poised to redefine the accessibility landscape, offering unprecedented levels of inclusivity and usability to Apple device users worldwide.