Apple brings AI-powered eye tracking to iPhone and iPad
Apple is set to release AI-enabled Eye Tracking on the iPhone and iPad as part of a new range of accessibility tools announced Wednesday. Using artificial intelligence, it will allow users to navigate their Apple device using just their eyes.
Powered by artificial intelligence, Eye Tracking gives users a built-in option for navigating iPad and iPhone with just their eyes. Designed for users with physical disabilities, Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple.
Eye Tracking works across iPadOS and iOS apps, and doesn’t require additional hardware or accessories. With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes.
Support MacDailyNews at no extra cost to you by using this link to shop at Amazon.
MacDailyNews Take: Hopefully, Eye Tracking on iPhone and iPad work at least as well as it does on Apple Vision Pro (from whence it came)!
Please help support MacDailyNews. Click or tap here to support our independent tech blog. Thank you!
Support MacDailyNews at no extra cost to you by using this link to shop at Amazon.
The post Apple brings AI-powered eye tracking to iPhone and iPad appeared first on MacDailyNews.