The final accessibility feature announcement from Apple in their May 16th press release, Point and Speak, is unfortunately something only users of the more premium Apple mobile devices will be able to take advantage of. This is because only the Pro and Pro Max iPhones and iPads released after 2020 have a LiDAR scanner which is necessary for Point and Speak to work. Rather than a standalone tool Point and Speak works as an additional feature of the inbuilt Magnifier, joining other recent enhancements like Door and People detection. By combining input from the camera and the LiDAR Scanner, and using on-device machine learning, Point and Speak will read text pointed to. This effectively means that any hardware device with labelled buttons can be used by a blind person using Point and Speak. See the short video below illustrating it being used to operate a Microware oven.
The functionality of Point and Speak is similar to that of expensive dedicated vision support hardware such as ORCam. Unlike ORCam, users of Point and Speak on the iPhone won’t have their device head mounted however so the process of using it might be a bit more clunky… for now. Having seen some of the recent promotion videos of the upcoming Apple Vision Pro headset it’s very easy to imagine Point and Speak (along with other vision support tools) working very well on it. It has all the required technology.
As it is Point and Speak is a nice new feature for those with compatible devices, what is most exciting about it however is the potential future of vision support technology it hints at.
Read about the other new features announced in the two previous posts below: