Microsoft’s new App Uses AI to Demonstrate the World for the Visually Impaired
Microsoft’s Research division has today launched yet another iOS app – Seeing AI. The accessibility tool is specifically designed for the visually impaired user, and as the name suggests, it employs a series of neural networks to narrate the surrounding world. “Seeing AI” was originally announced at this year’s BUILD conference and is now available on iOS devices.
“Seeing AI” fundamentally pushes forward how blind users interact with the environment and its objects. It can recognize almost everything from people to documents and read out to the person. In addition to that, it can detect and guess the strangers’ age, emotion and in some cases, what they are doing. For example, “a man sitting on a couch working on a laptop” or “a girl playing frisbee”. These, although, entirely depend on how clear the image is. Moreover, scanning barcodes and the product’s description is possible as well. Seeing AI can also be used for identifying US currency which should be quite helpful since every bill doesn’t feature a varying texture.
Thanks to Microsoft’s constant efforts to advance in this field, “Seeing AI” isn’t just limited to telling what the camera is being pointed at. It is even capable of understanding how the user is holding the phone and suggests adjustments for a better view. For instance, it will speak out “top edges not visible” while scanning a document. Similarly, it will play audio cues for guiding the user to align a barcode. Additionally, Seeing AI works with other apps as well allowing you to share images and getting audible captions accordingly.
Most of the app’s functions can be employed without an internet connection as well. However, the advanced ones like describing an entire scene require it to be connected to the cloud. Seeing AI is currently limited to the US and available for free on iOS devices.