Apple’s Live Text Uses AI to Let You Capture Text from Images
Google Lens for iOS?
- Live Text digitizes the text in your photos and unlocks a series of different operations that you can perform with it on your device.
- It uses on-device intelligence to recognize the text in an image and allows you to take action on the captured text as you deem fit.
- Live Text can come in handy when you want to look up the text (in an image) on the web, copy it to the clipboard and paste it on another app, or call a phone number displayed in an image.
At the ongoing World Wide Developers Conference (WWDC) 2021, Apple unveiled its latest operating systems, namely, the iOS 15, iPadOS 15, watchOS 8, and macOS Monterey, which will power its lineup of iPhone, iPad, Apple Watch, and Mac, respectively, this fall.
Each of these operating systems promises to bring along a bunch of improvements over its predecessors along with the addition of few new features, including an array of all-new privacy-focused features across the board that aims to offer users a more private and secure experience on their devices.
Besides privacy features, another intriguing feature that managed to grab some eyeballs during the announcement is Live Text, which digitizes the text in your photos and unlocks a series of different operations that you can perform with it on your device.
Live Text vs Google Lens
If this sounds familiar to you, you are probably not alone. Google already has something similar with Google Lens for quite a few years now, where it uses text and object recognition to detect and extract text from images.
However, what sets the two services apart is that, with Apple’s take on Live Text, the text capturing is said to take place passively on every photo (taken by the iPhone) in the background — unlike Google Lens, which requires active involvement.
Also, unlike Google Lens, the processing of data on Live Text is completely within the device (and hence more secure and private). While Google Lens too has some features which work offline (like translation), it still does some processing in the cloud.
How Live Text works?
According to Apple, Live Text uses on-device intelligence to quickly recognize the text in an image, after which it gives users the ability to take action on the captured text as they deem fit. In addition, it also uses the power of Neural Engine, which would allow users to use the Camera app to recognize and copy text right from the viewfinder without having to capture an image.
Live Text: Usecases
A few scenarios where Live Text can come in handy include those where you want to look up the text (in an image) on the web, copy it to the clipboard and paste it on another app, or call a phone number displayed in the image.
Apple says that Live Text will also work in Spotlight search (on iOS) and will allow users to find text and handwriting in images on their Photos app.
Visual Look Up
Besides, iOS 15 will also come with the Visual Look Up feature, which will enable you to find out more information about the different landmarks, art, books, plants/flowers, breeds of pets, etc., in your surroundings using your iPhone.
Live Text: Availability
Live Text is cross-platform and will work on iPhones, iPads, and Macs. It will arrive on iOS 15 this fall, with other operating systems expected to follow suit subsequently.
When it launches, Live Text will be available in seven languages: English, Chinese, French, German, Italian, Portuguese, and Spanish.