Google Lens can now select the text seen in a photo. (Photo: Google)
Ahead of Mobile World Congress, Google announced Friday it is bringing its augmented reality and Google Lens smart camera tools to more devices — and giving both new tricks.
Why it matters: Convincing developers to build for AR is, in part, a numbers game. Apple's ARKit is already supported on lots and lots of iPhones so getting Google's rival ARCore on more devices is important. Google Lens, meanwhile, is the evolution of the search box, using what the camera sees as the query.
As for Google Lens, Google is also expanding what the camera-based search tool can do, both within Google Photos and Google Assistant apps. Even iPhone users will be able to use Google Lens within the Google Photos app.
- Capture text within a photo — Perhaps Google Lens' coolest new trick will be to covert the words inside a photo into text that you can copy and paste. It will also be able to make sense of things like business cards and add the information to your contacts. Translating text between languages isn't in this version but is planned, Google told Axios.
- Google Lens will also be able to classify the types of plants and animals found within photos, which, according to Google was a highly-requested feature.
On the AR front, ARCore moves out of developer preview and will now work with certain high-end models from Samsung, LG, Asus and OnePlus, in addition to Google's Pixel and Pixel 2 models. Many of this year's flagship phones will also support ARCore, including devices from Samsung, Huawei, LGE, Motorola, ASUS, Xiaomi, HMD/Nokia, ZTE, Sony Mobile, and Vivo.