Google is extending its machine learning capabilities to your smartphone’s camera. The search engine giant, at its annual developer conference – Google I/O 2017, announced a new feature which will be soon coming to Android smartphones called “Google Lens”.
Google Lens is essentially what Samsung did with Bixby Lens, or Amazon did on the Fire phone. It can recognize objects when you point your phone’s camera towards them. For instance, point it at a flower, and it will tell you the species it belongs to. Point it at a concert poster and Google Lens will bring pages for tickets, timings and more. Point at a restaurant, and it will reveal details about it. You get the idea. It will work through Google’s artificially intelligent Assistant and Google Photos. The technology leverages the company’s computer vision to gather accurate results.
Samsung, a few months back, launched a similar software feature on its Galaxy S8 smartphone under the title of Bixby Vision. However, Google’s approach will certainly bring a more general and coherent experience to the Android landscape. Furthermore, Google’s knowledge graph and the database will allow them to pull up significantly better results from the internet which will only get better over time.
Moreover, at I/O, Google also announced that its mobile operating system – Android has crossed over 2 Billion monthly active users which is quite incredible. Android recently also replaced Microsoft’s Windows as the most popular operating system. In addition to these, Google showcased how you will be soon able to remove objects and fix images seamlessly using machine learning on Android.
Stay tuned for more updates from Google I/O.