mobile technology

Google announces Style Match discoverability feature for Google Lens

Google has announced Style Match, a feature allowing users to point their smartphone’s camera at an outfit so it generates suggestions on what to buy online.

Google Lens Style Match
Google Lens Style Match

Google has announced Style Match, a feature allowing users to point their smartphone’s camera at an outfit so it generates suggestions on what to buy online.

The feature is part of Google Lens, the company’s camera-powered search engine that was unveiled last year and is now fully integrated into its smartphone’s native camera.

The news, which was announced at the Google I/O 2018 conference in California on May 8, means any user can discover a specific item or receive suggestions to similar styles not only for fashion, but other categories including accessories and furniture.

Such functionality could give the tech giant the lead in facilitating discovery through mixed realities, particularly because by being embedded in the phone’s native camera, it doesn’t require the user to learn a new behavior or download a dedicated app they will eventually ditch. So far, brands such as eBay and ASOS have tinkered with image recognition within their own apps, but the ability to trigger the image via a smartphone’s main image-capturing tool can only lead to mass adoption.

Other new features of the Android phone include a smart text selection where users can ‘copy and paste’ texts from the real world, such as recipes and articles, directly onto their smartphone screens. Google Lens then allows users to highlight copy or bring up relevant information, such as the word for a dish at a restaurant menu, and receiving an image of said food.

In order to enable this, Google is leveraging its knowledge in search to help recognize word terms and the context of certain words.

Meanwhile, for the updated Google Maps, users can trigger augmented reality to navigate via Street View – this means the user can wave a phone in front of themselves to know their exact position on the map, and giant arrows will point to where they should walk next.

Beyond augmented reality and image recognition, the company also announced developments to Google Assistant that means users can increasingly have more natural interactions with voice. During the keynote, they used singer John Legend’s voice to demonstrate.

This includes the ability to have “Continued Conversations”, where the Assistant remembers your most recent questions, as well as being able to ask more than one question within the same voice request. The future Assistant will also be able to call places on the user’s behalf, which proves particularly handy when booking appointments via phone with businesses that don’t have an online booking system.

By Bia Bezamat

Bia Bezamat is a fashion journalist by trade and innovation expert with experience spanning fashion, retail, grocery and hospitality. Originally from Brazil, she is interested in how cultural, behavioral and technological shifts influence how consumers and brands interact with each other. As Senior Innovation Editor and Strategist at Current Global, she helps brands understand changing consumer behaviors and the evolving technology landscape.