Google is introducing a new augmented reality technology for its mobile search engine that will allow customers to see 3D renderings of a number of visual product results.
Users will also be able to place these into the real world through their phone cameras, using AR.
According to the tech conglomerate, partners from the world of fashion, tech, automobile and more, will be making their products available for the mobile search enhancements. These include names such as New Balance, Target Corp, Samsung and Volvo.
The new feature was announced at Google’s developer conference on Tuesday alongside a flurry of other developments such as extended privacy, new smart speaker features and more. The new AR technology feature will be released later this month.
It was demonstrated on stage with the example of shopping for a pair of sneakers. A customer searching for a pair of New Balance shoes, for instance, will come across a visual search result that has the option of a “view in 3D” button. When tapped this will transform the image into a three-dimensional rendering that can be moved by swiping on the phone screen.
Another tap on a “view in your space” button pulls up the user’s phone camera and drops the sneaker into their immediate environment using AR technology. The user can then move closer to the sneaker and see it from different angles by walking around it.
“Say…you’re shopping for a new pair of shoes. With New Balance, you can look at shoes up close, from different angles, again, directly from search,” explained Aparna Chennapragada, vice president of Google Lens & AR on stage. “That way, you get a much better sense for things like, what does the grip look like on the sole, or how they match with the rest of your clothes.”
With the new launch, Google makes it easier for retailers to tap into AR technology by offering the service directly through its search engine, with no additional development beyond the 3D rendering needed by the brand itself.
Recent examples of other brands using AR technology include Puma. The brand just launched a sneaker that activated AR content through a dedicated app.
–
How are you thinking about innovation? We’re all about finding you the perfect partners to do so. TheCurrent Global is a consultancy transforming how fashion, beauty and consumer retail brands intersect with technology. We deliver innovative integrations and experiences, powered by a network of top technologies and startups. Get in touch to learn more.
Levi’s has partnered with Pinterest on a personal styling tool that generates a custom inspiration board depending on the user’s taste.
Called “Styled by Levi’s”, the feature exists on a microsite, where users can select the gender they would like to shop in, and five product images that they relate to the most. Next, they are prompted to log into their Pinterest accounts to receive personalized, shoppable boards.
Apart from editorial campaigns and product images, boards also refer customers to its customization service and chatbot feature, which the brand launched in 2017.
The feature works both when the user is logged in to their Pinterest accounts or not. However, users that are logged in will be served a more relevant experience as the platform also uses data from their past browsing behavior.
Over the past couple of years, Pinterest has pushed to monetize the behavior of its audience of 250 million with a series of brand partnerships that feature an e-commerce element. For this holiday season, it is introducing Gift Globes, a gift-finding solution where consumers can enter information to receive customised gift guides. Participating brands include Macy’s, Lowe’s and Kohl’s.
–
How are you thinking about digital innovation? We’re all about finding you the perfect partners to do so. TheCurrent is a consultancy transforming how fashion, beauty and consumer retail brands intersect with technology. We deliver innovative integrations and experiences, powered by a network of top technologies and startups. Get in touch to learn more.
Forever 21 has introduced an AI-enabled feature that will allow consumers to engage with visual search when browsing online.
The feature, titled “Discover your style”, allows shoppers to search for items by clicking on icons that represent features that they want in an outfit – such as length or fit of a skirt, or the neckline and color of a shirt. For this launch, the fast fashion retailer worked with visual search experts Donde Search, whose recommendation algorithm aims to mimic how shoppers think about products.
“Visual search technology bridges the gap between the convenience of online shopping and the rich discovery experience of traditional retail by enabling our customers to search for clothing in the same way they think about it—using visuals, not words,” says Alex Ok, president of Forever 21. “Early data shows that this is one of the most important innovations in the e-commerce space in recent years.”
The functionality debuted in the Forever 21 iOS app in May and was initially available for the dresses and tops categories. However, within the first month of launching the feature, the brand saw a 20% increase in average purchase value for the two test categories, as well as an increase in sales conversions.
Forever 21’s “Discover your style” feature
“As e-commerce’s share of retail sales continues to grow, it’s more important than ever that retailers use a universal language that both shoppers and merchandisers can understand,” says Liat Zakay, CEO and founder of Donde Search.
There are many benefits to introducing visual search alongside more traditional text, but according to the brand, the functionality also helps retailers remove any local language barriers associated with the latter.
Allowing consumers to search visually also enables them to manifest more subtle likes and dislikes when searching for garments, which is something major brands and retailers have been experimenting with for years. Last year, for instance, ASOS introduced a visual search functionality that allows people to upload images to display similar items for sale on the site.
–
How are you thinking about visual search? We’re all about finding you the perfect partners for your innovation strategy. TheCurrent is a consultancy transforming how fashion, beauty and consumer retail brands intersect with technology. We deliver innovative integrations and experiences, powered by a network of top technologies and startups. Get in touch to learn more.
With the launch of a lower-price subscription service, how Rent the Runway’s ‘closet in the cloud’ is changing the face of sustainability [Fashionista]
Digital closet start-ups want to give you the Cher Horowitz experience [Racked]
Target is using Pinterest’s Lens visual search technology
If there was one overarching term at Shoptalk Europe this week, it was artificial intelligence. From machine learning to visual search, natural language processing and more, the role of systems that facilitate smarter and more personalised customer experiences was key.
Keynote talks from Google, Alibaba, Westfield and more all referenced such a focus, with repeats of numerous big stats bandied about in terms of where this space is moving. By 2020, 85% of customer interaction in retail will be managed by AI, according to Gartner, multiple speakers said. And 30% of all companies will employ AI to augment at least one of their primary sales processes by the same time period, they further added.
“We’re putting AI front and centre as a driving force to make [smart commerce] happen,” noted eBay’s chief product officer, RJ Pittman. “The curve is steep but the opportunity is extraordinary. So we’re going to start climbing; we’re right at the precipice of a transformational inflection point.”
Other such initiatives were referenced throughout the conference too. Levi’s noted its virtual stylist chatbot, created with Mode.ai, which aims to replicate the experience customers have in store by helping them with the fit and style of jeans to suit them.
Topman’s global digital director, Gareth Rees-John, highlighted his work with a Canadian company called Granify to help optimize the menswear store’s e-commerce conversion rates by serving different messages to shoppers when they are at flight risk. The notifications use machine learning to address issues that will help retain the individual in question, such as letting them know an item is low in stock, as one example. It’s seeing an uplift of 3-5% in doing so.
Flash sales site BrandAlley meanwhile, outlined how it works with marketing automation company Emarsys for persona based targeting in its email campaigns, which has led to a 16% conversion lift. And AI firm Sentient Technologies showed how providing 256 real-time website design variations for consumers for Swedish flower delivery chain Euroflorist, has resulted in a 17% increase in conversions.
An underlying thread throughout however, was how much more work there is to be done to move towards true personalisation. Rees-John reminded the audience how many retailers are still operating on legacy systems with “jumbled data” making it hard to move forward fast, for instance. His focus, he said, is on “making little changes that have robust business cases”.
Meanwhile, Bruce Macinnes, chairman of BrandAlley, noted that he hopes to move towards personalising the entire customer journey from homepage to checkout. “We have plenty of personalised content along that journey but it’s not fully personalised yet and we believe there is a way to go to using all the data that we have,” he explained.
Charmaine Huet, chief marketing officer of Woolworths South Africa, wants to work towards having millions of different communications plans every day. “78% of our revenue comes from credit cards, so we already know a lot about our customers. Now what we’re really thinking about is how do you really personalise the experience for them and how do you create content that is really personalised and resonates with [each of them] – and this is really difficult, it takes humans and data and AI.”
Vladimir Stankovic, global digital and e-commerce director at Camper, said AI can be seen as the enabler for all this. “It will allow us to get closer to our consumer, to give them what they want.” His big hopes lie in how it can impact discovery: “Natural language processing and visual search are providing new ways to discover product. I believe there is huge value from this technology.”
Visual search companies particularly dominated the exhibit floor, including the likes of Slyce, which works with Tommy Hilfiger, and Fashwell, which works with Zalando. Ted Mann, CEO of the former, said being able to search through your camera lens will become common practice for shoppers down the road, noting new functionalities his team is adding including being able to use visual search to create wishlists and to fill shopping baskets.
In his keynote talk, Tim Kendall, president of Pinterest, likewise said “the future of discovery will be visual”. He pushed the idea that Pinterest is aiming to do to discovery what Google did to search, with visual search at the heart of achieving that.
The company’s Lens tool, which allows customers to find similar items from its database by searching through their cameras, is being heavily integrated in the shopping space. It recently launched a partnership with Target on that basis, similarly starting with a registry experience.
“This Pinterest partnership quite literally helps us shorten the distance from when our guests have an idea to when they’re ready to make a purchase,” said Rick Gomez, chief marketing officer at Target, at launch. “It’s another way we’re making it easy and fun for our guests to explore and find new products.”
Ultimately the goal, said Huet of Woolworths South Africa, is for automation in retail processes to do just this: allow more frictionless shopping, as well as a level of personalised experience so consumers can spend more time doing (and finding) what they really want.
AI in its various forms, is helping shopkeepers move this forward. “Just look at this conference; AI is already here,” said Pittman of eBay. “I say embrace it. And then go build something great.”
My filter for successful visual search is simple – can you take a photo of someone else’s shoes or jacket when on a busy train and find a direct replica online? Can technology negate the awkwardness of actually speaking to someone during your commute to find out where his or her “must-have” item is from?
Fashion stalker claims aside, the short answer is still no. In the majority of cases, the tech is not yet good enough to pull apart a busy image and identify exactly what that piece is.
It is however getting better at finding similar items. Thanks to artificial intelligence, it can identify shape, colour, print and more – it can serve up relevant options and at least start to inspire discovery.
That’s the key theme behind the launch of e-commerce site ASOS’s visual search launch on its native app today.
This is a fashion website with a huge 85,000 products on it; 5,000 new ones every week. One of many challenges in the online retail space is balancing that newness with the overwhelming nature of volume, particularly for users increasingly browsing on mobile. It’s for that same reason we’ve also seen Pinterest and eBay recently playing in this computer vision space. It’s about that keyword: “discovery”.
This rollout from ASOS then, aims to enable shoppers to capture fleeting moments – whether that’s someone they pass on the street, a look a friend is wearing or even a screengrab from Instagram or otherwise – and use them to search through the site’s product lines to find similar suggestions.
“The depth of our offering is absolutely one of our strengths. However that range can be challenging to present to customers, especially on a mobile phone,” Richard Jones, head of product and UX at ASOS, explains to me. “If you know what you want, you can quite simply get to what you’re looking for. But what we’re trying to find is more of that discovery use case – if you’re not quite sure what you want, or you’ve seen something that’s inspired you, visual search is designed to kickstart that discovery… It’s about getting as close as possible to giving you something that is visually similar.”
The tool is shown as a camera icon in the search bar of the ASOS app. Tapping on it then invites customers to either take a picture or upload one from their library to have it find similar products.
Jones knows the tech isn’t yet perfect, if anything the examples out in the market to date have been a “bit clunky”, but with machine learning and big data, it’s only going to improve, he suggests.
ASOS’s own version, the tech for which is powered by an external third party the company has opted not to disclose, is built on this notion. “The more [this tech] gets used, the better it gets trained, the data results get better… the smarter it becomes,” he explains.
That also reflects the way the ASOS team are operating – pushing the launch out to market (in the UK only at first) in order to test and iterate accordingly. It’s about getting it out there and learning how it’s best used before then rolling it to different geographies thereafter.
In its press release, ASOS refers to this as the “build-measure-learn” approach to innovation, a methodology developed by the Lean Startup.
This announcement also supports wider planned technology investment by the company. It currently has a tech team of 900 employees and is planning to hire a further 200 over the next year, for instance. It says its focusing on its AI-powered recommendation engine, which uses big data and a smart algorithm to learn customers’ preferences over time, as well as on improving site infrastructure to drive agility and speed up innovations for customers.
Zooming in on the mobile experience is particularly key. Today 80% of UK traffic for ASOS and nearly 70% of orders come from a mobile device, with people spending 80 minutes per month, on average, in the ASOS app.
With such mobile-native customers, Jones says it’s about how to now use the underlying technology that is in these devices – the high processing power, the ultra high-definition cameras, the depth perception imagery and more.
“We’re thinking about how do we use these devices in a way that is natural and contextual to how our 20-something customers live their lives. They go everywhere with [their smartphones] – so how can we make sure we give them an experience they are expecting?” he comments.
Further motivation lies in the fact using the camera as a means to search is going to become fairly default in September when Apple launches iOS 11, which includes the ARKit development platform. That essentially means all manner of augmented reality uses will be possible directly through the iPhone’s camera lens; visual search included. Net-a-Porter is another e-commerce player that has referenced using it.
“What we want to do is be able to meet that customer expectation and demand,” Jones adds. The visual search tool will live within the app for now, with the intention of making that broader experience an increasingly personalised one for each shopper down the road.
ASOS’s visual search launches on iOS in the UK today with pending rollout for Android and then international scale thereafter.
Gigi Hadid in Tommy Hilfiger’s LA fashion week show
Despite some connected clothing here, a spot of mixed reality there, there’s largely been little in the way of technology in action this fashion week season. Not literally of course – behind the scenes, tech is working harder than ever to push the latest shows out to a widening consumer audience – but the role of innovation has made a serious shift away from big tech hits in recent times.
Over the past few years, tech has been the way to grab attention – we’ve seen everything from drones, holograms, virtual reality, wearables and more making their way down the New York, London, Milan and Paris runways. Who could forget Google Glass at Diane von Furstenberg, virtual reality windows at Topshop or the holographic Polo Ralph Lauren show? And that’s before you think about the likes of Burberry pioneering the way with endless partnerships with tech giants Apple, Google, Facebook and Twitter.
Such a focus wasn’t brand new (the infamous robotic spray-paint scene from Alexander McQueen takes us back to before today’s connected age – nearly 20 years ago to spring/summer 1999), but it exploded in the social media era, becoming the defacto way to draw headlines, whether it was for the first live streams or indeed those big budget campaigns.
But in the centre of all that, in some instances because of it, the very notion of fashion week has changed. Today, the industry is battling with an event series that has become consumer facing while it’s still set up to deliver primarily to a trade (wholesale) model. What appears on the catwalks is generally speaking six-months ahead of it hitting stores. The result is supposed consumer fatigue, greater fast fashion copycats and more pressure than ever on turnover, margins and more. The big question now is not only whether that should change, but how. Enter the “see-now, buy-now” movement from those able to be more agile in their production timelines, including Burberry, Tommy Hilfiger, Tom Ford, Rebecca Minkoff, Topshop Unique and more.
Burberry’s February 2017 LFW show
But the follow-up question that then brings, is if we start selling to consumers now what does that mean for how things are marketed in real-time?
The simple answer, really, is to step away from the tech-for-tech’s-sake stuff; the attention grabbing initiatives without any substance behind them. That concept can still be achieved in other ways – with set designs, with political statements, with genuinely incredible collections. Today’s focus instead seems to be moving to conversion rates – to selling. And importantly, less on gimmicks.
It’s still very early days with this consumer-facing movement, and we’re just bedding in with the first iterations of it, let alone have any true measurement to compare. But, as one big British brand told me off the record this London Fashion Week: “If you have a line to push immediately, your efforts and budgets are going to go to that – to directing consumers into how to shop, not in something else that’s merely a brand move.”
What we did see in technology this season was accordingly driven by what would indeed impact the end shopper. There were chatbots once more from the likes of Burberry and Tommy Hilfiger. There was also an immediately shoppable initiative on Instagram from Rebecca Minkoff in partnership with LiketoKnow.it.
“Our customers love when we create unique experiences for them. More than just shopping, they get to be a part of our brand and we get to know them in a more meaningful way,” said Minkoff. “Collaborating with LiketoKnow.it empowers us to take that to a global audience by giving them immediate access to the same content and products that those attending in person are seeing, and that’s a very powerful opportunity.” She also introduced connected handbags to draw in certain shoppers to a unique experience of the show via a digital ticket.
The new connected handbag from Rebecca Minkoff
Tommy Hilfiger then introduced a visual search tool with Slyce – an app called Tommyland Snap:Shop that enabled users to take pictures of the models to pull up the e-commerce link to that item.
Both Minkoff and Hilfiger, who each showed in LA this season, otherwise focused heavily on the idea of full consumer entertainment as well as a significant influencer plug-in. The recognition here is that it’s about getting in front of the right consumers, and spending money to do it.
And this isn’t just a fashion week move. You can see the same with social media. It’s not about gimmicky campaigns anymore – when was the last time you called something on Facebook an innovation? It’s more about integration. It’s about decent budgets and shifting the needle on ROI.
It’s a similar story at retail, with fewer campaigns focused on technology in the store. That doesn’t mean there’s actually less tech in-store – but there’s a redirection towards the sort of tech that matters.
Speaking at the recent Commerce 2020 summit in London, Malcolm Pinkerton, VP of e-commerce and digital retail insights at Kantar Retail, said: “[In recent years] we’ve seen stores flooded with technology hoping it would digitise the experience, but it was hard to do and expensive to maintain. Now we’ve realised we can build solutions around what people bring in with them – mobile.”
Sabinna’s LFW show was shot for mixed reality
Innovation today is happening in that somewhat quieter fashion. Chatbots might still be nascent, but it makes sense they’re being experimented with – forget drones on the catwalk, why not offer personalised access to it through an AI-enabled smartphone experience? It’s for that same reason we’re seeing virtual reality and mixed reality content continuing during the shows – dipping a toe into where the future of interactive (and shoppable) content is moving.
The fact is, technology shouldn’t be a “brand move” anymore. It needs to work for who your customer is, where she is, and when. On that basis, it shifts from a headline, to a standard part of what you do. All year round.
That’s not to say there’s no place for innovation anymore. Far from it. It’s perhaps more necessary today, than ever. In this sort of market, the question increasingly becomes how do you stand out – especially if you don’t have Tommy Hilfiger-size budgets? And even if you do, how is that sustainable? On that basis, it’s about shifting the very fundamental underpinning of the business, not just the press release topline.
It’s about truly keeping ahead of the curve by disrupting the way you’ve always operated. Innovation today isn’t necessarily about the flashiest moves, but the smartest. Innovating in the supply chain, in the personalised customer experience via mobile, even eventually more and more through the fabrications themselves, is where we’ll start seeing real movements.
So is tech going out of fashion? No, but the thing to remember is that innovation is no longer just a marketing play; it’s an entire business mentality.