Categories
data e-commerce Editor's pick mobile technology

ASOS launches visual search tool to aid inspiration and discovery for shoppers

ASOS visual search
ASOS visual search

My filter for successful visual search is simple – can you take a photo of someone else’s shoes or jacket when on a busy train and find a direct replica online? Can technology negate the awkwardness of actually speaking to someone during your commute to find out where his or her “must-have” item is from?

Fashion stalker claims aside, the short answer is still no. In the majority of cases, the tech is not yet good enough to pull apart a busy image and identify exactly what that piece is.

It is however getting better at finding similar items. Thanks to artificial intelligence, it can identify shape, colour, print and more – it can serve up relevant options and at least start to inspire discovery.

That’s the key theme behind the launch of e-commerce site ASOS’s visual search launch on its native app today.

This is a fashion website with a huge 85,000 products on it; 5,000 new ones every week. One of many challenges in the online retail space is balancing that newness with the overwhelming nature of volume, particularly for users increasingly browsing on mobile. It’s for that same reason we’ve also seen Pinterest and eBay recently playing in this computer vision space. It’s about that keyword: “discovery”.

This rollout from ASOS then, aims to enable shoppers to capture fleeting moments – whether that’s someone they pass on the street, a look a friend is wearing or even a screengrab from Instagram or otherwise – and use them to search through the site’s product lines to find similar suggestions.

“The depth of our offering is absolutely one of our strengths. However that range can be challenging to present to customers, especially on a mobile phone,” Richard Jones, head of product and UX at ASOS, explains to me. “If you know what you want, you can quite simply get to what you’re looking for. But what we’re trying to find is more of that discovery use case – if you’re not quite sure what you want, or you’ve seen something that’s inspired you, visual search is designed to kickstart that discovery… It’s about getting as close as possible to giving you something that is visually similar.”

The tool is shown as a camera icon in the search bar of the ASOS app. Tapping on it then invites customers to either take a picture or upload one from their library to have it find similar products.

Jones knows the tech isn’t yet perfect, if anything the examples out in the market to date have been a “bit clunky”, but with machine learning and big data, it’s only going to improve, he suggests.

ASOS’s own version, the tech for which is powered by an external third party the company has opted not to disclose, is built on this notion. “The more [this tech] gets used, the better it gets trained, the data results get better… the smarter it becomes,” he explains.

That also reflects the way the ASOS team are operating – pushing the launch out to market (in the UK only at first) in order to test and iterate accordingly. It’s about getting it out there and learning how it’s best used before then rolling it to different geographies thereafter.

In its press release, ASOS refers to this as the “build-measure-learn” approach to innovation, a methodology developed by the Lean Startup.

This announcement also supports wider planned technology investment by the company. It currently has a tech team of 900 employees and is planning to hire a further 200 over the next year, for instance. It says its focusing on its AI-powered recommendation engine, which uses big data and a smart algorithm to learn customers’ preferences over time, as well as on improving site infrastructure to drive agility and speed up innovations for customers.

Zooming in on the mobile experience is particularly key. Today 80% of UK traffic for ASOS and nearly 70% of orders come from a mobile device, with people spending 80 minutes per month, on average, in the ASOS app.

With such mobile-native customers, Jones says it’s about how to now use the underlying technology that is in these devices – the high processing power, the ultra high-definition cameras, the depth perception imagery and more.

“We’re thinking about how do we use these devices in a way that is natural and contextual to how our 20-something customers live their lives. They go everywhere with [their smartphones] – so how can we make sure we give them an experience they are expecting?” he comments.

Further motivation lies in the fact using the camera as a means to search is going to become fairly default in September when Apple launches iOS 11, which includes the ARKit development platform. That essentially means all manner of augmented reality uses will be possible directly through the iPhone’s camera lens; visual search included. Net-a-Porter is another e-commerce player that has referenced using it.

“What we want to do is be able to meet that customer expectation and demand,” Jones adds. The visual search tool will live within the app for now, with the intention of making that broader experience an increasingly personalised one for each shopper down the road.

ASOS’s visual search launches on iOS in the UK today with pending rollout for Android and then international scale thereafter.

This post first appeared on Forbes

Categories
business data Editor's pick mobile technology

Yoox Net-a-Porter looks to the future of AI and mobile commerce with new tech hub in London

The new Yoox Net-a-Porter tech hub in White City, London
The new Yoox Net-a-Porter tech hub in White City, London

Federico Marchetti, CEO of Yoox Net-a-Porter, calls the group’s new tech hub in White City, west London, its “space shuttle”.

“This is our temple of innovation that’s going to take YNAP into the future,” he explained at the opening this morning.

The 70,000 sq ft space comes as part of an investment of more than €500m in technology and logistics across the group in a bid to double the size of the business by 2020. It houses all of YNAP’s UK tech teams under one roof – a total of 500 employees, in addition to the further 500 based in Bologna, Italy.

The big focus in terms of the work they’re doing today is around artificial intelligence (AI) and the next wave of mobile technologies, the team explained. Demonstrations at the opening for instance included an AI-enabled virtual personal stylist tool that could recommend items based on image recognition, personalised preferences and contextual data like location and weather forecast.

Another AI tool in the works can suggest different options for complete outfit looks – taking the professionally styled shots that the e-commerce sites currently show and providing unlimited variations of mix and match pieces for users alongside. This level of machine learning and neural networks learn as they go, making them only better for users over time, the team explained.

Alex Alexander, CIO at the company, referred to everything they’re doing as being about making the experience more personalised for shoppers. “We’re using our own data in a smarter and more detailed way in order to tailor the customer experience to every individual customer,” he explained.

Marchetti added: “What innovation means for us is not innovation for the sake of it, but innovation for the customer.”

On the mobile side however, that starts internally. Every employee at the company is being given an iPhone equipped with new apps designed in collaboration with Apple and IBM in a bid to enable them to think not only mobile-first, but eventually mobile-only.

“Our focus on mobile starts with our employees. If we don’t think mobile-first for them, how can we expect to get it right for our customers?” Alexander asked.

The tech team is therefore meeting with every department within the business to understand their mobile needs. The personal shopping team was on hand today, for instance, exploring how they can use mobile as an opportunity to spend more time with their top customers, known as EIPs (extremely important people). The idea is to give them greater tools and capabilities so they can scale their interactions. AI will inform that too.

Yoox Net-a-Porter is prioritising mobile
Yoox Net-a-Porter is prioritising mobile

From a customer perspective, some further examples for mobile include leaning heavily on Apple’s iOS 11 update, which is due to roll out in September (though is available in beta now). Centre to that is the camera, which will come with in-built augmented reality opportunities as well as a QR reader.

Users can use it to take pictures of outfits they like and through an AI algorithm using visual recognition, find similar options to purchase, for instance. Meanwhile, the team will also deploy QR codes in some of the windows of the physical stores it powers so that the looks on display (each tagged with RFID labels) can be brought up on the relevant e-commerce site for purchase immediately.

If the QR code still sounds like a questionable option, YNAP is convinced both by behavior witnessed from Chinese shoppers and Apple’s integration of the technology as a sure sign for the future.

The company also announced a new partnership today with Imperial College London, to support an initiative that teaches local children from underprivileged backgrounds the basics of coding. The project, named Imperial Codelab powered by Yoox Net-a-Porter, is particularly focusing on increasing the number of young girls who have access to such classes. “We know there are not enough women in tech and we want to help that for the future,” Marchetti noted.

This post first appeared on Forbes