L’Oréal has unveiled an AI-enabled digital skin diagnosis tool that uses selfies to assess the user’s skin in order to make skincare recommendations tailored to the individual.
The Skinconsult tool deploys AI technology developed by virtual try-on beauty company Modiface combined with L’Oréal’s own research, which includes 6,000 clinical images of men and women across countries such as France, India and China, as well as 4,000 user selfies in different lighting conditions.
“Our mission is beauty for all,” said Lubomira Rochet, chief digital director of the French group, speaking at a press conference for WWD. Rochet added that she believes services will be the new way for users to discover their brands and products, and that this particular system is promoting the “democratization” of skin diagnosis, since all a potential user needs is a smartphone to snap a selfie.
To use the tool, the customer must upload a selfie onto a website, which is then analyzed in terms of areas of strength and improvement using seven different aging variables: under-eye wrinkles, lack of firmness, fine lines, lack of radiance, dark spots, deep wrinkles and pores. The result is a bespoke skincare regime that aims to meet their individual needs.
According to the group’s executives, a typical analysis under this system resulted in the same skin diagnosis as an average of 12 dermatologists. The bespoke result, however, still encourages users to see a specialist regularly.
The new tool was first introduced in January in Canada under L’Oréal’s Vichy brand, and there are plans to further expand it across the brand’s websites worldwide in the future.
How are you thinking about innovation? We’re all about finding you the perfect partners to do so. The Current Global is a consultancy transforming how fashion, beauty and consumer retail brands intersect with technology. We deliver innovative integrations and experiences, powered by a network of top technologies and startups. Get in touch to learn more.
Artificial intelligence promises benefits for both retailers and their customers, from personalised discovery to the surfacing of entire catalogues, writes Andy Narayanan of Sentient Technologies.
What’s the biggest difference between shopping online and shopping at a store? Online gives you the convenience of shopping in your robe, getting products shipped directly to your house, browsing an endless aisle of choice after choice, the ability to price check tons of retailers on the same dress, and a whole lot more. In a whole host of ways, shopping online is just, well, better. But there’s one thing that e-commerce sites have struggled with for years: the personal touch.
For many of us, when we head into our favourite stores, one of the best parts is interacting with a great salesperson. And what exactly makes a great salesperson? They get us. A great salesperson listens and understands what you want. They know which brands run a little small. They can intuit what your style is based on how you’re dressed and the clothes you’re trying on. They can head to the back to find you something they know you’ll love.
This, as we mentioned, is notably missing online. Instead of great salespeople, we have a search bar and little checkboxes to click to browse the aisles. We’re left browsing that so-called endless aisle in hopes of finding something we like. In essence, we’re trading the personal touch for convenience.
But new advances in artificial intelligence (AI) are changing all that. They bring far more personalisation than what you see at your typical fashion retailer and promise massive benefits for both retailers and their customers. Which, of course, is exactly how it should be.
So let’s get back to that online shopping experience for a second. You know the one I’m talking about. After all, most sites, when you get right down to it, have very similar interfaces. You’ve got a search bar, some facets to help narrow your search (those checkboxes on the left with options for brand, colour, size, price, etc.), and a grid of product images. If you don’t like what you see, you can click through the page numbers on the bottom or click into a product detail page to find more. And really, that’s most of what shopping online actually is.
But with AI, it’s different. Because AI can understand the product images themselves, it allows for a whole different kind of shopping.
A smart AI – like the one my company Sentient makes – looks at an image in hundreds of vectors. That means it can identify things that are tough to describe, like the placement of a logo, a certain kind of fringe, the height of a heel in relation to the rest of shoe, etc. But that’s not what really makes the AI feel personal to users. What does is how the AI reacts to their behaviour as they shop, in the moment.
So say a user starts with a normal search for a red dress. Each time she clicks on a dress to check the price or look at the product detail page, she’s sending the AI a signal. And the signal is simply that she’s interested in the product. What makes things personal is that the AI actively figures the similarities between the products the shopper is looking at. Is it a particular shade of red? The length of the dress? The scoop of the neckline? And as it’s learning what she wants, the AI can start suggesting dresses that fit her browsing patterns, not based on retailer metadata or purchases she’s made before, but from just the couple of clicks she’s made in the past few minutes.
That means, effectively, that an AI can figure out preference and style for that user. It knows what she’s looking for in a red dress, not just that she’s interested in a broad category of red dresses. In other words, it finds the red dress, not just an endless aisle of red dresses.
For retailers, implementing something like this is actually quite easy. It sits on the front-end of a site (no backend integration is necessary) and the AI can be specifically trained to their catalogue. All it needs, for example, is product images. After that, you can use the AI in your existing user flows, product detail pages, recommendation pages, you name it. It’s really up to retailers as to how they want to leverage it.
And the benefits are tangible. AI can help with important metrics like average order value, add to cart, and more, but one of the more interesting proof points is that AI helps expose the entirety of a retailer’s catalogue. That’s because, instead of using old recommendation systems (stuff like “users like you bought this”) or giving primacy to items that are already popular, an AI can look at the images themselves and recommend products that a user may never have found, because it was buried on page 40 or was from a brand they didn’t recognise or because the manufacturer didn’t give a retailer the right metadata.
In fact, our first customer surfaced a full 92% of their products in the first month they implemented. Which backs up a key thing we should underline here: AI really does know your entire catalogue. And as it learns, it helps a shopper find just what they want.
Other folks are taking different approaches to personalisation, of course. Chatbots are having their moment. Sites like The North Face have implemented a sort of Q&A flow that uses real AI to suggest products. A while back, Victoria’s Secret used a non-AI powered questionnaire to help users find the sizes and styles they liked. But what excites me more, is personalisation that adapts and reacts to buyer behaviour in the moment. One that learns style, intent, and preference as users browse. One that gives a shopper access to a retailer’s entire inventory.
AI learns, adapts, and gets the customer. It figures out what they like even when they might have trouble articulating it themselves. AI understands intent and style so that shoppers can stop scrolling through page after page of red dresses and instead, in just a few clicks, find the perfectred dress, just like a great salesperson would do. That means getting the key benefit of brick-and-mortar shopping without having to leave the couch. And that’s the sort of thing that wins you customers for life.
Andy Narayanan is the VP of Intelligent Commerce Sentient Technologies, the world’s highest-funded AI company, having raised over $140M. Its platform and technology has been in development for over nine years. Comment Counts is a series of opinion pieces from experts within the industry. Do you have something to say? Get in touch via email@example.com.
British haircare brand John Frieda is focusing on personalisation in its latest campaign; using an Instagram algorithm that analyses hair colour and social media expressions to generate custom video stories for its fans.
A collaboration with creative agency Brave, the bespoke “Shades of Me” films aim to show what individuals’ hair colour and Instagram feed say about them.
“Your Instagram feed is a curated, beautiful visual depiction of your unique style and self expression. Colour is a powerful part of this; from the pictures you take and filters you use, down to the locations you take them in – the colours you gravitate toward are what makes you, you,” reads the write-up from the team.
To achieve it, users simply select their hair colour and grant the site permission to its Instagram or Facebook photos. The site then highlights two key colours the user associates with the most and relevant John Frieda products for that lifestyle.
The custom film alongside also picks out keywords that relate to them: “You are a bold, cool, original, warm ombre,” for instance. Or: “You are a deep, refreshing, admirable, rich brunette”… Those words are laid over footage of both their own shots and
It also then provides them with footage of both their own shots and a selection from over 100 video close-ups of lifestyle, fashion and beauty moments created by the company.
For those in London this festive season, there’s a pop-up shop in Covent Garden worth taking the time to visit. Unmade, as it’s called, is tucked down an unimposing side street off the main piazza. Away from the street entertainers and busy Christmas shoppers, it’s a minimal showcase of a knitwear brand currently considered one of London’s most disruptive start-ups.
Sweaters, scarves and a full-sized industrial knitting machine are on display. You can’t walk away with an item there and then, but you can use iPads to design your own and have it made especially for you thereafter.
And that’s the USP. The name “Unmade” comes from the fact no garment is finished until you, the shopper, come and complete it.
Born through frustration at the fashion industry’s stagnant approach to mass-consumption, it’s about bespoke, personalised knitwear, produced on-demand, yet at an industrial scale. Think of it as a 3D printer for fashion, yet using the same machines that make up the $200bn knitwear market worldwide.
Tommy Hilfiger is opening up its social concierge initiative to its online following this fashion week season. First launched at its spring/summer 2014 show in September, this service enables users to request bespoke assets – pictures through to collection information – in real-time.
The aim is to provide immediate customised access to the collection and the show, in order to enable social media storytelling.
This was only offered to media and influencers physically in attendance in September, but Monday’s show at the Park Avenue Armory in New York, will invite any of Tommy Hilfiger’s eight million followers on Facebook, 473,000 on Twitter and 155,000 on Instagram, as well as global media, to participate.
A staff of roughly 100 photographers – up from 30 in September – will be on hand to fulfil the personalised requests. They will be both on- and off-site, receiving requests via email and Twitter, and working to respond as quickly as possible.
“Efficiency is a top priority,” a spokesperson at the company told me. “Blink and the moment is over – media and consumers don’t want to wait to see coverage, and the social concierge facilitates that process.”
While there’s no automation involved, it is perhaps inevitable many of the requests will be similar or the same, like a backstage make-up shot, or a picture of Tommy himself, maybe one of the model opening the show (last season Jourdan Dunn), or a detailed view of one of the pieces – therefore easing the load.
An image bank will be created accordingly for the team to draw from throughout the event, but they are also willing to gather more specific assets both backstage and front of house. Fans are actively encouraged to be as creative and original with their requests as they like. Last season saw bespoke deliveries ranging from a personal handwritten message from certain models to an image of Tommy with his thumbs up.
Avery Baker, CMO of the Tommy Hilfiger group, said: “This ‘beat-the-clock’ mentality is an important component of amplifying our brand message in the new digital age of fashion where coverage and commentary are happening in-the- moment before it’s on to the next!”
Tommy Hilfiger is also hosting a runway “Instameet”, inviting 20 local Instagrammers to join onsite on show day and receive a guided tour of the set, including backstage. The initiative is in collaboration with Brian Difeo (@bridif) and Anthony Danielle (@takinyerphoto), both influential New York Instagram users. The hashtags to follow include #tommyfall14 and #nyfwinstameet.
“This song is in tribute to Chloé, Chloé women and celebrating 60 years of all of us, it’s our theme tune in a way. So to all who have supported, designed for, bought, worn, written and talked about, shared, followed, loved and lusted after Chloé, we say thank you – this is for you. Enjoy!” reads the write-up.
This 26th and final letter also sees the archive – created by digital agency Guided Collective – become an “intimate invitational tool” from Monday. Users will be able to type in the name of a friend and send them a bespoke version of the Chloé heritage, relative to the letters that make up their name (demonstrated below).
“As [founder Gaby Aghion] once used the alphabet to inspire creativity and fun, we want you to continue the journey and introduce this wonderful story to a friend,” it says.