Categories
business Campaigns e-commerce Editor's pick product Retail technology

How brands can find their own voice in a screenless future

Amazon Alexas and Google Homes have been popping up in households around the world, and it is expected that there will be 8 billion voice assistants by 2023. While so far, the technology has mainly been used for running other smart devices in the home, asking novelty questions or setting timers, there is strong potential for fashion and beauty brands to focus on the retail aspect of the experience.

Voice commerce sales totaled a whopping $2.1 billion last year, and it is predicted that consumers will use the technology for almost a fifth of their total spending by 2021. For brands, this is not only a new a new opportunity to connect with its customers, but an important new sales channel. 

Last year we spoke to Amazon Alexa’s founder, William Tunstall-Pedoe, on the Innovators podcast, on how voice tech will impact retail. Although the technology is still in its early stages of development, Tunstall-Pedoe envisioned a future that is all connected: “I think you’ll be surprised in a couple of years if you speak to a device and it doesn’t reply.” He believes that the technology will be transformative, with the artificial intelligence behind voice assistants eventually interconnecting everything around us. 

As far into the future as it sounds, this concept may be happening a lot sooner than we think. 

From creating moments of discovery to enabling better store interactions, we explore 3 ways that brands retailers can be leveraging voice tech in order to enhance customer experience.

Gaining traction
Rebook’s limited edition Club C sneakers

One of the biggest challenges retailers and brands face when engaging in voice interactions is how to get their product discovered. The lack of a screen and the current intelligence of algorithms means that shopping on these platforms is generally a linear journey, and unless the customer is looking for a specific brand, surfacing as a suggestion is virtually impossible. 

One way retailers can adapt to the technology is by utilizing it in their marketing strategy. Reebok, for example teamed up with Amazon and Google for the launch of its Swarovski sneakers collaboration. Consumers could win a pair of the limited edition trainers by asking their voice assistant to “open Reebok Sneaker Drop”, which would automatically enter them into the competition. On the day of the launch, 50 lucky winners were announced through the voice channels. 

This specific campaign showed that as the popularity of the drop model starts to lose steam, voice tech could help reignite its spark. This approach is also particularly effective with the younger generation who is not only tech-savvy, but constantly looking to be challenged in order to land exclusive products.

Setting the tone
Mastercard’s sonic branding

Marketers often talk about fighting to get through the noise, but now brands are literally fighting to get their voices heard. In the near future, owning a clear brand voice, which aligns to its overall identity and DNA, is going to be an important tool to have under the belt. 

As voice tech gets more sophisticated, we’re seeing that brands will start to move away from the generic ‘Alexa’ or ‘Cortana’ voices, into recognizable accents that differentiate the brand from competitors. Developing the correct tone of voice will be key to building brand loyalty, as 72% of consumers believe brands should have a unique voice and personality.

Mastercard has been experimenting with sound architecture by creating its own sonic brand identity which is simple, memorable and adaptable. The distinct melody is played at every touchpoint of the consumer journey, with the intention of helping reinforce the brand’s values and build deeper connections with its customers. This indicates that although brands have long relied on having a purely visual identity, in the future, they are going to have to adapt to an environment that is increasingly audio-friendly (and often screenless).

Enhancing the in-store experience
H&M’s voice activated mirror

68% of consumers say voice assistants free them to multitask and accomplish tasks hands-free in the home, but how could that translate in-store? For example in a fitting room, a voice assistant could make product recommendations, check for other sizes, or even offer styling tips.

Last year, H&M tested the use of voice-activated mirror at its NYC flagship, which allowed users to access style advice, discounts and even take selfies. The mirror gained a lot of traction, with 150 interactions per day, while 85% of people who did so, scanned an additional QR code to receive a discount. The mirror was implemented as a standalone feature, but in the future, this technology could potentially move into changing rooms, allowing people to experience it privately (and therefore lowering the barrier to entry.)

In 2016, Gartner predicted that by next year 30% of web browsing would be screenless. Brands and retailers must therefore keep up with the pace of change, or risk being excluded from this emerging behavior that is increasingly leaning towards audio.

How are you thinking about new technology? The Current Global is a transformation consultancy driving growth within fashion, luxury and retail. Our mission is to solve challenges and facilitate change. We are thinkers and builders delivering innovative solutions and experiences. Each of the rules referenced above is matched by one of our products and services. Interested in how? Get in touch to learn more.

Categories
Campaigns Editor's pick Retail

Coach pop-up celebrates self-discovery with NY fairground experience

“Life Coach” pop-up

Coach’s newest pop-up, Life Coach, celebrates the label’s roots in New York City with a series of immersive experiences that aims to “heighten your senses, stimulate your soul and wake up all the feels”.

The activation, which is running from June 12 through to June 17 in the Soho neighbourhood in NYC, which is where the brand was founded in 1941, invites guests to participate in tarot card readings, drawing, and playing carnival games.

Visitors enter the space via a neon storefront filled with psychic symbols and Coach visuals. Upon first entering the space visitors are asked to check in, and when reaching the first room, they are met with an entirely blank canvas on which they are encouraged to draw on.

The next room represents a typical Coney Island-type of fairground scene, including old-fashioned arcade games and photo props, as well as a boardwalk made from pieces salvaged from Coney Island after Hurricane Sandy.

In the third and final room, visitors can walk through a dark forest where they can find white tents that house tarot card readers.

Speaking to the New York Times, Carlos Becil, Coach’s chief marketing officer, said of the concept: “Whether you call it mindfulness, spirituality or self-help, seeking answers is the new pop culture.”

Activities that help consumers through their self-discovery include free sessions with mystics including tarot card readers Hoodwitch and astrologists Astrotwins. The event, which has no Coach product in sight, will keep its concept of self-discovery and elusiveness by introducing surprise guests and events throughout its programming until the pop-up’s last day.

The entire initiative ties to a broader theme we’re seeing in consumer retail, whereby the experience economy is evolving into the transformation economy – a state that is about driving self improvement and enhancement for consumers through brand activities, rather than mere moments meant to encourage dwell time or social sharing.

This slideshow requires JavaScript.

Categories
Editor's pick mobile Retail technology

American Express brings shoppable AR feature to Coachella

Coachella
Coachella

American Express has integrated a shoppable augmented reality feature to the official app of the Coachella music festival, spanning two weekends in Palm Springs this month.

The payment company’s AR experience allows its cardholders to buy select merchandise by using an AR camera feature when at the festival’s grounds.

Within the Coachella app, Amex cardholders can tap a dedicated Amex tab that will enable the AR experience, as well as give them a series of other benefits and rewards. Clicking the “shop” feature and waving their phones will trigger an AR image of exclusive merchandise, which can be purchased on the spot.

Other cardholder benefits at the festival include entrance into a club area, access to an Uber priority lane and free Ferris wheel rides, while Platinum card members get access to a dedicated house that hosts exercise classes and music performances.

The AR app from American Express at Coachella
The AR app from American Express at Coachella

Amex has been increasingly experimenting with AR technology in order to enable tech-enhanced, real-world experiences that blend discovery and commerce to its cardholders.

It has recently collaborated with Justin Timberlake to launch an experience to promote his new album, titled “Man of the Woods”, within the American Express Music app. The “Outside In” AR camera experience sees Timberlake himself guide users through a Montana setting while he shares details of how his “Breeze Off the Pond” track came together. Users can also shop for exclusive merchandise while partaking in the experience.

Justin Timberlake and American Express
Justin Timberlake and American Express

Brands are upping the ante when it comes to striking the right balance in providing immersive mobile experiences that eventually convert into sales. By its nature, AR technology needs to be deployed in-situ, meaning there is also scope to play with the element of scarcity by making the experience geo-fenced, as seen with the Coachella feature.

Most recently, Nike teamed up with Snapchat to offer early access to a new shoe at a basketball game that could only be purchased by scanning Snapcodes; meanwhile at SXSW this year, hip activewear brand Outdoor Voices encouraged Austin locals and visitors to go outside by creating an AR experience that surfaced particular products depending on location.

Categories
data e-commerce Editor's pick mobile technology

ASOS launches visual search tool to aid inspiration and discovery for shoppers

ASOS visual search
ASOS visual search

My filter for successful visual search is simple – can you take a photo of someone else’s shoes or jacket when on a busy train and find a direct replica online? Can technology negate the awkwardness of actually speaking to someone during your commute to find out where his or her “must-have” item is from?

Fashion stalker claims aside, the short answer is still no. In the majority of cases, the tech is not yet good enough to pull apart a busy image and identify exactly what that piece is.

It is however getting better at finding similar items. Thanks to artificial intelligence, it can identify shape, colour, print and more – it can serve up relevant options and at least start to inspire discovery.

That’s the key theme behind the launch of e-commerce site ASOS’s visual search launch on its native app today.

This is a fashion website with a huge 85,000 products on it; 5,000 new ones every week. One of many challenges in the online retail space is balancing that newness with the overwhelming nature of volume, particularly for users increasingly browsing on mobile. It’s for that same reason we’ve also seen Pinterest and eBay recently playing in this computer vision space. It’s about that keyword: “discovery”.

This rollout from ASOS then, aims to enable shoppers to capture fleeting moments – whether that’s someone they pass on the street, a look a friend is wearing or even a screengrab from Instagram or otherwise – and use them to search through the site’s product lines to find similar suggestions.

“The depth of our offering is absolutely one of our strengths. However that range can be challenging to present to customers, especially on a mobile phone,” Richard Jones, head of product and UX at ASOS, explains to me. “If you know what you want, you can quite simply get to what you’re looking for. But what we’re trying to find is more of that discovery use case – if you’re not quite sure what you want, or you’ve seen something that’s inspired you, visual search is designed to kickstart that discovery… It’s about getting as close as possible to giving you something that is visually similar.”

The tool is shown as a camera icon in the search bar of the ASOS app. Tapping on it then invites customers to either take a picture or upload one from their library to have it find similar products.

Jones knows the tech isn’t yet perfect, if anything the examples out in the market to date have been a “bit clunky”, but with machine learning and big data, it’s only going to improve, he suggests.

ASOS’s own version, the tech for which is powered by an external third party the company has opted not to disclose, is built on this notion. “The more [this tech] gets used, the better it gets trained, the data results get better… the smarter it becomes,” he explains.

That also reflects the way the ASOS team are operating – pushing the launch out to market (in the UK only at first) in order to test and iterate accordingly. It’s about getting it out there and learning how it’s best used before then rolling it to different geographies thereafter.

In its press release, ASOS refers to this as the “build-measure-learn” approach to innovation, a methodology developed by the Lean Startup.

This announcement also supports wider planned technology investment by the company. It currently has a tech team of 900 employees and is planning to hire a further 200 over the next year, for instance. It says its focusing on its AI-powered recommendation engine, which uses big data and a smart algorithm to learn customers’ preferences over time, as well as on improving site infrastructure to drive agility and speed up innovations for customers.

Zooming in on the mobile experience is particularly key. Today 80% of UK traffic for ASOS and nearly 70% of orders come from a mobile device, with people spending 80 minutes per month, on average, in the ASOS app.

With such mobile-native customers, Jones says it’s about how to now use the underlying technology that is in these devices – the high processing power, the ultra high-definition cameras, the depth perception imagery and more.

“We’re thinking about how do we use these devices in a way that is natural and contextual to how our 20-something customers live their lives. They go everywhere with [their smartphones] – so how can we make sure we give them an experience they are expecting?” he comments.

Further motivation lies in the fact using the camera as a means to search is going to become fairly default in September when Apple launches iOS 11, which includes the ARKit development platform. That essentially means all manner of augmented reality uses will be possible directly through the iPhone’s camera lens; visual search included. Net-a-Porter is another e-commerce player that has referenced using it.

“What we want to do is be able to meet that customer expectation and demand,” Jones adds. The visual search tool will live within the app for now, with the intention of making that broader experience an increasingly personalised one for each shopper down the road.

ASOS’s visual search launches on iOS in the UK today with pending rollout for Android and then international scale thereafter.

This post first appeared on Forbes

Categories
Comment e-commerce Editor's pick technology

Comment counts: How AI is making online fashion truly personal

Artificial intelligence promises benefits for both retailers and their customers, from personalised discovery to the surfacing of entire catalogues, writes Andy Narayanan of Sentient Technologies.

Sentient Technologies artificial intelligence
Sentient Technologies

What’s the biggest difference between shopping online and shopping at a store? Online gives you the convenience of shopping in your robe, getting products shipped directly to your house, browsing an endless aisle of choice after choice, the ability to price check tons of retailers on the same dress, and a whole lot more. In a whole host of ways, shopping online is just, well, better. But there’s one thing that e-commerce sites have struggled with for years: the personal touch.

For many of us, when we head into our favourite stores, one of the best parts is interacting with a great salesperson. And what exactly makes a great salesperson? They get us. A great salesperson listens and understands what you want. They know which brands run a little small. They can intuit what your style is based on how you’re dressed and the clothes you’re trying on. They can head to the back to find you something they know you’ll love.

This, as we mentioned, is notably missing online. Instead of great salespeople, we have a search bar and little checkboxes to click to browse the aisles. We’re left browsing that so-called endless aisle in hopes of finding something we like. In essence, we’re trading the personal touch for convenience.

But new advances in artificial intelligence (AI) are changing all that. They bring far more personalisation than what you see at your typical fashion retailer and promise massive benefits for both retailers and their customers. Which, of course, is exactly how it should be.

So let’s get back to that online shopping experience for a second. You know the one I’m talking about. After all, most sites, when you get right down to it, have very similar interfaces. You’ve got a search bar, some facets to help narrow your search (those checkboxes on the left with options for brand, colour, size, price, etc.), and a grid of product images. If you don’t like what you see, you can click through the page numbers on the bottom or click into a product detail page to find more. And really, that’s most of what shopping online actually is.

But with AI, it’s different. Because AI can understand the product images themselves, it allows for a whole different kind of shopping.

A smart AI – like the one my company Sentient makes – looks at an image in hundreds of vectors. That means it can identify things that are tough to describe, like the placement of a logo, a certain kind of fringe, the height of a heel in relation to the rest of shoe, etc. But that’s not what really makes the AI feel personal to users. What does is how the AI reacts to their behaviour as they shop, in the moment.

So say a user starts with a normal search for a red dress. Each time she clicks on a dress to check the price or look at the product detail page, she’s sending the AI a signal. And the signal is simply that she’s interested in the product. What makes things personal is that the AI actively figures the similarities between the products the shopper is looking at. Is it a particular shade of red? The length of the dress? The scoop of the neckline? And as it’s learning what she wants, the AI can start suggesting dresses that fit her browsing patterns, not based on retailer metadata or purchases she’s made before, but from just the couple of clicks she’s made in the past few minutes.

That means, effectively, that an AI can figure out preference and style for that user. It knows what she’s looking for in a red dress, not just that she’s interested in a broad category of red dresses. In other words, it finds the red dress, not just an endless aisle of red dresses.

For retailers, implementing something like this is actually quite easy. It sits on the front-end of a site (no backend integration is necessary) and the AI can be specifically trained to their catalogue. All it needs, for example, is product images. After that, you can use the AI in your existing user flows, product detail pages, recommendation pages, you name it. It’s really up to retailers as to how they want to leverage it.

And the benefits are tangible. AI can help with important metrics like average order value, add to cart, and more, but one of the more interesting proof points is that AI helps expose the entirety of a retailer’s catalogue. That’s because, instead of using old recommendation systems (stuff like “users like you bought this”) or giving primacy to items that are already popular, an AI can look at the images themselves and recommend products that a user may never have found, because it was buried on page 40 or was from a brand they didn’t recognise or because the manufacturer didn’t give a retailer the right metadata.

In fact, our first customer surfaced a full 92% of their products in the first month they implemented. Which backs up a key thing we should underline here: AI really does know your entire catalogue. And as it learns, it helps a shopper find just what they want.

Other folks are taking different approaches to personalisation, of course. Chatbots are having their moment. Sites like The North Face have implemented a sort of Q&A flow that uses real AI to suggest products. A while back, Victoria’s Secret used a non-AI powered questionnaire to help users find the sizes and styles they liked. But what excites me more, is personalisation that adapts and reacts to buyer behaviour in the moment. One that learns style, intent, and preference as users browse. One that gives a shopper access to a retailer’s entire inventory.

AI learns, adapts, and gets the customer. It figures out what they like even when they might have trouble articulating it themselves. AI understands intent and style so that shoppers can stop scrolling through page after page of red dresses and instead, in just a few clicks, find the perfect red dress, just like a great salesperson would do. That means getting the key benefit of brick-and-mortar shopping without having to leave the couch. And that’s the sort of thing that wins you customers for life.

Andy Narayanan is the VP of Intelligent Commerce Sentient Technologiesthe world’s highest-funded AI company, having raised over $140M. Its platform and technology has been in development for over nine years. Comment Counts is a series of opinion pieces from experts within the industry. Do you have something to say? Get in touch via info@fashionandmash.com.

Categories
social media

Former Topshop, Burberry exec launches Tunepics – an image-based music sharing app

Tunepics on the iPhone

Will.i.am, Kate Bosworth and Jamie Oliver are among some of the first celebrity names to be using a new music discovery app called Tunepics, while brands including Paul Smith, Chloé and asos are also on board.

Ever wanted to share a song with your photograph to help sum up the mood of the scene more than a filter alone can do? Now you can. Tunepics – launched in the app store for the iPhone and iPad today – enables users to pair images with relevant songs thanks to the iTunes API.

“Over 500 million pictures are uploaded to the internet every day, and over 100 million songs are downloaded each week. Together, that’s dynamite,” says the brains behind the new social network, Justin Cooke, former CMO of Topshop, now founder and CEO of innovate7. His aim is to help create the “soundtrack to your life”.

The experience is an intuitive one: you upload an image, place a filter over the top, then search the 35 million songs in the iTunes library by keyword to add them to your shot. The result appears in a feed alongside those from the friends you opt to follow; each one auto-playing a 30-second preview of the track as you scroll over it, as well as offering a ‘download’ button to buy the full version.

Posts can also be ‘re-tuned’ to your own followers, and shared via Facebook and Twitter where they will appear as a ‘tunecard’. For the likes of Will.i.am, that of course makes the app an appealing proposition for its potential to help drive record sales. It also provides a revenue stream for innovate7 through affiliate sales from iTunes (there’s no advertising model planned on the platform for now otherwise).

Cooke is particularly excited for the opportunity that lies in music discovery, both for consumers using the app and for young, emerging talent to start gaining recognition in a new way. On that basis, it launches with a specially commissioned soundtrack from British band, Ellerby, called Colour Me In.

But the premise of the app, which was built by agency AKQA, otherwise goes further than just being about music sharing and discovery. The aim is to provide multisensory experiences that evoke an emotional response.

“When you hear a picture, it changes everything; it awakens your senses. We want [Tunepics] to be like a cinematic celebration of your life,” said Cooke. “Music is the most powerful way to express the things we see and feel; nothing else comes close.”

To that end, the emotional response that posts receive from followers is also fully visible. Each is accompanied by an ‘emotion wheel’ (the design of which also makes up the app’s logo). This features a spectrum of 16 colours users can choose from, representing different feelings such as happy, moved, jealous and heartbroken.

Said Cooke: “A like doesn’t tell a story on its own anymore. When [Nelson] Mandela passed away, we didn’t want to say that we liked it, but that it moved us. This is all about enabling an emotional experience.”

Which is why this app also makes sense, from the off, for brands. Beyond the initial celebrity appeal, there are also the likes of Paul Smith, All Saints, asos, Dazed and Airbnb already on board.

The expectation is that embedding music into their social content will help heighten the moments they want to talk about. An example post from Paul Smith featured a collection of paint pots and the Rolling Stones track Paint it Black. “His response was that he couldn’t imagine life without music. That’s so powerful, and so true,” Cooke explained. In fact, a similar quote from philosopher Nietzsche features on the Tunepics introductory video from the innovate7 team: “Without music, life would be a mistake.”

Clare Waight Keller, creative director of Chloé, said the choice to join Tunepics from day one was an instant decision after a two minute pitch. “I just loved the added layers of emotion, simply adding music to an image really brings it to life. It’s like a way to capture what was going through your head in that moment.”

She also appreciates the emotion wheel. “[It] will be really interesting. ‘Likes’ have almost become empty gestures now, it takes no real thought to ‘like’ a picture. But to take the time to select the feeling the image inspired in you, shows real engagement. It’s a great way for Chloé to connect with our audience,” she explained.

Brands will also begin to benefit from the data said emotion wheel collates. Mood charts are displayed beneath each tunepic showcasing people’s responses, which suggests valuable consumer insights could be gleaned should the numbers creep high enough. Unlike Instagram, it is also possible to add hyperlinks to every post, which will prove quite the draw for the likes of Paul Smith again, and all those others with e-commerce capabilities.

It may come as no surprise to learn that prior to his role at Topshop, Cooke spent six years helping to lead the charge at Burberry – a brand not only with a longstanding music initiative in Burberry Acoustic, but with an unquestionable focus on emotive content tied to measurable business results.

Topping it all off is the fact those aforementioned filters are based on the weather – another theme familiar to Burberry fans. Every photograph uploaded can be enhanced with true-to-life overlays of the snow, raindrops, sunshine or even a rainbow.

“I’ve always had a fascination with music, colour, images and the weather, and how they influence our mood and emotions. I want people to be able to share the depth behind the moments they experience and to articulate all the ones that they dream of having,” Cooke explained.