Sixth Sense of Retail

What motivates customers to glance, to stop, to stare, to share … to ultimately purchase a brand’s product?

(As first seen on LinkedIn written by Melissa Gonzalez)

It’s a different answer for each of us but for all of us there is something that triggers our senses to connect. It may be a flash of color, a burst of light, a soothing sound or a powerful scent. And, retailers work hard to design moments and mastermine store layouts that strategically plan speed bumps and moments of discovery to insight engagement and hopefully conversions.

Image from Organic Fiber Council Pop-Up
Image from Organic Fiber Council Pop-Up

Emotional connections play an incremental role in marketing products and building brands. So it’s critical that brands and store planners truly know if there are merits to their efforts: Is signage placed where it should be, does the story in the store incite happiness, does the color blue convert more customers than yellow? The guessing game of what makes an impact on customers is a thing of the past when the “eyes” are always watching and can help brands make sense of the physical world they have created. Just as Dustin Hoffman could count cards in Rain Man to help Tom Cruise play to win, computer vision can do the same constantly collecting and analyzing data to make informed predictions about what customers want to see and do.

“Cameras will be everywhere. All inanimate objects will have cameras and this will give them the ability to see. The Internet of Eyes is going to be larger than IoT and Amazon Echo Look recently validated this.” Says Evan Nisselson, General Partner, LDV Capital and organizer of this week’s annual LDV Vision Summit.

Image courtesy of Backchannel
Image courtesy of Backchannel

Creating a Clear Use Case

In order to truly capitalize on the power of computer vision, marketers and merchandisers need to begin with a clear point of view on what they are trying to learn (“the hypothesis”) and metrics of success. On a basic level they can ask questions such as: Is this the optimal layout? Do customers interact in one zone of the store more than others? Where do customers spend the most time? Do these patterns correlate with the sales conversion data we gather from our POS system?

A camera has the ability to simultaneously scan and analyze hundreds of points of a person’s face. It can also decipher between human objects and non-human. Via an instantaneous recognition of facial detection, emotion detection, demographic and feature detection, computer vision is layering in context to who is interaction by adding in the how and the why.

“Contextual relevance is the the most important aspect and challenge when leveraging computer vision. There will be so much data and only a small percentage will be valuable signals to answer questions and increase revenue opportunities. When integrated properly, these high quality signals will deliver significant revenue.” Says Nisselson.

Understanding that a man stops and stares at a product may be driven by a very different motivation that impacts his dwell time in a certain area in comparison to a woman who stops and smiles in the very same spot. By compiling and categorizing instance after instance, day after day, brands can create norms and outliers and eventually very specific categorizations to enable them to more fine tunely understand each customer on a more personal level.

image via IBM
image via IBM

Vision In Action

From visual sentiment analysis, technology companies are working hard to crack the code on understanding what motivates and influences customer decisions. Affectiva, an MIT Lab spinoff that have analyzed over 5 million faces, enables retailers to use facial tracking to generate invaluable emotional insights the inform digital displays and in-store signage. Sensing up to 7 human emotions (including anger, sadness, disgust, joy, surprise, fear and contempt) up to 20 different facial expressions, age range, ethnicity and gender, their recognition technology analyzes pixels in those regions to classify facial expressions and mapping them to associated emotion emojis. IBM Watson has been working with major retailers such as North Face and Macy’s to empower their shopping experiences with visual recognition and artificial intelligence to enable more personalized service. They create and train their systems with  custom image classifiers that match each retailer’s own collections and can associate them with customers and sentiment. Startup brand Eversight, who says they enable brands and retailers to deploy promotions that out-perform the status quo by 10-25%, helps consumer packaged goods companies monitor the presentation of their merchandise on store shelves and track the results of in-store promotions and visual displays. Via in-store cameras and artificial intelligence, they enable their clients to monitor the performance of products on shelves through camera vision and sentiment analysis to optimize product promotions.

Building customer segmentations based on computer vision data and sentiment analysis empowers retailers on a deeper level. It adds a layer of complex thinking to pass/fail decisions. It allows retailers to understand the dynamics of a living lab store environment. On a basic level it gives answers as to traffic patterns and dwell times, but on a more complex level it can drive true personalization. It can empower sales associates to serve as personal concierges to each customer. It can inform the software systems behind interactive touch screens to dynamically recognize the person standing in front of its screen down to the age, gender, ethnicity and serve up images and information that’s most relevant.

“Leveraging computer vision and artificial intelligence a camera will hopefully help proactively shop for me without needing me to physically search online or in stores”, says Nisselson. “Ideally, Amazon Alexa would send me an email saying “looks like your favorite red pants are wearing out because you wear them all of the time. We noticed a hole on your back pocket and thought you would like to know that we have two of those pants in the same color and size in stock. Would you like me to order you one or two of them?” … Yes!”

By leveraging computer vision and artificial intelligence, stores can be designed to build emotional connections with consumers in a dynamic way. It takes what’s possible in personalization to a new level. Eventually we will all have our Mission Impossible Tom Cruise moments, but this time the systems speaking to us will also understand our current sentiment towards what they are delivering to us, and will be able to adjust and respond with what we really want.