Pretty Soon, Chatbots Will Be Able to Understand How You're Feeling
“It’s like a sixth sense. When a guy walks into the store, I can tell in, like, five seconds.” The veteran clothing salesperson I’m interviewing warms quickly to my questions about his art. “I watch the way he walks, his eyes. I can tell if he’s happy, sad or stressed. I can tell if he needs to talk, or just wants to get on with finding whatever he’s looking for. I know how to respond to make my commission.”
Contrast this with the highly-touted chatbot from 1-800 Flowers, which I interacted with recently. It efficiently walked me through my order, and as I was ready to pay, wished me a “fantastic, colorful day!” This would have been friendly and pleasing, had I not spent the previous 15 minutes browsing funeral flower arrangements.
Granted, I was interacting with a Facebook Messenger chatbot, which had no knowledge of my actions on the company website. However, wide scale rollout of intelligent on-site chatbots is just around the corner. And when this happens, things get really interesting.
Once we have intelligent chatbots operating organically on major retail, travel or finance sites, the picture changes fundamentally. Having the chatbot from which I ordered funeral flowers solemnly wish me condolences is just the tip of the iceberg. This is a conversational ecommerce no-brainer -- the bot sees what I’m doing on the site and responds according to a script.
But what happens when we take this to the next level? What happens when the chatbot can minutely examine my on-page actions, infer my mindset and respond accordingly?
In other words, what happens when chatbots, in a very real sense, can empathize?
Is digital body language just a buzzword?
Much has been written about “digital body language” -- understanding what customers do online, not just what they say (or write).
Why is digital body language so crucial to online success? Because digital interactions -- just like day-to-day real-world interactions -- are based in very large part on nonverbal communication. When we interact with others in the physical world, we are continuously processing wordless signals such as facial expressions, tone of voice, gestures, body language, eye contact and physical distance. We cannot understand the true meaning of an interaction if we do not have the ability to interpret these nonverbal signals.
Despite the clear importance of digital body language as a tool for ecommerce, it has remained largely a catchall phrase for ex post facto analysis and profile-based personalization. Even efforts by chatbot vendors to humanize interactions fall flat because they have failed to capture, analyze and harness the most meaningful and powerful aspect of human communication: the unspoken.
All this, however, is about to change.
Harnessing digital body language is not science fiction.
Here’s the breakthrough -- and it’s not science fiction, even though it sounds like it. Just as we infer nonverbal signals in the offline world, today we can use cutting-edge customer experience technology to effectively infer a customer's mindset in real time.
Using advanced user experience solutions, we can now monitor in real-time digital activities such as browsing behavior, click-through rates, hesitation, scrolling and more. This enables forward-thinking retailers to abandon behavioral models based on past actions, in favor of tracking, analyzing and responding to current behavior. Now, they can quickly identify each shopper’s psychological needs and more effectively assist them in the decision-making process.
With machine learning, it is possible to develop models that can interpret and classify the mindset of each customer coming to the site. Leveraging data gathered in real-time, per shopper session, such algorithms could integrate actions, attributes and contexts to generate a real-time classification of an individual visitor’s intent. Then, based on such knowledge, brands can automatically adapt their offerings. And here’s where chatbots come into the picture. Because if we can quantify mindset and respond to it with page personalization or offer customization -- we can teach chatbots to do the same.
Thus, in my example above, the chatbot that responded to me would not only grasp that I’m shopping for a funeral arrangement and offer condolences. Rather, it would parse my on-site actions -- what page I landed on, where I clicked or moved my mouse, how quickly I scrolled through which pages, how exactly I interacted with the site navigation -- and infer my mindset. It could know, in a micro-second, that I was just checking the options on the site and respond by offering to help narrow down the wealth of choices. It could tell if I was focused and ready to buy, and guide me as quickly as possible through the process. And it could tell whether I was open to suggestions for wreaths vs. traditional floral arrangements, and suggest popular options.
In short, it could do exactly what my friend the salesperson does with his customers. It could grasp and react to my mindset -- gaining, strictly speaking, the ability to empathize. And this is likely why my friend was not at all pleased to hear about all this.
“Perhaps,” he huffed, mildly affronted, “the next time we go out to dinner, you should let the chatbot pay?”