AI Gets Emotional: How Perceptive Analytics Could Revolutionize Marketing

Perspectives

By Russ Banham

Computers are warming to human feelings, as new image, voice, and text-recognition software tools effectively empathize with both customer and employee’s emotional states.

In a marketing context, the tools can decipher if facial expressions like smiles, smirks, arched eyebrows, or frowns suggest emotions that can be served by a particular product or service. In a workplace context, the tools can decode the meaning of written and voiced-word patterns that are suggestive of frustration, disappointment, satisfaction, or happiness to improve employee productivity and work engagement.

This new world of analytics is called artificial emotional intelligence (AEI). It combines emotional intelligence (EI)—the ability to perceive and understand human emotions in a psychological context—with artificial intelligence (AI)—the use of machine learning algorithms to deduce a person’s emotional state through their phraseology, word repetitions, facial expressions, tone of voice, gestures, and even the force of keystrokes. By analyzing this data, the user can formulate and deliver an empathetic response.

In the context of marketing, AEI offers the ability to better understand customer motivations to buy a particular product. “By recognizing, tracking, analyzing, and measuring human emotions using AI, companies are able to access people’s truthful reactions to ads and other forms of promotion,” says Mihkel Jäätma, CEO of RealEyes, which uses webcams, computer vision, and AI to analyze a viewer’s facial expressions when watching video advertisements.

Getting It Right

Marketers historically segment consumers using demographic factors like age, gender, occupation, and socioeconomic status. By combining this information with web-based psychographic data—the hypothetical values, opinions, attitudes, and interests of people within a particular demographic segment—a more detailed picture of the “customer persona” is possible. AEI provides another arrow in the quiver, offering insights into how people feel in the moment.

RealEyes was spun out of Oxford University in 2007 to do just that. The provider of emotion technology solutions is focused on helping advertisers reach people with specific products or services they actually need, freeing the rest of humanity from a barrage of product/service promotions that have little to do with our lives. “Our banner is: We bring a trillion more smiles to the world every year,” Jäätma says, referring to the boredom and frustration viewers feel when bombarded with emails, social media ads, and television commercials that are so “hit or miss” they seem like wasteful exercises. The anger boils over into adblocking.

“Using image recognition software, we’ve built a way to combine AI with EI to tell marketers if their messages align with different people’s emotional needs,” he says. “We’re now able to interpret millions of facial expressions to decode the particular emotions that may suggest actual interest in a product or service.”

As William Shakespeare put it, “The eyes are a window to our souls.” Facial expressions are “tells,” a gambling term informing how a person is thinking in the moment. Different facial expressions like a half-smile or a furrowed brow betray how someone is feeling. Since the interpretation of a person’s facial expressions differs across cultures, AEI offers a way to specifically align different facial expressions with their respective human emotions. “The same facial expression or `frame,’ as we call it, might indicate a happy person in Brazil, but a confused person in China,” Jäätma explains.

To get to this level of accuracy, RealEyes conducts AEI tests in which different facial expressions are analyzed by hundreds of people in a particular culture to discern the feelings behind the expression, such as skepticism, boredom, attentiveness, or distractedness. Armed with this data, marketers have the opportunity to test an advertisement before broadcasting it.

“Attention is a proxy for sales impact,” Jäätma says. “Consumer product companies are able to test their ads to figure out if they’ll work before putting them into the marketplace. The content can be altered to appeal to specific customer personas and segments, improving the overall marketing effort.”

He provided the example of a RealEyes customer, global confectionery manufacturer Mars, Inc. The company wanted to know how customers might respond to two different soundtracks used in a 30-second advertisement for its Twix candy bar. One soundtrack featured the song “Happy Together” by the American 60s rock band The Turtles, and the other featured the song “Nothing Compares 2 U” by Irish singer Sinéad O’Connor.

“The soundtrack choice was particularly important as there was a pause in the middle of the ad when the actor bites into the second part of the Twix and the words, `Twix, one great bar after another,’ appear,” Jäätma says. “We tracked the facial expressions of a test group of 600 viewers, using our proprietary algorithm indicating happiness.”

Which ad performed better? “`Happy Together,’ which scored 66.6 percent higher, [was]unequivocally the more effective choice to drive purchase intent and brand favorability,” Jäätma says. “For example, the moment when the actor bites into the second half of the Twix, the happiness emotion trace jumped up significantly with the song.”

Down the line, he anticipates that RealEyes’ facial recognition tools will be integrated with natural language processing (NLP) software to open up new markets like customer service interactions on videoconferencing platforms like Zoom and WebEx regarding a recent purchase.

“Facial and voice expressions can be analyzed with unique algorithms to produce a richer set of details on the customer’s emotions,” Jäätma says, explaining that the information can guide more empathetic responses to customers at what is often a time of frustration.

More Humane Work Environments

AEI is also being used to better understand how employees feel about their work and lives. UKG, a leading provider of workforce management and human capital management solutions, has merged EI with AI in a number of areas across its suite, including Employee Voice, an NLP tool that analyzes feedback in employee surveys to discern how they feel.

“Employee engagement surveys have typically asked mostly closed-ended, multiple-choice questions like, ‘How do you feel about your work on a scale of one to five?’”, because HR departments have limited time and resources to analyze open-ended, free, text-based questions,” says Christa Degnan Manning, UKG senior director, AI strategy and product management.

She explained that when AI methods like NLP are used to analyze employees’ words in responses to open-ended questions, human emotions and sentiments like frustration, disappointment, enthusiasm, or happiness—as are the issues underlying these sentiments—are detectable. “Employers can really understand what is going on,” Manning says, “[in order to] address the issues that affect employee engagement and overall productivity.”

UKG also offers AEI-based workforce feedback and coaching performance development tool. By understanding employees’ concerns, motivations, and truthful feelings about their work and the work environment, leaders can assess if their EI is in alignment with the organization’s reality.

“If this is not the case,” says Manning, “developmental actions can be taken to ensure better alignment. For example, if the feedback from the tool indicates a high prevalence of criticism about the lack of autonomy in teams, HR or team managers can make adjustments to stop the micromanagement.”

These tools were particularly timely in the past year, since many employees worked remotely at home on a physically unsupervised basis. “Managers for the most part lost the ability to pick up on physical cues,” Manning says. “Using NLP algorithms, managers can identify what we call ‘performance themes’ and `feedback intentions’ to get a better sense of how people are doing and how teams are performing. This becomes actionable information.”

She provided the example of employees finding difficulties integrating remote work with other aspects of their lives at home. The AEI tool provides feedback they otherwise might not receive. “Organizational leaders hear these issues in real time and at scale,” Manning says. “They can pivot to change policies like flex time and offer new programs, such as tutoring support or virtual after-school activities.”

Broader Applications

AEI is evident in other industry sectors like healthcare. For example, digital health company CompanionMx has developed a wearable health-monitoring app that can decipher and alert wearers to potential mental health issues like anxiety and stress. Then there’s BioEssence, a wearable olfactory/cardio-respiratory device created by MIT’s Media Lab that captures chest vibrations indicating subtle physiological changes in heart and breathing rates. Depending on the analysis, the device released one of three different scents to calm the person.

The Lab is also looking into AEI as a way for people on the autism disorder spectrum to pick up facial expressions and body language cues when communicating with other people. Wearable devices using image recognition and NLP software can interpret these cues and guide the wearer to respond accordingly. In a similar vein, Not Impossible Labs is developing a wearable tool embedded with both NLP software and image recognition sensors to assist brain-impaired people to better understand the context of communications with others.

These varied developments suggest that the intersection of AI and EI “will allow for more human-centric decisions and a more humane environment overall,” Manning says.

Jäätma shares this opinion. “The vision we always had is that by better understanding how people feel in the moment companies have the opportunity to `get personal’ on a really human level.”

Leave a Reply