Emerging Tech Adds Emotional Layer To Existing Measurement Framework

From voice and facial-recognition tech to artificial intelligence and biometrics, brands can, for the first time, not only measure engagement, but they can also measure emotion and how a particular experience made someone feel.

Emerging Tech Adds Emotional Layer To Existing Measurement Framework

This article is part of CMO.com’s March/April series about emerging technology. Click here for more.

Emerging technologies lend themselves well to the age of customer experience, where brands are working hard to better understand the needs and preferences of their current and potential customers.

From voice and facial-recognition tech to artificial intelligence and biometrics, brands can, for the first time, not only measure engagement, but they can also measure emotion and how a particular experience made someone feel. Indeed, emerging technologies add a new, exciting layer to existing measurement frameworks, according to Dave Dickson, who leads efforts in emerging technologies at Adobe (CMO.com’s parent company).

“With emerging technologies that measure emotional responses, marketers can better quantify consumer reactions,” Dickson said. “Combining that data with artificial intelligence holds particular promise in designing and delivering personalized experiences, tailored in real time to the current needs and desires of the customer.”

New Metrics To Consider

Dickson cited facial recognition, biometrics, and voice technologies as ones to watch. Facial recognition and biometrics, for example, have the potential to measure how people feel as they experience a product or a marketing campaign. Voice could gauge tonal inflection to measure emotion around particular experiences.

Forrester VP and principal analyst James McQuivey said he’s most excited about the potential of real-time emotion metrics and envisions a time when marketers expand from just click-stream data to also include “feel-stream data.”

“We’re going to see how people’s feelings move through our experience,” he told CMO.com. “It’s also going to show us how little we understand about feelings. Right now, what we want to know is which of the two offers we’ve served you worked best, and we can see in your face, for example, that you’re really happy. But what if you came into the experience with a certain facial configuration prior to even showing up in our app or in front of our camera or talking to our chatbot?”

Dickson is also bullish on the potential of “affective computing” and believes that it holds answers to delivering experiences that resonate with the emotional state of the consumer. Researchers in this area are already exploring facial-expression decoding, electro-dermal measurement, speech analysis (including tone/pitch/rate), eye tracking, and gesture recognition, among other technology.

With these biometrics, brands can get a sense of the emotions—both expressed and latent—that their experiences provoke,” Dickson said. “Although no one metric alone holds the answer, these technologies offer insight into how customers are actually experiencing a brand’s product or service.”

Of course, the amount of data is only going to grow. And, according to Veritone president Ryan Steelberg, “It’s going to take AI, machine learning, and a lot of horsepower to ingest and decipher what all of this new, emotional data is telling us.”

Brands Measuring Emotion

Biometric technologies, specifically, help companies “fine-tune their storytelling so that it resonates at an individual level with the consumer,” Dickson said.

As an example, Lego used electro-dermal measurement to uncover psychological micro-reactions when customers used its products. Interestingly, results revealed that parents’ involvement in building Legos with their kids significantly reduced their children’s stress levels, “potentially an opportunity for a parent-child collaborative building set,” Dickson pointed out. “Young users also felt stressed when they had to correct mistakes in building a specialized robotic Lego set, revealing the opportunity for a companion app to provide a series of checkpoints to check progress and accuracy to minimize errors.”

Video game makers also use biometric feedback regularly, according to Veritone’s Steelberg. They recruit players and have them wear cameras equipped with biometric technology to try and understand their interactions with the games, and how to improve.

Other brands are combining quantitative data from biometric feedback with machine learning to deliver experiences that delight customers, Dickson pointed out. Disney, for example, used infrared cameras in a test theater to encode facial expressions for 3,179 audience members, measuring their smiles and other facial gestures. Using a neural network on the resulting 16 million data points, the movie studio assessed how the movie elicited an emotional response on a granular level and made changes accordingly.

“While movie studios have long tested films to gauge emotional response, what’s new here is not only the amount of quantitative data, but the machine learning that identifies trends the studio should examine and the ability to iterate on that data quickly to deliver a superior experience to customers,” Dickson said.

In addition, Mars Inc. used facial-expression decoding to identify which creative elements in its ads resonated with users. Meanwhile, insurance provider Humana deployed speech analysis in its call centers to cue agents on how to respond to customers based on voice emotions, improving its Net Promoter Score in the process.

What’s Next?

As these technologies become more widespread, brands will need to consider how they include emotional resonance in their product design and delivery, Adobe’s Dickson suggested. mPath, a startup from the MIT Media Lab that offers an electro-dermal sensor product, encourages emotional prototyping—or “emototyping”—as a new way to design experiences that drive emotional impact with users.

Through a process of emotion sensing, ethnography, and rapid prototyping, brands can design products that connect with customers, according to the company. Other companies, such as Affectiva, offer emotion SDKs and APIs for developers to start incorporating emotional responses in their mobile apps. And firms including Lightwave are able to visualize and make sense of emotion-based data for brands.

“But as brands deploy affective computing in their strategies,” Dickson cautioned, “they need to develop robust privacy policies and practices around this biometric data.”

According to Forrester’s McQuivey, once we get through this first stage of having rough emotional measures, the next step will be to start collecting emotional measures about specific customers that might be more subtle than joy, anger, frustration, or fear. By using deep learning, brands will be able to more efficiently and cost-effectively design and deliver emotionally resonant, individualized experiences, he said.

“We already have more data than we know how to deal with as human beings,” McQuivey added. “But AI is going to be a ‘hug aspect’ of future measurement because only AI can reliably detect all of this complexity and subtlety, not only in your face and your voice, but how that then relates to the actions that you took.”


http://www.adobe.com/experience-cloud.html?promoid=PLHRQG8M&mv=other