2017-01-23

Artificial intelligence that reads and responds to our emotions is the killer app of the digital economy. It will make customers and employees happier—as long as it learns to respect our boundaries.

When psychologist Dr. Paul Ekman visited the Fore tribe in the highlands of Papua New Guinea in 1967, he probably didn’t imagine that his work would become the foundation for some of the latest developments in artificial intelligence (AI).

After studying the tribe, which was still living in the preliterate state it had been in since the Stone Age, Ekman believed he had found the blueprint for a set of universal human emotions and related expressions that crossed cultures and were present in all humans. A decade later he created the Facial Action Coding System, a comprehensive tool for objectively measuring facial movement. Ekman’s work has been used by the FBI and police departments to identify the seeds of violent behavior in nonverbal expressions of sentiment. He has also developed the online Atlas of Emotions at the behest of the Dalai Lama.

And today his research is being used to teach computer systems how to feel.

Facial expressions are just one set of data that’s fueling the rapid advancement of a subset of AI called “affective computing.” Researchers and developers are creating algorithms that try to determine the emotional state of the human on the other side of the machine based on input such as gestures, facial expressions, text, and tone of voice.

More importantly, they’re using machine-learning techniques to develop increasingly emotional-intelligent interfaces that can not only accurately detect a person’s mood but also respond to it appropriately. A number of startups have already amassed databases of millions of human facial reactions and libraries of written communication and are actively hunting for patterns to predict human emotion—and resulting behavior—on a large scale.

Just as once-novel voice recognition technology is now a ubiquitous part of human–machine relationships, so too could this kind of mood recognition technology soon pervade digital interactions—and help businesses peer into our inner feelings.

“Once you are able to analyze a person’s affective state, you can also respond to it and influence it,” says Stacy Marsella, a professor in Northeastern University’s College of Computer and Information Science with a joint appointment in psychology.

The customer experience is the most obvious sweet spot for affective computing capabilities. Forrester analyzed its customer experience data from 2014 and 2015 and found that emotion was the number-one factor in determining customer loyalty in 17 out of the 18 industries surveyed—far more important than the ease or effectiveness of their interactions with a company. Yet most businesses have focused more on the functional experience their customers have with them than on the emotional one, in large part because, until now, there has been no easy way to assess or address the latter.

But the potential benefits of affective computing go beyond building a better customer service bot. Low-cost, wearable sensors could enable companies to measure how environment and experiences affect employee mood. Organizations could use this knowledge to design more effective work settings and processes to increase productivity and employee satisfaction. Empathy could be built into enterprise software systems to improve the user experience by, for example, sensing when employees become frustrated with a task and offering feedback or suggestions for help.

Indeed, emotion is already big business and is expected to become much bigger. The global affective computing market is estimated to grow from just over US$9.3 billion a year in 2015 to more than $42.5 billion by 2020, according to market research firm Research and Markets. In fact, the firm predicts that affective computing will “revolutionize the way organizations, especially across the retail, healthcare, government and defense, and academia sectors, gather, organize, collaborate, and deliver information.”

Though we are already seeing some novel applications of affective computing in business, it will take some time for it to reach its full potential. Businesses will have to address two potentially limiting factors in particular: the availability of the data that provides accurate emotional cues; and the ethical and data privacy issues that will emerge as companies seek to gather this intimate information about customers and employees.

“These are still very early days for affective computing and social robotics,” says Richard Yonck, executive director for Intelligent Future Consulting and author of the forthcoming book Heart of the Machine: Our Future in a World of Artificial Emotional Intelligence. “But we will see progress that will lead to affective computing moving incrementally into our lives. For example, we’ll see digital personal assistants exhibit increasing emotional awareness in the next two to five years, followed by similar improvements in office and accounting software.”

Driven by Emotion

The goal of artificial intelligence is to make machines more like humans. And humans are driven as much by emotion as by intellect. Indeed, neuroscientific research has revealed that emotions are a crucial component of perception, decision making, learning, and more. Those discoveries led to the birth of affective computing research and development 20 years ago.

Since then, affective computing experts have sought not merely to mimic emotions but to build applications that can adjust to the changing moods of their human counterparts. “One of the key elements of affective computing is being able to make inferences about the emotional states of the person who is interacting with the system—are they frustrated or happy or annoyed?—and then tailoring the response of the system based on those inferences,” says Marsella.

Machines that can connect with humans bring benefits that go far beyond computer-generated compassion. A 2005 study conducted by Stanford University’s Department of Communication in conjunction with Toyota’s Information Technology Center found that matching a car’s voice to the driver’s emotion had a significant impact on driver performance. Happy drivers guided by an enthusiastic voice and upset drivers hearing a subdued voice both had fewer accidents than happy or upset drivers who were listening to a car voice that didn’t match their mood.

Just as artificial intelligence does not exactly replicate human intelligence in a system or device, affective computing systems are not emotional in the same ways that humans are. Rather, through the application of machine learning, Big Data inputs, image recognition, and in some cases robotics, artificially intelligent systems hunt for affective clues, just as a human might capture and correlate any number of sensory inputs: widened eyes, quickened speech, crossed arms, and so on. Additionally, some researchers are looking at physiological signs of mood, such as heart rate or skin changes, that could be monitored through wearable devices.

Indeed, when it comes to data on human emotions, an incredible amount is already available today, says Marsella. “We’ve gotten really good at collecting and analyzing these large amounts of data and, using machine-learning techniques, mapping things like facial or vocal expressions to inferences about the underlying mental state of the person. A lot of researchers have taken a data-driven approach to this and it’s starting to pay off to some extent,” he says.

The Emotion Economy

Many businesses are already putting affective computing to use in customer-facing processes and functions. After all, in an era in which customer experience is the competitive differentiator, empathy may be the killer app for digital business. “Improving the customer experience is among the primary benefits of these technologies,” says Yonck. “Ultimately, tying emotional awareness into the customer experience is going to enhance brand loyalty and improve customer relations.”

Today, companies such as the BBC, Coca-Cola, and Disney are already using the simplest form of affective computing systems—emotion analytics—to assess how consumers react to advertisements, film trailers, and TV shows. They can measure, frame by frame, how a person responds to the content in order to optimize it or determine how best to allocate media spend, for example. Ad agencies are using such analytics to measure response and correlate that to key performance indicators like brand name recall or purchase intent.

But the big opportunity for businesses lies in enabling their customer-facing applications to respond in real time to how an individual feels. The most obvious and easiest-to-implement scenarios are chatbots and other digital or digitally enhanced customer service and support interactions. Text-sentiment analysis is already standard in most digital assistants. The goal is to improve and expand emotion recognition capabilities in order to provide customers with the most appropriate response at any point in time to deliver a more meaningful, personalized, or authentic experience.

Emotionally intelligent systems could deal with a range of frequent human reactions in an automated way, or they could monitor interactions between customers and human agents for emotional cues and then prompt certain responses, elevate calls, or alert supervisors to provide help.

“Scripts could branch according to whether the customer was seeking a genuine solution or simply was someone who likes to rant. As a human operator, the way you treat the two is very different, and this should be the case with a bot as well,” says Yonck.

Insurer Humana uses AI software that can detect conversational cues to guide call-center workers through difficult customer calls. The system recognizes that a steady rise in the pitch of a customer’s voice, or instances of an agent and customer talking over one another, are causes for concern. The system also grades each customer call experience from 1 to 10 based on the number and types of alerts it recognizes.

The company’s director of consumer experience told The Wall Street Journal that empathy is a competitive advantage in the health insurance industry, whose customers may be calling about such emotionally charged issues as procedures related to their well-being. And there are hard results: Humana says it has seen a 28% improvement in customer satisfaction, a 63% improvement in agent engagement, and a 6% improvement in first-contact resolution.

Affective computing could also be used to induce desired emotions in customers. Researchers at Carnegie Mellon University (CMU) have been working on ways for digital agents to recognize and respond to subtle cues in conversation to build more rapport with their users. They have developed the Socially-Aware Robot Assistant (SARA) with the goal of creating a chatbot that’s more effective at conveying task-related information.

Designed to collaborate with human users, SARA uses several microphones and video cameras to track a person’s nonverbal communications. “Sara is capable of detecting social behaviors in conversation, reasoning about how to respond to the intentions behind those particular behaviors, and generating appropriate social responses,” according to the CMU lab where SARA was built. Required hardware limits SARA’s use in the wild today but suggests the advances that could be achieved with the right inputs.

Emotional Awareness in the Organization

More sensitive chatbots and customer service lines are just a start, however. Observers anticipate the rapid development of an ecosystem of hardware, software, and services that build artificial emotional awareness into other aspects of the digital organization.

One recruiting technology company is using affective computing to record and analyze facial expressions and word choice during job applicant interviews. It can then provide its Fortune 500 clients with an additional data point to measure candidates’ levels of engagement, motivation, and empathy. The deep-learning emotion-analytics engine it uses is built on a database of 4 million faces and 75 billion micro-expression data points. It analyzes video, audio, text, and natural language, leveraging emotion-sensing analytics to assess basic human emotions with an accuracy rate in the 90th percentile.

The recruiting company says the system helps its clients rank the best candidates, identify overlooked high-potential candidates, and assess softer traits such as personality, motivation, and ambition. The software can also be used to measure and improve the performance of interviewers and hiring managers, making them more effective and reducing bias.

Employers could monitor employee moods to make organizational adjustments that increase productivity, effectiveness, or satisfaction. For example, Bank of America used sensors to track call-center workers over the course of several weeks and found that those in the most cohesive networks were the most productive. A sense of belonging and collaborative problem solving can spark a positive mood, which has a direct impact on retention and the bottom line. Happy employees are about 12% more productive, according to a study by economists at the University of Warwick.

Affective computing tools are also being developed to help those on the autism spectrum better understand and interact with the socio-emotional world around them, alerting them to other people’s emotions and prompting them to react in appropriate ways. Meanwhile, researchers at North Carolina State University have developed a teaching program that tracks the emotions of students engaged in interactive online learning in order to predict the effectiveness of online tutoring sessions.

The Perils of Collecting Feelings

Whether customers and employees will be comfortable having their emotions logged and broadcast by companies is an open question. Customers may find some uses of affective computing creepy or, worse, predatory. Today, when you buy a car or a house, you interact with another person. Negotiations take place human to human. But what happens when one of the human negotiators has an emotionally aware assistant in his corner?

“Once you begin interacting with a system that can read your subtlest emotional response more rapidly and accurately than any person ever could and then shift its script and strategy instantly, well, we’re probably not going to get as good a deal on that car as we would’ve in the good old days,” says Yonck.

Affective computing vendors are quick to insist that their systems are all opt-in. And other proponents of emotionally aware systems point out that affective state is just another data point among hundreds being collected on individuals by companies around the world.

The biggest limiting factor from a tactical point of view is the availability of data required to infer a person’s emotional state. In many cases, the signals that would provide the best emotional cues aren’t or can’t be collected or may not be allowed to be used for this purpose.

Indeed, that’s why chatbots and other text-based applications are at the forefront of affective computing development. Text interactions are relatively easy to collect, parse, and interpret, and customers are comfortable with what appears to be benign data collection and analysis to make their customer experience better.

Further, because some emotions are expressed through both vocal inflections and facial expressions, machines may need to learn to track both in real time. It can be difficult to capture the confluence of physical cues that may be relevant to an interaction, such as facial expression, tone of voice, and even posture. And while some affective computing experts believe there are universal cues to human emotion, as Ekman posits, a great debate remains in the psychological community about that. We humans are, after all, fairly complicated when it comes to emotions. The connection between physical cues and affective state may differ by situation, individual, and culture.

Finally, affective computing systems work best under controlled conditions, says Yonck. Noise, poor lighting, and odd perspectives all remain challenging. “It’s easy for the lay public to read certain headlines and think we’re talking about machines that can actually internalize and experience emotion. We’re still a long way off from that in any true sense, and the ability to work well in the wild is still going to take time,” he says.

Assess the Value

Despite the limitations and ethical issues to be worked out, the emotional machines are coming. Companies that want to take advantage of these emerging capabilities as they digitally transform more aspects of their business can take a number of steps to assess the potential value:

Evaluate the business problem you want to solve. Companies will want to figure out where they might get the most value from affective computing capabilities. Functions that are already being transformed by cognitive computing or other emerging digital technologies may be the best areas to start.

Determine what affective data is available. There may be sources of data already being collected that could be mined effectively, or additional data sources that could be integrated inexpensively. Companies should determine what inferences about mental states they want the system to make and how accurately those inferences can be made using the inputs available. They should also consider what nonemotional data they will need in order to get value from an emotionally aware system.

Scope out the technical challenges. Involve IT and engineering groups to figure out the challenges of integration with existing systems for collection, assimilation, and analysis of large volumes of emotional data.

Consider the level of emotional complexity. A like or dislike may be relatively straightforward to determine or act on. Other emotions may be more difficult to discern or respond to. Context is also key. An emotionally aware machine would need to respond differently to user frustration in an educational setting than to user frustration in a vehicle or on a customer service line.

The Transformative Power of Empathy

In the race to build the most effective digital organization, the ability to understand and respond to human emotion may actually be the key differentiator. Companies have been using technology to rework and improve their business processes and models for decades, but there has always been a disconnect between human and machine. Those enterprises that effectively integrate emotionally aware and responsive systems into their processes will be able to transform their organizations by creating more intuitive, customized, and empathetic customer interactions and boosting employee happiness, productivity, and retention. D!

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.

Markus Noga is Vice President of Machine Learning Incubation at SAP.
Chandran Saravana is Senior Director of Advanced Analytics at SAP.
Stephanie Overby is a Boston-based business and technology journalist.

This story originally appeared on The Digitalist.

Show more