Chatbots don't have to be cheerful

Customer service people are trained to be warm, reassuring and cheerful. Training chatbots to do the same, however, may be a waste of time.

A recent study from the Georgia Institute of Technology found that while people in customer service positions are expected to have a positive demeanor, having a chatbot do the same can have either no effect or even make people's experience worse. Researchers came to these findings over the course of three different experiments.

In the first, 155 participants were randomly assigned to four different scenarios: human agents with neutral emotion, human agents with positive emotion, bots with neutral emotion, and bots with positive emotion. They were then asked about service quality and overall satisfaction. The results indicated that positive emotion was more beneficial when human agents exhibited it, but it had no effect when bots exhibited it.

In the second, 88 people were asked to imagine returning a textbook and were randomly assigned to either emotion-positive or emotion-neutral bots. After chatting with the bot, participants were asked to rate if they were communal (social) oriented or exchange (transaction) oriented on a scale. If the participant was communal-focused, they were more likely to appreciate the positive emotional bot — the only times when positive affect mattered — but if they expected the exchange as merely transactional, the emotionally positive bot made their experience worse.

The final study, involving 177 undergrad students randomly assigned to emotive or non-emotive bots, explored why there is this effect. The results showed that, because customers do not expect machines to have emotions, they can react negatively to emotion in a bot.

"It is commonly believed and repeatedly shown that human employees can express positive emotion to improve customers' service evaluations," said Han Zhang, one of the study's authors. "Our findings suggest that the likelihood of AI's expression of positive emotion to benefit or hurt service evaluations depends on the type of relationship that customers expect from the service agent."

Researchers say the studies show that using positive emotion in chatbots is challenging because businesses don't know a customer's biases and expectations going into the interaction. A happy chatbot could lead to an unhappy customer.

For reprint and licensing requests for this article, click here.
Technology Client communications Customer service Artificial intelligence
MORE FROM DIGITAL INSURANCE