With all the change going on in CX, one piece of infrastructure remains – the call center.
For sure, bots driven by AI are encroaching on the traditional call center. But for the moment, the human call center will remain core to contact between brands and their customers. Self-service will soak up a percentage of the interactions, but would not be responsible for 100% of it.
However, training bots and automating call centers will not remove a lingering concern. That is whether the call centers are compliant.
“Buying my product will solve your problem.”
Given that AI bots can be trained and rules integrated into their behaviors, one would expect proper compliance.
But the issue is that only a tiny percentage of calls are monitored. This can lead to sub-standards in service.
As the recent Royal Commission into the Australian banks recently heard, it can also result in some highly unethical and non-compliant behavior as desperate call center staff hunt for sales. And bad experiences can destroy a brand, and make all its statements about customer commitment meaningless.
One particular recording of a sales agent selling life insurance to an intellectually-disabled customer had the Royal Commissioner, and Australian regulators, aghast.
“Don’t worry, I am not trying to sell but offering advice.”
According to Richard Kimber, the founder of Australian AI startup Daisee, a typical Australian big bank would be dealing with over 10 million phone calls each year. Only 1% of these would be monitored for “quality and training purposes.”
“Business’s problem is that random sampling of phone calls doesn’t work from a quality management perspective because you simply can’t sample enough calls to work out which ones are effective and which ones aren’t,” said Kimber.
"That is the first element of it. The second is that there are numerous regulations that companies have to abide by. They are quite complex [with] numerous conduct and compliance rules which need to be met. Our AI software is one of the very few ways you can ensure that the company is compliant and looking after its customers.”
“This recording is monitored by deep AI.”
Daisee stands for “Deep Artificial Intelligence Software for Enterprise Ecosystems.”
It is a product that began at Deakin University in Melbourne. It is currently being commercialized and developed by Kimber – a former managing director of Google in Australia – and a team of engineers and data scientists.
Using around 200 different sensors, Daisee is an overarching software layer that listens to all conversations. It then gives each of them a quality score derived from machine learning.
From there, the company can identify high-risk calls that need intervention or rapid remediation.
Currently, companies wait for complaints which are then sent into the complaints department. It then gets elevated from there.
“All of these are bad outcomes for the corporate, bad for the brand and bad for everything,” said Kimber.
“And of course there’s a huge amount of wastage in that whole process because all of that stuff takes a lot of time and money and delivers poor customer experience.”
"Bleep! You are listening to a compliant tone of voice."
Daisee is a compelling case because the product crosses both compliance and customer experience. It shows how inextricably linked the two areas are.
The Daisee AI also monitors the quality of the interaction and the tone of the agent. So, it includes the quality of the call in the assessment.
It is another example of how AI is being developed to work alongside humans rather than just replacing them outright.
“You call is audited by humans and bots.”
The call centers of the future will have AI embedded within them, not only to drive better performance from the legions of bots but also to monitor the humans and help improve service delivery.
So, while customers might still be talking to a human, AI will be standing behind them. Given some of the problems which come out of the uneven performance in call centers, that is not a dystopian vision for the customer, but rather one of reassurance.