Service bots get our lingo

Service bots get our lingo
The other day I decided to check out an airline’s chatbot to see how it works. I keyed in: “I’m travelling with my 8-year-old son, recommend the best flight without disturbing his sleep”. It asked for my source and destination, which I gave as Chennai and Hyderabad. It asked for the date. I typed ₹Mon’ (for Monday), but it did not recognise what I meant.
I typed ₹tom’, it correctly recognised it as tomorrow.
The bot responded that there were no flights that “fully meet” my specified criteria and provided a bunch of options. I then asked for the cheapest flight, and non-stop flights. The options it provided were correct – I confirmed that on MakeMyTrip and Google Flights. It automatically filled the passenger list as 1 adult and 1 child for booking. It even clarified the availability of gluten free food for my fictional son.
Customer support bots have come a long way from the days of clunky bots that only answered FAQs. These sophisticated generative AI-based tools are much better than their predecessors, though still far from perfect. Sometimes they ask silly questions even after being explicitly told. They struggle to recognise colloquial words, have woefully poor memory, unlike human agents. But at least you don’t have to listen to hold music for 15 minutes.
The leap comes from GenAI’s ability to understand conversations better, their ability to understand multilingual queries, and support for interfaces like voice and videos. As these systems improve, customer support will become a lot quicker and easier for users, and hugely cheaper for companies, since they won’t have to hire thousands of contact centre staff.
Moving to complex functions
Ramprakash Ramamoorthy, director of AI Research at Zoho Corp, says with GenAI, the next goal is to make customer assistant bots do more complex and autonomous functions. He says they could hand over the issue to human agents when it figures it can’t handle it. He says they may even be able to process warranty claims, for instance. “As soon as you reach out, the system will be able to cross-check the database for similar issues/ queries where warranty claims are applicable. A human agent will have to call a supervisor to do the same job,” he says.

Screenshot 2024-06-28 085124

But these are still some time away. Even for basic customer support services, companies are being careful due to concerns around data privacy, and hallucination. Nobody, after all, wants to get into a situation like Air Canada did earlier this year when its chatbot offered a discount to a customer that it wasn’t supposed to.
Chaitanya Chokkareddy, cofounder of Ozonetel, the Hyderabadbased customer experience tech provider, says the focus currently is on automating the most common, repetitive queries. Chatbots, he says, fail in non-common areas of work.
He says a lot of work is happening with AI in the back-end like data analysis, summary, intent analysis, and quality control, focusing on efficiency and productivity of human customer agents and improving margins of companies. “We handle about 20 million calls per day which last about 3 to 5 minutes each, and a lot of that data is currently not accessible to managers to make better decisions on their product and services. Demand for back-end services is higher than customer facing applications,” he says.
Indic language capability is a little expensive now
While AI assistants perform relatively well in English, their performance in Indic languages is not great. Ozonetel is building Telugu language tokenisation by working with a non-profit and 1 lakh student interns to build a Telugu database. But Indian languages, he says, cost around 5x to 10x more compared to English because of the complex tokenisation of Indian languages compared to the Roman alphabet. “AI models are charged based on usage and chatbots using Indic scripts have more tokens and hence have higher computational costs. In many cases, the costs are equivalent to a human being and so our customers don’t want to automate,” Chaitanya says.
Ankush Sabharwal’s CoRover.ai also has a local language model called BharatGPT. The bot can respond to voice queries in Indian languages, and provide tailored information. It also has video chatbots with pre-generated videos and text-to-video conversion and lip sync, which can be used for purposes like internal training.
Ankush says redundant workload should be automated first, and as the adoption increases, more clear use cases will emerge. He says problems like hallucination can be reduced with retrieval augmented generation (RAG) and use of domain-specific small models.
Abhiroop Medhekar, founder & CEO of financing platform Velocity, says they have built text-to-speech and speech-to-text tech for AI-agent voice call automation. They initially used it internally and then started commercial offerings. “Some BFSI applications require an immediate response from customers, but customers frequently don’t respond to mails or WhatsApp reminders. The current robotic IVR calls are not effective either. But these GenAI-based voice calls are interactive and provide better results,” he says.
End of Article
FOLLOW US ON SOCIAL MEDIA