08 February 2024

Specialized LLMs create a better future for AI in CX. - guest blog by Jonas Berggren

Specialized LLMs create a better future for AI in CX. - guest blog by Jonas Berggren

Everyone involved in designing CX processes knows about ChatGPT. This generative artificial intelligence system released at the end of 2022 by OpenAI has caught the public imagination so much that sometimes it feels as if AI has only been with us for a year.

ChatGPT was a breakthrough. It was when AI became appreciated by the general public rather than just computer scientists. The technical teams have been improving AI for decades, but nothing they have developed has become so popular so quickly. If you search Google today for ‘ChatGPT’, it will return over a billion pages of information.

Many companies designing CX systems have started flying the ChatGPT flag. They are boasting about how ChatGPT can improve customer chatbots and how it can be used inside the contact centre to manage administrative tasks.

This is all true, but there is a problem. ChatGPT uses a Large Language Model (LLM) that is absolutely enormous. This is the body of knowledge that is used to train the system. ChatGPT 4 has about one trillion different parameters, compared to around 175 billion in ChatGPT 3.5. Each parameter measures an individual relationship between words - linked by numbers and algorithms.

To simplify this, it’s a bit like training a chatbot on all the knowledge in Wikipedia, every digital book available online, and every website. It’s vast.

This works well for general questions of the chatbot. If you ask it to summarize a document using the style of the Financial Times, then it knows what you mean. If you ask about the history of Thor and Odin before the Marvel movies, you will learn all about Norse mythology. If you ask it why London was located on the river Thames, then it will know.

But when your new TV does not connect to the internet, and you need advice on how to set it up correctly for your chosen internet provider, then it is highly likely that ChatGPT was never trained on this specific product-focused information.

So I believe we need to adopt a more Specialized LLM approach to training the AI systems we are using to interact with customers. The key benefits of using it for customer service - privacy, security, speed, accuracy, cost savings, and improved customer experience.

  1. Privacy and confidentiality: Specialized LLMs can be programmed to follow specific guidelines and regulations, ensuring that they provide safe and compliant customer service.
  2. Speed and accuracy: Specialized LLMs are big enough to impact your bottom line, small enough to personalize the customer experience, and fast enough to keep up with the demands of your business.
  3. Knowledge: LLMs feature a very broad knowledge base that can also go into depth on many subjects, but it is unlikely to include specific products or services that a company wants to support using AI. As mentioned above, I can ask ChatGPT about the Kings of England, but not the settings on my Samsung TV specific to my Internet provider.
  4. Cost: Although individual users can access services like ChatGPT free, any company that wants to build a customer service solution using this technology will need to pay for all that infrastructure sitting on the Microsoft Azure Cloud. That huge environmental footprint also has a huge cost in dollars.
  5. Environment: The environmental cost of the enormous data centers required to run LLMs is huge - both in terms of the amount of energy needed, the amount of water needed to cool them, and their broader carbon footprint. LLMs should only be used where there is a need for deep and broad knowledge on a wide range of subjects.

So, the real problem is that we have created some fantastic tools, but using an LLM for a customer support chatbot is a bit like using a sledgehammer to crack open a walnut. We have over-engineered the solution, and it will be incredibly expensive - and will not even work as well as a more focused solution.

So, if you are thinking about deploying a customer service solution that uses AI, then think carefully about a few questions, such as:

  • What is my use case? What kind of questions will customers want to ask the system?
  • What is the very specific type of knowledge - such as product or domain expertise - that it needs to understand and is probably not found in an off-the-shelf LLM?
  • How can I tweak and adjust the data once it goes live? If I notice gaps in knowledge, can I quickly add additional training and knowledge?

When you think about your AI solution like this, it becomes much more obvious that a Specialized LLM will be preferable. It will be cheaper, it will have less of an impact on the environment, and it will have detailed knowledge about your industry domain and products. After all, customers are probably going to be asking this AI about how to get help with your products - not just having a general conversation about world history.

LLMs are generally too broad and too expensive to consider for customer service solutions. Don’t fall into the trap of believing the media hype that Chat GPT can answer any question. If you are tempted to believe that it is this powerful, then try asking it some detailed support questions about your products today.

Then, ask a CX specialist about how much better a Specialized LLM would work

A profile image of Jonas Berggren

Written by Jonas Berggren, Head Of Business Development NE

Jonas Berggren joined Transcom in 2020 as Head Of Business Development Northern Europe. Prior to this, Jonas was the co-founder and partner of Feedback Lab by Differ. Earlier in his career, Jonas held the position of CEO at Teleperformance Nordic.

Read more on CX