CM.com's Natural Language Understanding consists of three recognition engines working together to provide the best results when matching your customers' questions to your conversational flows.
Language Recognition Engine
To offer a consistently conversational experience, it is crucial to understand the language your customers speak. Conversational AI Cloud provides you with two options to handle language:
Language detection and automatic translations
Language-specific content can be created through cultures, allowing you to define content for specific languages. Meanwhile, language detection and automatic translations allow for multilingual support, meaning that the conversation flow can be the same for English, French, and Arabic users even when content is shared across languages.
Learn more about language recognition.
Machine Learning Intent Classification Engine
Intent classification uses a custom-trained machine learning algorithm to match a users utterance (read: question) to an intent that’s custom defined for each project environment. Intent classification is perfect for narrowing down the goal of your customers, e.g. if a customer says “I’d like more information on my car insurance” the intent of the customer is a request for information about the entity car insurance.
Intent classification is the first step in Conversational AI Cloud’s NLU engine. After classifying your customers' utterance, Conversational AI Cloud can identify the best response to their question from one or more articles. We then use our entity-driven, rule-based engine to select the best article for the context of the interaction and conversation.
If the intent classification engine is unable to confidently match an intent to your customers’ utterance, Conversational AI Cloud will defer to the entity driven rule-based recognition engine for further matching.
Entity Driven Rule-Based Recognition Engine
Entity recognition utilizes rule-based custom defined condition sets to attach meaning to specific parts of a user's question. It pre-processes natural language by normalizing the text before breaking it down into keywords, entities, and stop words. These conditions are then attached to articles within a project's environment to create matching condition sets. When the matching conditions are met, the article is considered a match and provided to the customer as the best possible answer. Because matching conditions can be as specific as desired, precise contextual answers can be delivered to customers in a controlled manner.
Entities work together seamlessly once a customer's question is classified into an intent. Whenever an intent is classified, entities are used to determine the best article match within the scope of that intent, providing powerful and intuitive control over the natural language understanding pipeline.
In the event that an intent cannot be classified for the customer's question, entity recognition serves as a secondary recognition engine to find the best possible match for what the customer is asking about.
When both recognition engines are unable to confidently produce an article to match the customer's question, Conversational AI Cloud provides control flows for fallback scenarios.
Learn more about how you can set up entity recognition for your environment.
Once an article is selected by Conversational AI Cloud's Multi-Engine NLU, it is time to select the best contextual answer. Throughout a conversation, the evaluation of any interaction can be influenced by various factors such as the user's identity, login status, order history, and previous conversation tone. Each of these elements can impact the tone of voice you’d want to use, the relevance of an answer or the conversational flow you want to trigger for your customer.
Conversational AI Cloud's context evaluation engine confidently determines the context of any given interaction based on the conversation state, additional information provided with the initial request, and any relevant information retrieved from external resources.
Learn more about context in your project here.