Conversational interfaces are powered primarily by natural language processing (NLP), and a key subset of NLP is natural language understanding (NLU). While NLU is a subset of NLP, NLP doesn’t always involve NLU. The terms NLP and NLU are often used interchangeably, but they have slightly different meanings. Developers need to understand the difference between natural language processing and natural language understanding so they can build successful conversational applications.
What is NLP?
Natural language processing is a subset of AI, and it involves programming computers to process massive volumes of language data. It involves numerous tasks that break down natural language into smaller elements in order to understand the relationships between those elements and how they work together. Common tasks include parsing, speech recognition, part-of-speech tagging, and information extraction. NLP focuses largely on converting text to structured data.
What is NLU?
One of the primary goals of NLU is to teach machines how to interpret and understand language inputted by humans. It aims to teach computers what a body of text or spoken speech means. NLU leverages AI algorithms to recognize attributes of language such as sentiment, semantics, context, and intent. It enables computers to understand the subtleties and variations of language. For example, the questions “what’s the weather like outside?” and “how’s the weather?” are both asking the same thing. The question “what’s the weather like outside?” can be asked in hundreds of ways. With NLU, computer applications can recognize the many variations in which humans say the same things.
The Key Difference Between NLP and NLU
When it comes to natural language, what was written or spoken may not be what was meant. In the most basic terms, NLP looks at what was said, and NLU looks at what was meant. People can say identical things in numerous ways, and they may make mistakes when writing or speaking. They may use the wrong words, write fragmented sentences, and misspell or mispronounce words. NLP can analyze text and speech, performing a wide range of tasks that focus primarily on language structure. However, it will not tell you what was meant or intended by specific language. NLU allows computer applications to infer intent from language even when the written or spoken language is flawed.
The Difference Between NLP and NLU Matters
If a developer wants to build a simple chatbot that produces a series of programmed responses, they could use NLP along with a few machine learning techniques. A simple chatbot doesn’t necessarily need NLU. However, if a developer wants to build an intelligent contextual assistant capable of having sophisticated natural-sounding conversations with users, they would need NLU. NLU is the component that allows the contextual assistant to understand the intent of each utterance by a user. Without it, the assistant won’t be able to understand what a user means throughout a conversation. And if the assistant doesn’t understand what the user means, it won’t respond appropriately or at all in some cases.
Whether it’s simple chatbots or sophisticated AI assistants, NLP is an integral part of the conversational app building process. And the difference between NLP and NLU is important to remember when building a conversational app because it impacts how well the app interprets what was said and meant by users.
Our open source conversational AI platform includes NLU, and you can customize your pipeline in a modular way to extend the built-in functionality of Rasa’s NLU models. You can learn more about custom NLU components in the developer documentation, and be sure to check out this detailed tutorial.