Partnering with HumanFirst, Infobip generated over 220 knowledge articles, unlocked 30% of their agents' time, and improved containment by a projected 15%.
Reviewing the state of employee experimentation and organizational adoption, and exploring the shifts in thinking, tooling, and training required for workforce-wide AI.
With a one-click integration to Conversations, Infobip’s contact center solution, HumanFirst helps enterprise teams leverage LLMs to analyze 100% of their customer data.
Partnering with HumanFirst, Infobip generated over 220 knowledge articles, unlocked 30% of their agents' time, and improved containment by a projected 15%.
Reviewing the state of employee experimentation and organizational adoption, and exploring the shifts in thinking, tooling, and training required for workforce-wide AI.
I gained an appreciation for the power of Natural Language Understanding (NLU) engines while experimenting with the predictive and classification capabilities of Large Language Models (LLMs).
NLU Engines have been used to power chatbots and customer facing conversations for a while now. While they are mainly recognised for their generative capabilities, NLU Engines are also incredibly efficient in the area of predictive capabilities - which can be more tricky to get right. Here are five areas in which NLU Engines are exceptionally well optimised…
Entering Data Is Simple
Chatbot development frameworks typically offer a no-code graphical user interface (GUI) for NLU data entry.
This allows users to enter data in a point-and-click fashion, eliminating the need to format data into a JSON or CSV structure for importation. This also avoids potential errors associated with manual data entry.
Migrating between chatbot frameworks and NLU engines is made easy by the lightweight nature of NLU Engine data exports.
Flexibility Of Intents
Classes can be seen as intents, and the necessity of assigning text to one or multiple categories, labels, or intents will always remain.
Here are some recent developments in the field of intents:
Entities Have Developed Over Time
Intents are like verbs and entities like nouns.
Accurate entity detection is important to avoid re-prompting the user for input already supplied. Thus, it is necessary to be able to identify entities from unstructured input on the first input.
I recently trained a multi-label model using Google Cloud Vertex AI.
The model training process took 4 hours and 48 minutes to complete, with 11,947 training data items.
To help reduce training times, chatbot development frameworks like Rasa have implemented incremental training. Other chatbot frameworks have decreased training time drastically.
Multiple Installation Options
Natural Language Understanding (NLU) Engines are lightweight, often open-sourced, and can easily be installed in any environment, including workstations.
Examples of open-sourced NLU APIs include Rasa, which is easy to configure and train, as well as other accessible NLU tools such as Spacy, Snips, and Cisco MindMeld.
With these tools, individuals can explore natural language understanding without having to invest in expensive computing, LLMs, or complicated ML pipelines.
I’m currently the Chief Evangelist @ HumanFirst. I explore and write about all things at the intersection of AI and language. Including NLU design, evaluation & optimisation. Data-centric prompt tuning and LLM observability, evaluation and fine-tuning.
Subscribe to HumanFirst Blog
Get the latest posts delivered right to your inbox
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.