Back to blog
Articles
Articles
June 26, 2023
·
6 Min Read

Practical Examples of OpenAI Function Calling

June 26, 2023
|
6 Min Read

Latest content

Customer Stories
4 min read

How Infobip Generated 220+ Knowledge Articles with Gen AI For Smarter Self-Service and Better NPS

Partnering with HumanFirst, Infobip generated over 220 knowledge articles, unlocked 30% of their agents' time, and improved containment by a projected 15%.
September 16, 2024
Articles
7 min read

Non-Technical AI Adoption: The Value of & Path Towards Workforce-Wide AI

Reviewing the state of employee experimentation and organizational adoption, and exploring the shifts in thinking, tooling, and training required for workforce-wide AI.
September 12, 2024
Articles
6 min read

AI for CIOs: From One-Off Use to Company-Wide Value

A maturity model for three stages of AI adoption, including strategies for company leaders to progress to the next stage.
September 12, 2024
Tutorials
4 min read

Building Prompts for Generators in Dialogflow CX

How to get started with generative features.
August 15, 2024
Announcements
3 min read

HumanFirst and Infobip Announce a Partnership to Equip Enterprise Teams with Data + Generative AI

With a one-click integration to Conversations, Infobip’s contact center solution, HumanFirst helps enterprise teams leverage LLMs to analyze 100% of their customer data.
August 8, 2024
Tutorials
4 min read

Two Field-Tested Prompts for CX Teams

Get deeper insights from unstructured customer data with generative AI.
August 7, 2024
Tutorials
5 min read

Optimizing RAG with Knowledge Base Maintenance

How to find gaps between knowledge base content and real user questions.
April 23, 2024
Tutorials
4 min read

Scaling Quality Assurance with HumanFirst and Google Cloud

How to use HumanFirst with Vertex AI to test, improve, and trust agent performance.
March 14, 2024
Customer Stories
4 min read

How Infobip Generated 220+ Knowledge Articles with Gen AI For Smarter Self-Service and Better NPS

Partnering with HumanFirst, Infobip generated over 220 knowledge articles, unlocked 30% of their agents' time, and improved containment by a projected 15%.
September 16, 2024
Articles
7 min read

Non-Technical AI Adoption: The Value of & Path Towards Workforce-Wide AI

Reviewing the state of employee experimentation and organizational adoption, and exploring the shifts in thinking, tooling, and training required for workforce-wide AI.
September 12, 2024
Articles
6 min read

AI for CIOs: From One-Off Use to Company-Wide Value

A maturity model for three stages of AI adoption, including strategies for company leaders to progress to the next stage.
September 12, 2024

Let your data drive.

Articles

Practical Examples of OpenAI Function Calling

COBUS GREYLING
June 26, 2023
.
6 Min Read

Three use-cases for OpenAI Function Calling with practical code examples.

Introduction

Large Language Models (LLMs), like conversational UIs in general, is efficient at receiving highly unstructured data in the form of conversational natural language.

This unstructured data is then structured, processed and subsequently unstructured again in the form of natural language output.

OpenAI Function Calling structures output for machine consumption in the form of an API, as apposed to human consumption in the form of unstructured natural language.

Function Calling

With the API call, you can provide functions to gpt-3.5-turbo-0613 and gpt-4–0613, and have the model intelligently generate a JSON object containing arguments which you can then in turn use to call the function in your code.

The Chat Completion API does not call the function directly; instead it generates a JSON document which you can use in your code.

Read more about OpenAI Function Calling here.

With OpenAI Function Calling, the focus is on the JSON output.

Source

Here are three practical examples of function calling:

Create chatbots that answer questions by calling external APIs:

In this example, an email is compiled with a few fields defined, which need to be populated from the natural language input.

If the JSON document is not defined correctly, an error will be raised by the OpenAI model. Hence there is a level of rigidity in terms of defining the JSON structure which needs to be populated.

The response from the model:

Convert unstructured natural language into API calls:

In this example an order is converted into a JSON structure with date, notes, order type, delivery address, etc.

The response:

Extract structured data from text:

This final example has a list of names and birthdays, which needs to be structured into JSON. Something I found interesting, is that the model does not iterate through the names and dates.

Only the first name and date is placed within the JSON, hence the set of data is not recognised and itterated through.

The result:

In Conclusion

There are a few considerations, however:

Programmatically the chatbot/Conversational UI will have to know that the output to the LLM must be in JSON format. Hence there will have to be some classification or intent recognition of sorts to detect output type should be JSON.

A predefined template will have to exist and be defined, for input to the completion LLM. As seen in the examples, the JSON template guides the LLM on how to populate the values.

More importantly, as seen in the examples, the parameters which should be populated, must be well defined. Failure results in the following error:

“We could not parse the JSON body of your request. (HINT: This likely means you aren’t using your HTTP library correctly. The OpenAI API expects a JSON payload, but what was sent was not valid JSON. If you have trouble figuring out how to fix this, please contact us through our help center at help.openai.com.)”

I’m currently the Chief Evangelist @ HumanFirst. I explore and write about all things at the intersection of AI and language; ranging from LLMs, Chatbots, Voicebots, Development Frameworks, Data-Centric latent spaces and more.

Subscribe to HumanFirst Blog

Get the latest posts delivered right to your inbox