Getting Started with AI Fucntions and Webhooks
One of the many building blocks in Montag to create complex functionality with LLMs is called the AI Function.
AI Functions are stored prompt and LLM configurations that can be triggered by an API call or called from within a script. Think of them a little like stored procedures in a database, you write the desired functionality once, and you can then call it over and over again without needing to worry about the specifics of the LLM, more or less from anywhere.
Because they can be triggered by calling an API endpoint, it means you can start using LLM functionality within your scripts and other automations, but also ensure that all interactions are being monitored and secured.
A few examples of AI Functions that get used often are:
- To summarize some text to bullet points
- To re-format some text into a well-known format, such as an error report
- Keyword generation
- Role-adoption for analysing a query - for example a fact-checker
Let’s go ahead and create a new AI function, let’s make something nice and simple: a keyword generator.
The behaviour we want from this AI function is for it to take in a block of content, and then to output no more than 5 keywords from that content.
Step 1: Create a Prompt
First off, we will need to create a prompt, browse to the “Prompts” section in the UI and click “Create”:
Then add the following:
Name: Keyword Generator
Custom Instructions: You are an AI Assistant that helps the user extract keywords from a body of content. I will mention a block of content below, and you will ONLY reply with no more than 5 (five) keywords from that content, separated by commas.
Help Text*: I make keywords from content
Number of Context Injections: Set this to 0 as we do not want this function activating the RAG process.
Step 2: Create the AI Function entry
Next we need to create the actual AI Function… this part is very straighforward, we need to set:
- Name: The name of the function, use
Keyword Generator
- Slug: The slug of the function, this is used in the API call, use
generateKeywords
- Conversation Window: Whether this AI function has a conversational memory - this can be played back to the LLM in follow-up calls to simulate a conversation. We want to set this to 0 for one-shot functions like this.
Then, for the LLM Settings section, we need to select:
- LLM Config:
OpenAI
(if you are using the default quickstart) - Prompt:
Keyword Generator
(the prompt we just created)
You can safely ignore the last section on RAG setings as we do not want this AI function to submit it’s query to the RAG process in Montag.
Step 3: Create an API token to call the function
Finally, we need to issue a token, this is quite simple and we’ll just issue a simple developer key in the “Tokens” section of the UI, give it a name and description, and select the AI Developer ACL preset.
Pnce the token is saved, the actual token will appear in the first section of the Token view on the right.
Now that we have a token, and we have an AI Function, its time to test it! The fastest way to test it is to call it as a webhook.
Step 4: Test the AI Function
Call the AI function using your favourite REST client, in the video we are using HoppScotch, but you can use Postman, Insomnia, or even just curl, your request just needs an “input” field in the body of the main JSON object payload, like so:
curl --request POST \
--url http://localhost:8080/api/aifunctions/call/generateKeywords \
--header 'Authorization: YOUR_API_KEY' \
--header 'content-type: application/json' \
--data '{
"input": "The Tower of London, officially His Majesty'\''s Royal Palace and Fortress of the Tower of London, is a historic castle on the north bank of the River Thames in central London, England. It lies within the London Borough of Tower Hamlets, which is separated from the eastern edge of the square mile of the City of London by the open space known as Tower Hill. It was founded toward the end of 1066 as part of the Norman Conquest. The White Tower, which gives the entire castle its name, was built by William the Conqueror in 1078 and was a resented symbol of oppression, inflicted upon London by the new Norman ruling class. The castle was also used as a prison from 1100 (Ranulf Flambard) until 1952 (Kray twins),[3] although that was not its primary purpose. A grand palace early in its history, it served as a royal residence. As a whole, the Tower is a complex of several buildings set within two concentric rings of defensive walls and a moat. There were several phases of expansion, mainly under kings Richard I, Henry III, and Edward I in the 12th and 13th centuries. The general layout established by the late 13th century remains despite later activity on the site."
}'
Now all we need to do is POST that request off to Montag! You should receive a response something like this:
{
"id": "c2f18e05-ae40-471f-938e-23e16878adfc",
"response": "\nTower of London, Norman Conquest, William the Conqueror, royal residence, defensive walls",
"session_id": "9421094a-f714-46cf-a1d7-ade5ec9db9c3"
}
Let me walk through the various sections of the response:
id
: This is the ID of the AI Function call, this is for loggingresponse
: This is the response from the AI Function, in this case it is the 5 keywords extracted from the body copysession_id
: This is the ID of the session that was created for this AI Function call, play this back in a follow-up request to simulate a conversation with the bot and fill the conversation window
As you can see, the response is a set of 5 keywords extracted from the body copy, and we can now re-use this request over and over anywhere we need it. Or provide it to end-users as an API in our API Developer portal so that they can easily have access to a keyword function.
Now it’s worth noting that AI Functions can also be called from a Script - we won’t cover that in this guide, but the interesting thing with using a script in conjunction with an AI function is that you can call multiple AI Functions to perform different tasks towards a single goal, an AI pipeline.
And yes, scripts can also be called as webhooks.