Skip to main content
Creating a prompt tool involves developing a function tailored to a specific task, then making it accessible to LLM models by exposing it as a prompt tool. This allows you to mimic and test an agentic flow.

Create a Prompt Tool

For creating a prompt tool:
1
Go to the left navigation bar.
2
Click on the “Prompts Tools” tab.
3
This will direct you to the Prompt Tools page.
4
Click on the + button.
5
You can select the tool type as Code, API, or Schema.
6
Click the “Create” button.
7
Proceed to write your own custom function in JavaScript.

Create a Code-Based Tool

1

Navigate to Prompt Tools

Go to the left navigation bar and click on the “Prompts Tools” tab.
2

Create new tool

On the Prompt Tools page, click the + button.
3

Select tool type

Select “Code” as the tool type and click “Create”.
4

Write your function

Write your custom function in JavaScript in the code editor.

Code editor interface

The interface provides:
  • A code editor for writing your function
  • An input panel on the right for testing
  • A console at the bottom to view outputs

Example: Travel price calculator

Here’s an example of a prompt tool that calculates travel fares between cities:
const pricesMap = {
    London_Barcelona: 3423,
    Barcelona_London: 3500,
    London_Chicago: 3021,
    Chicago_London: 3670,
    London_Madrid: 6375,
    Madrid_London: 6590,
    London_Paris: 5621,
    Paris_London: 5560,
    Barcelona_Chicago: 3000,
    Chicago_Barcelona: 3890,
    Barcelona_Madrid: 4000,
    Madrid_Barcelona: 4321,
    Barcelona_Paris: 2034,
    Paris_Barcelona: 2041,
    Chicago_Madrid: 6987,
    Madrid_Chicago: 6456,
    Chicago_Paris: 3970,
    Paris_Chicago: 3256,
    Madrid_Paris: 4903,
    Paris_Madrid: 4678,
};

function Travel_Prices(st1, st2) {
    const key1 = `${st1}_${st2}`;
    const key2 = `${st2}_${st1}`;

    if (pricesMap[key1] !== undefined) {
        return pricesMap[key1];
    } else if (pricesMap[key2] !== undefined) {
        return pricesMap[key2];
    } else {
        return "Price not found for the given stations";
    }
}

function Total_Travel_Price(cities) {
    if (cities.length < 2) {
        return "At least two cities are required to calculate travel price";
    }

    let total_price = 0;

    for (let i = 0; i < cities.length - 1; i++) {
        const price = Travel_Prices(cities[i], cities[i + 1]);
        if (typeof price === "string") {
            return price; // Return the error message if price not found
        }
        total_price += price;
    }

    return total_price;
}

Create a Schema-Based Tool

Overview

Schema-based prompt tools provide a structured way to define tools that ensure accurate and schema-compliant outputs. This approach is particularly useful when you need to guarantee that the LLM’s responses follow a specific format.

Creating a Schema Tool

1

Navigate to Prompt Tools

Navigate to the Prompt Tools section and click the + button.
2

Select tool type

Select Schema as the tool type.
3

Define your schema

Define your schema in the editor. Here’s an example schema for a stock price tool:
Function call schema
{
    "type": "function",
    "function": {
        "name": "get-stock-price",
        "parameters": {
            "type": "object",
            "properties": {
                "stock": {
                    "type": "string"
                }
            }
        },
        "description": "this function returns stock value"
    }
}
4
Click the Save button to create your schema-based tool.

Testing Your Schema Tool

After creating your schema-based tool:
  1. Add it to a prompt configuration
  2. Test if the model correctly identifies when to use it
  3. Verify that the outputs match your schema’s structure

Create an API-Based Tool

Overview

Maxim allows you to expose external API endpoints as prompt tools. The platform automatically generates function schemas based on the API’s query parameters and payload structure.

Example

Here’s how an API payload gets converted into a function schema:
  1. Original API Payload:
Zipcode API payload
{
    "check": "123333"
}
  1. Generated Schema for LLM:
Payload sent to the model while making requests
{
    "type": "function",
    "function": {
        "parameters": {
            "type": "object",
            "properties": {
                "check": {
                    "type": "string"
                }
            }
        },
        "name": "clt473gri0006yzrl26rz79iu", // This is the ID of the function.
        "description": "This function accepts a zipcode and returns the corresponding location information" // This is the description of the function.
    }
}

Define Tool Variables

You can define variables for your prompt tools, which are automatically translated into properties within the function schema exposed to the LLM. The LLM uses these properties to decide the arguments for a tool/function call.

Variable Configuration:

Type: Variables can be set as either string or number. Description: Add a description for the variable to help the LLM understand its purpose. Optionality: You can designate variables as optional or non-optional (required). The LLM uses this information, along with the user’s prompt, to determine if it should include the variable in its function call. You can add variables to your prompt tools by following the steps below:

Add variables to Code-Based Tool

1
Select the Code-Based tool you want to add variables.
2
Add the variables to the function parameters.
3
Go to Variables tab at the top.
4
Add a description for the variable, select the type of the variable, and select the optionality of the variable.
5
Your variables are now added as properties in the function schema exposed to the LLM.

Add variables to Schema-Based Tool

1
Select the Schema-Based tool you want to add variables.
2
Add the variables to the schema properties.
3
Add a description for the schema properties and select the type of the schema properties.
4
Add the variable names to the required array to make them non-optional.

Add variables to API-Based Tool

1
Select the API-Based tool you want to add variables.
2
You can add variables to the API payload by using the {{variable_name}} syntax.
3
Go to Variables tab in the Endpoint editor.
4
Add a description for the variable, select the type of the variable, and select the optionality of the variable.
5
Your variables are now added as properties in the function schema exposed to the LLM.

Evaluate Tool Call Accuracy

You can learn more about evaluating the accuracy of tool calls here.