Tags
Prompts can also be configured with Tags
which are custom metadata that can be used to identify and retrieve prompts through the Maxim SDK.
Note: Tags
are a Maxim-specific feature and available for all prompts on the platform only.
Learn how to create, configure, and deploy single prompts in Maxim. Explore features like sessions, versioning, variable handling, and SDK integration for efficient prompt engineering and management.
Single prompts in Maxim provide a powerful way to experiment with system and user prompts. This feature allows you to iterate over prompts, test their effectiveness, and ensure they work well before integrating them into more complex workflows for your application.
To create a new Single prompt:
Navigate to the "Prompt" tab on the left side panel.
Click on "Prompt" and choose "Single prompt".
Click the "+" icon.
Choose to create a new prompt directly or create a folder for better organization.
Name your new prompt and optionally assign it to an existing folder.
Once you've created your prompt, you can configure it to suit your specific needs:
Maxim supports a wide range of models, including:
This flexibility allows you to experiment seamlessly across various model architectures.
In the prompt editor, you can add your system and user prompts. The system prompt sets the context or instructions for the AI, while the user prompt represents the input you want the AI to respond to.
Each prompt has a set of parameters that you can configure to control the behavior of the model. These parameters are different for each model and can be found in the model's documentation. Here are some examples of common parameters:
Tags
Prompts can also be configured with Tags
which are custom metadata that can be used to identify and retrieve prompts through the Maxim SDK.
Note: Tags
are a Maxim-specific feature and available for all prompts on the platform only.
Tags make prompts uniquely identifiable and can be used with the Maxim SDK for targeted prompt retrieval. Below is an example of how to assign tags to a prompt:
Example of using tags with the Maxim SDK:
Variables and context are essential components in creating dynamic and adaptable prompts. Let's explore how to use them effectively in Maxim.
Maxim allows you to include variables in your prompts using double curly braces {{ }}
. This feature is useful for creating flexible prompts that can adapt to different inputs.
You can attach your RAG (Retrieval-Augmented Generation) pipeline to prompts using variables. When Maxim encounters a variable in curly braces with a context source attached, it retrieves the context from the RAG pipeline and replaces the variable with the retrieved context.
Static variables
If no context source is attached, Maxim treats the variable as a static value.
Maxim introduces a powerful versioning and session system to help you manage your prompt development process effectively.
Sessions act as a history, saving the entire state of your prompt as you work. This allows you to experiment freely without fear of losing your progress.
Saving sessions
Cmd+S
( Mac) or Ctrl+S
(Windows/Linux) to save your current session.Versions, as the name suggests, represent different releases of your prompt. Each version has its own system (and optionally a user prompt and description).
When you're ready to test your prompt or deploy it:
The first message (typically the system message) is always saved with the version.
To thoroughly test and evaluate your prompt:
For detailed instructions on testing, refer to the Testing a Prompt guide.
Once you're satisfied with your prompt's performance, you can deploy it for use in your applications.
To deploy a prompt:
Deployment variables allow you to control which prompt version is used in different environments or for specific tenants.
After deploying your prompts, you can access them in your applications using the Maxim SDK.
For more advanced usage, refer to the Maxim SDK Documentation.
By leveraging the single prompt feature in Maxim, you can efficiently develop, test, and deploy prompts for your AI applications. The combination of flexible configuration, versioning, and seamless SDK integration makes it easier than ever to manage your prompt engineering workflow.