Maxim Logo
How toEvaluate Prompts

Compare Prompts in the playground

Iterating on Prompts as you evolve your AI application would need experiments across models, prompt structures, etc. In order to compare versions and make informed decisions about changes, the comparison playground allows a side by side view of results.

Why use Prompt comparison?

Prompt comparison combines multiple single Prompts into one view, enabling a streamlined approach for various workflows:

  1. Model comparison: Evaluate the performance of different models on the same Prompt.
  2. Prompt optimization: Compare different versions of a Prompt to identify the most effective formulation.
  3. Cross-Model consistency: Ensure consistent outputs across various models for the same Prompt.
  4. Performance benchmarking: Analyze metrics like latency, cost, and token count across different models and Prompts.

Create a new comparison

Navigate to the Prompt tab on the left side panel.

Click on Prompt and choose Prompt comparison.

Prompt comparison tab in navigation

Click the + icon in the left sidebar.

Choose to create a new Prompt comparison directly or create a folder for better organization.

Dropdown menu for creating a new Prompt comparison

Name your new Prompt comparison and optionally assign it to an existing folder.

Creating a new Prompt comparison in Maxim

Configure Prompts

Choose Prompts from your existing Prompts or just select a model from the dropdown menu directly.

imageimage

Add more Prompts to compare using the "+" icon.

Customize each Prompt independently. This won't affect the base Prompt in any manner so experiment freely.

Run your comparison

You can choose to have the Multi input option either enabled or disabled.

  • If enabled, provide input to each entry in the comparison individually.
  • If disabled, the same input is taken for all the Prompts in the comparison.
imageimage

You can compare up to five different Prompts side by side in a single comparison.

Next steps

On this page