ObservabilityPythonQuickstart

Tracing Litellm with Maxim SDK

Before you start logging you will have to create a LogRepository on the Maxim dashboard. To create a log repository,click on Logs and the click on + icon on the sidebar. You can use the ID of that repository to push logs.

This guide shows how to add Maxim's tracing to your Litellm workflows. We'll use a simple API that sends user queries to an LLM and returns responses. You'll learn how to quickly start logging and gain insights into your LLM interactions, helping you better monitor and optimize your AI applications.

Flask web server + Litellm
import os
from flask import Flask, jsonify, request
import litellm
 
flask_app = Flask(__name__)
 
def search_tool():
    return "This is a dummy search tool."
 
async def ask_llm(query: str) -> str:
    response = completion(
            model="openai/gpt-4o",
            api_key=os.getenv("OPENAI_API_KEY"),
            messages=[{"content": query, "role": "user"}],
            temperature=0.73
            tools=[
                {
                    "type": "function",
                    "function": {
                        "name": "search_tool",
                        "description": "This tool helps to retrieve information from internet",
                        "parameters": {
                            "type": "object",
                            "properties": {
                                "query": {
                                    "type": "string",
                                    "description": "Query for searching information.",
                                },
                            },
                            "required": ["query"],
                        },
                    },
                }
            ],
        )
    return response
 
 
@flask_app.post("/chat")
async def handle():
    resp = await ask_llm(request.json["query"])
    return jsonify({"result": resp})
 
flask_app.run(port=8000)

Integrating Maxim SDK

Initialize Maxim logger.

To get the Maxim API key

Initializing Maxim logger
from maxim import Maxim,Config
from maxim.logger import LoggerConfig
from maxim.logger.litellm.tracer import MaximLiteLLMTracer
 
maxim = Maxim(Config(api_key="maxim-api-key"))
logger = maxim.logger(LoggerConfig(id="log-repository-id"))

Create MaximLiteLLMTracer instance and add as a callback.

Adding Maxim LiteLLM callback
async def ask_llm(query: str) -> str:
    litellm.callbacks = [MaximLiteLLMTracer(logger)]
    response = completion(
        ...
    )

This will enable Maxim observability on your Litellm workflow.

On this page