ObservabilityJS/TSFrameworks

Langchain

You can use Maxim's Langchain Tracer which can be passed as a callback to your langchain LLM calls. This will return responses in the format the logging repository accepts, and can directly be passed to the respective logging functions.

Install

Installing Maxim Langchain SDK Extension
npm install @maximai/maxim-js-langchain

Import

Importing Maxim Langchain SDK Extension
import MaximLangchainTracer from "@maximai/maxim-js-langchain"

Pass as callback

Passing Maxim Langchain Tracer as callback
import { OpenAI } from "@langchain/openai";
const maxim = new Maxim({ apiKey: "maxim-api-key" });
const logger = await maxim.logger({ id: "log-repository-id" });
const llm = new ChatOpenAI({
	openAIApiKey: openAIKey,
	modelName: "gpt-4o",
	temperature: 0,
	callbacks: [new MaximLangchainTracer(logger)],
});

Here we automatically create a new trace and add a generation to it for every LLM call.

Passing existing trace_id

If you are tracing an existing trace, you can pass the traceId as metadata to the LLM call.

Passing Maxim Langchain Tracer as callback with custom trace-id
import { OpenAI } from "@langchain/openai";
const maxim = new Maxim({ apiKey: "maxim-api-key" });
const logger = await maxim.logger({ id: "log-repository-id" });
const llm = new ChatOpenAI({
    openAIApiKey: openAIKey,
    modelName: "gpt-4o",
    temperature: 0,
    callbacks: [new MaximLangchainTracer(logger)],
    metadata: { "maxim":{traceId: "trace-id"} },
});

This will add the generation to the existing trace.

On this page