Ingesting OTLP Traces into Maxim
Maxim provides an OTLP ingestion endpoint that accepts traces in the vendor-neutral, industry-standard OpenTelemetry Protocol format. This enables deep insights into your AI systems using existing OpenTelemetry instrumentation. Endpoint:https://api.getmaxim.ai/v1/otel
Supported Protocols:
| Protocol | Content-Type |
|---|---|
| HTTP + Protobuf (binary) | application/x-protobuf or application/protobuf |
| HTTP + JSON | application/json |
x-maxim-api-key: Your Maxim API Keyx-maxim-repo-id: Your Maxim Log Repository IDContent-Type: application/json, application/x-protobuf, or application/protobuf
Forwarding Traces via Data Connectors
Maxim can also act as your central observability hub. Send traces once to Maxim, and forward them to your preferred observability platforms while getting AI-powered insights. Supported Forwarding Destinations:- New Relic
- Snowflake
- Any OpenTelemetry (OTLP) collector
To set up forwarding, navigate to your log repository, click the top-right menu, and select “Set up data connectors.”
Benefits of Using Maxim as Your Observability Hub
- Single Instrumentation: Instrument once with Maxim, forward to multiple destinations
- Enriched Data: Maxim enhances your traces with AI context before forwarding
- Consistent Format: Normalized data across all your observability tools
- Reduced Overhead: Lower instrumentation maintenance and network traffic
- Centralized Control: Manage all your observability connections in one place
Connecting to an OTLP Collector
Maxim exports traces following OpenTelemetry semantic conventions, ensuring compatibility with platforms like New Relic and other OTLP collectors. Trace forwarding happens asynchronously and doesn’t impact your application performance.
Best Practices
- Use binary Protobuf (
application/x-protobuf) for optimal performance and robustness - Batch traces to reduce network overhead
- Include rich attributes following GenAI semantic conventions
- Secure your headers and avoid exposing credentials