OpenTelemetry (OTel) is a Cloud Native Computing Foundation (CNCF) open-source framework. It provides a standardized way to collect, process, and export telemetry data (traces, metrics, and logs) from your applications. This is vital for monitoring performance, debugging issues, and understanding complex system behavior.

Many popular AI development tools and SDKs, like the Vercel AI SDK, LlamaIndex, OpenLLMetry, and Logfire, utilize OpenTelemetry for observability. Portkey now embraces OTel, allowing you to send telemetry data from any OTel-compatible source directly into Portkey’s observability platform.

The Portkey Advantage: Gateway Intelligence Meets Full-Stack Observability

Portkey’s strength lies in its unique combination of an intelligent LLM Gateway and a powerful Observability backend.

  • Enriched Data from the Gateway: Your LLM calls routed through the Portkey Gateway are automatically enriched with deep contextual information—virtual keys, caching status, retry attempts, prompt versions, and more. This data flows seamlessly into Portkey Observability.

  • Holistic View with OpenTelemetry: By adding an OTel endpoint, Portkey now ingests traces and logs from your entire application stack, not just the LLM calls. Instrument your frontend, backend services, databases, and any other component with OTel, and send that data to Portkey.

This combination provides an unparalleled, end-to-end view of your LLM application’s performance, cost, and behavior. You can correlate application-level events with specific LLM interactions managed by the Portkey Gateway.

How OpenTelemetry Data Flows to Portkey

The following diagram illustrates how telemetry data from your instrumented applications and the Portkey Gateway itself is consolidated within Portkey Observability:

Explanation:

  1. Your Application Code is instrumented using OTel Instrumentation Libraries.
  2. This telemetry data (traces, logs) can be sent to the Portkey OTel Backend Endpoint.
  3. Simultaneously, LLM calls made via the Portkey Gateway generate their own rich, structured telemetry.
  4. All this data is consolidated in the Portkey Observability Stack, giving you a unified view.

Setting Up Portkey as an OpenTelemetry Backend

To send your OpenTelemetry data to Portkey, configure your OTel exporter to point to Portkey’s OTLP endpoint and provide your Portkey API Key for authentication.

Key Environment Variables:

# Portkey's OTLP HTTP Endpoint for traces and logs
OTEL_EXPORTER_OTLP_ENDPOINT="https://api.portkey.ai/v1/otel"
# Your Portkey API Key (ensure it's a Server Key)
OTEL_EXPORTER_OTLP_HEADERS="x-portkey-api-key=YOUR_PORTKEY_API_KEY"

Replace YOUR_PORTKEY_API_KEY with your actual Portkey API Key found in your Portkey Dashboard.

Signal-Specific Endpoints: If your OTel collector or SDK strictly requires signal-specific endpoints:

For Traces: OTEL_EXPORTER_OTLP_TRACES_ENDPOINT="https://api.portkey.ai/v1/otel/v1/traces"

For Logs: OTEL_EXPORTER_OTLP_LOGS_ENDPOINT="https://api.portkey.ai/v1/otel/v1/logs"

Remember to include the OTEL_EXPORTER_OTLP_HEADERS with your API key for these as well.

Viewing Traces

Once configured, your OpenTelemetry traces appear in the Portkey dashboard with full visibility for your AI application:

Why Use OpenTelemetry with Portkey?

Portkey’s OTel backend is compatible with any OTel-compliant library. Here are a few popular ones for GenAI and general application observability:

Language Agnostic

Works with any programming language that supports OpenTelemetry - Python, JavaScript, Java, Go, and more

Framework Support

Compatible with all major LLM frameworks through their OTel instrumentation

Zero Code Changes

Many libraries offer auto-instrumentation that requires no changes to your application code

Standards-Based

Built on industry-standard protocols ensuring long-term compatibility

Navigate to the Logs page to view your traces, filter by various attributes, and drill down into specific requests.

Supported OTel Libraries

Getting Started

1

Get your Portkey API key

Sign up for Portkey and grab your API key from the settings page

2

Choose an instrumentation library

Pick from our supported integrations based on your stack

3

Configure the endpoint

Point your OTel exporter to https://api.portkey.ai/v1/logs/otel with your API key

4

Start tracing

Run your application and view traces in the Portkey dashboard

Next Steps