Simplify AI development with OpenTelemetry-native observability and intelligent gateway routing
OpenLIT allows you to simplify your AI development workflow, especially for Generative AI and LLMs. It streamlines essential tasks like experimenting with LLMs, organizing and versioning prompts, and securely handling API keys. With just one line of code, you can enable OpenTelemetry-native observability, offering full-stack monitoring that includes LLMs, vector databases, and GPUs.
OpenLIT’s automatic instrumentation combined with Portkey’s intelligent gateway creates a comprehensive observability solution where every trace captures model performance, prompt versioning, and cost optimization data in real-time.
Set up OpenTelemetry tracer and initialize OpenLIT:
Copy
Ask AI
import openlitfrom opentelemetry.sdk.trace import TracerProviderfrom opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporterfrom opentelemetry.sdk.trace.export import SimpleSpanProcessorfrom opentelemetry import trace# Create and configure the tracer providertrace_provider = TracerProvider()trace_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter()))# Set the global default tracer providertrace.set_tracer_provider(trace_provider)# Create a tracer from the global tracer providertracer = trace.get_tracer(__name__)# Initialize OpenLIT with the custom tracer# disable_batch=True ensures traces are processed immediatelyopenlit.init(tracer=tracer, disable_batch=True)
Set up the OpenAI client to use Portkey’s intelligent gateway:
Copy
Ask AI
from openai import OpenAIfrom portkey_ai import createHeaders# Create OpenAI client with Portkey's gatewayclient = OpenAI( api_key="YOUR_OPENAI_API_KEY", # Or use a dummy value with virtual keys base_url="https://api.portkey.ai/v1", default_headers=createHeaders( api_key="YOUR_PORTKEY_API_KEY", virtual_key="YOUR_VIRTUAL_KEY" # Optional: Use Portkey's secure key management ))
Now your LLM calls are automatically traced by OpenLIT and enhanced by Portkey:
Copy
Ask AI
# Make calls through Portkey's gateway# OpenLIT instruments the call, Portkey adds gateway intelligenceresponse = client.chat.completions.create( model="gpt-4", messages=[ { "role": "user", "content": "Explain the benefits of OpenTelemetry in AI applications" } ], temperature=0.7)print(response.choices[0].message.content)# You now get:# 1. Automatic tracing from OpenLIT# 2. Gateway features from Portkey (caching, fallbacks, routing)# 3. Combined insights in Portkey's dashboard