Langsmith
Integrate LangSmith observability with Portkey’s AI gateway for comprehensive LLM monitoring and advanced routing capabilities
LangSmith is LangChain’s observability platform that helps you debug, test, evaluate, and monitor your LLM applications. When combined with Portkey, you get the best of both worlds: LangSmith’s detailed observability and Portkey’s advanced AI gateway features.
This integration allows you to:
- Track all LLM requests in LangSmith while routing through Portkey
- Use Portkey’s 1600+ LLM providers with LangSmith observability
- Implement advanced features like caching, fallbacks, and load balancing
- Maintain detailed traces and analytics in both platforms
Quick Start Integration
Since Portkey provides an OpenAI-compatible API, integrating with LangSmith is straightforward using LangSmith’s OpenAI wrapper.
Installation
Basic Setup
This integration automatically logs requests to both LangSmith and Portkey, giving you observability data in both platforms.
Using Portkey Features with LangSmith
1. LLM Integrations
LLM Integrations in Portkey allow you to securely manage API keys and set usage limits. Use them with LangSmith for better security:
2. Multiple Providers
Switch between 1600+ LLM providers while maintaining LangSmith observability:
3. Advanced Routing with Configs
Use Portkey’s config system for advanced features while tracking in LangSmith:
Example config for fallback between providers:
4. Caching for Cost Optimization
Enable caching to reduce costs while maintaining full observability:
5. Custom Metadata and Tracing
Add custom metadata visible in both LangSmith and Portkey:
Fallbacks
Automatically switch to backup targets if the primary target fails.
Conditional Routing
Route requests to different targets based on specified conditions.
Load Balancing
Distribute requests across multiple targets based on defined weights.
Caching
Enable caching of responses to improve performance and reduce costs.
Smart Retries
Automatic retry handling with exponential backoff for failed requests
Budget Limits
Set and manage budget limits across teams and departments. Control costs with granular budget limits and usage tracking.
Observability Features
With this integration, you get:
In LangSmith:
- Request/response logging
- Latency tracking
- Token usage analytics
- Cost calculation
- Trace visualization
In Portkey:
- Request logs with provider details
- Advanced analytics across providers
- Cost tracking and budgets
- Performance metrics
- Custom dashboards
- Token usage analytics

Migration Guide
If you’re already using LangSmith with OpenAI, migrating to use Portkey is simple:
Next Steps
- Create LLM Integrations for secure API key management
- Build Configs for advanced routing
- Set up Guardrails for content filtering
- Implement Caching for cost optimization
Resources
For enterprise support and custom features, contact our enterprise team.