LibreChat
Integrate Portkey with LibreChat for enhanced AI capabilities
LibreChat is an “enhanced ChatGPT clone”. You can deploy it internally (or externally) to instantly spin up a Chat UI for any of your AI use cases.
Portkey natively integrates with LibreChat and makes your LibreChat deployments production-grade and reliable with our suite of features:
- Full-stack observability and tracing for all requests
- Interoperability across 250+ LLMs
- Built-in 50+ state-of-the-art guardrails
- Simple & semantic caching to save costs & time
- Conditional request routing with fallbacks, load-balancing, automatic retries, and more
- Continuous improvement based on user feedback
Integrate Portkey with LibreChat
Step 1. Create the docker-compose-override.yaml
file
Create this file following the instructions here.
This file will point to the librechat.yaml
file where we will configure our Portkey settings (in Step 3).
Step 2: Configure the .env
file
Edit your existing .env
file at the project root (if the file does not exist, copy the .env.example
file and rename to .env
). We will add:
Step 3: Edit the librechat.yaml
file
Edit this file following the instructions here. Here, you can either pass your Config (containing provider/model configurations) or direct provider Virtual key saved on Portkey.
LibreChat requires that the API key field is present. Since we don’t need it for the Portkey integration, we can pass a dummy string for it.
If you’re a system admin, and you’re looking to track the costs/user on a centralized instance of LibreChat, here’s a guide by Tim Manik.
Support for the LibreChat Integration
If you face any issues in integrating Portkey with LibreChat, please ping the Portkey team on our community forum here.
Was this page helpful?