Open WebUI
Cost tracking, observability, and more for Open WebUI
Open WebUI is the most loved open source web interface for running LLMs. Integrating portkey with open-webui is as simple as writing 50 lines of code. There’s 10’s of provider plugins like anthropic, vertex in open webui, but they get outdated and might not be maintained. Portkey is a unified interface for all your LLM providers. It’s the only plugin you’ll need for your model management, cost tracking, observability, metadata logging, and more.
If you’re an IT admin planning to deploy a centralized instance of Open WebUI and track usage and limits, Portkey fits right in.
Common use cases
- Cost tracking: Track usage and limits for all your users and models.
- Observability: Get detailed observability into all your LLM traffic.
- Metadata logging: Log all requests and responses to your LLM providers by user, model, provider, etc.
- Access control: Control access to your Open WebUI instance using Portkey RBAC.
Integrate Portkey with Open WebUI
Portkey follows the OpenAI API specification, so it’s fully compatible with Open WebUI. It supports function calling, streaming, image generation, prompting with documents, etc. out of the box.
There’s multiple ways to use Portkey with Open WebUI. Underlying any such integration would be what is called an Open WebUI pipe.
The following is an example using Portkey Virtual Keys. Feel free to edit the plugin once installed to use advanced features like Configs, Conditional routing, etc.
Step 1: Install Portkey Plugin in Open WebUI
more on Open WebUI plugins here
Step 2: Create a Portkey Virtual Key in the Portkey Console
Skip this step if you’re using the open source version of Portkey.
Create virtual keys for each of your LLM providers in the portkey console.
Step 3: Update the Open WebUI Pipe variables
- In the functions section of your open-webui workspace settings, click on the
Edit
button (Alternatively you can click on the Valves button to input in the UI, but it’s recommended to update the code). (This step can only be done by the open-webui workspace admin) - Input your Portkey API key and the virtual key for the LLM providers you want to use.
- Add models names in the pipes function.
- Save the changes.
Step 4: Test the integration
Chat with one of the models and see the cost, tokens, and metadata logged in the Portkey console!
Support for the Open WebUI Integration
If you face any issues in integrating Portkey with Open WebUI, please ping the Portkey team on our community forum here.
Was this page helpful?