Ollama
Portkey provides a robust and secure gateway to facilitate the integration of various Large Language Models (LLMs) into your applications, including your locally hosted models through Ollama.
Provider Slug. ollama
Portkey SDK Integration with Ollama Models
Portkey provides a consistent API to interact with models from various providers. Following is an example that shows how to proxy requests to any Ollama model through Portkey:
1. Expose your Ollama API
Expose your Ollama API by using a tunneling service like ngrok or any other way you prefer. You can skip this step if you’re self-hosting the Gateway.
For using Ollama with ngrok, here’s a useful guide
2. Install the Portkey SDK
Install the Portkey SDK in your application to interact with your Ollama API through Portkey.
3. Initialize Portkey with Ollama URL
Instantiate the Portkey client by adding your Ollama publicly-exposed URL to the customHost
property.
For the Ollama integration, you only need to pass the base URL to **customHost**
without the version identifier (such as **/v1**
) - Portkey takes care of the rest!
4. Invoke Chat Completions with Ollama
Use the Portkey SDK to invoke chat completions from your Ollama model, just as you would with any other provider.
Next Steps
Explore the complete list of features supported in the SDK:
SDK
You’ll find more information in the relevant sections:
Was this page helpful?