Portkey provides a robust and secure gateway to facilitate the integration of various Large Language Models (LLMs) into your applications, including your locally hosted models through LocalAI.
Portkey SDK Integration with LocalAI
1. Install the Portkey SDK
2. Initialize Portkey with LocalAI URL
First, ensure that your API is externally accessible. If you’re running the API on http://localhost
, consider using a tool like ngrok
to create a public URL. Then, instantiate the Portkey client by adding your LocalAI URL (along with the version identifier) to the customHost
property, and add the provider name as openai
.
Note: Don’t forget to include the version identifier (e.g., /v1
) in the customHost
URL
Portkey currently supports all endpoints that adhere to the OpenAI specification. This means, you can access and observe any of your LocalAI models that are exposed through OpenAI-compliant routes.
List of supported endpoints here.
3. Invoke Chat Completions
Use the Portkey SDK to invoke chat completions from your LocalAI model, just as you would with any other provider.
Using Virtual Keys
Virtual Keys serve as Portkey’s unified authentication system for all LLM interactions, simplifying the use of multiple providers and Portkey features within your application. For self-hosted LLMs, you can configure custom authentication requirements including authorization keys, bearer tokens, or any other headers needed to access your model:
- Navigate to Virtual Keys in your Portkey dashboard
- Click “Add Key” and enable the “Local/Privately hosted provider” toggle
- Configure your deployment:
- Select the matching provider API specification (typically
OpenAI
) - Enter your model’s base URL in the
Custom Host
field - Add required authentication headers and their values
- Select the matching provider API specification (typically
- Click “Create” to generate your virtual key
You can now use this virtual key in your requests:
For more information about managing self-hosted LLMs with Portkey, see Bring Your Own LLM.
LocalAI Endpoints Supported
Endpoint | Resource |
---|---|
/chat/completions (Chat, Vision, Tools support) | Doc |
/images/generations | Doc |
/embeddings | Doc |
/audio/transcriptions | Doc |
Next Steps
Explore the complete list of features supported in the SDK:
SDK
You’ll find more information in the relevant sections:
Was this page helpful?