More
Inference.net
Portkey provides a robust and secure gateway to facilitate the integration of various Large Language Models (LLMs) into your applications, including the models hosted on Inference.net.
Provider slug: inference-net
Portkey SDK Integration with Inference.net
Portkey provides a consistent API to interact with models from various providers. To integrate Inference.net with Portkey:
1. Install the Portkey SDK
2. Initialize Portkey with Inference.net Authorization
- Set
provider
name asinference-net
- Pass your API key with
Authorization
header
3. Invoke Chat Completions
Supported Models
Find more info about models supported by Inference.net here:
Next Steps
The complete list of features supported in the SDK are available on the link below.
SDK
You’ll find more information in the relevant sections:
Was this page helpful?