Provider Slug.
predibase
Portkey SDK Integration with Predibase
Using Portkey, you can call your Predibase models in the familar OpenAI-spec and try out your existing pipelines on Predibase fine-tuned models with 2 LOC change.1. Install the Portkey SDK
Install the Portkey SDK in your project using npm or pip:2. Initialize Portkey with the Virtual Key
To use Predibase with Portkey, get your API key from here, then add it to Portkey to create the virtual key.3. Invoke Chat Completions on Predibase Serverless Endpoints
Predibase offers LLMs like Llama 3, Mistral, Gemma, etc. on its serverless infra that you can query instantly.Sending Predibase Tenand ID
Predibase expects your account tenant ID along with the API key in each request. With Portkey, you can send your Tenand ID with theuser
param while making your request.4. Invoke Predibase Fine-Tuned Models
With Portkey, you can send your fine-tune model & adapter details directly with themodel
param while making a request.
The format is:
model = <base_model>:<adapter-repo-name/adapter-version-number>
llama-3-8b
and the adapter repo name is sentiment-analysis
, you can make a request like this:
Routing to Dedicated Deployments
Using Portkey, you can easily route to your dedicatedly deployed models as well. Just pass the dedicated deployment name in themodel
param:
model = "my-dedicated-mistral-deployment-name"
JSON Schema Mode
You can enforce JSON schema for all Predibase models - just set theresponse_format
to json_object
and pass the relevant schema while making your request. Portkey logs will show your JSON output separately