OpenAI
Batches
Perform batch inference with OpenAI
With Portkey, you can perform OpenAI Batch Inference operations. This is the most efficient way to
- Test your data with different foundation models
- Perform A/B testing with different foundation models
- Perform batch inference with different foundation models
Start Batch Inference
List Batch Inferences
Get Batch Inference
Cancel Batch Inference
Get Batch Output
This is a Gateway only feature
Was this page helpful?