Bedrock
Batches
Perform batch inference with Bedrock
To perform batch inference with Bedrock, you need to upload files to S3. This process can be cumbersome and duplicative in nature because you need to transform your data into model specific formats.
With Portkey, you can upload the file in OpenAI format and portkey will handle transforming the file into the format required by Bedrock on the fly!
This is the most efficient way to
- Test your data with different foundation models
- Perform A/B testing with different foundation models
- Perform batch inference with different foundation models
Start Batch Inference
List Batch Inferences
Get Batch Inference
Cancel Batch Inference
Get Batch Output
This is a Gateway only feature
Was this page helpful?