Files
Upload files to S3 for Bedrock batch inference
To perform batch inference with Bedrock, you need to upload files to S3. This process can be cumbersome and duplicative in nature because you need to transform your data into model specific formats.
With Portkey, you can upload the file in OpenAI format and portkey will handle transforming the file into the format required by Bedrock on the fly!
This is the most efficient way to
- Test your data with different foundation models
- Perform A/B testing with different foundation models
- Perform batch inference with different foundation models
Uploading Files
This is a Gateway only feature
YOUR_AWS_S3_BUCKET: The name of the S3 bucket you want to upload the file to. YOUR_AWS_S3_OBJECT_KEY: The name of the S3 object key you want to upload the file to. YOUR_AWS_BEDROCK_MODEL: The name of the Bedrock model you want to use. This will be used to transform OpenAI format to Bedrock format.
Get File Content
This is a Gateway only feature
file_id: S3 Uri of the file. for example: s3://your-bucket-name/your-file-name.jsonl
The following endpoints are NOT supported for Bedrock:
GET /files
GET /files/{file_id}
Was this page helpful?