Provider Slug.
bedrock
Portkey SDK Integration with AWS Bedrock
Portkey provides a consistent API to interact with models from various providers. To integrate Bedrock with Portkey:1. Install the Portkey SDK
Add the Portkey SDK to your application to interact with Anthropic’s API through Portkey’s gateway.2. Initialize Portkey with the Bedrock Provider
There are two ways to integrate AWS Bedrock with Portkey:AWS Access Key
Use your
AWS Secret Access Key
, AWS Access Key Id
, and AWS Region
to create your AI Provider on Portkey’s app.Integration Guide
AWS Assumed Role
Take your
AWS Assumed Role ARN
and AWS Region
to create the virtaul key.Integration Guide
Using Bedrock Provider with AWS STS
If you’re using AWS Security Token Service, you can pass youraws_session_token
along with the AI Provider slug:
Not using Bedrock Provider from Model Catalog?
Check out this example on how you can directly use your AWS details to make a Bedrock request through Portkey.3. Invoke Chat Completions with AWS bedrock
Use the Portkey instance to send requests to Anthropic. You can also override the provider slug directly in the API call if needed.Using the /messages Route with Bedrock Models
Access Bedrock’s Claude models through Anthropic’s native/messages
endpoint using Portkey’s SDK or Anthropic’s SDK.
This route only works with Claude models on Bedrock. For other models, use the standard OpenAI compliant endpoint.
Counting Tokens
Portkey also supports the token counting endpoint for bedrock. Checkout the example in this link for more details.
Using Vision Models
Portkey’s multimodal Gateway fully supports Bedrock’s vision modelsanthropic.claude-3-sonnet
, anthropic.claude-3-haiku
, and anthropic.claude-3-opus
For more info, check out this guide:
Vision
Extended Thinking (Reasoning Models) (Beta)
The assistants thinking response is returned in the
response_chunk.choices[0].delta.content_blocks
array, not the response.choices[0].message.content
string.us.anthropic.claude-3-7-sonnet-20250219-v1:0
support extended thinking.
This is similar to openai thinking, but you get the model’s reasoning as it processes the request as well.
Note that you will have to set strict_open_ai_compliance=False
in the headers to use this feature.
Single turn conversation
Multi turn conversation
Inference Profiles
Inference profiles are a resource in Amazon Bedrock that define a model and one or more Regions to which the inference profile can route model invocation requests. To use inference profiles, your IAM role needs to additionally have the following permissions:Bedrock Converse API
Portkey uses the AWS Converse API internally for making chat completions requests. If you need to pass additional input fields or parameters likeanthropic_beta
, top_k
, frequency_penalty
etc. that are specific to a model, you can pass it with this key:
Managing AWS Bedrock Prompts
You can manage all prompts to AWS bedrock in the Prompt Library. All the current models of Anthropic are supported and you can easily start testing different prompts. Once you’re ready with your prompt, you can use theportkey.prompts.completions.create
interface to use the prompt in your application.
Making Requests without using Portkey’s Model Catalog
If you do not want to add your AWS details to Portkey vault, you can also directly pass them while instantiating the Portkey client.Mapping the Bedrock Details
Node SDK | Python SDK | REST Headers |
---|---|---|
awsAccessKeyId | aws_access_key_id | x-portkey-aws-access-key-id |
awsSecretAccessKey | aws_secret_access_key | x-portkey-aws-secret-access-key |
awsRegion | aws_region | x-portkey-aws-region |
awsSessionToken | aws_session_token | x-portkey-aws-session-token |
Example
Using AWS PrivateLink for Bedrock [Self Hosted Enterprise]
Though using assumed role is in itself enough for enterprise security. You can additional configure AWS PrivateLink for Bedrock to ensure that your requests are not traversed outside your VPC.- Create a private link between the VPC you’ve deployed Portkey and AWS Bedrock (the endpoint is in most cases
https://bedrock.{your_region}.amazonaws.com
). - When configuring your integration on portkey, simply configure the
custom host
option to point to your VPC endpoint for the private link.
Supported Models
List of supported Amazon Bedrock model IDs
How to Find Your AWS Credentials
Navigate here in the AWS Management Console to obtain your AWS Access Key ID and AWS Secret Access Key.- In the console, you’ll find the ‘Access keys’ section. Click on ‘Create access key’.
- Copy the
Secret Access Key
once it is generated, and you can view theAccess Key ID
along with it.

- On the same page under the ‘Access keys’ section, where you created your Secret Access key, you will also find your Access Key ID.

- And lastly, get Your
AWS Region
from the Home Page of AWS Bedrock as shown in the image below.
