Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.portkey.ai/docs/llms.txt

Use this file to discover all available pages before exploring further.

Amazon Bedrock Mantle is AWS’s OpenAI-compatible inference engine for Bedrock models. The bedrock-mantle provider in Portkey routes requests to bedrock-mantle.{region}.api.aws, giving you access to Bedrock models via:
  • /v1/chat/completions (OpenAI Chat Completions)
  • /v1/responses (OpenAI Responses)
  • /v1/messages (Anthropic-native, via bedrock-mantle.{region}.api.aws/anthropic/v1)
Use bedrock-mantle when you want OpenAI-compatible access to Bedrock. For the classic Bedrock runtime (Converse / InvokeModel), use the bedrock provider.

Quick Start

from portkey_ai import Portkey

portkey = Portkey(api_key="PORTKEY_API_KEY")

response = portkey.chat.completions.create(
    model="@bedrock-mantle-provider/anthropic.claude-sonnet-4-5-20250929-v1:0",
    messages=[{"role": "user", "content": "Say this is a test"}]
)

print(response.choices[0].message.content)

Add Provider in Model Catalog

Configure a new provider with Bedrock Mantle as the provider type. Choose one of the supported auth modes:
Auth ModeRequired Fields
API KeyapiKey (Amazon Bedrock API key / bearer token)
Assumed RoleawsRoleArn, awsExternalId, awsRegion
Service Role (EKS / IRSA)awsRegion (credentials are resolved from the pod’s service role)
awsRegion determines which regional endpoint is used (e.g., us-east-1 → bedrock-mantle.us-east-1.api.aws).

Anthropic Messages API

Requests to /v1/messages or chat completions with anthropic.* models are routed to the Mantle Anthropic base path (/anthropic/v1). Pass the Anthropic version via the anthropic-version header or anthropic_version in the request body (defaults to 2023-06-01).

Amazon Bedrock Mantle Documentation

Official AWS documentation for Bedrock Mantle endpoints, supported regions, and models.
Last modified on May 4, 2026