Amazon Bedrock Mantle is AWS’s OpenAI-compatible inference engine for Bedrock models. TheDocumentation Index
Fetch the complete documentation index at: https://docs.portkey.ai/docs/llms.txt
Use this file to discover all available pages before exploring further.
bedrock-mantle provider in Portkey routes requests to bedrock-mantle.{region}.api.aws, giving you access to Bedrock models via:
/v1/chat/completions(OpenAI Chat Completions)/v1/responses(OpenAI Responses)/v1/messages(Anthropic-native, viabedrock-mantle.{region}.api.aws/anthropic/v1)
Use
bedrock-mantle when you want OpenAI-compatible access to Bedrock. For the classic Bedrock runtime (Converse / InvokeModel), use the bedrock provider.Quick Start
Add Provider in Model Catalog
Configure a new provider with Bedrock Mantle as the provider type. Choose one of the supported auth modes:| Auth Mode | Required Fields |
|---|---|
| API Key | apiKey (Amazon Bedrock API key / bearer token) |
| Assumed Role | awsRoleArn, awsExternalId, awsRegion |
| Service Role (EKS / IRSA) | awsRegion (credentials are resolved from the pod’s service role) |
awsRegion determines which regional endpoint is used (e.g., us-east-1 → bedrock-mantle.us-east-1.api.aws).
Anthropic Messages API
Requests to/v1/messages or chat completions with anthropic.* models are routed to the Mantle Anthropic base path (/anthropic/v1). Pass the Anthropic version via the anthropic-version header or anthropic_version in the request body (defaults to 2023-06-01).
Related Resources
Amazon Bedrock Mantle Documentation
Official AWS documentation for Bedrock Mantle endpoints, supported regions, and models.

