AWS Bedrock Knowledge Bases enables you to give foundation models access to your company’s private data sources, delivering more relevant, accurate, and customized responses through Retrieval Augmented Generation (RAG).

With Portkey’s integration, you can seamlessly create and connect to AWS Bedrock Knowledge Bases while gaining enterprise features like observability, caching, and reliability - all through a unified API that simplifies authentication.

What is AWS Bedrock Knowledge Bases?

AWS Bedrock Knowledge Bases is a fully managed service that implements the entire RAG workflow - from data ingestion to retrieval and prompt augmentation. It allows you to:

  • Connect to Multiple Data Sources: Automatically fetch data from Amazon S3, Confluence, Salesforce, SharePoint, and more.
  • Managed Vector Storage: Store embeddings in supported vector databases like Amazon Aurora, OpenSearch, Neptune, MongoDB, Pinecone, or Redis.
  • Advanced Retrieval: Use semantic search and filtering for accurate information retrieval.
  • Source Attribution: All retrieved information includes citations to improve transparency and minimize hallucinations.

Prerequisites

Before integrating AWS Bedrock Knowledge Bases with Portkey, ensure you have:

  1. AWS Account with Bedrock access enabled.
  2. AWS Credentials with permissions to create and access Bedrock Knowledge Bases, configured in Portkey.
  3. Portkey Account with an API key.
  4. An IAM Role with the necessary permissions for Bedrock to access your data sources.
  5. A Vector Store (like Amazon OpenSearch Serverless) to store the indexed data.

Setup Guide

Step 1: Create an AWS Integration in Portkey

You need to connect your AWS account to Portkey. This allows Portkey to make authenticated requests to AWS on your behalf.

1

Navigate to Integrations

Navigate to the Integrations section on Portkey’s Sidebar. This is where you’ll connect your LLM providers.

  1. Find Bedrock and click Connect.
  2. In the “Create New Integration” window:
    • Enter a Name for reference (e.g., aws-bedrock-prod).
    • Enter a Slug for the integration (e.g., aws-bedrock-prod).
    • Enter your AWS Access Key ID, Secret Access Key, and Default Region.
  3. Click Next Step.

The Slug you create here will be used as the virtual_key in your Portkey client initialization.

2

Configure Models

On the model provisioning page:

  • Leave all models selected (or customize).
  • Toggle “Automatically enable new models” if desired.

Click Create Integration to complete the integration.

Step 2: Create a Knowledge Base

To create an AWS Bedrock Knowledge Base, you use Portkey’s put method. This acts as a proxy, sending a signed request directly to the AWS PUT /knowledgebases/ API endpoint. Portkey handles the complex AWS Signature Version 4 authentication process for you.

When initializing the Portkey client, you must provide a custom_host that points to the AWS Bedrock Agent API endpoint for your region.

AWS Bedrock Docs to create Knowledgebase

from portkey_ai import Portkey

# The virtual_key is the slug of your Bedrock provider in Portkey
# The custom_host is the Bedrock Agent API endpoint for your region
portkey = Portkey(
    api_key="YOUR_PORTKEY_API_KEY",
    virtual_key="your-bedrock-provider-slug",
    custom_host="https://bedrock-agent.us-east-1.amazonaws.com"
)

# The body for the request to create a new Knowledge Base
# Replace placeholders with your actual resource ARNs and names
kb_body = {
   "name": "MyKB",
   "description": "My knowledge base",
   "roleArn": "arn:aws:iam::111122223333:role/service-role/AmazonBedrockExecutionRoleForKnowledgeBase_123",
   "knowledgeBaseConfiguration": {
      "type": "VECTOR",
      "vectorKnowledgeBaseConfiguration": {
         "embeddingModelArn": "arn:aws:bedrock:us-east-1::foundation-model/amazon.titan-embed-text-v2:0",
         "embeddingModelConfiguration": {
            "bedrockEmbeddingModelConfiguration": {
               "dimensions": 1024,
               "embeddingDataType": "BINARY"
            }
         },
         "supplementalDataStorageConfiguration": {
            "storageLocations": [
               {
                  "s3Location": {
                     "uri": "arn:aws:s3:::MyBucket"
                  },
                  "type": "S3"
               }
            ]
         }
      }
   },
   "storageConfiguration": {
      "opensearchServerlessConfiguration": {
         "collectionArn": "arn:aws:aoss:us-east-1:111122223333:collection/abcdefghij1234567890",
         "fieldMapping": {
            "metadataField": "metadata",
            "textField": "text",
            "vectorField": "vector"
         },
         "vectorIndexName": "MyVectorIndex"
      }
   }
}

# Make the PUT request to create the knowledge base
response = portkey.put(
    path='/knowledgebases/',
    body=kb_body
)
print(response.json())

Step 3: Retrieve from a Knowledge Base

Once your knowledge base is created and has finished syncing, you can query it using Portkey’s post method. This request is sent to the AWS retrieve API endpoint: POST /knowledgebases/{knowledgeBaseId}/retrieve.

AWS Bedrock Docs to create Knowledgebase

Important: For retrieval, the custom_host URL is different. You must use the Bedrock Agent Runtime endpoint for your region (e.g., bedrock-agent-runtime.us-east-1.amazonaws.com).

from portkey_ai import Portkey

# For retrieval, use the Bedrock Agent Runtime URL
portkey = Portkey(
    api_key="YOUR_PORTKEY_API_KEY",
    virtual_key="your-bedrock-provider-slug",
    custom_host="bedrock-agent-runtime.us-east-1.amazonaws.com"
)

# Your Knowledge Base ID from the AWS console
knowledge_base_id = "YOUR_KNOWLEDGE_BASE_ID"
query = "What is our company's remote work policy?"

response = portkey.post(
    url=f"/knowledgebases/{knowledge_base_id}/retrieve",
    retrievalQuery={
        "text": query
    }
)
# The response contains the retrieved chunks
print(response.json())

Next Steps

Now that you can create and retrieve data from your knowledge base, you can build a full RAG application.

For enterprise support with your AWS Bedrock integration, contact our enterprise team.