Use Portkey with AWS’s Strands Agents to take your AI Agents to production
Install Dependencies
Replace Your Model Initialization
Use Your Agent Normally
client_args
parameter, which passes any arguments directly to the OpenAI client constructor. By setting base_url
to Portkey’s gateway, all requests route through Portkey while maintaining full compatibility with the OpenAI API.
Add Your AI Provider Keys
Create a Configuration
Generate Your Portkey API Key
Import Errors
ModuleNotFoundError
when importing Portkey componentsSolution: Ensure all dependencies are installed:Authentication Errors
AuthenticationError
when making requestsSolution: Verify your Portkey API key and provider setup: Verify your Portkey API key and provider setup. Test your Portkey API key directly and check for common issues such as wrong API key format, misconfigured provider, and missing config attachments.Rate Limiting
Model Compatibility
Missing Traces/Logs
How does Portkey enhance Strands Agents?
Can I use Portkey with existing Strands Agents applications?
Does Portkey work with all Strands Agents features?
Can I track usage across multiple agents in a workflow?
trace_id
across multiple agents and requests to track the entire workflow. This is especially useful for multi-agent systems where you want to understand the full execution path.How do I filter logs and traces for specific agent runs?
agent_name
, agent_type
, or session_id
to easily find and analyze specific agent executions.Can I use my own API keys with Portkey?