Area | Key Highlights |
---|---|
Platform | • Model Catalog launch • OpenAI background mode • Circuit breaker config |
Gateway & Providers | • Support for Sutra, o3-pro, Magistral, and Gemini 2.5 models • Vertex AI global endpoints • Anthropic Computer Use tool support • Support for additional Azure OpenAI endpoints • Bedrock Inference Profiles • Prompt caching for tools (Anthropic) |
Integrations | • OpenAI Agent SDK (TypeScript) • Langroid native support • Strands SDK & ADK integration • Cursor integration • Gemini CLI support |
region = global
in your Vertex AI Virtual Key config to automatically access Google’s distributed infrastructure — no manual region selection needed.
OpenAI Background Mode
Reasoning models can take minutes to solve complex problems.
With background mode, you can now run long-running tasks on models like o3-pro
and o1-pro
reliably, without worrying about timeouts or dropped connections.
Portkey now supports background mode for OpenAI requests.
Simply pass background:True
as a parameter, and Portkey will handle the rest.
Anthropic’s Computer Use
Experiment confidently with Anthropic’s new Computer Use tool by adding observability, fallback logic, and cost controls from day one. Learn more ->
Support for additional Azure OpenAI endpoints
Portkey now supports a wider range of Azure OpenAI endpoints including image generation, audio transcription, speech synthesis, file management, and batch operations.
Anthropic prompt caching (tools)
Anthropic tool-based interactions now benefit from prompt caching in Portkey, improving performance and reducing token use for repeated tool calls.
To keeping up the pace!
service_tier
flag added