Summary
| Area | Key Highlights |
|---|---|
| Platform | • Sticky load balancing for session-consistent routing |
| Integrations | • AWS Bedrock AgentCore support • OpenCode integration |
| Guardrails | • Block Tools guardrail for controlling tool usage |
| Gateway (Models & Providers) | • GPT-5.2 • Gemini 3 Flash Preview • New provider: OCI |
| Gateway (Enhancements) | • Anthropic models via Azure AI Foundry (/messages)• OpenAI conversation & modalities params + Sora pricing• Gemini / Vertex reasoning & image config support • Azure image editing ( /v1/images/edits) |
| Community & Events | • Portkey AI Builders Challenge |
How Hedy AI delivers reliable real-time AI coaching for 20,000 users
Hedy AI is a real-time AI coaching assistant designed to help professionals bring their best selves to every conversation. Julian Pscheid, Founder & CEO of Hedy AI, shares how Hedy supports over 20,000 individual users by delivering real-time understanding, intelligent suggestions, and long-term conversational memory during and after meetings.Platform
Sticky load balancing
Sticky load balancing ensures that requests sharing the same identifier are consistently routed to the same target. This is especially useful for:- Maintaining conversation context across requests
- Ensuring consistent model behavior during A/B testing
- Session-based or user-specific routing.
Integrations
AWS Bedrock AgentCore support
Because AgentCore supports OpenAI-compatible frameworks, you can integrate Portkey without modifying your agent code while keeping AgentCore’s runtime, gateway, and memory services intact. With this setup, you get:- A unified gateway for 1600+ models across providers
- Production telemetry (traces, logs, metrics) for AgentCore invocations
- Reliability controls such as fallbacks, load balancing, and timeouts
- Centralized governance over provider access, spend, and policies using Portkey API keys
OpenCode integration
OpenCode’s model-agnostic, terminal-first workflow can now be run with Portkey underneath, allowing teams to add access control, budgets, limits, and observability at the platform layer, without changing how developers use OpenCode. See how you can set it up here.Guardrails
Block Tools
We added a new guardrail that allows you to control which AI tools can be used in requests. Supported tool types include:function, web_search_preview, web_search, file_search, code_interpreter, computer_use, and mcp.
Gateway
New models & providers
- GPT-5.2: Frontier-grade model now available.
- Gemini 3 Flash preview: Google’s latest model with improved reasoning performance.
- OCI: LLM provider for enteprises that are integrated with Oracle Cloud workloads
Model & Provider Enhancements
- Azure AI Foundry: access Anthropic models using the native
/messagesendpoint - OpenAI: Added support for
conversation&modalitiesparameters and pricing for Sora models. - Gemini / Vertex AI: Added support for
reasoning_effort, mapping OpenAI-style reasoning levels to Gemini’s thinkingLevel. - Gemini/Vertex AI: Added support for
image_configto control image generation settings such asaspect_ratioandimage_size - Azure AI: Added support for the
/v1/images/editsendpoint for image editing workflows.
Community & Events
Portkey AI Builders Challenge
This challenge is designed to identify engineers who can think in systems, debug real problems, and work close to production. Exciting rewards for top-performing participants! If you’d like to participate, please register here
Resources
- Blog: Understanding MCP authorization
- Blog: AI audit checklist
- Blog: OpenCode: token usage, costs and model access control

