Production MCP, governed & observable
Production MCP, governed & observable
Production MCP, governed & observable
Connect MCP servers to 1600+ LLMs through Portkey’s MCP Gateway with authentication, policies, logging, and cost controls built in.



Powering 3000+ GenAI teams
Powering 3000+ GenAI teams
Powering 3000+ GenAI teams
Introducing the MCP Gateway
Introducing the MCP Gateway
Introducing the MCP Gateway
MCP Connectors
Connect Portkey’s AI Gateway to any MCP server, no extra clients or orchestration layers. Handle multiple MCP servers in a single



MCP Hub
Discover, approve, and govern which MCP tools agents can call. Portkey gives you one place to manage metadata, set permissions






Make MCP servers secure, discoverable, and manageable
Make MCP servers secure, discoverable, and manageable
Make MCP servers secure, discoverable, and manageable
No more juggling dozens of disconnected servers. Portkey centralizes security, access, and management so teams can focus on building with MCP, not maintaining it.
Run tools from any MCP server using any LLM
Run tools from any MCP server using any LLM
Run tools from any MCP server using any LLM
Portkey’s model-agnostic Gateway lets you connect MCP tools to 1600+ LLMs, no custom integrations, no extra orchestration required.
Discover MCP servers and tools in a unified hub
Invoke tools seamlessly across models without rewiring
Cloudflare
Secure • Accelerate • Deliver
Figma
Design • Create
Auth0
Authenticate • Protect
Docker
Build • Ship • Run
Github
Code • Collaborate • Ship
MongoDB
Store • Query • Scale
Microsoft Azure
Build • Deploy • Manage
Cloudflare
Secure • Accelerate • Deliver
Figma
Design • Create
Auth0
Authenticate • Protect
Docker
Build • Ship • Run
Github
Code • Collaborate • Ship
MongoDB
Store • Query • Scale
Microsoft Azure
Build • Deploy • Manage
Cloudflare
Secure • Accelerate • Deliver
Figma
Design • Create
Auth0
Authenticate • Protect
Docker
Build • Ship • Run
Github
Code • Collaborate • Ship
MongoDB
Store • Query • Scale
Microsoft Azure
Build • Deploy • Manage
Cloudflare
Secure • Accelerate • Deliver
Figma
Design • Create
Auth0
Authenticate • Protect
Docker
Build • Ship • Run
Github
Code • Collaborate • Ship
MongoDB
Store • Query • Scale
Microsoft Azure
Build • Deploy • Manage
Cloudflare
Secure • Accelerate • Deliver
Figma
Design • Create
Auth0
Authenticate • Protect
Docker
Build • Ship • Run
Github
Code • Collaborate • Ship
MongoDB
Store • Query • Scale
Microsoft Azure
Build • Deploy • Manage
Cloudflare
Secure • Accelerate • Deliver
Figma
Design • Create
Auth0
Authenticate • Protect
Docker
Build • Ship • Run
Github
Code • Collaborate • Ship
MongoDB
Store • Query • Scale
Microsoft Azure
Build • Deploy • Manage
Cloudflare
Secure • Accelerate • Deliver
Figma
Design • Create
Auth0
Authenticate • Protect
Docker
Build • Ship • Run
Github
Code • Collaborate • Ship
MongoDB
Store • Query • Scale
Microsoft Azure
Build • Deploy • Manage
Cloudflare
Secure • Accelerate • Deliver
Figma
Design • Create
Auth0
Authenticate • Protect
Docker
Build • Ship • Run
Github
Code • Collaborate • Ship
MongoDB
Store • Query • Scale
Microsoft Azure
Build • Deploy • Manage


Simplify MCP server authentication
Simplify MCP server authentication
Simplify MCP server authentication
With built-in OAuth and unified access control, Portkey lets you connect once and share secure access to all MCP tools.
With built-in OAuth and unified access control, Portkey lets you connect once and share secure access to all MCP tools.
Consistent authentication across all servers and tools
Eliminate custom auth flows or per-server credentials
LLM / Agent Request
LLM / Agent Request

LLM / Agent Request
Handle multiple MCP servers in a single request
Route tool calls across multiple MCP servers within a single LLM or agent flow, without coordination code or separate pipelines.
No more unauthorized tool usage, anywhere
Whether you're using your favorite apps or building new AI agents, restrict tools access with granular, enforceable permissions.
Build and Deploy
Build and Deploy
Build and Deploy
Simplify state management for multi-step interactions
Portkey handles state, retries, and context threading so multi-step flows work seamlessly, even across multiple MCP servers
Step 1
User Input

Step 2

Step 3

Step 4
Step 1
User Input

Step 2

Step 3

Step 4
Step 1
User Input

Step 2

Step 3

Step 4
Debug faster with end-to-end observability
Pinpoint exactly where issues occur - from LLM calls to tool executions - with comprehensive logs and traces.
1.13s
embedding
Timings:
Start Time
10.03s
End Time
11.13s
Embedding
Model
gpt-4-0231
Cost
$0 Cents
Manage spends with usage limits, budgets and cost tracking
Set enforceable limits across teams and environments, and monitor usage and spending across MCP tools and LLMs.
15 Days, 3 Hours
15 Days, 3 Hours
Uptime
Uptime
↑+99.9%
↑+99.9%
Improve reliability with intelligent routing
Route requests based on latency, health, or traffic, keeping workloads responsive even under scale.
How Portkey runs MCP workflows
How Portkey runs MCP
workflows
How Portkey runs MCP workflows
1
Add servers and manage authentication
Register internal, remote, and third-party MCP servers in the Hub. Handle OAuth 2.1 with client credentials and auth codes—no custom wiring required.
#1
2
Apply access control and rate limits
Define least-privilege permissions for servers, tools, and resources, while enforcing per-team budgets, usage caps, and rate limits.
#2
3
Run your MCP workflow with any LLM
Work with any of 1600+ LLMs through Portkey’s Gateway and invoke the MCP servers and tools of your choice in a single request.
#3
4
Get complete observability
Track every step end-to-end, from model prompt to tool execution, with detailed logs, traces, and cost visibility.
#4
What you can build
What you can build
What you
can build
A comprehensive map of all available MCP servers, from reference
implementations to community contributions


Enterprise-ready LLMOps platform
Highest standards for security & compliance
Portkey complies with stringent data privacy and security standards so that you can focus on innovation without worrying about data security.
Audited every quarter
100% secure on-prem deployment with full control
Deploy Portkey in your private cloud for enhanced security, control, and 100% data ownership.






















































































































































































































































































































































The most popular open source AI Gateway
Portkey’s AI Gateway is actively maintained by 50+ contributors worldwide, bringing the cutting edge of AI work into the Gateway.

New Integration: Inference.net (wholesaler of LLM...
Kierra Westervelt • Oct 03, 2024

New Integration: DeepSeek AI on Gateway
Alfonso Workman • Oct 01, 2024

Enhance RAG Retrieval Success by 67% using...
Lincoln Geidt • Sep 30, 2024

OpenAI o1-preview and o1-mini on Portkey
Zain Aminoff • Sep 27, 2024

New Integration: Inference.net (wholesaler...
Omar Calzoni • Sep 23, 2024
Changelog
We’re open-source!
Join 50 other contributors in collaboratively developing Portkey’s open-source AI Gateway and push the frontier of production-ready AI.
Enterprise-ready LLMOps platform
Highest standards for security & compliance
Portkey complies with stringent data privacy and security standards so that you can focus on innovation without worrying about data security.
Audited every quarter
100% secure on-prem deployment with full control
Deploy Portkey in your private cloud for enhanced security, control, and 100% data ownership.






















































































































































































































































































































































The most popular open source AI Gateway
Portkey’s AI Gateway is actively maintained by 50+ contributors worldwide, bringing the cutting edge of AI work into the Gateway.

New Integration: Inference.net (wholesaler of LLM...
Kierra Westervelt • Oct 03, 2024

New Integration: DeepSeek AI on Gateway
Alfonso Workman • Oct 01, 2024

Enhance RAG Retrieval Success by 67% using...
Lincoln Geidt • Sep 30, 2024

OpenAI o1-preview and o1-mini on Portkey
Zain Aminoff • Sep 27, 2024

New Integration: Inference.net (wholesaler...
Omar Calzoni • Sep 23, 2024
Changelog
We’re open-source!
Join 50 other contributors in collaboratively developing Portkey’s open-source AI Gateway and push the frontier of production-ready AI.
Enterprise-ready LLMOps platform
Highest standards for security & compliance
Portkey complies with stringent data privacy and security standards so that you can focus on innovation without worrying about data security.
Audited every quarter
100% secure on-prem deployment with full control
Deploy Portkey in your private cloud for enhanced security, control, and 100% data ownership.






















































































































































































































































































































































The most popular open source AI Gateway
Portkey’s AI Gateway is actively maintained by 50+ contributors worldwide, bringing the cutting edge of AI work into the Gateway.

New Integration: Inference.net (wholesaler of LLM...
Kierra Westervelt • Oct 03, 2024

New Integration: DeepSeek AI on Gateway
Alfonso Workman • Oct 01, 2024

Enhance RAG Retrieval Success by 67% using...
Lincoln Geidt • Sep 30, 2024

OpenAI o1-preview and o1-mini on Portkey
Zain Aminoff • Sep 27, 2024

New Integration: Inference.net (wholesaler...
Omar Calzoni • Sep 23, 2024
Changelog
We’re open-source!
Join 50 other contributors in collaboratively developing Portkey’s open-source AI Gateway and push the frontier of production-ready AI.
Trusted by Fortune 500s & Startups
Portkey is easy to set up, and the ability for developers to share credentials with LLMs is great. Overall, it has significantly sped up our development process.
Patrick L,
Founder and CPO, QA.tech


With 30 million policies a month, managing over 25 GenAI use cases became a pain. Portkey helped with prompt management, tracking costs per use case, and ensuring our keys were used correctly. It gave us the visibility we needed into our AI operations.
Prateek Jogani,
CTO, Qoala

Portkey stood out among AI Gateways we evaluated for several reasons: excellent, dedicated support even during the proof of concept phase, easy-to-use APIs that reduce time spent adapting code for different models, and detailed observability features that give deep insights into traces, errors, and caching
AI Leader,
Fortune 500 Pharma Company
Portkey is a no-brainer for anyone using AI in their GitHub workflows. It has saved us thousands of dollars by caching tests that don't require reruns, all while maintaining a robust testing and merge platform. This prevents merging PRs that could degrade production performance. Portkey is the best caching solution for our needs.
Kiran Prasad,
Senior ML Engineer, Ario


Well done on creating such an easy-to-use and navigate product. It’s much better than other tools we’ve tried, and we saw immediate value after signing up. Having all LLMs in one place and detailed logs has made a huge difference. The logs give us clear insights into latency and help us identify issues much faster. Whether it's model downtime or unexpected outputs, we can now pinpoint the problem and address it immediately. This level of visibility and efficiency has been a game-changer for our operations.
Oras Al-Kubaisi,
CTO, Figg





Used by ⭐️ 16,000+ developers across the world
Trusted by Fortune 500s
& Startups
Portkey is easy to set up, and the ability for developers to share credentials with LLMs is great. Overall, it has significantly sped up our development process.
Patrick L,
Founder and CPO, QA.tech


With 30 million policies a month, managing over 25 GenAI use cases became a pain. Portkey helped with prompt management, tracking costs per use case, and ensuring our keys were used correctly. It gave us the visibility we needed into our AI operations.
Prateek Jogani,
CTO, Qoala

Portkey stood out among AI Gateways we evaluated for several reasons: excellent, dedicated support even during the proof of concept phase, easy-to-use APIs that reduce time spent adapting code for different models, and detailed observability features that give deep insights into traces, errors, and caching
AI Leader,
Fortune 500 Pharma Company
Portkey is a no-brainer for anyone using AI in their GitHub workflows. It has saved us thousands of dollars by caching tests that don't require reruns, all while maintaining a robust testing and merge platform. This prevents merging PRs that could degrade production performance. Portkey is the best caching solution for our needs.
Kiran Prasad,
Senior ML Engineer, Ario


Well done on creating such an easy-to-use and navigate product. It’s much better than other tools we’ve tried, and we saw immediate value after signing up. Having all LLMs in one place and detailed logs has made a huge difference. The logs give us clear insights into latency and help us identify issues much faster. Whether it's model downtime or unexpected outputs, we can now pinpoint the problem and address it immediately. This level of visibility and efficiency has been a game-changer for our operations.
Oras Al-Kubaisi,
CTO, Figg





Used by ⭐️ 16,000+ developers across the world
Trusted by Fortune 500s & Startups
Portkey is easy to set up, and the ability for developers to share credentials with LLMs is great. Overall, it has significantly sped up our development process.
Patrick L,
Founder and CPO, QA.tech


With 30 million policies a month, managing over 25 GenAI use cases became a pain. Portkey helped with prompt management, tracking costs per use case, and ensuring our keys were used correctly. It gave us the visibility we needed into our AI operations.
Prateek Jogani,
CTO, Qoala

Portkey stood out among AI Gateways we evaluated for several reasons: excellent, dedicated support even during the proof of concept phase, easy-to-use APIs that reduce time spent adapting code for different models, and detailed observability features that give deep insights into traces, errors, and caching
AI Leader,
Fortune 500 Pharma Company
Portkey is a no-brainer for anyone using AI in their GitHub workflows. It has saved us thousands of dollars by caching tests that don't require reruns, all while maintaining a robust testing and merge platform. This prevents merging PRs that could degrade production performance. Portkey is the best caching solution for our needs.
Kiran Prasad,
Senior ML Engineer, Ario


Well done on creating such an easy-to-use and navigate product. It’s much better than other tools we’ve tried, and we saw immediate value after signing up. Having all LLMs in one place and detailed logs has made a huge difference. The logs give us clear insights into latency and help us identify issues much faster. Whether it's model downtime or unexpected outputs, we can now pinpoint the problem and address it immediately. This level of visibility and efficiency has been a game-changer for our operations.
Oras Al-Kubaisi,
CTO, Figg





Used by ⭐️ 16,000+ developers across the world
Latest guides and resources

Strategic Perspective on the MCP Registry for Enterprise
While the registry enables public discovery, realizing MCP’s value depends on designing for enterprise complexity.

The hidden challenge of MCP adoption in enterprises in 2025
MCP servers are multiplying at an overwhelming pace and at scale, they create an invisible sprawl.

MCP Server Ecosystem
A map of MCP servers, from reference implementations to community contributions.
Latest guides and resources

Strategic Perspective on the MCP Registry for Enterprise
While the registry enables public discovery, realizing MCP’s value depends on designing for enterprise complexity.

The hidden challenge of MCP adoption in enterprises in 2025
MCP servers are multiplying at an overwhelming pace and at scale, they create an invisible sprawl.

MCP Server Ecosystem
A map of MCP servers, from reference implementations to community contributions.
Latest guides and resources

Strategic Perspective on the MCP Registry for Enterprise
While the registry enables public discovery, realizing MCP’s value depends on designing for enterprise complexity.

The hidden challenge of MCP adoption in enterprises in 2025
MCP servers are multiplying at an overwhelming pace and at scale, they create an invisible sprawl.

MCP Server Ecosystem
A map of MCP servers, from reference implementations to community contributions.
The fastest way to bring MCP-powered apps to production
The fastest way to bring MCP-powered apps to production

The fastest way to bring MCP-powered apps to production
Products
© 2025 Portkey, Inc. All rights reserved
HIPAA
COMPLIANT
GDPR
Products
© 2025 Portkey, Inc. All rights reserved
HIPAA
COMPLIANT
GDPR
Products
© 2025 Portkey, Inc. All rights reserved
HIPAA
COMPLIANT
GDPR