Products

Solutions

Developers

Resources

...
...

MCP Gateway

Secure infrastructure for enterprise MCP adoption

Secure infrastructure for enterprise MCP adoption

Secure infrastructure for enterprise MCP adoption

Portkey’s MCP Gateway provides the infrastructure for deploying, governing, and monitoring MCP servers in production

Upload an SVG file
Upload an SVG file
Upload an SVG file

Powering 3000+ GenAI teams

Powering 3000+ GenAI teams

Powering 3000+ GenAI teams

Introducing the MCP Gateway

Introducing the MCP Gateway

Introducing the MCP Gateway

MCP Connectors

Connect Portkey’s AI Gateway to any MCP server instantly with built-in authentication, no extra clients or orchestration layers.

MCP Hub

Register, discover, and govern tool access across all internal and external MCP servers from one unified hub.

Make MCP servers secure, discoverable, and manageable

Make MCP servers secure, discoverable, and manageable

Make MCP servers secure, discoverable, and manageable

Portkey centralizes authentication, access, and observability of MCP servers, so teams can focus on building with MCP, not maintaining it.

Run tools from any MCP server using any LLM

Run tools from any MCP server using any LLM

Run tools from any MCP server using any LLM

Connect 1600+ LLMs to any MCP tool through Portkey’s model-agnostic Gateway, no custom integrations or orchestration required.

Discover MCP servers and tools in a unified hub

 Invoke tools seamlessly across models without rewiring

Simplify MCP server authentication

Simplify MCP server authentication

Simplify MCP server authentication

With built-in OAuth and access control, Portkey lets you connect once and share secure access to all MCP tools.

With built-in OAuth and access control, Portkey lets you connect once and share secure access to all MCP tools.

Unified authentication across all MCP servers

Eliminate custom auth flows or per-server credentials

LLM / Agent Request

LLM / Agent Request

LLM / Agent Request

Handle multiple MCP servers in one request

Route tool calls across multiple MCP servers within a single LLM or agent flow, without coordination code or separate pipelines.

Enforce RBAC and rate limits

Set rate limits, budgets, and
restrict tools access with
granular, enforceable
permissions.

SF DevOps

SF DevOps

SF DevOps

Grand Access

Grand Access

Grand Access

Simplify state management

Portkey handles state, retries, and context threading so multi-step flows work seamlessly, even across multiple MCP servers.

Step 1

User Input

Step 2

Step 3

Step 4

Step 1

User Input

Step 2

Step 3

Step 4

Step 1

User Input

Step 2

Step 3

Step 4

Debug faster with detailed logs and traces

Pinpoint exactly where issues occur - from LLM calls to tool executions, with comprehensive logs and traces.

1.13s

embedding

Timings:

Start Time

10.03s

End Time

11.13s

Embedding

Model

gpt-4-0231

Cost

$0 Cents

Get complete MCP visibility

Monitor MCP usage across teams in real time. Every MCP request is logged with user, tool, and model context, ensuring full traceability and compliance visibility.

15 Days, 3 Hours

15 Days, 3 Hours

Uptime

Uptime

↑+99.9%

↑+99.9%

Improve reliability with intelligent routing

Route requests based on latency, health, or traffic, keeping workloads responsive even under scale.

How Portkey runs MCP workflows

How Portkey runs MCP
workflows

How Portkey runs MCP workflows

1

Provision MCP servers

Register internal, remote, and third-party MCP servers in the Hub. Handle OAuth 2.1 with client credentials and auth codes—no custom wiring required.

#1

2

Define permissions and limits

Define least-privilege permissions or share access through access Okta, Azure AD, or any IdP. Enforce usage caps, rate limits, and budgets across teams.

#2

3

Run with LLMs

Work with any of 1600+ LLMs through Portkey’s Gateway and invoke the MCP servers and tools of your choice in a single request.

#3

4

Observe and analyze

Track every step end-to-end, from model prompt to tool execution, with detailed logs, traces, and cost visibility.

#4

What you can build

What you can build

What you
can build

A comprehensive map of all available MCP servers, from reference
implementations to community contributions

 Enterprise-ready LLMOps platform

Highest standards for security & compliance

Portkey complies with stringent data privacy and security standards so that you can focus on innovation without worrying about data security.

Audited every quarter
100% secure on-prem deployment with full control

Deploy Portkey in your private cloud for enhanced security, control, and 100% data ownership.

The most popular open source AI Gateway

Portkey’s AI Gateway is actively maintained by 50+ contributors worldwide, bringing the cutting edge of AI work into the Gateway.

New Integration: Inference.net (wholesaler of LLM...

Kierra Westervelt • Oct 03, 2024

New Integration: DeepSeek AI on Gateway

Alfonso Workman • Oct 01, 2024

Enhance RAG Retrieval Success by 67% using...

Lincoln Geidt • Sep 30, 2024

OpenAI o1-preview and o1-mini on Portkey

Zain Aminoff • Sep 27, 2024

New Integration: Inference.net (wholesaler...

Omar Calzoni • Sep 23, 2024

Changelog

We’re open-source!

Join 50 other contributors in collaboratively developing Portkey’s open-source AI Gateway and push the frontier of production-ready AI.

 Enterprise-ready LLMOps platform

Highest standards for security & compliance

Portkey complies with stringent data privacy and security standards so that you can focus on innovation without worrying about data security.

Audited every quarter
100% secure on-prem deployment with full control

Deploy Portkey in your private cloud for enhanced security, control, and 100% data ownership.

The most popular open source AI Gateway

Portkey’s AI Gateway is actively maintained by 50+ contributors worldwide, bringing the cutting edge of AI work into the Gateway.

New Integration: Inference.net (wholesaler of LLM...

Kierra Westervelt • Oct 03, 2024

New Integration: DeepSeek AI on Gateway

Alfonso Workman • Oct 01, 2024

Enhance RAG Retrieval Success by 67% using...

Lincoln Geidt • Sep 30, 2024

OpenAI o1-preview and o1-mini on Portkey

Zain Aminoff • Sep 27, 2024

New Integration: Inference.net (wholesaler...

Omar Calzoni • Sep 23, 2024

Changelog

We’re open-source!

Join 50 other contributors in collaboratively developing Portkey’s open-source AI Gateway and push the frontier of production-ready AI.

 Enterprise-ready LLMOps platform

Highest standards for security & compliance

Portkey complies with stringent data privacy and security standards so that you can focus on innovation without worrying about data security.

Audited every quarter
100% secure on-prem deployment with full control

Deploy Portkey in your private cloud for enhanced security, control, and 100% data ownership.

The most popular open source AI Gateway

Portkey’s AI Gateway is actively maintained by 50+ contributors worldwide, bringing the cutting edge of AI work into the Gateway.

New Integration: Inference.net (wholesaler of LLM...

Kierra Westervelt • Oct 03, 2024

New Integration: DeepSeek AI on Gateway

Alfonso Workman • Oct 01, 2024

Enhance RAG Retrieval Success by 67% using...

Lincoln Geidt • Sep 30, 2024

OpenAI o1-preview and o1-mini on Portkey

Zain Aminoff • Sep 27, 2024

New Integration: Inference.net (wholesaler...

Omar Calzoni • Sep 23, 2024

Changelog

We’re open-source!

Join 50 other contributors in collaboratively developing Portkey’s open-source AI Gateway and push the frontier of production-ready AI.

Trusted by Fortune 500s & Startups

Portkey is easy to set up, and the ability for developers to share credentials with LLMs is great. Overall, it has significantly sped up our development process.

Patrick L,
Founder and CPO, QA.tech

With 30 million policies a month, managing over 25 GenAI use cases became a pain. Portkey helped with prompt management, tracking costs per use case, and ensuring our keys were used correctly. It gave us the visibility we needed into our AI operations.

Prateek Jogani,
CTO, Qoala

Portkey stood out among AI Gateways we evaluated for several reasons: excellent, dedicated support even during the proof of concept phase, easy-to-use APIs that reduce time spent adapting code for different models, and detailed observability features that give deep insights into traces, errors, and caching

AI Leader,
Fortune 500 Pharma Company

Portkey is a no-brainer for anyone using AI in their GitHub workflows. It has saved us thousands of dollars by caching tests that don't require reruns, all while maintaining a robust testing and merge platform. This prevents merging PRs that could degrade production performance. Portkey is the best caching solution for our needs.

Kiran Prasad,
Senior ML Engineer, Ario

Well done on creating such an easy-to-use and navigate product. It’s much better than other tools we’ve tried, and we saw immediate value after signing up. Having all LLMs in one place and detailed logs has made a huge difference. The logs give us clear insights into latency and help us identify issues much faster. Whether it's model downtime or unexpected outputs, we can now pinpoint the problem and address it immediately. This level of visibility and efficiency has been a game-changer for our operations.

Oras Al-Kubaisi,
CTO, Figg

Used by ⭐️ 16,000+ developers across the world

Trusted by Fortune 500s
& Startups

Portkey is easy to set up, and the ability for developers to share credentials with LLMs is great. Overall, it has significantly sped up our development process.

Patrick L,
Founder and CPO, QA.tech

With 30 million policies a month, managing over 25 GenAI use cases became a pain. Portkey helped with prompt management, tracking costs per use case, and ensuring our keys were used correctly. It gave us the visibility we needed into our AI operations.

Prateek Jogani,
CTO, Qoala

Portkey stood out among AI Gateways we evaluated for several reasons: excellent, dedicated support even during the proof of concept phase, easy-to-use APIs that reduce time spent adapting code for different models, and detailed observability features that give deep insights into traces, errors, and caching

AI Leader,
Fortune 500 Pharma Company

Portkey is a no-brainer for anyone using AI in their GitHub workflows. It has saved us thousands of dollars by caching tests that don't require reruns, all while maintaining a robust testing and merge platform. This prevents merging PRs that could degrade production performance. Portkey is the best caching solution for our needs.

Kiran Prasad,
Senior ML Engineer, Ario

Well done on creating such an easy-to-use and navigate product. It’s much better than other tools we’ve tried, and we saw immediate value after signing up. Having all LLMs in one place and detailed logs has made a huge difference. The logs give us clear insights into latency and help us identify issues much faster. Whether it's model downtime or unexpected outputs, we can now pinpoint the problem and address it immediately. This level of visibility and efficiency has been a game-changer for our operations.

Oras Al-Kubaisi,
CTO, Figg

Used by ⭐️ 16,000+ developers across the world

Trusted by Fortune 500s & Startups

Portkey is easy to set up, and the ability for developers to share credentials with LLMs is great. Overall, it has significantly sped up our development process.

Patrick L,
Founder and CPO, QA.tech

With 30 million policies a month, managing over 25 GenAI use cases became a pain. Portkey helped with prompt management, tracking costs per use case, and ensuring our keys were used correctly. It gave us the visibility we needed into our AI operations.

Prateek Jogani,
CTO, Qoala

Portkey stood out among AI Gateways we evaluated for several reasons: excellent, dedicated support even during the proof of concept phase, easy-to-use APIs that reduce time spent adapting code for different models, and detailed observability features that give deep insights into traces, errors, and caching

AI Leader,
Fortune 500 Pharma Company

Portkey is a no-brainer for anyone using AI in their GitHub workflows. It has saved us thousands of dollars by caching tests that don't require reruns, all while maintaining a robust testing and merge platform. This prevents merging PRs that could degrade production performance. Portkey is the best caching solution for our needs.

Kiran Prasad,
Senior ML Engineer, Ario

Well done on creating such an easy-to-use and navigate product. It’s much better than other tools we’ve tried, and we saw immediate value after signing up. Having all LLMs in one place and detailed logs has made a huge difference. The logs give us clear insights into latency and help us identify issues much faster. Whether it's model downtime or unexpected outputs, we can now pinpoint the problem and address it immediately. This level of visibility and efficiency has been a game-changer for our operations.

Oras Al-Kubaisi,
CTO, Figg

Used by ⭐️ 16,000+ developers across the world

Latest guides and resources

Strategic Perspective on the MCP Registry for Enterprise

While the registry enables public discovery, realizing MCP’s value depends on designing for enterprise complexity.

The hidden challenge of MCP adoption in enterprises in 2025

MCP servers are multiplying at an overwhelming pace and at scale, they create an invisible sprawl.

MCP Server Ecosystem

A map of MCP servers, from reference implementations to community contributions.

Latest guides and resources

Strategic Perspective on the MCP Registry for Enterprise

While the registry enables public discovery, realizing MCP’s value depends on designing for enterprise complexity.

The hidden challenge of MCP adoption in enterprises in 2025

MCP servers are multiplying at an overwhelming pace and at scale, they create an invisible sprawl.

MCP Server Ecosystem

A map of MCP servers, from reference implementations to community contributions.

Latest guides and resources

Strategic Perspective on the MCP Registry for Enterprise

While the registry enables public discovery, realizing MCP’s value depends on designing for enterprise complexity.

The hidden challenge of MCP adoption in enterprises in 2025

MCP servers are multiplying at an overwhelming pace and at scale, they create an invisible sprawl.

MCP Server Ecosystem

A map of MCP servers, from reference implementations to community contributions.

The fastest way to bring MCP-powered apps to production

The fastest way to bring MCP-powered apps to production

The fastest way to bring MCP-powered apps to production