The easiest way to manage GenAI across your university
Production stack for Gen AI builders
Production
stack for Gen AI builders
Portkey helps research and education institutions deploy GenAI responsibly by routing usage through 1,600+ models, enforcing guardrails, managing budgets, and giving IT full visibility into every interaction.
Portkey equips AI teams with everything they need to go to production - Gateway, Observability, Guardrails, Governance, and Prompt Management, all in one platform.
Portkey equips AI teams with everything they need to go to production - Gateway, Observability, Guardrails, Governance, and Prompt Management, all in one platform.


Enabling 3000+ leading teams to build the future of GenAI

Enabling 3000+ leading teams to build the future of GenAI

Enabling 3000+ leading teams to build the future of GenAI

Centralized governance for GenAI in education
Centralized governance for GenAI in education
Centralized governance for GenAI in education
Share model access across classrooms, researchers, and students without exposing API keys, with full visibility and policy enforcement.
Share model access across classrooms, researchers, and students without exposing API keys, with full visibility and policy enforcement.
Share model access across classrooms, researchers, and students without exposing API keys, with full visibility and policy enforcement.
Tap into 1,600+ LLMs through one API
Use the AI Gateway to access 1,600+ LLMs including GPT-4, Claude, Gemini, Mistral, and open-source models, with Day 0 support for new releases.
Simplify credential management across teams
Portkey abstracts raw API keys and scattered environment variables into governed Provider Integrations and Models so teams can access Gen AI tools without ever touching credentials.
Govern access and usage with roles and limits
Apply role-based access control, set department-level budgets, and configure rate limits to align with institutional policies and prevent overuse.
Enforce safe and consistent GenAI usage
Set institution-wide guardrails on prompts and completions to block unsafe or non-compliant outputs and standardize usage policies.
Get complete visibility with detailed logging
Track every GenAI request across departments, tools, and users with detailed logs for prompts, responses, latency, cost, and errors.
Plug into tools like Claude Code, LibreChat, Open WebUI, and more
Integrate Portkey with your favorite GenAI apps in minutes and layer on enterprise-grade controls, guardrails, and governance.




Built for Higher Education,
Backed by Internet2
Built for Higher Education, Backed by Internet2
Portkey is an official AI gateway provider for Internet2, providing a trusted, peer-reviewed path for universities adopting GenAI. Institutions benefit from shared governance, pre-negotiated terms, and streamlined procurement—backed by participation from NYU, Lehigh, Bowdoin, Cornell, Harvard, Princeton, and UC Berkeley.
Benefits include:
✓ Pre-negotiated terms for faster, simplified procurement
✓ Peer-reviewed service evaluations to ensure quality and reliability
✓ Shared governance frameworks that align with institutional priorities
Enterprise-ready LLMOps platform
Highest standards for security & compliance
Portkey complies with stringent data privacy and security standards so that you can focus on innovation without worrying about data security.
Audited every quarter
100% secure on-prem deployment with full control
Deploy Portkey in your private cloud for enhanced security, control, and 100% data ownership.






















































































































































































































































































































































The most popular open source AI Gateway
Portkey’s AI Gateway is actively maintained by 50+ contributors worldwide, bringing the cutting edge of AI work into the Gateway.
Portkey provides end-to-end support for Azure
Portkey plugs into your entire Azure ecosystem, so you can build and scale without leaving your existing infrastructure

New Integration: Inference.net (wholesaler of LLM...
Kierra Westervelt • Oct 03, 2024

New Integration: DeepSeek AI on Gateway
Alfonso Workman • Oct 01, 2024

Enhance RAG Retrieval Success by 67% using...
Lincoln Geidt • Sep 30, 2024

OpenAI o1-preview and o1-mini on Portkey
Zain Aminoff • Sep 27, 2024

New Integration: Inference.net (wholesaler...
Omar Calzoni • Sep 23, 2024
Changelog
We’re open-source!
Join 50 other contributors in collaboratively developing Portkey’s open-source AI Gateway and push the frontier of production-ready AI.
Enterprise-ready LLMOps platform
Highest standards for security & compliance
Portkey complies with stringent data privacy and security standards so that you can focus on innovation without worrying about data security.
Audited every quarter
100% secure on-prem deployment with full control
Deploy Portkey in your private cloud for enhanced security, control, and 100% data ownership.






















































































































































































































































































































































The most popular open source AI Gateway
Portkey’s AI Gateway is actively maintained by 50+ contributors worldwide, bringing the cutting edge of AI work into the Gateway.
Portkey provides end-to-end support for Azure
Portkey plugs into your entire Azure ecosystem, so you can build and scale without leaving your existing infrastructure

New Integration: Inference.net (wholesaler of LLM...
Kierra Westervelt • Oct 03, 2024

New Integration: DeepSeek AI on Gateway
Alfonso Workman • Oct 01, 2024

Enhance RAG Retrieval Success by 67% using...
Lincoln Geidt • Sep 30, 2024

OpenAI o1-preview and o1-mini on Portkey
Zain Aminoff • Sep 27, 2024

New Integration: Inference.net (wholesaler...
Omar Calzoni • Sep 23, 2024
Changelog
We’re open-source!
Join 50 other contributors in collaboratively developing Portkey’s open-source AI Gateway and push the frontier of production-ready AI.
Enterprise-ready LLMOps platform
Highest standards for security & compliance
Portkey complies with stringent data privacy and security standards so that you can focus on innovation without worrying about data security.
Audited every quarter
100% secure on-prem deployment with full control
Deploy Portkey in your private cloud for enhanced security, control, and 100% data ownership.






















































































































































































































































































































































The most popular open source AI Gateway
Portkey’s AI Gateway is actively maintained by 50+ contributors worldwide, bringing the cutting edge of AI work into the Gateway.
Portkey provides end-to-end
support for Azure
Portkey plugs into your entire Azure ecosystem, so you can build and scale without leaving your existing infrastructure

New Integration: Inference.net (wholesaler of LLM...
Kierra Westervelt • Oct 03, 2024

New Integration: DeepSeek AI on Gateway
Alfonso Workman • Oct 01, 2024

Enhance RAG Retrieval Success by 67% using...
Lincoln Geidt • Sep 30, 2024

OpenAI o1-preview and o1-mini on Portkey
Zain Aminoff • Sep 27, 2024

New Integration: Inference.net (wholesaler...
Omar Calzoni • Sep 23, 2024
Changelog
We’re open-source!
Join 50 other contributors in collaboratively developing Portkey’s open-source AI Gateway and push the frontier of production-ready AI.
Trusted by Fortune 500s & Startups
Portkey is easy to set up, and the ability for developers to share credentials with LLMs is great. Overall, it has significantly sped up our development process.
Patrick L,
Founder and CPO, QA.tech


With 30 million policies a month, managing over 25 GenAI use cases became a pain. Portkey helped with prompt management, tracking costs per use case, and ensuring our keys were used correctly. It gave us the visibility we needed into our AI operations.
Prateek Jogani,
CTO, Qoala

Portkey stood out among AI Gateways we evaluated for several reasons: excellent, dedicated support even during the proof of concept phase, easy-to-use APIs that reduce time spent adapting code for different models, and detailed observability features that give deep insights into traces, errors, and caching
AI Leader,
Fortune 500 Pharma Company
Portkey is a no-brainer for anyone using AI in their GitHub workflows. It has saved us thousands of dollars by caching tests that don't require reruns, all while maintaining a robust testing and merge platform. This prevents merging PRs that could degrade production performance. Portkey is the best caching solution for our needs.
Kiran Prasad,
Senior ML Engineer, Ario


Well done on creating such an easy-to-use and navigate product. It’s much better than other tools we’ve tried, and we saw immediate value after signing up. Having all LLMs in one place and detailed logs has made a huge difference. The logs give us clear insights into latency and help us identify issues much faster. Whether it's model downtime or unexpected outputs, we can now pinpoint the problem and address it immediately. This level of visibility and efficiency has been a game-changer for our operations.
Oras Al-Kubaisi,
CTO, Figg





Used by ⭐️ 16,000+ developers across the world
Trusted by Fortune 500s
& Startups
Portkey is easy to set up, and the ability for developers to share credentials with LLMs is great. Overall, it has significantly sped up our development process.
Patrick L,
Founder and CPO, QA.tech


With 30 million policies a month, managing over 25 GenAI use cases became a pain. Portkey helped with prompt management, tracking costs per use case, and ensuring our keys were used correctly. It gave us the visibility we needed into our AI operations.
Prateek Jogani,
CTO, Qoala

Portkey stood out among AI Gateways we evaluated for several reasons: excellent, dedicated support even during the proof of concept phase, easy-to-use APIs that reduce time spent adapting code for different models, and detailed observability features that give deep insights into traces, errors, and caching
AI Leader,
Fortune 500 Pharma Company
Portkey is a no-brainer for anyone using AI in their GitHub workflows. It has saved us thousands of dollars by caching tests that don't require reruns, all while maintaining a robust testing and merge platform. This prevents merging PRs that could degrade production performance. Portkey is the best caching solution for our needs.
Kiran Prasad,
Senior ML Engineer, Ario


Well done on creating such an easy-to-use and navigate product. It’s much better than other tools we’ve tried, and we saw immediate value after signing up. Having all LLMs in one place and detailed logs has made a huge difference. The logs give us clear insights into latency and help us identify issues much faster. Whether it's model downtime or unexpected outputs, we can now pinpoint the problem and address it immediately. This level of visibility and efficiency has been a game-changer for our operations.
Oras Al-Kubaisi,
CTO, Figg





Used by ⭐️ 16,000+ developers across the world
Trusted by Fortune 500s & Startups
Portkey is easy to set up, and the ability for developers to share credentials with LLMs is great. Overall, it has significantly sped up our development process.
Patrick L,
Founder and CPO, QA.tech


With 30 million policies a month, managing over 25 GenAI use cases became a pain. Portkey helped with prompt management, tracking costs per use case, and ensuring our keys were used correctly. It gave us the visibility we needed into our AI operations.
Prateek Jogani,
CTO, Qoala

Portkey stood out among AI Gateways we evaluated for several reasons: excellent, dedicated support even during the proof of concept phase, easy-to-use APIs that reduce time spent adapting code for different models, and detailed observability features that give deep insights into traces, errors, and caching
AI Leader,
Fortune 500 Pharma Company
Portkey is a no-brainer for anyone using AI in their GitHub workflows. It has saved us thousands of dollars by caching tests that don't require reruns, all while maintaining a robust testing and merge platform. This prevents merging PRs that could degrade production performance. Portkey is the best caching solution for our needs.
Kiran Prasad,
Senior ML Engineer, Ario


Well done on creating such an easy-to-use and navigate product. It’s much better than other tools we’ve tried, and we saw immediate value after signing up. Having all LLMs in one place and detailed logs has made a huge difference. The logs give us clear insights into latency and help us identify issues much faster. Whether it's model downtime or unexpected outputs, we can now pinpoint the problem and address it immediately. This level of visibility and efficiency has been a game-changer for our operations.
Oras Al-Kubaisi,
CTO, Figg





Used by ⭐️ 16,000+ developers across the world
Latest guides and resources

Bringing GenAI to the classroom
Discover how top universities like Harvard and Princeton are scaling GenAI access...

Centralized governance for GenAI
Instead of managing separate subscriptions, contracts, and integrations...

Catching Up on the Cloud: Spring 2025
Around this time last year, I discussed cloud journey pathways and highlighted...
Latest guides and resources

Bringing GenAI to the classroom
Discover how top universities like Harvard and Princeton are scaling GenAI access...

Centralized governance for GenAI
Instead of managing separate subscriptions, contracts, and integrations...

Catching Up on the Cloud: Spring 2025
Around this time last year, I discussed cloud journey pathways and highlighted...
Latest guides and resources

Bringing GenAI to the classroom
Discover how top universities like Harvard and Princeton are scaling GenAI access...

Centralized governance for GenAI
Instead of managing separate subscriptions, contracts, and integrations...

Catching Up on the Cloud: Spring 2025
Around this time last year, I discussed cloud journey pathways and highlighted...
🎉 Introducing Model Catalog. Control every LLM your team uses. Learn More
🎉 Introducing Model Catalog. Control every LLM your team uses. Learn More
🎉 Introducing Model Catalog. Control every
LLM your team uses. Learn More
The last platform you’ll need in your AI stack
Products
© 2025 Portkey, Inc. All rights reserved
HIPAA
COMPLIANT
GDPR
The last platform you’ll need in your AI stack
Products
© 2025 Portkey, Inc. All rights reserved
HIPAA
COMPLIANT
GDPR
The last platform you’ll need in your AI stack
Products
© 2025 Portkey, Inc. All rights reserved
HIPAA
COMPLIANT
GDPR