Skip to main content
We have open sourced our battle-tested AI Gateway to the community - it connects to 250+ LLMs with a unified interface and a single endpoint, and lets you effortlessly setup fallbacks, load balancing, retries, and more.
This gateway is in production at Portkey processing billions of tokens every day.
Open-source pricing database for 2,300+ LLMs across 35+ providers. Powers cost tracking across the Portkey ecosystem.
Community resource for AI builders to find GPU credits, grants, AI accelerators, or investments - all in a single place. Continuously updated, and sometimes also featuring exclusive deals.
Access the data here.
We collaborate with the community to dive deep into how the LLMs & their inference providers are performing at scale, and publish gateway reports. We track latencies, uptime, cost changes, fluctuations across various modalitites like time-of-day, regions, token-lengths, and more.
Insights from analyzing 2 trillion+ tokens, across 90+ regions and 650+ teams in production. The report contains:
- Trends shaping AI adoption and LLM provider growth.
- Benchmarks to optimize speed, cost and reliability.
- Strategies to scale production-grade AI systems.
The report is available here
Collaborations
Portkey supports various open source projects with additional production capabilities through its custom integrations.
Check out our expanding ecosystem of integrations