Portkey Docs home pagelight logodark logo
  • Support
  • Book Demo
  • Sign in
  • Sign in
Documentation
Integrations
Inference API
Admin API
Cookbook
Changelog
Introduction
  • What is Portkey?
  • Make Your First Request
  • Portkey Features
Product
  • Observability
  • AI Gateway
    • AI Gateway
    • Universal API
    • Configs
    • Conditional Routing
    • Multimodal Capabilities
    • Cache (Simple & Semantic)
    • Fallbacks
    • Automatic Retries
    • Realtime API
    • Load Balancing
    • Canary Testing
    • Strict OpenAI Compliance
    • Virtual Keys
    • Request Timeouts
    • Files
    • Batches
    • Fine-tuning
  • Prompt Studio
  • Guardrails
  • MCP
  • Administration
  • Security
  • Autonomous Fine-tuning
  • Enterprise Offering
  • Open Source
  • Feature Comparison
Support
  • Contact Us
  • Developer Forum
  • Common Errors & Resolutions
  • December '23 Migration
Product
AI Gateway

AI Gateway

The world’s fastest AI Gateway with advanced routing & integrated Guardrails.

​
Features

Universal API

Use any of the supported models with a universal API (REST and SDKs)

Cache (Simple & Semantic)

Save costs and decrease latencies by using a cache

Fallbacks

Fallback between providers and models for resilience

Conditional Routing

Route to different targets based on custom conditional checks

Multimodality

Use vision, audio, image generation, and more models

Automatic Retries

Setup automatic retry strategies

Load Balancing

Load balance between various API Keys to counter rate-limits

Canary Testing

Canary test new models in production

Virtual Keys

Manage AI provider keys and auth in a secure vault

Request Timeout

Easily handle unresponsive LLM requests

Budget Limits

Set usage limits based on costs incurred or tokens used

Rate Limits

Set hourly, daily, or per minute rate limits on requests or tokens sent

​
Using the Gateway

The various gateway strategies are implemented using Gateway configs. You can read more about configs below.

Configs

​
Open Source

We’ve open sourced our battle-tested AI gateway to the community. You can run it locally with a single command:

npx @portkey-ai/gateway

Contribute here.

While you’re here, why not give us a star? It helps us a lot!

You can also self-host the gateway and then connect it to Portkey. Please reach out on [email protected] and we’ll help you set this up!

Was this page helpful?

Suggest editsRaise issue
Auto-Instrumentation [BETA]Universal API
On this page
  • Features
  • Using the Gateway
  • Open Source
Portkey Docs home pagelight logodark logo
xlinkedindiscordgithub

For LLMs

llms.txtllms-full.txt

Products

ObservabilityAI GatewayGuardrailsPrompt ManagementAgentsMCP ClientSecurityAuto-Instrumentation

Resources

API StatusHelm ChartsAI Grants FinderEvents CalendarSupport ForumBlogError Library

Company

PricingCompliancesPrivacy PolicyTerms of ServiceDPACookie Policy
Powered by Mintlify
Assistant
Responses are generated using AI and may contain mistakes.