It takes 2 mins to integrate and with that, it starts monitoring all of your LLM requests and makes your app resilient, secure, performant, and more accurate at the same time. Here’s a product walkthrough (3 mins):

Integrate in 3 Lines of Code

from portkey_ai import Portkey

portkey = Portkey(
    api_key="YOUR_PORTKEY_API_KEY",
    virtual_key="YOUR_VIRTUAL_KEY"
)

chat_complete = portkey.chat.completions.create(
    model="gpt-3.5-turbo",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"}
    ]
)
print(chat_complete.choices[0].message.content)
While you’re here, why not give us a star? It helps us a lot!

FAQs

Portkey is hosted on edge workers throughout the world, ensuring minimal latency. Our benchmarks estimate a total latency addition between 20-40ms compared to direct API calls. This slight increase is often offset by the benefits of our caching and routing optimizations.Our edge worker locations:
Portkey AI is ISO:27001 and SOC 2 certified, and GDPR & HIPAA compliant. We maintain best practices for service security, data storage, and retrieval. All data is encrypted in transit and at rest using industry-standard AES-256 encryption. For enhanced security, we offer:
  1. On request, we can enable a feature that does NOT store any of your request and response body objects in Portkey datastores or our logs.
  2. For enterprises, we offer managed hosting to deploy Portkey inside private clouds.
For more information on these options, contact us at [email protected].
Portkey is built on scalable infrastructure and can handle millions of requests per minute with very high concurrency. We currently serve over 25M requests daily with a 99.99% uptime. Our edge architecture & scaling capabilities ensure we can accommodate sudden spikes in traffic without performance degradation.View our Status Page
We DO NOT impose any explicit timeout for our free OR paid plans currently. While we don’t time out requests on our end, we recommend implementing client-side timeouts appropriate for your use case to handle potential network issues or upstream API delays.
Yes! We support SSO with any custom OIDC provider.
Portkey’s Gateway is open source and free to use. On managed version, Portkey offers a free plan with 10k requests per month. We also offer paid plans with more requests and additional features.
We’re available all the time on Discord, or on our support email - [email protected]