Make Your First Request

Integrate Portkey and analyze your first LLM call in 2 minutes!

1. Get your Portkey API Key

Create or log in to your Portkey account. Grab your account's API key in the menu on your profile icon.

2. Integrate Portkey

Portkey offers a variety of integration options, including SDKs, REST APIs, and native connections with platforms like OpenAI, Langchain, and LlamaIndex, among others.

Through the OpenAI SDK

If you're using the OpenAI SDK, import the Portkey SDK and configure it within your OpenAI client object:

pageOpenAI

Portkey SDK

You can also use the Portkey SDK / REST APIs directly to make the chat completion calls. This is a more versatile way to make LLM calls across any provider:

pagePortkey SDK Client

Once, the integration is ready, you can view the requests reflect on your Portkey dashboard.

Other Integration Guides

Azure OpenAI

Anthropic

Langchain

LlamaIndex

Ollama

Others

3. Next Steps

Now that you're up and running with Portkey, you can dive into the various Portkey features to learn about all of the supported functionalities:


FAQs

Will Portkey increase the latency of my API requests?

Portkey is hosted on edge workers throughout the world and our servers ensure the least latency roundtrips. Our benchmarks estimate a total latency addition between 20-40ms.

Our edge worker locations:

Is my data secure?

Portkey is ISO:27001 and SOC 2 certified. We're also GDPR compliant. This is proof that we maintain the best practices involving security of our services, data storage and retrieval. All your data is encrypted in transit and at rest.

If you're still worried about your data passing through Portkey, we recommend one of the below options:

  1. On request, we can enable a feature that does NOT store any of your request and response boby objects in the Portkey datastores or our logs.

  2. For enterprises, we offer managed hosting to deploy Portkey inside private clouds.

If you need to talk about these options, feel free to drop us a note on [email protected]

Will Portkey scale if my app explodes?

Portkey has been tested to handle millions of requests per second. We serve over 10M requests everyday with a 99.99% uptime. We're built of top of scalable infrastructure and can handle huge loads without breaking a sweat.

View our Status Page

Does Portkey impose timeouts on requests?

We do not impose any explicit timeout for our free OR paid plans currently. In the past, we have had users experience timeouts from various other frameworks, but Portkey does not time out requests on our end.

Do you support sign ups from non-google/gmail accounts?

Yes! We support registrations with Microsoft accounts - this is currently in beta. Please reach out on [email protected] or request on Discord for access to MS login.

Where can I reach you you?

We're available all the time on Discord, or on our support email - [email protected]

Last updated