Portkey is the Control Panel for AI apps. With it’s popular AI Gateway and Observability Suite, hundreds of teams ship reliable, cost-efficient, and fast apps.

Quickstart

Since Portkey is fully compatible with the OpenAI signature, you can connect to the Portkey AI Gateway through OpenAI Client.

  • Set the base_url as PORTKEY_GATEWAY_URL
  • Add default_headers to consume the headers needed by Portkey using the createHeaders helper method.

Install the OpenAI and Portkey SDK

pip install -qU portkey-ai openai

Create the client

import os
from openai import OpenAI
from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders

client = OpenAI(
  api_key=os.environ.get("OPENAI_API_KEY"),
  base_url=PORTKEY_GATEWAY_URL, # πŸ‘ˆ or 'http://localhost:8787/v1'
  default_headers=createHeaders(
    provider="openai", # πŸ‘ˆ or 'anthropic', 'together-ai', 'stability-ai', etc
    api_key=os.environ.get("PORTKEY_API_KEY") # πŸ‘ˆ skip when self-hosting
  )
)

Examples

OpenAI Chat Completion

Provider: openai

Model being tested here: gpt-4o-mini

import os
from openai import OpenAI
from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders

client = OpenAI(
  api_key=os.environ.get("OPENAI_API_KEY"),
  base_url=PORTKEY_GATEWAY_URL, # πŸ‘ˆ or 'http://localhost:8787/v1'
  default_headers=createHeaders(
    provider="openai",
    api_key=os.environ.get("PORTKEY_API_KEY") # πŸ‘ˆ skip when self-hosting
  )
)

client.chat.completions.create(
  model="gpt-4o-mini",
  messages=[{"role": "user", "content": "What is a fractal?"}],
)
A fractal is a complex geometric shape that can be split into parts, each of which is a reduced-scale  of the whole. Fractals are typically self-similar and independent of scale, meaning they look similar at any zoom level. They often appear in nature, in things like snowflakes, coastlines, and fern leaves. The term "fractal" was coined by mathematician Benoit Mandelbrot in 1975.

Anthropic

Provider: anthropic

Model being tested here: claude-3-5-sonnet-20240620

PythonJS/TScURL

import os
from openai import OpenAI
from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders

client = OpenAI(
  api_key=os.environ.get("OPENAI_API_KEY"),
  base_url=PORTKEY_GATEWAY_URL, # πŸ‘ˆ or 'http://localhost:8787/v1'
  default_headers=createHeaders(
    provider="anthropic",
    api_key=os.environ.get("PORTKEY_API_KEY") # πŸ‘ˆ skip when self-hosting
  )
)

client.chat.completions.create(
  model="claude-3-5-sonnet-20240620",
  messages=[{"role": "user", "content": "What is a fractal?"}],
  max_tokens=250
)
A fractal is a complex geometric shape that can be split into parts, each of which is a reduced-scale  of the whole. Fractals are typically self-similar and independent of scale, meaning they look similar at any zoom level. They often appear in nature, in things like snowflakes, coastlines, and fern leaves. The term "fractal" was coined by mathematician Benoit Mandelbrot in 1975.

Mistral AI

Provider: mistral-ai

Model being tested here: mistral-medium

import os
from openai import OpenAI
from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders

client = OpenAI(
  api_key=os.environ.get("OPENAI_API_KEY"),
  base_url=PORTKEY_GATEWAY_URL, # πŸ‘ˆ or 'http://localhost:8787/v1'
  default_headers=createHeaders(
    provider="mistral-ai",
    api_key=os.environ.get("PORTKEY_API_KEY") # πŸ‘ˆ skip when self-hosting
  )
)

client.chat.completions.create(
  model="mistral-medium",
  messages=[{"role": "user", "content": "What is a fractal?"}],
)
A fractal is a complex geometric shape that can be spl

Together AI

Provider: together-ai

Model being tested here: togethercomputer/llama-2-70b-chat

import os
from openai import OpenAI
from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders

client = OpenAI(
  api_key=os.environ.get("OPENAI_API_KEY"),
  base_url=PORTKEY_GATEWAY_URL, # πŸ‘ˆ or 'http://localhost:8787/v1'
  default_headers=createHeaders(
    provider="together-ai",
    api_key=os.environ.get("PORTKEY_API_KEY") # πŸ‘ˆ skip when self-hosting
  )
)

client.chat.completions.create(
  model="togethercomputer/llama-2-70b-chat",
  messages=[{"role": "user", "content": "What is a fractal?"}],
)
A fractal is a complex geometric shape that can be spl

Portkey Supports other Providers

Portkey supports 30+ providers and all the models within those providers. To use these different providers and models with OpenAI’s SDK, you just need to change the provider and model names in your code with their respective auth keys. It’s that easy!

If you want to see all the providers Portkey works with, check out the list of providers.

OpenAI Embeddings

PythonNodeJS

import os
from openai import OpenAI
from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders

client = OpenAI(
  api_key=os.environ.get("OPENAI_API_KEY"),
  base_url=PORTKEY_GATEWAY_URL, # πŸ‘ˆ or 'http://localhost:8787/v1'
  default_headers=createHeaders(
    provider="openai",
    api_key=os.environ.get("PORTKEY_API_KEY") # πŸ‘ˆ skip when self-hosting
  )
)

def get_embedding(text, model="text-embedding-3-small"):
   text = text.replace("\n", " ")
   return client.embeddings.create(input = [text], model=model).data[0].embedding

df['ada_embedding'] = df.combined.apply(lambda x: get_embedding(x, model='text-embedding-3-small'))
df.to_csv('output/embedded_1k_reviews.csv', index=False)





OpenAI Function Calling

OpenAI NodeJSOpenAI PythonNodeJSPythonREST

import OpenAI from 'openai'; // We're using the v4 SDK
import { PORTKEY_GATEWAY_URL, createHeaders } from 'portkey-ai'

const openai = new OpenAI({
  apiKey: 'OPENAI_API_KEY', // defaults to process.env["OPENAI_API_KEY"],
  baseURL: PORTKEY_GATEWAY_URL,
  defaultHeaders: createHeaders({
    provider: "openai",
    apiKey: "PORTKEY_API_KEY" // defaults to process.env["PORTKEY_API_KEY"]
  })
});

// Generate a chat completion with streaming
async function getChatCompletionFunctions(){
  const messages = [{"role": "user", "content": "What is the weather like in Boston today?"}];
  const tools = [
      {
        "type": "function",
        "function": {
          "name": "get_current_weather",
          "description": "Get the current weather in a given location",
          "parameters": {
            "type": "object",
            "properties": {
              "location": {
                "type": "string",
                "description": "The city and state, e.g. San Francisco, CA",
              },
              "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
            },
            "required": ["location"],
          },
        }
      }
  ];

  const response = await openai.chat.completions.create({
    model: "gpt-3.5-turbo",
    messages: messages,
    tools: tools,
    tool_choice: "auto",
  });

  console.log(response)

}
await getChatCompletionFunctions();

OpenAI Chat-Vision

import OpenAI from 'openai'; // We're using the v4 SDK
import { PORTKEY_GATEWAY_URL, createHeaders } from 'portkey-ai'

const openai = new OpenAI({
  apiKey: 'OPENAI_API_KEY', // defaults to process.env["OPENAI_API_KEY"],
  baseURL: PORTKEY_GATEWAY_URL,
  defaultHeaders: createHeaders({
    provider: "openai",
    apiKey: "PORTKEY_API_KEY" // defaults to process.env["PORTKEY_API_KEY"]
  })
});

// Generate a chat completion with streaming
async function getChatCompletionFunctions(){
  const response = await openai.chat.completions.create({
    model: "gpt-4-vision-preview",
    messages: [
      {
        role: "user",
        content: [
          { type: "text", text: "What’s in this image?" },
          {
            type: "image_url",
            image_url:
              "https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg",
          },
        ],
      },
    ],
  });

  console.log(response)

}
await getChatCompletionFunctions();

Images

import OpenAI from 'openai'; // We're using the v4 SDK
import { PORTKEY_GATEWAY_URL, createHeaders } from 'portkey-ai'

const openai = new OpenAI({
  apiKey: 'OPENAI_API_KEY', // defaults to process.env["OPENAI_API_KEY"],
  baseURL: PORTKEY_GATEWAY_URL,
  defaultHeaders: createHeaders({
    provider: "openai",
    apiKey: "PORTKEY_API_KEY" // defaults to process.env["PORTKEY_API_KEY"]
  })
});

async function main() {
  const image = await openai.images.generate({
    model: "dall-e-3",
    prompt: "Lucy in the sky with diamonds"
  });

  console.log(image.data);
}

main();

OpenAI Audio

Here’s an example of Text-to-Speech

import fs from "fs";
import OpenAI from "openai";
import { PORTKEY_GATEWAY_URL, createHeaders } from 'portkey-ai'

const openai = new OpenAI({
  baseURL: PORTKEY_GATEWAY_URL,
  defaultHeaders: createHeaders({
    apiKey: "PORTKEY_API_KEY",
    virtualKey: "OPENAI_VIRTUAL_KEY"
  })
});

// Transcription

async function transcribe() {
  const transcription = await openai.audio.transcriptions.create({
    file: fs.createReadStream("/path/to/file.mp3"),
    model: "whisper-1",
  });

  console.log(transcription.text);
}
transcribe();

// Translation

async function translate() {
    const translation = await openai.audio.translations.create({
        file: fs.createReadStream("/path/to/file.mp3"),
        model: "whisper-1",
    });
    console.log(translation.text);
}
translate();

OpenAI Batch - Create Batch

import OpenAI from 'openai';
import { PORTKEY_GATEWAY_URL, createHeaders } from 'portkey-ai'

const client = new OpenAI({
  baseURL: PORTKEY_GATEWAY_URL,
  defaultHeaders: createHeaders({
    apiKey: "PORTKEY_API_KEY",
    virtualKey: "PROVIDER_VIRTUAL_KEY"
  })
});

async function main() {
  const batch = await client.batches.create({
    input_file_id: "file-abc123",
    endpoint: "/v1/chat/completions",
    completion_window: "24h"
  });

  console.log(batch);
}

main();

Files - Upload File

import fs from "fs";
import OpenAI from 'openai';
import { PORTKEY_GATEWAY_URL, createHeaders } from 'portkey-ai'

const client = new OpenAI({
  baseURL: PORTKEY_GATEWAY_URL,
  defaultHeaders: createHeaders({
    apiKey: "PORTKEY_API_KEY",
    virtualKey: "PROVIDER_VIRTUAL_KEY"
  })
});

async function main() {
  const file = await client.files.create({
    file: fs.createReadStream("mydata.jsonl"),
    purpose: "batch",
  });

  console.log(file);
}

main();