The new Model Catalog makes discovering and using AI models in Portkey easier and more flexible than ever before. This guide shows you how to find available models and how to call them in your code.

Inside your workspace, the “Virtual Keys” item on the sidebar has been replaced by the Model Catalog. Clicking this takes you to two tabs: “AI Providers” and “Models”.

The Models tab is your new “Model Garden.” It’s a central gallery that lists every single model you have been given access to by your admins, across all providers (OpenAI, Anthropic, Google, etc.).

  • Discover Models: Browse or search for any model available to you.

  • See Providers: Click on a model to see which provider(s) you can use to access it. For example, claude-3-opus might be available via both AWS Bedrock and Anthropic direct.

  • Get Code Snippets: The best part! Click a model, select your provider and language, and Portkey will generate the exact code snippet you need to start making calls.

The New Way to Call Models: Simple and Powerful

With the Model Catalog, you no longer need to rely on your admin to bind a specific virtual key to your API key. You can now choose your provider and model directly in your request.

The model parameter now accepts a new format: @{provider_slug}/{model_slug}

  • provider_slug: The unique slug for the provider (e.g., openai-prod). You can find this in your Model Garden.
  • model_slug: The name of the model you want to use (e.g., gpt-4o).

Example: Switching Between Providers on the Fly

Imagine your admin has given you access to OpenAI via a provider with the slug openai-main and to Anthropic on Bedrock via bedrock-us-east-1. With a single Portkey API key, you can do this:

# main.py
from portkey import Portkey

# Your single Portkey API Key is all you need
client = Portkey() 

# Make a call to GPT-4o
response_openai = client.chat.completions.create(
  model="@openai-main/gpt-4o",
  messages=[{"role": "user", "content": "Explain the theory of relativity in simple terms."}]
)
print(response_openai.choices[0].message.content)

# In the same script, switch to Claude 3 Sonnet
response_claude = client.chat.completions.create(
  model="@bedrock-us-east-1/anthropic.claude-3-sonnet-20240229-v1:0",
  messages=[{"role": "user", "content": "Write a short poem about coding."}]
)
print(response_claude.choices[0].message.content)

What About the Old Way?

It still works! If you are using code that passes a virtual_key in the header or in a Portkey Config, it will continue to function without any changes.

The new model parameter format is an enhancement for flexibility and ease of use, allowing you to access the full power of your workspace’s Model Garden with minimal friction.