Prerequisites

Setting up your environment

First, let’s set up our Python environment with the necessary libraries:

!pip install portkey-ai supabase

Preparing your database

  1. Create a Supabase account
  2. Enable pgvector, an extension for PostgreSQL that allows you to both store and query vector embeddings within your database. Let’s try it out.

First we’ll enable the Vector extension. In Supabase, this can be done from the web portal through Database → Extensions. You can also do this in SQL by running:

create extension vector;
  1. Next let’s create a table to store our documents and their embeddings: pgvector introduces a new data type called vector. In the code above, we create a column named embedding with the vector data type. The size of the vector defines how many dimensions the vector holds. OpenAI’s text-embedding-ada-002 model outputs 1536 dimensions, so we will use that for our vector size. We also create a text column named content to store the original document text that produced this embedding. Depending on your use case, you might just store a reference (URL or foreign key) to a document here instead.
create table documents (
  id bigserial primary key,
  content text,
  embedding vector(1536)
);

Configuring Supabase and Portkey

Next, we’ll import the required libraries and set up our Supabase and Portkey clients:

from portkey_ai import Portkey
from supabase import create_client, Client

# Supabase setup
supabase_url = "YOUR_SUPABASE_PROJECT_URL"
supabase_key = "YOUR_SUPABASE_API_KEY"
supabase: Client = create_client(supabase_url, supabase_key)

# Portkey setup
portkey_client = Portkey(
    api_key="YOUR_PORTKEY_API_KEY",
    provider="openai",
    virtual_key="YOUR_OPENAI_VIRTUAL_KEY",
)

Replace the placeholder values with your actual Supabase and Portkey credentials.

Generating and storing embeddings

Now, let’s create a function to generate embeddings using Portkey and OpenAI, and store them in Supabase:

#Generate Embedding
embedding_response = client.embeddings.create(
  model="text-embedding-ada-002",
  input="The food was delicious and the waiter...",
  encoding_format="float"
)


embedding = embedding_response.data[0].embedding

# Store in Supabase
result = supabase.table('documents').insert({
    "content": text,
    "embedding": embedding
    }).execute()



This function takes a text input, generates an embedding using through Portkey, and then stores both the original text and its embedding in the Supabase documents table.

Portkey supports 250+ Models, you can choose any model just by changing the provider and virtual_key

Here’s an example on how to use Cohere with Portkey

client = Portkey(
    api_key="YOUR_PORTKEY_API_KEY", # defaults to os.environ.get("PORTKEY_API_KEY")
    provider="cohere",
    virtual_key="YOUR_COHERE_VIRTUAL_KEY",
)

embeddings = client.embeddings.create(
  model="embed-english-v3.0",
  input_type="search_query",
  input="The food was delicious and the waiter...",
  encoding_format="float"
)
  • Note you will need to make a new table with 1024 dimensions instead of 1536 dimensions for Cohere’s embed-english-v3.0 model.