Supabase provides an open source toolkit for developing AI applications using Postgres and pgvector. With Portkey integration, you can seamlessly generate embeddings using AI models like OpenAI and store them in Supabase, enabling efficient data retrieval. Portkey’s unified API supports over 250 models, making AI management more streamlined and secure

Prerequisites

  1. Supabase project API Key
  2. Portkey API key

Setting up your environment

First, let’s set up our Python environment with the necessary libraries:

pip install portkey-ai supabase

Preparing your database

1

Create a Supabase Project

Go to Supabase and create a new project.

2

Enable pgvector extension

pgvector is an extension for PostgreSQL that allows you to both store and query vector embeddings within your database. We can enable it from the web portal through Database → Extensions. You can also do this in SQL by running:

create extension vector;
3

Create a table to store our documents and their embeddings

pgvector introduces a new data type called vector. In the code above, we create a column named embedding with the vector data type. The size of the vector defines how many dimensions the vector holds. OpenAI’s text-embedding-ada-002 model outputs 1536 dimensions, so we will use that for our vector size. We also create a text column named content to store the original document text that produced this embedding. Depending on your use case, you might just store a reference (URL or foreign key) to a document here instead.

create table documents (
  id bigserial primary key,
  content text,
  embedding vector(1536)
);

Configuring Supabase and Portkey

Now, let’s import the required libraries and set up our Supabase and Portkey clients:

from portkey_ai import Portkey
from supabase import create_client, Client

# Supabase setup
supabase_url = "YOUR_SUPABASE_PROJECT_URL"
supabase_key = "YOUR_SUPABASE_API_KEY"
supabase: Client = create_client(supabase_url, supabase_key)

# Portkey setup
portkey_client = Portkey(
    api_key="YOUR_PORTKEY_API_KEY",
    provider="openai",
    virtual_key="YOUR_OPENAI_VIRTUAL_KEY",
)

Replace the placeholder values with your actual Supabase and Portkey credentials.

Generating and storing embeddings

Let’s create a function to generate embeddings using Portkey and OpenAI, and store them in Supabase:

#Generate Embedding
embedding_response = client.embeddings.create(
  model="text-embedding-ada-002",
  input="The food was delicious and the waiter...",
  encoding_format="float"
)

embedding = embedding_response.data[0].embedding

# Store in Supabase
result = supabase.table('documents').insert({
    "content": text,
    "embedding": embedding
}).execute()

This function takes a text input, generates an embedding using through Portkey, and then stores both the original text and its embedding in the Supabase documents table.

Portkey supports 250+ Models, you can choose any model just by changing the provider and virtual_key

Here’s an example on how to use Cohere with Portkey

client = Portkey(
    api_key="YOUR_PORTKEY_API_KEY", # defaults to os.environ.get("PORTKEY_API_KEY")
    provider="cohere",
    virtual_key="YOUR_COHERE_VIRTUAL_KEY",
)

embeddings = client.embeddings.create(
  model="embed-english-v3.0",
  input_type="search_query",
  input="The food was delicious and the waiter...",
  encoding_format="float"
)

Note that you will need to make a new table with 1024 dimensions instead of 1536 dimensions for Cohere’s embed-english-v3.0 model.