Portkey provides a robust and secure gateway to facilitate the integration of various Large Language Models (LLMs) into your applications, including Nomic.

Nomic has especially become popular due to it’s superior embeddings and is now available through Portkey’s AI gateway as well.

With Portkey, you can take advantage of features like fast AI gateway access, observability, prompt management, and more, all while ensuring the secure management of your LLM API keys through a virtual key system.

Provider Slug. nomic

Portkey SDK Integration with Nomic

Portkey provides a consistent API to interact with embedding models from various providers. To integrate Nomic with Portkey:

1. Integrate Nomic in your Portkey account

You can head over to the Integrations tab and connect Nomic with API key. This will be then used to make API requests to Nomic without needing the protected API key. Grab your Nomic API key from here.

Connect Nomic in Portkey

2. Install the Portkey SDK and Initialize with Nomic

Add the Portkey SDK to your application to interact with Nomic’s API through Portkey’s gateway.

import Portkey from 'portkey-ai'

const portkey = new Portkey({
    apiKey: "PORTKEY_API_KEY", // defaults to process.env["PORTKEY_API_KEY"]
    provider:"@nomic" // Your Nomic provider slug from Portkey
})

3. Invoke the Embeddings API with Nomic

Use the Portkey instance to send requests to your Nomic API. You can also override the virtual key directly in the API call if needed.

const embeddings = await portkey.embeddings.create({
    input: "create vector representation on this sentence",
    model: "nomic-embed-text-v1.5",
});

console.log(embeddings);

Next Steps

The complete list of features supported in the SDK are available on the link below.

SDK

You’ll find more information in the relevant sections:

  1. API Reference for Embeddings
  2. Add metadata to your requests
  3. Add gateway configs to your Nomic requests
  4. Tracing Nomic requests