Nomic

Portkey provides a robust and secure gateway to facilitate the integration of various Large Language Models (LLMs) into your applications, including Nomic.

Nomic has especially become popular due to it's superior embeddings and is now available through Portkey's AI gateway as well.

With Portkey, you can take advantage of features like fast AI gateway access, observability, prompt management, and more, all while ensuring the secure management of your LLM API keys through a virtual key system.

Portkey SDK Integration with Nomic

Portkey provides a consistent API to interact with embedding models from various providers. To integrate Nomic with Portkey:

1. Create a Virtual Key for Nomic in your Portkey account

You can head over to the virtual keys tab and create one for Nomic. This will be then used to make API requests to Nomic without needing the protected API key.

2. Install the Portkey SDK and Initialize with this Virtual Key

Add the Portkey SDK to your application to interact with Nomic's API through Portkey's gateway. Set up Portkey with your virtual key as part of the initialization configuration.

Set up Portkey with your virtual key as part of the initialization configuration. You can create a virtual key for Azure in the UI.

import Portkey from 'portkey-ai'
 
const portkey = new Portkey({
    apiKey: "PORTKEY_API_KEY", // defaults to process.env["PORTKEY_API_KEY"]
    virtualKey: "VIRTUAL_KEY" // Your Nomic Virtual Key
})

3. Invoke the Embeddings API with Nomic

Use the Portkey instance to send requests to your Nomic API. You can also override the virtual key directly in the API call if needed.

const embeddings = await portkey.embeddings.create({
    input: "create vector representation on this sentence",
    model: "nomic-embed-text-v1.5",
});

console.log(embeddings);

Next Steps

The complete list of features supported in the SDK are available on the link below.

pagePortkey SDK Client

You'll find more information in the relevant sections:

Last updated