Portkey provides a robust and secure gateway to facilitate the integration of various Large Language Models (LLMs) into your applications, including the models hosted on Nscale.

Provider Slug. nscale

Portkey SDK Integration with Nscale

Portkey provides a consistent API to interact with models from various providers. To integrate Nscale with Portkey:

1. Install the Portkey SDK

npm install --save portkey-ai

2. Initialize Portkey with the Virtual Key

To use Nscale with Virtual Key, get your API key from here. Then add it to Portkey to create the virtual key.

import Portkey from 'portkey-ai'

const portkey = new Portkey({
  apiKey: "PORTKEY_API_KEY", // defaults to process.env["PORTKEY_API_KEY"]
  virtualKey: "VIRTUAL_KEY" // Your Nscale Virtual Key
})

3. Invoke Chat Completions

const chatCompletion = await portkey.chat.completions.create({
  messages: [{ role: 'user', content: 'Say this is a test' }],
  model: 'meta-llama/Llama-4-Scout-17B-16E-Instruct',
});

console.log(chatCompletion.choices);

4. Invoke Image Generation

const response = await portkey.images.generations.create({
  prompt: "A beautiful sunset over mountains",
  model: "stabilityai/stable-diffusion-xl-base-1.0",
  n: 1,
  size: "1024x1024"
});

console.log(response.data[0].url);

Supported Models


Next Steps

The complete list of features supported in the SDK are available on the link below.

SDK

You’ll find more information in the relevant sections:

  1. Add metadata to your requests
  2. Add gateway configs to your Nscale requests
  3. Tracing Nscale requests
  4. Setup a fallback from OpenAI to Nscale