Portkey provides a robust and secure gateway to facilitate the integration of various Large Language Models (LLMs) into your applications, including Lepton AI APIs.
With Portkey, you can take advantage of features like fast AI gateway access, observability, prompt management, and more, all while ensuring the secure management of your LLM API keys through a virtual key system.
import Portkey from 'portkey-ai'const portkey = new Portkey({ apiKey: "PORTKEY_API_KEY", // defaults to process.env["PORTKEY_API_KEY"] provider:"@PROVIDER" // Your Lepton AI Virtual Key})
Copy
Ask AI
import Portkey from 'portkey-ai'const portkey = new Portkey({ apiKey: "PORTKEY_API_KEY", // defaults to process.env["PORTKEY_API_KEY"] provider:"@PROVIDER" // Your Lepton AI Virtual Key})
Copy
Ask AI
from portkey_ai import Portkeyportkey = Portkey( api_key="PORTKEY_API_KEY", # Replace with your Portkey API key provider="@PROVIDER" # Replace with your virtual key for Lepton)
Lepton AI supports streaming responses to provide real-time generation:
Copy
Ask AI
const stream = await portkey.chat.completions.create({ messages: [{ role: 'user', content: 'Write a story about a robot' }], model: 'llama-3-8b-sft-v1', stream: true,});for await (const chunk of stream) { process.stdout.write(chunk.choices[0]?.delta?.content || '');}
Copy
Ask AI
const stream = await portkey.chat.completions.create({ messages: [{ role: 'user', content: 'Write a story about a robot' }], model: 'llama-3-8b-sft-v1', stream: true,});for await (const chunk of stream) { process.stdout.write(chunk.choices[0]?.delta?.content || '');}
Copy
Ask AI
stream = portkey.chat.completions.create( messages=[{"role": "user", "content": "Write a story about a robot"}], model="llama-3-8b-sft-v1", stream=True)for chunk in stream: if chunk.choices[0].delta.content: print(chunk.choices[0].delta.content, end="")
You can manage all prompts to Lepton AI in the Prompt Library. All the current models of Lepton AI are supported and you can easily start testing different prompts.
Once you’re ready with your prompt, you can use the portkey.prompts.completions.create interface to use the prompt in your application.