Ecosystem
LLMs
- Overview
- OpenAI
- Anthropic
- Google Gemini
- Google Vertex AI
- Azure OpenAI
- Bedrock
- AWS SageMaker
- Ollama
- More
- LocalAI
- vLLM
- Triton
- AI21
- Anyscale
- Cerebras
- Cohere
- Fireworks
- Dashscope
- Deepinfra
- Deepbricks
- Deepgram
- DeepSeek
- Github
- Groq
- Hugging Face
- Inference.net
- Jina AI
- Lambda Labs
- Lemonfox-AI
- Lingyi (01.ai)
- Mistral AI
- Monster API
- Moonshot
- Nomic
- Novita AI
- OpenRouter
- Perplexity AI
- Predibase
- Reka AI
- SambaNova
- Segmind
- Stability AI
- SiliconFlow
- Together AI
- Upstage AI
- Voyage AI
- Workers AI
- xAI
- ZhipuAI / ChatGLM / BigModel
- Replicate
- Suggest a new integration!
- Bring Your Own LLM
Agents
Portkey supports xAI’s chat completions, completions, and embeddings APIs.
Integrate
Just paste your xAI API Key to Portkey to create your Virtual Key.
Sample Request
Portkey is a drop-in replacement for xAI. You can make request using the official Portkey SDK.
Popular libraries & agent frameworks like LangChain, CrewAI, AutoGen, etc. are also supported.
import Portkey from 'portkey-ai';
const client = new Portkey({
apiKey: 'PORTKEY_API_KEY',
virtualKey: 'PROVIDER_VIRTUAL_KEY'
});
async function main() {
const response = await client.chat.completions.create({
messages: [{ role: "user", content: "Bob the builder.." }],
model: "grok-beta",
});
console.log(response.choices[0].message.content);
}
main();
Local Setup
If you do not want to use Portkey’s hosted API, you can also run Portkey locally:
Portkey runs on our popular open source Gateway. You can spin it up locally to make requests without sending them to the Portkey API.
npx @portkey-ai/gateway
Your Gateway is running on http://localhost:8080/v1 🚀 |
---|
Then, just change the baseURL
to the local Gateway URL, and make requests:
import Portkey from 'portkey-ai';
const client = new Portkey({
baseUrl: 'http://localhost:8080/v1',
apiKey: 'PORTKEY_API_KEY',
virtualKey: 'PROVIDER_VIRTUAL_KEY'
});
async function main() {
const response = await client.chat.completions.create({
messages: [{ role: "user", content: "Bob the builder.." }],
model: "grok-beta",
});
console.log(response.choices[0].message.content);
}
main();
Portkey’s data & control planes can be fully deployed on-prem with the Enterprise license.
Integration Overview
xAI Endpoints & Capabilities
Portkey works with all of xAI’s endpoints and supports all xAI capabilities like function calling and image understanding. Find examples for each below:
let tools = [{
type: "function",
function: {
name: "getWeather",
description: "Get the current weather",
parameters: {
type: "object",
properties: {
location: { type: "string", description: "City and state" },
unit: { type: "string", enum: ["celsius", "fahrenheit"] }
},
required: ["location"]
}
}
}];
let response = await portkey.chat.completions.create({
model: "grok-beta",
messages: [
{ role: "system", content: "You are a helpful assistant." },
{ role: "user", content: "What's the weather like in Delhi - respond in JSON" }
],
tools,
tool_choice: "auto",
});
console.log(response.choices[0].finish_reason);
Process images alongside text using xAI’s vision capabilities:
response = portkey.chat.completions.create(
model="grok-beta",
messages=[
{
"role": "user",
"content": [
{"type": "text", "text": "What's in this image?"},
{
"type": "image_url",
"image_url": "https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg",
},
],
}
],
max_tokens=300,
)
print(response)
Generate embeddings for text using xAI’s embedding models:
Coming Soon!
Portkey Features
Here’s a simplified version of how to use Portkey’s Gateway Configuration:
Create a Gateway Configuration
You can create a Gateway configuration using the Portkey Config Dashboard or by writing a JSON configuration in your code. In this example, requests are routed based on the user’s subscription plan (paid or free).
config = {
"strategy": {
"mode": "conditional",
"conditions": [
{
"query": { "metadata.user_plan": { "$eq": "paid" } },
"then": "grok-beta"
},
{
"query": { "metadata.user_plan": { "$eq": "free" } },
"then": "grok-2-1212"
}
],
"default": "grok-beta"
},
"targets": [
{
"name": "grok-beta",
"virtual_key": "xx"
},
{
"name": "grok-2-1212",
"virtual_key": "yy"
}
]
}
Process Requests
When a user makes a request, it will pass through Portkey’s AI Gateway. Based on the configuration, the Gateway routes the request according to the user’s metadata.
Set Up the Portkey Client
Pass the Gateway configuration to your Portkey client. You can either use the config object or the Config ID from Portkey’s hosted version.
from portkey_ai import Portkey
portkey = Portkey(
api_key="PORTKEY_API_KEY",
virtual_key="VIRTUAL_KEY",
config=portkey_config
)
That’s it! Portkey seamlessly allows you to make your AI app more robust using built-in gateway features. Learn more about advanced gateway features:
Load Balancing
Distribute requests across multiple targets based on defined weights.
Fallbacks
Automatically switch to backup targets if the primary target fails.
Conditional Routing
Route requests to different targets based on specified conditions.
Caching
Enable caching of responses to improve performance and reduce costs.
Portkey’s AI gateway enables you to enforce input/output checks on requests by applying custom hooks before and after processing. Protect your user’s/company’s data by using PII guardrails and many more available on Portkey Guardrails:
{
"virtual_key":"xai-xxx",
"before_request_hooks": [{
"id": "input-guardrail-id-xx"
}],
"after_request_hooks": [{
"id": "output-guardrail-id-xx"
}]
}
Learn More About Guardrails
Explore Portkey’s guardrail features to enhance the security and reliability of your AI applications.
Appendix
FAQs
You can sign up to xAI here and grab your API key.
xAI typically gives some amount of free credits without you having to add your credit card. Reach out to their support team if you’d like additional free credits.
You can find your current rate limits imposed by xAI on the console. Use Portkey’s loadbalancer to tackle rate limiting by xAI.
Was this page helpful?