Documentation Index Fetch the complete documentation index at: https://docs.portkey.ai/docs/llms.txt
Use this file to discover all available pages before exploring further.
Give your AI coding assistant the knowledge to work with Portkey SDKs by installing our official skills. This enables AI agents to understand Portkey’s APIs, patterns, and best practices when helping you write code.
Quick Start
Run this command in your project directory:
npx add-skill portkey-ai/skills
This installs Portkey SDK skills, which teach your AI assistant how to use the SDKs effectively.
Install a Specific Skill
Python SDK
TypeScript SDK
npx add-skill portkey-ai/skills --skill portkey-python-sdk
npx add-skill portkey-ai/skills --skill portkey-typescript-sdk
Supported AI Coding Assistants
The skills work with any AI coding assistant that supports the Agent Skills format:
Assistant Status Claude Code Supported Cursor Supported OpenCode Supported GitHub Copilot Supported Codex Supported Amp Supported Roo Code Supported
What the Skills Provide
Once installed, your AI coding assistant will have knowledge of:
SDK Installation & Setup — How to install and configure Portkey SDKs in Python and TypeScript projects
Chat Completions — Making LLM calls with full streaming support
Observability & Tracing — Adding trace IDs, metadata, and custom tags for debugging
Caching — Semantic and simple caching to reduce costs and latency
Fallbacks — Automatic failover when a provider fails
Load Balancing — Distribute traffic across multiple providers or API keys
Multi-Provider Routing — Route requests to 250+ LLMs through a unified API
Error Handling — Proper retry logic and error handling patterns
Framework Integrations — Working with LangChain, LlamaIndex, Strands, and Google ADK
Example Usage
After installing the skills, your AI assistant can help you with tasks like:
“Help me set up Portkey with OpenAI fallback to Anthropic”
The assistant will know to use:
from portkey_ai import Portkey
client = Portkey(
api_key = "YOUR_PORTKEY_API_KEY" ,
config = {
"strategy" : { "mode" : "fallback" },
"targets" : [
{
"override_params" : { "model" : "@openai/gpt-4o" }
},
{
"override_params" : { "model" : "@anthropic/claude-sonnet-4-20250514" }
}
]
}
)
response = client.chat.completions.create(
messages = [{ "role" : "user" , "content" : "Hello!" }]
)
import Portkey from 'portkey-ai' ;
const client = new Portkey ({
apiKey: 'YOUR_PORTKEY_API_KEY' ,
config: {
strategy: { mode: 'fallback' },
targets: [
{
overrideParams: { model: '@openai/gpt-4o' }
},
{
overrideParams: { model: '@anthropic/claude-sonnet-4-20250514' }
}
]
}
});
const response = await client . chat . completions . create ({
messages: [{ role: 'user' , content: 'Hello!' }]
});
“Add request tracing to my Portkey calls”
The assistant understands the observability API:
response = client.with_options(
trace_id = "unique-trace-id" ,
metadata = {
"user_id" : "user-123" ,
"session_id" : "session-456" ,
"environment" : "production"
}
).chat.completions.create(
model = "@openai/gpt-4o" ,
messages = [{ "role" : "user" , "content" : "Summarize this document..." }]
)
const response = await client . chat . completions . create ({
model: '@openai/gpt-4o' ,
messages: [{ role: 'user' , content: 'Summarize this document...' }]
}, {
traceId: 'unique-trace-id' ,
metadata: {
userId: 'user-123' ,
sessionId: 'session-456' ,
environment: 'production'
}
});
“Enable semantic caching for my LLM requests”
client = Portkey(
api_key = "YOUR_PORTKEY_API_KEY" ,
config = {
"cache" : {
"mode" : "semantic" , # or "simple" for exact match
"max_age" : 3600 # TTL in seconds
}
}
)
# Similar queries return cached responses
response = client.chat.completions.create(
model = "@openai/gpt-4o" ,
messages = [{ "role" : "user" , "content" : "What is the capital of France?" }]
)
Installation Options
List Available Skills
npx add-skill portkey-ai/skills --list
Project-Level Installation (Default)
Installs skills to your current project directory:
npx add-skill portkey-ai/skills
Global Installation
Installs skills globally so they’re available across all your projects:
npx add-skill portkey-ai/skills --global
Target Specific Agent
Install for a specific AI coding assistant only:
npx add-skill portkey-ai/skills --agent cursor
npx add-skill portkey-ai/skills --agent claude
Manual Installation
If you prefer to install manually, copy the skill files to your project’s skills directory:
mkdir -p .claude/skills/portkey-python-sdk
curl -o .claude/skills/portkey-python-sdk/SKILL.md \
https://raw.githubusercontent.com/portkey-ai/skills/main/skills/portkey-python-sdk/SKILL.md
mkdir -p .cursor/skills/portkey-python-sdk
curl -o .cursor/skills/portkey-python-sdk/SKILL.md \
https://raw.githubusercontent.com/portkey-ai/skills/main/skills/portkey-python-sdk/SKILL.md
Updating the Skills
To get the latest SDK documentation, re-run the install command:
npx add-skill portkey-ai/skills
The skills are automatically updated when new SDK versions are released.
Repository
The skill source is available at: github.com/portkey-ai/skills
Contributions and feedback are welcome.
Python SDK Docs Full Python SDK documentation
TypeScript SDK Docs Full TypeScript SDK documentation
Skills Repository View the skills source on GitHub
Discord Community Get help from the community