Ecosystem
LLM Integrations
- Overview
- OpenAI
- Anthropic
- Google Gemini
- Google Vertex AI
- Azure
- Bedrock
- AWS SageMaker
- Ollama
- More
- LocalAI
- vLLM
- Triton
- AI21
- Anyscale
- Cerebras
- Cohere
- Fireworks
- Dashscope
- Deepinfra
- Deepbricks
- Deepgram
- DeepSeek
- Github
- Groq
- Hugging Face
- Inference.net
- Jina AI
- Lambda Labs
- Lemonfox-AI
- Lepton AI
- Lingyi (01.ai)
- Mistral AI
- Monster API
- Moonshot
- Ncompass
- Nomic
- Nscale (EU Sovereign)
- Novita AI
- Nebius
- OpenRouter
- Perplexity AI
- Predibase
- Reka AI
- Recraft AI
- SambaNova
- Segmind
- Snowflake Cortex
- Stability AI
- SiliconFlow
- Together AI
- Upstage AI
- Voyage AI
- Workers AI
- xAI
- ZhipuAI / ChatGLM / BigModel
- Replicate
- Suggest a new integration!
- Bring Your Own LLM
Cloud Platforms
Guardrails
Plugins
Agents
AI Apps
Libraries
Replicate
Replicate is a platform for building and running machine learning models.
Replicate does not have a standarized JSON body format for their inference API, hence it is not possible to use unified API to interact with Replicate. Portkey instead provides a proxy to Replicate, allowing you to manage auth on Portkey and log all Replicate requests.
Portkey SDK Integration with Replicate
To integrate Replicate with Portkey:
1. Install the Portkey SDK
Add the Portkey SDK to your application to interact with Replicate through Portkey’s gateway.
Copy
Ask AI
npm install --save portkey-ai
Copy
Ask AI
npm install --save portkey-ai
Copy
Ask AI
pip install portkey-ai
2. Create a New Integration
To use Replicate with Portkey, get your Replicate API key from here, then add it to Portkey to create your Replicate virtual key.
Copy
Ask AI
import Portkey from 'portkey-ai'
const portkey = new Portkey({
apiKey: "PORTKEY_API_KEY", // defaults to process.env["PORTKEY_API_KEY"]
provider:"@PROVIDER" // Your Replicate Virtual Key
})
Copy
Ask AI
import Portkey from 'portkey-ai'
const portkey = new Portkey({
apiKey: "PORTKEY_API_KEY", // defaults to process.env["PORTKEY_API_KEY"]
provider:"@PROVIDER" // Your Replicate Virtual Key
})
Copy
Ask AI
from portkey_ai import Portkey
portkey = Portkey(
api_key="PORTKEY_API_KEY", # Replace with your Portkey API key
provider="@PROVIDER" # Replace with your virtual key for Replicate
)
Copy
Ask AI
from openai import OpenAI
from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders
client = OpenAI(
api_key="REPLICATE_API_KEY",
base_url=PORTKEY_GATEWAY_URL,
default_headers=createHeaders(
api_key="PORTKEY_API_KEY",
provider="replicate"
)
)
Copy
Ask AI
import OpenAI from "openai";
import { PORTKEY_GATEWAY_URL, createHeaders } from "portkey-ai";
const client = new OpenAI({
apiKey: "REPLICATE_API_KEY",
baseURL: PORTKEY_GATEWAY_URL,
defaultHeaders: createHeaders({
provider: "replicate",
apiKey: "PORTKEY_API_KEY",
}),
});
3. Use the Portkey SDK to interact with Replicate
Copy
Ask AI
from portkey_ai import Portkey
portkey = Portkey(
api_key="PORTKEY_API_KEY", # Replace with your Portkey API key
provider="@REPLICATE_PROVIDER",
)
response = portkey.post(
url="predictions", # Replace with the endpoint you want to call
)
print(response)
Copy
Ask AI
from portkey_ai import Portkey
portkey = Portkey(
api_key="PORTKEY_API_KEY", # Replace with your Portkey API key
provider="@REPLICATE_PROVIDER",
)
response = portkey.post(
url="predictions", # Replace with the endpoint you want to call
)
print(response)
Copy
Ask AI
import Portkey from 'portkey-ai';
// Initialize the Portkey client
const portkey = new Portkey({
apiKey: "PORTKEY_API_KEY", // Replace with your Portkey API key
provider:"@REPLICATE_PROVIDER", // Add your Replicate's virtual key
});
response = portkey.post(
url="predictions", # Replace with the endpoint you want to call
)
print(response)
Copy
Ask AI
import OpenAI from 'openai'; // We're using the v4 SDK
import { PORTKEY_GATEWAY_URL, createHeaders } from 'portkey-ai'
const openai = new OpenAI({
apiKey: 'REPLICATE_API_KEY', // defaults to process.env["OPENAI_API_KEY"],
baseURL: PORTKEY_GATEWAY_URL,
defaultHeaders: createHeaders({
provider:"@REPLICATE_PROVIDER",
apiKey: "PORTKEY_API_KEY", // defaults to process.env["PORTKEY_API_KEY"]
})
});
response = openai.post(
url="predictions", # Replace with the endpoint you want to call
)
print(response)
Copy
Ask AI
from openai import OpenAI
from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders
openai = OpenAI(
api_key='REPLICATE_API_KEY',
base_url=PORTKEY_GATEWAY_URL,
default_headers=createHeaders(
provider="replicate",
api_key="PORTKEY_API_KEY"
)
)
response = openai.post(
url="predictions", # Replace with the endpoint you want to call
)
print(response)
Copy
Ask AI
curl --location --request POST 'https://api.portkey.ai/v1/predictions' \
--header 'x-portkey-provider: REPLICATE_PROVIDER' \
--header 'x-portkey-api-key: PORTKEY_API_KEY'
Was this page helpful?
Assistant
Responses are generated using AI and may contain mistakes.