Use Portkey with LangGraph to take your AI agent workflows to production
Install the required packages
pip install langchain_community
pip install langgraph[checkpoint]
Generate API Key
Configure LangChain with Portkey
@tool
decorator. Here’s how to create a simple multiplication tool:
@tool
decoratorthread_id
in the config allows you to maintain separate conversation threads for different users or contexts.Manage prompts in Portkey's Prompt Library
_user
field is specifically designed for user tracking.
Filter analytics by user
Create Integration
Create Config
Configure Portkey API Key
Step 2
Connect to LangGraph
Step 1: Implement Budget Controls & Rate Limits
Step 2: Define Model Access Rules
Step 3: Implement Access Controls
Step 4: Deploy & Monitor
How does Portkey enhance LangGraph?
Can I use Portkey with existing LangGraph applications?
Does Portkey work with all LangGraph features?
How do I filter logs and traces for specific graph runs?
graph_id
, workflow_type
, or session_id
to easily find and analyze specific graph executions.Can I use LangGraph's memory features with Portkey?
MemorySaver
checkpointer with Portkey-enabled LLMs. All the memory and state management features work seamlessly with Portkey.