Portkey is a control panel for your Vercel AI app. It makes your LLM integrations prod-ready, reliable, fast, and cost-efficient.
ai
and portkey-ai
as dependencies.
.env
.env
file:app/api/chat/route.ts
that calls GPT-4 and accepts a POST
request with a messages array of strings:
gpt-3.5-turbo
model, and the response will be streamed to your Next.js app.
/chat/completions
calluseChat
hook will default use the POST
Route Handler we created earlier (/api/chat
). However, you can override this default value by passing an api
prop to useChat({ api: '...'}
).
rolling logs and cachescreens
{"key":"value"}
pairs inside the metadata header. Portkey segments the requests based on the metadata to give you granular insights.
cache
mode
to semantic
in your Gateway Config:
max-age
of the cache and force refresh a cache. See the docs for more information.
verel app prompt