The Tracing capabilities in Portkey empowers you to monitor the lifecycle of your LLM requests in a unified, chronological view.
trace ID
with your requests, all related LLM calls are grouped together in the Traces View, appearing as “spans” within that trace.
“Span” is another word for subgrouping of LLM calls. Based on how you instrument, it can refer to another group within your trace or to a single LLM call.
spanId
and optional spanName
. Child spans link to a parent via the parentSpanId
. Parentless spans become root nodes.
Key - Node | Key - Python | Expected Value | Required? |
---|---|---|---|
traceId | trace_id | Unique string | YES |
spanId | span_id | Unique string | NO |
spanName | span_name | string | NO |
parentSpanId | parent_span_id | Unique string | NO |
trace tree
values while making your request (or while instantiating your client).
Based on these values, Portkey will instrument your requests, and will show the exact trace with its spans on the “Traces” view in Logs page.
ChatOpenAI
instanceOpenAI
llm classtraceId
, spanId
etc. will become part of the metadata object in your log, and Portkey will instrument your requests to take those values into account.
The logger endpoint supports inserting a single log as well as log array, and helps you build traces of any depth or complexity. For more, check here: