Portkey exposes OpenAI’s Batch API through one consistent endpoint, so you can run large, asynchronous evaluation jobs at 50 % lower cost. Use batches when you need to run large jobs offline — e.g. nightly evals, A/B tests, or bulk embeddings.

Create Batch Job

Upload your .jsonl file first — see Files API for more details and then use the following code to create a batch job.
from portkey_ai import Portkey

# Initialize the Portkey client
portkey = Portkey(
    api_key="PORTKEY_API_KEY",  # Replace with your Portkey API key
    provider="@PROVIDER"   
)

start_batch_response = portkey.batches.create(
  input_file_id="file_id", # file id of the input file
  endpoint="/v1/chat/completions",
  completion_window="24h",
  metadata={} # metadata for the batch
)

print(start_batch_response)

List Batch Jobs

from portkey_ai import Portkey

# Initialize the Portkey client
portkey = Portkey(
    api_key="PORTKEY_API_KEY",  # Replace with your Portkey API key
    provider="@PROVIDER"   
)

batches = portkey.batches.list()

print(batches)

Get Batch Job Details

from portkey_ai import Portkey


# Initialize the Portkey client
portkey = Portkey(
    api_key="PORTKEY_API_KEY",  # Replace with your Portkey API key
    provider="@PROVIDER"   
)

batch = portkey.batches.retrieve(batch_id="batch_id")

print(batch)
The status of a given Batch object can be any of the following:
StatusDescription
validatingthe input file is being validated before the batch can begin
failedthe input file has failed the validation process
in_progressthe input file was successfully validated and the batch is currently being run
finalizingthe batch has completed and the results are being prepared
completedthe batch has been completed and the results are ready
expiredthe batch was not able to be completed within the 24-hour time window
cancellingthe batch is being cancelled (may take up to 10 minutes)
cancelledthe batch was cancelled

Get Batch Output

curl --location 'https://api.portkey.ai/v1/batches/<batch_id>/output' \
--header 'x-portkey-api-key: <portkey_api_key>' \
--header 'x-portkey-provider: @provider'

Cancel Batch Job

from portkey_ai import Portkey

# Initialize the Portkey client
portkey = Portkey(
    api_key="PORTKEY_API_KEY",  # Replace with your Portkey API key
    provider="@PROVIDER"   
)

cancel_batch_response = portkey.batches.cancel(batch_id="batch_id")

print(cancel_batch_response)