Skip to content

Batch Types

The Batch type represents a batch job returned by the Batch API functions.

Re-exported from openai.types.Batch.

Import: from any_llm.types.batch import Batch

FieldTypeDescription
idstrUnique batch identifier.
objectstrAlways "batch".
endpointstrThe API endpoint used for all requests in the batch.
input_file_idstrID of the uploaded input file.
completion_windowstrTime frame for batch processing (e.g., "24h").
statusstrCurrent status: "validating", "in_progress", "finalizing", "completed", "failed", "expired", "cancelling", or "cancelled".
output_file_idstr | NoneID of the output file (available when status is "completed").
error_file_idstr | NoneID of the error file (if any requests failed).
created_atintUnix timestamp of batch creation.
in_progress_atint | NoneUnix timestamp of when processing started.
expires_atint | NoneUnix timestamp of when the batch expires.
finalizing_atint | NoneUnix timestamp of when finalization started.
completed_atint | NoneUnix timestamp of completion.
failed_atint | NoneUnix timestamp of failure.
expired_atint | NoneUnix timestamp of expiration.
cancelling_atint | NoneUnix timestamp of cancellation request.
cancelled_atint | NoneUnix timestamp of cancellation completion.
request_countsBatchRequestCounts | NoneCounts of total, completed, and failed requests.
metadatadict[str, str] | NoneCustom metadata attached to the batch.

Re-exported from openai.types.batch_request_counts.BatchRequestCounts.

Import: from any_llm.types.batch import BatchRequestCounts

FieldTypeDescription
totalintTotal number of requests in the batch.
completedintNumber of completed requests.
failedintNumber of failed requests.
from any_llm import create_batch, retrieve_batch
batch = create_batch(
provider="openai",
input_file_path="requests.jsonl",
endpoint="/v1/chat/completions",
)
print(f"Batch ID: {batch.id}")
print(f"Status: {batch.status}")
# Poll for completion
import time
while batch.status not in ("completed", "failed", "expired", "cancelled"):
time.sleep(30)
batch = retrieve_batch("openai", batch.id)
print(f"Status: {batch.status}")
if batch.request_counts:
print(f" Completed: {batch.request_counts.completed}/{batch.request_counts.total}")
if batch.status == "completed":
print(f"Output file: {batch.output_file_id}")