Skip to content

Quickstart

Prepare the batch file

batch-file.jsonl
{"custom_id": "task-0", "method": "POST", "url": "/chat/completions", "body": {"model": "gpt-4o", "messages": [{"role": "system", "content": "You are an AI assistant that helps people find information."}, {"role": "user", "content": "When was Microsoft founded?"}]}}
{"custom_id": "task-1", "method": "POST", "url": "/chat/completions", "body": {"model": "gpt-4o", "messages": [{"role": "system", "content": "You are an AI assistant that helps people find information."}, {"role": "user", "content": "When was the first XBOX released?"}]}}

Create a batch object

from langbatch import chat_completion_batch

batch = chat_completion_batch("path/to/batch-file.jsonl", provider="openai")

Start the batch job

Creating a batch object will not start the batch job. You need to start the batch job explicitly.

batch.start()

Check the status of the batch job

To check the status of the batch job, use the get_status method:

status = batch.get_status()
print(status)

Get the results of the batch job

After the batch job is successful, you can get the results using the get_results method:

if batch.get_status() == "completed":
    successful_results, unsuccessful_results = batch.get_results()
    for result in successful_results:
        print(f"Custom ID: {result['custom_id']}")
        print(f"Content: {result['choices'][0]['message']['content']}")
Tip

Learn more about the batch actions in the Batch page.

Data Path

By default, LangBatch will save the batch related files in the langbatch_data directory in the current working directory. You can change this by setting the LANGBATCH_DATA_PATH environment variable.

import os
os.environ["LANGBATCH_DATA_PATH"] = "/path/to/your/data"