Running a Batch with LangBatch
LangBatch provides a simple interface to run batch jobs.
from langbatch import OpenAIChatCompletionBatch
# Create a batch object
batch = OpenAIChatCompletionBatch("path/to/file.jsonl")
# Start the batch job
batch.start()
To check the status of the batch job, use the get_status
method:
To get the results of the batch job, use the get_results
method:
successful_results, unsuccessful_results = batch.get_results()
for result in successful_results:
print(f"Custom ID: {result['custom_id']}")
print(f"Content: {result['choices'][0]['message']['content']}")
Note
You can perform the same actions with other providers and models.
For example, use the AnthropicChatCompletionBatch
class to run batches with the Anthropic models.
Check out the Providers section to learn more.