I built a unified Python library for AI batch requests (50% cost savings)

4 funerr 4 7/22/2025, 6:17:20 PM github.com ↗

Comments (4)

funerr · 22h ago
I needed a Python library to handle complex batch requests to LLMs (Anthropic & OpenAI) and couldn't find a good one - so I built one.

Batch requests take up to 24h but cut costs by ~50%. Features include structured outputs, automatic cost tracking, state resume after interruptions, and citation support (Anthropic only for now).

It's open-source, feedback/contributions welcome!

GitHub: https://github.com/agamm/batchata

tomgs · 21h ago
Neat! What’s the use case exactly? Kinda hard to figure from skimming
funerr · 21h ago
When you have LLM requests you don't mind waiting for (up to 24h) then you can save 50% in costs. Great for document processing, image classification at scale, anything that you don't need an immediate result from the LLM provider and costs play a role.
tomgs · 21h ago
Concrete use cases where 50 percent is actually a thing?