Ask HN: Can LLMs do batch classification?

2 iknownthing 2 6/26/2025, 1:16:29 AM
I wrote a prompt that did batch classification - the prompt contained instructions on how to classify text and 10 input examples for it to classify and it was to return a json string displaying the classifications. It kind of worked by then I realized the individual classification of an individual input was significantly affected by which other 9 inputs it was in the prompt with. In other works the classification was not at all independent. With traditional ML you can do batch classification trivially with each input in the batch being predicted independently. So is this just a limitation of LLMs? You have to classify inputs one LLM call at a time?

Comments (2)

magicalhippo · 2h ago
> You have to classify inputs one LLM call at a time?

Yes, but it's possible to batch the calls when feeding the data through the neural network, so LLM libraries might support that.

See for example this[1] article which gives a brief overview of batching calls using vLLM.

[1]: https://medium.com/ubiops-tech/how-to-optimize-inference-spe...

al_borland · 10h ago
I tried using an LLM to give me various information for a given input. I gave in the parameters of what I was looking for, then told it what data I would provide with each subsequent message for it to process independently based on the initial directive. After 4 or 5 I noticed an error, and things quickly went downhill, as the context from various entries was mixing in with the current one and it started to get more and more confused as I tried to point it back in the right direction. I ended up giving up and doing it all manually.

It sounds like we may have hit similar limits, using slightly different means to get there.