Show HN: Selfhostllm.org – Plan GPU capacity for self-hosting LLMs
4 erans 2 8/8/2025, 5:49:49 PM selfhostllm.org ↗
A simple calculator that estimates how many concurrent requests your GPU can handle for a given LLM, with shareable results.
Comments (2)
atmanactive · 48m ago
Very useful, thanks. I'm missing a reset button though.
erans · 7h ago
I also added a Mac version: https://selfhostllm.org/mac/ so you can know which models you can run on your Mac and get an estimated tokens/sec.