Arbitraging Down LLM Inference to the Cost of Electricity

6 ycombyourhair 4 8/28/2025, 6:59:57 PM inference.net ↗

Comments (4)

srbhr · 12h ago
> Someone with cheap electricity can serve inference profitably at prices that would bankrupt the current small set of centralized providers.

Strong claim, There's a similar article about China is eating the world on HN. Maybe then, they're the ones who can take over the inference cloud?

skeptrune · 13h ago
Why is there nowhere on the site where I can figure out how much I would get paid by adding my GPU to the network?
srbhr · 13h ago
That's exactly I was thinking.
nick779 · 13h ago
frontier labs are still making crazy margins. OSS models are impossible to host serverlessly at a profit even if you're Together