Bayes, Bits and Brains

96 cgadski 4 8/28/2025, 1:51:05 PM bayesbitsbrains.github.io ↗

Comments (4)

AdieuToLogic · 2d ago
Sometimes it is best to remember that you are a person and are not in competition with algorithms.

The author labels the first riddle thusly:

  Intelligence test
And reinforces this with the first sentence:

  Test your intelligence with the following widget!
If trying to solve it is fun and the faux implication immaterial, awesome. However, if the expressed characterization of:

  Don't feel bad if a machine beats you ...
Is bothersome, remember that all of this is just one person's way of shaping your experience such that continued engagement is likely.
JoshCole · 2d ago
> We are now getting an equivalent definition of what neural nets are being trained for! LLMs are trained to compress the internet as much as possible!

Nice payoff. Others have also called out the relationship to compression (https://www.youtube.com/watch?v=AKMuA_TVz3A).

CuriouslyC · 2d ago
Framing it as compression is reductive (intended). Yes compression of information is a proxy measure of Kolmogorov complexity, however it's really more accurate to say you're accurately mapping the conditional probability distribution, since it's a stochastic machine that produces samples from a distribution, not a literal compressed representation of anything (you have to do work to extract this stuff and it's not 100% in all cases).
_0ffh · 2d ago
The relationship has been thought about for a long time. In 2006 it even led to the creation of the Hutter Prize, with around 38k€ payed out so far.