Reasoning models don't always say what they think

5 Bluestein 2 5/2/2025, 1:19:50 PM anthropic.com ↗

Comments (2)

duxup · 11h ago
I see phrases like "thinking" and "intellgence".

I'm not up on the latest in AI but aren't LLMs still just doing a sort of predictive "word math" to come up with a string of words as an answer?

The phrases here imply (to me at least) that there is more going on than that... is there?

cratermoon · 11h ago
> is there?

Nope.