"If you covered a backhoe with skin, made its bucket look like a hand, painted eyes on its chassis, and made it play a sound like “hnngghhh!” whenever it lifted something heavy, then we’d start wondering whether there’s a ghost inside the machine."
Great point and I think this is kind of what has happened with LLMs by squishing them into the chat mode. Naturally they just continue the previous text which is much less likely to be anthropomorphised. Image generators don't get anthropomorphised nearly as much because they just output the image. But since we're used treating chat as if there's a person on the other end, and the LLM refers to itself in the first person etc, people act as if it can think.
Great point and I think this is kind of what has happened with LLMs by squishing them into the chat mode. Naturally they just continue the previous text which is much less likely to be anthropomorphised. Image generators don't get anthropomorphised nearly as much because they just output the image. But since we're used treating chat as if there's a person on the other end, and the LLM refers to itself in the first person etc, people act as if it can think.