AI Hallucination Cases Database

14 Tomte 4 5/25/2025, 4:05:10 PM damiencharlotin.com ↗

Comments (4)

irrational · 16m ago
I still think confabulation is a better term for what LLMs do than hallucination.

Hallucination - A hallucination is a false perception where a person senses something that isn't actually there, affecting any of the five senses: sight, sound, smell, touch, or taste. These experiences can seem very real to the person experiencing them, even though they are not based on external stimuli.

Confabulation - Confabulation is a memory error consisting of the production of fabricated, distorted, or misinterpreted memories about oneself or the world. It is generally associated with certain types of brain damage or a specific subset of dementias.

bluefirebrand · 5m ago
You're not wrong in a strict sense, but you have to remember that most people aren't that strict about language

I would bet that for most people they define the words like:

Hallucination - something that isn't real

Confabulation - a word that they have never heard of

Flemlo · 6m ago
So what's the amount of cases were it was wrong but no one checked?
anshumankmr · 8m ago
Can we submit ChatGPT convo histories??