I've found Google's AI overview to be so bad it's probably spreading misinformation. It has no ability to assess the correctness of it's results. I can only assume its enabled so that google can collect training data from user feedback on the feature.
For instance I asked google how many Apple shares one person owned, it plucked a number from an article, but that was for a deal Apple made, and the person just happened to be involved in the deal.
kristianp · 11h ago
The search used was "last airbus fatal crash".
bryant · 10h ago
The query was exactly as you described. Gemini returned the details of the Dreamliner Air India crash, including capturing the number of people aboard the flight, the date of the flight, the location of the flight and crash - yet Gemini somehow managed to hallucinate the wrong plane, an Airbus A330-243.
Air India doesn't even have any A330s, so it's not even immediately obvious how the hallucination happened. It just straight up included the wrong plane.
This specific failure seems more egregious than most.
Nckpz · 10h ago
Just reproduced the issue with Bing's AI result. I find it kind of hilarious that in its sources, the first one listed is an article with the headline: "How Is Airbus Not Suing Google?"
clipsy · 10h ago
And the output was a Boeing crash.
orionblastar · 11h ago
This is concerning, AI has to be accurate for it to work well. It can't hallucinate. Until this problem is fixed, we still need people to do the work.
For instance I asked google how many Apple shares one person owned, it plucked a number from an article, but that was for a deal Apple made, and the person just happened to be involved in the deal.
Air India doesn't even have any A330s, so it's not even immediately obvious how the hallucination happened. It just straight up included the wrong plane.
This specific failure seems more egregious than most.