If You Could Fix One Thing About AI Search, What Would It Be?

1 zyruh 7 8/13/2025, 8:56:21 PM
I'm building an AI search tool (Zyruh) and I’m curious about what would truly motivate someone to leave their current AI search solution—whether that’s Google, ChatGPT, Perplexity, Kagi, or something else.

Not small “nice-to-have” tweaks, but that one killer feature or improvement that would make you say, “Yep, I’m switching.”

Examples could be:

100% transparent, clickable sources for every claim

No bias—presenting multiple perspectives side-by-side

Persistent long-term memory that keeps context between sessions

Something else entirely?

If you could wave a magic wand and fix the thing that frustrates you most in AI search today… what would it be?

Comments (7)

anenefan · 50m ago
I see some of the results A.I. assist throws - getting a no brainer right should not be seen as a major win.

I wouldn't use any of the A.I. search for numerous short comings but I'll list a couple of thoughts where it could be better.

A.I. search needs to ignore advertising spiel - ie work out what and where it is (perhaps better stated as becoming more resistant to poisoning)

A.I. search needs to identify and discard common BS that is trotted out by unwitting human parrots.

A.I. search should offer an option for the user to input a great number of sites or sets of sites they don't feel offer anything whatsoever, though not to the point it only echoes what the user wants to hear - but it is easy to say most woo woo kook areas are not greatly reputable as far as facts and real science in the context of the real world. Personally for me, (given what I'm generally searching for) I would like to avoid results being based on facts found in social media areas.

MongooseStudios · 1d ago
Replace it with search that actually worked again.
zyruh · 1d ago
Thank you for your comment.
WarOnPrivacy · 1d ago
I'd fix AI' inherent untrustworthiness.

I'd make AI unable to resist presenting the knowledge is has; I'd fix AI so it would fully answer the question I asked.

I'd make precursory fluff and post content review and explanation - opt-in only.

I'd fix AI' gaslighting-like inability to correctly parse an inquiry.

I'd fix AI' inability to learn from the mistake it just made.

I'd fix the near ubiquitous bad choice to put unasked-for AI up front. If it's not requested, it's not wanted.

zyruh · 1d ago
This is great - thank you!
k310 · 1d ago
I don't use AI search because I most often want to visit sites where material originates, not some nonlinear combination of sources. Authenticity matters.

And LLM's are easily polluted with disinformation and web slop, not to mention algorithm bias, intentional or unintentional.

zyruh · 1d ago
Yes, and this is exactly why I'm creating a tool to address the slop and inaccuracy issues.