Accounts peddling child abuse content flood X hashtags as Thorn cuts ties

6 riffraff 3 6/20/2025, 8:50:15 PM nbcnews.com ↗

Comments (3)

833 · 7h ago
CSAM is an impossible problem to solve, especially when it's text-based solicitation.

But it's concerning that they seem to not have integrated proper hashing solutions until now:

> We are proud to provide an important update on our continuous work detecting Child Sexual Abuse Material (CSAM) content, announcing today that we have launched additional CSAM hash matching efforts.

> This system allows X to hash and match media content quickly and securely,

The existing hashing tools are perfectly fit for purpose, but if the CSAM isn't known (and it's not, because it's either new or AI generated) then no amount of hashing will detect it.

Not sure why X developed something new instead of using PhotoDNA, if it all still uses the same hash databases!

chiph2o · 7h ago
I was part of Google+ moderation team in 2015.

I learnt this is one of the major legal issue in social media which we could not find any reasonable technological path to address it.

I always wondered how Facebook is keeping this under control and if AI has made this problem approachable today?

833 · 7h ago
AI is still not useful for detection.

Facebook makes 1 report to NCMEC for every 134 users, each year. Over 22 million reports. Each report could contain many pieces of CSAM.

The situation is dire.