The Future of Forums Is Lies, I Guess

35 zdw 11 7/7/2025, 5:46:24 PM aphyr.com ↗

Comments (11)

crabmusket · 5h ago
Why should "we" not legislate that any AI systems must identify themselves as such when asked? There could even be a specified way to ask this question so it can be recognised by simple NLP techniques and avoid the black box processing of the model itself. This could carry legal weight.

That way, humans could impersonate AIs, but AIs would be legally encouraged, shall we say, not to impersonate humans.

"It could never be enforced" or "but there will be bad actors who don't do this" are useful and valid discussions to have, but I think separate to the question of if this would be a worthwhile regulatory concept to explore.

alganet · 12h ago
We need to normalize behaviors that are commonly attributed to paranoia.

It is ok to ask a lot of questions, it is ok to be skeptic of friendly interactions, it is ok to be suspicious. These behaviors are not social anxiety, not psychosis, not anti-social. They are, in fact, desirable human aspects that contribute to the larger group.

There is no automated detection, no magic way of keeping these new threats away. They work by exploring humans in vulnerable states. We need kind humans that are less vulnerable to those things.

jaredcwhite · 8h ago
Are you real?

Are you a human?

Is that real text you typed out?

Does anything you're saying have any meaning?

----

That is essentially what you are asking for. Every single online interaction immediately viewed as entirely suspect, with people having to go out of their way to prove they are…people.

Well perhaps you're right that this is where online culture is headed, but we don't have to like it. I hate it. I hate it so bad.

alganet · 6h ago
You don't need to be the paranoid one, you just need to accept that some people will be paranoid and that's a good thing and you should listen. You don't have to like them or obey them.

The other option is trying to make your bubble of protection and trust, where everyone is happy and friendly. Good luck with that.

anitil · 5h ago
I'm not sure what the solution is here - some forums put people in a 'probationary' state for a while where they either can't post or have extra scrutiny. There's some spoiling of the commons going on here that I can't quite put my finger on.

Separately, why are companies using this? Surely this is counter productive to their marketing efforts? Or maybe am I wrong and any attention is good?

lavelganzu · 7h ago
Money is an imperfect but real solution. The simple thing is to charge a small sign-up fee. Obviously this dramatically increases the barrier to entry for real humans. But it should cut the spam even more sharply.
alganet · 6h ago
It's worse. It creates a false sense of security, while it allows people with vast resources to spam and scam freely.

We need smarter humans, it's the only way.

pvg · 14h ago
praptak · 12h ago
I don't believe a purely technical solution exists. This needs to get political, ideally making it a crime to use technology in this way. The scope is much broader and more dangerous than niche forums. This shit has the potential to kill the ability of societies to discuss policy in a meaningful way.
burnt-resistor · 9h ago
This will likely lead to the requirements of identity verification and a small bond as collateral for the privilege of online participation in a particular forum. Idealistic, unenforceable laws won't help.
chatmasta · 10h ago
> Unavailable Due to the UK Online Safety Act

https://archive.is/y9JyC