Amazed social media engineering society isn’t getting more press. They are all doing it.
I noticed this first on X - during the FarageRiots where an Asian woman asked how many people felt safe. The volume of violently racist replies was insane. As an Asian man - it made me feel, very scared about society. I felt outnumbered. It wasn’t reflective of society - as it turned out it was a mass demonstration of racial unity, by the vast majority of Britain. But not on X
On YouTube I noticed it silent deleting my comments. Nothing violent - literally a comment saying I was concerned about NHS privatisation and takeover by the US finance. Noticed the same comment get removed. No notice. No reason. No appeal. Take down. Invisible. Quick google shows lots of people experiencing same thing.
And it got me thinking - wow - imagine shaping public sentiment at mass. Making opinions that weren’t convenient to people owning social media disappear. That creates helplessness. Shapes elections.
The rage bait we see now pulls in attention, shapes conversations and defines the Overton window.
Noticed a post from Theo T3 an hour or so ago that was critical, on X, about OpenAi and the first comment was calling him an OpenAI shill. Certainly seems plausible incentives on X to fuel anti competitor sentiment and amplify useful sentiment.
This article on meta is validating the patterns I’ve seen. It’s deeply concerning. We are in an era where society is micro shaped by social media owners and their agenda.
This issue needs to be addressed. We need regulation and transparent recommendation algorithms and clear limit in targeting users.
And then there is the toxic nature of social media engineered addiction. Side bar I know but has to be said.
We need much more regulation and we need more decentralised ownership for social media companies to protect democracy.
tojumpship · 1h ago
I wholly agree with your point on X. The comeback of racism is one of the most dangerous social phenomenons in today's world. Besides sowing unsurmountable amounts of hatred, it also brings along xenophobia, misogyny/misandry and the whole likes along with it as the forerunning discriminatory practice in our world.
zingababba · 1h ago
It's pretty bad. I've also been very interested in the non-organic way certain topics get introduced. It's often chains of non-organic posts/replies that will seed topics in a way that makes it seem like opinion 1 is proposed then someone else will come in and make some obvious fallacy in a counter-argument, then another post will respond calling them out in some inflammatory way. This kicks off a cycle of user engagement either defending or attacking one of the participants. However the entire initial chain of 3-4 back and forth is all bots just subtly guiding topics.
dileeparanawake · 28m ago
Wow that’s insane - didn’t realise that was happening re non-organic posts.
From a behavioural POV it seems like an obvious play. These companies and there owners have huge gains via social engineering.
There is very little transparency, accountability or regulation.
The thing that worries me is the unobvious… most people know about instagram and ++ suicide rates. What shocked me was finding out that instagram used things like waiting for people to remove photos of themselves, identify insecurity behaviour, use that to position beauty products to young girls. It seems so so unethical and predatory. Not to mention the impact on public MH when applied at scale.
Another crazy stat was something like screen time av was 4hrs/ day and av attention spans dropped from ~180s to something like ~90s.
The impact in so many areas is so bad. Blows my mind there is such a lack of regulation.
Thinking AI has the potential, at scale to social engineer without the need to bother creating content / making bots.
morkalork · 1h ago
So the premise is that bad actors co-opt the target's hashtags, mass upload ban-able content with those tags, then mass-report that same content and as a casualty the tag gets caught up the next time content moderation models are trained? If the sources aren't just making this up, then it's pretty damning on Facebook's part. Any intern-level data scientist could inspect the training results and see the innocent tags being negatively weighted and flagged.
codyb · 2h ago
Social media's poison. Delete your accounts. Life is better without it.
sunrunner · 2h ago
Does Hacker News count? ;)
tojumpship · 1h ago
You will find many, arguing they abandoned all online presence. Urging people to leave social media for a better life like they have. Yet they have accounts on these pseudo-social sites - Reddit, HN ... - actively participating. I don' think restricting your own access to a swath of information is a good idea, no matter what that info is. Moving your monthly online discussion habit to Reddit of all places... Humans are social, and with the internet's presence in every facet of life, it will be harder to run from it.
I don't think simply being absent is logical. Sensing ideological currents, bias and being primarily a lurker rather than participating will have the same mental benefits without limiting your knowledge.
christianqchung · 2h ago
Not in the same way doomscrolling Instagram Reels, Tiktok, and Youtube shorts, no.
I noticed this first on X - during the FarageRiots where an Asian woman asked how many people felt safe. The volume of violently racist replies was insane. As an Asian man - it made me feel, very scared about society. I felt outnumbered. It wasn’t reflective of society - as it turned out it was a mass demonstration of racial unity, by the vast majority of Britain. But not on X
On YouTube I noticed it silent deleting my comments. Nothing violent - literally a comment saying I was concerned about NHS privatisation and takeover by the US finance. Noticed the same comment get removed. No notice. No reason. No appeal. Take down. Invisible. Quick google shows lots of people experiencing same thing.
And it got me thinking - wow - imagine shaping public sentiment at mass. Making opinions that weren’t convenient to people owning social media disappear. That creates helplessness. Shapes elections.
The rage bait we see now pulls in attention, shapes conversations and defines the Overton window.
Noticed a post from Theo T3 an hour or so ago that was critical, on X, about OpenAi and the first comment was calling him an OpenAI shill. Certainly seems plausible incentives on X to fuel anti competitor sentiment and amplify useful sentiment.
This article on meta is validating the patterns I’ve seen. It’s deeply concerning. We are in an era where society is micro shaped by social media owners and their agenda.
This issue needs to be addressed. We need regulation and transparent recommendation algorithms and clear limit in targeting users.
And then there is the toxic nature of social media engineered addiction. Side bar I know but has to be said.
We need much more regulation and we need more decentralised ownership for social media companies to protect democracy.
From a behavioural POV it seems like an obvious play. These companies and there owners have huge gains via social engineering.
There is very little transparency, accountability or regulation.
The thing that worries me is the unobvious… most people know about instagram and ++ suicide rates. What shocked me was finding out that instagram used things like waiting for people to remove photos of themselves, identify insecurity behaviour, use that to position beauty products to young girls. It seems so so unethical and predatory. Not to mention the impact on public MH when applied at scale.
Another crazy stat was something like screen time av was 4hrs/ day and av attention spans dropped from ~180s to something like ~90s.
The impact in so many areas is so bad. Blows my mind there is such a lack of regulation.
Thinking AI has the potential, at scale to social engineer without the need to bother creating content / making bots.
I don't think simply being absent is logical. Sensing ideological currents, bias and being primarily a lurker rather than participating will have the same mental benefits without limiting your knowledge.