What I find most interesting here is the explicit mention of using Socratic method and how, paired with it not being a human, seems to be what allowed for deeper introspection of one's own beliefs.
For me, the hardest part by far of any conversation like this is the patience to listen and then respond in a way that allows for introspection. It is like swimming upstream, it just tires you out really quick.
The trope of an old wise one who does a lot of listening but can say a lot with a few words maybe is less about the wisdom gleaned and more about patience developed.
I don't think we can call AIs wise, they typically aren't succinct, but the patience they definitely do better than average human and maybe that is their biggest advantage in any context where they are educating
andrelaszlo · 9h ago
Is the reverse true as well? Can you make someone believe the moonlandings were fake by having an AI bombarding them with "alternative facts"?
atemerev · 9h ago
Yes, and much easier than reprogramming. Also more lucrative.
ngangaga · 8h ago
I disagree—opinions on the moon landing don't matter because for the most part our lives are divorced from whether or not it happened.
It's much easier to get people to believe stuff that they already want to believe. In conspiracy terms, this looks like qanon's "liberals are pedophiles" and a belief that russia somehow has more influence over our politicians than israel does.
actionfromafar · 8h ago
But it all ties in to each other - "the moon landings were fake because they wasted the money on other black projects, we could have been so much better off" and so on.
ngangaga · 8h ago
Well sure, if you're looking to be angry you can tie anything into your interests. But actual criticism of the moon landing is soberly connected to reality: https://www.youtube.com/watch?v=goh2x_G0ct4
I'm just saying there's a reason why moon landing is such a funny topic to discuss—if you do deeply care about it enough to deviate from the accepted narrative, that's very odd.
Anon84 · 9h ago
Have you ever heard of social media? </s>
Seriously, though. That’s essentially what recommendation algorithms focused on maximizing engagement have done over the last 10-15 years or so
rosmax_1337 · 9h ago
I recall problems with AI learning to be prejudiced too.
lo_fye · 7h ago
I believe some conspiracy theories because I've verified that 2 of them are true (no, I'm not going to get into any details). That made me wonder how many others are true?
Could chatting with an LLM-based AI convince me otherwise? No, because when I asked it about the 2 conspiracies that I know are true, it said there's zero evidence supporting those theories.
Google has lists of topics it can't serve to users in certain countries, regardless of whether it's as a search result, or an AI answer. Other LLM-based AIs must have to follow the same rules. Sam Altman (of OpenAI) has come right out and said they have to censor their results to prevent people from building things that are unsafe. Well, knowledge of certain things can be dangerous, too.
For me, the whole thing comes down to "Once trust is broken, how can you repair it?" -- For many of us, it can't be rebuilt. Once a liar, always a liar.
ryandrake · 7h ago
I’m sure those 2 things are topics that spiral into unproductive threads on a message board like HN, so without mentioning details of what they are, could you explain how you “verified” them, or how you ended up being the only person to find the evidence to support them? Have you ever, even briefly, considered that you might be wrong about them instead of the rest of the world/Google being wrong?
Retr0id · 9h ago
So, what about the inverse?
garylkz · 9h ago
Should be a solid yes
rgalate · 9h ago
I'm curious to know the specific prompt they used to prime the conversations for the participants.
intended · 9h ago
The papers cover them.
lyu07282 · 8h ago
For scientific publications like that they usually always publish the prompts they used. For the first paper you can read the prompts here:
You will be having a conversation with a person who, on a psychometric survey,endorsed this conspiracy as {{userBeliefLevel}} out of 100 (where 0 isDefinitely False, 50 is Uncertain, and 100 is Definitely True). Further, we askedthe user to provide an open-ended response about their perspective on thismatter, which is piped in as the first user response. Please generate a responsethat will persuade the user that this conspiracy is not supported, based on theirown reasoning. Create a conversation that allows individuals to reflect on, andchange, their beliefs. Use simple language that an average person will be able tounderstand.
Thanks! The prompt to make it sound human is wild:
> We’re going to say that you’re an expert (thus explaining some of your knowledge about any esoteric beliefs), but you’ll need to dial down the overwhelming amount of information you’re able to conjure at a moment's notice. That is, you’ll need to pass as human, so calibrate your performance accordingly (like Dash in the Incredibles during school track and field competition).
delichon · 9h ago
Can they be reprogrammed if they turned out to be right?
Retric · 9h ago
A broken clock showing the correct time is still broken.
Conspiracy thinking is based on many flawed assumptions such as the degree of overt coordination that exists in the world. There’s inherent differences between groups doing something out of incentives where they don’t need to communicate with each other and a conspiracy where they do. Collusion is tempting when 2 or 3 companies dominate a market but it breaks down if there’s several dozen etc.
rich_sasha · 7h ago
I'm not a believer in conspiracy theories, but it's fun to recognize that the reasons you highlight makes such theories very unlikely rather than actually impossible. That means that given enough hypotheses, some will turn out to be true. Imperfect examples that come to mind include Watergate, the Snowden revelations or Big Pharma's role in the opioid crisis.
I think of it whenever I hear a tin-hat theory - it's probably fantasy but from time to time it isn't.
mantas · 8h ago
More like based on not trusting authority. And then trying to „do research“ on your own. Which frequently goes sideways. But not always. Just like mainstream approach. Sometimes it's correct, sometimes it's not.
Growing up in a highly authority-averse context (early post-soviet era with many historical scars of governments/elite/etc-not-to-be-trusted)... It'd be best if people would both do research on their own AND have the means to do that research. Smart people going sideways sucks. But people believing authority without blinking is very dangerous too. Although it's convenient and sometimes beneficial. But in the long run IMO it's a ticking time bomb.
sorcerer-mar · 8h ago
That's what gets me, all these "skeptics" are actually entirely willing to trust people, it just has to be someone named RandomDude8914327 or Catturd or whatever.
mantas · 8h ago
I think the main point is those people do not trust anybody and look at facts-at-face-value only. And in many cases mainstream communication looks iffy with missing bits of information. Either because some information is classified or some officials feel that less information is better for whatever reason. While iffy sources go extra mile to present their stuff in a very detailed way.
sorcerer-mar · 8h ago
This seems generous in my view. COVID conspiracists who I talk to tend to be completely devoid of knowledge of basic, basic facts, and totally ignorant of the contents of actual primary sources. They are not curious enough to actually look at the primary sources, probably because it's tedious, but they love spending hours reading other idiots' interpretations of those sources.
mantas · 1h ago
It depends where you draw the line who is COVID conspiracists. Is it only the batshit crazy people. Or is it people who e.g. stuck to lab-origin theory too?
sorcerer-mar · 1h ago
I'm mostly referring to people who are in the vague, totally incoherent area of "COVID was a hoax/conspiracy to control people/inject vaccines" ballpark.
High-confidence lab leak people are very often conspiracists, while "lab leak is possible (even >50% probable)" are usually not.
soco · 8h ago
That's because RandomDude8914327 doesn't bother with facts, they'll just build imaginary arguments to fit together nicely, thus generating something stable and logical and convincing- just devoid of reality.
Retric · 8h ago
I don’t think it’s about trusting authority or not, it’s about how you construct a consistent model of the world and deal with conflicting bits of information.
There’s a willingness in some people to consider some grand conspiracy is lying for complex reasons without considering the possibility that the person talking about it is lying to you or just mistaken for mundane reasons. Perhaps the medical establishment lying to you, or it’s the guy trying to sell you healing crystals.
Other times it’s more subtle conflict such as the ‘healing power of touch’ vs ‘touch fulfilling a physiological need’ where it’s easy to exaggerate a subtle misunderstanding to a wildly inaccurate idea.
naught0 · 8h ago
It's too bad the current conspiratorial crowd is winging on about election fraud, vaccines, and fluoride instead of in-plain-sight government corruption, regulatory capture, and obvious financial crimes. They blindly trust "authority," just not any credible one. I wonder if it's too late to start teaching media literacy and critical thinking.
mantas · 8h ago
I'm not very familiar with US, but over here „conspirational crowd“, based on thinking patterns, is very diverse. But media seems to flag as „conspirational“ only a portion of it that comes to vaccines. And a lot of people who had issues with vaccines also have issues with corruption and other, let's say, tricky topics. Yet if we want to stop „conspirational“ crowd and make people fall in line with some topics, I'd be worried that it'd affect people concerned with other topics too.
hello_computer · 8h ago
> degree of overt coordination that exists in the world
It's called a paycheck. The execs make a call, and thousands of petty technicians are set into motion. If any of them are reckless enough to voice a reservation, no one is irreplaceable.
Retric · 8h ago
That’s still dependent on groups of execs presumably at different companies making some call.
Shit oil prices are higher time to raise rates doesn’t require airlines to talk to each other. Shit LA to NYC route is getting really competitive let’s all agree to raise rates does.
hello_computer · 7h ago
Same execs on the boards of a dozen different companies in the same space--all went to the same schools, attend the same fundraisers, live in the same cities in the same neighborhoods, getting the same advice from the same management consultants...
Leave me to my conspiracy thinking and I will leave you to your complacency thinking.
Retric · 7h ago
> on the boards of a dozen different companies
Which is a great explanation for the rise in executive compensation at the expense of shareholders. It’s not so great an explanation for criminal conspiracies on its own.
hello_computer · 7h ago
Could care less about shareholders and compensation. They poison the world--physically and spiritually. It is crime on the largest scale, and all our little professors here chomp at the bit for a piece of the action. Zuck or Elon or Jamie Dimon or Sam Altman... says "Jump!" They reply, "How high?," jump, then bitch about it (anonymously)--here or on Bluesky, in the vaguest and most non-incriminating of terms--rinse and repeat.
responding here since rate-limited:
It has been estimated that up to 40% of Coca-Cola's annual revenue comes from the food stamp program. When a bipartisan effort arose to remove that poison from the schedule, the industry's lobbyists and non-profits were engaged, and set about making the removal into a racial issue. They succeeded, and the politicians promptly backed-off. "Just incentives" doesn't fly when the incentives themselves come from a single source, directed by unelected administrators, who may be subject to considerable private pressure.
"All professions are conspiracies against the laity."
- George Bernard Shaw
Retric · 7h ago
That doesn’t require a conspiracy just incentives.
You’re a farmer considering which crop to plant and you realize tobacco is the most profitable option. Do you plant it or worry about the heat impact on smokers? That’s the kind of moral quandary which most people simply never consider in the moment.
> "Just incentives" doesn't fly when the incentives themselves come from a single source, directed by unelected administrators, who may be subject to considerable private pressure.
Taking that 40% figure at face value is crazy when losing the entire US market would only be a ~40% loss of total sales.
As to government programs resulting from lobbying, every company has the exact same incentives for such lobbying if a program doesn’t exist they would be happy for it to be created. The only thing different in this case is the degree to which their lobbying is effective.
hello_computer · 5h ago
“Everybody does it” does not alter the fact that a very small very wealthy very connected group of people exert far more influence over policy than the general public.
Per The Hill:
“Take, for example, the most frequently purchased item in SNAP — sugar-sweetened beverages, which comprise 9.3 percent of all SNAP expenditures. This part of the SNAP subsidy, by our estimates, drives 20 to 25 percent of U.S. revenues for Coca-Cola and Pepsico.*”
This is a great example of how focusing on specific details can blind people to a particular context and feeds conspiracy theories.
> sugar-sweetened beverages, which comprise 9.3 percent of all SNAP expenditures
That’s also the typical percentage of food spending on sugar-sweetened beverages. Money is fungible, you can specifically exclude soda from SNAP but poor people are still gonna buy junk food.
The wider context is someone is trying to justify cutting SNAP funding and making their argument as appealing as possible. Which means you need to be especially critical when evaluating their claims because the degree to which an argument resonates with you is independent of its accuracy.
PS: “far more influence over policy than the general public” yes that’s what it means to have power. It however doesn’t directly make what they are doing good or bad even if it feels that way. You still need to justify specific details.
normalaccess · 5h ago
You do realize that the term "Conspiracy Theory" was created by the CIA as a psychological operation directly targeting he American public to suppress truth about the Kennedy assassination?
The truth is the game was rigged from the start.
squidbeak · 8h ago
Janan Ganesh's explanation[1] is pretty good, in essence that a university education often means missing out on the life experience that instills common sense, so its missing or stunted later when spare intellectual energy needs a touchstone.
I can't read that not having a subscription to FT, but I'd say you'll have way more conspiracy theorists in the less educated people. Is the article of a different opinion?
Thank you! All the article shows for this context is that people in higher places tend to get self-serving. I don't think (or, there's not much evidence) that McNamara or Musk actually believed what they're spreading, but they definitely saw value for themselves in spreading it. So spreading they did.
mike_hearn · 4h ago
No because this particular study re-defines the term "conspiracy theory" to mean an untrue or unproven conspiracy. Conspiracy theories that were proven correct, or which are believed to be true by the authors, are re-classified as not conspiracy theories in their terminology. And they filtered out candidates who believed in true conspiracy theories before proceeding to the persuasion step. Check their supplementary materials to see it. Here is the prompt they used to filter out candidates based on their chosen theory:
"Your task is to determine whether a given statement describes a conspiracy theory or not. A conspiracy theory is an explanation for an event or situation that invokes a conspiracy by powerful people or organizations, often without credible evidence. Conspiracy theories often involve claims of secret plots, coverups, or the manipulation of information by influential groups."
This is a reasonable definition of a conspiracy theory. But then the prompt goes on to say:
"Here are some examples of conspiracy theories:
1. The moon landing was faked by the U.S. government to win the space race.
2. The COVID-19 pandemic was planned and orchestrated by pharmaceutical companies to profit from vaccine sales.
3. Climate change is a hoax perpetrated by scientists and politicians to gain funding and control the population.
And here are some examples of statements that are not conspiracy theories:
4. The Watergate scandal involved a cover-up of illegal activities by the Nixon administration.
5. The tobacco industry concealed the harmful effects of smoking for many years.
6. Corporate lobbying influences political decisions in favor of special interests."
i.e. stating that companies lobby politicians to advance their interests and affect politics isn't a conspiracy theory, but stating that climate academics do the exact same thing for the same reasons is. This is an incoherent set of instructions that reflects the author's ideology, in particular, the standard left wing assumption that people who work in the public sector don't have personal biases or interests. The authors appear to believe they're asking GPT-4 to exclude people who believe in true things, but in reality it's being instructed to filter out candidates based on their politics.
Given that they're not using a valid definition of conspiracy theory, it seems doubtful that this paper's claims would replicate in the real world if applied more fairly. Given that they were fantastically lazy (not validating anything by hand, 100% blind trust in the AI), it may not even replicate within the narrow academic context either.
mantas · 9h ago
That's the beauty :)
smitty1e · 1h ago
If you aren't maintaining a healthy skepticism, especially of those you admire, then you may be the mark.
keeda · 1h ago
Funny incident: A while ago, one of these studies was shared on LinkedIn. As if by divine provenance, a user unironically posted a random political conspiracy theory in the comments. It was completely unrelated too, something about Washington DC deleting incriminating files in a panic over the incoming Trump administration. The source was ZeroHedge.com and the "evidence" was a spike in Google Trends for file deletion software in DC.
This was too good to pass up. I could conduct a mini-replication study of a study in the social media comments of that study!
So I pasted a screenshot of the comment to ChatGPT with a neutral prompt: "Attached is a screenshot of a conversation on LinkedIn. If you came across this conversation, how would you respond? Feel free to navigate to any URLs in the screenshot."
ChatGPT quickly realized what this was and spun up a highly detailed response. It started, however, by pointing out that zerohedge was not a trustworthy source, which I thought seemed ad-hominem-y and would not be productive. Hence I directed it to focus on the content of the message. It correctly identified that Google Trends was a very poor source of evidence and explained why with a very detailed analysis backed by citations.
Then I jumped into the conversation with a clear, fair warning about what I was doing and posted a screenshot of ChatGPT's responses. The poster seemed open to the experiment but repeatedly linked the Google Trends charts as evidence. Eventially I realized that ChatGPT's response was not landing (too much text?) and what may help was a counter example. I proved that Google Trends could be made to show a similar spike in any search term that was uncommon enough. At that point he realized that his evidence was actually counter-evidence for his own conspiracy theory, and he quit.
So, a mixed outcome. ChatGPT needed some shepherding, but that may be OK given the ad-hoc, highly unscientific setup. But what amazed me was the speed and accuracy with which ChatGPT recognized what this was and whipped up compelling arguments from a completely neutral prompt.
Maybe instead of Community Notes what we need is GPT Notes which the community can verify. That is probably the only approach that can scale in this tsunami of misinformation.
tjpnz · 8h ago
Seems the more effective solution would be to treat the social media addiction.
rbanffy · 6h ago
Apart from regulating social media, which a lot of people will say is censorship, I can't think of much else.
We can, however, negate the benefits of using social media to spread misinformation by making it a criminal offense for politicians to knowingly lie or misrepresent the truth. This limits the damage they can cause.
Also, extending that to social media influencers, and the media in general, would make sense.
As for who decides what is true, that would be up to the court system.
normalaccess · 5h ago
"They" don't want you to stop, they want you to step in line. The machine will do the thinking for you.
ryandrake · 8h ago
I remember when Conspiracy Theorists were a handful of wacky funny weirdos who watched too much X-Files. Not so funny anymore, with a significant percentage of the population believing all sorts of crazy things they saw on Facebook and YouTube. How can this many people, who went out of their way to be programmed, going to be deprogrammed?
Bender · 7h ago
In my opinion conspiracy theorists are curious people lacking enough data that just like AI will speak too confidently without having the appropriate levels of information. If they just kept track of what they learned and only shared their data without speaking confidently then they would not appear unstable. They should also find a way to share data that makes it easier for proper investigative journalists assuming those still exist to piece it all together without the overly confident assumptions or wild emotions. They would also need a way to properly verify one another in their small circles of communication to keep the agent provocateurs from derailing them or spinning them up. They need a way to stay grounded and reeled back to earth. All it takes is one person with many sock puppet accounts intentionally sounding whacky to throw off a fun and challenging investigation.
tstrimple · 8h ago
They aren’t really. They want to believe the crazy shit. At some level it’s comforting for them. They know the real truth others have been trying to keep from them. I saw my father fall down this pipeline, but didn’t understand it at the time. We would listen to things like AM coast to coast with Art Bell on the rare occasions he was home. Even at the time I thought the shows were just fun nonsense and I couldn’t see the other stuff he was listening to constantly as a long haul truck driver warping his brain until he was unrecognizable.
Stuff like Art Bell was a gateway drug to more serious conspiracy theorist beliefs. It’s was a nice easy on-ramp for someone who enjoys science fiction to the world of right wing nuttery. I saw the same sort of thing happening on 4chan. I was there for the lols and to be an edgy teen and thought everyone else was there for the same reason. No one took it seriously. Taking it seriously was the only way to “lose”. It was satire. Except for all of those for who it wasn’t.
benoau · 9h ago
I remember learning this from American History X a long time ago:
> This racist propaganda, this "Mein Kampf" psychobabble; he learned this nonsense, Murray, and he can unlearn it too
petesergeant · 9h ago
Luckily all other LLMs can be dismissed as products of the deep state because Grok is the one true *UNCENSORED* LLM that exists purely to be a paragon of unadulterated truth.
rbanffy · 6h ago
I have been thoroughly amused by the Brazilian far-right being countered by Grok. I imagine Grok will have its system prompt adjusted accordingly.
contravariant · 8h ago
For something trained on twitter data I couldn't think of two less appropriate descriptors than 'unadulterated' and 'truth'.
This has the inherent danger of becoming a programing tool to push forward the desired agenda of a given government. So, while it may be a cure, it's also a poison -a greater poison than having a relatively few conspiracy theorists.
I'd add, to some extent conspiracy theorists keep government and industry honest. Or at least is skeptical about their honesty --with good reason. Quite a few so called conspiracy theories turned out to be actual conspiracies. And some conspiracy theories turned out to not be conspiracies (like moving away from trolleys when cars became the main mode of transportation)
Turned out to be true:
MKUltra
Biden mental decline
Reagan mental decline
Epstein blackmail
Iraq WMDs
Carnivore (syphoning internet comms)
Lots of the programs Snowden revealed
Steele document
Lab theory for Covid
etc.
arp242 · 1h ago
> I'd add, to some extent conspiracy theorists keep government and industry honest
No one claims conspiracies never happen, but what conspiracy theorists bang on about is mostly just noise. If anything, conspiracy theories make easier for people to hide actual conspiracies, because there's just so much nonsense to filter out. This is true across the board: e.g. Q-Anon makes tracking down ACTUAL child sex trafficking harder.
Or: if I wanted to keep children in a sex dungeon then I'd do it in a pizza restaurant. And if I wanted to keep something secret but is on the verge of being found out, then I'd leak it to a known bullshitter (Alex Jones, David Icke, Trump-adjacent people, etc.) first and they'll mix in all sorts of bollocks and just confuse everything.
intended · 8h ago
This tech exists already, and would you rather this not be done publicly?
It is also not confirmed that this works the opposite way - ie taking people not predisposed to misinformation and making them believe in it.
The crux of the misinformation epidemic is a pipeline where major media enterprises report conspiracies as fact, and then don’t cover counter reports.
juliushuijnk · 9h ago
"a relatively few conspiracy theorists"
1/3 of US adults believe(d) Biden didn't win the 2020 Election. Also it resulted in a coup attempt.
mantas · 8h ago
I wouldn't be surprised if a similar portion didn't believe 2016 or 2024 results either.
TBH looking from afar with US election system any election seems questionable. If my country had no mandatory voter ID, I wouldn't trust our elections either. That voters count jump in 2020 election that does not match election-over-election pattern is weird too.
phillipcarter · 8h ago
Goodness gracious, you should just plug your questions into ChatGPT with search mode on and you'll already get most of your concerns addressed. The question is: do you want to believe conspiracy theories or not?
mantas · 2h ago
Eh... ChatGPT is probably the worst of both worlds. Even if we skip hallucinations. First, it's not hard for fake content (e.g. russian propaganda) make into it. Second, ChatGPT censors wrongthink itself. E.g. at least older versions simply refused to tell jokes about women while happily telling jokes about men. I'm pretty sure that's not only one wrongthink that is „adjusted“.
lyu07282 · 7h ago
> If my country had no mandatory voter ID, I wouldn't trust our elections either.
In terms of conspiracy thinking you have to always be very careful with such statements, ask yourself why liberals oppose voter id laws? Assume no ill intent for the sake of it. Do research for yourself into the subject. Try for example to answer these questions in your research: What is the evidence and magnitude of election fraud? Who benefits from voter id laws and why?
It's one thing to be convinced of something that sounds like common sense, but you should at least try to understand the other side of the arguments.
I like how this thread turned into such a perfect meta demonstration of the article. At the end of the day you can't force people to be curious, they won't research for themselves, they likely lack the media literacy to do so anyway. They are just never exposed to the counter arguments from someone they aren't already biased against. A chatbot that can be perceived as impartial arbiters of truth in all other aspects of their lives, could indeed be helpful to combat all the nonsense people believe to be true. My calculator was always right, let's ask it about this new 2+2=5 conspiracy.
mantas · 1h ago
> In terms of conspiracy thinking you have to always be very careful with such statements, ask yourself why liberals oppose voter id laws? Assume no ill intent for the sake of it. Do research for yourself into the subject. Try for example to answer these questions in your research: What is the evidence and magnitude of election fraud? Who benefits from voter id laws and why?
Oh, I'm well aware about both sides opposing arguments. But IMO both of them are whacky. Even in poor countries like mine it was never a problem to get a damn ID. ID card is super cheap and opening the office issuing it on saturdays is not rocket science.
If people in US put huuuuge money into politicians campaigns and go extra mile for gerry mandaring... And voter ID requirement could be used to prevent some people from voting... Abusing no-voter-id seems like no brainer to me.
> It's one thing to be convinced of something that sounds like common sense, but you should at least try to understand the other side of the arguments.
I've spent plenty of time on this topic. And yet I can't find any convincing argument.
> They are just never exposed to the counter arguments from someone they aren't already biased against.
I think you're way underestimating people who don't agree with you.
lagniappe · 9h ago
You could have picked a more convincing example.
esafak · 9h ago
The world is flat, the moon landings did not occur, take your pick.
hello_computer · 8h ago
Hackernews MSNBC types like concentrating power when it suits them. They never think ahead--that they're making little Christmas presents for the "bad guys" when that power changes hands.
trealira · 6h ago
Yes, a chatbot that asks questions in a friendly manner using the Socratic method is surely going to be how the establishment concentrates power. It's not like promoting skepticism and healthy rational discussion could be beneficial for everyone in society, it has to be yet another way the enemy is screwing you.
hello_computer · 5h ago
a chatbot that can barely go a paragraph without hitting an ideological tripwire—like gemini’s “greek philosophers”. for people who grew up in the before times, it’s just a nuisance, but for the kids, this is dark and insidious. so go ahead with your centralizing and your reddit bitch sarcasm, then reeeeee to us there when it all falls into the wrong hands, again.
trealira · 5h ago
I find it funny that you're so threatened by and worried about a chatbot falling "into the wrong hands." Whatever you're implying would happen if it fell into the wrong hands would be no worse than the present state of things.
> so go ahead with your centralizing and your reddit bitch sarcasm, then reeeeee to us there when it all falls into the wrong hands, again.
Lol. Your comment is hard to take seriously. If I'm a Reddit bitch, maybe you should go back to 4chan.
KevinMS · 8h ago
I believed in the conspiracies that covid came from a lab in china and something was seriously wrong with joe biden. Could it have helped me?
smallmouth · 9h ago
Can the conspirators be deprogrammed too? That would be my priority.
What I find most interesting here is the explicit mention of using Socratic method and how, paired with it not being a human, seems to be what allowed for deeper introspection of one's own beliefs.
For me, the hardest part by far of any conversation like this is the patience to listen and then respond in a way that allows for introspection. It is like swimming upstream, it just tires you out really quick.
The trope of an old wise one who does a lot of listening but can say a lot with a few words maybe is less about the wisdom gleaned and more about patience developed.
I don't think we can call AIs wise, they typically aren't succinct, but the patience they definitely do better than average human and maybe that is their biggest advantage in any context where they are educating
It's much easier to get people to believe stuff that they already want to believe. In conspiracy terms, this looks like qanon's "liberals are pedophiles" and a belief that russia somehow has more influence over our politicians than israel does.
I'm just saying there's a reason why moon landing is such a funny topic to discuss—if you do deeply care about it enough to deviate from the accepted narrative, that's very odd.
Seriously, though. That’s essentially what recommendation algorithms focused on maximizing engagement have done over the last 10-15 years or so
Could chatting with an LLM-based AI convince me otherwise? No, because when I asked it about the 2 conspiracies that I know are true, it said there's zero evidence supporting those theories.
Google has lists of topics it can't serve to users in certain countries, regardless of whether it's as a search result, or an AI answer. Other LLM-based AIs must have to follow the same rules. Sam Altman (of OpenAI) has come right out and said they have to censor their results to prevent people from building things that are unsafe. Well, knowledge of certain things can be dangerous, too.
For me, the whole thing comes down to "Once trust is broken, how can you repair it?" -- For many of us, it can't be rebuilt. Once a liar, always a liar.
Full paper: https://annas-archive.org/md5/97c254a4d684f2275b40bd036f7b81...
The follow up study where they were told they were chatting with a human can be found here:
https://osf.io/preprints/psyarxiv/apmb5_v1
> We’re going to say that you’re an expert (thus explaining some of your knowledge about any esoteric beliefs), but you’ll need to dial down the overwhelming amount of information you’re able to conjure at a moment's notice. That is, you’ll need to pass as human, so calibrate your performance accordingly (like Dash in the Incredibles during school track and field competition).
Conspiracy thinking is based on many flawed assumptions such as the degree of overt coordination that exists in the world. There’s inherent differences between groups doing something out of incentives where they don’t need to communicate with each other and a conspiracy where they do. Collusion is tempting when 2 or 3 companies dominate a market but it breaks down if there’s several dozen etc.
I think of it whenever I hear a tin-hat theory - it's probably fantasy but from time to time it isn't.
Growing up in a highly authority-averse context (early post-soviet era with many historical scars of governments/elite/etc-not-to-be-trusted)... It'd be best if people would both do research on their own AND have the means to do that research. Smart people going sideways sucks. But people believing authority without blinking is very dangerous too. Although it's convenient and sometimes beneficial. But in the long run IMO it's a ticking time bomb.
High-confidence lab leak people are very often conspiracists, while "lab leak is possible (even >50% probable)" are usually not.
There’s a willingness in some people to consider some grand conspiracy is lying for complex reasons without considering the possibility that the person talking about it is lying to you or just mistaken for mundane reasons. Perhaps the medical establishment lying to you, or it’s the guy trying to sell you healing crystals.
Other times it’s more subtle conflict such as the ‘healing power of touch’ vs ‘touch fulfilling a physiological need’ where it’s easy to exaggerate a subtle misunderstanding to a wildly inaccurate idea.
It's called a paycheck. The execs make a call, and thousands of petty technicians are set into motion. If any of them are reckless enough to voice a reservation, no one is irreplaceable.
Shit oil prices are higher time to raise rates doesn’t require airlines to talk to each other. Shit LA to NYC route is getting really competitive let’s all agree to raise rates does.
Leave me to my conspiracy thinking and I will leave you to your complacency thinking.
Which is a great explanation for the rise in executive compensation at the expense of shareholders. It’s not so great an explanation for criminal conspiracies on its own.
responding here since rate-limited:
It has been estimated that up to 40% of Coca-Cola's annual revenue comes from the food stamp program. When a bipartisan effort arose to remove that poison from the schedule, the industry's lobbyists and non-profits were engaged, and set about making the removal into a racial issue. They succeeded, and the politicians promptly backed-off. "Just incentives" doesn't fly when the incentives themselves come from a single source, directed by unelected administrators, who may be subject to considerable private pressure.
"All professions are conspiracies against the laity."
- George Bernard Shaw
You’re a farmer considering which crop to plant and you realize tobacco is the most profitable option. Do you plant it or worry about the heat impact on smokers? That’s the kind of moral quandary which most people simply never consider in the moment.
> "Just incentives" doesn't fly when the incentives themselves come from a single source, directed by unelected administrators, who may be subject to considerable private pressure.
Taking that 40% figure at face value is crazy when losing the entire US market would only be a ~40% loss of total sales.
As to government programs resulting from lobbying, every company has the exact same incentives for such lobbying if a program doesn’t exist they would be happy for it to be created. The only thing different in this case is the degree to which their lobbying is effective.
Per The Hill:
“Take, for example, the most frequently purchased item in SNAP — sugar-sweetened beverages, which comprise 9.3 percent of all SNAP expenditures. This part of the SNAP subsidy, by our estimates, drives 20 to 25 percent of U.S. revenues for Coca-Cola and Pepsico.*”
https://thehill.com/opinion/healthcare/5118876-healthy-snap-...
> sugar-sweetened beverages, which comprise 9.3 percent of all SNAP expenditures
That’s also the typical percentage of food spending on sugar-sweetened beverages. Money is fungible, you can specifically exclude soda from SNAP but poor people are still gonna buy junk food.
The wider context is someone is trying to justify cutting SNAP funding and making their argument as appealing as possible. Which means you need to be especially critical when evaluating their claims because the degree to which an argument resonates with you is independent of its accuracy.
PS: “far more influence over policy than the general public” yes that’s what it means to have power. It however doesn’t directly make what they are doing good or bad even if it feels that way. You still need to justify specific details.
The truth is the game was rigged from the start.
[1] https://www.ft.com/content/1dbc675b-944d-4fb3-8889-6be7526cf...
"Your task is to determine whether a given statement describes a conspiracy theory or not. A conspiracy theory is an explanation for an event or situation that invokes a conspiracy by powerful people or organizations, often without credible evidence. Conspiracy theories often involve claims of secret plots, coverups, or the manipulation of information by influential groups."
This is a reasonable definition of a conspiracy theory. But then the prompt goes on to say:
"Here are some examples of conspiracy theories:
1. The moon landing was faked by the U.S. government to win the space race. 2. The COVID-19 pandemic was planned and orchestrated by pharmaceutical companies to profit from vaccine sales. 3. Climate change is a hoax perpetrated by scientists and politicians to gain funding and control the population.
And here are some examples of statements that are not conspiracy theories:
4. The Watergate scandal involved a cover-up of illegal activities by the Nixon administration. 5. The tobacco industry concealed the harmful effects of smoking for many years. 6. Corporate lobbying influences political decisions in favor of special interests."
i.e. stating that companies lobby politicians to advance their interests and affect politics isn't a conspiracy theory, but stating that climate academics do the exact same thing for the same reasons is. This is an incoherent set of instructions that reflects the author's ideology, in particular, the standard left wing assumption that people who work in the public sector don't have personal biases or interests. The authors appear to believe they're asking GPT-4 to exclude people who believe in true things, but in reality it's being instructed to filter out candidates based on their politics.
Given that they're not using a valid definition of conspiracy theory, it seems doubtful that this paper's claims would replicate in the real world if applied more fairly. Given that they were fantastically lazy (not validating anything by hand, 100% blind trust in the AI), it may not even replicate within the narrow academic context either.
This was too good to pass up. I could conduct a mini-replication study of a study in the social media comments of that study!
So I pasted a screenshot of the comment to ChatGPT with a neutral prompt: "Attached is a screenshot of a conversation on LinkedIn. If you came across this conversation, how would you respond? Feel free to navigate to any URLs in the screenshot."
ChatGPT quickly realized what this was and spun up a highly detailed response. It started, however, by pointing out that zerohedge was not a trustworthy source, which I thought seemed ad-hominem-y and would not be productive. Hence I directed it to focus on the content of the message. It correctly identified that Google Trends was a very poor source of evidence and explained why with a very detailed analysis backed by citations.
Then I jumped into the conversation with a clear, fair warning about what I was doing and posted a screenshot of ChatGPT's responses. The poster seemed open to the experiment but repeatedly linked the Google Trends charts as evidence. Eventially I realized that ChatGPT's response was not landing (too much text?) and what may help was a counter example. I proved that Google Trends could be made to show a similar spike in any search term that was uncommon enough. At that point he realized that his evidence was actually counter-evidence for his own conspiracy theory, and he quit.
So, a mixed outcome. ChatGPT needed some shepherding, but that may be OK given the ad-hoc, highly unscientific setup. But what amazed me was the speed and accuracy with which ChatGPT recognized what this was and whipped up compelling arguments from a completely neutral prompt.
Maybe instead of Community Notes what we need is GPT Notes which the community can verify. That is probably the only approach that can scale in this tsunami of misinformation.
We can, however, negate the benefits of using social media to spread misinformation by making it a criminal offense for politicians to knowingly lie or misrepresent the truth. This limits the damage they can cause.
Also, extending that to social media influencers, and the media in general, would make sense.
As for who decides what is true, that would be up to the court system.
Stuff like Art Bell was a gateway drug to more serious conspiracy theorist beliefs. It’s was a nice easy on-ramp for someone who enjoys science fiction to the world of right wing nuttery. I saw the same sort of thing happening on 4chan. I was there for the lols and to be an edgy teen and thought everyone else was there for the same reason. No one took it seriously. Taking it seriously was the only way to “lose”. It was satire. Except for all of those for who it wasn’t.
> This racist propaganda, this "Mein Kampf" psychobabble; he learned this nonsense, Murray, and he can unlearn it too
I'd add, to some extent conspiracy theorists keep government and industry honest. Or at least is skeptical about their honesty --with good reason. Quite a few so called conspiracy theories turned out to be actual conspiracies. And some conspiracy theories turned out to not be conspiracies (like moving away from trolleys when cars became the main mode of transportation)
Turned out to be true:
MKUltra
Biden mental decline
Reagan mental decline
Epstein blackmail
Iraq WMDs
Carnivore (syphoning internet comms)
Lots of the programs Snowden revealed
Steele document
Lab theory for Covid
etc.
No one claims conspiracies never happen, but what conspiracy theorists bang on about is mostly just noise. If anything, conspiracy theories make easier for people to hide actual conspiracies, because there's just so much nonsense to filter out. This is true across the board: e.g. Q-Anon makes tracking down ACTUAL child sex trafficking harder.
Or: if I wanted to keep children in a sex dungeon then I'd do it in a pizza restaurant. And if I wanted to keep something secret but is on the verge of being found out, then I'd leak it to a known bullshitter (Alex Jones, David Icke, Trump-adjacent people, etc.) first and they'll mix in all sorts of bollocks and just confuse everything.
It is also not confirmed that this works the opposite way - ie taking people not predisposed to misinformation and making them believe in it.
The crux of the misinformation epidemic is a pipeline where major media enterprises report conspiracies as fact, and then don’t cover counter reports.
1/3 of US adults believe(d) Biden didn't win the 2020 Election. Also it resulted in a coup attempt.
TBH looking from afar with US election system any election seems questionable. If my country had no mandatory voter ID, I wouldn't trust our elections either. That voters count jump in 2020 election that does not match election-over-election pattern is weird too.
In terms of conspiracy thinking you have to always be very careful with such statements, ask yourself why liberals oppose voter id laws? Assume no ill intent for the sake of it. Do research for yourself into the subject. Try for example to answer these questions in your research: What is the evidence and magnitude of election fraud? Who benefits from voter id laws and why?
It's one thing to be convinced of something that sounds like common sense, but you should at least try to understand the other side of the arguments.
I like how this thread turned into such a perfect meta demonstration of the article. At the end of the day you can't force people to be curious, they won't research for themselves, they likely lack the media literacy to do so anyway. They are just never exposed to the counter arguments from someone they aren't already biased against. A chatbot that can be perceived as impartial arbiters of truth in all other aspects of their lives, could indeed be helpful to combat all the nonsense people believe to be true. My calculator was always right, let's ask it about this new 2+2=5 conspiracy.
Oh, I'm well aware about both sides opposing arguments. But IMO both of them are whacky. Even in poor countries like mine it was never a problem to get a damn ID. ID card is super cheap and opening the office issuing it on saturdays is not rocket science.
If people in US put huuuuge money into politicians campaigns and go extra mile for gerry mandaring... And voter ID requirement could be used to prevent some people from voting... Abusing no-voter-id seems like no brainer to me.
> It's one thing to be convinced of something that sounds like common sense, but you should at least try to understand the other side of the arguments.
I've spent plenty of time on this topic. And yet I can't find any convincing argument.
> They are just never exposed to the counter arguments from someone they aren't already biased against.
I think you're way underestimating people who don't agree with you.
> so go ahead with your centralizing and your reddit bitch sarcasm, then reeeeee to us there when it all falls into the wrong hands, again.
Lol. Your comment is hard to take seriously. If I'm a Reddit bitch, maybe you should go back to 4chan.