One of a few issues I have with groups like these, is that they often confidently and aggressively spew a set of beliefs that on their face logically follow from one another, until you realize they are built on a set of axioms that are either entirely untested or outright nonsense. This is common everywhere, but I feel especially pronounced in communities like this. It also involves quite a bit of navel gazing that makes me feel a little sick participating in.
The smartest people I have ever known have been profoundly unsure of their beliefs and what they know. I immediately become suspicious of anyone who is very certain of something, especially if they derived it on their own.
jl6 · 38m ago
I don’t think it’s just (or even particularly) bad axioms, I think it’s that people tend to build up “logical” conclusions where they think each step is a watertight necessity that follows inevitably from its antecedents, but actually each step is a little bit leaky, leading to runaway growth in false confidence.
Not that non-rationalists are any better at reasoning, but non-rationalists do at least benefit from some intellectual humility.
danaris · 27m ago
> I think it’s that people tend to build up “logical” conclusions where they think each step is a watertight necessity that follows inevitably from its antecedents, but actually each step is a little bit leaky, leading to runaway growth in false confidence.
Yeah, this is a pattern I've seen a lot of recently—especially in discussions about LLMs and the supposed inevitability of AGI (and the Singularity). This is a good description of it.
lordnacho · 17m ago
> I immediately become suspicious of anyone who is very certain of something
Me too, in almost every area of life. There's a reason it's called a conman: they are tricking your natural sense that confidence is connected to correctness.
But also, even when it isn't about conning you, how do people become certain of something? They ignored the evidence against whatever they are certain of.
People who actually know what they're talking about will always restrict the context and hedge their bets. Their explanation are tentative, filled with ifs and buts. They rarely say anything sweeping.
ar-nelson · 23m ago
I find Yudowsky-style rationalists morbidly fascinating in the same way as Scientologists and other cults. Probably because they seem to genuinely believe they're living in a sci-fi story. I read a lot of their stuff, probably too much, even though I find it mostly ridiculous.
The biggest nonsense axiom I see in the AI-cult rationalist world is recursive self-improvement. It's the classic reason superintelligence takeoff happens in sci-fi: once AI reaches some threshold of intelligence, it's supposed to figure out how to edit its own mind, do that better and faster than humans, and exponentially leap into superintelligence. The entire "AI 2027" scenario is built on this assumption; it assumes that soon LLMs will gain the capability of assisting humans on AI research, and AI capabilities will explode from there.
But AI being capable of researching or improving itself is not obvious; there's so many assumptions built into it!
- What if "increasing intelligence", which is a very vague goal, has diminishing returns, making recursive self-improvement incredibly slow?
- Speaking of which, LLMs already seem to have hit a wall of diminishing returns; it seems unlikely they'll be able to assist cutting-edge AI research with anything other than boilerplate coding speed improvements.
- What if there are several paths to different kinds of intelligence with their own local maxima, in which the AI can easily get stuck after optimizing itself into the wrong type of intelligence?
- Once AI realizes it can edit itself to be more intelligent, it can also edit its own goals. Why wouldn't it wirehead itself? (short-circuit its reward pathway so it always feels like it's accomplished its goal)
Knowing Yudowsky I'm sure there's a long blog post somewhere where all of these are addressed with several million rambling words of theory, but I don't think any amount of doing philosophy in a vacuum without concrete evidence could convince me that fast-takeoff superintelligence is possible.
ambicapter · 18m ago
One of the only idioms that I don't mind living my life by is, "Follow the truth-seeker, but beware those who've found it".
ctoth · 2h ago
> I immediately become suspicious of anyone who is very certain of something, especially if they derived it on their own.
Are you certain about this?
adrianN · 23m ago
Your own state of mind is one of the easiest things to be fairly certain about.
ants_everywhere · 20m ago
The fact that this is false is one of the oldest findings of research psychology
lazide · 15m ago
said no one familiar with their own mind, ever!
teddyh · 23m ago
All I know is that I know nothing.
idontwantthis · 40m ago
Suspicious implies uncertain. It’s not immediate rejection.
JohnMakin · 1h ago
no
bobson381 · 2h ago
There should be an extremist cult of people who are certain only that uncertainty is the only certain thing
hungmung · 40m ago
What makes you so certain there isn't? A group that has a deep understanding fnord of uncertainty would probably like to work behind the scenes to achieve their goals.
More people should read Sextus Empiricus as he's basically the O.G. Phyrronist skeptic and goes pretty hard on this very train of thought.
bobson381 · 1h ago
Cool. Any specific recs or places to start with him?
rpcope1 · 1h ago
Probably the Hackett book, "Sextus Empiricus: Selections from the Major Writings on Scepticism"
bobson381 · 1h ago
Thanks!
saltcured · 1h ago
There would be, except we're all very much on the fence about whether it is the right cult for us.
arwhatever · 1h ago
“Oh, that must be exhausting.”
ameliaquining · 2h ago
Like Robert Anton Wilson if he were way less chill, perhaps.
VonGuard · 2m ago
This is actually a known pattern in tech, going back to Engelbart and SRI. While not 1-to-1, you could say that the folks who left SRI for Xerox PARC did so because Engelbart and his crew became obsessed with EST: https://en.wikipedia.org/wiki/Erhard_Seminars_Training
EST-type training still exists today. You don't eat the until the end of the whole weekend, or maybe you get rice and little else. Everyone is told to insult you day one until you cry. Then day to, still having not eaten, they build you up and tell you how great you are and have a group hug. Then they ask you how great you feel? Isn't this a good feeling? Don't you want your loved ones to have this feeling? Still having not eaten, you're then encouraged to pay for your family and friends to do the training, without their knowledge or consent.
A friend of mine did this training after his brother paid for his mom to do it, and she paid for him to do it. Let's just say that, though they felt it changed their lives at the time, their lives in no way shape or form changed. Two are in quite a bad place, in fact...
Anyway, point is, the people who invented everything we are using right now were also susceptible to cult-like groups with silly ideas and shady intentions.
kazinator · 9m ago
[delayed]
bobson381 · 2h ago
I keep thinking about the first Avengers movie, when Loki is standing above everyone going "See, is this not your natural state?". There's some perverse security in not getting a choice, and these rationalist frameworks, based in logic, can lead in all kinds of crazy arbitrary directions - powered by nothing more than a refusal to suffer any kind of ambiguity.
jacquesm · 8m ago
They mostly seem to lean that way because it gives them carte blanche to do as they please. It is just a modern version of 'god has led my hand'.
csours · 34m ago
Humans are not chickens, but we sure do seem to love having a pecking order.
lazide · 10m ago
Making good decisions is hard, and being accountable to the results of them is not fun. Easier to outsource if you can.
meroes · 2h ago
It grew out of many different threads: different websites, communities, etc all around the same time. I noticed it contemporaneously in the philosophy world where Nick Bostrom’s Simulation argument was boosted more than it deserved (like everyone was just accepting it at the lay-level). Looking back I see it also developed from less wrong and other sites, but I was wondering what was going on with simulations taking over philosophy talk. Now I see how it all coalesced.
All of it has the appearance of sounding so smart, and a few sites were genuine. But it got taken over.
6177c40f · 11m ago
To be clear, this article isn't calling rationalism a cult, it's about cults that have some sort of association with rationalism (social connection and/or ideology derived from rationalist concepts), e.g. the Zizians.
potatolicious · 1h ago
Yeah, a lot of the comments here are really just addressing cults writ large and opposed to why this one was particularly successful.
A significant part of this is the intersection of the cult with money and status - this stuff really took off once prominent SV personalities became associated with it, and got turbocharged when it started intersecting with the angel/incubator/VC scene, when there was implicit money involved.
It's unusually successful because -- for a time at least -- there was status (and maybe money) in carrying water for it.
jacquesm · 7m ago
Paypal will be traced as the root cause of many of our future troubles.
mlinhares · 2h ago
> One is Black Lotus, a Burning Man camp led by alleged rapist Brent Dill, which developed a metaphysical system based on the tabletop roleplaying game Mage the Ascension.
What the actual f. This is such an insane thing to read and understand what it means that i might need to go and sit in silence for the rest of the day.
How did we get to this place with people going completely nuts like this?
egypturnash · 4m ago
I've always been under the impression that M:tA's rules of How Magic Works are inspired by actual mystical beliefs that people have practiced for centuries. It's probably about as much of a magical for mystical development as the GURPS Cyberpunk rulebook was for cybercrime but it's pointing at something that already exists and saying "this is a thing we are going to tell an exaggerated story about".
Came to ask a similar question, but also has it always been like this? The difference is now these people/groups on the fringe had no visibility before the internet?
It's nuts.
reactordev · 2h ago
It’s always been like this, have you read the Bible? Or the Koran? It’s insane. Ours is just our flavor of crazy. Every generation has some. When you dig at it, there’s always a religion.
mlinhares · 2h ago
Mage is a game for teenagers, it doesn't try to be anything else other than a game where you roll dice do to stuff.
reactordev · 2h ago
Mage yea, but the cult? Where do you roll for crazy? Is it a save against perception? Constitution? Or intelligence check?
I know the church of Scientology wants you to crit that roll of tithing.
mlinhares · 2h ago
> I know the church of Scientology wants you to crit that roll of tithing.
I shouldn't LOL at this but I must. We're all gonna die in these terrible times but at least we'll LOL at the madness and stupidity of it all.
reactordev · 1h ago
Like all tragedies, there’s comedy there somewhere. Sometimes you have to be it.
zzzeek · 39m ago
yeah, people should understand, what is Scientology based on? The E-Meter which is some kind of cheap shit radio shack lie detector thing. I'm quite sure LLMs would do very well if given the task to spit out new cult doctrines and I would gather we are less than years away from cults based on LLM generated content (if not already).
bitwize · 12m ago
Terry Davis, a cult of one, believed God spoke to him through his computer's RNG. So... yeah.
saghm · 1h ago
Without speaking for religions I'm not familiar with, I grew up Catholic, and one of the most important Catholic beliefs is that during Mass, the bread (i.e. "communion" wafers) and wine quite literally transform into the body and blood of Jesus, and that eating and drinking it is a necessary ritual to get into heaven[1], which was a source of controversy even back as far as the Protestant Reformation, with some sects retaining that doctrine and others abandoning it. In a lot of ways, what's considered "normal" or "crazy" in a religion comes to what you're familiar with.
For those not familiar with the bible enough to know what to look for to find the wild stuff, look up the story of Elisha summoning bears out of the first to maul children for calling him bald, or the last two chapters of Daniel (which I think are only in the Catholic bible) where he literally blows up a dragon by feeding it a cake.
The "bears" story reads a lot more sensibly if you translated it correctly as "a gang of thugs tries to bully Elisha into killing himself." Still reliant on the supernatural, but what do you expect from such a book?
cjameskeller · 18m ago
To be fair, the description of the dragon incident is pretty mundane, and all he does is prove that the large reptile they had previously been feeding (& worshiping) could be killed:
"Then Daniel took pitch, and fat, and hair, and did seethe them together, and made lumps thereof: this he put in the dragon's mouth, and so the dragon burst in sunder: and Daniel said, Lo, these are the gods ye worship."
tialaramex · 1h ago
Yeah "Transubstantiation" is another technical term people might want to look at in this topic. The art piece "An Oak Tree" is a comment on these ideas. It's a glass of water. But, the artist's notes for this work insist it is an oak tree.
petralithic · 3m ago
Someone else who knows "An Oak Tree"! It is one of my favorite pieces because it wants not reality itself to be the primary way to see the world, but the belief of what reality could be.
robertlagrant · 1h ago
Yes, Catholicism has definitely accumulated some cruft :)
startupsfail · 2h ago
It is used ti be always religion. But now downsides are well understood. And alternatives that can fill the same need (social activities) - like gathering with your neighbors, singing, performing arts, clubs, parks and paries are available and great.
reactordev · 2h ago
I can see that. There’s definitely a reason they keep pumping out Call of Duty’s and Madden’s.
Mountain_Skies · 42m ago
Religions have multitudes of problems but suicide rates amongst atheists is higher than you'd think it would be. It seems like for many, rejection of organized religion leads to adoption of ad hoc quasi-religions with no mooring to them, leaving the person who is seeking a solid belief system drifting until they find a cult, give in to madness that causes self-harm, or adopt their own system of belief that they then need to vigorously protect from other beliefs.
Some percentage of the population has a lesser need for a belief system (supernatural, ad hoc, or anything else) but in general, most humans appear to be hardcoded for this need and the overlap doesn't align strictly with atheism. For the atheist with a deep need for something to believe in, the results can be ugly. Though far from perfect, organized religions tend to weed out their most destructive beliefs or end up getting squashed by adherents of other belief systems that are less internally destructive.
jacquesm · 6m ago
It's no more crazy than a virgin conception. And yet, here we are. A good chunk of the planet believes that drivel, but they'd throw their own daughters out of the house if they made the same claim.
rglover · 2h ago
> Came to ask a similar question, but also has it always been like this?
Crazy people have always existed (especially cults), but I'd argue recruitment numbers are through the roof thanks to technology and a failing economic environment (instability makes people rationalize crazy behavior).
It's not that those groups didn't have visibility before, it's just easier for the people who share the same...interests...to cloister together on an international scale.
glenstein · 2h ago
I personally (for better or worse) became familiar with Ayn Rand as a teenager, and I think Objectivism as a kind of extended Ayn Rand social circle and set of organizations has faced the charge of cultish-ness, and that dates back to, I want to say, the 70s and 80s at least. I know Rand wrote much earlier than that, but I think the social and organizational dynamics unfolded rather late in her career.
ryandrake · 2h ago
“There are two novels that can change a bookish fourteen-year old’s life: The Lord of the Rings and Atlas Shrugged. One is a childish fantasy that often engenders a lifelong obsession with its unbelievable heroes, leading to an emotionally stunted, socially crippled adulthood, unable to deal with the real world. The other, of course, involves orcs."
Her books were very popular with the gifted kids I hung out with in the late 80s. Cool kids would carry around hardback copies of Atlas Shrugged, impressive by the sheer physical size and art deco cover. How did that trend begin?
cogman10 · 1h ago
I think it's pretty similar dynamics. It's unquestioned premises (dogma) which are supposed to be accepted simply because this is "objectivism" or "rationalism".
Very similar to my childhood religion. "We have figured everything out and everyone else is wrong for not figuring things out".
Rationalism seems like a giant castle built on sand. They just keep accruing premises without ever going backwards to see if those premises make sense. A good example of this is their notions of "information hazards".
hexis · 2h ago
Albert Ellis wrote a book, "Is Objectivism a Religion" as far back as 1968. Murray Rothbard wrote "Mozart Was a Red", a play satirizing Rand's circle, in the early 60's. Ayn Rand was calling her own circle of friends, in "jest", "The Collective" in the 50's. The dynamics were there from almost the beginning.
There are at least a dozen I can think of, including the ‘drink the koolaid’ Jonestown massacre.
People be crazy, yo.
SirFatty · 2h ago
Of course, Jim Jones and L Ron Hubbard, David Kersh. I realize there have always been people that are coocoo for cocoa puffs. But so many as there appear to be now?
tuesdaynight · 1h ago
Internet made possible to know global news all the time. I think that there have always been a lot of people with very crazy and extremist views, but we only knew about the ones closer to us. Now it's possible to know about crazy people from the other side of the planet, so it looks like there's a lot more of them than before.
lazide · 1h ago
Yup. Like previously, westerners could have gone their whole lives with no clue the Hindutva existed [https://en.m.wikipedia.org/wiki/Hindutva] - Hindu Nazis, basically. Which if you know Hinduism at all, is a bit like saying Buddhist Nazis. Say what?
Just a note that the Heaven's Gate website is still up. It's a wonderful snapshot of 90s web design. https://www.heavensgate.com/
ants_everywhere · 18m ago
what a wild set of SEO keywords
> Heaven's Gate Heaven's Gate Heaven's Gate Heaven's Gate Heaven's Gate Heaven's Gate Heaven's Gate Heaven's Gate ufo ufo ufo ufo ufo ufo ufo ufo ufo ufo ufo ufo space alien space alien space alien space alien space alien space alien space alien space alien space alien space alien space alien space alien extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial misinformation misinformation misinformation misinformation misinformation misinformation misinformation misinformation misinformation misinformation misinformation misinformation freedom freedom freedom freedom freedom freedom freedom freedom freedom freedom freedom freedom second coming second coming second coming second coming second coming second coming second coming second coming second coming second coming angels angels angels angels angels angels angels angels angels angels end end times times end times end times end times end times end times end times end times end times end times Key Words: (for search engines) 144,000, Abductees, Agnostic, Alien, Allah, Alternative, Angels, Antichrist, Apocalypse, Armageddon, Ascension, Atheist, Awakening, Away Team, Beyond Human, Blasphemy, Boddhisattva, Book of Revelation, Buddha, Channeling, Children of God, Christ, Christ's Teachings, Consciousness, Contactees, Corruption, Creation, Death, Discarnate, Discarnates, Disciple, Disciples, Disinformation, Dying, Ecumenical, End of the Age, End of the World, Eternal Life, Eunuch, Evolution, Evolutionary, Extraterrestrial, Freedom, Fulfilling Prophecy, Genderless, Glorified Body, God, God's Children, God's Chosen, God's Heaven, God's Laws, God's Son, Guru, Harvest Time, He's Back, Heaven, Heaven's Gate, Heavenly Kingdom, Higher Consciousness, His Church, Human Metamorphosis, Human Spirit, Implant, Incarnation, Interfaith, Jesus, Jesus' Return, Jesus' Teaching, Kingdom of God, Kingdom of Heaven, Krishna Consciousness, Lamb of God, Last Days, Level Above Human, Life After Death, Luciferian, Luciferians, Meditation, Members of the Next Level, Messiah, Metamorphosis, Metaphysical, Millennium, Misinformation, Mothership, Mystic, Next Level, Non Perishable, Non Temporal, Older Member, Our Lords Return, Out of Body Experience, Overcomers, Overcoming, Past Lives, Prophecy, Prophecy Fulfillment, Rapture, Reactive Mind, Recycling the Planet, Reincarnation, Religion, Resurrection, Revelations, Saved, Second Coming, Soul, Space Alien, Spacecraft, Spirit, Spirit Filled, Spirit Guide, Spiritual, Spiritual Awakening, Star People, Super Natural, Telepathy, The Remnant, The Two, Theosophy, Ti and Do, Truth, Two Witnesses, UFO, Virginity, Walk-ins, Yahweh, Yeshua, Yoda, Yoga,
JTbane · 2h ago
I don't know how you can call yourself a "rationalist" and base your worldview on a fantasy game.
ponector · 1h ago
I my experience, religious people are perfectly fine with contradicted worldview.
Like christians are very flexible in following 10 commandments, always been.
BalinKing · 37m ago
That example isn’t a contradictory worldview though, just “people being people, and therefore failing to be as good as the ideal they claim to strive for.”
reactordev · 2h ago
Rationalizing the fantasy. Like LARPing. Only you lack weapons, armor, magic missiles…
hungryhobbit · 2h ago
Mage is an interesting game though: it's fantasy, but not "swords and dragons" fantasy. It's set in the real world, and the "magic" is just the "mage" shifting probabilities so that unlikely (but possible) things occur.
Such a setting would seem like the perfect backdrop for a cult that claims "we have the power to subtly influence reality and make improbable things (ie. "magic") occur".
vannevar · 1h ago
"Rationalist" in this context does not mean "rational person," but rather "person who rationalizes."
empath75 · 1h ago
Most "rationalists" throughout history have been very deeply religious people. Secular enlightenment-era rationalism is not the only direction you can go with it. It depends very much, as others have said, on what your axioms are.
But, fwiw, that particular role-playing game was very much based on trendy at the time occult beliefs in things like chaos magic, so it's not completely off the wall.
Narcissists tend to believe that they are always right, no mater what the topic is, or how knowledgeable they are. This makes them speak with confidence and conviction.
Some people are very drawn to confident people.
If the cult leader has other mental health issues, it can/will seep into their rhetoric. Combine that with unwavering support from loyal followers that will take everything they say as gospel...
That's about it.
TheOtherHobbes · 2h ago
That's pretty much it. The beliefs are just a cover story.
Outside of those, the cult dynamics are cut-paste, and always involve an entitled narcissistic cult leader acquiring as much attention/praise, sex, money, and power as possible from the abuse and exploitation of followers.
Most religion works like this. Most alternative spirituality works like this Most finance works like this. Most corporate culture works like this. Most politics works like this.
Most science works like this. (It shouldn't, but the number of abused and exploited PhD students and post-docs is very much not zero.)
The only variables are the differing proportions of attention/praise, sex, money, and power available to leaders, and the amount of abuse that can be delivered to those lower down and/or outside the hierarchy.
The hierarchy and the realities of exploitation and abuse are a constant.
If you removed this dynamic from contemporary culture there wouldn't be a lot left.
Fortunately quite a lot of good things happen in spite of it. But a lot more would happen if it wasn't foundational.
vannevar · 1h ago
Yes. The cult's "beliefs" really boil down to one belief: the infallibility of the leader. Much of the attraction is in the simplicity.
namuol · 2h ago
> How did we get to this place with people going completely nuts like this?
Ayahuasca?
Nihilartikel · 2h ago
I'm entertaining sending my kiddo to a Waldorf School, because it genuinely seems pretty good.
But looking into the underlying Western Esoteric Spirit Science, 'Anthroposophy' (because Theosophy wouldn't let him get weird enough) by Rudolph Steiner, has been quite a ride. The point being that.. humans have a pretty endless capacity to go ALL IN on REALLY WEIRD shit, as long as it promises to fix their lives if they do everything they're told. Naturally if their lives aren't fixed, then they did it wrong or have karmic debt to pay down, so YMMV.
In any case, I'm considering the latent woo-cult atmosphere as a test of the skeptical inoculation that I've tried to raise my child with.
BryantD · 1h ago
I went to a Waldorf school and I’d recommend being really wary. The woo is sort of background noise, and if you’ve raised your kid well they’ll be fine. But the quality of the academics may not be good at all. For example, when I was ready for calculus my school didn’t have anyone who knew how to teach it so they stuck me and the other bright kid in a classroom with a textbook and told us to figure it out. As a side effect of not being challenged, I didn’t have good study habits going into college, which hurt me a lot.
If you’re talking about grade school, interview whoever is gonna be your kids teacher for the next X years and make sure they seem sane. If you’re talking about high school, give a really critical look at the class schedule.
Waldorf schools can vary a lot in this regard so you may not encounter the same problems I did, but it’s good to be cautious.
rglover · 2h ago
I came to comments first. Thank you for sharing this quote. Gave me a solid chuckle.
I think people are going nuts because we've drifted from the dock of a stable civilization. Institutions are a mess. Economy is a mess. Combine all of that together with the advent of social media making the creation of echo chambers (and the inevitable narcissism of "leaders" in those echo chambers) effortless and ~15 years later, we have this.
AnimalMuppet · 1h ago
From false premises, you can logically and rationally reach really wrong conclusions. If you have too much pride in your rationality, you may not be willing to say "I seem to have reached a really insane conclusion, maybe my premises are wrong". That is, the more you pride yourself on your rationalism, the more prone you may be to accepting a bogus conclusion if it is bogus because the premises are wrong.
DangitBobby · 27m ago
Then again, most people tend to form really bogus beliefs without bothering to establish any premises. They may not even be internally consistent or align meaningfully with reality. I imagine having premises and thinking it through has a better track record of reaching conclusions consistent with reality.
davorak · 2h ago
Makes me think of that saying that great artists steal, so repurposed for cult founders: "Good cult founders copy, great cult founders steal"
I do not think this cult dogma is any more out there than other cult dogma I have heard, but the above quote makes me think it is easier to found cults in modern day in someways since you can steal other complex world building from numerous sources rather building yourself and keeping everything straight.
linohh · 2h ago
Running a cult is a somewhat reliable source of narcissistic supply. The internet tells you how to do it. So an increasing number of people do it.
No comments yet
pstuart · 2h ago
People are wired to worship, and want somebody in charge telling them what to do.
I'm a staunch atheist and I feel the pull all the time.
optimalsolver · 2h ago
astronauts_meme.jpg
eli_gottlieb · 1h ago
Who the fuck bases a Black Lotus cult on Mage: the Ascension rather than Magic: the Gathering? Is this just a mistake by the journalist?
piva00 · 2h ago
I've met a fair share of people in the burner community, the vast majority I met are lovely folks who really enjoy the process of bringing some weird big idea into reality, working hard on the builds, learning stuff, and having a good time with others for months to showcase their creations at some event.
On the other hand, there's a whole other side of a few nutjobs who really behave like cult leaders, they believe their own bullshit and over time manage to find in this community a lot of "followers", since one of the foundational aspects is radical acceptance it becomes very easy to be nutty and not questioned (unless you do something egregious).
greenavocado · 2h ago
Humans are compelled to find agency and narrative in chaos. Evolution favored those who assumed the rustle was a predator, not the wind. In a post-Enlightenment world where traditional religion often fails (or is rejected), this drive doesn't vanish. We don't stop seeking meaning. We seek new frameworks. Our survival depended on group cohesion. Ostracism meant death. Cults exploit this primal terror. Burning Man's temporary city intensifies this: extreme environment, sensory overload, forced vulnerability. A camp like Black Lotus offers immediate, intense belonging. A tribe with shared secrets (the "Ascension" framework), rituals, and an "us vs. the sleepers" mentality. This isn't just social; it's neurochemical. Oxytocin (bonding) and cortisol (stress from the environment) flood the system, creating powerful, addictive bonds that override critical thought.
Human brains are lazy Bayesian engines. In uncertainty, we grasp for simple, all-explaining models (heuristics). Mage provides this: a complete ontology where magic equals psychology/quantum woo, reality is malleable, and the camp leaders are the enlightened "tradition." This offers relief from the exhausting ambiguity of real life. Dill didn't invent this; he plugged into the ancient human craving for a map that makes the world feel navigable and controllable. The "rationalist" veneer is pure camouflage. It feels like critical thinking but is actually pseudo-intellectual cargo culting. This isn't Burning Man's fault. It's the latest step of a 2,500-year-old playbook. The Gnostics and the Hermeticists provided ancient frameworks where secret knowledge ("gnosis") granted power over reality, accessible only through a guru. Mage directly borrows from this lineage (The Technocracy, The Traditions). Dill positioned himself as the modern "Ascended Master" dispensing this gnosis.
The 20th century cults Synanon, EST, Moonies, NXIVM all followed similar patterns, starting with isolation. Burning Man's temporary city is the perfect isolation chamber. It's physically remote, temporally bounded (a "liminal space"), fostering dependence on the camp. Initial overwhelming acceptance and belonging (the "Burning Man hug"), then slowly increasing demands (time, money, emotional disclosure, sexual access), framed as "spiritual growth" or "breaking through barriers" (directly lifted from Mage's "Paradigm Shifts" and "Quintessence"). Control language ("sleeper," "muggle," "Awakened"), redefining reality ("that rape wasn't really rape, it was a necessary 'Paradox' to break your illusions"), demanding confession of "sins" (past traumas, doubts), creating dependency on the leader for "truth."
Burning Man attracts people seeking transformation, often carrying unresolved pain. Cults prey on this vulnerability. Dill allegedly targeted individuals with trauma histories. Trauma creates cognitive dissonance and a desperate need for resolution. The cult's narrative (Mage's framework + Dill's interpretation) offers a simple explanation for their pain ("you're unAwakened," "you have Paradox blocking you") and a path out ("submit to me, undergo these rituals"). This isn't therapy; it's trauma bonding weaponized. The alleged rape wasn't an aberration; it was likely part of the control mechanism. It's a "shock" to induce dependency and reframe the victim's reality ("this pain is necessary enlightenment"). People are adrift in ontological insecurity (fear about the fundamental nature of reality and self). Mage offers a new grand narrative with clear heroes (Awakened), villains (sleepers, Technocracy), and a path (Ascension).
TimorousBestie · 2h ago
Mage: The Ascension is basically a delusions of grandeur simulator, so I can see how an already unstable personality might get attached to it and become more unstable.
mlinhares · 2h ago
I don't know, i'd understand something like Wraith (which I did see people developing issues, the shadow mechanic is such a terrible thing) but Mage is so, like, straightforward?
Use your mind to control reality, reality fights back with paradox, its cool for a teenager but you read a bit more fantasy and you'll definitely find cooler stuff. But i guess for you to join a cult your mind must stay a teen mind forever.
WJW · 2h ago
I didn't originally write this, but can't find the original place I read it anymore. I think it makes a lot of sense to repost it here:
All of the World Of Darkness and Chronicles Of Darkness games are basically about coming of age/puberty. Like X-Men but for Goth-Nerds instead of Geek-Nerds.
In Vampire, your body is going through weird changes and you are starting to develop, physically and/or mentally, while realising that the world is run by a bunch of old, evil fools who still expect you to toe the line and stay in your place, but you are starting to wonder if the world wouldn't be better if your generation overthrew them and took over running the world, doing it the right way. And there are all these bad elements trying to convince you that you should do just that, but for the sake of mindless violence and raucous partying.
Teenager - the rpg.
In Werewolf, your body is going through weird changes and you are starting to develop, physically and mentally, while realising that you are not a part of the "normal" crowd that the rest of Humanity belongs to. You are different and they just can't handle that whenever it gets revealed. Luckily, there are small communities of people like you out there who take you in and show you how use the power of your "true" self. Of course, even among this community, there are different types of other.
LGBT Teenager - the RPG
In Mage, you have begun to take an interest in the real world, and you think you know what the world is really like. The people all around you are just sleep-walking through life, because they don't really get it. This understanding sets you against the people who run the world: the governments and the corporations, trying to stop these sleeper from waking up to the truth and rejecting their comforting lies. You have found some other people who saw through them, and you think they've got a lot of things wrong, but at least they're awake to the lies!
Rebellious Teenager - the RPG
abullinan · 2h ago
“ The people all around you are just sleep-walking through life, because they don't really get it.”
Twist: we’re sleepwalking through life because we really DO get it.
(Source: I’m 56)
mlinhares · 2h ago
This tracks, but I'd say Werewolf goes beyond LGBT folks, the violence there also fits the boy's aggressive play and the saving the world theme resonated a lot with the basic "i want to be important/hero" thing. Its my favorite of all world of darkness books, i regret not getting the kickstarter edition :(
reactordev · 2h ago
I think I read it too, it’s called Twilight. /s
I had friends who were into Vampire growing up. I hadn’t heard of Werewolf until after the aforementioned book came out and people started going nuts for it. I mentioned to my wife at the time that there was this game called “Vampire” and told her about it and she just laughed, pointed to the book, and said “this is so much better”. :shrug:
Rewind back and there were the Star Wars kids. Fast forward and there are the Harry Potter kids/adults. Each generation has their own “thing”. During that time, it was Quake MSDOS and Vampire. Oh and we started Senior Assassinations. 90s super soakers were the real deal.
wavefunction · 2h ago
How many adults actually abandon juvenalia as they age? Not the majority in my observation, and it's not always a bad thing when it's only applied to subjects like pop culture. Applied juvenalia in response to serious subjects is a more serious issue.
mlinhares · 2h ago
There has to be a cult of people that believe they’re vampires, respecting the masquerade and serving some antedeluvian somewhere, vampire was much more mainstream than mage.
DonHopkins · 2h ago
There are post-pubescent males who haven't abandoned Atlas Shrugged posting to this very web site!
gedy · 2h ago
Paraphrasing someone I don't recall - when people believe in nothing, they'll believe anything.
collingreen · 1h ago
And therefore you should believe in me and my low low 10% tithe! That's the only way to not get tricked into believing something wrong so don't delay!
j_m_b · 2h ago
> One way that thinking for yourself goes wrong is that you realize your society is wrong about something, don’t realize that you can’t outperform it, and wind up even wronger.
many such cases
NoGravitas · 1h ago
Capital-R Rationalism also encourages you to think you can outperform it, by being smart and reasoning from first principles. That was the idea behind MetaMed, founded by LessWronger Michael Vassar - that being trained in rationalism made you better at medical research and consulting than medical school or clinical experience. Fortunately they went out of business before racking up a body count.
rpcope1 · 1h ago
One lesson I've learned and seen a lot in my life is that understanding that something is wrong or what's wrong about it, and being able to come up with a better solution are distinct, and the latter is often much harder. It seems often that those that are best able to describe the problem often don't overlap much with those that can figure out how to solve, even though they think they can.
shadowgovt · 2h ago
It is an unfortunate reality of our existence that sometimes Chesterton actually did build that fence for a good reason, a good reason that's still here.
(One of my favorite TED talks was about a failed experiment in introducing traditional Western agriculture to a people in Zambia. It turns out when you concentrate too much food in one place, the hippos come and eat it all and people can't actually out-fight hippos in large numbers. In hindsight, the people running the program should have asked how likely it was that folks in a region that had exposure to other people's agriculture for thousands of years, hadn't ever, you know... tried it. https://www.ted.com/talks/ernesto_sirolli_want_to_help_someo...)
bobson381 · 2h ago
You sound like you'd like the book Seeing like a State.
HDThoreaun · 36m ago
Why didnt they kill the hippos like we killed the buffalo?
im3w1l · 2h ago
Shoot the hippos to death for even more food. If it doesn't seem to work it's just a matter of having more and bigger guns.
ljlolel · 2h ago
TEDx
quantummagic · 2h ago
It's almost the defining characteristic of our time.
biophysboy · 2h ago
> “There’s this belief [among rationalists],” she said, “that society has these really bad behaviors, like developing self-improving AI, or that mainstream epistemology is really bad–not just religion, but also normal ‘trust-the-experts’ science. That can lead to the idea that we should figure it out ourselves. And what can show up is that some people aren't actually smart enough to form very good conclusions once they start thinking for themselves.”
I see this arrogant attitude all the time on HN: reflexive distrust of the "mainstream media" and "scientific experts". Critical thinking is a very healthy idea, but its dangerous when people use it as a license to categorically reject sources. Its even worse when extremely powerful people do this; they can reduce an enormous sub-network of thought into a single node for many many people.
So, my answer for "Why Are There So Many Rationalist Cults?" is the same reason all cults exist: humans like to feel like they're in on the secret. We like to be in secret clubs.
ameliaquining · 2h ago
Sure, but that doesn't say anything about why one particular social scene would spawn a bunch of cults while others do not, which is the question that the article is trying to answer.
biophysboy · 1h ago
Maybe I was too vague. My argument is that cults need a secret. The secret of the rationalist community is "nobody is rational except for us". Then the rituals would be endless probability/math/logic arguments about sci-fi futures.
dfabulich · 2h ago
The whole game of Rationalism is that you should ignore gut intuitions and cultural norms that you can't justify with rational arguments.
Well, it turns out that intuition and long-lived cultural norms often have rational justifications, but individuals may not know what they are, and norms/intuitions provide useful antibodies against narcissist would-be cult leaders.
Can you find the "rational" justification not to isolate yourself from non-Rationalists, not to live with them in a polycule, and not to take a bunch of psychedelic drugs with them? If you can't solve that puzzle, you're in danger of letting the group take advantage of you.
StevenWaterman · 1h ago
Yeah, I think this is exactly it. If something sounds extremely stupid, or if everyone around you says it's extremely stupid, it probably is. If you can't justify it, it's probably because you have failed to find the reason it's stupid, not because it's actually genius.
And the crazy thing is, none of that is fundamentally opposed to rationalism. You can be a rationalist who ascribes value to gut instinct and societal norms. Those are the product of millions of years of pre-training.
I have spent a fair bit of time thinking about the meaning of life. And my conclusions have been pretty crazy. But they sound insane, so until I figure out why they sound insane, I'm not acting on those conclusions. And I'm definitely not surrounding myself with people who take those conclusions seriously.
kelseyfrog · 1h ago
> The whole game of Rationalism is that you should ignore gut intuitions and cultural norms that you can't justify with rational arguments.
Specifically, rationalism spends a lot of time about priors, but a sneaky thing happens that I call the 'double update'.
Bayesian updating works when you update your genuine prior believe with new evidence. No one disagrees with this, and sometimes it's easy and sometimes it's difficult to do.
What Rationalists often end up doing is relaxing their priors - intuition, personal experience, cultural norms - and then updating. They often think of this as one update, but what it is is two. The first update, relaxing priors, isn't associated with evidence. It's part of the community norms. There is an implicit belief that by relaxing one's priors you're more open to reality. The real result though, is that it sends people wildly off course. Care in point: all the cults.
Consider the pre-tipped scale. You suspect the scale reads a little low, so before weighing you tilt it slightly to "correct" for that bias. Then you pour in flour until the dial says you've hit the target weight. You’ve followed the numbers exactly, but because you started from a tipped scale, you've ended up with twice the flour the recipe called for.
Trying to correct for bias by relaxing priors is updating using evidence, not just because everyone is doing it.
empath75 · 1h ago
> The whole game of Rationalism is that you should ignore gut intuitions and cultural norms that you can't justify with rational arguments.
The game as it is _actually_ played is that you use rationalist arguments to justify your pre-existing gut intuitions and personal biases.
NoGravitas · 1h ago
Or worse - to justify the gut intuitions and personal biases of your cult leader.
nathan_compton · 2h ago
Thinking too hard about anything will drive you insane but I think the real issue here is that rationalists simply over-estimate both the power of rational thought and their ability to do it. If you think of people who tend to make that kind of mistake you can see how you get a lot of crazy groups.
I guess I'm a radical skeptic, secular humanist, utilitarianish sort of guy, but I'm not dumb enough to think throwing around the words "bayesian prior" and "posterior distribution" makes actually figuring out how something works or predicting the outcome of an intervention easy or certain. I've had a lot of life at this point and gotten to some level of mastery at a few things and my main conclusion is that most of the time its just hard to know stuff and that the single most common cognitive mistake people make is too much certainty.
rpcope1 · 37m ago
Even the real progenitors of a lot of this sort of thought, like E.T. Jaynes, expoused significantly more skepticism than I've ever seen a "rationalist" use. I would even imagine if you asked almost all rationalists who E.T. Jaynes was (if they weren't well versed in statistical mechanics) they'd have no idea who he was or why his work was important to applying "Bayesianism".
nyeah · 2h ago
I'm lucky enough work in a pretty rational place (small "r"). We're normally data-limited. Being "more rational" would mean taking/finding more of the right data, talking to the right people, reading the right stuff. Not just thinking harder and harder about what we already know.
There's a point where more passive thinking stops adding value and starts subtracting sanity. It's pretty easy to get to that point. We've all done it.
naasking · 2h ago
> We're normally data-limited.
This is a common sentiment but is probably not entirely true. A great example is cosmology. Yes, more data would make some work easier, but astrophysicists and cosmologists have shown that you can gather and combine existing data and look at it in novel new ways to produce unexpected results, like place bounds that can include/exclude various theories.
I think a philosophy that encourages more analysis rather than sitting back on our laurels with an excuse that we need more data is good, as long as it's done transparently and honestly.
spott · 24m ago
This depends on what you are trying to figure out.
If you are talking about cosmology? Yea, you can look at existing data in new ways, cause you probably have enough data to do that safely.
If you are looking at human psychology? Looking at existing data in new ways is essentially p-hacking. And you probably won’t ever have enough data to define a “universal theory of the human mind”.
nyeah · 1h ago
I suspect you didn't read some parts of my comment. I didn't say everyone in the world is always data-limited, I said we normally are where I work. I didn't recommend "sitting back on our laurels." I made very specific recommendations.
The qualifier "normally" already covers "not entirely true". Of course it's not entirely true. It's mostly true for us now. (In fact twenty years ago we used more numerical models than we do now, because we were facing more unsolved problems where the solution was pretty well knowable just by doing more complicated calculations, but without taking more data. Back then, when people started taking lots of data, it was often a total waste of time. But right now, most of those problems seem to be solved. We're facing different problems that seem much harder to model, so we rely more on data. This stage won't be permanent either.)
It's not a sentiment, it's a reality that we have to deal with.
naasking · 43m ago
> It's not a sentiment, it's a reality that we have to deal with.
And I think you missed the main point of my reply: that people often think we need more data, but cleverness and ingenuity can often find a way to make meaningful progress with existing data. Obviously I can't make any definitive judgment about your specific case, but I'm skeptical of any claim that it's out of the realm of possibility that some genius like Einstein analyzed your problem could get no further than you have.
sunshowers · 1h ago
I don't disagree, but to steelman the case for (neo)rationalism: one of its fundamental contributions is that Bayes' theorem is extraordinarily important as a guide to reality, perhaps at the same level as the second law of thermodynamics; and that it is dramatically undervalued by larger society. I think that is all basically correct.
(I call it neorationalism because it is philosophically unrelated to the more traditional rationalism of Spinoza and Descartes.)
matthewdgreen · 1h ago
I don't understand what "Bayes' theorem is a good way to process new data" (something that is not at all a contribution of neorationalism) has to do with "human beings are capable of using this process effectively at a conscious level to get to better mental models of the world." I think the rationalist community has a thing called "motte and bailey" that would apply here.
rpcope1 · 34m ago
Where Bayes' theorem applies in unconventional ways is not remotely novel for "rationalism" (maybe only in their strange busted hand wavy circle jerk "thought experiments"). This has been the domain of statistical mechanics long before Yudkowski and other cult leaders could even probably mouth "update your priors".
sunshowers · 26m ago
I don't know, most of science still runs on frequentist statistics. Juries convict all the time on evidence that would never withstand a Bayesian analysis. The prosecutor's fallacy is real.
throw4847285 · 2h ago
People find academic philosophy impenetrable and pretentious, but it has two major advantages over rationalist cargo cults.
The first is diffusion of power. Social media is powered by charisma, and while it is certainly true that personality-based cults are nothing new, the internet makes it way easier to form one. Contrast that with academic philosophy. People can have their own little fiefdoms, and there is certainly abuse of power, but rarely concentrated in such a way that you see within rationalist communities.
The second (and more idealistic) is that the discipline of Philosophy is rooted in the Platonic/Socratic notion that "I know that I know nothing." People in academic philosophy are on the whole happy to provide a gloss on a gloss on some important thinker, or some kind of incremental improvement over somebody else's theory. This makes it extremely boring, and yet, not nearly as susceptible to delusions of grandeur. True skepticism has to start with questioning one's self, but everybody seems to skip that part and go right to questioning everybody else.
Rationalists have basically reinvented academic philosophy from the ground up with none of the rigor, self-discipline, or joy. They mostly seem to dedicate their time to providing post-hoc justifications for the most banal unquestioned assumptions of their subset of contemporary society.
NoGravitas · 1h ago
> Rationalists have basically reinvented academic philosophy from the ground up with none of the rigor, self-discipline, or joy.
Taking academic philosophy seriously, at least as an historical phenomenon, would require being educated in the humanities, which is unpopular and low-status among Rationalists.
alphazard · 2h ago
The terminology here is worth noting. Is a Rationalist Cult a cult that practices Rationalism according to third parties, or is it a cult that says they are Rationalist?
Clearly all of these groups that believe in demons or realities dictated by tabletop games are not what third parties would call Rationalist. They might call themselves that.
There are some pretty simple tests that can out these groups as not rational. None of these people have ever seen a demon, so world models including demons have never predicted any of their sense data. I doubt these people would be willing to make any bets about when or if a demon will show up. Many of us would be glad to make a market concerning predictions made by tabletop games about physical phenomenon.
glenstein · 2h ago
Yeah, I would say the groups in question are notionally, aspirationally rational and I would hate for the takeaway to be disengagement from principles of critical thinking and skeptical thinking writ large.
Which, to me, raises the fascinating question of what does a "good" version look like, of groups and group dynamics centered around a shared interest in best practices associated with critical thinking?
At a first impression, I think maybe these virtues (which are real!) disappear into the background of other, more applied specializations, whether professions, hobbies, backyard family barbecues.
alphazard · 2h ago
It would seem like the quintessential Rationalist institution to congregate around is the prediction market. Status in the community has to be derived from a history of making good bets (PnL as a %, not in absolute terms). And the sense of community would come from (measurably) more rational people teaching (measurably) less rational people how to be more rational.
handoflixue · 1h ago
The founder of LessWrong / The Rationalist movement would absolutely agree with you here, and has written numerous fanfics about a hypothetical alien society ("Dath Ilan") where those are fairly central.
ameliaquining · 2h ago
The article is talking about cults that arose out of the rationalist social milieu, which is a separate question from whether the cult's beliefs qualify as "rationalist" in some sense (a question that usually has no objective answer anyway).
Barrin92 · 2h ago
>so world models including demons have never predicted any of their sense data.
There's a reason they call themselves "rationalists" instead of empiricists or positivists. They perfectly inverted Hume ("reason is, and ought only to be the slave of the passions")
These kinds of harebrained views aren't an accident but a product of rationalism. The idea that intellect is quasi infinite and that the world can be mirrored in the mind is not running contradictory to, but just the most extreme form of rationalism taken to its conclusion, and of course deeply religious, hence the constant fantasies about AI divinities and singularities.
gadders · 2h ago
They are literally the "ackchyually" meme made flesh.
noqc · 2h ago
Perhaps I will get downvoted to death again for saying so, but the obvious answer is because the name "rationalist" is structurally indistinguishable from the name "scientology" or "the illuminati". You attract people who are desperate for an authority to appeal to, but for whatever reason are no longer affiliated with the church of their youth. Even a rationalist movement which held nothing as dogma would attract people seeking dogma, and dogma would form.
The article begins by saying the rationalist community was "drawn together by AI researcher Eliezer Yudkowsky’s blog post series The Sequences". Obviously the article intends to make the case that this is a cult, but it's already done with the argument at this point.
o11c · 20m ago
> for whatever reason are no longer affiliated with the church of their youth.
This is the Internet, you're allowed to say "they are obsessed with unlimited drugs and weird sex things, far beyond what even the generally liberal society tolerates".
I'm increasingly convinced that every other part of "Rationalism" is just distraction or justification for those; certainly there's a conscious decision to minimize talking about this part on the Internet.
handoflixue · 1h ago
> Obviously the article intends to make the case that this is a cult
The author is a self-identified rationalist. This is explicitly established in the second sentence of the article. Given that, why in the world would you think they're trying to claim the whole movement is a cult?
Obviously you and I have very different definitions of "obvious"
noqc · 1h ago
When I read the article in its entirety, I was pretty disappointed in its top-level introspection.
It seems to not be true, but I still maintain that it was obvious. Sometimes people don't pick the low-hanging fruit.
mcv · 2h ago
In fact, I'd go a step further and note the similarity with organized religion. People have a tendency to organize and dogmatize everything. The problem with religion is rarely the core ideas, but always the desire to use it as a basis for authority, to turn it dogmatic and ultimately form a power structure.
And I say this as a Christian. I often think that becoming a state religion was the worst thing that ever happened to Christianity, or any religion, because then it unavoidably becomes a tool for power and authority.
And doing the same with other ideas or ideologies is no different. Look at what happened to communism, capitalism, or almost any other secular idea you can think of: the moment it becomes established, accepted, and official, the corruption sets in.
johnisgood · 2h ago
I do not see any reasons for you to get down-voted.
I agree that the term "rationalist" would appeal to many people, and the obvious need to belong to a group plays a huge role.
noqc · 1h ago
There are a lot of rationalists in this community. Pointing out that the entire thing is a cult attracts downvotes from people who wish to, for instance, avoid being identified with the offshoots.
6177c40f · 4m ago
No, the downvotes are because rationalism isn't a cult and people take offense to being blatantly insulted. This article is about cults that are rationalism-adjacent, it's not claiming that rationalism is itself a cult.
AIPedant · 2h ago
I think I found the problem!
The rationalist community was drawn together by AI researcher Eliezer Yudkowsky’s blog post series The Sequences, a set of essays about how to think more rationally
I actually don't mind Yudkowski as an individual - I think he is almost always wrong and undeservedly arrogant, but mostly sincere. Yet treating him as an AI researcher and serious philosopher (as opposed to a sci-fi essayist and self-help writer) is the kind of slippery foundation that less scrupulous people can build cults from. (See also Maharishi Mahesh Yogi and related trends - often it is just a bit of spiritual goofiness as with David Lynch, sometimes you get a Charles Manson.)
fulafel · 1h ago
How has he fared in the fields of philosophy and AI research in terms of peer review, is there some kind of roundup or survey around about this?
polytely · 2h ago
Don't forget the biggest scifi guy turned cult leader of all L. Ron Hubbard
AIPedant · 2h ago
I don't think Yudkowski is at all like L. Ron Hubbard. Hubbard was insane and pure evil. Yudkowski seems like a decent and basically reasonable guy, he's just kind of a blowhard and he's wrong about the science.
L. Ron Hubbard is more like the Zizians.
pingou · 1h ago
I don't have a horse in the battle but could you provide a few examples where he was wrong?
bglazer · 1h ago
Here's one: Yudkowsky has been confidently asserting (for years) that AI will extinct humanity because it will learn how to make nanomachines using "strong" covalent bonds rather than the "weak" van der Waals forces used by biological systems like proteins. I'm certain that knowledgeable biologists/physicists have tried to explain to him why this belief is basically nonsense, but he just keeps repeating it. Heck there's even a LessWrong post that lays it out quite well [1]. This points to a general disregard for detailed knowledge of existing things and a preference for "first principles" beliefs, no matter how wrong they are.
> And what can show up is that some people aren't actually smart enough to form very good conclusions once they start thinking for themselves.
It's mostly just people who aren't very experienced talking about and dealing honestly with their emotions, no?
I mean, suppose someone is busy achieving and, at the same time, proficient in balancing work with emotional life, dealing head-on with interpersonal conflicts, facing change, feeling and acknowledging hurt, knowing their emotional hangups, perhaps seeing a therapist, perhaps even occasionally putting personal needs ahead of career... :)
Tell that person they can get a marginal (or even substantial) improvement from some rationalist cult practice. Their first question is going to be, "What's the catch?" Because at the very least they'll suspect that adjusting their work/life balance will bring a sizeable amount of stress and consequent decrease in their emotional well-being. And if the pitch is that this rationalist practice works equally well at improving emotional well-being, that smells to them. They already know they didn't logic themselves into their current set of emotional issues, and they are highly unlikely to logic themselves out of them. So there's not much value here to offset the creepy vibes of the pitch. (And again-- being in touch with your emotions means quicker and deeper awareness of creepy vibes!)
Now, take a person whose unexplored emotional well-being tacitly depends on achievement. Even a marginal improvement in achievement could bring perceptible positive changes in their holistic selves! And you can step through a well-specified, logical process to achieve change? Sign HN up!
lenerdenator · 2h ago
Because humans like people who promise answers.
andy99 · 2h ago
Boring as it is, this is the answer. It's just more religion.
Church, cult, cult, church. So we'll get bored someplace else every Sunday. Does this really change our everyday lives?
optimalsolver · 2h ago
Funnily enough, the actress who voiced this line is a Scientologist:
I think they were making fun of the "Moonies" so she was probably able to rationalize it. Pretty sure Isaac Hayes quit South Park over their making fun of scientologists.
ZeroGravitas · 2h ago
I read recently that he suffered a serious medical event atound that time and it was actually cult members speaking on his behalf that withdrew him from the show.
I think it was a relative of his claiming this.
wiredfool · 2h ago
It's really worth reading up on the techniques from Large Group Awareness Training so that you can recognize them when they pop up.
Once you see them listed (social pressure, sleep deprivation, control of drinking/bathroom, control of language/terminology, long exhausting activities, financial buy in, etc) and see where they've been used in cults and other cult adjacent things it's a little bit of a warning signal when you run across them IRL.
saasapologist · 2h ago
I think we've strayed too far from the Aristotelian dynamics of the self.
Outside of sexuality and the proclivities of their leaders, emphasis on physical domination of the self is lacking. The brain runs wild, the spirit remains aimless.
In the Bay, the difference between the somewhat well-adjusted "rationalists" and those very much "in the mush" is whether or not someone tells you they're in SF or "on the Berkeley side of things"
optimalsolver · 2h ago
Pertinent Twitter comment:
"Rationalism is such an insane name for a school of thought. Like calling your ideology correctism or winsargumentism"
IIUC the name in its current sense was sort of an accident. Yudkowsky originally used the term to mean "someone who succeeds at thinking and acting rationally" (so "correctism" or "winsargumentism" would have been about equally good), and then talked about the idea of "aspiring rationalists" as a community narrowly focused on developing a sort of engineering discipline that would study the scientific principles of how to be right in full generality and put them into practice. Then the community grew and mutated into a broader social milieu that was only sort of about that, and people needed a name for it, and "rationalists" was already there, so that became the name through common usage. It definitely has certain awkwardnesses.
hn_throwaway_99 · 2h ago
To be honest I don't understand that objection. If you strip it from all its culty sociological effects, one of the original ideas of rationalism was to try to use logical reasoning and statistical techniques to explicitly avoid the pitfalls of known cognitive biases. Given that foundational tenet, "rationalism" seems like an extremely appropriate moniker.
I fully accept that the rationalist community may have morphed into something far beyond that original tenet, but I think rationalism just describes the approach, not that it's the "one true philosophy".
ameliaquining · 2h ago
That it refers to a different but confusingly related concept in philosophy is a real downside of the name.
nyeah · 2h ago
I'm going to start a group called "Mentally Healthy People". We use data, logical thinking, and informal peer review. If you disagree with us, our first question will be "what's wrong with mental health?"
handoflixue · 1h ago
So... Psychiatry? Do you think psychiatrists are particularly prone to starting cults? Do you think learning about psychiatry makes you at risk for cult-like behavior?
nyeah · 1h ago
No. I have no beef with psychology or psychiatry. They're doing good work as far as I can tell. I am poking fun at people who take "rationality" and turn it into a brand name.
handoflixue · 1h ago
Why is "you can work to avoid cognitive biases" more ridiculous than "you can work to improve your mental health"?
nyeah · 1h ago
I'm feeling a little frustrated by the derail. My complaint is about some small group claiming to have a monopoly on a normal human faculty, in this case rationality. The small group might well go on to claim that people outside the group lack rationality. That would be absurd. The mental health profession do not claim to be immune from mental illness themselves, they do not claim that people outside their circle are mentally unhealthy, and they do not claim that their particular treatment is necessary for mental health.
I guess it's possible you might be doing some deep ironic thing by providing a seemingly sincere example of what I'm complaining about. If so it was over my head but in that case I withdraw "derail"!
glenstein · 1h ago
Right and to your point, I would say you can distinguish (1) "objective" in the sense of relying on mind-independent data from (2) absolute knowledge, which treats subjects like closed conversations. And you can make similar caveats for "rational".
You can be rational and objective about a given topic without it meaning that the conversation is closed, or that all knowledge has been found. So I'm certainly not a fan of cult dynamics, but I think it's easy to throw an unfair charge at these groups, that their interest in the topic necessitates an absolutist disposition.
wiredfool · 2h ago
Objectivisim?
nyeah · 2h ago
Great names! Are you using them, or are they available?
/s
animal_spirits · 2h ago
> If someone is in a group that is heading towards dysfunctionality, try to maintain your relationship with them; don’t attack them or make them defend the group. Let them have normal conversations with you.
This is such an important skill we should all have. I learned this best from watching the documentary Behind the Curve, about flat earthers, and have applied it to my best friend diving into the Tartarian conspiracy theory.
keybored · 2h ago
Cue all the surface-level “tribalism/loneliness/hooman nature” comments instead of the simple analysis that Rationalism (this kind) is severely brain-broken and irredeemable and will just foster even worse outcomes in a group setting. It’s a bit too close to home (ideologically) to get a somewhat detached analysis.
digbybk · 1h ago
When I was looking for a group in my area to meditate with, it was tough finding one that didn't appear to be a cult. And yet I think Buddhist meditation is the best tool for personal growth humanity has ever devised. Maybe the proliferation of cults is a sign that Yudkowsky was on to something.
ivm · 21m ago
None of them are practicing Buddhist meditation though, same for the "personal growth" oriented meditation styles.
Buddhist meditation exists only in the context of the Four Noble Truths and the rest of the Buddha's Dhamma. Throwing them away means it stops being Buddhist.
digbybk · 9m ago
I disagree, but we'd be arguing semantics. In any case, the point still stands: you can just as easily argue that these rationalist offshoots aren't really Rationalist.
bubblyworld · 2h ago
What is the base rate here? Hard to know the scope of the problem without knowing how many non-rationalists (is that even a coherent group of people?) end up forming weird cults, as a comparison. My impression is that crazy beliefs are common amongst everybody.
A much simpler theory is that rationalists are mostly normal people, and normal people tend to form cults.
glenstein · 2h ago
I was wondering about this too. You could also say it's a sturgeon's law question.
They do note at the beginning of the article that many, if not most such groups have reasonably normal dynamics, for what it's worth. But I think there's a legitimate question of whether we ought to expect groups centered on rational thinking to be better able to escape group dynamics we associate with irrationality.
rkapsoro · 2h ago
Something like 15 years ago I once went to a Less Wrong/Overcoming Bias meetup in my town after being a reader of Yudkowsky's blog for some years. I was like, Bayesian Conspiracy, cool, right?
The group was weird and involved quite a lot of creepy oversharing. I didn't return.
cjs_ac · 2h ago
Rationalism is the belief that reason is the primary path to knowledge, as opposed to, say, the observation that is championed by empiricism. It's a belief system that prioritises imposing its tenets on reality rather than asking reality what reality's tenets are. From the outset, it's inherently cult-like.
handoflixue · 2h ago
Rationalists, in this case, refers specifically to the community clustered around LessWrong, which explicitly and repeatedly emphasizes points like "you can't claim to have a well grounded belief if you don't actually have empirical evidence for it" (https://www.lesswrong.com/w/evidence for a quick overview of some of the basic posts on that topic)
To quote one of the core foundational articles: "Before you try mapping an unseen territory, pour some water into a cup at room temperature and wait until it spontaneously freezes before proceeding. That way you can be sure the general trick—ignoring infinitesimally tiny probabilities of success—is working properly." (https://www.lesswrong.com/posts/eY45uCCX7DdwJ4Jha/no-one-can...)
One can argue how well the community absorbs the lesson, but this certainly seems to be a much higher standard than average.
Ifkaluva · 2h ago
That is the definition of “rationalism” as proposed by philosophers like Descartes and Kant, but I don’t think that is an accurate representation of the type of “rationalism” this article describes.
This article describes “rationalism” as described in LessWrong and the sequences by Eliezer Yudkowsky. A good amount of it based on empirical findings from psychology behavior science. It’s called “rationalism” because it seeks to correct common reasoning heuristics that are purported to lead to incorrect reasoning, not in contrast to empiricism.
glenstein · 2h ago
Agreed, I appreciate that there's a conceptual distinction between the philosophical versions of rationalism and empiricism, but what's being talked about here is a conception that (again, at least notionally) is interested in and compatible with both.
I am pretty sure many of the LessWrong posts are about how to understand the meaning of different types of data and are very much about examining, developing, criticizing a rich variety of empirical attitudes.
FergusArgyll · 2h ago
I was going to write a similar comment as op, so permit me to defend it:
Many of their "beliefs" - Super-duper intelligence, doom - are clearly not believed by the market; Observing the market is a kind of empiricism and it's completely discounted by the lw-ers
gethly · 2h ago
But you cannot have reason without substantial proof of how things behave by observing them in the first place. Reason is simply a logical approach to yes and no questions where you factually know, from observation of past events, how things work. And therefore you can simulate an outcome by the exercise of reasoning applied onto a situation that you have not yet observed and come to a logical outcome, given the set of rules and presumptions.
dkarl · 2h ago
Isn't this entirely to be expected? The people who dominate groups like these are the ones who put the most time and effort into them, and no sane person who appreciates both the value and the limitations of rational thinking is going to see as much value in a rationalist group, and devote as much time to it, as the kind of people who are attracted to the cultish aspect of achieving truth and power through pure thought. There's way more value there if you're looking to indulge in, or exploit, a cult-like spiral into shared fantasy than if you're just looking to sharpen your logical reasoning.
scythe · 17m ago
One of the hallmarks of cults — if not a necessary element — is that they tend to separate their members from the outside society. Rationalism doesn't directly encourage this, but it does facilitate it in a couple of ways:
- Idiosyncratic language used to describe ordinary things ("lightcone" instead of "future", "prior" instead of "belief" or "prejudice", etc)
- Disdain for competing belief systems
- Insistence on a certain shared interpretation of things most people don't care about (the "many-worlds interpretation" of quantum uncertainty, self-improving artificial intelligence, veganism, etc)
- I'm pretty sure polyamory makes the list somehow, just because it isn't how the vast majority of people want to date. In principle it's a private lifestyle choice, but it's obviously a community value here.
So this creates an opportunity for cult-like dynamics to occur where people adjust themselves according to their interactions within the community but not interactions outside the community. And this could seem — to the members — like the beliefs themselves are the problem, but from a sociological perspective, it might really be the inflexible way they diverge from mainstream society.
Isamu · 2h ago
So I like Steven Pinker’s book Rationality, to me it seems quite straightforward.
But I have never been able to get into the Rationalist stuff, to me it’s all very meandering and peripheral and focused on… I don’t know what.
Is it just me?
ameliaquining · 2h ago
Depends very much on what you're hoping to get out of it. There isn't really one "rationalist" thing at this point, it's now a whole bunch of adjacent social groups with overlapping-but-distinct goals and interests.
handoflixue · 1h ago
https://www.lesswrong.com/highlights this is the ostensible "Core Highlights", curated by major members of the community, and I believe Eliezer would endorse it.
If you don't get anything out of reading the list itself, then you're probably not going to get anything out of the rest of the community either.
If you poke around and find a few neat ideas there, you'll probably find a few other neat ideas.
For some people, though, this is "wait, holy shit, you can just DO that? And it WORKS?", in which case probably read all of this but then also go find a few other sources to counter-balance it.
(In particular, probably 90% of the useful insights already exist elsewhere in philosophy, and often more rigorously discussed - LessWrong will teach you the skeleton, the general sense of "what rationality can do", but you need to go elsewhere if you want to actually build up the muscles)
zzzeek · 2h ago
because humans are biological creatures iterating through complex chemical processes that are attempting to allow a large organism to survive and reproduce within the specific ecosystem provided by the Earth in the present day. "Rational reasoning" is a quaint side effect that sometimes is emergent from the nervous system of these organisms, but it's nothing more than that. It's normal that the surviving/reproducing organism's emergent side effect of "rational thought", when it is particularly intense, will self-refer to the organism and act as though it has some kind of dominion over the organism itself, but this is, like the rationalism itself, just an emergent effect that is accidental and transient. Same as if you see a cloud that looks like an elephant (it's still just a cloud).
thrance · 2h ago
Reminds me somewhat of the Culte de la Raison (Cult of Reason) birthed by the french revolution. It didn't last long.
Why are there so many cults? People want to feel like they belong to something, and in a world in the midst of a loneliness and isolation epidemic the market conditions are ideal for cults.
FuriouslyAdrift · 2h ago
Because we are currently living in an age of narcissism and tribalism / Identitarianism is the societal version of narcissism.
khazhoux · 10m ago
> Because we are currently living in an age of narcissism and tribalism
I've been saying this since at least 1200 BC!
ameliaquining · 2h ago
The question the article is asking is "why did so many cults come out of this particular social milieu", not "why are there a lot of cults in the whole world".
iwontberude · 2h ago
Your profile says that you want to keep your identity small, but you have like over 30 thousand comments spelling out exactly who you are and how you think. Why not shard accounts? Anyways. Just a random thought.
keybored · 2h ago
[deleted]
shadowgovt · 2h ago
"SC identity?"
shadowgovt · 2h ago
The book Imagined Communities (Benedict Anderson) touches on this, making the case that in modern times, "nation" has replaced the cultural narrative purpose previously held by "tribe," "village," "royal subject," or "religion."
The shared thread among these is (in ever widening circles) a story people tell themselves to justify precisely why, for example, the actions of someone you'll never meet in Tulsa, OK have any bearing whatsoever on the fate of you, a person in Lincoln, NE.
One can see how this leaves an individual in a tenuous place if one doesn't feel particularly connected to nationhood (one can also see how being too connected to nationhood, in an exclusionary way, can also have deleterious consequences, and how not unlike differing forms of Christianity, differing concepts on what the 'soul' of a nation is can foment internal strife).
(To be clear: those fates are intertwined to some extent; the world we live in grows ever smaller due to the power of up-scaled influence of action granted by technology. But "nation" is a sort of fiction we tell ourselves to fit all that complexity into the slippery meat between human ears).
mindslight · 2h ago
Also, who would want to join an "irrationalist cult" ?
shadowgovt · 2h ago
Hey now, the Discordians have an ancient and respectable tradition. ;)
NoGravitas · 1h ago
Five tons of flax!
iwontberude · 2h ago
They watched too much eXistenZ
the_third_wave · 2h ago
Gott ist tot! Gott bleibt tot! Und wir haben ihn getötet! Wie trösten wir uns, die Mörder aller Mörder? Das Heiligste und Mächtigste, was die Welt bisher besaß, es ist unter unseren Messern verblutet.
The average teenager who reads Nietzsches proclamation on the death of God thinks of this as an accomplishment, finally we got rid of those thousands of years old and thereby severely outdated ideas and rules. Somewhere along the march to maturity they may start to wonder whether that which has replaced those old rules and ideas were good replacements but most of them never come to the realisation that there were rebellious teenagers during all those centuries when the idea of a supreme being to which or whom even the mightiest were to answer to still held sway. Nietzsche saw the peril in letting go off that cultural safety valve and warned for what might come next.
We are currently living in the world he warned us about and for that I, atheist as I am, am partly responsible. The question to be answered here is whether it is possible to regain the benefits of the old order without getting back the obvious excesses, the abuse, the sanctimoniousness and all the other abuses of power and privilege which were responsible for turning people away from that path.
amiga386 · 2h ago
See also Rational Magic: Why a Silicon Valley culture that was once obsessed with reason is going woo (2023)
Quite possibly, places like Reddit and Hacker News, are training for the required level of intellectual smugness, and certitude that you can dismiss every annoying argument with a logical fallacy.
That sounds smug of me, but I’m actually serious. One of their defects, is that once you memorize all the fallacies (“Appeal to authority,” “Ad hominem,”) you can easily reach the point where you more easily recognize the fallacies in everyone else’s arguments than your own. You more easily doubt other people’s cited authorities, than your own. You slap “appeal to authority” against a disliked opinion, while citing an authority next week for your own. It’s a fast path from there to perceived intellectual superiority, and an even faster path from there into delusion. Rational delusion.
sunshowers · 1h ago
While deployment of logical fallacies to win arguments is annoying at best, the far bigger problem is that people make those fallacies in the first place — such as not considering base rates.
shadowgovt · 2h ago
It's generally worth remembering that some of the fallacies are actually structural, and some are rhetorical.
A contradiction creates a structural fallacy; if you find one, it's a fair belief that at least one of the supporting claims is false. In contrast, appeal to authority is probabilistic: we don't know, given the current context, if the authority is right, so they might be wrong... But we don't have time to read the universe into this situation so an appeal to authority is better than nothing.
... and this observation should be coupled with the observation that the school of rhetoric wasn't teaching a method for finding truth; it was teaching a method for beating an opponent in a legal argument. "Appeal to authority is a logical fallacy" is a great sword to bring to bear if your goal is to turn off the audience's ability to ask whether we should give the word of the environmental scientist and the washed-up TV actor equal weight on the topic of environmental science...
gjsman-1000 · 2h ago
… however, even that is up for debate. Maybe the TV actor in your own example is Al Gore filming An Inconvenient Truth and the environmental scientist was in the minority which isn’t so afraid of climate change. Fast forward to 2025, the scientist’s minority position was wrong, while Al Gore’s documentary was legally ruled to have 9 major errors; so you were stupid on both sides, with the TV actor being closer.
incomingpain · 2h ago
We live in an irrational time. It's unclear if it was simply under reported in history or social changes in the last ~50-75 years have had breaking consequences.
People are trying to make sense of this. For examples.
The Canadian government heavily subsidizes junk food, then spends heavily on healthcare because of the resulting illnesses. It restrict and limits healthy food through supply management and promotes a “food pyramid” favoring domestic unhealthy food. Meanwhile, it spends billions marketing healthy living, yet fines people up to $25,000 for hiking in forests and zones cities so driving is nearly mandatory.
Government is an easy target for irrational behaviours.
codr7 · 2h ago
There's nothing irrational about it, this is how you maximize power and profit at any and all costs.
incomingpain · 2h ago
I completely get that point of view; and yes if that's the goal, it's completely rational.
But from a societal cohesion or perhaps even an ethical point of view it's just pure irrationality.
When typing the post, I was thinking, different levels of government, changing ideologies of politicians leaving inconsistent governance.
watwut · 2h ago
Scientology is here since 1953 and it has similarly bonkers set of believes. And is huge.
Your rant about government or not being allowed to hike in some places in Canada is unrelated to the issue.
os2warpman · 1h ago
Rationalists are, to a man (and they’re almost all men) arrogant dickheads and arrogant dickheads do not see what they’re doing to be “a cult” but “the right and proper way of things because I am right and logical and rational and everyone else isn’t”.
The smartest people I have ever known have been profoundly unsure of their beliefs and what they know. I immediately become suspicious of anyone who is very certain of something, especially if they derived it on their own.
Not that non-rationalists are any better at reasoning, but non-rationalists do at least benefit from some intellectual humility.
Yeah, this is a pattern I've seen a lot of recently—especially in discussions about LLMs and the supposed inevitability of AGI (and the Singularity). This is a good description of it.
Me too, in almost every area of life. There's a reason it's called a conman: they are tricking your natural sense that confidence is connected to correctness.
But also, even when it isn't about conning you, how do people become certain of something? They ignored the evidence against whatever they are certain of.
People who actually know what they're talking about will always restrict the context and hedge their bets. Their explanation are tentative, filled with ifs and buts. They rarely say anything sweeping.
The biggest nonsense axiom I see in the AI-cult rationalist world is recursive self-improvement. It's the classic reason superintelligence takeoff happens in sci-fi: once AI reaches some threshold of intelligence, it's supposed to figure out how to edit its own mind, do that better and faster than humans, and exponentially leap into superintelligence. The entire "AI 2027" scenario is built on this assumption; it assumes that soon LLMs will gain the capability of assisting humans on AI research, and AI capabilities will explode from there.
But AI being capable of researching or improving itself is not obvious; there's so many assumptions built into it!
- What if "increasing intelligence", which is a very vague goal, has diminishing returns, making recursive self-improvement incredibly slow?
- Speaking of which, LLMs already seem to have hit a wall of diminishing returns; it seems unlikely they'll be able to assist cutting-edge AI research with anything other than boilerplate coding speed improvements.
- What if there are several paths to different kinds of intelligence with their own local maxima, in which the AI can easily get stuck after optimizing itself into the wrong type of intelligence?
- Once AI realizes it can edit itself to be more intelligent, it can also edit its own goals. Why wouldn't it wirehead itself? (short-circuit its reward pathway so it always feels like it's accomplished its goal)
Knowing Yudowsky I'm sure there's a long blog post somewhere where all of these are addressed with several million rambling words of theory, but I don't think any amount of doing philosophy in a vacuum without concrete evidence could convince me that fast-takeoff superintelligence is possible.
Are you certain about this?
https://archive.org/details/goblinsoflabyrin0000frou/page/10...
EST-type training still exists today. You don't eat the until the end of the whole weekend, or maybe you get rice and little else. Everyone is told to insult you day one until you cry. Then day to, still having not eaten, they build you up and tell you how great you are and have a group hug. Then they ask you how great you feel? Isn't this a good feeling? Don't you want your loved ones to have this feeling? Still having not eaten, you're then encouraged to pay for your family and friends to do the training, without their knowledge or consent.
A friend of mine did this training after his brother paid for his mom to do it, and she paid for him to do it. Let's just say that, though they felt it changed their lives at the time, their lives in no way shape or form changed. Two are in quite a bad place, in fact...
Anyway, point is, the people who invented everything we are using right now were also susceptible to cult-like groups with silly ideas and shady intentions.
All of it has the appearance of sounding so smart, and a few sites were genuine. But it got taken over.
A significant part of this is the intersection of the cult with money and status - this stuff really took off once prominent SV personalities became associated with it, and got turbocharged when it started intersecting with the angel/incubator/VC scene, when there was implicit money involved.
It's unusually successful because -- for a time at least -- there was status (and maybe money) in carrying water for it.
What the actual f. This is such an insane thing to read and understand what it means that i might need to go and sit in silence for the rest of the day.
How did we get to this place with people going completely nuts like this?
See for example "Reality Distortion Field": https://en.wikipedia.org/wiki/Reality_distortion_field
It's nuts.
I know the church of Scientology wants you to crit that roll of tithing.
I shouldn't LOL at this but I must. We're all gonna die in these terrible times but at least we'll LOL at the madness and stupidity of it all.
For those not familiar with the bible enough to know what to look for to find the wild stuff, look up the story of Elisha summoning bears out of the first to maul children for calling him bald, or the last two chapters of Daniel (which I think are only in the Catholic bible) where he literally blows up a dragon by feeding it a cake.
[1]: https://en.wikipedia.org/wiki/Real_presence_of_Christ_in_the...
"Then Daniel took pitch, and fat, and hair, and did seethe them together, and made lumps thereof: this he put in the dragon's mouth, and so the dragon burst in sunder: and Daniel said, Lo, these are the gods ye worship."
Some percentage of the population has a lesser need for a belief system (supernatural, ad hoc, or anything else) but in general, most humans appear to be hardcoded for this need and the overlap doesn't align strictly with atheism. For the atheist with a deep need for something to believe in, the results can be ugly. Though far from perfect, organized religions tend to weed out their most destructive beliefs or end up getting squashed by adherents of other belief systems that are less internally destructive.
Crazy people have always existed (especially cults), but I'd argue recruitment numbers are through the roof thanks to technology and a failing economic environment (instability makes people rationalize crazy behavior).
It's not that those groups didn't have visibility before, it's just easier for the people who share the same...interests...to cloister together on an international scale.
https://www.goodreads.com/quotes/366635-there-are-two-novels...
Very similar to my childhood religion. "We have figured everything out and everyone else is wrong for not figuring things out".
Rationalism seems like a giant castle built on sand. They just keep accruing premises without ever going backwards to see if those premises make sense. A good example of this is their notions of "information hazards".
There are at least a dozen I can think of, including the ‘drink the koolaid’ Jonestown massacre.
People be crazy, yo.
Which actually kinda exised/exists too? [https://en.m.wikipedia.org/wiki/Nichirenism], right down to an attempted coup and a bunch of assassinations [https://en.m.wikipedia.org/wiki/League_of_Blood_Incident].
Now you know. People be whack.
> Heaven's Gate Heaven's Gate Heaven's Gate Heaven's Gate Heaven's Gate Heaven's Gate Heaven's Gate Heaven's Gate ufo ufo ufo ufo ufo ufo ufo ufo ufo ufo ufo ufo space alien space alien space alien space alien space alien space alien space alien space alien space alien space alien space alien space alien extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial misinformation misinformation misinformation misinformation misinformation misinformation misinformation misinformation misinformation misinformation misinformation misinformation freedom freedom freedom freedom freedom freedom freedom freedom freedom freedom freedom freedom second coming second coming second coming second coming second coming second coming second coming second coming second coming second coming angels angels angels angels angels angels angels angels angels angels end end times times end times end times end times end times end times end times end times end times end times Key Words: (for search engines) 144,000, Abductees, Agnostic, Alien, Allah, Alternative, Angels, Antichrist, Apocalypse, Armageddon, Ascension, Atheist, Awakening, Away Team, Beyond Human, Blasphemy, Boddhisattva, Book of Revelation, Buddha, Channeling, Children of God, Christ, Christ's Teachings, Consciousness, Contactees, Corruption, Creation, Death, Discarnate, Discarnates, Disciple, Disciples, Disinformation, Dying, Ecumenical, End of the Age, End of the World, Eternal Life, Eunuch, Evolution, Evolutionary, Extraterrestrial, Freedom, Fulfilling Prophecy, Genderless, Glorified Body, God, God's Children, God's Chosen, God's Heaven, God's Laws, God's Son, Guru, Harvest Time, He's Back, Heaven, Heaven's Gate, Heavenly Kingdom, Higher Consciousness, His Church, Human Metamorphosis, Human Spirit, Implant, Incarnation, Interfaith, Jesus, Jesus' Return, Jesus' Teaching, Kingdom of God, Kingdom of Heaven, Krishna Consciousness, Lamb of God, Last Days, Level Above Human, Life After Death, Luciferian, Luciferians, Meditation, Members of the Next Level, Messiah, Metamorphosis, Metaphysical, Millennium, Misinformation, Mothership, Mystic, Next Level, Non Perishable, Non Temporal, Older Member, Our Lords Return, Out of Body Experience, Overcomers, Overcoming, Past Lives, Prophecy, Prophecy Fulfillment, Rapture, Reactive Mind, Recycling the Planet, Reincarnation, Religion, Resurrection, Revelations, Saved, Second Coming, Soul, Space Alien, Spacecraft, Spirit, Spirit Filled, Spirit Guide, Spiritual, Spiritual Awakening, Star People, Super Natural, Telepathy, The Remnant, The Two, Theosophy, Ti and Do, Truth, Two Witnesses, UFO, Virginity, Walk-ins, Yahweh, Yeshua, Yoda, Yoga,
Like christians are very flexible in following 10 commandments, always been.
Such a setting would seem like the perfect backdrop for a cult that claims "we have the power to subtly influence reality and make improbable things (ie. "magic") occur".
But, fwiw, that particular role-playing game was very much based on trendy at the time occult beliefs in things like chaos magic, so it's not completely off the wall.
Narcissists tend to believe that they are always right, no mater what the topic is, or how knowledgeable they are. This makes them speak with confidence and conviction.
Some people are very drawn to confident people.
If the cult leader has other mental health issues, it can/will seep into their rhetoric. Combine that with unwavering support from loyal followers that will take everything they say as gospel...
That's about it.
Outside of those, the cult dynamics are cut-paste, and always involve an entitled narcissistic cult leader acquiring as much attention/praise, sex, money, and power as possible from the abuse and exploitation of followers.
Most religion works like this. Most alternative spirituality works like this Most finance works like this. Most corporate culture works like this. Most politics works like this.
Most science works like this. (It shouldn't, but the number of abused and exploited PhD students and post-docs is very much not zero.)
The only variables are the differing proportions of attention/praise, sex, money, and power available to leaders, and the amount of abuse that can be delivered to those lower down and/or outside the hierarchy.
The hierarchy and the realities of exploitation and abuse are a constant.
If you removed this dynamic from contemporary culture there wouldn't be a lot left.
Fortunately quite a lot of good things happen in spite of it. But a lot more would happen if it wasn't foundational.
Ayahuasca?
But looking into the underlying Western Esoteric Spirit Science, 'Anthroposophy' (because Theosophy wouldn't let him get weird enough) by Rudolph Steiner, has been quite a ride. The point being that.. humans have a pretty endless capacity to go ALL IN on REALLY WEIRD shit, as long as it promises to fix their lives if they do everything they're told. Naturally if their lives aren't fixed, then they did it wrong or have karmic debt to pay down, so YMMV.
In any case, I'm considering the latent woo-cult atmosphere as a test of the skeptical inoculation that I've tried to raise my child with.
If you’re talking about grade school, interview whoever is gonna be your kids teacher for the next X years and make sure they seem sane. If you’re talking about high school, give a really critical look at the class schedule.
Waldorf schools can vary a lot in this regard so you may not encounter the same problems I did, but it’s good to be cautious.
I think people are going nuts because we've drifted from the dock of a stable civilization. Institutions are a mess. Economy is a mess. Combine all of that together with the advent of social media making the creation of echo chambers (and the inevitable narcissism of "leaders" in those echo chambers) effortless and ~15 years later, we have this.
I do not think this cult dogma is any more out there than other cult dogma I have heard, but the above quote makes me think it is easier to found cults in modern day in someways since you can steal other complex world building from numerous sources rather building yourself and keeping everything straight.
No comments yet
I'm a staunch atheist and I feel the pull all the time.
On the other hand, there's a whole other side of a few nutjobs who really behave like cult leaders, they believe their own bullshit and over time manage to find in this community a lot of "followers", since one of the foundational aspects is radical acceptance it becomes very easy to be nutty and not questioned (unless you do something egregious).
Human brains are lazy Bayesian engines. In uncertainty, we grasp for simple, all-explaining models (heuristics). Mage provides this: a complete ontology where magic equals psychology/quantum woo, reality is malleable, and the camp leaders are the enlightened "tradition." This offers relief from the exhausting ambiguity of real life. Dill didn't invent this; he plugged into the ancient human craving for a map that makes the world feel navigable and controllable. The "rationalist" veneer is pure camouflage. It feels like critical thinking but is actually pseudo-intellectual cargo culting. This isn't Burning Man's fault. It's the latest step of a 2,500-year-old playbook. The Gnostics and the Hermeticists provided ancient frameworks where secret knowledge ("gnosis") granted power over reality, accessible only through a guru. Mage directly borrows from this lineage (The Technocracy, The Traditions). Dill positioned himself as the modern "Ascended Master" dispensing this gnosis.
The 20th century cults Synanon, EST, Moonies, NXIVM all followed similar patterns, starting with isolation. Burning Man's temporary city is the perfect isolation chamber. It's physically remote, temporally bounded (a "liminal space"), fostering dependence on the camp. Initial overwhelming acceptance and belonging (the "Burning Man hug"), then slowly increasing demands (time, money, emotional disclosure, sexual access), framed as "spiritual growth" or "breaking through barriers" (directly lifted from Mage's "Paradigm Shifts" and "Quintessence"). Control language ("sleeper," "muggle," "Awakened"), redefining reality ("that rape wasn't really rape, it was a necessary 'Paradox' to break your illusions"), demanding confession of "sins" (past traumas, doubts), creating dependency on the leader for "truth."
Burning Man attracts people seeking transformation, often carrying unresolved pain. Cults prey on this vulnerability. Dill allegedly targeted individuals with trauma histories. Trauma creates cognitive dissonance and a desperate need for resolution. The cult's narrative (Mage's framework + Dill's interpretation) offers a simple explanation for their pain ("you're unAwakened," "you have Paradox blocking you") and a path out ("submit to me, undergo these rituals"). This isn't therapy; it's trauma bonding weaponized. The alleged rape wasn't an aberration; it was likely part of the control mechanism. It's a "shock" to induce dependency and reframe the victim's reality ("this pain is necessary enlightenment"). People are adrift in ontological insecurity (fear about the fundamental nature of reality and self). Mage offers a new grand narrative with clear heroes (Awakened), villains (sleepers, Technocracy), and a path (Ascension).
Use your mind to control reality, reality fights back with paradox, its cool for a teenager but you read a bit more fantasy and you'll definitely find cooler stuff. But i guess for you to join a cult your mind must stay a teen mind forever.
All of the World Of Darkness and Chronicles Of Darkness games are basically about coming of age/puberty. Like X-Men but for Goth-Nerds instead of Geek-Nerds.
In Vampire, your body is going through weird changes and you are starting to develop, physically and/or mentally, while realising that the world is run by a bunch of old, evil fools who still expect you to toe the line and stay in your place, but you are starting to wonder if the world wouldn't be better if your generation overthrew them and took over running the world, doing it the right way. And there are all these bad elements trying to convince you that you should do just that, but for the sake of mindless violence and raucous partying. Teenager - the rpg.
In Werewolf, your body is going through weird changes and you are starting to develop, physically and mentally, while realising that you are not a part of the "normal" crowd that the rest of Humanity belongs to. You are different and they just can't handle that whenever it gets revealed. Luckily, there are small communities of people like you out there who take you in and show you how use the power of your "true" self. Of course, even among this community, there are different types of other. LGBT Teenager - the RPG
In Mage, you have begun to take an interest in the real world, and you think you know what the world is really like. The people all around you are just sleep-walking through life, because they don't really get it. This understanding sets you against the people who run the world: the governments and the corporations, trying to stop these sleeper from waking up to the truth and rejecting their comforting lies. You have found some other people who saw through them, and you think they've got a lot of things wrong, but at least they're awake to the lies! Rebellious Teenager - the RPG
Twist: we’re sleepwalking through life because we really DO get it.
(Source: I’m 56)
I had friends who were into Vampire growing up. I hadn’t heard of Werewolf until after the aforementioned book came out and people started going nuts for it. I mentioned to my wife at the time that there was this game called “Vampire” and told her about it and she just laughed, pointed to the book, and said “this is so much better”. :shrug:
Rewind back and there were the Star Wars kids. Fast forward and there are the Harry Potter kids/adults. Each generation has their own “thing”. During that time, it was Quake MSDOS and Vampire. Oh and we started Senior Assassinations. 90s super soakers were the real deal.
many such cases
(One of my favorite TED talks was about a failed experiment in introducing traditional Western agriculture to a people in Zambia. It turns out when you concentrate too much food in one place, the hippos come and eat it all and people can't actually out-fight hippos in large numbers. In hindsight, the people running the program should have asked how likely it was that folks in a region that had exposure to other people's agriculture for thousands of years, hadn't ever, you know... tried it. https://www.ted.com/talks/ernesto_sirolli_want_to_help_someo...)
I see this arrogant attitude all the time on HN: reflexive distrust of the "mainstream media" and "scientific experts". Critical thinking is a very healthy idea, but its dangerous when people use it as a license to categorically reject sources. Its even worse when extremely powerful people do this; they can reduce an enormous sub-network of thought into a single node for many many people.
So, my answer for "Why Are There So Many Rationalist Cults?" is the same reason all cults exist: humans like to feel like they're in on the secret. We like to be in secret clubs.
Well, it turns out that intuition and long-lived cultural norms often have rational justifications, but individuals may not know what they are, and norms/intuitions provide useful antibodies against narcissist would-be cult leaders.
Can you find the "rational" justification not to isolate yourself from non-Rationalists, not to live with them in a polycule, and not to take a bunch of psychedelic drugs with them? If you can't solve that puzzle, you're in danger of letting the group take advantage of you.
And the crazy thing is, none of that is fundamentally opposed to rationalism. You can be a rationalist who ascribes value to gut instinct and societal norms. Those are the product of millions of years of pre-training.
I have spent a fair bit of time thinking about the meaning of life. And my conclusions have been pretty crazy. But they sound insane, so until I figure out why they sound insane, I'm not acting on those conclusions. And I'm definitely not surrounding myself with people who take those conclusions seriously.
Specifically, rationalism spends a lot of time about priors, but a sneaky thing happens that I call the 'double update'.
Bayesian updating works when you update your genuine prior believe with new evidence. No one disagrees with this, and sometimes it's easy and sometimes it's difficult to do.
What Rationalists often end up doing is relaxing their priors - intuition, personal experience, cultural norms - and then updating. They often think of this as one update, but what it is is two. The first update, relaxing priors, isn't associated with evidence. It's part of the community norms. There is an implicit belief that by relaxing one's priors you're more open to reality. The real result though, is that it sends people wildly off course. Care in point: all the cults.
Consider the pre-tipped scale. You suspect the scale reads a little low, so before weighing you tilt it slightly to "correct" for that bias. Then you pour in flour until the dial says you've hit the target weight. You’ve followed the numbers exactly, but because you started from a tipped scale, you've ended up with twice the flour the recipe called for.
Trying to correct for bias by relaxing priors is updating using evidence, not just because everyone is doing it.
The game as it is _actually_ played is that you use rationalist arguments to justify your pre-existing gut intuitions and personal biases.
I guess I'm a radical skeptic, secular humanist, utilitarianish sort of guy, but I'm not dumb enough to think throwing around the words "bayesian prior" and "posterior distribution" makes actually figuring out how something works or predicting the outcome of an intervention easy or certain. I've had a lot of life at this point and gotten to some level of mastery at a few things and my main conclusion is that most of the time its just hard to know stuff and that the single most common cognitive mistake people make is too much certainty.
There's a point where more passive thinking stops adding value and starts subtracting sanity. It's pretty easy to get to that point. We've all done it.
This is a common sentiment but is probably not entirely true. A great example is cosmology. Yes, more data would make some work easier, but astrophysicists and cosmologists have shown that you can gather and combine existing data and look at it in novel new ways to produce unexpected results, like place bounds that can include/exclude various theories.
I think a philosophy that encourages more analysis rather than sitting back on our laurels with an excuse that we need more data is good, as long as it's done transparently and honestly.
If you are talking about cosmology? Yea, you can look at existing data in new ways, cause you probably have enough data to do that safely.
If you are looking at human psychology? Looking at existing data in new ways is essentially p-hacking. And you probably won’t ever have enough data to define a “universal theory of the human mind”.
The qualifier "normally" already covers "not entirely true". Of course it's not entirely true. It's mostly true for us now. (In fact twenty years ago we used more numerical models than we do now, because we were facing more unsolved problems where the solution was pretty well knowable just by doing more complicated calculations, but without taking more data. Back then, when people started taking lots of data, it was often a total waste of time. But right now, most of those problems seem to be solved. We're facing different problems that seem much harder to model, so we rely more on data. This stage won't be permanent either.)
It's not a sentiment, it's a reality that we have to deal with.
And I think you missed the main point of my reply: that people often think we need more data, but cleverness and ingenuity can often find a way to make meaningful progress with existing data. Obviously I can't make any definitive judgment about your specific case, but I'm skeptical of any claim that it's out of the realm of possibility that some genius like Einstein analyzed your problem could get no further than you have.
(I call it neorationalism because it is philosophically unrelated to the more traditional rationalism of Spinoza and Descartes.)
The first is diffusion of power. Social media is powered by charisma, and while it is certainly true that personality-based cults are nothing new, the internet makes it way easier to form one. Contrast that with academic philosophy. People can have their own little fiefdoms, and there is certainly abuse of power, but rarely concentrated in such a way that you see within rationalist communities.
The second (and more idealistic) is that the discipline of Philosophy is rooted in the Platonic/Socratic notion that "I know that I know nothing." People in academic philosophy are on the whole happy to provide a gloss on a gloss on some important thinker, or some kind of incremental improvement over somebody else's theory. This makes it extremely boring, and yet, not nearly as susceptible to delusions of grandeur. True skepticism has to start with questioning one's self, but everybody seems to skip that part and go right to questioning everybody else.
Rationalists have basically reinvented academic philosophy from the ground up with none of the rigor, self-discipline, or joy. They mostly seem to dedicate their time to providing post-hoc justifications for the most banal unquestioned assumptions of their subset of contemporary society.
Taking academic philosophy seriously, at least as an historical phenomenon, would require being educated in the humanities, which is unpopular and low-status among Rationalists.
Clearly all of these groups that believe in demons or realities dictated by tabletop games are not what third parties would call Rationalist. They might call themselves that.
There are some pretty simple tests that can out these groups as not rational. None of these people have ever seen a demon, so world models including demons have never predicted any of their sense data. I doubt these people would be willing to make any bets about when or if a demon will show up. Many of us would be glad to make a market concerning predictions made by tabletop games about physical phenomenon.
Which, to me, raises the fascinating question of what does a "good" version look like, of groups and group dynamics centered around a shared interest in best practices associated with critical thinking?
At a first impression, I think maybe these virtues (which are real!) disappear into the background of other, more applied specializations, whether professions, hobbies, backyard family barbecues.
There's a reason they call themselves "rationalists" instead of empiricists or positivists. They perfectly inverted Hume ("reason is, and ought only to be the slave of the passions")
These kinds of harebrained views aren't an accident but a product of rationalism. The idea that intellect is quasi infinite and that the world can be mirrored in the mind is not running contradictory to, but just the most extreme form of rationalism taken to its conclusion, and of course deeply religious, hence the constant fantasies about AI divinities and singularities.
The article begins by saying the rationalist community was "drawn together by AI researcher Eliezer Yudkowsky’s blog post series The Sequences". Obviously the article intends to make the case that this is a cult, but it's already done with the argument at this point.
This is the Internet, you're allowed to say "they are obsessed with unlimited drugs and weird sex things, far beyond what even the generally liberal society tolerates".
I'm increasingly convinced that every other part of "Rationalism" is just distraction or justification for those; certainly there's a conscious decision to minimize talking about this part on the Internet.
The author is a self-identified rationalist. This is explicitly established in the second sentence of the article. Given that, why in the world would you think they're trying to claim the whole movement is a cult?
Obviously you and I have very different definitions of "obvious"
It seems to not be true, but I still maintain that it was obvious. Sometimes people don't pick the low-hanging fruit.
And I say this as a Christian. I often think that becoming a state religion was the worst thing that ever happened to Christianity, or any religion, because then it unavoidably becomes a tool for power and authority.
And doing the same with other ideas or ideologies is no different. Look at what happened to communism, capitalism, or almost any other secular idea you can think of: the moment it becomes established, accepted, and official, the corruption sets in.
I agree that the term "rationalist" would appeal to many people, and the obvious need to belong to a group plays a huge role.
L. Ron Hubbard is more like the Zizians.
[1] https://www.lesswrong.com/posts/8viKzSrYhb6EFk6wg/why-yudkow...
It's mostly just people who aren't very experienced talking about and dealing honestly with their emotions, no?
I mean, suppose someone is busy achieving and, at the same time, proficient in balancing work with emotional life, dealing head-on with interpersonal conflicts, facing change, feeling and acknowledging hurt, knowing their emotional hangups, perhaps seeing a therapist, perhaps even occasionally putting personal needs ahead of career... :)
Tell that person they can get a marginal (or even substantial) improvement from some rationalist cult practice. Their first question is going to be, "What's the catch?" Because at the very least they'll suspect that adjusting their work/life balance will bring a sizeable amount of stress and consequent decrease in their emotional well-being. And if the pitch is that this rationalist practice works equally well at improving emotional well-being, that smells to them. They already know they didn't logic themselves into their current set of emotional issues, and they are highly unlikely to logic themselves out of them. So there's not much value here to offset the creepy vibes of the pitch. (And again-- being in touch with your emotions means quicker and deeper awareness of creepy vibes!)
Now, take a person whose unexplored emotional well-being tacitly depends on achievement. Even a marginal improvement in achievement could bring perceptible positive changes in their holistic selves! And you can step through a well-specified, logical process to achieve change? Sign HN up!
https://en.wikipedia.org/wiki/Nancy_Cartwright#Personal_life
I think it was a relative of his claiming this.
Once you see them listed (social pressure, sleep deprivation, control of drinking/bathroom, control of language/terminology, long exhausting activities, financial buy in, etc) and see where they've been used in cults and other cult adjacent things it's a little bit of a warning signal when you run across them IRL.
Outside of sexuality and the proclivities of their leaders, emphasis on physical domination of the self is lacking. The brain runs wild, the spirit remains aimless.
In the Bay, the difference between the somewhat well-adjusted "rationalists" and those very much "in the mush" is whether or not someone tells you they're in SF or "on the Berkeley side of things"
"Rationalism is such an insane name for a school of thought. Like calling your ideology correctism or winsargumentism"
https://x.com/growing_daniel/status/1893554844725616666
I fully accept that the rationalist community may have morphed into something far beyond that original tenet, but I think rationalism just describes the approach, not that it's the "one true philosophy".
I guess it's possible you might be doing some deep ironic thing by providing a seemingly sincere example of what I'm complaining about. If so it was over my head but in that case I withdraw "derail"!
You can be rational and objective about a given topic without it meaning that the conversation is closed, or that all knowledge has been found. So I'm certainly not a fan of cult dynamics, but I think it's easy to throw an unfair charge at these groups, that their interest in the topic necessitates an absolutist disposition.
This is such an important skill we should all have. I learned this best from watching the documentary Behind the Curve, about flat earthers, and have applied it to my best friend diving into the Tartarian conspiracy theory.
Buddhist meditation exists only in the context of the Four Noble Truths and the rest of the Buddha's Dhamma. Throwing them away means it stops being Buddhist.
A much simpler theory is that rationalists are mostly normal people, and normal people tend to form cults.
They do note at the beginning of the article that many, if not most such groups have reasonably normal dynamics, for what it's worth. But I think there's a legitimate question of whether we ought to expect groups centered on rational thinking to be better able to escape group dynamics we associate with irrationality.
The group was weird and involved quite a lot of creepy oversharing. I didn't return.
To quote one of the core foundational articles: "Before you try mapping an unseen territory, pour some water into a cup at room temperature and wait until it spontaneously freezes before proceeding. That way you can be sure the general trick—ignoring infinitesimally tiny probabilities of success—is working properly." (https://www.lesswrong.com/posts/eY45uCCX7DdwJ4Jha/no-one-can...)
One can argue how well the community absorbs the lesson, but this certainly seems to be a much higher standard than average.
This article describes “rationalism” as described in LessWrong and the sequences by Eliezer Yudkowsky. A good amount of it based on empirical findings from psychology behavior science. It’s called “rationalism” because it seeks to correct common reasoning heuristics that are purported to lead to incorrect reasoning, not in contrast to empiricism.
I am pretty sure many of the LessWrong posts are about how to understand the meaning of different types of data and are very much about examining, developing, criticizing a rich variety of empirical attitudes.
Many of their "beliefs" - Super-duper intelligence, doom - are clearly not believed by the market; Observing the market is a kind of empiricism and it's completely discounted by the lw-ers
- Idiosyncratic language used to describe ordinary things ("lightcone" instead of "future", "prior" instead of "belief" or "prejudice", etc)
- Disdain for competing belief systems
- Insistence on a certain shared interpretation of things most people don't care about (the "many-worlds interpretation" of quantum uncertainty, self-improving artificial intelligence, veganism, etc)
- I'm pretty sure polyamory makes the list somehow, just because it isn't how the vast majority of people want to date. In principle it's a private lifestyle choice, but it's obviously a community value here.
So this creates an opportunity for cult-like dynamics to occur where people adjust themselves according to their interactions within the community but not interactions outside the community. And this could seem — to the members — like the beliefs themselves are the problem, but from a sociological perspective, it might really be the inflexible way they diverge from mainstream society.
But I have never been able to get into the Rationalist stuff, to me it’s all very meandering and peripheral and focused on… I don’t know what.
Is it just me?
If you don't get anything out of reading the list itself, then you're probably not going to get anything out of the rest of the community either.
If you poke around and find a few neat ideas there, you'll probably find a few other neat ideas.
For some people, though, this is "wait, holy shit, you can just DO that? And it WORKS?", in which case probably read all of this but then also go find a few other sources to counter-balance it.
(In particular, probably 90% of the useful insights already exist elsewhere in philosophy, and often more rigorously discussed - LessWrong will teach you the skeleton, the general sense of "what rationality can do", but you need to go elsewhere if you want to actually build up the muscles)
https://en.wikipedia.org/wiki/Cult_of_Reason
I've been saying this since at least 1200 BC!
The shared thread among these is (in ever widening circles) a story people tell themselves to justify precisely why, for example, the actions of someone you'll never meet in Tulsa, OK have any bearing whatsoever on the fate of you, a person in Lincoln, NE.
One can see how this leaves an individual in a tenuous place if one doesn't feel particularly connected to nationhood (one can also see how being too connected to nationhood, in an exclusionary way, can also have deleterious consequences, and how not unlike differing forms of Christianity, differing concepts on what the 'soul' of a nation is can foment internal strife).
(To be clear: those fates are intertwined to some extent; the world we live in grows ever smaller due to the power of up-scaled influence of action granted by technology. But "nation" is a sort of fiction we tell ourselves to fit all that complexity into the slippery meat between human ears).
The average teenager who reads Nietzsches proclamation on the death of God thinks of this as an accomplishment, finally we got rid of those thousands of years old and thereby severely outdated ideas and rules. Somewhere along the march to maturity they may start to wonder whether that which has replaced those old rules and ideas were good replacements but most of them never come to the realisation that there were rebellious teenagers during all those centuries when the idea of a supreme being to which or whom even the mightiest were to answer to still held sway. Nietzsche saw the peril in letting go off that cultural safety valve and warned for what might come next.
We are currently living in the world he warned us about and for that I, atheist as I am, am partly responsible. The question to be answered here is whether it is possible to regain the benefits of the old order without getting back the obvious excesses, the abuse, the sanctimoniousness and all the other abuses of power and privilege which were responsible for turning people away from that path.
https://www.thenewatlantis.com/publications/rational-magic
and its discussion on HN: https://news.ycombinator.com/item?id=35961817
Because empathy is hard.
Quite possibly, places like Reddit and Hacker News, are training for the required level of intellectual smugness, and certitude that you can dismiss every annoying argument with a logical fallacy.
That sounds smug of me, but I’m actually serious. One of their defects, is that once you memorize all the fallacies (“Appeal to authority,” “Ad hominem,”) you can easily reach the point where you more easily recognize the fallacies in everyone else’s arguments than your own. You more easily doubt other people’s cited authorities, than your own. You slap “appeal to authority” against a disliked opinion, while citing an authority next week for your own. It’s a fast path from there to perceived intellectual superiority, and an even faster path from there into delusion. Rational delusion.
A contradiction creates a structural fallacy; if you find one, it's a fair belief that at least one of the supporting claims is false. In contrast, appeal to authority is probabilistic: we don't know, given the current context, if the authority is right, so they might be wrong... But we don't have time to read the universe into this situation so an appeal to authority is better than nothing.
... and this observation should be coupled with the observation that the school of rhetoric wasn't teaching a method for finding truth; it was teaching a method for beating an opponent in a legal argument. "Appeal to authority is a logical fallacy" is a great sword to bring to bear if your goal is to turn off the audience's ability to ask whether we should give the word of the environmental scientist and the washed-up TV actor equal weight on the topic of environmental science...
People are trying to make sense of this. For examples.
The Canadian government heavily subsidizes junk food, then spends heavily on healthcare because of the resulting illnesses. It restrict and limits healthy food through supply management and promotes a “food pyramid” favoring domestic unhealthy food. Meanwhile, it spends billions marketing healthy living, yet fines people up to $25,000 for hiking in forests and zones cities so driving is nearly mandatory.
Government is an easy target for irrational behaviours.
But from a societal cohesion or perhaps even an ethical point of view it's just pure irrationality.
When typing the post, I was thinking, different levels of government, changing ideologies of politicians leaving inconsistent governance.
Your rant about government or not being allowed to hike in some places in Canada is unrelated to the issue.