[HN Gopher] Why are there so many rationalist cults?
       ___________________________________________________________________
        
       Why are there so many rationalist cults?
        
       Author : glenstein
       Score  : 332 points
       Date   : 2025-08-12 14:56 UTC (8 hours ago)
        
 (HTM) web link (asteriskmag.com)
 (TXT) w3m dump (asteriskmag.com)
        
       | api wrote:
       | Why are there so many cults? People want to feel like they belong
       | to something, and in a world in the midst of a loneliness and
       | isolation epidemic the market conditions are ideal for cults.
        
         | iwontberude wrote:
         | Your profile says that you want to keep your identity small,
         | but you have like over 30 thousand comments spelling out
         | exactly who you are and how you think. Why not shard accounts?
         | Anyways. Just a random thought.
        
           | keybored wrote:
           | [deleted]
        
             | shadowgovt wrote:
             | "SC identity?"
        
         | mindslight wrote:
         | Also, who would want to join an "irrationalist cult" ?
        
           | shadowgovt wrote:
           | Hey now, the Discordians have an ancient and respectable
           | tradition. ;)
        
             | NoGravitas wrote:
             | Five tons of flax!
        
         | FuriouslyAdrift wrote:
         | Because we are currently living in an age of narcissism and
         | tribalism / Identitarianism is the societal version of
         | narcissism.
        
           | khazhoux wrote:
           | > Because we are currently living in an age of narcissism and
           | tribalism
           | 
           | I've been saying this since at least 1200 BC!
        
         | shadowgovt wrote:
         | The book _Imagined Communities_ (Benedict Anderson) touches on
         | this, making the case that in modern times,  "nation" has
         | replaced the cultural narrative purpose previously held by
         | "tribe," "village," "royal subject," or "religion."
         | 
         | The shared thread among these is (in ever widening circles) a
         | story people tell themselves to justify precisely why, for
         | example, the actions of someone you'll never meet in Tulsa, OK
         | have any bearing whatsoever on the fate of you, a person in
         | Lincoln, NE.
         | 
         | One can see how this leaves an individual in a tenuous place if
         | one doesn't feel particularly connected to nationhood (one can
         | also see how being _too_ connected to nationhood, in an
         | exclusionary way, can also have deleterious consequences, and
         | how not unlike differing forms of Christianity, differing
         | concepts on what the  'soul' of a nation is can foment internal
         | strife).
         | 
         | (To be clear: those fates _are_ intertwined to some extent; the
         | world we live in grows ever smaller due to the power of up-
         | scaled influence of action granted by technology. But  "nation"
         | is a sort of fiction we tell ourselves to fit all that
         | complexity into the slippery meat between human ears).
        
         | ameliaquining wrote:
         | The question the article is asking is "why did so many cults
         | come out of this particular social milieu", not "why are there
         | a lot of cults in the whole world".
        
       | optimalsolver wrote:
       | Pertinent Twitter comment:
       | 
       | "Rationalism is such an insane name for a school of thought. Like
       | calling your ideology correctism or winsargumentism"
       | 
       | https://x.com/growing_daniel/status/1893554844725616666
        
         | nyeah wrote:
         | Great names! Are you using them, or are they available? /s
        
         | wiredfool wrote:
         | Objectivisim?
        
         | hn_throwaway_99 wrote:
         | To be honest I don't understand that objection. If you strip it
         | from all its culty sociological effects, one of the original
         | ideas of rationalism was to try to use logical reasoning and
         | statistical techniques to explicitly avoid the pitfalls of
         | known cognitive biases. Given that foundational tenet,
         | "rationalism" seems like an extremely appropriate moniker.
         | 
         | I fully accept that the rationalist community may have morphed
         | into something far beyond that original tenet, but I think
         | rationalism just describes the approach, not that it's the "one
         | true philosophy".
        
           | ameliaquining wrote:
           | That it refers to a different but confusingly related concept
           | in philosophy is a real downside of the name.
        
           | nyeah wrote:
           | I'm going to start a group called "Mentally Healthy People".
           | We use data, logical thinking, and informal peer review. If
           | you disagree with us, our first question will be "what's
           | wrong with mental health?"
        
             | handoflixue wrote:
             | So... Psychiatry? Do you think psychiatrists are
             | particularly prone to starting cults? Do you think learning
             | about psychiatry makes you at risk for cult-like behavior?
        
               | nyeah wrote:
               | No. I have no beef with psychology or psychiatry. They're
               | doing good work as far as I can tell. I am poking fun at
               | people who take "rationality" and turn it into a brand
               | name.
        
               | handoflixue wrote:
               | Why is "you can work to avoid cognitive biases" more
               | ridiculous than "you can work to improve your mental
               | health"?
        
               | nyeah wrote:
               | I'm feeling a little frustrated by the derail. My
               | complaint is about some small group claiming to have a
               | monopoly on a normal human faculty, in this case
               | rationality. The small group might well go on to claim
               | that people outside the group lack rationality. That
               | would be absurd. The mental health profession do not
               | claim to be immune from mental illness themselves, they
               | do not claim that people outside their circle are
               | mentally unhealthy, and they do not claim that their
               | particular treatment is necessary for mental health.
               | 
               | I guess it's possible you might be doing some deep ironic
               | thing by providing a seemingly sincere example of what
               | I'm complaining about. If so it was over my head but in
               | that case I withdraw "derail"!
        
           | glenstein wrote:
           | Right and to your point, I would say you can distinguish (1)
           | "objective" in the sense of relying on mind-independent data
           | from (2) absolute knowledge, which treats subjects like
           | closed conversations. And you can make similar caveats for
           | "rational".
           | 
           | You can be rational and objective about a given topic without
           | it meaning that the conversation is closed, or that all
           | knowledge has been found. So I'm certainly not a fan of cult
           | dynamics, but I think it's easy to throw an unfair charge at
           | these groups, that their interest in the topic necessitates
           | an absolutist disposition.
        
         | ameliaquining wrote:
         | IIUC the name in its current sense was sort of an accident.
         | Yudkowsky originally used the term to mean "someone who
         | succeeds at thinking and acting rationally" (so "correctism" or
         | "winsargumentism" would have been about equally good), and then
         | talked about the idea of "aspiring rationalists" as a community
         | narrowly focused on developing a sort of engineering discipline
         | that would study the scientific principles of how to be right
         | in full generality and put them into practice. Then the
         | community grew and mutated into a broader social milieu that
         | was only sort of about that, and people needed a name for it,
         | and "rationalists" was already there, so that became the name
         | through common usage. It definitely has certain awkwardnesses.
        
         | SilverElfin wrote:
         | What do you make of the word "science" then?
        
       | mlinhares wrote:
       | > One is Black Lotus, a Burning Man camp led by alleged rapist
       | Brent Dill, which developed a metaphysical system based on the
       | tabletop roleplaying game Mage the Ascension.
       | 
       | What the actual f. This is such an insane thing to read and
       | understand what it means that i might need to go and sit in
       | silence for the rest of the day.
       | 
       | How did we get to this place with people going completely nuts
       | like this?
        
         | optimalsolver wrote:
         | astronauts_meme.jpg
        
         | linohh wrote:
         | Running a cult is a somewhat reliable source of narcissistic
         | supply. The internet tells you how to do it. So an increasing
         | number of people do it.
        
         | TimorousBestie wrote:
         | Mage: The Ascension is basically a delusions of grandeur
         | simulator, so I can see how an already unstable personality
         | might get attached to it and become more unstable.
        
           | mlinhares wrote:
           | I don't know, i'd understand something like Wraith (which I
           | did see people developing issues, the shadow mechanic is such
           | a terrible thing) but Mage is so, like, straightforward?
           | 
           | Use your mind to control reality, reality fights back with
           | paradox, its cool for a teenager but you read a bit more
           | fantasy and you'll definitely find cooler stuff. But i guess
           | for you to join a cult your mind must stay a teen mind
           | forever.
        
             | wavefunction wrote:
             | How many adults actually abandon juvenalia as they age? Not
             | the majority in my observation, and it's not always a bad
             | thing when it's only applied to subjects like pop culture.
             | Applied juvenalia in response to serious subjects is a more
             | serious issue.
        
               | mlinhares wrote:
               | There has to be a cult of people that believe they're
               | vampires, respecting the masquerade and serving some
               | antedeluvian somewhere, vampire was much more mainstream
               | than mage.
        
               | DonHopkins wrote:
               | There are post-pubescent males who haven't abandoned
               | Atlas Shrugged posting to this very web site!
        
             | WJW wrote:
             | I didn't originally write this, but can't find the original
             | place I read it anymore. I think it makes a lot of sense to
             | repost it here:
             | 
             | All of the World Of Darkness and Chronicles Of Darkness
             | games are basically about coming of age/puberty. Like X-Men
             | but for Goth-Nerds instead of Geek-Nerds.
             | 
             | In Vampire, your body is going through weird changes and
             | you are starting to develop, physically and/or mentally,
             | while realising that the world is run by a bunch of old,
             | evil fools who still expect you to toe the line and stay in
             | your place, but you are starting to wonder if the world
             | wouldn't be better if your generation overthrew them and
             | took over running the world, doing it the right way. And
             | there are all these bad elements trying to convince you
             | that you should do just that, but for the sake of mindless
             | violence and raucous partying. Teenager - the rpg.
             | 
             | In Werewolf, your body is going through weird changes and
             | you are starting to develop, physically and mentally, while
             | realising that you are not a part of the "normal" crowd
             | that the rest of Humanity belongs to. You are different and
             | they just can't handle that whenever it gets revealed.
             | Luckily, there are small communities of people like you out
             | there who take you in and show you how use the power of
             | your "true" self. Of course, even among this community,
             | there are different types of other. LGBT Teenager - the RPG
             | 
             | In Mage, you have begun to take an interest in the real
             | world, and you think you know what the world is really
             | like. The people all around you are just sleep-walking
             | through life, because they don't really get it. This
             | understanding sets you against the people who run the
             | world: the governments and the corporations, trying to stop
             | these sleeper from waking up to the truth and rejecting
             | their comforting lies. You have found some other people who
             | saw through them, and you think they've got a lot of things
             | wrong, but at least they're awake to the lies! Rebellious
             | Teenager - the RPG
        
               | reactordev wrote:
               | I think I read it too, it's called Twilight. /s
               | 
               | I had friends who were into Vampire growing up. I hadn't
               | heard of Werewolf until after the aforementioned book
               | came out and people started going nuts for it. I
               | mentioned to my wife at the time that there was this game
               | called "Vampire" and told her about it and she just
               | laughed, pointed to the book, and said "this is so much
               | better". :shrug:
               | 
               | Rewind back and there were the Star Wars kids. Fast
               | forward and there are the Harry Potter kids/adults. Each
               | generation has their own "thing". During that time, it
               | was Quake MSDOS and Vampire. Oh and we started Senior
               | Assassinations. 90s super soakers were the real deal.
        
               | abullinan wrote:
               | " The people all around you are just sleep-walking
               | through life, because they don't really get it."
               | 
               | Twist: we're sleepwalking through life because we really
               | DO get it.
               | 
               | (Source: I'm 56)
        
               | mlinhares wrote:
               | This tracks, but I'd say Werewolf goes beyond LGBT folks,
               | the violence there also fits the boy's aggressive play
               | and the saving the world theme resonated a lot with the
               | basic "i want to be important/hero" thing. Its my
               | favorite of all world of darkness books, i regret not
               | getting the kickstarter edition :(
        
               | michaeldoron wrote:
               | Yeah, I would say Werewolf is more like Social Activist:
               | The Rage simulator than LGBT teenager
        
         | JTbane wrote:
         | I don't know how you can call yourself a "rationalist" and base
         | your worldview on a fantasy game.
        
           | reactordev wrote:
           | Rationalizing the fantasy. Like LARPing. Only you lack
           | weapons, armor, magic missiles...
        
           | hungryhobbit wrote:
           | Mage is an interesting game though: it's fantasy, but not
           | "swords and dragons" fantasy. It's set in the real world, and
           | the "magic" is just the "mage" shifting probabilities so that
           | unlikely (but possible) things occur.
           | 
           | Such a setting would seem like the _perfect_ backdrop for a
           | cult that claims  "we have the power to subtly influence
           | reality and make improbable things (ie. "magic") occur".
        
           | empath75 wrote:
           | Most "rationalists" throughout history have been very deeply
           | religious people. Secular enlightenment-era rationalism is
           | not the only direction you can go with it. It depends very
           | much, as others have said, on what your axioms are.
           | 
           | But, fwiw, that particular role-playing game was very much
           | based on trendy at the time occult beliefs in things like
           | chaos magic, so it's not completely off the wall.
        
           | vannevar wrote:
           | "Rationalist" in this context does not mean "rational
           | person," but rather "person who rationalizes."
        
           | ponector wrote:
           | I my experience, religious people are perfectly fine with
           | contradicted worldview.
           | 
           | Like christians are very flexible in following 10
           | commandments, always been.
        
             | BalinKing wrote:
             | That example isn't a contradictory worldview though, just
             | "people being people, and therefore failing to be as good
             | as the ideal they claim to strive for."
        
             | scns wrote:
             | Being fine with cognitive dissonance would be a
             | prerequisite for holding religious beliefs i'd say.
        
           | prepend wrote:
           | I mean, is it a really good game?
           | 
           | I've never played, but now I'm kind of interesting.
        
             | nemomarx wrote:
             | It's reportedly alright - the resolution mechanic seems a
             | little fiddly with varying pools of dice for everything.
             | The lore is pretty interesting though and I think a lot of
             | the point of that series of games was reading up on that.
        
         | SirFatty wrote:
         | Came to ask a similar question, but also has it always been
         | like this? The difference is now these people/groups on the
         | fringe had no visibility before the internet?
         | 
         | It's nuts.
        
           | lazide wrote:
           | Have you heard of Heavens Gate? [https://en.m.wikipedia.org/w
           | iki/Heaven%27s_Gate_(religious_g...].
           | 
           | There are at least a dozen I can think of, including the
           | 'drink the koolaid' Jonestown massacre.
           | 
           | People be crazy, yo.
        
             | SirFatty wrote:
             | Of course, Jim Jones and L Ron Hubbard, David Kersh. I
             | realize there have always been people that are coocoo for
             | cocoa puffs. But so many as there appear to be now?
        
               | tuesdaynight wrote:
               | Internet made possible to know global news all the time.
               | I think that there have always been a lot of people with
               | very crazy and extremist views, but we only knew about
               | the ones closer to us. Now it's possible to know about
               | crazy people from the other side of the planet, so it
               | looks like there's a lot more of them than before.
        
               | lazide wrote:
               | Yup. Like previously, westerners could have gone their
               | whole lives with no clue the Hindutva existed
               | [https://en.m.wikipedia.org/wiki/Hindutva] - Hindu Nazis,
               | basically. Which if you know Hinduism at all, is a bit
               | like saying Buddhist Nazis. Say what?
               | 
               | Which actually kinda exised/exists too?
               | [https://en.m.wikipedia.org/wiki/Nichirenism], right down
               | to an attempted coup and a bunch of assassinations [https
               | ://en.m.wikipedia.org/wiki/League_of_Blood_Incident].
               | 
               | Now you know. People be whack.
        
             | geoffeg wrote:
             | Just a note that the Heaven's Gate website is still up.
             | It's a wonderful snapshot of 90s web design.
             | https://www.heavensgate.com/
        
               | ants_everywhere wrote:
               | what a wild set of SEO keywords
               | 
               | > Heaven's Gate Heaven's Gate Heaven's Gate Heaven's Gate
               | Heaven's Gate Heaven's Gate Heaven's Gate Heaven's Gate
               | ufo ufo ufo ufo ufo ufo ufo ufo ufo ufo ufo ufo space
               | alien space alien space alien space alien space alien
               | space alien space alien space alien space alien space
               | alien space alien space alien extraterrestrial
               | extraterrestrial extraterrestrial extraterrestrial
               | extraterrestrial extraterrestrial extraterrestrial
               | extraterrestrial extraterrestrial extraterrestrial
               | extraterrestrial extraterrestrial extraterrestrial
               | extraterrestrial misinformation misinformation
               | misinformation misinformation misinformation
               | misinformation misinformation misinformation
               | misinformation misinformation misinformation
               | misinformation freedom freedom freedom freedom freedom
               | freedom freedom freedom freedom freedom freedom freedom
               | second coming second coming second coming second coming
               | second coming second coming second coming second coming
               | second coming second coming angels angels angels angels
               | angels angels angels angels angels angels end end times
               | times end times end times end times end times end times
               | end times end times end times end times Key Words: (for
               | search engines) 144,000, Abductees, Agnostic, Alien,
               | Allah, Alternative, Angels, Antichrist, Apocalypse,
               | Armageddon, Ascension, Atheist, Awakening, Away Team,
               | Beyond Human, Blasphemy, Boddhisattva, Book of
               | Revelation, Buddha, Channeling, Children of God, Christ,
               | Christ's Teachings, Consciousness, Contactees,
               | Corruption, Creation, Death, Discarnate, Discarnates,
               | Disciple, Disciples, Disinformation, Dying, Ecumenical,
               | End of the Age, End of the World, Eternal Life, Eunuch,
               | Evolution, Evolutionary, Extraterrestrial, Freedom,
               | Fulfilling Prophecy, Genderless, Glorified Body, God,
               | God's Children, God's Chosen, God's Heaven, God's Laws,
               | God's Son, Guru, Harvest Time, He's Back, Heaven,
               | Heaven's Gate, Heavenly Kingdom, Higher Consciousness,
               | His Church, Human Metamorphosis, Human Spirit, Implant,
               | Incarnation, Interfaith, Jesus, Jesus' Return, Jesus'
               | Teaching, Kingdom of God, Kingdom of Heaven, Krishna
               | Consciousness, Lamb of God, Last Days, Level Above Human,
               | Life After Death, Luciferian, Luciferians, Meditation,
               | Members of the Next Level, Messiah, Metamorphosis,
               | Metaphysical, Millennium, Misinformation, Mothership,
               | Mystic, Next Level, Non Perishable, Non Temporal, Older
               | Member, Our Lords Return, Out of Body Experience,
               | Overcomers, Overcoming, Past Lives, Prophecy, Prophecy
               | Fulfillment, Rapture, Reactive Mind, Recycling the
               | Planet, Reincarnation, Religion, Resurrection,
               | Revelations, Saved, Second Coming, Soul, Space Alien,
               | Spacecraft, Spirit, Spirit Filled, Spirit Guide,
               | Spiritual, Spiritual Awakening, Star People, Super
               | Natural, Telepathy, The Remnant, The Two, Theosophy, Ti
               | and Do, Truth, Two Witnesses, UFO, Virginity, Walk-ins,
               | Yahweh, Yeshua, Yoda, Yoga,
        
               | lazide wrote:
               | It's the aliens to yoga ratio that really gets me. Yogis
               | got really shortchanged here.
        
               | jameslk wrote:
               | I was curious who's keeping that website alive, and
               | allegedly it's two former members of the cult: Mark and
               | Sarah King
               | 
               | https://www.vice.com/en/article/a-suicide-cults-
               | surviving-me...
        
           | reactordev wrote:
           | It's always been like this, have you read the Bible? Or the
           | Koran? It's insane. Ours is just our flavor of crazy. Every
           | generation has some. When you dig at it, there's always a
           | religion.
        
             | mlinhares wrote:
             | Mage is a game for teenagers, it doesn't try to be anything
             | else other than a game where you roll dice do to stuff.
        
               | reactordev wrote:
               | Mage yea, but the cult? Where do you roll for crazy? Is
               | it a save against perception? Constitution? Or
               | intelligence check?
               | 
               | I know the church of Scientology wants you to crit that
               | roll of tithing.
        
               | mlinhares wrote:
               | > I know the church of Scientology wants you to crit that
               | roll of tithing.
               | 
               | I shouldn't LOL at this but I must. We're all gonna die
               | in these terrible times but at least we'll LOL at the
               | madness and stupidity of it all.
        
               | reactordev wrote:
               | Like all tragedies, there's comedy there somewhere.
               | Sometimes you have to be it.
        
               | zzzeek wrote:
               | yeah, people should understand, what is Scientology based
               | on? The E-Meter which is some kind of cheap shit radio
               | shack lie detector thing. I'm quite sure LLMs would do
               | very well if given the task to spit out new cult
               | doctrines and I would gather we are less than years away
               | from cults based on LLM generated content (if not
               | already).
        
               | bitwize wrote:
               | Terry Davis, a cult of one, believed God spoke to him
               | through his computer's RNG. So... yeah.
        
               | reactordev wrote:
               | If only he installed Dwarf Fortress where he could become
               | one.
        
               | notahacker wrote:
               | tbf Helter Skelter was a song about a fairground ride
               | that didn't really pretend to be much more than an excuse
               | for Paul McCartney to write something loud, but that
               | didn't stop a sufficiently manipulative individual
               | turning it into a reason why the Family should murder
               | people. And he didn't even need the internet to help him
               | find followers.
        
             | startupsfail wrote:
             | It is used ti be always religion. But now downsides are
             | well understood. And alternatives that can fill the same
             | need (social activities) - like gathering with your
             | neighbors, singing, performing arts, clubs, parks and
             | paries are available and great.
        
               | reactordev wrote:
               | I can see that. There's definitely a reason they keep
               | pumping out Call of Duty's and Madden's.
        
               | Mountain_Skies wrote:
               | Religions have multitudes of problems but suicide rates
               | amongst atheists is higher than you'd think it would be.
               | It seems like for many, rejection of organized religion
               | leads to adoption of ad hoc quasi-religions with no
               | mooring to them, leaving the person who is seeking a
               | solid belief system drifting until they find a cult, give
               | in to madness that causes self-harm, or adopt their own
               | system of belief that they then need to vigorously
               | protect from other beliefs.
               | 
               | Some percentage of the population has a lesser need for a
               | belief system (supernatural, ad hoc, or anything else)
               | but in general, most humans appear to be hardcoded for
               | this need and the overlap doesn't align strictly with
               | atheism. For the atheist with a deep need for something
               | to believe in, the results can be ugly. Though far from
               | perfect, organized religions tend to weed out their most
               | destructive beliefs or end up getting squashed by
               | adherents of other belief systems that are less
               | internally destructive.
        
               | reactordev wrote:
               | Nothing to do with religion and everything to do with
               | support networks that Churches and those Groups provide.
               | Synagogue, Church, Camp, Retreat, a place of belonging.
               | 
               | Atheists tend to not have those consistently and must
               | build their own.
        
             | saghm wrote:
             | Without speaking for religions I'm not familiar with, I
             | grew up Catholic, and one of the most important Catholic
             | beliefs is that during Mass, the bread (i.e. "communion"
             | wafers) and wine quite literally transform into the body
             | and blood of Jesus, and that eating and drinking it is a
             | necessary ritual to get into heaven[1], which was a source
             | of controversy even back as far as the Protestant
             | Reformation, with some sects retaining that doctrine and
             | others abandoning it. In a lot of ways, what's considered
             | "normal" or "crazy" in a religion comes to what you're
             | familiar with.
             | 
             | For those not familiar with the bible enough to know what
             | to look for to find the wild stuff, look up the story of
             | Elisha summoning bears out of the first to maul children
             | for calling him bald, or the last two chapters of Daniel
             | (which I think are only in the Catholic bible) where he
             | literally blows up a dragon by feeding it a cake.
             | 
             | [1]: https://en.wikipedia.org/wiki/Real_presence_of_Christ_
             | in_the...
        
               | robertlagrant wrote:
               | Yes, Catholicism has definitely accumulated some cruft :)
        
               | tialaramex wrote:
               | Yeah "Transubstantiation" is another technical term
               | people might want to look at in this topic. The art piece
               | "An Oak Tree" is a comment on these ideas. It's a glass
               | of water. But, the artist's notes for this work insist it
               | is an oak tree.
        
               | petralithic wrote:
               | Someone else who knows "An Oak Tree"! It is one of my
               | favorite pieces because it wants not reality itself to be
               | the primary way to see the world, but the _belief_ of
               | what reality could be.
        
               | scns wrote:
               | Interesting you bring art into the discussion. Often
               | thought that some "artists" have a lot in common with
               | cult leaders. My definition of art would be that is
               | immediately understood, zero explanation needed.
        
               | o11c wrote:
               | The "bears" story reads a lot more sensibly if you
               | translated it correctly as "a gang of thugs tries to
               | bully Elisha into killing himself." Still reliant on the
               | supernatural, but what do you expect from such a book?
        
               | michaeldoron wrote:
               | Where do you see that in the text? I am looking at the
               | Hebrew script, and the text only reads that as Elisha
               | went up a path, young lads left the city and mocked him
               | by saying "get up baldy", and he turned to them and
               | cursed them to be killed by two she bears. I don't think
               | saying "get up baldy" to a guy walking up a hill
               | constitutes bullying him into killing himself.
        
               | reactordev wrote:
               | Never underestimate the power of words. Kids have
               | unalived themselves over it.
               | 
               | I think the true meaning has been lost to time. The
               | Hebrew text has been translated and rewritten so many
               | times it's a children's book. The original texts of the
               | Dead Sea scrolls are bits and pieces of that long lost
               | story. All we have left are the transliterations of
               | transliterations.
        
               | o11c wrote:
               | It's called context. The beginning of the chapter is
               | Elijah (Elisha's master) being removed from Earth and
               | going up (using the exact same Hebrew word) to Heaven.
               | Considering that the thugs are clearly not pious people,
               | "remove yourself from the world, like your master did"
               | has only one viable interpretation.
               | 
               | As for my choice of the word "thugs" ("mob" would be
               | another good word), that is necessary to preserve the
               | connotation. Remember, there were 42 of them _punished_ ,
               | possibly more escaped - this is a threatening crowd size
               | (remember the duck/horse meme?). Their claimed youth does
               | imply "not an established veteran of the major annual
               | wars", but that's not the same as "not acquainted with
               | violence".
        
               | cjameskeller wrote:
               | To be fair, the description of the dragon incident is
               | pretty mundane, and all he does is prove that the large
               | reptile they had previously been feeding (& worshiping)
               | could be killed:
               | 
               | "Then Daniel took pitch, and fat, and hair, and did
               | seethe them together, and made lumps thereof: this he put
               | in the dragon's mouth, and so the dragon burst in sunder:
               | and Daniel said, Lo, these are the gods ye worship."
        
               | saghm wrote:
               | I don't think it's mundane to cause something to "burst
               | in sunder" by putting some pitch, fat, and hair in its
               | mouth.
        
               | neaden wrote:
               | The story is pretty clearly meant to indicate that the
               | Babylonians were worshiping an animal though. The
               | theology of the book of Daniel emphasises that the Gods
               | of the Babylonians don't exist, this story happens around
               | the same time Daniel proves the priests had a secret
               | passage they were using to get the food offered to Bel
               | and eat it at night while pretending that Bel was eating
               | it. Or when Daniel talks to King Belshazzar and says "You
               | have praised the gods of silver and gold, of bronze,
               | iron, wood, and stone, which do not see or hear or know,
               | but the God in whose power is your very breath and to
               | whom belong all your ways, you have not honored". This is
               | not to argue for the historical accuracy of the stories,
               | just that the point is that Daniel is acting as a
               | debunker of the Babylonian beliefs in these stories while
               | asserting the supremacy of the Israelite beliefs.
        
               | genghisjahn wrote:
               | I've recently started attending an Episcopal church. We
               | have some people who lean heavy on transubstantiation,
               | but our priest says, "look, something holy happens during
               | communion, exactly what, we don't know."
               | 
               | See also: https://www.episcopalchurch.org/glossary/real-
               | presence/?
               | 
               | "Belief in the real presence does not imply a claim to
               | know how Christ is present in the eucharistic elements.
               | Belief in the real presence does not imply belief that
               | the consecrated eucharistic elements cease to be bread
               | and wine."
        
               | reactordev wrote:
               | Same could be said for bowel movements too though.
               | 
               | There's a fine line between suspension of disbelief and
               | righteousness. All it takes is for one to believe their
               | own delusion.
        
           | glenstein wrote:
           | I personally (for better or worse) became familiar with Ayn
           | Rand as a teenager, and I think Objectivism as a kind of
           | extended Ayn Rand social circle and set of organizations has
           | faced the charge of cultish-ness, and that dates back to, I
           | want to say, the 70s and 80s at least. I know Rand wrote much
           | earlier than that, but I think the social and organizational
           | dynamics unfolded rather late in her career.
        
             | hexis wrote:
             | Albert Ellis wrote a book, "Is Objectivism a Religion" as
             | far back as 1968. Murray Rothbard wrote "Mozart Was a Red",
             | a play satirizing Rand's circle, in the early 60's. Ayn
             | Rand was calling her own circle of friends, in "jest", "The
             | Collective" in the 50's. The dynamics were there from
             | almost the beginning.
        
             | ryandrake wrote:
             | "There are two novels that can change a bookish fourteen-
             | year old's life: The Lord of the Rings and Atlas Shrugged.
             | One is a childish fantasy that often engenders a lifelong
             | obsession with its unbelievable heroes, leading to an
             | emotionally stunted, socially crippled adulthood, unable to
             | deal with the real world. The other, of course, involves
             | orcs."
             | 
             | https://www.goodreads.com/quotes/366635-there-are-two-
             | novels...
        
             | cogman10 wrote:
             | I think it's pretty similar dynamics. It's unquestioned
             | premises (dogma) which are supposed to be accepted simply
             | because this is "objectivism" or "rationalism".
             | 
             | Very similar to my childhood religion. "We have figured
             | everything out and everyone else is wrong for not figuring
             | things out".
             | 
             | Rationalism seems like a giant castle built on sand. They
             | just keep accruing premises without ever going backwards to
             | see if those premises make sense. A good example of this is
             | their notions of "information hazards".
        
             | afpx wrote:
             | Her books were very popular with the gifted kids I hung out
             | with in the late 80s. Cool kids would carry around hardback
             | copies of Atlas Shrugged, impressive by the sheer physical
             | size and art deco cover. How did that trend begin?
        
               | prepend wrote:
               | People reading the book and being into it and telling
               | other people.
               | 
               | It's also a hard book to read so it may be smart kids
               | trying to signal being smart.
        
               | jacquesm wrote:
               | The only thing that makes it hard to read is the
               | incessant soap-boxing by random characters. I have a rule
               | that if I start a book I finish it but that one had me
               | tempted.
        
               | mikestew wrote:
               | I'm convinced that even Rand's editor didn't finish the
               | book. That is why Galt's soliloquy is ninety friggin'
               | pages long. (When in reality, three minutes in and people
               | would be unplugging their radios.)
        
               | meheleventyone wrote:
               | It's hard to read because it's tedious not because you
               | need to be smart though.
        
               | notahacker wrote:
               | tbf you have to have read it to know that!
               | 
               | I can't help but think it's probably the "favourite book"
               | of a lot of people who haven't finished it though,
               | possibly to a greater extent than any other secular tome
               | (at least LOTR's lightweight fans watched the movies!).
               | 
               | I mean, _if you 've only read the blurb on the back_ it's
               | the perfect book to signal your belief in free markets,
               | conservative values and the American Dream: what could be
               | more a more strident defence of your views than a book
               | about capitalists going on strike to prove how much the
               | world really needs them?! If you read the first few
               | pages, it's satisfyingly pro-industry and contemptuous of
               | liberal archetypes. If you trudge through the whole
               | thing, it's not only tedious and _odd_ but contains whole
               | subplots devoted to dumping on core conservative values
               | (religion bad, military bad, marriage vows not that
               | important really, and a rather jaded take on actually
               | extant capitalism) in between the philosopher pirates and
               | jarring absence of private transport, and the resolution
               | is an odd combination of a handful of geniuses running
               | away to form a commune and the world being saved by a
               | multi-hour speech about philosophy which has surprisingly
               | little to say on market economics...
        
               | mikestew wrote:
               | _at least LOTR 's lightweight fans watched the movies!_
               | 
               | Oh, there's movies for lazy Rand fans, too.
               | 
               | https://www.imdb.com/title/tt0480239/
               | 
               | More of a Fountainhead fan, are you? Do ya like Gary
               | Cooper and Patricia Neal?
               | 
               | https://www.imdb.com/title/tt0041386/?ref_=ext_shr_lnk
        
               | notahacker wrote:
               | > Oh, there's movies for lazy Rand fans, too.
               | 
               | tbf that comment was about 50% a joke about their poor
               | performance at the box office :D
        
               | mikestew wrote:
               | Rereading your comment, that's my _woosh_ moment for the
               | day, I guess. :-)
               | 
               | Though a Gary Cooper _The Fountainhead_ does tempt me on
               | occasion. (Unlike _Atlas Shrugged_ , _The Fountainhead_
               | wasn't horrible, but still some pretty poor writing.)
        
               | CalChris wrote:
               | _Fountainhead_ is written at the 7th grade reading level.
               | Its Lexile level is 780L. It 's long and that's about it.
               | By comparison, _1984_ is 1090L.
        
               | jacquesm wrote:
               | By setting up the misfits in a revenge of the nerds
               | scenario?
               | 
               | Ira Levin did a much better job of it and showed what it
               | would lead to but his 'This Perfect Day' did not -
               | predictably - get the same kind of reception as Atlas
               | Shrugged did.
        
               | spacechild1 wrote:
               | What's funny is that Robert Anton Wilson and Robert Shear
               | already took the piss out of Ayn Rand in Illuminatus!
               | (1969-1971).
        
           | rglover wrote:
           | > Came to ask a similar question, but also has it always been
           | like this?
           | 
           | Crazy people have always existed (especially cults), but I'd
           | argue recruitment numbers are through the roof thanks to
           | technology and a failing economic environment (instability
           | makes people rationalize crazy behavior).
           | 
           | It's not that those groups didn't have visibility before,
           | it's just easier for the people who share the
           | same...interests...to cloister together on an international
           | scale.
        
           | jacquesm wrote:
           | It's no more crazy than a virgin conception. And yet, here we
           | are. A good chunk of the planet believes that drivel, but
           | they'd throw their own daughters out of the house if they
           | made the same claim.
        
         | davorak wrote:
         | Makes me think of that saying that great artists steal, so
         | repurposed for cult founders: "Good cult founders copy, great
         | cult founders steal"
         | 
         | I do not think this cult dogma is any more out there than other
         | cult dogma I have heard, but the above quote makes me think it
         | is easier to found cults in modern day in someways since you
         | can steal other complex world building from numerous sources
         | rather building yourself and keeping everything straight.
        
         | TrackerFF wrote:
         | Cult leaders tend to be narcissists.
         | 
         | Narcissists tend to believe that they are always right, no
         | mater what the topic is, or how knowledgeable they are. This
         | makes them speak with confidence and conviction.
         | 
         | Some people are very drawn to confident people.
         | 
         | If the cult leader has other mental health issues, it can/will
         | seep into their rhetoric. Combine that with unwavering support
         | from loyal followers that will take everything they say as
         | gospel...
         | 
         | That's about it.
        
           | TheOtherHobbes wrote:
           | That's pretty much it. _The beliefs are just a cover story._
           | 
           | Outside of those, the cult dynamics are cut-paste, and always
           | involve an entitled narcissistic cult leader acquiring as
           | much attention/praise, sex, money, and power as possible from
           | the abuse and exploitation of followers.
           | 
           | Most religion works like this. Most alternative spirituality
           | works like this Most finance works like this. Most corporate
           | culture works like this. Most politics works like this.
           | 
           | Most science works like this. (It shouldn't, but the number
           | of abused and exploited PhD students and post-docs is very
           | much not zero.)
           | 
           | The only variables are the differing proportions of
           | attention/praise, sex, money, and power available to leaders,
           | and the amount of abuse that can be delivered to those lower
           | down and/or outside the hierarchy.
           | 
           | The hierarchy and the realities of exploitation and abuse are
           | a constant.
           | 
           | If you removed this dynamic from contemporary culture there
           | wouldn't be a lot left.
           | 
           | Fortunately quite a lot of good things happen in spite of it.
           | But a lot more would happen if it wasn't foundational.
        
           | vannevar wrote:
           | Yes. The cult's "beliefs" really boil down to one belief: the
           | infallibility of the leader. Much of the attraction is in the
           | simplicity.
        
           | patrickmay wrote:
           | If what you say is true, we're very lucky no one like that
           | with a massive following has ever gotten into politics in the
           | United States. It would be an ongoing disaster!
        
         | piva00 wrote:
         | I've met a fair share of people in the burner community, the
         | vast majority I met are lovely folks who really enjoy the
         | process of bringing some weird big idea into reality, working
         | hard on the builds, learning stuff, and having a good time with
         | others for months to showcase their creations at some event.
         | 
         | On the other hand, there's a whole other side of a few nutjobs
         | who really behave like cult leaders, they believe their own
         | bullshit and over time manage to find in this community a lot
         | of "followers", since one of the foundational aspects is
         | radical acceptance it becomes very easy to be nutty and not
         | questioned (unless you do something egregious).
        
         | greenavocado wrote:
         | Humans are compelled to find agency and narrative in chaos.
         | Evolution favored those who assumed the rustle was a predator,
         | not the wind. In a post-Enlightenment world where traditional
         | religion often fails (or is rejected), this drive doesn't
         | vanish. We don't stop seeking meaning. We seek new frameworks.
         | Our survival depended on group cohesion. Ostracism meant death.
         | Cults exploit this primal terror. Burning Man's temporary city
         | intensifies this: extreme environment, sensory overload, forced
         | vulnerability. A camp like Black Lotus offers immediate,
         | intense belonging. A tribe with shared secrets (the "Ascension"
         | framework), rituals, and an "us vs. the sleepers" mentality.
         | This isn't just social; it's neurochemical. Oxytocin (bonding)
         | and cortisol (stress from the environment) flood the system,
         | creating powerful, addictive bonds that override critical
         | thought.
         | 
         | Human brains are lazy Bayesian engines. In uncertainty, we
         | grasp for simple, all-explaining models (heuristics). Mage
         | provides this: a complete ontology where magic equals
         | psychology/quantum woo, reality is malleable, and the camp
         | leaders are the enlightened "tradition." This offers relief
         | from the exhausting ambiguity of real life. Dill didn't invent
         | this; he plugged into the ancient human craving for a map that
         | makes the world feel navigable and controllable. The
         | "rationalist" veneer is pure camouflage. It feels like critical
         | thinking but is actually pseudo-intellectual cargo culting.
         | This isn't Burning Man's fault. It's the latest step of a
         | 2,500-year-old playbook. The Gnostics and the Hermeticists
         | provided ancient frameworks where secret knowledge ("gnosis")
         | granted power over reality, accessible only through a guru.
         | Mage directly borrows from this lineage (The Technocracy, The
         | Traditions). Dill positioned himself as the modern "Ascended
         | Master" dispensing this gnosis.
         | 
         | The 20th century cults Synanon, EST, Moonies, NXIVM all
         | followed similar patterns, starting with isolation. Burning
         | Man's temporary city is the perfect isolation chamber. It's
         | physically remote, temporally bounded (a "liminal space"),
         | fostering dependence on the camp. Initial overwhelming
         | acceptance and belonging (the "Burning Man hug"), then slowly
         | increasing demands (time, money, emotional disclosure, sexual
         | access), framed as "spiritual growth" or "breaking through
         | barriers" (directly lifted from Mage's "Paradigm Shifts" and
         | "Quintessence"). Control language ("sleeper," "muggle,"
         | "Awakened"), redefining reality ("that rape wasn't really rape,
         | it was a necessary 'Paradox' to break your illusions"),
         | demanding confession of "sins" (past traumas, doubts), creating
         | dependency on the leader for "truth."
         | 
         | Burning Man attracts people seeking transformation, often
         | carrying unresolved pain. Cults prey on this vulnerability.
         | Dill allegedly targeted individuals with trauma histories.
         | Trauma creates cognitive dissonance and a desperate need for
         | resolution. The cult's narrative (Mage's framework + Dill's
         | interpretation) offers a simple explanation for their pain
         | ("you're unAwakened," "you have Paradox blocking you") and a
         | path out ("submit to me, undergo these rituals"). This isn't
         | therapy; it's trauma bonding weaponized. The alleged rape
         | wasn't an aberration; it was likely part of the control
         | mechanism. It's a "shock" to induce dependency and reframe the
         | victim's reality ("this pain is necessary enlightenment").
         | People are adrift in ontological insecurity (fear about the
         | fundamental nature of reality and self). Mage offers a new
         | grand narrative with clear heroes (Awakened), villains
         | (sleepers, Technocracy), and a path (Ascension).
        
           | photios wrote:
           | Gnosticism... generating dumb cults that seem smart on the
           | outside for 2+ thousand years. Likely to keep it up for 2k
           | more.
        
         | gedy wrote:
         | Paraphrasing someone I don't recall - when people believe in
         | nothing, they'll believe anything.
        
           | collingreen wrote:
           | And therefore you should believe in me and my low low 10%
           | tithe! That's the only way to not get tricked into believing
           | something wrong so don't delay!
        
         | pstuart wrote:
         | People are wired to worship, and want somebody in charge
         | telling them what to do.
         | 
         | I'm a staunch atheist and I feel the pull all the time.
        
         | Nihilartikel wrote:
         | I'm entertaining sending my kiddo to a Waldorf School, because
         | it genuinely seems pretty good.
         | 
         | But looking into the underlying Western Esoteric Spirit
         | Science, 'Anthroposophy' (because Theosophy wouldn't let him
         | get weird enough) by Rudolph Steiner, has been quite a ride.
         | The point being that.. humans have a pretty endless capacity to
         | go ALL IN on REALLY WEIRD shit, as long as it promises to fix
         | their lives if they do everything they're told. Naturally if
         | their lives aren't fixed, then they did it wrong or have karmic
         | debt to pay down, so YMMV.
         | 
         | In any case, I'm considering the latent woo-cult atmosphere as
         | a test of the skeptical inoculation that I've tried to raise my
         | child with.
        
           | BryantD wrote:
           | I went to a Waldorf school and I'd recommend being really
           | wary. The woo is sort of background noise, and if you've
           | raised your kid well they'll be fine. But the quality of the
           | academics may not be good at all. For example, when I was
           | ready for calculus my school didn't have anyone who knew how
           | to teach it so they stuck me and the other bright kid in a
           | classroom with a textbook and told us to figure it out. As a
           | side effect of not being challenged, I didn't have good study
           | habits going into college, which hurt me a lot.
           | 
           | If you're talking about grade school, interview whoever is
           | gonna be your kids teacher for the next X years and make sure
           | they seem sane. If you're talking about high school, give a
           | really critical look at the class schedule.
           | 
           | Waldorf schools can vary a lot in this regard so you may not
           | encounter the same problems I did, but it's good to be
           | cautious.
        
           | linohh wrote:
           | Don't do it. It's a place that enables child abuse with its
           | culture. These people are serious wackos and you should not
           | give your kid into their hands. A lot of people come out of
           | that Steiner Shitbox traumatized for decades if not for life.
           | They should not be allowed to run schools to begin with.
           | Checking a lot of boxes from antivax to whatever the fuck
           | their lore has to offer starting with a z.
        
         | namuol wrote:
         | > How did we get to this place with people going completely
         | nuts like this?
         | 
         | Ayahuasca?
        
           | yamazakiwi wrote:
           | Nah I did Ayahuasca and I'm an empathetic person who most
           | would consider normal or at least well-adjusted. If it's drug
           | related it would most definitely be something else.
           | 
           | I'm inclined to believe your upbringing plays a much larger
           | role.
        
         | rglover wrote:
         | I came to comments first. Thank you for sharing this quote.
         | Gave me a solid chuckle.
         | 
         | I think people are going nuts because we've drifted from the
         | dock of a stable civilization. Institutions are a mess. Economy
         | is a mess. Combine all of that together with the advent of
         | social media making the creation of echo chambers (and the
         | inevitable narcissism of "leaders" in those echo chambers)
         | _effortless_ and ~15 years later, we have this.
        
           | staunton wrote:
           | People have been going nuts all throughout recorded history,
           | that's really nothing new.
           | 
           | The only scary thing is that they have ever more power to
           | change the world and influence others without being forced to
           | grapple with that responsibility...
        
         | eli_gottlieb wrote:
         | Who the fuck bases a _Black Lotus_ cult on Mage: the Ascension
         | rather than Magic: the Gathering? Is this just a mistake by the
         | journalist?
        
           | kiitos wrote:
           | i regret that i have but one upvote to give
        
         | AnimalMuppet wrote:
         | From false premises, you can logically and rationally reach
         | _really_ wrong conclusions. If you have too much pride in your
         | rationality, you may not be willing to say  "I seem to have
         | reached a really insane conclusion, maybe my premises are
         | wrong". That is, the more you pride yourself on your
         | rationalism, the more prone you may be to accepting a bogus
         | conclusion if it is bogus because the premises are wrong.
        
           | DangitBobby wrote:
           | Then again, most people tend to form really bogus beliefs
           | without bothering to establish any premises. They may not
           | even be internally consistent or align meaningfully with
           | reality. I imagine having premises and thinking it through
           | has a better track record of reaching conclusions consistent
           | with reality.
        
         | bitwize wrote:
         | It's been like this a while. Have you heard the tale of the
         | Final Fantasy House?: http://www.demon-sushi.com/warning/
         | 
         | https://www.vice.com/en/article/the-tale-of-the-final-fantas...
        
         | egypturnash wrote:
         | I've always been under the impression that M:tA's rules of How
         | Magic Works are inspired by actual mystical beliefs that people
         | have practiced for centuries. It's probably about as much of a
         | magical for mystical development as the GURPS Cyberpunk
         | rulebook was for cybercrime but it's pointing at something that
         | already exists and saying "this is a thing we are going to tell
         | an exaggerated story about".
         | 
         | See for example "Reality Distortion Field":
         | https://en.wikipedia.org/wiki/Reality_distortion_field
        
         | GeoAtreides wrote:
         | >How did we get to this place with people going completely nuts
         | like this?
         | 
         | God died and it's been rough going since then.
        
       | thrance wrote:
       | Reminds me somewhat of the _Culte de la Raison_ (Cult of Reason)
       | birthed by the french revolution. It didn 't last long.
       | 
       | https://en.wikipedia.org/wiki/Cult_of_Reason
        
       | amiga386 wrote:
       | See also _Rational Magic: Why a Silicon Valley culture that was
       | once obsessed with reason is going woo_ (2023)
       | 
       | https://www.thenewatlantis.com/publications/rational-magic
       | 
       | and its discussion on HN:
       | https://news.ycombinator.com/item?id=35961817
        
       | gjsman-1000 wrote:
       | It's especially popular in Silicon Valley.
       | 
       | Quite possibly, places like Reddit and Hacker News, are training
       | for the required level of intellectual smugness, and certitude
       | that you can dismiss every annoying argument with a logical
       | fallacy.
       | 
       | That sounds smug of me, but I'm actually serious. One of their
       | defects, is that once you memorize all the fallacies ("Appeal to
       | authority," "Ad hominem,") you can easily reach the point where
       | you more easily recognize the fallacies in everyone else's
       | arguments than your own. You more easily doubt other people's
       | cited authorities, than your own. You slap "appeal to authority"
       | against a disliked opinion, while citing an authority next week
       | for your own. It's a fast path from there to perceived
       | intellectual superiority, and an even faster path from there into
       | delusion. _Rational_ delusion.
        
         | shadowgovt wrote:
         | It's generally worth remembering that some of the fallacies are
         | actually structural, and some are rhetorical.
         | 
         | A contradiction creates a structural fallacy; if you find one,
         | it's a fair belief that at least one of the supporting claims
         | is false. In contrast, appeal to authority is probabilistic: we
         | don't _know_ , given the _current context,_ if the authority is
         | right, so they _might_ be wrong... But we don 't have time to
         | read the universe into this situation so an appeal to authority
         | is better than nothing.
         | 
         | ... and this observation should be coupled with the observation
         | that the school of rhetoric wasn't teaching a method for
         | finding truth; it was teaching a method for beating an opponent
         | in a legal argument. "Appeal to authority is a logical fallacy"
         | is a great sword to bring to bear if your goal is to turn off
         | the audience's ability to ask whether we should give the word
         | of the environmental scientist and the washed-up TV actor equal
         | weight on the topic of environmental science...
        
           | gjsman-1000 wrote:
           | ... however, even that is up for debate. Maybe the TV actor
           | in your own example is Al Gore filming _An Inconvenient
           | Truth_ and the environmental scientist was in the minority
           | which isn't so afraid of climate change. Fast forward to
           | 2025, the scientist's minority position was wrong, while Al
           | Gore's documentary was legally ruled to have 9 major errors;
           | so you were stupid on both sides, with the TV actor being
           | closer.
        
             | shadowgovt wrote:
             | True, but this is where the Boolean nature of traditional
             | logic can really trip up a person trying to operate in the
             | real world.
             | 
             | These "maybes" are on the table. They are _probably_ not
             | the case.
             | 
             | (You end up with a spread of likelihoods and have to decide
             | what to do with them. And law _hates_ a spread of
             | likelihoods and _hates_ decision-by-coinflips, so one can
             | see how rhetorical traditions grounded in legal persuasion
             | tend towards encouraging Boolean outcomes; you can 't find
             | someone "a little guilty," at least not in the Western
             | tradition of justice).
        
         | sunshowers wrote:
         | While deployment of logical fallacies to win arguments is
         | annoying at best, the far bigger problem is that people make
         | those fallacies in the first place -- such as not considering
         | base rates.
        
       | bobson381 wrote:
       | I keep thinking about the first Avengers movie, when Loki is
       | standing above everyone going "See, is this not your natural
       | state?". There's some perverse security in not getting a choice,
       | and these rationalist frameworks, based in logic, can lead in all
       | kinds of crazy arbitrary directions - powered by nothing more
       | than a refusal to suffer any kind of ambiguity.
        
         | csours wrote:
         | Humans are not chickens, but we sure do seem to love having a
         | pecking order.
        
           | lazide wrote:
           | Making good decisions is hard, and being accountable to the
           | results of them is not fun. Easier to outsource if you can.
        
           | snarf21 wrote:
           | I think it is more simple in that we _love_ tribalism. A long
           | time ago being part of a tribe had such huge benefits over
           | going it alone that it was always worth any tradeoffs. We
           | have a much better ability to go it alone now but we still
           | love to belong to a group. Too often we pick a group based on
           | a single shared belief and don 't recognize all the baggage
           | that comes along. Life is also too complicated today. It is
           | difficult for someone to be knowledgeable in one topic let
           | alone the 1000s that make up our society.
        
             | csours wrote:
             | maybe the real innie/outie is the in-group/out-group. no
             | spoilers, i haven't finished that show yet
        
         | jacquesm wrote:
         | They mostly seem to lean that way because it gives them carte
         | blanche to do as they please. It is just a modern version of
         | 'god has led my hand'.
        
           | notahacker wrote:
           | I agree with the religion comparison (the "rational"
           | conclusions of rationalism tend towards millenarianism with a
           | scifi flavour), but the people going furthest down that
           | rabbit hole often aren't doing what they please: on the
           | contrary they're spending disproportionate amounts of time
           | worrying about armageddon and optimising for stuff other
           | people simply don't care about, or in the case of the
           | explicit cults being actively exploited. Seems like the
           | typical in-too-deep rationalist gets seduced by the idea that
           | others who scoff at their choices just _aren 't as smart and
           | rational as them_, as part of a package deal which treats
           | everything from their scifi interests to their on-the-
           | spectrum approach to analysing every interaction from first
           | principles as great insights...
        
       | nathan_compton wrote:
       | Thinking too hard about anything will drive you insane but I
       | think the real issue here is that rationalists simply _over-
       | estimate_ both the power of rational thought _and_ their ability
       | to do it. If you think of people who tend to make that kind of
       | mistake you can see how you get a lot of crazy groups.
       | 
       | I guess I'm a radical skeptic, secular humanist, utilitarianish
       | sort of guy, but I'm not dumb enough to think throwing around the
       | words "bayesian prior" and "posterior distribution" makes
       | actually figuring out how something works or predicting the
       | outcome of an intervention easy or certain. I've had a lot of
       | life at this point and gotten to some level of mastery at a few
       | things and my main conclusion is that most of the time its just
       | hard to know stuff and that the single most common cognitive
       | mistake people make is too much certainty.
        
         | nyeah wrote:
         | I'm lucky enough work in a pretty rational place (small "r").
         | We're normally data-limited. Being "more rational" would mean
         | taking/finding more of the right data, talking to the right
         | people, reading the right stuff. Not just thinking harder and
         | harder about what we already know.
         | 
         | There's a point where more passive thinking stops adding value
         | and starts subtracting sanity. It's pretty easy to get to that
         | point. We've all done it.
        
           | naasking wrote:
           | > We're normally data-limited.
           | 
           | This is a common sentiment but is probably not entirely true.
           | A great example is cosmology. Yes, more data would make some
           | work easier, but astrophysicists and cosmologists have shown
           | that you can gather and combine existing data and look at it
           | in novel new ways to produce unexpected results, like place
           | bounds that can include/exclude various theories.
           | 
           | I think a philosophy that encourages more analysis rather
           | than sitting back on our laurels with an excuse that we need
           | more data is good, as long as it's done transparently and
           | honestly.
        
             | nyeah wrote:
             | I suspect you didn't read some parts of my comment. I
             | didn't say everyone in the world is always data-limited, I
             | said we normally are where I work. I didn't recommend
             | "sitting back on our laurels." I made very specific
             | recommendations.
             | 
             | The qualifier "normally" already covers "not entirely
             | true". Of course it's not entirely true. It's mostly true
             | for us now. (In fact twenty years ago we used more
             | numerical models than we do now, because we were facing
             | more unsolved problems where the solution was pretty well
             | knowable just by doing more complicated calculations, but
             | without taking more data. Back then, when people started
             | taking lots of data, it was often a total waste of time.
             | But right now, most of those problems seem to be solved.
             | We're facing different problems that seem much harder to
             | model, so we rely more on data. This stage won't be
             | permanent either.)
             | 
             | It's not a sentiment, it's a reality that we have to deal
             | with.
        
               | naasking wrote:
               | > It's not a sentiment, it's a reality that we have to
               | deal with.
               | 
               | And I think you missed the main point of my reply: that
               | people often think we need more data, but cleverness and
               | ingenuity can often find a way to make meaningful
               | progress with existing data. Obviously I can't make any
               | definitive judgment about your specific case, but I'm
               | skeptical of any claim that it's out of the realm of
               | possibility that some genius like Einstein analyzed your
               | problem could get no further than you have.
        
               | nyeah wrote:
               | Apparently you will not be told what I'm saying.
               | 
               | I read your point and answered it twice. Your latest
               | response seems to indicate that you're ignoring those
               | responses. For example you seem to suggest that I'm
               | "claim[ing] that it's out of the realm of possibility"
               | for "Einstein" to make progress on our work without
               | taking more data. But anyone can hit "parent" a few times
               | and see what I actually claimed. I claimed "mostly" and
               | "for us where I work". I took the time to repeat that for
               | you. That time seems wasted now.
               | 
               | Perhaps you view "getting more data" as an extremely
               | unpleasant activity, to be avoided at all costs? You may
               | be an astronomer, for example. Or maybe you see taking
               | more data before thinking as some kind of admission of
               | defeat? We don't use that kind of metric. For us it's a
               | question of the cheapest and fastest way to solve each
               | problem.
               | 
               | if modeling is slower and more expensive than measuring,
               | we measure. If not, we model. You do you.
        
             | spott wrote:
             | This depends on what you are trying to figure out.
             | 
             | If you are talking about cosmology? Yea, you can look at
             | existing data in new ways, cause you probably have enough
             | data to do that safely.
             | 
             | If you are looking at human psychology? Looking at existing
             | data in new ways is essentially p-hacking. And you probably
             | won't ever have enough data to define a "universal theory
             | of the human mind".
        
         | throw4847285 wrote:
         | People find academic philosophy impenetrable and pretentious,
         | but it has two major advantages over rationalist cargo cults.
         | 
         | The first is diffusion of power. Social media is powered by
         | charisma, and while it is certainly true that personality-based
         | cults are nothing new, the internet makes it way easier to form
         | one. Contrast that with academic philosophy. People can have
         | their own little fiefdoms, and there is certainly abuse of
         | power, but rarely concentrated in such a way that you see
         | within rationalist communities.
         | 
         | The second (and more idealistic) is that the discipline of
         | Philosophy is rooted in the Platonic/Socratic notion that "I
         | know that I know nothing." People in academic philosophy are on
         | the whole happy to provide a gloss on a gloss on some important
         | thinker, or some kind of incremental improvement over somebody
         | else's theory. This makes it extremely boring, and yet, not
         | nearly as susceptible to delusions of grandeur. True skepticism
         | has to start with questioning one's self, but everybody seems
         | to skip that part and go right to questioning everybody else.
         | 
         | Rationalists have basically reinvented academic philosophy from
         | the ground up with none of the rigor, self-discipline, or joy.
         | They mostly seem to dedicate their time to providing post-hoc
         | justifications for the most banal unquestioned assumptions of
         | their subset of contemporary society.
        
           | NoGravitas wrote:
           | > Rationalists have basically reinvented academic philosophy
           | from the ground up with none of the rigor, self-discipline,
           | or joy.
           | 
           | Taking academic philosophy seriously, at least as an
           | historical phenomenon, would require being educated in the
           | humanities, which is unpopular and low-status among
           | Rationalists.
        
           | wizzwizz4 wrote:
           | > _True skepticism has to start with questioning one 's self,
           | but everybody seems to skip that part and go right to
           | questioning everybody else._
           | 
           | Nuh-uh! Eliezer Yudkowsky wrote that his mother made this
           | mistake, so he's made sure to say things in the right order
           | for the reader not to make this mistake. Therefore, _true_
           | Rationalists(tm) are immune to this mistake.
           | https://www.readthesequences.com/Knowing-About-Biases-Can-
           | Hu...
        
         | sunshowers wrote:
         | I don't disagree, but to steelman the case for
         | (neo)rationalism: one of its fundamental contributions is that
         | Bayes' theorem is extraordinarily important as a guide to
         | reality, perhaps at the same level as the second law of
         | thermodynamics; and that it is dramatically undervalued by
         | larger society. I think that is all basically correct.
         | 
         | (I call it neorationalism because it is philosophically
         | unrelated to the more traditional rationalism of Spinoza and
         | Descartes.)
        
           | matthewdgreen wrote:
           | I don't understand what "Bayes' theorem is a good way to
           | process new data" (something that is _not at all_ a
           | contribution of neorationalism) has to do with  "human beings
           | are capable of using this process effectively at a conscious
           | level to get to better mental models of the world." I think
           | the rationalist community has a thing called "motte and
           | bailey" that would apply here.
        
           | rpcope1 wrote:
           | Where Bayes' theorem applies in unconventional ways is not
           | remotely novel for "rationalism" (maybe only in their strange
           | busted hand wavy circle jerk "thought experiments"). This has
           | been the domain of statistical mechanics long before
           | Yudkowski and other cult leaders could even probably mouth
           | "update your priors".
        
             | sunshowers wrote:
             | I don't know, most of science still runs on frequentist
             | statistics. Juries convict all the time on evidence that
             | would never withstand a Bayesian analysis. The prosecutor's
             | fallacy is real.
        
               | ImaCake wrote:
               | Most science runs on BS with a cursory amount of
               | statistics slapped on top so everyone can feel better
               | about it. Weirdly enough, science still works despite not
               | being rational. Rationalists seem to think science is
               | logical when in reality it works for largely the same
               | reasons the free-market does; throw shit at the wall and
               | maybe support some of the stuff that works.
        
           | copularent wrote:
           | As if these neorationalist are building a model and markov
           | chain monte carlo sampling their life decisions.
           | 
           | That is the bullshit part.
        
         | rpcope1 wrote:
         | Even the real progenitors of a lot of this sort of thought,
         | like E.T. Jaynes, expoused significantly more skepticism than
         | I've ever seen a "rationalist" use. I would even imagine if you
         | asked almost all rationalists who E.T. Jaynes was (if they
         | weren't well versed in statistical mechanics) they'd have no
         | idea who he was or why his work was important to applying
         | "Bayesianism".
        
         | randcraw wrote:
         | The second-most common cognitive mistake we make has to be the
         | failure to validate what we think we know -- is it actually
         | true? The crux of being right isn't reasoning. It's avoiding
         | dumb blunders based on falsehoods, both honest and dishonest.
         | In today's political and media climate, I'd say dishonest
         | falsehoods are a far greater cause for being wrong than
         | irrationality.
        
       | incomingpain wrote:
       | We live in an irrational time. It's unclear if it was simply
       | under reported in history or social changes in the last ~50-75
       | years have had breaking consequences.
       | 
       | People are trying to make sense of this. For examples.
       | 
       | The Canadian government heavily subsidizes junk food, then spends
       | heavily on healthcare because of the resulting illnesses. It
       | restrict and limits healthy food through supply management and
       | promotes a "food pyramid" favoring domestic unhealthy food.
       | Meanwhile, it spends billions marketing healthy living, yet fines
       | people up to $25,000 for hiking in forests and zones cities so
       | driving is nearly mandatory.
       | 
       | Government is an easy target for irrational behaviours.
        
         | watwut wrote:
         | Scientology is here since 1953 and it has similarly bonkers set
         | of believes. And is huge.
         | 
         | Your rant about government or not being allowed to hike in some
         | places in Canada is unrelated to the issue.
        
         | codr7 wrote:
         | There's nothing irrational about it, this is how you maximize
         | power and profit at any and all costs.
        
           | incomingpain wrote:
           | I completely get that point of view; and yes if that's the
           | goal, it's completely rational.
           | 
           | But from a societal cohesion or perhaps even an ethical point
           | of view it's just pure irrationality.
           | 
           | When typing the post, I was thinking, different levels of
           | government, changing ideologies of politicians leaving
           | inconsistent governance.
        
             | codr7 wrote:
             | I couldn't agree more, but we've long since given up our
             | power collectively in hope of escaping responsibility.
        
       | noqc wrote:
       | Perhaps I will get downvoted to death again for saying so, but
       | the obvious answer is because the name "rationalist" is
       | structurally indistinguishable from the name "scientology" or
       | "the illuminati". You attract people who are desperate for an
       | authority to appeal to, but for whatever reason are no longer
       | affiliated with the church of their youth. Even a rationalist
       | movement which held nothing as dogma would attract people seeking
       | dogma, and dogma would form.
       | 
       | The article begins by saying the rationalist community was "drawn
       | together by AI researcher Eliezer Yudkowsky's blog post series
       | The Sequences". Obviously the article intends to make the case
       | that this is a cult, but it's already done with the argument at
       | this point.
        
         | johnisgood wrote:
         | I do not see any reasons for you to get down-voted.
         | 
         | I agree that the term "rationalist" would appeal to many
         | people, and the obvious need to belong to a group plays a huge
         | role.
        
           | noqc wrote:
           | There are a lot of rationalists in this community. Pointing
           | out that the entire thing is a cult attracts downvotes from
           | people who wish to, for instance, avoid being identified with
           | the offshoots.
        
             | 6177c40f wrote:
             | No, the downvotes are because rationalism isn't a cult and
             | people take offense to being blatantly insulted. This
             | article is about cults that are rationalism-adjacent, it's
             | not claiming that rationalism is itself a cult.
        
               | noqc wrote:
               | That's almost word for word what I said...
        
         | mcv wrote:
         | In fact, I'd go a step further and note the similarity with
         | organized religion. People have a tendency to organize and
         | dogmatize everything. The problem with religion is rarely the
         | core ideas, but always the desire to use it as a basis for
         | authority, to turn it dogmatic and ultimately form a power
         | structure.
         | 
         | And I say this as a Christian. I often think that becoming a
         | state religion was the worst thing that ever happened to
         | Christianity, or any religion, because then it unavoidably
         | becomes a tool for power and authority.
         | 
         | And doing the same with other ideas or ideologies is no
         | different. Look at what happened to communism, capitalism, or
         | almost any other secular idea you can think of: the moment it
         | becomes established, accepted, and official, the corruption
         | sets in.
        
         | handoflixue wrote:
         | > Obviously the article intends to make the case that this is a
         | cult
         | 
         | The author is a self-identified rationalist. This is explicitly
         | established in the second sentence of the article. Given that,
         | why in the world would you think they're trying to claim the
         | whole movement is a cult?
         | 
         | Obviously you and I have very different definitions of
         | "obvious"
        
           | noqc wrote:
           | When I read the article in its entirety, I was pretty
           | disappointed in its top-level introspection.
           | 
           | It seems to not be true, but I still maintain that it was
           | obvious. Sometimes people don't pick the low-hanging fruit.
        
         | o11c wrote:
         | > for whatever reason are no longer affiliated with the church
         | of their youth.
         | 
         | This is the Internet, you're allowed to say "they are obsessed
         | with unlimited drugs and weird sex things, far beyond what even
         | the generally liberal society tolerates".
         | 
         | I'm increasingly convinced that _every_ other part of
         | "Rationalism" is just distraction or justification for those;
         | certainly there's a conscious decision to minimize talking
         | about this part on the Internet.
        
           | twic wrote:
           | I strongly suspect there is heterogeneity here. An outer
           | party of "genuine" rationalists who believe that learning to
           | be a spreadsheet or whatever is going to let them save
           | humanity, and an inner party who use the community to conceal
           | some absolute shenanigans.
        
       | cjs_ac wrote:
       | Rationalism is the belief that reason is the primary path to
       | knowledge, as opposed to, say, the observation that is championed
       | by empiricism. It's a belief system that prioritises imposing its
       | tenets on reality rather than asking reality what reality's
       | tenets are. From the outset, it's inherently cult-like.
        
         | Ifkaluva wrote:
         | That is the definition of "rationalism" as proposed by
         | philosophers like Descartes and Kant, but I don't think that is
         | an accurate representation of the type of "rationalism" this
         | article describes.
         | 
         | This article describes "rationalism" as described in LessWrong
         | and the sequences by Eliezer Yudkowsky. A good amount of it
         | based on empirical findings from psychology behavior science.
         | It's called "rationalism" because it seeks to correct common
         | reasoning heuristics that are purported to lead to incorrect
         | reasoning, not in contrast to empiricism.
        
           | FergusArgyll wrote:
           | I was going to write a similar comment as op, so permit me to
           | defend it:
           | 
           | Many of their "beliefs" - Super-duper intelligence, doom -
           | are clearly not believed by the market; Observing the market
           | is a kind of empiricism and it's completely discounted by the
           | lw-ers
        
           | glenstein wrote:
           | Agreed, I appreciate that there's a conceptual distinction
           | between the philosophical versions of rationalism and
           | empiricism, but what's being talked about here is a
           | conception that (again, at least notionally) is interested in
           | and compatible with both.
           | 
           | I am pretty sure many of the LessWrong posts are about how to
           | understand the meaning of different types of data and are
           | very much about examining, developing, criticizing a rich
           | variety of empirical attitudes.
        
         | gethly wrote:
         | But you cannot have reason without substantial proof of how
         | things behave by observing them in the first place. Reason is
         | simply a logical approach to yes and no questions where you
         | factually know, from observation of past events, how things
         | work. And therefore you can simulate an outcome by the exercise
         | of reasoning applied onto a situation that you have not yet
         | observed and come to a logical outcome, given the set of rules
         | and presumptions.
        
         | handoflixue wrote:
         | Rationalists, in this case, refers specifically to the
         | community clustered around LessWrong, which explicitly and
         | repeatedly emphasizes points like "you can't claim to have a
         | well grounded belief if you don't actually have empirical
         | evidence for it" (https://www.lesswrong.com/w/evidence for a
         | quick overview of some of the basic posts on that topic)
         | 
         | To quote one of the core foundational articles: "Before you try
         | mapping an unseen territory, pour some water into a cup at room
         | temperature and wait until it spontaneously freezes before
         | proceeding. That way you can be sure the general trick--
         | ignoring infinitesimally tiny probabilities of success--is
         | working properly."
         | (https://www.lesswrong.com/posts/eY45uCCX7DdwJ4Jha/no-one-
         | can...)
         | 
         | One can argue how well the community absorbs the lesson, but
         | this certainly seems to be a much higher standard than average.
        
       | AIPedant wrote:
       | I think I found the problem!                 The rationalist
       | community was drawn together by AI researcher Eliezer Yudkowsky's
       | blog post series The Sequences, a set of essays about how to
       | think more rationally
       | 
       | I actually don't mind Yudkowski as an individual - I think he is
       | almost always wrong and undeservedly arrogant, but mostly
       | sincere. Yet treating him as an AI researcher and serious
       | philosopher (as opposed to a sci-fi essayist and self-help
       | writer) is the kind of slippery foundation that less scrupulous
       | people can build cults from. (See also Maharishi Mahesh Yogi and
       | related trends - often it is just a bit of spiritual goofiness as
       | with David Lynch, sometimes you get a Charles Manson.)
        
         | polytely wrote:
         | Don't forget the biggest scifi guy turned cult leader of all L.
         | Ron Hubbard
        
           | AIPedant wrote:
           | I don't think Yudkowski is at all like L. Ron Hubbard.
           | Hubbard was insane and pure evil. Yudkowski seems like a
           | decent and basically reasonable guy, he's just kind of a
           | blowhard and he's wrong about the science.
           | 
           | L. Ron Hubbard is more like the Zizians.
        
             | pingou wrote:
             | I don't have a horse in the battle but could you provide a
             | few examples where he was wrong?
        
               | bglazer wrote:
               | Here's one: Yudkowsky has been confidently asserting (for
               | years) that AI will extinct humanity because it will
               | learn how to make nanomachines using "strong" covalent
               | bonds rather than the "weak" van der Waals forces used by
               | biological systems like proteins. I'm certain that
               | knowledgeable biologists/physicists have tried to explain
               | to him why this belief is basically nonsense, but he just
               | keeps repeating it. Heck there's even a LessWrong post
               | that lays it out quite well [1]. This points to a general
               | disregard for detailed knowledge of existing things and a
               | preference for "first principles" beliefs, no matter how
               | wrong they are.
               | 
               | [1]
               | https://www.lesswrong.com/posts/8viKzSrYhb6EFk6wg/why-
               | yudkow...
        
         | fulafel wrote:
         | How has he fared in the fields of philosophy and AI research in
         | terms of peer review, is there some kind of roundup or survey
         | around about this?
        
       | iwontberude wrote:
       | They watched too much eXistenZ
        
       | lenerdenator wrote:
       | Because humans like people who promise answers.
        
         | andy99 wrote:
         | Boring as it is, this is the answer. It's just more religion.
         | Church, cult, cult, church. So we'll get bored someplace else
         | every Sunday. Does this really change our everyday lives?
        
           | optimalsolver wrote:
           | Funnily enough, the actress who voiced this line is a
           | Scientologist:
           | 
           | https://en.wikipedia.org/wiki/Nancy_Cartwright#Personal_life
        
             | andy99 wrote:
             | I think they were making fun of the "Moonies" so she was
             | probably able to rationalize it. Pretty sure Isaac Hayes
             | quit South Park over their making fun of scientologists.
        
               | ZeroGravitas wrote:
               | I read recently that he suffered a serious medical event
               | atound that time and it was actually cult members
               | speaking on his behalf that withdrew him from the show.
               | 
               | I think it was a relative of his claiming this.
        
       | saasapologist wrote:
       | I think we've strayed too far from the Aristotelian dynamics of
       | the self.
       | 
       | Outside of sexuality and the proclivities of their leaders,
       | emphasis on physical domination of the self is lacking. The brain
       | runs wild, the spirit remains aimless.
       | 
       | In the Bay, the difference between the somewhat well-adjusted
       | "rationalists" and those very much "in the mush" is whether or
       | not someone tells you they're in SF or "on the Berkeley side of
       | things"
        
       | j_m_b wrote:
       | > One way that thinking for yourself goes wrong is that you
       | realize your society is wrong about something, don't realize that
       | you can't outperform it, and wind up even wronger.
       | 
       | many such cases
        
         | quantummagic wrote:
         | It's almost the defining characteristic of our time.
        
           | teiferer wrote:
           | Tell-tale slogan: "Let's derive from first principles"
        
         | shadowgovt wrote:
         | It is an unfortunate reality of our existence that sometimes
         | Chesterton actually _did_ build that fence for a good reason, a
         | good reason that 's still here.
         | 
         | (One of my favorite TED talks was about a failed experiment in
         | introducing traditional Western agriculture to a people in
         | Zambia. It turns out when you concentrate too much food in one
         | place, the hippos come and eat it all and people can't actually
         | out-fight hippos in large numbers. In hindsight, the people
         | running the program should have asked how likely it was that
         | folks in a region that had exposure to other people's
         | agriculture for thousands of years, hadn't ever, you know...
         | _tried it_. https://www.ted.com/talks/ernesto_sirolli_want_to_h
         | elp_someo...)
        
           | ljlolel wrote:
           | TEDx
        
           | bobson381 wrote:
           | You sound like you'd like the book Seeing like a State.
        
           | im3w1l wrote:
           | Shoot the hippos to death for even more food. If it doesn't
           | seem to work it's just a matter of having more and bigger
           | guns.
        
           | HDThoreaun wrote:
           | Why didnt they kill the hippos like we killed the buffalo?
        
             | lesuorac wrote:
             | Hippos are more dangerous than emus.
             | 
             | https://en.wikipedia.org/wiki/Emu_War
        
               | HDThoreaun wrote:
               | My understanding of the emu war is that they werent
               | dangerous so much as quick to multiply. The army couldnt
               | whack the moles fast enough. Hippos dont strike me as
               | animals that can go underground when threatened
        
         | NoGravitas wrote:
         | Capital-R Rationalism also encourages you to think you _can_
         | outperform it, by being smart and reasoning from first
         | principles. That was the idea behind MetaMed, founded by
         | LessWronger Michael Vassar - that being trained in rationalism
         | made you better at medical research and consulting than medical
         | school or clinical experience. Fortunately they went out of
         | business before racking up a body count.
        
         | rpcope1 wrote:
         | One lesson I've learned and seen a lot in my life is that
         | understanding that something is wrong or what's wrong about it,
         | and being able to come up with a better solution are distinct,
         | and the latter is often much harder. It seems often that those
         | that are best able to describe the problem often don't overlap
         | much with those that can figure out how to solve, even though
         | they think they can.
        
         | kiitos wrote:
         | indeed
         | 
         | see: bitcoin
        
       | numbsafari wrote:
       | Why are so many cults founded on fear or hate?
       | 
       | Because empathy is hard.
        
       | meroes wrote:
       | It grew out of many different threads: different websites,
       | communities, etc all around the same time. I noticed it
       | contemporaneously in the philosophy world where Nick Bostrom's
       | Simulation argument was boosted more than it deserved (like
       | everyone was just accepting it at the lay-level). Looking back I
       | see it also developed from less wrong and other sites, but I was
       | wondering what was going on with simulations taking over
       | philosophy talk. Now I see how it all coalesced.
       | 
       | All of it has the appearance of sounding so smart, and a few
       | sites were genuine. But it got taken over.
        
         | potatolicious wrote:
         | Yeah, a lot of the comments here are really just addressing
         | cults writ large and opposed to why this one was particularly
         | successful.
         | 
         | A significant part of this is the intersection of the cult with
         | money and status - this stuff really took off once prominent SV
         | personalities became associated with it, and got turbocharged
         | when it started intersecting with the angel/incubator/VC scene,
         | when there was implicit money involved.
         | 
         | It's unusually successful because -- for a time at least --
         | there was status (and maybe money) in carrying water for it.
        
           | jacquesm wrote:
           | Paypal will be traced as the root cause of many of our future
           | troubles.
        
             | varjag wrote:
             | Wish I could upvote this twice. It's like intersectionality
             | for evil.
        
         | 6177c40f wrote:
         | To be clear, this article isn't calling rationalism a cult,
         | it's about cults that have some sort of association with
         | rationalism (social connection and/or ideology derived from
         | rationalist concepts), e.g. the Zizians.
        
           | throwanem wrote:
           | This article attempts to establish disjoint categories "good
           | rationalist" and "cultist." Its authorship, and its
           | appearance in the cope publication of the "please take us
           | seriously" rationalist faction, speak volumes of how well it
           | is likely to succeed in that project.
        
             | ImaCake wrote:
             | Not sure why you got down voted for this. The opening
             | paragraph of the article reads as suspicious to the
             | observant outsider:
             | 
             | >The rationalist community was drawn together by AI
             | researcher Eliezer Yudkowsky's blog post series The
             | Sequences, a set of essays about how to think more
             | rationally.
             | 
             | Anyone who had just read a lot about Scientology would read
             | that and have alarm bells ringing.
        
       | alphazard wrote:
       | The terminology here is worth noting. Is a Rationalist Cult a
       | cult that practices Rationalism according to third parties, or is
       | it a cult that says they are Rationalist?
       | 
       | Clearly all of these groups that believe in demons or realities
       | dictated by tabletop games are not what third parties would call
       | Rationalist. They might call themselves that.
       | 
       | There are some pretty simple tests that can out these groups as
       | not rational. None of these people have ever seen a demon, so
       | world models including demons have never predicted any of their
       | sense data. I doubt these people would be willing to make any
       | bets about when or if a demon will show up. Many of us would be
       | _glad_ to make a market concerning predictions made by tabletop
       | games about physical phenomenon.
        
         | ameliaquining wrote:
         | The article is talking about cults that arose out of the
         | rationalist social milieu, which is a separate question from
         | whether the cult's beliefs qualify as "rationalist" in some
         | sense (a question that usually has no objective answer anyway).
        
         | glenstein wrote:
         | Yeah, I would say the groups in question are notionally,
         | aspirationally rational and I would hate for the takeaway to be
         | disengagement from principles of critical thinking and
         | skeptical thinking writ large.
         | 
         | Which, to me, raises the fascinating question of what does a
         | "good" version look like, of groups and group dynamics centered
         | around a shared interest in best practices associated with
         | critical thinking?
         | 
         | At a first impression, I think maybe these virtues (which are
         | real!) disappear into the background of other, more applied
         | specializations, whether professions, hobbies, backyard family
         | barbecues.
        
           | alphazard wrote:
           | It would seem like the quintessential Rationalist institution
           | to congregate around is the prediction market. Status in the
           | community has to be derived from a history of making good
           | bets (PnL as a %, not in absolute terms). And the sense of
           | community would come from (measurably) more rational people
           | teaching (measurably) less rational people how to be more
           | rational.
        
             | handoflixue wrote:
             | The founder of LessWrong / The Rationalist movement would
             | absolutely agree with you here, and has written numerous
             | fanfics about a hypothetical alien society ("Dath Ilan")
             | where those are fairly central.
        
         | Barrin92 wrote:
         | >so world models including demons have never predicted any of
         | their sense data.
         | 
         | There's a reason they call themselves "rationalists" instead of
         | empiricists or positivists. They perfectly inverted Hume
         | ("reason is, and ought only to be the slave of the passions")
         | 
         | These kinds of harebrained views aren't an accident but a
         | product of rationalism. The idea that intellect is quasi
         | infinite and that the world can be mirrored in the mind is not
         | running contradictory to, but just the most extreme form of
         | rationalism taken to its conclusion, and of course deeply
         | religious, hence the constant fantasies about AI divinities and
         | singularities.
        
       | wiredfool wrote:
       | It's really worth reading up on the techniques from Large Group
       | Awareness Training so that you can recognize them when they pop
       | up.
       | 
       | Once you see them listed (social pressure, sleep deprivation,
       | control of drinking/bathroom, control of language/terminology,
       | long exhausting activities, financial buy in, etc) and see where
       | they've been used in cults and other cult adjacent things it's a
       | little bit of a warning signal when you run across them IRL.
        
         | derektank wrote:
         | Related, the BITE model of authoritarian control is also a
         | useful framework for identifying malignant group behavior. It's
         | amazing how consistent these are across groups and cultures,
         | from Mao's inner circle to NXIVM and on.
         | 
         | https://freedomofmind.com/cult-mind-control/bite-model-pdf-d...
        
       | keybored wrote:
       | Cue all the surface-level "tribalism/loneliness/hooman nature"
       | comments instead of the simple analysis that Rationalism (this
       | kind) is severely brain-broken and irredeemable and will just
       | foster even worse outcomes in a group setting. It's a bit too
       | close to home (ideologically) to get a somewhat detached
       | analysis.
        
       | bubblyworld wrote:
       | What is the base rate here? Hard to know the scope of the problem
       | without knowing how many non-rationalists (is that even a
       | coherent group of people?) end up forming weird cults, as a
       | comparison. My impression is that crazy beliefs are common
       | amongst _everybody_.
       | 
       | A much simpler theory is that rationalists are mostly normal
       | people, and normal people tend to form cults.
        
         | glenstein wrote:
         | I was wondering about this too. You could also say it's a
         | sturgeon's law question.
         | 
         | They do note at the beginning of the article that many, if not
         | most such groups have reasonably normal dynamics, for what it's
         | worth. But I think there's a legitimate question of whether we
         | ought to expect groups centered on rational thinking to be
         | better able to escape group dynamics we associate with
         | irrationality.
        
       | zzzeek wrote:
       | because humans are biological creatures iterating through complex
       | chemical processes that are attempting to allow a large organism
       | to survive and reproduce within the specific ecosystem provided
       | by the Earth in the present day. "Rational reasoning" is a quaint
       | side effect that sometimes is emergent from the nervous system of
       | these organisms, but it's nothing more than that. It's normal
       | that the surviving/reproducing organism's emergent side effect of
       | "rational thought", when it is particularly intense, will self-
       | refer to the organism and act as though it has some kind of
       | dominion over the organism itself, but this is, like the
       | rationalism itself, just an emergent effect that is accidental
       | and transient. Same as if you see a cloud that looks like an
       | elephant (it's still just a cloud).
        
       | rkapsoro wrote:
       | Something like 15 years ago I once went to a Less
       | Wrong/Overcoming Bias meetup in my town after being a reader of
       | Yudkowsky's blog for some years. I was like, Bayesian Conspiracy,
       | cool, right?
       | 
       | The group was weird and involved quite a lot of creepy
       | oversharing. I didn't return.
        
       | dfabulich wrote:
       | The whole game of Rationalism is that you should ignore gut
       | intuitions and cultural norms that you can't justify with
       | rational arguments.
       | 
       | Well, it turns out that intuition and long-lived cultural norms
       | often have rational justifications, but individuals may not know
       | what they are, and norms/intuitions provide useful antibodies
       | against narcissist would-be cult leaders.
       | 
       | Can you find the "rational" justification not to isolate yourself
       | from non-Rationalists, not to live with them in a polycule, and
       | not to take a bunch of psychedelic drugs with them? If you can't
       | solve that puzzle, you're in danger of letting the group take
       | advantage of you.
        
         | StevenWaterman wrote:
         | Yeah, I think this is exactly it. If something sounds extremely
         | stupid, or if everyone around you says it's extremely stupid,
         | it probably is. If you can't justify it, it's probably because
         | you have failed to find the reason it's stupid, not because
         | it's actually genius.
         | 
         | And the crazy thing is, none of that is fundamentally opposed
         | to rationalism. You can be a rationalist who ascribes value to
         | gut instinct and societal norms. Those are the product of
         | millions of years of pre-training.
         | 
         | I have spent a fair bit of time thinking about the meaning of
         | life. And my conclusions have been pretty crazy. But they sound
         | insane, so until I figure out why they sound insane, I'm not
         | acting on those conclusions. And I'm definitely not surrounding
         | myself with people who take those conclusions seriously.
        
         | empath75 wrote:
         | > The whole game of Rationalism is that you should ignore gut
         | intuitions and cultural norms that you can't justify with
         | rational arguments.
         | 
         | The game as it is _actually_ played is that you use rationalist
         | arguments to justify your pre-existing gut intuitions and
         | personal biases.
        
           | NoGravitas wrote:
           | Or worse - to justify the gut intuitions and personal biases
           | of your cult leader.
        
           | xbar wrote:
           | Which is to say, Rationalism is easily abused to justify any
           | behavior contrary to its own tenets, just like any other
           | -ism.
        
           | copularent wrote:
           | Exactly. Humans are rationalizers. Operate on pre-existing
           | gut intuitions and biases then invent after the fact rational
           | sounding justifications.
           | 
           | I guess Pareto wasn't on the reading list for these
           | intellectual frauds.
           | 
           | Those are actually the priors being updated lol.
        
         | kelseyfrog wrote:
         | > The whole game of Rationalism is that you should ignore gut
         | intuitions and cultural norms that you can't justify with
         | rational arguments.
         | 
         | Specifically, rationalism spends a lot of time about priors,
         | but a sneaky thing happens that I call the 'double update'.
         | 
         | Bayesian updating works when you update your genuine prior
         | believe with new evidence. No one disagrees with this, and
         | sometimes it's easy and sometimes it's difficult to do.
         | 
         | What Rationalists often end up doing is relaxing their priors -
         | intuition, personal experience, cultural norms - and then
         | updating. They often think of this as one update, but what it
         | is is two. The first update, relaxing priors, isn't associated
         | with evidence. It's part of the community norms. There is an
         | implicit belief that by relaxing one's priors you're more open
         | to reality. The real result though, is that it sends people
         | wildly off course. Care in point: all the cults.
         | 
         | Consider the pre-tipped scale. You suspect the scale reads a
         | little low, so before weighing you tilt it slightly to
         | "correct" for that bias. Then you pour in flour until the dial
         | says you've hit the target weight. You've followed the numbers
         | exactly, but because you started from a tipped scale, you've
         | ended up with twice the flour the recipe called for.
         | 
         | Trying to correct for bias by relaxing priors _is_ updating
         | using evidence, not just because everyone is doing it.
        
           | ewoodrich wrote:
           | Thanks, that's a fantastic description of a phenomenon I've
           | observed but couldn't quite put my finger on.
        
           | windowshopping wrote:
           | > Consider the pre-tipped scale. You suspect the scale reads
           | a little low, so before weighing you tilt it slightly to
           | "correct" for that bias. Then you pour in flour until the
           | dial says you've hit the target weight. You've followed the
           | numbers exactly, but because you started from a tipped scale,
           | you've ended up with twice the flour the recipe called for.
           | 
           | I'm not following this example at all. If you've zero'd out
           | the scale by tilting, why would adding flour until it reads
           | 1g lead to 2g of flour?
        
         | twic wrote:
         | From another piece about the Zizians [1]:
         | 
         | > The ability to dismiss an argument with a "that sounds nuts,"
         | without needing recourse to a point-by-point rebuttal, is
         | anathema to the rationalist project. But it's a pretty
         | important skill to have if you want to avoid joining cults.
         | 
         | [1] https://maxread.substack.com/p/the-zizians-and-the-
         | rationali...
        
       | JohnMakin wrote:
       | One of a few issues I have with groups like these, is that they
       | often confidently and aggressively spew a set of beliefs that on
       | their face logically follow from one another, until you realize
       | they are built on a set of axioms that are either entirely
       | untested or outright nonsense. This is common everywhere, but I
       | feel especially pronounced in communities like this. It also
       | involves quite a bit of navel gazing that makes me feel a little
       | sick participating in.
       | 
       | The smartest people I have ever known have been profoundly unsure
       | of their beliefs and what they know. I immediately become
       | suspicious of anyone who is very certain of something, especially
       | if they derived it on their own.
        
         | bobson381 wrote:
         | There should be an extremist cult of people who are _certain
         | only that uncertainty is the only certain thing_
        
           | ameliaquining wrote:
           | Like Robert Anton Wilson if he were way less chill, perhaps.
        
           | rpcope1 wrote:
           | More people should read Sextus Empiricus as he's basically
           | the O.G. Phyrronist skeptic and goes pretty hard on this very
           | train of thought.
        
             | bobson381 wrote:
             | Cool. Any specific recs or places to start with him?
        
               | rpcope1 wrote:
               | Probably the Hackett book, "Sextus Empiricus: Selections
               | from the Major Writings on Scepticism"
        
               | bobson381 wrote:
               | Thanks!
        
             | Telemakhos wrote:
             | If I remember my Gellius, it was the Academic Skeptics who
             | claimed that the only certainty was uncertainty; the
             | Pyrrhonists, in opposition, denied that one could be
             | certain about the certainty of uncertainty.
        
           | arwhatever wrote:
           | "Oh, that must be exhausting."
        
           | saltcured wrote:
           | There would be, except we're all very much on the fence about
           | whether it is the right cult for us.
        
           | hungmung wrote:
           | What makes you so certain there isn't? A group that has a
           | deep understanding fnord of uncertainty would probably like
           | to work behind the scenes to achieve their goals.
        
             | dcminter wrote:
             | One might even call them illuminati? :D
        
             | cwmoore wrote:
             | The Fnords do keep a lower profile.
        
           | card_zero wrote:
           | The Snatter Goblins?
           | 
           | https://archive.org/details/goblinsoflabyrin0000frou/page/10.
           | ..
        
           | pancakemouse wrote:
           | My favourite bumper sticker, "Militant Agnostic. I don't
           | know, and neither do you."
        
             | bobson381 wrote:
             | I heard about this the other day! I think I need one.
        
           | jazzyjackson wrote:
           | A Wonderful Phrase by Gandhi                 I do dimly
           | perceive       that while everything around me is ever-
           | changing,       ever-dying there is,       underlying all
           | that change,       a living power       that is changeless,
           | that holds all together,       that creates,       dissolves,
           | and recreates
        
           | tim333 wrote:
           | Socrates was fairly close to that.
        
             | freedomben wrote:
             | My thought as well! I can't remember names at the moment,
             | but there were some cults that spun off from Socrates.
             | Unfortunately they also adopted his practice of never
             | writing anything down, so we don't know a whole lot about
             | them
        
           | JTbane wrote:
           | "I have no strong feelings one way or the other." _thunderous
           | applause_
        
           | tomjakubowski wrote:
           | https://realworldrisk.com/
        
           | mapontosevenths wrote:
           | There already is, they're called "Politicians."
        
         | ctoth wrote:
         | > I immediately become suspicious of anyone who is very certain
         | of something, especially if they derived it on their own.
         | 
         | Are you certain about this?
        
           | JohnMakin wrote:
           | no
        
           | idontwantthis wrote:
           | Suspicious implies uncertain. It's not immediate rejection.
        
           | teddyh wrote:
           | All I know is that I know nothing.
        
             | p1esk wrote:
             | How do you know?
        
           | adrianN wrote:
           | Your own state of mind is one of the easiest things to be
           | fairly certain about.
        
             | ants_everywhere wrote:
             | The fact that this is false is one of the oldest findings
             | of research psychology
        
               | PaulHoule wrote:
               | Marvin Minsky wrote forcefully [1] about this in _The
               | Society of Mind_ and went so far to say that trying to
               | observe yourself (e.g. meditation) might be harmful.
               | 
               | Freud of course discovered a certain world of the
               | unconscious but untrained [2] you would certainly
               | struggle to explain how you know sentence S is
               | grammatical and S' is not, or what it is you do when you
               | walk.
               | 
               | If you did meditation or psychoanalysis or some other
               | practice to understand yourself better it would take
               | years.
               | 
               | [1] whether or not it is true.
               | 
               | [2] the "scientific" explanation you'd have if you're
               | trained may or may not be true since it can't be used to
               | program a computer to do it
        
             | lazide wrote:
             | said no one familiar with their own mind, ever!
        
           | tshaddox wrote:
           | Well you could be a critical rationalist and do away with the
           | notion of "certainty" or any sort of justification or
           | privileged source of knowledge (including "rationality").
        
           | at-fates-hands wrote:
           | Isaac Newton would like to have a word.
        
             | elictronic wrote:
             | I am not a big fan of alchemy, thank you though.
        
         | jl6 wrote:
         | I don't think it's just (or even particularly) bad axioms, I
         | think it's that people tend to build up "logical" conclusions
         | where they think each step is a watertight necessity that
         | follows inevitably from its antecedents, but actually each step
         | is a little bit leaky, leading to runaway growth in false
         | confidence.
         | 
         | Not that non-rationalists are any better at reasoning, but non-
         | rationalists do at least benefit from some intellectual
         | humility.
        
           | danaris wrote:
           | > I think it's that people tend to build up "logical"
           | conclusions where they think each step is a watertight
           | necessity that follows inevitably from its antecedents, but
           | actually each step is a little bit leaky, leading to runaway
           | growth in false confidence.
           | 
           | Yeah, this is a pattern I've seen a lot of recently--
           | especially in discussions about LLMs and the supposed
           | inevitability of AGI (and the Singularity). This is a good
           | description of it.
        
             | kergonath wrote:
             | Another annoying one is the simulation theory group. They
             | know just enough about Physics to build sophisticated
             | mental constructs without understanding how flimsy the
             | foundations are or how their logical steps are actually
             | unproven hypotheses.
        
               | JohnMakin wrote:
               | Agreed. This one is especially annoying to me and dear to
               | my heart, because I enjoy discussing the philosophy
               | behind this, but it devolves into weird discussions and
               | conclusions fairly quickly without much effort at all. I
               | particularly enjoy the tenets of certain sects of
               | buddhism and how they view these things, but you'll get a
               | lot of people that are doing a really pseudo-intellectual
               | version of the Matrix where they are the main character.
        
           | kergonath wrote:
           | > I don't think it's just (or even particularly) bad axioms,
           | I think it's that people tend to build up "logical"
           | conclusions where they think each step is a watertight
           | necessity that follows inevitably from its antecedents, but
           | actually each step is a little bit leaky, leading to runaway
           | growth in false confidence.
           | 
           | I really like your way of putting it. It's a fundamental
           | fallacy to assume certainty when trying to predict the
           | future. Because, as you say, uncertainty compounds over time,
           | all prediction models are chaotic. It's usually associated
           | with some form of Dunning-Kruger, where people know just
           | enough to have ideas but not enough to understand where they
           | might fail (thus vastly underestimating uncertainty at each
           | step), or just lacking imagination.
        
             | ramenbytes wrote:
             | Deep Space 9 had an episode dealing with something similar.
             | Superintelligent beings determine that a situation is
             | hopeless and act accordingly. The normal beings take issue
             | with the actions of the Superintelligents. The normal
             | beings turn out to be right.
        
           | BeFlatXIII wrote:
           | > Not that non-rationalists are any better at reasoning, but
           | non-rationalists do at least benefit from some intellectual
           | humility.
           | 
           | Non-rationalists are forced to use their physical senses more
           | often because they can't follow the chain of logic as far.
           | This is to their advantage. Empiricism > rationalism.
        
             | om8 wrote:
             | Good rationalism includes empiricism though
        
             | whatevertrevor wrote:
             | That conclusion presupposes that rationality and empiricism
             | are at odds or mutually incompatible somehow. Any rational
             | position worth listening to, about any testable hypothesis,
             | is hand in hand with empirical thinking.
        
               | guerrilla wrote:
               | In traditional philosophy, rationalism and empiricism are
               | at odds; they are essentially diametrically opposed.
               | Rationalism prioritizes _a priori_ reasoning while
               | empiricism prioritizes _a posteriori_ reasoning. You can
               | prioritize both equally but that is neither rationalism
               | nor empiricism in the traditional terminology. The
               | current rationalist movement has no relation to that
               | original rationalist movement, so the words don 't
               | actually mean the same thing. In fact, the majority of
               | participants in the current movement seem ignorant of the
               | historical dispute and its implications, hence the misuse
               | of the word.
        
               | BlueTemplar wrote:
               | Yeah, Stanford has a good recap :
               | 
               | https://plato.stanford.edu/entries/rationalism-
               | empiricism/
               | 
               | (Note also how the context is French vs British, and the
               | French basically lost with Napoleon, so the current
               | "rationalists" seem to be more likely to be heirs to
               | empiricism instead.)
        
               | whatevertrevor wrote:
               | Thank you for clarifying.
               | 
               | That does compute with what I thought the "Rationalist"
               | movement as covered by the article was about. I didn't
               | peg them as pure _a priori_ thinkers as you put it. I
               | suppose my comment still holds, assuming the rationalist
               | in this context refers to the version of  "Rationalism"
               | being discussed in the article as opposed to the
               | traditional one.
        
           | analog31 wrote:
           | Perhaps part of being rational, as opposed to rationalist, is
           | having a sense of when to override the conclusions of
           | seemingly logical arguments.
        
             | 1attice wrote:
             | In philosophy grad school, we described this as 'being
             | reasonable' as opposed to 'being rational'.
             | 
             | That said, big-R Rationalism (the Lesswrong/Yudkowsky/Ziz
             | social phenomenon) has very little in common with what
             | we've standardly called 'rationalism'; trained philosophers
             | tend to wince a little bit when we come into contact with
             | these groups (who are nevertheless chockablock with
             | fascinating personalities and compelling aesthetics.)
             | 
             | From my perspective (and I have only glancing contact,)
             | these mostly seem to be _cults of consequentialism_, an
             | epithet I'd also use for Effective Altruists.
             | 
             | Consequentialism has been making young people say and do
             | daft things for hundreds of years -- Dostoevsky's _Crime
             | and Punishment_ being the best character sketch I can think
             | of.
             | 
             | While there are plenty of non-religious (and thus, small-r
             | rationalist) alternatives to consequentialism, none of them
             | seem to make it past the threshold in these communities.
             | 
             | The other codesmell these big-R rationalist groups have for
             | me, and that which this article correctly flags, is their
             | weaponization of psychology -- while I don't necessarily
             | doubt the findings of sociology, psychology, etc, I wonder
             | if they necessarily furnish useful tools for personal
             | improvement. For example, memorizing a list of biases that
             | people can potentially have is like numbering the stars in
             | the sky; to me, it seems like this is a cargo-cultish
             | transposition of the act of finding _fallacies in
             | arguments_ into the domain of finding _faults in persons_.
             | 
             | And that's a relatively mild use of psychology. I simply
             | can't imagine how annoying it would be to live in a
             | household where everyone had memorized everything from
             | connection theory to attachment theory to narrative therapy
             | and routinely deployed hot takes on one another.
             | 
             | In actual philosophical discussion, back at the academy,
             | psychologizing was considered 'below the belt', and would
             | result in an intervention by the ref. Sometimes this was
             | explicitly associated with something we called 'the
             | Principle of Charity', which is that, out of an abundance
             | of epistemic caution, you commit to always interpreting the
             | motives and interests of your interlocutor in the kindest
             | light possible, whether in 'steel manning' their arguments,
             | or turning a strategically blind eye to bad behaviour in
             | conversation.
             | 
             | The importance Principle of Charity is probably the most
             | enduring lesson I took from my decade-long sojurn among the
             | philosophers, and mutual psychological dissection is
             | anathema to it.
        
               | rendx wrote:
               | > to me, it seems like this is a cargo-cultish
               | transposition of the act of finding _fallacies in
               | arguments_ into the domain of finding _faults in
               | persons_.
               | 
               | Well put, thanks!
        
           | MajimasEyepatch wrote:
           | I feel this way about some of the more extreme effective
           | altruists. There is no room for uncertainty or recognition of
           | the way that errors compound.
           | 
           | - "We should focus our charitable endeavors on the problems
           | that are most impactful, like eradicating preventable
           | diseases in poor countries." Cool, I'm on board.
           | 
           | - "I should do the job that makes the absolute most amount of
           | money possible, like starting a crypto exchange, so that I
           | can use my vast wealth in the most effective way." Maybe? If
           | you like crypto, go for it, I guess, but I don't think that's
           | the only way to live, and I'm not frankly willing to trust
           | the infallibility and incorruptibility of these so-called
           | geniuses.
           | 
           | - "There are many billions more people who will be born in
           | the future than those people who are alive today. Therefore,
           | we should focus on long-term problems over short-term ones
           | because the long-term ones will affect far more people."
           | Long-term problems are obviously important, but the further
           | we get into the future, the less certain we can be about our
           | projections. We're not even good at seeing five years into
           | the future. We should have very little faith in some
           | billionaire tech bro insisting that their projections about
           | the 22nd century are correct (especially when those
           | projections just so happen to show that the best thing you
           | can do in the present is buy the products that said tech bro
           | is selling).
        
             | xg15 wrote:
             | The "longtermism" idea never made sense to me: So we should
             | sacrifice the present to save the future. Alright. But then
             | those future descendants would also have to sacrifice
             | _their_ present to save _their_ future, etc. So by that
             | logic, there could never be a time that was not full of
             | misery. So then why do all of that stuff?
        
               | twic wrote:
               | At some point in the future, there won't be more people
               | who will live in the future than live in the present, at
               | which point you are allowed to improve conditions today.
               | Of course, by that point the human race is nearly
               | finished, but hey.
               | 
               | That said, if they really thought hard about this
               | problem, they would have come to a different conclusion:
               | 
               | https://theconversation.com/solve-suffering-by-blowing-
               | up-th...
        
               | xg15 wrote:
               | Some time after we've colonized half the observable
               | universe. Got it.
        
               | vharuck wrote:
               | Zeno's poverty
        
               | rawgabbit wrote:
               | To me it is disguised way of saying the ends justify the
               | means. Sure, we murder a few people today but think of
               | the utopian paradise we are building for the future.
        
               | vlowther wrote:
               | "I came up with a step-by-step plan to achieve World
               | Peace, and now I am on a government watchlist!"
        
               | to11mtm wrote:
               | Well, there's a balance to be had. Do the most good you
               | can while still being able to survive the rat race.
               | 
               | However, people are bad at that.
               | 
               | I'll give an interesting example.
               | 
               | Hybrid Cars. Modern proper HEVs[0] usually benefit to
               | their owners, both by virtue of better fuel economy as
               | well as in most cases being overall more reliable than a
               | normal car.
               | 
               | And, they are better on CO2 emissions and lower our oil
               | consumption.
               | 
               | And yet most carmakers as well as consumers have been
               | very slow to adopt. On the consumer side we are finally
               | to where we can have hybrid trucks that can get 36-40MPG
               | capable of towing 4000 pounds or hauling over 1000 pounds
               | in the bed [1] we have hybrid minivans capable of 35MPG
               | for transporting groups of people, we have hybrid sedans
               | getting 50+ and Small SUVs getting 35-40+MPG for people
               | who need a more normal 'people' car. And while they are
               | selling better it's insane that it took as long as it has
               | to get here.
               | 
               | The main 'misery' you experience at that point, is that
               | you're driving the same car as a lot of other people and
               | it's not as exciting [2] as something with more power
               | than most people know what to do with.
               | 
               | And hell, as they say in investing, sometimes the market
               | can be irrational longer than you can stay solvent. E.x.
               | was it truly worth it to Hydro-Quebec to sit on LiFePO
               | patents the way they did vs just figuring out licensing
               | terms that got them a little bit of money to then
               | properly accelerate adoption of Hybrids/EVs/etc?
               | 
               | [0] - By this I mean Something like Toyota's HSD style
               | setup used by Ford and Subaru, or Honda or Hyundai/Kia's
               | setup where there's still a more normal transmission
               | involved.
               | 
               | [1] - Ford advertises up to 1500 pounds, but I feel like
               | the GVWR allows for a 25 pound driver at that point.
               | 
               | [2] - I feel like there's ways to make an exciting
               | hybrid, but until there's a critical mass or Stellantis
               | gets their act together, it won't happen...
        
               | BlueTemplar wrote:
               | Not that these technologies don't have anything to bring,
               | but any discussion that still presupposes that
               | cars/trucks(/planes) (as we know them) still have a
               | future is (mostly) a waste of time.
               | 
               | P.S.: The article mentions the "normal error-checking
               | processes of society"... but what makes them so sure
               | cults aren't part of them ?
               | 
               | It's not like society is particularly good about it
               | either, immune from groupthink (see the issue above) -
               | and who do you think is more likely to kick-start a
               | strong enough alternative ?
               | 
               | (Or they are just sad about all the failures ? But it's
               | questionable that the "process" can work (with all its
               | vivacity) without the "failures"...)
        
             | human_person wrote:
             | "I should do the job that makes the absolute most amount of
             | money possible, like starting a crypto exchange, so that I
             | can use my vast wealth in the most effective way."
             | 
             | Has always really bothered me because it assumes that there
             | are no negative impacts of the work you did to get the
             | money. If you do a million dollars worth of damage to the
             | world and earn 100k (or a billion dollars worth of damage
             | to earn a million dollars), even if you spend all of the
             | money you earned on making the world a better place, you
             | arent even going to fix 10% of the damage you caused (and
             | thats ignoring the fact that its usually easier/cheaper to
             | break things than to fix them).
        
               | to11mtm wrote:
               | > If you do a million dollars worth of damage to the
               | world and earn 100k (or a billion dollars worth of damage
               | to earn a million dollars), even if you spend all of the
               | money you earned on making the world a better place, you
               | arent even going to fix 10% of the damage you caused (and
               | thats ignoring the fact that its usually easier/cheaper
               | to break things than to fix them).
               | 
               | You kinda summed up a lot of the world post industrial
               | revolution there, at least as far as stuff like toxic
               | waste (Superfund, anyone?) and stuff like climate change,
               | I mean for goodness sake let's just think about TEL and
               | how they _knew_ Ethanol could work but it just wasn 't
               | 'patentable'. [0] Or the "We don't even know the dollar
               | amount because we don't have a workable solution" problem
               | of PFAS.
               | 
               | [0] - I still find it shameful that a university is named
               | after the man who enabled this to happen.
        
           | abtinf wrote:
           | > non-rationalists do at least benefit from some intellectual
           | humility
           | 
           | The Islamists who took out the World Trade Center don't
           | strike me as particularly intellectually humble.
           | 
           | If you reject reason, you are only left with force.
        
             | morleytj wrote:
             | I now feel the need to comment that this thread does
             | illustrate an issue I have with the naming of the
             | philosophical/internet community of rationalism.
             | 
             | One can very clearly be a rational individual or an
             | individual who practices reason and not associate with the
             | internet community of rationalism. The median member of the
             | group defined as "not being part of the internet-organized
             | movement of rationalism and not reading lesswrong posts" is
             | not "religious extremist striking the world trade center
             | and committing an atrocious act of terrorism", it's "random
             | person on the street."
             | 
             | And to preempt a specific response some may make to this,
             | yes, the thread here is talking about rationalism as
             | discussed in the blog post above as organized around
             | Yudowsky or slate star codex, and not the rationalist
             | movement of like, Spinoza and company. Very different
             | things philosophically.
        
             | prisenco wrote:
             | Are you so sure the 9/11 hijackers rejected reason?
             | 
             |  _Why Are So Many Terrorists Engineers?_
             | 
             | https://archive.is/XA4zb
             | 
             | Self-described rationalists can and often do rationalize
             | acts and beliefs that seem baldly irrational to others.
        
             | montefischer wrote:
             | Islamic fundamentalism and cult rationalism are both
             | involved in a "total commitment", "all or nothing" type of
             | thinking. The former is totally committed to a particular
             | literal reading of scripture, the latter, to logical
             | deduction from a set of chosen premises. Both modes of
             | thinking have produced violent outcomes in the past.
             | 
             | Skepticism, in which no premise or truth claim is regarded
             | as above dispute (or, that it is always permissible and
             | even praiseworthy to suspend one's judgment on a matter),
             | is the better comparison with rationalism-fundamentalism.
             | It is interesting that skepticism today is often associated
             | with agnostic or atheist religious beliefs, but I consider
             | many religious thinkers in history to have been skeptics
             | par excellence when judged by the standard of their own
             | time. E.g. William Ockham (of Ockham's razor) was a 14C
             | Franciscan friar (and a fascinating figure) who denied
             | papal infallibility. I count Martin Luther as belonging to
             | the history of skepticism as well, for example, as well as
             | much of the humanist movement that returned to the original
             | Greek sources for the Bible, from the Latin Vulgate
             | translation by Jerome.
             | 
             | The history of ideas is fun to read about. I am hardly an
             | expert, but you may be interested by the history of
             | Aristotelian rationalism, which gained prominence in the
             | medieval west largely through the works of Averroes, a 12C
             | Muslim philosopher who heavily favored Aristotle. In 13C,
             | Thomas Aquinus wrote a definitive Catholic systematic
             | theology, rejecting Averroes but embracing Aristotle. To
             | this day, Catholic theology is still essentially
             | Aristotelian.
        
               | throwway120385 wrote:
               | The only absolute above questioning is that there are no
               | absolutes.
        
           | tibbar wrote:
           | Yet I think most people err in the other direction. They
           | 'know' the basics of health, of discipline, of charity, but
           | have a hard time following through. 'Take a simple idea, and
           | take it seriously': a favorite aphorism of Charlie Munger.
           | Most of the good things in my life have come from trying to
           | follow through the real implications of a theoretical belief.
        
             | bearl wrote:
             | And "always invert"! A related mungerism.
        
               | more_corn wrote:
               | I always get weird looks when I talk about killing as
               | many pilots as possible. I need a new example of the
               | always invert model of problem solving.
        
           | godelski wrote:
           | > I don't think it's just (or even particularly) bad axioms
           | 
           | IME most people aren't very good at building axioms. I hear a
           | lot of people say "from first principles" and it is a pretty
           | good indication that they will not be. First principles
           | require a lot of effort to create. They require iteration.
           | They require a lot of nuance, care, and precision. And of
           | course they do! They are the foundation of everything else
           | that is about to come. This is why I find it so odd when
           | people say "let's work from first principles" and then just
           | state something matter of factly and follow from there. If
           | you want to really do this you start simple, attack your own
           | assumptions, reform, build, attack, and repeat.
           | 
           | This is how you reduce the leakiness, but I think it is
           | categorically the same problem as the bad axioms. It is hard
           | to challenge yourself and we often don't like being wrong. It
           | is also really unfortunate that small mistakes can be a
           | critical flaw. There's definitely an imbalance.
           | >> The smartest people I have ever known have been profoundly
           | unsure of their beliefs and what they know.
           | 
           | This is why the OP is seeing this behavior. Because the
           | smartest people you'll meet are constantly challenging their
           | own ideas. They know they are wrong to at least some degree.
           | You'll sometimes find them talking with a bit of authority at
           | first but a key part is watching how they deal with
           | challenging of assumptions. Ask them what would cause them to
           | change their minds. Ask them about nuances and details. They
           | won't always dig into those can of worms but they will be
           | aware of it and maybe nervousness or excited about going down
           | that road (or do they just outright dismiss it?). They
           | understand that accuracy is proportional to computation, and
           | you have exponentially increasing computation as you converge
           | on accuracy. These are strong indications since it'll suggest
           | if they care more about the right answer or being right. You
           | also don't have to be very smart to detect this.
        
           | dan_quixote wrote:
           | As a former mechanical engineer, I visualize this phenomenon
           | like a "tolerance stackup". Effectively meaning that for each
           | part you add to the chain, you accumulate error. If you're
           | not damn careful, your assembly of parts (or conclusions)
           | will fail to measure up to expectations.
        
             | godelski wrote:
             | I like this approach. Also having dipped my toes in the
             | engineering world (professionally) I think it naturally
             | follows that you should be constantly rechecking your
             | designs. Those tolerances were fine to begin with, but are
             | they now that things have changed? It also makes you think
             | about failure modes. What can make this all come down and
             | if it does what way will it fail? Which is really useful
             | because you can then leverage this to design things to fail
             | in certain ways and now you got a testable hypothesis. It
             | won't create proof, but it at least helps in finding flaws.
        
             | to11mtm wrote:
             | I like this analogy.
             | 
             | I think of a bike's shifting systems; better shifters,
             | better housings, better derailleur, or better
             | chainrings/cogs can each 'improve' things.
             | 
             | I suppose where that becomes relevant to here, is that you
             | can have very fancy parts on various ends but if there's a
             | piece in the middle that's wrong you're still gonna get
             | shit results.
        
               | dylan604 wrote:
               | You only as strong as the weakest link.
               | 
               | Your SCSI devices are only as fast as the slowest device
               | in the chain.
               | 
               | I don't need to be faster than the bear, I only have to
               | be faster than you.
        
             | robocat wrote:
             | I saw an article recently that talked about stringing
             | likely inferences together but ending up with an unreliable
             | outcome because enough 0.9 probabilities one after the
             | other lead to an unlikely conclusion.
             | 
             | Edit: Couldn't find the article, but AI referenced Baysian
             | "Chain of reasoning fallacy".
        
               | godelski wrote:
               | I think you have this oversimplified. Stringing together
               | inferences can take us in either direction. It really
               | depends on how things are being done and this isn't
               | always so obvious or simple. But just to show both
               | directions I'll give two simple examples (real world
               | holds many more complexities)
               | 
               | It is all about what is being modeled and how the
               | inferences string together. If these are being
               | multiplied, then yes, this is going to decreases as xy <
               | x and xy < y for every x,y < 1.
               | 
               | But a good counter example is the classic Bayesian
               | Inference example[0]. Suppose you have a test that
               | detects vampirism with 95% accuracy (Pr(+|vampire) =
               | 0.95) and has a false positive rate of 1% (Pr(+|mortal) =
               | 0.01). But vampirism is rare, affecting only 0.1% of the
               | population. This ends up meaning a positive test only
               | gives us a 8.7% likelihood of a subject being a vampire
               | (Pr(vampire|+). The solution here is that we repeat the
               | testing. On our second test Pr(vampire) changes from
               | 0.001 to 0.087 and Pr(vampire|+) goes to 89% and a third
               | getting us to about 99%.
               | 
               | [0] Our equation is
               | Pr(+|vampire)Pr(vampire)       Pr(vampire|+) =
               | ------------------------
               | Pr(+)
               | 
               | And the crux is Pr(+) = Pr(+|vampire)Pr(vampire) +
               | Pr(+|mortal)(1-Pr(vampire))
        
               | p1necone wrote:
               | Worth noting that solution only works if the false
               | positives are totally random, which is probably not true
               | of many real world cases and would be pretty hard to work
               | out.
        
               | godelski wrote:
               | Definitely. Real world adds lots of complexities and
               | nuances, but I was just trying to make the point that it
               | matters how those inferences compound. That we can't just
               | conclude that compounding inferences decreases likelihood
        
             | guerrilla wrote:
             | This is what I hate about real life electronics. Everything
             | is nice on paper, but physics sucks.
        
               | godelski wrote:
               | > Everything is nice on paper
               | 
               | I think the reason this is true is mostly because how
               | people do things "on paper". We can get much more
               | accurate with "on paper" modeling, but the amount of work
               | increases very fast. So it tends to be much easier to
               | just calculate things as if they are spherical chickens
               | in a vacuum and account for error than it is to calculate
               | including things like geometry, drag, resistance, and all
               | that other fun jazz (which you still will also need to
               | account for error/uncertainty though this now can be
               | smaller).
               | 
               | Which I think at the end of the day the important lesson
               | is more how simple explanations can be good
               | approximations that get us most of the way there but the
               | details and nuances shouldn't be so easily dismissed.
               | With this framing we can choose how we pick our battles.
               | Is it cheaper/easier/faster to run a very accurate sim or
               | cheaper/easier/faster to iterate in physical space?
        
             | ctkhn wrote:
             | Basically the same as how dead reckoning your location
             | works worse the longer you've been traveling?
        
               | toasterlovin wrote:
               | Dead reckoning is a great analogy for coming to
               | conclusions based on reason alone. Always useful to check
               | in with reality.
        
           | guerrilla wrote:
           | > I don't think it's just (or even particularly) bad axioms,
           | I think it's that people tend to build up "logical"
           | conclusions where they think each step is a watertight
           | necessity that follows inevitably from its antecedents, but
           | actually each step is a little bit leaky, leading to runaway
           | growth in false confidence.
           | 
           | This is what you get when you naively re-invent philosophy
           | from the ground up while ignoring literally 2500 years of
           | actual debugging of such arguments by the smartest people who
           | ever lived.
           | 
           | You can't diverge from and improve on what everyone else did
           | AND be almost entirely ignorant of it, let alone have no
           | training whatsoever in it. This extreme arrogance I would say
           | is the root of the problem.
        
         | ar-nelson wrote:
         | I find Yudowsky-style rationalists morbidly fascinating in the
         | same way as Scientologists and other cults. Probably because
         | they seem to genuinely believe they're living in a sci-fi
         | story. I read a lot of their stuff, probably too much, even
         | though I find it mostly ridiculous.
         | 
         | The biggest nonsense axiom I see in the AI-cult rationalist
         | world is recursive self-improvement. It's the classic reason
         | superintelligence takeoff happens in sci-fi: once AI reaches
         | some threshold of intelligence, it's supposed to figure out how
         | to edit its own mind, do that better and faster than humans,
         | and exponentially leap into superintelligence. The entire "AI
         | 2027" scenario is built on this assumption; it assumes that
         | soon LLMs will gain the capability of assisting humans on AI
         | research, and AI capabilities will explode from there.
         | 
         | But AI being capable of researching or improving itself is not
         | obvious; there's so many assumptions built into it!
         | 
         | - What if "increasing intelligence", which is a very vague
         | goal, has diminishing returns, making recursive self-
         | improvement incredibly slow?
         | 
         | - Speaking of which, LLMs already seem to have hit a wall of
         | diminishing returns; it seems unlikely they'll be able to
         | assist cutting-edge AI research with anything other than
         | boilerplate coding speed improvements.
         | 
         | - What if there are several paths to different kinds of
         | intelligence with their own local maxima, in which the AI can
         | easily get stuck after optimizing itself into the wrong type of
         | intelligence?
         | 
         | - Once AI realizes it can edit itself to be more intelligent,
         | it can also edit its own goals. Why wouldn't it wirehead
         | itself? (short-circuit its reward pathway so it always feels
         | like it's accomplished its goal)
         | 
         | Knowing Yudowsky I'm sure there's a long blog post somewhere
         | where all of these are addressed with several million rambling
         | words of theory, but I don't think any amount of doing
         | philosophy in a vacuum without concrete evidence could convince
         | me that fast-takeoff superintelligence is possible.
        
           | JKCalhoun wrote:
           | An interesting point you make there -- one would assume that
           | if recursive self-improvement were a thing, _Nature_ would
           | have already lead humans into that  "hall of mirrors".
        
             | marcosdumay wrote:
             | Well, arguably that's exactly where we are, but machines
             | can evolve faster.
             | 
             | And that's an entire new angle that the cultists are
             | ignoring... because superintelligence may just not be very
             | valuable.
             | 
             | And we don't need superintelligence for smart machines to
             | be a problem anyway. We don't need even AGI. IMO, there's
             | no reason to focus on that.
        
               | derefr wrote:
               | > Well, arguably that's exactly where we are
               | 
               | Yep; from the perspective of evolution (and more
               | specifically, those animal species that only gain
               | capability generationally by evolutionary adaptation of
               | instinct), _humans_ are the recursively
               | self-(fitness-)improving accident.
               | 
               | Our species-aggregate capacity to compete for resources
               | within the biosphere went superlinear in the middle of
               | the previous century; and we've had to actively hit the
               | brakes on how much of everything we take since then,
               | handicapping . (With things like epidemic obesity and
               | global climate change being the result of us not hitting
               | those brakes quite hard enough.)
               | 
               | Insofar as a "singularity" can be defined on a per-agent
               | basis, as the moment when something begins to change too
               | rapidly for the given agent to ever hope to catch up with
               | / react to new conditions -- and so the agent goes from
               | being a "player at the table" to a passive observer of
               | what's now unfolding around them... then, from the rest
               | of our biosphere's perspective, they've 100% already
               | witnessed the "human singularity."
               | 
               | No living thing on Earth besides humans now has any
               | comprehension of how the world has been or will be
               | reshaped by human activity; nor can ever hope to do
               | anything to push back against such reshaping. Every
               | living thing on Earth other than humans, will only
               | survive into the human future, if we humans either decide
               | that it should survive, and act to preserve it; or if we
               | humans just ignore the thing, and then just-so-happen to
               | never accidentally do anything to wipe it from existence
               | without even noticing.
        
             | twic wrote:
             | There's a variant of this that argues that humans are
             | already as intelligent as it's possible to be. Because if
             | it's possible to be more intelligent, why aren't we? And a
             | slightly more reasonable variant that argues that we're
             | already as intelligent as it's useful to be.
        
               | danaris wrote:
               | While I'm deeply and fundamentally skeptical of the
               | recursive self-improvement/singularity hypothesis, I also
               | don't really buy this.
               | 
               | There are some pretty obvious ways we could improve human
               | cognition if we had the ability to reliably edit or
               | augment it. Better storage & recall. Lower
               | distractibility. More working memory capacity. Hell, even
               | extra hands for writing on more blackboards or putting up
               | more conspiracy theory strings at a time!
               | 
               | I suppose it _might_ be possible that, given the
               | fundamental design and structure of the human brain, none
               | of these things can be improved any further without
               | catastrophic side effects--but since the only  "designer"
               | of its structure is evolution, I think that's extremely
               | unlikely.
        
               | JKCalhoun wrote:
               | Some of your suggestions, if you don't mind my saying,
               | seem like only modest improvements -- akin to Henry
               | Ford's quote "If I had asked people what they wanted,
               | they would have said a faster horse."
               | 
               | To your point though, an electronic machine is a
               | different host altogether with different strengths and
               | weaknesses.
        
               | danaris wrote:
               | Well, twic's comment didn't say anything about
               | revolutionary improvements, just "maybe we're as smart as
               | we can be".
        
               | lukan wrote:
               | "Because if it's possible to be more intelligent, why
               | aren't we?"
               | 
               | Because deep abstract thoughts about the nature of the
               | universe and elaborate deep thinking were maybe not as
               | useful while we were chasing lions and buffaloes with a
               | spear?
               | 
               | We just had to be smarter then them. Which included
               | finding out that tools were great. Learning about the
               | habits of the prey and optmize hunting success. Those who
               | were smarter in that capacity had a greater chance of
               | reproducing. Those who just exceeded in thinking likely
               | did not lived that long.
        
               | tshaddox wrote:
               | Is it just dumb luck that we're able to create knowledge
               | about black holes, quarks, and lots of things in between
               | which presumably had zero evolutionary benefit before a
               | handful of generations ago?
        
               | lukan wrote:
               | Evolution rewarded us for developing general
               | intelligence. But with a very immediate practical focus
               | and not too much specialisation.
        
               | bee_rider wrote:
               | Basically yes it is luck, in the sense that evolution is
               | just randomness with a filter of death applied, so
               | whatever brains we happen to have are just luck.
               | 
               | The brains we did end up with are really bad at creating
               | that sort of knowledge. Almost none of us can. But we're
               | good at communicating, coming up with simplified models
               | of things, and seeing how ideas interact.
               | 
               | We're not universe-understanders, we're behavior modelers
               | and concept explainers.
        
               | godelski wrote:
               | I don't think the logic follows here. Nor does it match
               | evidence.
               | 
               | The premise is ignorant of time. It is also ignorant of
               | the fact that we know there's a lot of things we don't
               | know. That's all before we consider other factors like if
               | there are limits and physical barriers or many other
               | things.
        
             | Terr_ wrote:
             | I often like to point out that Earth was already consumed
             | by Grey Goo, and today we are hive-minds in titanic mobile
             | megastructure-swarms of trillions of the most complex
             | nanobots in existence (that we know of), inheritors of
             | tactics and capabilities from a zillion years of physical
             | and algorithmic warfare.
             | 
             | As we imagine the ascension of AI/robots, it may _seem_
             | like we 're being humble about ourselves... But I think
             | it's actually the reverse: It's a kind of hubris elevating
             | our ability to create over the vast amount we've
             | _inherited_.
        
           | tim333 wrote:
           | I've pondered recursive self-improvement. I'm fairly sure it
           | will be a thing - we're at a point already where people could
           | try telling Claude or some such to have a go, even if not
           | quite at a point it would work. But I imagine take off would
           | be very gradual. It would be constrained by available
           | computing resources and probably only comparably good to
           | current human researchers and so still take ages to get
           | anywhere.
        
             | tempfile wrote:
             | I honestly am not trying to be rude when I say this, but
             | this is exactly the sort of speculation I find problematic
             | and that I think most people in this thread are complaining
             | about. Being able to tell Claude to have a go has no
             | relation at all to whether it may ever succeed, and you
             | don't actually address any of the legitimate concerns the
             | comment you're replying to points out. There really isn't
             | anything in this comment but vibes.
        
               | doubleunplussed wrote:
               | On the other hand, I'm baffled to encounter recursive
               | self-improvement being discussed as something not only
               | weird to expect, but as damning evidence of sloppy
               | thinking by those who speculate about it.
               | 
               | We have an existence proof for intelligence that can
               | improve AI: humans.
               | 
               | If AI ever gets to human-level intelligence, it would be
               | quite strange if it _couldn 't_ improve itself.
               | 
               | Are people really that sceptical that AI will get to
               | human level intelligence?
               | 
               | It that an insane belief worthy of being a primary
               | example of a community not thinking clearly?
               | 
               | Come on! There is a good chance AI will recursively self-
               | improve! Those poo pooing this idea are the ones not
               | thinking clearly.
        
               | tim333 wrote:
               | I don't think it's vibes rather than my thinking about
               | the problem.
               | 
               | If you look at the "legitimate concerns" none are really
               | deal breakers:
               | 
               | >What if "increasing intelligence", which is a very vague
               | goal, has diminishing returns, making recursive self-
               | improvement incredibly slow?
               | 
               | I'm will to believe it will be slow though maybe it won't
               | 
               | >LLMs already seem to have hit a wall of diminishing
               | returns
               | 
               | Who cares - there will be other algorithms
               | 
               | >What if there are several paths to different kinds of
               | intelligence with their own local maxima
               | 
               | well maybe, maybe not
               | 
               | >Once AI realizes it can edit itself to be more
               | intelligent, it can also edit its own goals. Why wouldn't
               | it wirehead itself?
               | 
               | well - you can make another one if the first does that
               | 
               | Those are all potential difficulties with self
               | improvement, not reasons it will never happen. I'm happy
               | to say it's not happening right now but do you have any
               | solid arguments that it won't happen in the next century?
               | 
               | To me the arguments against sound like people in the
               | 1800s discussing powered flight and saying it'll never
               | happen because steam engine development has slowed.
        
           | PaulHoule wrote:
           | Yeah, to compare Yudkowsky to Hubbard I've read accounts of
           | people who read _Dianetics_ or _Science of Survival_ and
           | thought  "this is genius!" and I'm scratching my head and
           | it's like they never read Freud or Horney or Beck or Berne or
           | Burns or Rogers or Kohut, really any clinical psychology at
           | all, even anything in the better 70% of pop psychology. Like
           | Hubbard, Yudkowsky is unreadable, rambling [1] and
           | inarticulate -- how anybody falls for it boggles my mind [2],
           | but hey, people fell for Carlos Castenada who never used a
           | word of the Yaqui language or mentioned any plant that grows
           | in the desert in Mexico but has Don Juan give lectures about
           | Kant's _Critique of Pure Reason_ [3] that Castenada would
           | have heard in school and you would have heard in school too
           | if you went to school or would have read if you read a lot.
           | 
           | I can see how it appeals to people like Aella who wash into
           | San Francisco without exposure to education [4] or philosophy
           | or computer science or any topics germane to the content of
           | _Sequences_ -- not like it means you are stupid but, like
           | _Dianetics_ , _Sequences_ wouldn 't be appealing if you were
           | at all well read. How is people at frickin' Oxford or
           | Stanford fall for it is beyond me, however.
           | 
           | [1] some might even say a hypnotic communication pattern
           | inspired by Milton Erickson
           | 
           | [2] you think people would dismiss _Sequences_ because it 's
           | a frickin' Harry Potter fanfic, but I think it's like the 419
           | scam email which is riddled by typos which is meant to drive
           | the critical thinker away and, ironically in the case of
           | _Sequences_ , keep the person who wants to cosplay as a
           | critical thinker.
           | 
           | [3] minus any direct mention of Kant
           | 
           | [4] thus many of the marginalized, neurodivergent,
           | transgender who left Bumfuck, AK because they couldn't live
           | at home and went to San Francisco to escape persecution as
           | opposed to seek opportunity
        
             | nemomarx wrote:
             | I thought sequences was the blog posts and the fanfic was
             | kept separately, to nitpick
        
           | ufmace wrote:
           | I agree. There's also the point of hardware dependance.
           | 
           | From all we've seen, the practical ability of AI/LLMs seems
           | to be strongly dependent on how much hardware you throw at
           | it. Seems pretty reasonable to me - I'm skeptical that
           | there's that much out there in gains from more clever code,
           | algorithms, etc on the same amount of physical hardware.
           | Maybe you can get 10% or 50% better or so, but I don't think
           | you're going to get runaway exponential improvement on a
           | static collection of hardware.
           | 
           | Maybe they could design better hardware themselves? Maybe,
           | but then the process of improvement is still gated behind how
           | fast we can physically build next-generation hardware,
           | perfect the tools and techniques needed to make it, deploy
           | with power and cooling and datalinks and all of that other
           | tedious physical stuff.
        
           | morleytj wrote:
           | The built in assumptions are always interesting to me,
           | especially as it relates to intelligence. I find many of them
           | (though not all), are organized around a series of
           | fundamental beliefs that are very rarely challenged within
           | these communities. I should initially mention that I don't
           | think everyone in these communities believes these things, of
           | course, but I think there's often a default set of
           | assumptions going into conversations in these spaces that
           | holds these axioms. These beliefs more or less seem to be as
           | follows:
           | 
           | 1) They believe that there exists a singular factor to
           | intelligence in humans which largely explains capability in
           | every domain (a super g factor, effectively).
           | 
           | 2) They believe that this factor is innate, highly
           | biologically regulated, and a static factor about a
           | person(Someone who is high IQ in their minds must have been a
           | high achieving child, must be very capable as an adult, these
           | are the baseline assumptions). There is potentially belief
           | that this can be shifted in certain directions, but broadly
           | there is an assumption that you either have it or you don't,
           | there is no feeling of it as something that could be taught
           | or developed without pharmaceutical intervention or some
           | other method.
           | 
           | 3) There is also broadly a belief that this factor is at
           | least fairly accurately measured by modern psychometric IQ
           | tests and educational achievement, and that this factor is a
           | continuous measurement with no bounds on it (You can always
           | be smarter in some way, there is no max smartness in this
           | worldview).
           | 
           | These are things that certainly could be true, and perhaps I
           | haven't read enough into the supporting evidence for them but
           | broadly I don't see enough evidence to have them as core
           | axioms the way many people in the community do.
           | 
           | More to your point though, when you think of the world from
           | those sorts of axioms above, you can see why an obsession
           | would develop with the concept of a certain type of
           | intelligence being recursively improving. A person who has
           | become convinced of their moral placement within a societal
           | hierarchy based on their innate intellectual capability has
           | to grapple with the fact that there could be artificial
           | systems which score higher on the IQ tests than them, and if
           | those IQ tests are valid measurements of this super
           | intelligence factor in their view, then it means that the
           | artificial system has a higher "ranking" than them.
           | 
           | Additionally, in the mind of someone who has internalized
           | these axioms, there is no vagueness about increasing
           | intelligence! For them, intelligence is the animating factor
           | behind all capability, it has a central place in their mind
           | as who they are and the explanatory factor behind all
           | outcomes. There is no real distinction between capability in
           | one domain or another mentally in this model, there is just
           | how powerful a given brain is. Having the singular factor of
           | intelligence in this mental model means being able to solve
           | more difficult problems, and lack of intelligence is the only
           | barrier between those problems being solved vs unsolved. For
           | example, there's a common belief among certain groups among
           | the online tech world that all governmental issues would be
           | solved if we just had enough "high-IQ people" in charge of
           | things irrespective of their lack of domain expertise. I
           | don't think this has been particularly well borne out by
           | recent experiments, however. This also touches on what you
           | mentioned in terms of an AI system potentially maximizing the
           | "wrong types of intelligence", where there isn't a space in
           | this worldview for a wrong type of intelligence.
        
           | godelski wrote:
           | > The biggest nonsense axiom I see in the AI-cult rationalist
           | world is recursive self-improvement.
           | 
           | This is also the weirdest thing and I don't think they even
           | know the assumption they are making. It makes the assumption
           | that there is infinite knowledge to be had. It also ignores
           | the reality that in reality we have exceptionally strong
           | indications that accuracy (truth, knowledge, whatever you
           | want to call it) has exponential growth in complexity. These
           | may be wrong assumptions, but we at least have evidence for
           | them, and much more for the latter. So if objective truth
           | exists, then that intelligence gap is very very different.
           | One way they could be right there is for this to be an
           | S-curve and for us humans to be at the very bottom there.
           | That seems unlikely, though very possible. But they always
           | treat this as linear or exponential as if our understanding
           | to the AI will be like an ant trying to understand us.
           | 
           | The other weird assumption I hear is about how it'll just
           | kill us all. The vast majority of smart people I know are
           | very peaceful. They aren't even seeking power of wealth.
           | They're too busy thinking about things and trying to figure
           | everything out. They're much happier in front of a chalk
           | board than sitting on a yacht. And humans ourselves are
           | incredibly passionate towards other creatures. Maybe we
           | learned this because coalitions are a incredibly powerful
           | thing, but truth is that if I could talk to an ant I'd choose
           | that over laying traps. Really that would be so much easier
           | too! I'd even rather dig a small hole to get them started
           | somewhere else than drive down to the store and do all that.
           | A few shovels in the ground is less work and I'd ask them to
           | not come back and tell others.
           | 
           | Granted, none of this is absolutely certain. It'd be naive to
           | assume that we know! But it seems like these cults are
           | operating on the premise that they do know and that these
           | outcomes are certain. It seems to just be preying on fear and
           | uncertainty. Hell, even Altman does this, ignoring risk and
           | concern of existing systems by shifting focus to "an even
           | greater risk" that he himself is working towards (You can't
           | simultaneously maximize speed and safety). Which, weirdly
           | enough might fulfill their own prophesies. The AI doesn't
           | have to become sentient but if it is trained on lots of
           | writings about how AI turns evil and destroys everyone then
           | isn't that going to make a dumb AI that can't tell fact from
           | fiction more likely to just do those things?
        
             | empiricus wrote:
             | soo many things make no sense in this comment that I feel
             | like 20% chance this a mid quality gpt. and so much
             | interpolation effort, but starting from hearsay instead of
             | primary sources. then the threads stop just before seeing
             | the contradiction with the other threads. I imagine this is
             | how we all reason most of the time, just based on vibes :(
        
               | godelski wrote:
               | Sure, I wrote a lot and it's a bit scattered. You're
               | welcome to point to something specific but so far you
               | haven't. Ironically, you're committing the error you're
               | accusing me of.
               | 
               | I'm also not exactly sure what you mean because the only
               | claim I've made is that they've made assumptions where
               | there are other possible, and likely, alternatives. It's
               | much easier to prove something wrong than prove it right
               | (or in our case, evidence, since no one is proving
               | anything).
               | 
               | So the first part I'm saying we have to consider two
               | scenarios. Either intelligence is bounded or unbounded. I
               | think this is a fair assumption, do you disagree?
               | 
               | In an unbounded case, their scenario can happen. So I
               | don't address that. But if you want me to, sure. It's
               | because I have no reason to believe information is
               | bounded when everything around me suggests that it is.
               | Maybe start with the Bekenstein bound. Sure, it doesn't
               | prove information is bounded but you'd then need to
               | convince me that an entity not subject to our universe
               | and our laws of physics is going to care about us and be
               | malicious. Hell, that entity wouldn't even subject to
               | time and we're still living.
               | 
               | In a bounded case it _can_ happen but we need to
               | understand what conditions that requires. There 's a lot
               | of functions but I went with S-curve for simplicity and
               | familiarity. It'll serve fine (we're on HN man...) for
               | any monotonically increasing case (or if it tends that
               | way).
               | 
               | So think about it. Change the function if you want, but
               | _if intelligence is bounded_ then if we 're x more
               | intelligent then ants, where on the graph do we need to
               | be for another thing to be x more intelligent than us?
               | There's not a lot of opportunities for that even to
               | happen. It requires our intelligence (on that
               | hypothetical scale) to be pretty similar than an ant.
               | What cannot happen is that ant be in the tail of that
               | function and us be further than the inflection point
               | (half way). There just isn't enough space on that y-axis
               | for anything to be x more intelligent.
               | 
               | Yeah, I'll admit that this is a very naive model but
               | again, we're not trying to say what's right but instead
               | just say there's good reason to believe their assumption
               | is false. Adding more complexity to this model doesn't
               | make their case stronger, it makes it weaker.
               | 
               | The second part I can make much easier to understand.
               | 
               | Yes, there's bad smart people, but look at the smartest
               | people in history. Did they seek power or wish to harm?
               | Most of the great scientists did not. A lot of them were
               | actually quite poor and many even died fighting
               | persecution.
               | 
               | So we can't conclude that greater intelligence results in
               | greater malice. This isn't hearsay, I'm just saying
               | Newton wasn't a homicidal maniac. I know, bold claim...
               | > starting from hearsay
               | 
               | I don't think this word means what you think it means.
               | Just because I didn't link sources doesn't make it a
               | rumor. You can validate them and I gave you enough
               | information to do so. You now have more. Ask gpt for
               | links, I don't care, but people should stop worshipping
               | Yud
        
           | tshaddox wrote:
           | > - What if there are several paths to different kinds of
           | intelligence with their own local maxima, in which the AI can
           | easily get stuck after optimizing itself into the wrong type
           | of intelligence?
           | 
           | I think what's more plausible is that there is _general_
           | intelligence, and humans have that, and it 's general in the
           | same sense that Turing machines are general, meaning that
           | there is no "higher form" of intelligence that has strictly
           | greater capability. Computation speed, memory capacity, etc.
           | can obviously increase, but those are available to biological
           | general intelligences just like they would be available to
           | electronic general intelligences.
        
         | ambicapter wrote:
         | One of the only idioms that I don't mind living my life by is,
         | "Follow the truth-seeker, but beware those who've found it".
        
           | JKCalhoun wrote:
           | Interesting. I can't say I've done much _following_ though --
           | not that I am aware of anyway. Maybe I just had no _leaders_
           | growing up.
        
         | lordnacho wrote:
         | > I immediately become suspicious of anyone who is very certain
         | of something
         | 
         | Me too, in almost every area of life. There's a reason it's
         | called a conman: they are tricking your natural sense that
         | confidence is connected to correctness.
         | 
         | But also, even when it isn't about conning you, how do people
         | become certain of something? They ignored the evidence against
         | whatever they are certain of.
         | 
         | People who actually know what they're talking about will always
         | restrict the context and hedge their bets. Their explanation
         | are tentative, filled with ifs and buts. They rarely say
         | anything sweeping.
        
           | dcminter wrote:
           | In the term "conman" the confidence in question is that of
           | the mark, not the perpetrator.
        
             | sdwr wrote:
             | Isn't confidence referring to the alternate definition of
             | trust, as in "taking you into his confidence"?
        
               | godelski wrote:
               | I think if you used that definition you could equally say
               | "it is the mark that is taking the conman into [the
               | mark's] confidence"
        
         | JKCalhoun wrote:
         | You're describing the impressions I had of MENSA back in the
         | 70's.
        
         | jpiburn wrote:
         | "Cherish those who seek the truth but beware of those who find
         | it" - Voltaire
        
           | paviva wrote:
           | Most likely Gide ("Croyez ceux qui cherchent la verite,
           | doutez de ceux qui la trouvent", "Believe those who seek
           | Truth, doubt those who find it") and not Voltaire ;)
           | 
           | Voltaire was generally more subtle: "un bon mot ne prouve
           | rien", a witty saying proves nothing, as he'd say.
        
         | inasio wrote:
         | Saw once a discussion that people should not have kids as it's
         | by far the highest increase in your carbon footprint in your
         | lifetime (>10x than going vegan, etc) be driven all the way to
         | advocating genocide as a way of carbon footprint minimization
        
           | derektank wrote:
           | Setting aside the reductio ad absurdum of genocide, this is
           | an unfortunately common viewpoint. People really need to take
           | into account the chances their child might wind up working on
           | science or technology which reduces global CO2 emissions or
           | even captures CO2. This reasoning can be applied to all sorts
           | of naive "more people bad" arguments. I can't imagine where
           | the world would be if Norman Borlaug's parents had decided to
           | never have kids out of concern for global food insecurity.
        
             | freedomben wrote:
             | It also entirely subjugates the economic realities that we
             | (at least currently) live in to the future health of the
             | planet. I care a great deal about the Earth and our
             | environment, but the more I've learned about stuff the more
             | I've realized that anyone advocating for focusing on one
             | without considering the impact on the other is primarily
             | following a religion
        
               | mapontosevenths wrote:
               | > It also entirely subjugates the economic realities that
               | we...
               | 
               | To play devils advocate, you could be seen as trying to
               | subjugate the worlds health to your own economic well-
               | being, and far fewer people are concerned with your tax
               | bracket than there are people on earth. In a pure
               | democracy, I'm fairly certain the planets well being
               | would be deemed more important than the economy of
               | whatever nation you live in.
               | 
               | > advocating for focusing on one... is primarily
               | following a religion
               | 
               | Maybe, but they could also just be doing the risk
               | calculus a bit differently. If you are a many step
               | thinker the long term fecundity of our species might feel
               | more important than any level of short term financial
               | motivation.
        
             | freejazz wrote:
             | Insane to call "more people bad" naive but then actually
             | try and account for what would otherwise best be described
             | as hope.
        
             | mapontosevenths wrote:
             | > this is an unfortunately common viewpoint
             | 
             | Not everyone believes that the purpose of life is to make
             | more life, or that having been born onto team human
             | automatically qualifies team human as the best team. It's
             | not necessarily unfortunate.
             | 
             | I am not a rationalist, but rationally that whole "the
             | meaning of life is human fecundity" shtick is after school
             | special tautological nonsense, and that seems to be the
             | assumption buried in your statement. Try defining what you
             | mean without causing yourself some sort of recursion
             | headache.
             | 
             | > their child might wind up..
             | 
             | They might also grow up to be a normal human being, which
             | is far more likely.
             | 
             | > if Norman Borlaug's parents had decided to never have
             | kids
             | 
             | Again, this would only have mattered if you consider the
             | well being of human beings to be the greatest possible
             | good. Some people have other definitions, or are operating
             | on much longer timescales.
        
           | throw0101a wrote:
           | > _Saw once a discussion that people should not have kids as
           | it 's by far the highest increase in your carbon footprint in
           | your lifetime (>10x than going vegan, etc) be driven all the
           | way to advocating genocide as a way of carbon footprint
           | minimization_
           | 
           | The opening scene of _Utopia_ (UK) s2e6 goes over this:
           | 
           | > _" Why did you have him then? Nothing uses carbon like a
           | first-world human, yet you created one: why would you do
           | that?"_
           | 
           | * https://www.youtube.com/watch?v=rcx-nf3kH_M
        
         | uoaei wrote:
         | This is why it's important to emphasize that _rationality is
         | not a good goal to have_. Rationality is nothing more than
         | applied logic, which takes axioms as given and deduces
         | conclusions from there.
         | 
         |  _Reasoning_ is the appropriate target because it is a self-
         | critical, self-correcting method that continually re-evaluates
         | axioms and methods to express intentions.
        
         | amanaplanacanal wrote:
         | It's very tempting to try to reason things through from first
         | principles. I do it myself, a lot. It's one of the draws of
         | libertarianism, which I've been drawn to for a long time.
         | 
         | But the world is way more complex than the models we used to
         | derive those "first principles".
        
         | zaphar wrote:
         | The distinction between them and religion is that religion is
         | free to say that those axioms are a matter of faith and treat
         | them as such. Rationalists are not as free to do so.
        
         | EGreg wrote:
         | There are certain things I am sure of even though I derived
         | them on my own.
         | 
         | But I constantly battle tested them against other smart
         | people's views, and just after I ran out of people to bring me
         | new rational objections did I become sure.
         | 
         | Now I can battle test them against LLMs.
         | 
         | On a lesser level of confidence, I have also found a lot of
         | times the people who disagreed with what I thought had to be
         | the case, later came to regret it because their strategies
         | ended up in failure and they told me they regretted not taking
         | my recommendation. But that is on an individual level. I have
         | gotten pretty good at seeing systemic problems, architecting
         | systemic solutions, and realizing what it would take to get
         | them adopted to at least a critical mass. Usually, they fly in
         | the face of what happens normally in society. People don't see
         | how their strategies and lives are shaped by the technology and
         | social norms around them.
         | 
         | Here, I will share three examples:
         | 
         | Public Health: https://www.laweekly.com/restoring-healthy-
         | communities/
         | 
         | Economic and Governmental: https://magarshak.com/blog/?p=362
         | 
         | Wars & Destruction: https://magarshak.com/blog/?p=424
         | 
         | For that last one, I am often proven somewhat wrong by right-
         | wing war hawks, because my left-leaning anti-war stance is
         | about avoiding inflicting large scale misery on populations,
         | but the war hawks go through with it anyway and wind up
         | defeating their geopolitical enemies and gaining ground as the
         | conflict fades into history.
        
           | projektfu wrote:
           | "genetically engineers high fructose corn syrup into
           | everything"
           | 
           | This phrase is nonsense, because HFCS is a chemical process
           | applied to normal corn after the harvest. The corn may be a
           | GMO but it certainly doesn't have to be.
        
             | EGreg wrote:
             | Agreed, that was phrased wrong. The fruits across the board
             | have been genetically engineered to be extremely sweet
             | (fructose, not the syrup):
             | https://weather.com/news/news/2018-10-03-fruit-so-sweet-
             | zoo-...
             | 
             | While their nutritional quality has gone down tremendously,
             | for vegetables too:
             | https://pmc.ncbi.nlm.nih.gov/articles/PMC10969708/
        
         | gen220 wrote:
         | Strongly recommend this profile in the NYer on Curtis Yarvin
         | (who also uses "rationalism" to justify their beliefs) [0]. The
         | section towards the end that reports on his meeting one of his
         | supposed ideological heroes for an extended period of time is
         | particularly illuminating.
         | 
         | I feel like the internet has led to an explosion of these such
         | groups because it abstracts the "ideas" away from the "people".
         | I suspect if most people were in a room or spent an extended
         | amount of time around any of these self-professed, hyper-online
         | rationalists, they would immediately disregard any theories
         | they were able to cook up, no matter how clever or
         | persuasively-argued they might be in their written down form.
         | 
         | [0]: https://www.newyorker.com/magazine/2025/06/09/curtis-
         | yarvin-...
        
           | trawy081225 wrote:
           | > I feel like the internet has led to an explosion of these
           | such groups because it abstracts the "ideas" away from the
           | "people". I suspect if most people were in a room or spent an
           | extended amount of time around any of these self-professed,
           | hyper-online rationalists, they would immediately disregard
           | any theories they were able to cook up, no matter how clever
           | or persuasively-argued they might be in their written down
           | form.
           | 
           | Likely the opposite. The internet has led to people being
           | able to see the man behind the curtain, and realize how
           | flawed the individuals pushing these ideas are. Whereas many
           | intellectuals from 50 years back were just as bad if not
           | worse, but able to maintain a false aura of intelligence by
           | cutting themselves off from the masses.
        
         | ratelimitsteve wrote:
         | Are you familiar with ship of theseus as an arugmentation
         | fallacy? Innuendo Studios did a great video on it and I think
         | that a lot of what you're talking about breaks down to this.
         | Tldr - it's a fallacy of substitution, small details of an
         | argument get replaced by things that are (or feel like) logical
         | equivalents until you end up saying something entirely
         | different but are arguing as though you said the original
         | thing. In this video the example is "senator doxxes a political
         | opponent" but on looking "senator" turns out to mean "a
         | contractor working for the senator" and "doxxes a political
         | opponent" turns out to mean "liked a tweet that had that
         | opponent's name in it in a way that could draw attention to
         | it".
         | 
         | Each change is arguably equivalent and it seems logical that if
         | x = y then you could put y anywhere you have x, but after all
         | of the changes are applied the argument that emerges is
         | definitely different from the one before all the substitutions
         | are made. It feels like communities that pride themselves on
         | being extra rational seem subject to this because it has all
         | the trappings of rationalism but enables squishy, feely
         | arguments
        
         | GeoAtreides wrote:
         | Epistemological skepticism sure is a belief. A strong belief on
         | your side?
         | 
         | I am profoundly sure, I am certain I exist and that a reality
         | outside myself exists. Worse, I strongly believe knowing this
         | external reality is possible, desirable and accurate.
         | 
         | How suspicious does that make me?
        
         | antisthenes wrote:
         | It's crazy to read this, because by writing what you wrote you
         | basically show that you don't understand what an axiom is.
         | 
         | You need to review the definition of the word.
         | 
         | > The smartest people I have ever known have been profoundly
         | unsure of their beliefs and what they know.
         | 
         | The smartest people are unsure about their higher level
         | beliefs, but I can assure you that they almost certainly don't
         | re-evaluate "axioms" as you put it on a daily or weekly basis.
         | Not that it matters, as we almost certainly can't verify who
         | these people are based on an internet comment.
         | 
         | > I immediately become suspicious of anyone who is very certain
         | of something, especially if they derived it on their own.
         | 
         | That's only your problem, not anyone else's. If you think
         | people can't arrive to a tangible and useful approximation of
         | truth, then you are simply delusional.
        
           | mapontosevenths wrote:
           | > If you think people can't arrive to a tangible and useful
           | approximation of truth, then you are simply delusional
           | 
           | Logic is only a map, not the territory. It is a new toy,
           | still bright and shining from the box in terms of human
           | history. Before logic there were other ways of thinking, and
           | new ones will come after. Yet, Voltaire's bastards are always
           | certain they're right, despite being right far less often
           | than they believe.
           | 
           | Can people arrive at tangible and useful conclusions?
           | Certainly, but they can only ever find capital "T" Truth in a
           | _very_ limited sense. Logic, like many other models of the
           | universe, is only useful until you change your frame of
           | reference or the scale at which you think. Then those laws
           | suddenly become only approximations, or even irrelevant.
        
         | SLWW wrote:
         | A logical argument is only as good as it's presuppositions. To
         | first lay siege to your own assumptions before reasoning about
         | them tends towards a more beneficial outcome.
         | 
         | Another issue with "thinkers" is that many are cowards; whether
         | they realize it or not a lot of presuppositions are built on a
         | "safe" framework, placing little to no responsibility on the
         | thinker.
         | 
         | > The smartest people I have ever known have been profoundly
         | unsure of their beliefs and what they know. I immediately
         | become suspicious of anyone who is very certain of something,
         | especially if they derived it on their own.
         | 
         | This is where I depart from you. If I say it's anti-
         | intellectual I would only be partially correct, but it's worse
         | than that imo. You might be coming across "smart people" who
         | claim to know nothing "for sure", which in itself is a self-
         | defeating argument. How can you claim that nothing is truly
         | knowable as if you truly know that nothing is knowable? I'm
         | taking these claims to their logical extremes btw, avoiding the
         | granular argumentation surrounding the different shades and
         | levels of doubt; I know that leaves vulnerabilities in my
         | argument, but why argue with those who know that they can't
         | know much of anything as if they know what they are talking
         | about to begin with? They are so defeatist in their own
         | thoughts, it's comical. You say, "profoundly unsure", which
         | reads similarly to me as "can't really ever know" which is a
         | sure truth claim, not a relative claim or a comparative as many
         | would say, which is a sad attempt to side-step the absolute
         | reality of their statement.
         | 
         | I know that I exist, regardless of how I get here I know that I
         | do, there is a ridiculous amount of rhetoric surrounding that
         | claim that I will not argue for here, this is my
         | presupposition. So with that I make an ontological claim, a
         | truth claim, concerning my existence; this claim is one that I
         | must be sure of to operate at any base level. I also believe I
         | am me and not you, or any other. Therefore I believe in one
         | absolute, that "I am me". As such I can claim that an absolute
         | exists, and if absolutes exist, then within the right framework
         | you must also be an absolute to me, and so on and so forth;
         | what I do not see in nature is an existence, or notion of, the
         | relative on it's own as at every relative comparison there is
         | an absolute holding up the comparison. One simple example is
         | heat. Hot is relative, yet it also is objective; some heat can
         | burn you, other heat can burn you over a very long time, some
         | heat will never burn. When something is "too hot" that is a
         | comparative claim, stating that there is another "hot" which is
         | just "hot" or not "hot enough", the absolute still remains
         | which is heat. Relativistic thought is a game of comparisons
         | and relations, not making absolute claims; the only absolute
         | claim is that there is no absolute claim to the relativist. The
         | reason I am talking about relativists is that they are the
         | logical, or illogical, conclusion of the extremes of
         | doubt/disbelief i previously mentioned.
         | 
         | If you know nothing you are not wise, you are lazy and ill-
         | prepared, we know the earth is round, we know that gravity
         | exists, we are aware of the atomic, we are aware of our
         | existence, we are aware that the sun shines it's light upon us,
         | we are sure of many things that took debate among smart people
         | many many years ago to arrive to these sure conclusions. There
         | was a time where many things we accept where "not known" but
         | were observed with enough time and effort by brilliant people.
         | That's why we have scientists, teachers, philosophers and
         | journalists. I encourage you that the next time you find a
         | "smart" person who is unsure of their beliefs, you should
         | kindly encourage them to be less lazy and challenge their
         | absolutes, if they deny the absolute could be found then you
         | aren't dealing with a "smart" person, you are dealing with a
         | useful idiot who spent too much time watching skeptics blather
         | on about meaningless topics until their brains eventually fell
         | out. In every relative claim there must be an absolute or it
         | fails to function in any logical framework. You can with enough
         | thought, good data, and enough time to let things steep find
         | the (or an) absolute and make a sure claim. You might be proven
         | wrong later, but that should be an indicator to you that you
         | should improve (or a warning you are being taken advantage of
         | by a sophist), and that the truth is out there, not to
         | sequester yourself away in this comfortable, unsure hell that
         | many live in till they die.
         | 
         | The beauty of absolute truth is that you can believe absolutes
         | without understanding the entirety of the absolute. I know
         | gravity exists but I don't know fully how it works. Yet I can
         | be absolutely certain it acts upon me, even if I only
         | understand a part of it. People should know what they know and
         | study it until they do and not make sure claims outside of what
         | they do not know until they have the prerequisite absolute
         | claims to support the broader claims with the surety of the
         | weakest of their presuppositions.
         | 
         | Apologies for grammar, length and how schizo my thought process
         | appears; I don't think linearly and it takes a goofy amount of
         | effort to try to collate my thoughts in a sensible manner.
        
         | positron26 wrote:
         | Any theory of everything will often have a little perpetual
         | motion machine at the nexus. These can be fascinating to the
         | mind.
         | 
         | Pressing through uncertainty either requires a healthy appetite
         | for risk or an engine of delusion. A person who struggles to
         | get out of their comfort zone will seek enablement through such
         | a device.
         | 
         | Appreciation of risk-reward will throttle trips into the
         | unknown. A person using a crutch to justify everything will
         | careen hyperbolically into more chaotic and erratic behaviors
         | hoping to find that the device is still working, seeking the
         | thrill of enablement again.
         | 
         | The extremism comes from where once the user learned to say
         | hello to a stranger, their comfort zone has expanded to an area
         | that their experience with risk-reward is underdeveloped. They
         | don't look at the external world to appreciate what might
         | happen. They try to morph situations into some confirmation of
         | the crutch and the inferiority of confounding ideas.
         | 
         | "No, the world isn't right. They are just weak and the unspoken
         | rules [in the user's mind] are meant to benefit them." This
         | should always resonate because nobody will stand up for you
         | like you have a responsibility to.
         | 
         | A study of uncertainty and the limitations of axioms, the
         | inability of any sufficiently expressive formalism to be both
         | complete and consistent, these are the ideas that are antidotes
         | to such things. We do have to leave the rails from time to
         | time, but where we arrive will be another set of rails and will
         | look and behave like rails, so a bit of uncertainty is
         | necessary, but it's not some magic hat that never runs out of
         | rabbits.
         | 
         | Another psychology that will come into play from those who have
         | left their comfort zone is the inability to revert. It is a
         | harmful tendency to presume all humans fixed quantities. Once a
         | behavior exists, the person is said to be revealed, not
         | changed. The proper response is to set boundaries and be ready
         | to tie off the garbage bag and move on if someone shows remorse
         | and desire to revert or transform. Otherwise every relationship
         | only gets worse. If instead you can never go back, extreme
         | behavior is a ratchet. Ever mistake becomes the person.
        
         | Animats wrote:
         | Many arguments arise over the valuation of future money. See
         | "discount function" [1] At one extreme are the rational
         | altruists, who rate that near 1.0, and the "drill, baby, drill"
         | people, who are much closer to 0.
         | 
         | The discount function really should have a noise term, because
         | predictions about the future are noisy, and the noise increases
         | with the distance into the future. If you don't consider that,
         | you solve the wrong problem. There's a classic Roman concern
         | about running out of space for cemeteries. Running out of
         | energy, or overpopulation, turned out to be problems where the
         | projections assumed less noise than actually happened.
         | 
         | [1] https://en.wikipedia.org/wiki/Discount_function
        
       | animal_spirits wrote:
       | > If someone is in a group that is heading towards
       | dysfunctionality, try to maintain your relationship with them;
       | don't attack them or make them defend the group. Let them have
       | normal conversations with you.
       | 
       | This is such an important skill we should all have. I learned
       | this best from watching the documentary Behind the Curve, about
       | flat earthers, and have applied it to my best friend diving into
       | the Tartarian conspiracy theory.
        
       | dkarl wrote:
       | Isn't this entirely to be expected? The people who dominate
       | groups like these are the ones who put the most time and effort
       | into them, and no sane person who appreciates both the value and
       | the limitations of rational thinking is going to see as much
       | value in a rationalist group, and devote as much time to it, as
       | the kind of people who are attracted to the cultish aspect of
       | achieving truth and power through pure thought. There's way more
       | value there if you're looking to indulge in, or exploit, a cult-
       | like spiral into shared fantasy than if you're just looking to
       | sharpen your logical reasoning.
        
       | gadders wrote:
       | They are literally the "ackchyually" meme made flesh.
        
       | Isamu wrote:
       | So I like Steven Pinker's book Rationality, to me it seems quite
       | straightforward.
       | 
       | But I have never been able to get into the Rationalist stuff, to
       | me it's all very meandering and peripheral and focused on... I
       | don't know what.
       | 
       | Is it just me?
        
         | ameliaquining wrote:
         | Depends very much on what you're hoping to get out of it. There
         | isn't really one "rationalist" thing at this point, it's now a
         | whole bunch of adjacent social groups with overlapping-but-
         | distinct goals and interests.
        
         | handoflixue wrote:
         | https://www.lesswrong.com/highlights this is the ostensible
         | "Core Highlights", curated by major members of the community,
         | and I believe Eliezer would endorse it.
         | 
         | If you don't get anything out of reading the list itself, then
         | you're probably not going to get anything out of the rest of
         | the community either.
         | 
         | If you poke around and find a few neat ideas there, you'll
         | probably find a few other neat ideas.
         | 
         | For some people, though, this is "wait, holy shit, you can just
         | DO that? And it WORKS?", in which case probably read all of
         | this but then also go find a few other sources to counter-
         | balance it.
         | 
         | (In particular, probably 90% of the useful insights already
         | exist elsewhere in philosophy, and often more rigorously
         | discussed - LessWrong will teach you the skeleton, the general
         | sense of "what rationality can do", but you need to go
         | elsewhere if you want to actually build up the muscles)
        
       | biophysboy wrote:
       | > "There's this belief [among rationalists]," she said, "that
       | society has these really bad behaviors, like developing self-
       | improving AI, or that mainstream epistemology is really bad-not
       | just religion, but also normal 'trust-the-experts' science. That
       | can lead to the idea that we should figure it out ourselves. And
       | what can show up is that some people aren't actually smart enough
       | to form very good conclusions once they start thinking for
       | themselves."
       | 
       | I see this arrogant attitude all the time on HN: reflexive
       | distrust of the "mainstream media" and "scientific experts".
       | Critical thinking is a very healthy idea, but its dangerous when
       | people use it as a license to categorically reject sources. Its
       | even worse when extremely powerful people do this; they can
       | reduce an enormous sub-network of thought into a single node for
       | many many people.
       | 
       | So, my answer for "Why Are There So Many Rationalist Cults?" is
       | the same reason all cults exist: humans like to feel like they're
       | in on the secret. We like to be in secret clubs.
        
         | ameliaquining wrote:
         | Sure, but that doesn't say anything about why one particular
         | social scene would spawn a bunch of cults while others do not,
         | which is the question that the article is trying to answer.
        
           | biophysboy wrote:
           | Maybe I was too vague. My argument is that cults need a
           | secret. The secret of the rationalist community is "nobody is
           | rational except for us". Then the rituals would be endless
           | probability/math/logic arguments about sci-fi futures.
        
             | saalweachter wrote:
             | I think the promise of secret knowledge is important, but I
             | think cults also need a second thing: "That thing you fear?
             | You're right to fear it, and only we can protect you from
             | it. If you don't do what we say, it's going to be so much
             | worse than it is now, but if you do, everything will be
             | good and perfect."
             | 
             | In the rationalist cults, you typically have the fear of
             | death and non-existence, coupled with the promise of AGI,
             | the Singularity and immortality, weighed against the AI
             | Apocalypse.
        
               | biophysboy wrote:
               | I guess I'd say protection promises like this are a form
               | of "secret knowledge". At the same time, so many cults
               | have this protection racket so you might be on to
               | something
        
       | the_third_wave wrote:
       | _Gott ist tot! Gott bleibt tot! Und wir haben ihn getotet! Wie
       | trosten wir uns, die Morder aller Morder? Das Heiligste und
       | Machtigste, was die Welt bisher besass, es ist unter unseren
       | Messern verblutet._
       | 
       | The average teenager who reads Nietzsches proclamation on the
       | death of God thinks of this as an accomplishment, finally we got
       | rid of those thousands of years old and thereby severely outdated
       | ideas and rules. Somewhere along the march to maturity they may
       | start to wonder whether that which has replaced those old rules
       | and ideas were good replacements but most of them never come to
       | the realisation that there were rebellious teenagers during all
       | those centuries when the idea of a supreme being to which or whom
       | even the mightiest were to answer to still held sway. Nietzsche
       | saw the peril in letting go off that cultural safety valve and
       | warned for what might come next.
       | 
       | We are currently living in the world he warned us about and for
       | that I, atheist as I am, am partly responsible. The question to
       | be answered here is whether it is possible to regain the benefits
       | of the old order without getting back the obvious excesses, the
       | abuse, the sanctimoniousness and all the other abuses of power
       | and privilege which were responsible for turning people away from
       | that path.
        
       | digbybk wrote:
       | When I was looking for a group in my area to meditate with, it
       | was tough finding one that didn't appear to be a cult. And yet I
       | think Buddhist meditation is the best tool for personal growth
       | humanity has ever devised. Maybe the proliferation of cults is a
       | sign that Yudkowsky was on to something.
        
         | ivm wrote:
         | None of them are practicing _Buddhist_ meditation though, same
         | for the  "personal growth" oriented meditation styles.
         | 
         | Buddhist meditation exists only in the context of the Four
         | Noble Truths and the rest of the Buddha's Dhamma. Throwing them
         | away means it stops being Buddhist.
        
           | digbybk wrote:
           | I disagree, but we'd be arguing semantics. In any case, the
           | point still stands: you can just as easily argue that these
           | rationalist offshoots aren't really Rationalist.
        
             | ivm wrote:
             | I'm not familiar enough with their definitions to argue
             | about them, but meditations techniques predate Buddhism. In
             | fact, the Buddha himself learned them from two teachers
             | before developing his own path. Also, the style of
             | meditation taught nowadays (accepting non-reactive
             | awareness) is not how it's described in the Pali Canon.
             | 
             | This isn't just a "must come from the Champagne region of
             | France, otherwise it's sparkling wine" bickering, but
             | actual widespread misconceptions of what counts as
             | Buddhism. Many ideas floating in Western discourse are
             | basically German Romanticism wrapped in Orientalist
             | packaging, not matching neither Theravada nor Mahayana
             | teachings (for example, see the Fake Buddha Quotes
             | project).
             | 
             | So the semantics are extremely important when it comes to
             | spiritual matters. Flip one or two words and the whole
             | metaphysical model goes in a completely different
             | direction. Even translations add distortions, so there's no
             | room to be careless.
        
       | os2warpman wrote:
       | Rationalists are, to a man (and they're almost all men) arrogant
       | dickheads and arrogant dickheads do not see what they're doing to
       | be "a cult" but "the right and proper way of things because I am
       | right and logical and rational and everyone else isn't".
        
         | IX-103 wrote:
         | That's an unnecessary charicaterature. I have met many
         | rationalists of both genders and found most of them quite
         | pleasant. But it seems that the proportion of "arrogant
         | dickheads" unfortunately matches that of the general
         | population. Whether it's "irrational people" or "liberal
         | elites" these assholes always seem to find someone to look down
         | on.
        
       | jancsika wrote:
       | > And what can show up is that some people aren't actually smart
       | enough to form very good conclusions once they start thinking for
       | themselves.
       | 
       | It's mostly just people who aren't very experienced talking about
       | and dealing honestly with their emotions, no?
       | 
       | I mean, suppose someone is busy achieving and, at the same time,
       | proficient in balancing work with emotional life, dealing head-on
       | with interpersonal conflicts, facing change, feeling and
       | acknowledging hurt, knowing their emotional hangups, perhaps
       | seeing a therapist, perhaps even occasionally putting personal
       | needs ahead of career... :)
       | 
       | Tell that person they can get a marginal (or even substantial)
       | improvement from some rationalist cult practice. Their first
       | question is going to be, "What's the catch?" Because at the very
       | least they'll suspect that adjusting their work/life balance will
       | bring a sizeable amount of stress and consequent decrease in
       | their emotional well-being. And if the pitch is that this
       | rationalist practice works equally well at improving emotional
       | well-being, that smells to them. They already know they didn't
       | logic themselves into their current set of emotional issues, and
       | they are highly unlikely to logic themselves out of them. So
       | there's not much value here to offset the creepy vibes of the
       | pitch. (And again-- being in touch with your emotions means
       | quicker and deeper awareness of creepy vibes!)
       | 
       | Now, take a person whose unexplored emotional well-being _tacitly
       | depends_ on achievement. Even a marginal improvement in
       | achievement could bring perceptible positive changes in their
       | holistic selves! And you can step through a well-specified,
       | logical process to achieve change? Sign HN up!
        
       | scythe wrote:
       | One of the hallmarks of cults -- if not a necessary element -- is
       | that they tend to separate their members from the outside
       | society. Rationalism doesn't directly encourage this, but it does
       | facilitate it in a couple of ways:
       | 
       | - Idiosyncratic language used to describe ordinary things
       | ("lightcone" instead of "future", "prior" instead of "belief" or
       | "prejudice", etc)
       | 
       | - Disdain for competing belief systems
       | 
       | - Insistence on a certain shared interpretation of things most
       | people don't care about (the "many-worlds interpretation" of
       | quantum uncertainty, self-improving artificial intelligence,
       | veganism, etc)
       | 
       | - I'm pretty sure polyamory makes the list somehow, just because
       | it isn't how the vast majority of people want to date. In
       | principle it's a private lifestyle choice, but it's obviously a
       | community value here.
       | 
       | So this creates an opportunity for cult-like dynamics to occur
       | where people adjust themselves according to their interactions
       | within the community but not interactions outside the community.
       | And this could seem -- _to the members_ -- like the beliefs
       | themselves are the problem, but from a sociological perspective,
       | it might really be the inflexible way they diverge from
       | mainstream society.
        
       | kazinator wrote:
       | The only way you can hope to get a gathering of nothing but
       | paragons of critical thinking and skepticism is if the gathering
       | has an entrance exam in critical thinking and skepticism (and a
       | pretty tough one, if they are to be paragons). Or else, it's
       | invitation-only.
        
       | VonGuard wrote:
       | This is actually a known pattern in tech, going back to Engelbart
       | and SRI. While not 1-to-1, you could say that the folks who left
       | SRI for Xerox PARC did so because Engelbart and his crew became
       | obsessed with EST:
       | https://en.wikipedia.org/wiki/Erhard_Seminars_Training
       | 
       | EST-type training still exists today. You don't eat until the end
       | of the whole weekend, or maybe you get rice and little else.
       | Everyone is told to insult you day one until you cry. Then day
       | two, still having not eaten, they build you up and tell you how
       | great you are and have a group hug. Then they ask you how great
       | you feel. Isn't this a good feeling? Don't you want your loved
       | ones to have this feeling? Still having not eaten, you're then
       | encouraged to pay for your family and friends to do the training,
       | without their knowledge or consent.
       | 
       | A friend of mine did this training after his brother paid for his
       | mom to do it, and she paid for him to do it. Let's just say that,
       | though they felt it changed their lives at the time, their lives
       | in no way shape or form changed. Two are in quite a bad place, in
       | fact...
       | 
       | Anyway, point is, the people who invented everything we are using
       | right now were also susceptible to cult-like groups with silly
       | ideas and shady intentions.
        
         | nobody9999 wrote:
         | >EST-type training still exists today
         | 
         | It's called the "Landmark"[0] now.
         | 
         | Several of my family members got sucked into that back in the
         | early 80s and quite a few folks I knew socially as well.
         | 
         | I was quite skeptical, _especially_ because of the cult-like
         | fanaticism of its adherents. They would go on for as long as
         | you 'd let them (often needing to just walk away to get them to
         | stop) try to get you to join.
         | 
         | The goal appears to be to obtain as much legal tender as can be
         | pried from those who are willing to part with it. Hard sell,
         | abusive and deceptive tactics are encouraged -- because it's
         | _so important_ for those who haven 't "gotten it" to do so,
         | justifying just about anything. But if you don't pay -- you get
         | bupkis.
         | 
         | It's a scam, and an abusive one at that.
         | 
         | [0] https://en.wikipedia.org/wiki/Landmark_Worldwide
        
       | Mizza wrote:
       | It's amphetamine. All of these people are constantly tweaking.
       | They're annoying people to begin with, but they're all constantly
       | yakked up and won't stop babbling. It's really obvious, I don't
       | know why it isn't highlighted more in all these post Ziz
       | articles.
        
         | Muromec wrote:
         | How do you know?
        
           | ajkjk wrote:
           | Presumably they mean Adderall. Plausible theory tbh. Although
           | it's just a factor not an explanation.
        
           | tbrake wrote:
           | having known dozens of friends, family, roommates, coworkers
           | etc both before and after they started them. The two biggest
           | telltale signs -
           | 
           | 1. tendency to produce - out of no necessity whatsoever, mind
           | - walls of text. walls of speech will happen too but not
           | everyone rambles.
           | 
           | 2. Obnoxiously confident that they're fundamentally correct
           | about whatever position they happen to be holding during a
           | conversation with you. No matter how subjective or
           | inconsequential. Even if they end up changing it an hour
           | later. Challenging them on it gets you more of #1.
        
             | Muromec wrote:
             | I mean, I know the effects of adderall/ritalin and it's
             | plausible, what I'm asking is whether if gp knows that for
             | a fact or deduces from what is known.
        
             | MinimalAction wrote:
             | Pretty much spot on! It is frustrating to talk with these
             | when they never admit they are wrong. They find new levels
             | of abstractions to deal with your simpler counterarguments
             | and it is a never ending deal unless you admit they were
             | right.
        
             | TheAceOfHearts wrote:
             | Many people like to write in order to develop and explore
             | their understanding of a topic. Writing lets you spend a
             | lot of time playing around with whatever idea you're trying
             | to understand, and sharing this writing invites others to
             | challenge your assumptions.
             | 
             | When you're uncertain about a topic, you can explore it by
             | writing a lot about said topic. Ideally, when you've
             | finished exploring and studying a topic, you should be able
             | to write a much more condensed / synthesized version.
        
             | Henchman21 wrote:
             | I call this "diarrhea of the mind". It's what happens when
             | you hear a steady stream of bullshit from someone's mouth.
             | It definitely tracks with substance abuse of "uppers", aka
             | meth, blow, hell even caffeine!
        
         | throwanem wrote:
         | Who's writing them?
        
         | samdoesnothing wrote:
         | Yeah it's pretty obvious and not surprising. What do people
         | expect when a bunch of socially inept nerds with weird
         | unchallenged world views start doing uppers? lol
        
       | JKCalhoun wrote:
       | > Many of them also expect that, without heroic effort, AGI
       | development will lead to human extinction.
       | 
       | Odd to me. Not biological warfare? Global warming? All-out
       | nuclear war?
       | 
       | I guess _The Terminator_ was a formative experience for them.
       | (For me perhaps it was _The Andromeda Strain_.)
        
         | mitthrowaway2 wrote:
         | These aren't mutually exclusive. Even in _The Terminator_ ,
         | Skynet's method of choice is nuclear war. Yudkowsky frequency
         | expressses concern that a malevolent AI might synthesize a
         | bioweapon. I personally worry that destroying the ozone layer
         | might be an easy opening volley. Either way, I don't want a
         | really smart computer spending its time figuring out plans to
         | end the human species, because I think there are too many ways
         | to be successful.
        
         | myaccountonhn wrote:
         | That's what was so strange with EA and rationalist movements. A
         | highly theoretical model that AGI could wipe us all out vs the
         | very real issue of global warming and pretty much all emphasis
         | was on AGI.
        
         | eschaton wrote:
         | It makes a lot of sense when you realize that for many of the
         | "leaders" in this community like Yudkowsky, their understanding
         | of science (what it is, how it works, and its potential) comes
         | entirely from reading science fiction and playing video games.
         | 
         | Sad because Eli's dad was actually a real and well-credentialed
         | researcher at Bell Labs. Too bad he let his son quit school at
         | an early age to be an autodidact.
        
         | bell-cot wrote:
         | My interpretation: When they say "will lead to human
         | extinction", they are trying to vocalize their existential
         | terror that an AGI would render them and their fellow
         | rationalist cultists permanently irrelevant - by being
         | obviously superior to them, by the only metric that really
         | matters to them.
        
         | DonsDiscountGas wrote:
         | Check out "the precipice" by Tony Ord. Biological warfare and
         | global warming are unlikely to lead to total human extinction
         | (though both present large risks of massive harm).
         | 
         | Part of the argument is that we've had nuclear weapons for a
         | long time but no apocalypse so the annual risk can't be larger
         | than 1%, whereas we've never created AI so it might be
         | substantially larger. Not a rock solid argument obviously, but
         | we're dealing with a lot of unknowns.
         | 
         | A better argument is that most of those other risks are not
         | neglected, plenty of smart people working against nuclear war.
         | Whereas (up until a few years ago) very few people considered
         | AI a real threat, so the marginal benefit of a new person
         | working on it should be bigger.
        
       | 1970-01-01 wrote:
       | I find it ironic that the question is asked unempirically. Where
       | is the data stating there are many more than before? Start there,
       | then go down the rabbit hole. Otherwise, you're concluding on
       | something that may not be true, and trying to rationalize the
       | answer, just as a cultist does.
        
         | arduanika wrote:
         | Oh come on.
         | 
         | Anyone who's ever seen the sky knows it's blue. Anyone who's
         | spent much time around rationalism knows the premise of this
         | article is real. It would make zero sense to ban talking about
         | about a serious and obvious problem in their community until
         | some double blind peer reviewed data can be gathered.
         | 
         | It would be what they call an "isolated demand for rigor".
        
       | akomtu wrote:
       | It's a religion of an overdeveloped mind that hides from
       | everything it cannot understand. It's an anti-religion, in a
       | sense, that puts your mind on the pedestal.
       | 
       | Note the common pattern in major religions: they tell you that
       | thoughts and emotions obscure the light of intuition, like clouds
       | obscure sunlight. Rationalism is the opposite: it denies the very
       | idea of intuition, or anything above the sphere of thoughts, and
       | tells to create as many thoughts as possible.
       | 
       | Rationalists deny anything spiritual, good or evil, because they
       | don't have evidence to think otherwise. They remain in this state
       | of neutral nihilism until someone bigger than them sneaks into
       | their ranks and casually introduces them to evil with some
       | undeniable evidence. Their minds quickly pass the denial-anger-
       | acceptance stages and being faithful to their rationalist
       | doctrine they update their beliefs with what they know. From that
       | point they are a cult. That's the story of Scientology, which has
       | too many many parallels with Rationalism.
        
       | skrebbel wrote:
       | This article is beautifully written, and it's full of proper
       | original research. I'm sad that most comments so far are knee-
       | jerk "lol rationalists" type responses. I haven't seen any
       | comment yet that isn't already addressed in much more colour and
       | nuance in the article itself.
        
         | mm263 wrote:
         | As a place for meaningful discussion HN fell to Reddit level.
         | Very little substance and it became acceptable to circlejerk
         | about how bad your enemies are.
        
           | teh_klev wrote:
           | I have a link for you:
           | 
           | https://news.ycombinator.com/newsguidelines.html
           | 
           | Scroll to the bottom of the page.
        
         | knallfrosch wrote:
         | I think it's perfectly fine to read these articles, think
         | "definitely a cult" and ignore whether they believe in
         | spaceships, or demons, or AGI.
         | 
         | The key takeaway from the article is that if you have a group
         | leader who cuts you off from other people, that's a red flag -
         | not really a novel, or unique, or situational insight.
        
         | meowface wrote:
         | Asterisk is basically "rationalist magazine" and the author is
         | a well-known rationalist blogger, so it's not a surprise that
         | this is basically the only fair look into this phenomenon -
         | compared to the typical outside view that rationalism itself is
         | a cult and Eliezer Yudkowsky is a cult leader, both of which I
         | consider absurd notions.
        
       | SpaceManNabs wrote:
       | Cause they all read gwern and all eugenics leads into cults
       | because conspiracy adjacent garbo always does.
        
       | dav_Oz wrote:
       | For me largley shaped by the westering _old Europe_ creaking and
       | breaking (after 2 WWs) under its heavy load of philosophical
       | /metaphysical inheritance (which at this point in time can be
       | considered effectively americanized).
       | 
       | It is still fascinating to trace back the divergent developments
       | like american-flavoured christian sects or philosophical schools
       | of "pragmatism", "rationalism" etc. which get super-charged by
       | technological disruptions.
       | 
       | In my youth I was heavily influenced by the so-called _Bildung_
       | which can be functionally thought of as a form of _ersatz
       | religion_ and is maybe better exemplified in the literary
       | tradition of the _Bildungsroman_.
       | 
       | I've grappled with and wildly fantasized about all sorts of
       | things, experimented mindlessly with all kinds of modes of
       | thinking and consciousness amidst my coming-of-age, in hindsight
       | without this particular frame of _Bildung_ left by myself I would
       | have been left utterly confused and maybe at some point acted out
       | on it. By engaging with books like _Der Zauberberg_ by Thomas
       | Mann or _Der Mann ohne Eigenschaften_ by Robert Musil, my
       | apparent madness was calmed down and instead of breaking the dam
       | of a forming social front of myself with the vastness of the
       | unconsciousness, over time I was guided to develop my own way
       | into slowly operating it appropriately without completely blowing
       | myself up into a messiah or finding myself eternally trapped in
       | the futility and hopelessness of existence.
       | 
       | Borrowing from my background, one effective vaccination which
       | spontaneously came up in my mind against rationalists sects
       | described here, is Schopenhauer's _Die Welt als Wille und
       | Vorstellung_ which can be read as a radical continuation of Kant
       | 's Critique of Pure Reason which was trying to stress test the
       | _ratio_ itself. [To demonstrate the breadth of _Bildung_ in even
       | something like the physical sciences e.g. Einstein was familiar
       | with Kant 's _a priori_ framework of space and time, Heisenberg
       | 's autobiographical book _Der Teil und das Ganze_ was motivated
       | by:  "I wanted to show that science is done by people, and the
       | most wonderful ideas come from dialog".]
       | 
       | Schopenhauer arrives at the realization because of the groundwork
       | done by Kant (which he heavily acknowledges): that there can't
       | even exist a rational basis for rationality itself, that it is
       | simply an exquisitely disguised tool in the service of the more
       | fundamental _will_ i.e. by its definition an irrational force.
       | 
       | Funny little thought experiment but what consequences does this
       | have? Well, if you are declaring the _ratio_ as your _ultima
       | ratio_ you are just fooling yourself in order to be able to
       | _rationalize_ anything you _want_. Once internalized Schopenhauer
       | 's insight gets you overwhelmed by _Mitleid_ for every conscious
       | being, inoculating you against the excesses of your own ratio. It
       | instantly hit me with the same force as MDMA but several years
       | before.
        
       | jameslk wrote:
       | Over rationalizing is paperclip maximizing
        
       | idontwantthis wrote:
       | Does anyone else feel that "rationality" is the same as clinical
       | anxiety?
       | 
       | I'm hyper rational when I don't take my meds. I'm also insane.
       | But all of my thoughts and actions follow a carefully thought out
       | sequence.
        
       | rogerkirkness wrote:
       | Because they have serious emotional maturity issues leading to
       | lobotomizing their normal human emotional side of their identity
       | and experience of life.
        
       | AndrewKemendo wrote:
       | I was on LW when it emerged from the OB blog and back then it was
       | a interesting and engaging group, though even then there were
       | like 5 "major" contributors - most of which had no coherent
       | academic or commercial success.
       | 
       | As soon as those "sequences" were being developed it was clearly
       | turning into a cult around EY, that I never understood and still
       | don't.
       | 
       | This article did a good job of covering the history since and was
       | really well written.
       | 
       | Water finds its own level
        
       | thedudeabides5 wrote:
       | Purity Spirals + Cheap Talk = irrational rationalists
        
         | thedudeabides5 wrote:
         | _Eliezer Yudkowsky, shows little interest in running one. He
         | has consistently been distant from and uninvolved in
         | rationalist community-building efforts, from Benton House (the
         | first rationalist group house) to today's Lightcone
         | Infrastructure (which hosts LessWrong, an online forum, and
         | Lighthaven, a conference center). He surrounds himself with
         | people who disagree with him, discourages social isolation._
         | 
         | Ummm, EY literally has a semi-permanent office in Lighthouse
         | (at least until recently) and routinely blocks people on
         | Twitter as a matter of course.
        
         | throw0101a wrote:
         | > _Purity Spirals_
         | 
         | This is an interesting idea (phenomenon?):
         | 
         | > _A purity spiral is a theory which argues for the existence
         | of a form of groupthink in which it becomes more beneficial to
         | hold certain views than to not hold them, and more extreme
         | views are rewarded while expressing doubt, nuance, or
         | moderation is punished (a process sometimes called "moral
         | outbidding").[1] It is argued that this feedback loop leads to
         | members competing to demonstrate the zealotry or purity of
         | their views.[2][3]_
         | 
         | * https://en.wikipedia.org/wiki/Purity_spiral
        
       | gwbas1c wrote:
       | Many years ago I met Eliezer Yudkowsky. He handed me a pamphlet
       | extolling the virtues of rationality. The whole thing came across
       | _as a joke, as a parody of evangelizing._ We both laughed.
       | 
       | I glanced at it once or twice and shoved it into a bookshelf. I
       | wish I kept it, because I never thought so much would happen
       | around him.
        
         | yubblegum wrote:
         | imo These people are promoted. You look at their backgrounds
         | and there is nothing that justifies their perches. Eliezer
         | Yudkowsky is (iirc) a Thiel baby, isn't he?
        
         | quickthrowman wrote:
         | I only know Eliezer Yudkowsky from his Harry Potter fanfiction,
         | most notably _Harry Potter and the Methods of Rationality_.
         | 
         | Is he known publicly for some other reason?
        
           | meowface wrote:
           | He's considered the father of rationalism and the father of
           | AI doomerism. He wrote this famous article in Time magazine a
           | few years ago: https://time.com/6266923/ai-eliezer-yudkowsky-
           | open-letter-no...
           | 
           | His book _If Anyone Builds It, Everyone Dies_ comes out in a
           | month: https://www.amazon.com/Anyone-Builds-Everyone-Dies-
           | Superhuma...
           | 
           | You can find more info here:
           | https://en.wikipedia.org/wiki/Eliezer_Yudkowsky
        
         | yahoozoo wrote:
         | > early life on Eliezer's Wikipedia
         | 
         | Every single time
        
           | meowface wrote:
           | [delayed]
        
       | Liftyee wrote:
       | Finally, something that properly articulates my unease when
       | encountering so-called "rationalists" (especially the ones that
       | talk about being "agentic", etc.). For some reason, even though I
       | like logical reasoning, they always rubbed me the wrong way -
       | probably just a clash between their behavior and my personal
       | values (mainly humility).
        
       | psunavy03 wrote:
       | A problem with this whole mindset is that humans, all of us, are
       | only quasi-rational beings. We all use System 1 ("The Elephant")
       | and System 2 ("The Rider") thinking instinctively. So if you end
       | up in deep denial about your own capacity for irrationality, I
       | guess it stands to reason you could end up getting led down some
       | deep dark rabbit holes.
        
         | Muromec wrote:
         | Wasn't the "fast&slow" thingy debunked as another piece of
         | popscience?
        
           | psunavy03 wrote:
           | The point remains. People are not 100 percent rational
           | beings, never have been, never will be, and it's dangerous to
           | assume that this could ever be the case. Just like any number
           | of failed utopian political movements in history that assumed
           | people could ultimately be molded and perfected.
        
             | simpaticoder wrote:
             | Those of us who accept this limitation can often fail to
             | grasp how much others perceive it as a profound attack on
             | the self. To me, it is a basic humility - that no matter
             | how much I learn, I cannot really transcend the time and
             | place of my birth, the biology of my body, the quirks of my
             | culture. Rationality, though, promises that transcendence,
             | at least to some people. And look at all the trouble such
             | delusion has caused, for example "presentism". Science
             | fiction often introduces a hidden coordinate system, one of
             | language and predicate, upon which reason can operate, but
             | system itself did not come from reason, but rather a
             | storyteller's aesthetic.
        
           | navane wrote:
           | I think duality gets debunked every couple of hundred years
        
         | aaronbaugher wrote:
         | Some of the most irrational people I've met were those who
         | claimed to make all their decisions rationally, based on facts
         | and logic. They're just very good at rationalizing, and since
         | they've pre-defined their beliefs as rational, they never have
         | to examine where else they might come from. The rest of us at
         | least have a chance of thinking, "Wait, am I fooling myself
         | here?"
        
       | vehemenz wrote:
       | I get the impression that these people desperately want to study
       | philosophy but for some reason can't be bothered to get formal
       | training because it would be too humbling for them. I call it
       | "small fishbowl syndrome," but maybe there's a better term for
       | it.
        
         | 1attice wrote:
         | My thoughts exactly! I'm a survivor of ten years in the
         | academic philosophy trenches and it just sounds to me like what
         | would happen if you left a planeload of undergraduates on a
         | _Survivor_ island with an infinite supply of pizza pockets and
         | adderall
        
         | username332211 wrote:
         | The reason why people can't be bothered to get formal training
         | is that modern philosophy doesn't seem that useful.
         | 
         | It was a while ago, but take the infamous story of the 2006
         | rape case in Duke University. If you check out coverage of that
         | case, you get the impression every member of faculty that
         | joined in the hysteria was from some humanities department,
         | including philosophy. And quite a few of them refused to change
         | their mind even as the prosecuting attorney was being charged
         | with misconduct. Compare that to Socrates' behavior during the
         | trial of the admirals in 406 BC.
         | 
         | Meanwhile, whatever meager resistence was faced by that group
         | seems to have come from economists, natural scientist or legal
         | scholars.
         | 
         | I wouldn't blame people for refusing to study in a humanities
         | department where they can't tell right from wrong.
        
           | freejazz wrote:
           | >The reason why people can't be bothered to get formal
           | training is that modern philosophy doesn't seem that useful.
           | 
           | But rationalism is?
        
             | NoMoreNicksLeft wrote:
             | Yeh, probably.
             | 
             | Imagine that you're living in a big scary world, and
             | there's someone there telling you that being scared isn't
             | particularly useful, that if you slow down and think about
             | the things happening to you, most of your worries will
             | become tractable and some will even disappear. It probably
             | works at first. Then they sic Roko's Basilisk on you, and
             | you're a gibbering lunatic 2 weeks later...
        
             | username332211 wrote:
             | Nature abhors a vaccum. After the October revolution, the
             | genuine study of humanities was extinguished in Russia and
             | replaced with the mindless repetition of rather inane
             | doctrines. But people with awakened and open minds would
             | always ask questions and seek answers.
             | 
             | Those would, of course, be people with no formal training
             | in history or philosophy (as the study of history where you
             | aren't allowed to question Marxist doctrine would be self-
             | evidently useless). Their training would be in the natural
             | sciences or mathematics. And without knowing how to
             | properly reason about history or philosophy, they may reach
             | fairly kooky conclusions.
             | 
             | Hence why Rationalism can be though as the same class of
             | phenomena as Fomenko's chronology (or if you want to be
             | slightly more generous, Shafarevich's philosophical
             | tracts).
        
           | fellowniusmonk wrote:
           | Philosophy is interesting in how it informs computer science
           | and vice-versa.
           | 
           | Mereological nihilism and weak emergence is interesting and
           | helps protect against many forms of kind of obsessive levels
           | of type and functional cargo culting.
           | 
           | But then in some areas philosophy is woefully behind, and you
           | have philosophers poo-pooing intuitionism when any software
           | engineer working on sufficiently federated or real world
           | sensor/control system borrows constructivism into their
           | classical language to not kill people (agda is interesting of
           | course). Intermediate logic is clearly empirically true.
           | 
           | It's interesting that people don't understand the non-
           | physicality of the abstract and you have people serving the
           | abstract instead of the abstract being used to serve people.
           | People confusing the map for the terrain is such a deeply
           | insidious issue.
           | 
           | I mean all the lightcone stuff, like, you can't predict ex
           | ante what agents will be keystones in beneficial casual
           | chains so its such waste of energy to spin your wheels on.
        
           | djeastm wrote:
           | Modern philosophy isn't useful because some philosophy
           | faculty at Duke were wrong about a rape case? Is that the
           | argument being made here?
        
         | samdoesnothing wrote:
         | Why would they need formal training? Can't they just read
         | Plato, Socrates, etc, and classical lit like Dostoevsky, Camus,
         | Kafka etc? That would be far better than whatever they're doing
         | now.
        
           | guerrilla wrote:
           | I'm someone who has read all of that and much more, including
           | intense study of SEP and some contemporary papers and
           | textbooks, and I would say that I am absolutely not qualified
           | to produce philosophy of the quality output by analytic
           | philosophy over the last century. I can understand a lot of
           | it, and yes, this is better than being completely ignorant of
           | the last 2500 years of philosophy as most rationalists seem
           | to be, but doing only what I have done would not sufficiently
           | prepare them to work on the projects that they want to work
           | on. They (and I) do not have the proper training in logic or
           | research methods, let alone the experience that comes from
           | guided research in the field as it is today. What we all lack
           | especially is the epistemological reinforcement that comes
           | from being checked by a community of our peers. I'm not
           | saying it can't be done alone, I'm just saying that what
           | you're suggesting isn't enough and I can tell you because I'm
           | quite beyond that and I know that I cannot produce the
           | quality of work that you'll find in SEP today.
        
           | giraffe_lady wrote:
           | This is like saying someone who wants to build a specialized
           | computer for a novel use should read the turing paper and get
           | to it. A _lot_ has of development has happened in the field
           | in the last couple hundred years.
        
       | caycep wrote:
       | Granted, admitted from what little I've read on the outside, the
       | "rational" part just seems to be mostly the writing style - this
       | sort of dispassionate, eloquently worded prose that makes weird
       | ideas seem more "rational" and logical than they really are.
        
         | knallfrosch wrote:
         | Yes, they're not rational at all. They're just a San-
         | Francisco/Bay area cult who use that word.
        
       | mordnis wrote:
       | I really like your suggestions, even for non-rationalists.
        
       | Atlas667 wrote:
       | Narcissism and Elitism justified by material wealth.
       | 
       | What else?
       | 
       | Rationalism isn't any more "correct" and "proper" thinking than
       | Christianity and Buddhism claim to espouse.
        
       | jmull wrote:
       | I think rationalist cults work exactly the same as religious
       | cults. They promise to have all the answers, to attract the
       | vulnerable. The answers are convoluted and inscrutable, so a
       | leader/prophet interprets them. And doom is neigh, providing
       | motivation and fear to hold things together.
       | 
       | It's the same wolf in another sheep's clothing.
       | 
       | And people who wouldn't join a religious cult -- e.g. because
       | religious cults are too easy to recognize since we're all
       | familiar with them, or because religions hate anything unusual
       | about gender -- can join a rationalist cult instead.
        
       | mathattack wrote:
       | On a recent Mindscape podcast Sean Carroll mentioned that
       | rationalists are rational about everything except accusations
       | that they're not being rational.
        
         | doubleunplussed wrote:
         | I mean you have to admit that that's a bit of a kafkatrap
        
       | Jtsummers wrote:
       | > Many of them also expect that, without heroic effort, AGI
       | development will lead to human extinction.
       | 
       | > These beliefs can make it difficult to care about much of
       | anything else: what good is it to be a nurse or a notary or a
       | novelist, if humanity is about to go extinct?
       | 
       | Replace AGI causing extinction with the Rapture and you get a lot
       | of US Christian fundamentalists. They often reject addressing
       | problems in the environment, economy, society, etc. because the
       | Rapture will happen any moment now. Some people just end up stuck
       | in a belief about something catastrophic (in the case of the
       | Rapture, catastrophic for those left behind but not those
       | raptured) and they can't get it out of their head. For
       | individuals who've dealt with anxiety disorder, catastrophizing
       | is something you learn to deal with (and hopefully stop doing),
       | but these folks find a community that _reinforces_ the belief
       | about the pending catastrophe(s) and so they never get out of the
       | doom loop.
        
         | tines wrote:
         | The Rapture isn't doom for the people who believe in it though
         | (except in the lost sense of the word), whereas the AI
         | Apocalypse is, so I'd put it in a different category. And even
         | in that category, I'd say that's a pretty small number of
         | Christians, fundamentalist or no, who abandon earthly
         | occupations for that reason.
        
           | Jtsummers wrote:
           | Yes, I removed a parenthetical "(or euphoria loop for the
           | Rapture believers who know they'll be saved)". But I removed
           | it because not all who believe in the Rapture believe _they_
           | will be saved (or have such high confidence) and, for them,
           | it is a doom loop.
           | 
           | Both communities, though, end up reinforcing the belief
           | amongst their members and tend towards increasing isolation
           | from the rest of the world (leading to cultish behavior, if
           | not forming a cult in the conventional sense), and a
           | disregard for the here and now in favor of focusing on this
           | impending world changing (destroying or saving) event.
        
           | JohnMakin wrote:
           | I don't mean to well ackshually you here, but there are
           | several different theological beliefs around the Rapture,
           | some of which believe Christians will remain during the
           | theoretical "end times." The megachurch/cinema version of
           | this very much believes they won't, but, this is not the only
           | view, either in modern times or historically. Some believe
           | it's already happened, even. It's a very good analogy.
        
         | taberiand wrote:
         | Replace AGI with Climate Change and you've got an entirely
         | reasonable set of beliefs.
        
           | NoMoreNicksLeft wrote:
           | You have a very popular set of beliefs.
        
           | ImaCake wrote:
           | You can treat climate change as your personal Ragnarok, but
           | its also possible to take a more sober view that climate
           | change is just bad without it being apocalyptic.
        
           | psunavy03 wrote:
           | You can believe climate change is a serious problem without
           | believing it is necessarily an extinction-level event. It is
           | entirely possible that in the worst case, the human race will
           | just continue into a world which sucks more than it
           | necessarily has to, with less quality of life and maybe
           | lifespan.
        
             | taberiand wrote:
             | I never said I held the belief, just that it's reasonable
        
         | taurath wrote:
         | Raised to huddle close and expect the imminent utter demise of
         | the earth and being dragged to the depths of hell if I so much
         | as said a bad word I heard on TV, I have to keep an extremely
         | tight handle on my anxiety in this day and age.
         | 
         | It's not from a rational basis, but from being bombarded with
         | fear from every rectangle in my house, and the houses of my
         | entire community
        
         | joe_the_user wrote:
         | A lot of people also believe that global warming will cause
         | terrible problems. I think that's a plausible belief but if you
         | combine people believing one or another of these things, you've
         | a lot of the US.
         | 
         | Which is to say that I don't think just dooming is going on.
         | Especially, the belief in AGI doom has a lot of plausible
         | arguments in its favor. I happen not to believe in it but as a
         | belief system it is more similar to a belief in global warming
         | than to a belief in the raptures.
        
       | pavlov wrote:
       | A very interesting read.
       | 
       | My idea of these self-proclaimed rationalists was fifteen years
       | out of date. I thought they're people who write wordy fan
       | fiction, but turns out they've reached the point of having
       | subgroups that kill people and exorcise demons.
       | 
       | This must be how people who had read one Hubbard pulp novel in
       | the 1950s felt decades later when they find out he's running a
       | full-blown religion now.
       | 
       | The article seems to try very hard to find something positive to
       | say about these groups, and comes up with:
       | 
       |  _"Rationalists came to correct views about the COVID-19 pandemic
       | while many others were saying masks didn't work and only
       | hypochondriacs worried about covid; rationalists were some of the
       | first people to warn about the threat of artificial
       | intelligence."_
       | 
       | There's nothing very unique about agreeing with the WHO, or
       | thinking that building Skynet might be bad... (The rationalist
       | Moses/Hubbard was 12 when that movie came out -- the most
       | impressionable age.) In the wider picture painted by the article,
       | these presumed successes sound more like a case of a stopped
       | clock being right twice a day.
        
       | xpe wrote:
       | > The Sequences make certain implicit promises. ...
       | 
       | Some meta-commentary first... How would one go about testing if
       | this is true? If true, then such "promises" are not written down
       | -- they are implied. So one would need to ask at least two
       | questions: 1. Did the author intend to make these implicit
       | promises? 2. What portion of readers perceive them as such?
       | 
       | > ... There is an art of thinking better ...
       | 
       | First, this isn't _implicit_ in the Sequences; it is stated
       | directly. In any case, the quote does not constitute a promise:
       | so far, it is a claim. And yes, rationalists do think there are
       | better and worse ways of thinking, in the sense of "what are more
       | effective ways of thinking that will help me accomplish my
       | goals?"
       | 
       | > ..., and we've figured it out.
       | 
       | Codswallop. This is not a message of the rationality movement --
       | quite the opposite. We share what we've learned and why we
       | believe it to be true, but we don't claim we've figured it all
       | out. It is better to remain curious.
       | 
       | > If you learn it, you can solve all your problems...
       | 
       | Bollocks. This is not claimed implicitly or explicitly. Besides,
       | some problems are intractable.
       | 
       | > ... become brilliant and hardworking and successful and happy
       | ...
       | 
       | Rubbish.
       | 
       | > ..., and be one of the small elite shaping not only society but
       | the entire future of humanity.
       | 
       | Bunk.
       | 
       | For those who haven't read it, I'll offer a relevant extended
       | quote from Yudkowsky's 2009 "Go Forth and Create the Art!" [1],
       | the last post of the Sequences:
       | 
       | ## Excerpt from Go Forth and Create the Art
       | 
       | But those small pieces of rationality that I've set out... I
       | hope... just maybe...
       | 
       | I suspect--you could even call it a guess--that there is a
       | barrier to getting started, in this matter of rationality. Where
       | by default, in the beginning, you don't have enough to build on.
       | Indeed so little that you don't have a clue that more exists,
       | that there is an Art to be found. And if you do begin to sense
       | that more is possible--then you may just instantaneously go
       | wrong. As David Stove observes--I'm not going to link it, because
       | it deserves its own post--most "great thinkers" in philosophy,
       | e.g. Hegel, are properly objects of pity. That's what happens by
       | default to anyone who sets out to develop the art of thinking;
       | they develop fake answers.
       | 
       | When you try to develop part of the human art of thinking... then
       | you are doing something not too dissimilar to what I was doing
       | over in Artificial Intelligence. You will be tempted by fake
       | explanations of the mind, fake accounts of causality, mysterious
       | holy words, and the amazing idea that solves everything.
       | 
       | It's not that the particular, epistemic, fake-detecting methods
       | that I use, are so good for every particular problem; but they
       | seem like they might be helpful for discriminating good and bad
       | systems of thinking.
       | 
       | I hope that someone who learns the part of the Art that I've set
       | down here, will not instantaneously and automatically go wrong,
       | if they start asking themselves, "How should people think, in
       | order to solve new problem X that I'm working on?" They will not
       | immediately run away; they will not just make stuff up at random;
       | they may be moved to consult the literature in experimental
       | psychology; they will not automatically go into an affective
       | death spiral around their Brilliant Idea; they will have some
       | idea of what distinguishes a fake explanation from a real one.
       | They will get a saving throw.
       | 
       | It's this sort of barrier, perhaps, which prevents people from
       | beginning to develop an art of rationality, if they are not
       | already rational.
       | 
       | And so instead they... go off and invent Freudian psychoanalysis.
       | Or a new religion. Or something. That's what happens by default,
       | when people start thinking about thinking.
       | 
       | I hope that the part of the Art I have set down, as incomplete as
       | it may be, can surpass that preliminary barrier--give people a
       | base to build on; give them an idea that an Art exists, and
       | somewhat of how it ought to be developed; and give them at least
       | a saving throw before they instantaneously go astray.
       | 
       | That's my dream--that this highly specialized-seeming art of
       | answering confused questions, may be some of what is needed, in
       | the very beginning, to go and complete the rest.
       | 
       | [1]: https://www.lesswrong.com/posts/aFEsqd6ofwnkNqaXo/go-
       | forth-a...
        
       | IX-103 wrote:
       | Is it really that surprising that a group of humans who think
       | they have some special understanding of reality compared to
       | others tend to separate and isolate themselves until they fall
       | into an unguided self-reinforcing cycle?
       | 
       | I'd have thought that would be obvious since it's the history of
       | many religions (which seem to just be cults that survived the
       | bottleneck effect to grow until they reached a sustainable
       | population).
       | 
       | In other words, humans are wired for tribalism, so don't be
       | surprised when they start forming tribes...
        
       | guerrilla wrote:
       | > One way that thinking for yourself goes wrong is that you
       | realize your society is wrong about something, don't realize that
       | you can't outperform it, and wind up even wronger.
       | 
       | I've been there myself.
       | 
       | > And without the steadying influence of some kind of external
       | goal you either achieve or don't achieve, your beliefs can get
       | arbitrarily disconnected from reality -- which is very dangerous
       | if you're going to act on them.
       | 
       | I think this and the entire previous two paragraphs preceding it
       | are excellent arguments for philosophical pragmatism and
       | empiricism. It's strange to me that the community would not have
       | already converged on that after all their obsessions with
       | decision theory.
       | 
       | > The Zizians and researchers at Leverage Research both felt like
       | heroes, like some of the most important people who had ever
       | lived. Of course, these groups couldn't conjure up a literal Dark
       | Lord to fight. But they could imbue everything with a profound
       | sense of meaning. All the minor details of their lives felt like
       | they had the fate of humanity or all sentient life as the stakes.
       | Even the guilt and martyrdom could be perversely appealing: you
       | could know that you're the kind of person who would sacrifice
       | everything for your beliefs.
       | 
       | This helps me understand what people mean by "meaning". A sense
       | that their life and actions actually matter. I've always
       | struggled to understand this issue but this helps make it
       | concrete, the kind of thing people must be looking for.
       | 
       | > One of my interviewees speculated that rationalists aren't
       | actually any more dysfunctional than anywhere else; we're just
       | more interestingly dysfunctional.
       | 
       | "We're"? The author is a rationalist too? That would definitely
       | explain why this article is so damned long. Why are rationalists
       | not able to write less? It sounds like a joke but this is
       | seriously a thing. [EDIT: Various people further down in the
       | comments are saying it's amphetamines and yes, I should have
       | known that from my own experience. That's exactly what it is.]
       | 
       | > Consider talking about "ethical injunctions:" things you
       | shouldn't do even if you have a really good argument that you
       | should do them. (Like murder.)
       | 
       | This kind of defeats the purpose, doesn't it? Also, this is
       | nowhere justified in the article, just added on as the very last
       | sentence.
        
       | RareAz wrote:
       | +17576171102 is my telegram username
        
       | duckmysick wrote:
       | > And yet, the rationalist community has hosted perhaps half a
       | dozen small groups with very strange beliefs (including two
       | separate groups that wound up interacting with demons). Some --
       | which I won't name in this article for privacy reasons -- seem to
       | have caused no harm but bad takes.
       | 
       | So there's six questionable (but harmless) groups and then later
       | the article names three of them as more serious. Doesn't seem
       | like "many" to me.
       | 
       | I wonder what percentage of all cults are the rationalist ones.
        
       | hax0ron3 wrote:
       | The premise of the article might just be nonsense.
       | 
       | How many rationalists are there in the world? Of course it
       | depends on what you mean by rationalist, but I'd guess that there
       | are probably several tens of thousands, at very least, people in
       | the world who either consider themselves rationalists or are
       | involved with the rationalist community.
       | 
       | With such numbers, is it surprising that there would be half a
       | dozen or so small cults?
       | 
       | There are certainly some cult-like aspects to certain parts of
       | the rationalist community, and I think that those are interesting
       | to explore, but come on, this article doesn't even bother to
       | establish that its title is justified.
       | 
       | To the extent that rationalism does have some cult-like aspects,
       | I think a lot of it is because it tends to attract smart people
       | who are deficient in the ability to use avenues other than
       | abstract thinking to comprehend reality and who enjoy making
       | loosely justified imaginative leaps of thought while
       | overestimating their own abilities to model reality. The fact
       | that a huge fraction of rationalists are sci-fi fans is not a
       | coincidence.
       | 
       | But again, one should first establish that there is anything
       | actually unusual about the number of cults in the rationalist
       | community. Otherwise this is rather silly.
        
       | pizzadog wrote:
       | I have a lot of experience with rationalists. What I will say is:
       | 
       | 1) If you have a criticism about them or their stupid name or how
       | "'all I know is that I know nothing' how smug of them to say
       | they're truly wise," rest assured they have been self
       | flagellating over these criticisms 100x longer than you've been
       | aware of their group. That doesn't mean they succeeded at
       | addressing the criticisms, of course, but I can tell you that
       | they are self aware. Especially about the stupid name.
       | 
       | 2) They are actually well read. They are not sheltered and
       | confused. They are out there doing weird shit together all the
       | time. The kind of off-the-wall life experiences you find in this
       | community will leave you wide eyed.
       | 
       | 3) They are genuinely concerned with doing good. You might know
       | about some of the weird, scary, or cringe rationalist groups. You
       | probably haven't heard about the ones that are succeeding at
       | doing cool stuff because people don't gossip about charitable
       | successes.
       | 
       | In my experience, where they go astray is when they trick
       | themselves into working beyond their means. The basic underlying
       | idea behind most rationalist projects is something like "think
       | about the way people suffer everyday. How can we think about
       | these problems in a new way? How can we find an answer that
       | actually leaves everyone happy?" A cynic (or a realist, depending
       | on your perspective) might say that there are many problems that
       | fundamentally will leave some group unhappy. The overconfident
       | rationalist will challenge that cynical/realist perspective until
       | they burn themselves out, and in many cases they will attract a
       | whole group of people who burn out alongside them. To consider an
       | extreme case, the Zizians squared this circle by deciding that
       | the majority of human beings didn't have souls and so "leaving
       | everyone happy" was as simple as ignoring the unsouled masses. In
       | less extreme cases this presents itself as hopeless idealism, or
       | a chain of logic that becomes so divorced from normal
       | socialization that it appears to be opaque. "This thought
       | experiment could hypothetically create 9 quintillion cubic units
       | of Pain to exist, so I need to devote my entire existence towards
       | preventing it, because even a 1% chance of that happening is
       | horrible. If you aren't doing the same thing then you are now
       | morally culpable for 9 quintillion cubic units of Pain. You are
       | evil."
       | 
       | Most rationalists are weird but settle into a happy place far
       | from those fringes where they have a diet of "plants and
       | specifically animals without brains that cannot experience pain"
       | and they make $300k annually and donate $200k of it to charitable
       | causes. The super weird ones are annoying to talk to and nobody
       | really likes them.
        
       | gwbas1c wrote:
       | One thing I'm having trouble with: The article assumes the reader
       | knows some history about the rationalists.
       | 
       | I listened to a podcast that covered some of these topics, so I'm
       | not lost; but I think someone who's new to this topic will be
       | very, very, confused.
        
         | clueless wrote:
         | I'm curious, what was the podcast episode?
        
       | kanzure wrote:
       | Here are some other anti-lesswrong materials to consider:
       | 
       | https://aiascendant.com/p/extropias-children-chapter-1-the-w...
       | 
       | https://davidgerard.co.uk/blockchain/2023/02/06/ineffective-...
       | 
       | https://www.bloomberg.com/news/features/2023-03-07/effective...
       | 
       | https://www.vox.com/future-perfect/23458282/effective-altrui...
       | 
       | https://qchu.substack.com/p/eliezer
       | 
       | https://x.com/kanzure/status/1726251316513841539
        
       | antithesizer wrote:
       | Little on offer but cults these days. Take your pick. You
       | probably already did long ago and now your own cult is the only
       | one you'll never clock as such.
        
       ___________________________________________________________________
       (page generated 2025-08-12 23:00 UTC)