[HN Gopher] Effective altruism has a sexual harassment problem, ...
       ___________________________________________________________________
        
       Effective altruism has a sexual harassment problem, women say
        
       Author : s17n
       Score  : 75 points
       Date   : 2023-02-03 19:10 UTC (3 hours ago)
        
 (HTM) web link (time.com)
 (TXT) w3m dump (time.com)
        
       | wendyshu wrote:
       | A lot of people in EA have poor social skills and a lot of people
       | in the community become friends and lovers. That probably
       | explains most of this phenomenon.
        
         | s17n wrote:
         | You seem to be implying that "poor social skills" are a valid
         | excuse for sexual misconduct?
        
           | amanaplanacanal wrote:
           | An explanation is not necessarily an excuse.
        
       | empathy_m wrote:
       | Is EA still a thing? I thought the post rationalist and e/acc
       | communities were where the forefront of human civilization in the
       | Bay Area is now.
        
       | WeylandYutani wrote:
       | Tech bros gonna tech bro. This is the nerd equivalent of
       | highschool football team.
        
       | ChrisMarshallNY wrote:
       | Sounds a lot like the "Peace & Love" movement of the sixties
       | (hippies).
       | 
       | A _lot_ of what went on, in those days, would be considered rape,
       | slavery, various types of coercion and larceny, etc., these days.
        
       | BWStearns wrote:
       | Least surprising thing ever when the movement's logic resembles
       | the philosophical version of pickup artistry (and I'd bet good
       | money that it's a thick middle of the Venn diagram) in terms of
       | justifying some shitty behavior with rhetorical sleights of hand.
        
       | ergonaught wrote:
       | I wonder how many times we have to see that generalizing (outside
       | of an immediate emergency scenario) is frequently harmful and
       | always idiotic before we quit doing it.
       | 
       | Let's ChatGPT this thing and see if it can do better than humans.
       | 
       | Prompt: "Is it true that Effective Altruism has a toxic culture
       | of sexual harassment and abuse?"
       | 
       | Response: "The effective altruism community, like any other
       | community, may have individuals who engage in harmful behaviors,
       | including sexual harassment and abuse. However, it is not
       | accurate to make blanket statements about an entire community
       | based on the actions of a few individuals. It is important for
       | all communities to take the issue of sexual harassment and abuse
       | seriously and to have systems in place for addressing these
       | issues when they arise."
       | 
       | Bummer. LLMs win.
        
         | supermet wrote:
         | Hate speech for me, but not for thee.
         | 
         | See if you can have the LLM rewrite this article, but for BLM,
         | freemasonry, or grooming Pakistani immigrants.
        
       | lbwtaylor wrote:
       | The article links to this EA community post on 'interpersonal
       | harm' in the community, which I found interesting.
       | https://forum.effectivealtruism.org/posts/NbkxLDECvdGuB95gW/...
       | 
       | One item caught my eye: >>There are also cases where I find a
       | report to be alarming and would like to take action, but the
       | person reporting does not want it known that they spoke up. In
       | these cases, sometimes there's very little that can be done
       | without breaking their confidentiality.
       | 
       | Yikes. That would not meet reporting standards at most
       | organizations I've been a part of. There are a lot of hard
       | lessons behind requiring mandatory investigation of credible
       | claims.
        
         | The_Colonel wrote:
         | If you don't respect the desired confidentiality, you'll get
         | fewer people confiding with you.
         | 
         | There are similar practices regarding rape - victims can seek
         | e.g. securing the biological evidence without automatically
         | triggering a criminal investigation. The logic being that this
         | (critical) step should be as risk-free / barrier-free as
         | possible.
         | 
         | > There are a lot of hard lessons behind requiring mandatory
         | investigation of credible claims.
         | 
         | Worth keeping in mind that companies are always primarily
         | trying to cover their backs and these policies often reflect
         | that.
        
           | lbwtaylor wrote:
           | >> There are a lot of hard lessons behind requiring mandatory
           | investigation of credible claims.
           | 
           | >Worth keeping in mind that companies are always primarily
           | trying to cover their backs and these policies often reflect
           | that.
           | 
           | Your comment was a fair one when it comes to corporations,
           | but I was thinking of churches and community groups more like
           | EA than corporations. Bad actors thrive in secrecy,
           | particularly when they can use their power to create
           | repercussions for people reporting bad acts, creating a
           | culture of silence. Mandatory investigation is one of the few
           | effective ways to resolve that. That's why it's the policy
           | and/or law in many circumstances.
           | 
           | If my group had the same policy as EA, I would be very
           | uncomfortable with it.
        
       | somecompanyguy wrote:
       | good
        
       | s17n wrote:
       | (Original title was too long for HN, did my best)
        
       | zozbot234 wrote:
       | This is not about effective altruism as in the generic practice
       | of high-impact giving, it relates to a highly specific subculture
       | of EA proponents. "Polycules", 'nuff said. These are not
       | appropriate topics in a professional discussion among strangers.
        
         | pcthrowaway wrote:
         | It's a mission-focused group, and such groups usually include
         | some amount of socializing. It's not the same as a workplace
         | (although people frequently mention their partners in a
         | workplace also)
         | 
         | There are certainly polyamorous people who behave poorly, but
         | that doesn't mean people with multiple lovers should be held to
         | different standards than monogamous people are, just because
         | their romantic orientation places them in the minority.
        
       | cat_plus_plus wrote:
       | We as humans can choose elevate our sexuality as the ultimate
       | union with a soulmate and dedication of our talents to bringing
       | up the next generation. Or we can behave like rats in the gutter.
       | If women choose to associate themselves with the later group,
       | they shouldn't be shocked when they are disrespected. If they
       | join people who are actually doing altruism like tutoring kids or
       | helping army vets, they are likely to meet some decent guys.
        
       | cwkoss wrote:
       | "Effective Altruism" is one of those things like "All Lives
       | Matter" where what it says on the box is not reflective of the
       | way that people who identify with the ideology practice it.
        
         | prox wrote:
         | So it is a doublespeak?
        
       | mise_en_place wrote:
       | Of course the "utopian" types happen to be the most degenerate.
       | Ironic if you think about it. There is something seriously wrong
       | with our culture when men and women can no longer pair up
       | efficiently together and have to resort to these shenanigans in
       | order to get laid. It is a symptom of how sick our society has
       | become.
        
       | eddsh1994 wrote:
       | You can see the EA forums response to this post here:
       | https://forum.effectivealtruism.org/posts/JCyX29F77Jak5gbwq/...
        
       | haunter wrote:
       | >Thousands have signed a pledge to tithe at least 10% of their
       | income to high-impact charities. From college campuses to Silicon
       | Valley startups, adherents are drawn to the moral clarity of a
       | philosophy dedicated to using data and reason to shape a better
       | future for humanity. Effective altruism has become something of a
       | secular religion for the young and elite.
        
       | theragra wrote:
       | If there are less cases of abuse than among generic populace, it
       | is weird to blame the movement. From the article, it is hard to
       | understand that EA is somehow worse than most of the USA.
       | 
       | Also, arguments for polyamory are just that, arguments. You
       | certainly can press someone into it, but from the article, the
       | impression that it is more like persuasion.
       | 
       | Regarding cult dynamics - any tight knit community feels like
       | this. Be it psychedelics users, health nuts, athletes etc etc.
       | All of these will have takes that outsiders will consider unusual
       | and weird.
        
         | pdonis wrote:
         | _> arguments for polyamory are just that, arguments_
         | 
         | Which have nothing whatever to do with effective altruism. So
         | it's perfectly reasonable for a person who came to a gathering
         | expecting to talk about effective altruism, to be
         | uncomfortable, to say the least, when she finds herself getting
         | proselytized about polyamory.
        
           | olliecornelia wrote:
           | It's never who you want to be polyamorous who's polyamorous.
        
         | ketzo wrote:
         | > but from the article, the impression that it is more like
         | persuasion.
         | 
         | Okay -- but if you showed up to a tech conference, and someone
         | in the hallway was trying to "persuade" you to join a
         | threesome, would you feel that was appropriate for the setting?
         | The issue is that so much of this relationship-and-sex talk is
         | happening to people who didn't think they had signed up for it.
         | That's where you start verging on abuse.
         | 
         | I think part of the issue is that EA is:
         | 
         | - kind of life-defining by nature
         | 
         | - filled with people who seek self-improvement
         | 
         | - filled with people who are excellent persuaders
         | 
         | With that mix, it is uncomfortably easy to start "mixing
         | business with pleasure," so to speak. People in that
         | environment think that they live a really interesting and good
         | life, and want to convince others of that fact.
         | 
         | That's why people are blaming the movement, I think.
        
         | andrewflnr wrote:
         | > If there are less cases of abuse than among generic populace
         | 
         | Where are you getting this?
        
         | zozbot234 wrote:
         | Arguments for polyamory are also regarded as grossly
         | unprofessional in any environment that's focused on a specific
         | goal. Most people don't want to be goaded by strangers into
         | "arguing" about their relationship status. It's abusive.
        
       | jmeister wrote:
       | Does Time's readership have any clue about Effective Altruism?
        
         | 1shooner wrote:
         | I think this may be a bit of a HN bubble. I can't imagine most
         | people have a need for a social or philosophical framework for
         | their philanthropic identity.
        
         | eddsh1994 wrote:
         | After SBF and FTX, yes
        
         | TacticalCoder wrote:
         | Well there's this high profile in the EA community who happened
         | to have its name tied to FTX not just for being SBF's guru but
         | also for being on the FTX payroll in FTX's early days.
         | 
         | But not only that: it gets better for he happen to have bought
         | a mansion in the UK, supposedly for EA, with $15m of stolen
         | money SBF gave him. (mansion which may be clawed back, I sure
         | do hope it is).
         | 
         | The dude is a teacher at some fancy uni in the UK.
         | 
         | Tells me all I need to know about EA and the kind of people
         | higher up the echelons there. It's obviously highly
         | manipulative and I'm not surprised at all to see manipulative
         | gurus misappropriating money and using their manipulative
         | tactics to prey on women.
         | 
         | P.S: as he altruistically given back the mansion acquired with
         | stolen money?
        
       | supermet wrote:
       | [dead]
        
       | freejazz wrote:
       | Effective Altruism has a "make me think whoever is involved in
       | this isn't a grifty scammer weirdo" problem
        
       | willcipriano wrote:
       | Back in my day effective altruism was mainly about finding
       | charities that aren't essentially scams (way harder than it
       | looks). Scene has apparently moved on to other things since I
       | followed it a decade or so ago.
        
         | ketzo wrote:
         | This is a super-simplified summary, but I think it's generally
         | accurate.
         | 
         | The basic thesis of EA is "it is your duty to improve/save as
         | many human lives as you can."
         | 
         | At some point, a lot of EAs realized that there were many more
         | _future_ humans than _present_ humans.
         | 
         | Once you widen your scope in that way, you start realizing that
         | long-term, catastrophic risks (climate change, nuclear
         | disaster, misaligned AI) would affect a _lot_ more human lives
         | -- billions or trillions more -- than basically anything we
         | could do today.
         | 
         | So the logic becomes -- why would I spend time/money on
         | mosquito nets when we need to be securing the literal future of
         | the human race?
        
           | bena wrote:
           | Because we do that with mosquito nets.
           | 
           | EA seems like a way to achieve nothing while looking like
           | you're doing everything. No one expects you to fly to Mars
           | tomorrow. And that's true every single day. It's true today.
           | It'll be true tomorrow. It was true yesterday. It was true 10
           | years ago. It will be true 10 years from now.
           | 
           | So if no one really expects you to fully achieve your goal,
           | all you have to do is kinda look like you're trying and that
           | will be good enough for most people.
           | 
           | EA takes a good, hard look at all these good intentions and
           | says, "Fuck, this would make a baller ass road".
           | 
           | However, if we solve malaria. That's another thing not
           | killing us. Another problem checked off. Like polio. Or
           | smallpox. Colonize Mars? Fucking how? We can't even get the
           | environment on Earth under control. How the living fuck are
           | we going to create an environment on another fucking planet.
           | Much less even get there.
           | 
           | So how about we figure out a way to get the garbage out the
           | ocean. On how to scrub the air of CO2. How to manufacture and
           | produce without so many polluting side effects. We keep doing
           | all these smaller things. Put in the work, and one day, we
           | will save all those trillions of potential lives. But it
           | requires putting in the work.
           | 
           | Edit: Not saying you believe it. But presenting the counter-
           | argument to EA.
        
         | wendyshu wrote:
         | I guess GiveWell still does that, but I'm not really sure what
         | everyone else in the movement does.
        
           | amanaplanacanal wrote:
           | Givewell is certainly my go to.
        
             | willcipriano wrote:
             | Looks good I'll check it out.
        
       | sonjat wrote:
       | It's unclear if the issue is EA or how to handle misbehavior in
       | organizations without formal structure or hierarchy. It isn't
       | like a workplace, with reasonably well-defined boundaries, but
       | something more akin to religion, where its influence bleeds over
       | heavily into many aspects of ones life. As such, it is probably
       | both more devastating when one is the victim of misconduct and
       | also more difficult to police such misconduct. I am not really
       | sure what the answer here is. "Believe all women" is a great
       | slogan, but I am not a fan of a "guilty until proven innocent"
       | (and I say this as a woman). OTOH, this isn't a criminal
       | procedure and as such, one shouldn't have to prove beyond a
       | reasonable doubt that someone is preying on others to enforce
       | some level of punishment. It's a tough problem.
        
         | Y_Y wrote:
         | You should be able to punish people even though there's
         | reasonable doubt that they are culpable? Are you arguing for a
         | "balance of probabilities" standard? Or that it's worth
         | punishing some innocents so that the guilty are also punished?
        
           | sonjat wrote:
           | I think I am arguing for a "balance of probabilities". If (to
           | spout off random hypothetical) the punishment is something
           | like a banning of someone from EA conferences, then there
           | definitely needs to be evidence of their misconduct, but that
           | level of evidence doesn't need to be the same as if they are
           | looking at a criminal conviction. The point is balancing the
           | need to protect the victim while not punishing the innocent
           | is a difficult issue outside the criminal courtroom.
        
       | xchip wrote:
       | This is a free article, and for the same reason in Photography
       | Books, on the cover there is usually a naked lady, in these free
       | articles the story is also always about sex or something that
       | appeals to your instincts.
       | 
       | Chances are this story was exaggerated up to a point to make it
       | call your attention so you subscribe.
       | 
       | If something is free, you are the product.
        
         | [deleted]
        
         | xhkkffbf wrote:
         | I waded through many, many paragraphs to discover that some of
         | the crimes included people making comments that made others
         | feel uncomfortable. I was kind of expecting more.
         | 
         | So, yes, it looks like I was the product here. It made me feel
         | uncomfortable!
        
       | raincom wrote:
       | You see same set of problems in a movement of super wealthy men,
       | esp if you are a female.
        
       | supersour wrote:
       | > EA is diffuse and deliberately amorphous; anybody who wants to
       | can call themselves an EA... But with no official leadership
       | structure, no roster of who is and isn't in the movement, and no
       | formal process for dealing with complaints, Wise argues, it's
       | hard to gauge how common such issues are within EA compared to
       | broader society.
       | 
       | This passage reminded me of this article:
       | https://www.jofreeman.com/joreen/tyranny.htm
       | 
       | Moral of the story: be weary of groups with low accountability
       | and vague power structures. In a vacuum, power structures will
       | always emerge, so it's generally better for them to exist in the
       | light than in the dark.
        
         | Sakos wrote:
         | I think it's bizarre EA seems to be a movement with power
         | structures. I always just thought EA was a philosophy and based
         | on that I felt it was an interesting idea. I don't have to
         | worry about sexual harassment when I'm considering Plato or
         | Stoicism. Why is it a thing with EA?
        
           | LarryMullins wrote:
           | The EA "philosophy" is strongly tied up in libertarian
           | utilitarian ways of thinking, and such people are able to
           | talk themselves into believing that it's rational to defer to
           | people smarter/richer than themselves. They get money,
           | intelligence and virtue all mixed up and confused with each
           | other. Being intelligent gets you money, money buys virtue,
           | those who are smartest will become richest and those who are
           | richest will be able to buy the most virtue.
           | 
           | Power structures emerge naturally from this.
        
           | luckylion wrote:
           | Something can both describe a philosophy and a movement. The
           | movement always has hierarchies and power structures. The
           | philosophy doesn't, but then again, it's often presented by
           | the movement so the lines get blurry.
        
             | Sakos wrote:
             | This is insightful, thanks.
        
           | lbwtaylor wrote:
           | When there is a lot of money moving around, it seems
           | inevitable that power structures will form around it.
        
             | gruez wrote:
             | That might be true, but there isn't exactly an "Effective
             | Altruism Foundation" that all the donations funnel through.
             | You basically have a bunch of random people that are
             | donating to charities that are well regarded, with a few
             | philanthropists here and there setting up foundations. You
             | might be able to hijack organizations like givewell (ie.
             | organizations that tell people which charities they should
             | donate to), but trying to monetize that is tricky. At the
             | end of the day you don't really control anything, because
             | you're still reliant on individuals following your advice.
             | So if you try to funnel money to your own foundation (for
             | embezzling purposes), you will easily burn any goodwill
             | that you've built up.
        
         | ketzo wrote:
         | I see that essay linked every six months or so, and I swear
         | every time I read it, a new element of it rings true to me.
         | Really timeless, invaluable writing on the way groups of humans
         | work.
        
       | jfengel wrote:
       | The worst thing about being smart is how easy it is to talk
       | yourself into believing just about anything. After all, you make
       | really good arguments.
       | 
       | EA appeals to exactly that kind of really-smart-person who is
       | perfectly capable of convincing themselves that they're always
       | right about everything. And from there, you can justify all kinds
       | of terrible things.
       | 
       | Once that happens, it can easily spiral out from there. People
       | who know perfectly well they're misbehaving will claim that they
       | aren't, using the same arguments. It won't hold water, but now
       | we're swamped, and the entire thing crumbles.
       | 
       | I'd love to believe in effective altruism. I already know that my
       | money is more effective in the hands of a food bank than giving
       | people food myself. I'd love to think that could scale. It would
       | be great to have smarter, better-informed people vetting things.
       | But I don't have any reason to trust them -- in part because I
       | know too many of the type of people who get involved and aren't
       | trustworthy.
        
         | dfgheaoinbt6t wrote:
         | Effective altruism is, I think, an ideology perfectly suited to
         | ensnare a certain kind of person.
         | 
         | Conventional wisdom would say that wielding wealth and power
         | like effective altruism demands requires humility, compassion,
         | and maturity. It requires wisdom. Effective altruism can seem
         | to remove the need for that. Doing good is about calculation,
         | not compassion! Interpersonal failings don't matter if someone
         | is really good with C++. One needn't care about the feelings of
         | others if there are more _efficient_ ways to use the time.
         | 
         | Effective altruism calls on the rich and capable to recognize
         | their own power to help those who are poor and helpless.
         | However, it is easy for pity to turn to contempt and for
         | clarity of purpose to turn to arrogance. The poor, hungry, and
         | sick of the world need the effective altruist for a savior. The
         | effective altruist is _better_ than the rest because they are
         | making the world a better place.
         | 
         | An effective altruist may confuse immaturity with wisdom and
         | greed with generosity.
         | 
         | This is not meant to be a diatribe. I find much of effective
         | altruism obviously true and find my exposure to it has made me
         | a better person. If pressed, I would probably call myself an
         | effective altruist. Still, it is greatly concerning that people
         | like Elon Musk or Sam Bankman-Fried can be associated with
         | effective altruism without any real hypocrisy.
        
         | ben_w wrote:
         | > EA appeals to exactly that kind of really-smart-person who is
         | perfectly capable of convincing themselves that they're always
         | right about everything. And from there, you can justify all
         | kinds of terrible things.
         | 
         | Yup.
         | 
         | Which is super-ironic given the association with big-R
         | Rationality, Less Wrong, Overcoming Bias, all of which quote
         | Feynman saying "The first principle is that you must not fool
         | yourself, and you are the easiest person to fool."
         | 
         | Now I have the mental image of the scene in _The Life of Brian_
         | where the crowd mindlessly parrots Brian 's call for them to
         | think for themselves.
        
         | gruez wrote:
         | Your point seem superficially valid, but where do we go from
         | there?
         | 
         | >The worst thing about being smart is how easy it is to talk
         | yourself into believing just about anything. After all, you
         | make really good arguments.
         | 
         | >EA appeals to exactly that kind of really-smart-person who is
         | perfectly capable of convincing themselves that they're always
         | right about everything. And from there, you can justify all
         | kinds of terrible things.
         | 
         | Should we _not_ talk ourselves into believing into stuff?
         | Should smart people specifically avoid changing their beliefs
         | out of fear of  "justify all kinds of terrible things"?
         | 
         | >I'd love to believe in effective altruism. I already know that
         | my money is more effective in the hands of a food bank than
         | giving people food myself. I'd love to think that could scale.
         | It would be great to have smarter, better-informed people
         | vetting things. But I don't have any reason to trust them -- in
         | part because I know too many of the type of people who get
         | involved and aren't trustworthy.
         | 
         | So you don't trust donating money to food banks or malaria nets
         | because "don't have any reason to trust them", then what? Don't
         | donate any money at all? Give up trying to maximize impact and
         | donate to whatever you feel like donating to?
        
           | AnIdiotOnTheNet wrote:
           | > Should we not talk ourselves into believing into stuff?
           | Should smart people specifically avoid changing their beliefs
           | out of fear of "justify all kinds of terrible things"?
           | 
           | It's simple really: just be skeptical of your own reasoning
           | because you're aware of your own biases and fallibility. Be a
           | good scientist and be open to being wrong.
           | 
           | > So you don't trust donating money to food banks or malaria
           | nets because "don't have any reason to trust them", then
           | what?
           | 
           | No, they don't trust that you can scale the concept of "food
           | banks are more effective than I am" to any kind of
           | maximization. You can still donate to worthy causes and
           | effective organizations.
           | 
           | > Don't donate any money at all? Give up trying to maximize
           | impact and donate to whatever you feel like donating to?
           | 
           | Yeah, basically. Giving is more helpful than not giving, so
           | even a non-maximalist approach is better than nothing.
           | Perfect is the enemy of good, aim for good.
        
             | gruez wrote:
             | >It's simple really: just be skeptical of your own
             | reasoning because you're aware of your own biases and
             | fallibility. Be a good scientist and be open to being
             | wrong.
             | 
             | This just seems like a generic advice to me which is
             | theoretically applicable to everyone. Is there any evidence
             | of effective altruists not doing that, or this being
             | specifically a problem with "really-smart-person"s?
             | 
             | >No, they don't trust that you can scale the concept of
             | "food banks are more effective than I am" to any kind of
             | maximization. You can still donate to worthy causes and
             | effective organizations.
             | 
             | I'm not quite understanding what you're arguing for here.
             | Are you saying that you disagree with effective altruists'
             | assessment that you should be funding malaria nets in
             | africa or whatever (ie. what they want you to do), rather
             | than donating to local food banks (ie. what you want to
             | do)?
             | 
             | >Yeah, basically. Giving is more helpful than not giving,
             | so even a non-maximalist approach is better than nothing.
             | Perfect is the enemy of good, aim for good.
             | 
             | To be clear, you're arguing for donating for whatever your
             | gut tells you, rather than trying to maximize benefit?
        
           | tablespoon wrote:
           | >> EA appeals to exactly that kind of really-smart-person who
           | is perfectly capable of convincing themselves that they're
           | always right about everything. And from there, you can
           | justify all kinds of terrible things.
           | 
           | > Should we not talk ourselves into believing into stuff?
           | Should smart people specifically avoid changing their beliefs
           | out of fear of "justify all kinds of terrible things"?
           | 
           | The GP is talking about self-deception. And yes, we should
           | not deceive ourselves.
        
             | gruez wrote:
             | Okay, but how does this translate into actionable advice?
             | Nobody sets out to intentionally deceive themselves.
             | Telling people that "we should not deceive ourselves" is
             | basically as helpful as "don't be wrong".
        
               | mistermann wrote:
               | How about this: in swinger communities, they have safe
               | words that they can use to transcend the playtime aspect
               | of reality - how about we develop something similar for
               | internet arguments, a term that is mutually agreed upon
               | _in advance_ , and when uttered by a participant it kicks
               | off a _well documented (and agreed upon)_ protocol where
               | all participants downgrade System 1 thinking to zero and
               | upgrade System 2 thinking to 11...and, all participants
               | carefully monitor each other to ensure that people are
               | executing the agreed upon plan successfully?
               | 
               | This general approach works quite well (with practice) in
               | many other domains, maybe it would also work for
               | arguments/beliefs.
        
               | gruez wrote:
               | Wouldn't this devolve into name calling almost
               | immediately? On internet arguments it's already implied
               | that you're bringing forth logical points and not just
               | spouting off what you feel in the heat of the moment.
               | Invoking the safe word is basically a thinly veiled
               | attempt at calling the other party irrational and
               | emotional.
        
               | dkqmduems wrote:
               | It's called science.
        
               | gruez wrote:
               | That's the same problem as before. Outside of _maybe_
               | fundamentalist religious people who think their religious
               | text is the final word on everything, everybody agrees
               | that  "science" is the best way of finding out the truth.
               | The trouble is that they disagree on what counts as
               | science (ie. which scientists/institutions/studies to
               | trust). When the disagreement is at that level, casually
               | invoking "science" misses the point entirely.
        
             | TigeriusKirk wrote:
             | I've never met anyone who didn't deceive themselves in
             | significant ways.
        
         | PragmaticPulp wrote:
         | > EA appeals to exactly that kind of really-smart-person who is
         | perfectly capable of convincing themselves that they're always
         | right about everything. And from there, you can justify all
         | kinds of terrible things.
         | 
         | I came to the same conclusion after a group of my friends got
         | involved with the local rationalist and EA community, though
         | for a different reason: Their drug habits.
         | 
         | They believed themselves to have a better grasp on human nature
         | and behavior than the average person, and therefore believed
         | they were better at controlling themselves. They also had a
         | deep contrarian bias, which turned into a belief that drugs
         | weren't actually as bad as the system wanted us to believe.
         | 
         | Combine these two factors and they convinced themselves that
         | they could harness recreational opioid use to improve their
         | lives, but avoid the negative consequences that "normies"
         | suffered by doing it wrong. I remember being at a party where
         | several of them were explaining that they were on opioids
         | _right now_ and tried to use the fact that nothing terrible was
         | happening as proof that they were performing rational drug use.
         | 
         | Long story short, the realities of recreational opioid use
         | caught up with them and they were blind to the warning signs
         | due to their hubris. I intentionally drifted away from that
         | group around that time, so I don't know what happened to them.
         | 
         | I will never forget how confident they were that addiction is
         | something that only happens to other people, not rationalists
         | like them.
        
           | telotortium wrote:
           | No, they should have listened to their parents! Drug
           | positivity is supposed to be reserved for cannabis and
           | hallucinogens!
        
           | teraflop wrote:
           | I'm reminded of a fascinating series of Reddit threads,
           | starting back in 2009, from somebody who convinced himself he
           | "could handle anything once" and decided to try heroin, only
           | to rapidly spiral out of control:
           | 
           | https://www.reddit.com/r/BestofRedditorUpdates/comments/wef6.
           | ..
        
             | scarmig wrote:
             | The issue with Reddit is that the story is as likely as not
             | to be fake. Particularly, here, I don't think people are at
             | risk of serious withdrawal after only two weeks of heroin
             | use.
             | 
             | Though heroin use is obviously one of the dumbest things
             | anyone can do.
        
             | pavel_lishin wrote:
             | I'm only on the second update, and it seems like this guy
             | speed-ran addiction:
             | 
             | > I can't stop crying. Fuck heroin. Fuck my life. I guess I
             | don't need to say that since heroin pretty much fucked my
             | life for me in under two weeks, I just want to die.
        
           | DaveExeter wrote:
           | >recreational opioid use
           | 
           | Horse? Were they shooting horse?
        
         | JohnFen wrote:
         | Effective Altruism is just a modern iteration of a thing that's
         | been around for a very long time. The fundamental idea is
         | sound. However, in practice, it all-too-easily devolves into
         | something really terrible. Especially once people start down
         | the path of thinking the needs of today aren't as important as
         | the needs of a hypothetical future population.
         | 
         | Personally, I started "tithing" when my first business was a
         | success. In part because it's good to help the less fortunate,
         | but also as an ethical stance. Having a business drove home
         | that no business can be successful without the support of the
         | community it starts in, so it's only right to share in the
         | rewards.
         | 
         | So, I give 10% back. I have rules about it:
         | 
         | I always give to a local group who directly helps people and
         | who is typically overlooked for charitable giving. I get to
         | know the group pretty well first.
         | 
         | I never give to any group that won't keep my identity a secret.
         | 
         | I never give to any group that asks me for money.
         | 
         | I don't always give in the form of money. Sometimes, it's in
         | the form of my time and effort, or in material goods, etc.
         | 
         | I don't give to "umbrella" groups whose purpose is fundraising
         | for a collection of other groups. This isn't because I have a
         | problem with them, but because they're not the ones who
         | struggle the most to get donations.
        
           | pdonis wrote:
           | _> I get to know the group pretty well first._
           | 
           | I think this is a very, very, very important step. What's
           | more, it's a step that I don't think can be outsourced to
           | someone else, which is why I'm skeptical about claims by,
           | among others, the Effective Altruism movement, to be able to
           | do this kind of thing on your behalf.
        
             | gruez wrote:
             | >What's more, it's a step that I don't think can be
             | outsourced to someone else, which is why I'm skeptical
             | about claims by, among others, the Effective Altruism
             | movement, to be able to do this kind of thing on your
             | behalf.
             | 
             | Why can't this be done? Society in general outsources due
             | diligence to third parties all the time. Banks outsource
             | credit worthiness assessments to credit bureaus. Passive
             | investors outsource price discovery to other market
             | participants. Online shoppers outsource quality control to
             | reviewers. I agree that there's no substitute for doing it
             | yourself, but it's simply not realistic in many cases to do
             | the due diligence yourself. Even if you do it yourself,
             | there's no guarantee that you'll do a better job than the
             | professionals.
        
           | dfgheaoinbt6t wrote:
           | >Especially once people start down the path of thinking the
           | needs of today aren't as important as the needs of a
           | hypothetical future population.
           | 
           | It's not that that bothers me so much as the fact that many
           | effective altruists do it _so badly_. We need to be concerned
           | with the future. That is the only reason to maintain roads
           | and bridges, to prevent pollution, or to conserve resources
           | like water in aquifers and helium. But effective altruists
           | are as likely to talk about colonizing Mars as they are to
           | talk about global warming.
           | 
           | Effective altruism is supposedly about making evidence-based
           | decisions. We have no idea how likely "existential risks"
           | are. We have no idea what, if anything, can be done about
           | them. We cannot predict a year into the future, let alone
           | millenia. So-called longtermism is nothing more than
           | guesswork.
        
             | gruez wrote:
             | >It's not that that bothers me so much as the fact that
             | many effective altruists do it so badly. [...] But
             | effective altruists are as likely to talk about colonizing
             | Mars as they are to talk about global warming.
             | 
             | Are they doing it badly, or are you not understanding their
             | arguments? AFAIK effective altruists want to colonize mars
             | on x-risk grounds, which would explain why they want to
             | prioritize that over global warming, even though the latter
             | is happening right now. AFAIK they think that global
             | warming is bad, but isn't an existential risk, whereas
             | colonizing mars will mitigate many existential risks.
        
               | freejazz wrote:
               | >Are they doing it badly, or are you not understanding
               | their arguments?
               | 
               | Do YOU not understand their arguments? They are facially
               | stupid. The notion that we should be colonizing mars
               | because of global warming is the stupidest thing I've
               | ever read or heard.
        
               | gruez wrote:
               | >The notion that we should be colonizing mars because of
               | global warming is the stupidest thing I've ever read or
               | heard.
               | 
               | Yeah, because that's a strawman you imagined in your
               | head. I'm not sure what gave you the impression that the
               | two were related (other than that they're competing
               | options) based on my previous comment.
        
               | freejazz wrote:
               | Someone posted it upthread. You can replace any
               | catastrophic event with global warming and it's just as
               | facially stupid. Literally like the thought process of a
               | child. It's completely divorced from reality.
        
               | dfgheaoinbt6t wrote:
               | I understand their arguments just fine. I just don't
               | think they make any sense.
               | 
               | Ought implies can. We _cannot_ predict the far future of
               | humanity. We _cannot_ colonize other planets in the
               | foreseeable future. We _cannot_ plan how to handle future
               | technology that we aren 't yet sure is even possible.
               | 
               | The things we actually can predict and control, like
               | global warming and natural disasters and pandemics, are
               | handled with regular old public policy. Longtermism,
               | almost by definition, refers to things we can neither
               | predict nor control.
        
               | yamtaddle wrote:
               | I've yet to see an argument for colonizing Mars for this
               | purpose, that wouldn't be a better argument if the goal
               | were instead "build robust, distributed bunkers on earth,
               | and pay families to live in them part-time so there's
               | always someone there".
               | 
               | Cheaper, and more effective.
               | 
               | Most plausible post-apocalyptic Earths would be far
               | easier to live on than Mars.
               | 
               | The remaining threats that wouldn't also be pretty likely
               | to take out Mars at the same time, would be something
               | like a whole-crust-liquifying impact, which we'd have a
               | pretty good chance of spotting well in advance, and we
               | could put some of the savings into getting better at
               | that.
               | 
               | I think a bunch of smart people are also just romantics
               | when it comes to space shit, and that's why they won't
               | shut up about Mars, not because it's actually a good
               | idea.
               | 
               | Hell, building orbital habs is probably a better idea
               | than colonizing Mars, for those purposes, if we _must_ do
               | space shit.
        
               | walleeee wrote:
               | > Most plausible post-apocalyptic Earths would be far
               | easier to live on than Mars.
               | 
               | thank you
               | 
               | how are we supposed to build a second home on a dead,
               | ruthlessly hostile planet until we demonstrate ourselves
               | capable of stabilizing the biosphere and building a
               | sustainable long-term civilization here
               | 
               | Earth is easy mode compared to Mars
        
               | WalterBright wrote:
               | > how are we supposed to build a second home on a dead,
               | ruthlessly hostile planet until we demonstrate ourselves
               | capable of stabilizing the biosphere and building a
               | sustainable long-term civilization here
               | 
               | Because we can afford to make big mistakes in
               | terraforming a dead, ruthlessly hostile planet.
        
               | LarryMullins wrote:
               | A few microscopic fungal-like spore things could throw a
               | wrench in that. Now the planet is a nature reserve.
        
               | yamtaddle wrote:
               | Right--living on Mars is like living on Earth if it
               | ambient surface radiation levels were significantly
               | higher, nothing would grow in the soil anywhere without a
               | ton of preparation, and you couldn't leave your house
               | without a pressure suit. And there's no surface water.
               | And the gravity's fucked up. And the temperatures suck
               | for basically anything life-related. And none of the
               | geological and chemical processes that keep our biosphere
               | viable existed, at all.
               | 
               | So... like Earth if several apocalypses happened at once,
               | including a few nigh-impossible ones. Except it starts
               | that way. And it's actually even worse than that. Sure,
               | maybe we could slam some comets into it and do a ton of
               | other sci-fi shit over a few centuries and it'd
               | eventually get better, sorta, a little--but c'mon,
               | seriously?
        
               | zozbot234 wrote:
               | Why Mars though? Why not colonize the Gebi Desert first?
        
               | thaumasiotes wrote:
               | Presumably because the goal is to survive something bad
               | that happens to Earth. If you're on Mars (and self-
               | sustaining...), that's no big deal. If you're in the Gobi
               | Desert, you're going to be the _first_ people to get
               | wiped out by whatever happens to Earth.
        
               | travisjungroth wrote:
               | x-risk is existential risk, as in humans get wiped out.
               | Some big ones are meteor impact, nuclear war and disease.
               | The risk of those things ending all of humanity are
               | greatly reduced with a second planet. They're not reduced
               | with a desert colony.
        
               | Dylan16807 wrote:
               | > The risk of those things ending all of humanity are
               | greatly reduced with a second planet.
               | 
               | I can imagine a situation where that's true. But right
               | now, for almost any situation, a series of super-bunkers
               | is orders of magnitude cheaper and more effective. A lot
               | of ridiculously destructive things can happen to Earth
               | and it will still be a better place to live than Mars.
        
               | [deleted]
        
               | goatlover wrote:
               | None of those things would make Earth less hospitable
               | than Mars. A desert colony would still be better off than
               | trying to survive on Mars, particularly once Earth's
               | resources are cutoff. Mars is far more hostile than
               | anything likely to happen to Earth over the next hundred
               | million years.
        
             | JohnFen wrote:
             | Oh, I agree! I didn't mean to imply that being concerned
             | with the future isn't critically important. It is. I like
             | how you put it better -- it's that they do it so badly.
        
             | mistermann wrote:
             | > It's not that that bothers me so much as the fact that
             | many effective altruists do it so badly.
             | 
             | I feel the same about Rationalists and rationality. They
             | even had an excellent approach with their motto: "We're
             | only _aspiring_ rationalists ", but when you remind them of
             | that motto in the process of them being not actually
             | rational, it has no effect.
             | 
             | There's got to be a way to ~solve something that is so in
             | your face, _like right there in an argument, the very
             | essence_ , but it is a very tricky phenomenon, it always
             | finds a way to slip out of any corner you back it into.
        
           | HWR_14 wrote:
           | > I never give to any group that asks me for money.
           | 
           | I get how this would keep you personally from being annoyed,
           | but it seems to incentivize in worse outcomes. "Let's collect
           | all the money we can, we never know if we'll get more. Let's
           | grow that reserve" vs. "In a bad month we can get our usual
           | donors/JohnFen to give us his annual donation a little
           | early".
        
           | tptacek wrote:
           | _I never give to any group that asks me for money._
           | 
           | Far be it from me to second-guess anybody's giving (motes and
           | beams and all that) but this rules out many of the most
           | effective aid organizations, all of which are absolutely off-
           | the-charts obnoxious about fundraising --- because it works.
        
             | LarryMullins wrote:
             | If it works for those orgs, then those orgs don't need his
             | money anyway.
        
             | invalidOrTaken wrote:
             | > because it works
             | 
             | apparently not!
        
             | zozbot234 wrote:
             | > many of the most effective aid organizations, all of
             | which are absolutely off-the-charts obnoxious about
             | fundraising
             | 
             | This doesn't seem to jive much with what's reported by
             | charity evaluators like GiveWell, or with what kinds of
             | charitable organizations get grants from more traditional
             | but still high-impact philanthropies like the B&MGF.
             | 
             | It's quite plausible that too much emphasis on fund raising
             | among the general public distorts incentives within these
             | charities and makes them less likely to be highly effective
             | on average. If so, we're better off when the job of
             | publicly raising charitable donations is spun off to
             | separate organizations, such as GiveWell or more generally
             | the EA movement itself.
        
               | tptacek wrote:
               | Fundraising expenses are a huge problem with large
               | charities, but it doesn't follow that fundraising
               | _annoyingness_ is a huge problem. It 's not a customer
               | service problem with donors; it's a "using too much of
               | proceeds on fundraising" problem.
        
               | scarmig wrote:
               | If an organization believes a marginal dollar of money on
               | their programs is the best way to improve the world, then
               | spending $10 to get $11 in donations allows them to spend
               | an extra dollar on it. It's rational and even morally
               | required. (The only potential negative being the extent
               | to winning a contribution pulls funding away from other
               | causes.)
               | 
               | More generally, people overly emphasize low
               | administrative expenses as a sign of quality. You need
               | overhead to effectively administrate and evaluate
               | programs.
        
           | somecompanyguy wrote:
           | true. and those of us that are not psychopaths dont even have
           | to think about it. it just happens.
        
           | tablespoon wrote:
           | > Effective Altruism is just a modern iteration of a thing
           | that's been around for a very long time.
           | 
           | Which you think is what, exactly? I'm under the impression
           | that thing is warmed-over utilitarianism.
           | 
           | > The fundamental idea is sound.
           | 
           | I do not believe utilitarianism is sound, because its logic
           | can be easily used to justify some obviously horrible things.
           | However the framework appeals very strongly to "rationalist"
           | type people.
        
             | michaelt wrote:
             | The "10% of lifetime income to charity" pledge is pretty
             | close to Christian tithing, Islamic zakat, and suchlike.
             | Who also claim to be spending donations to help the poorest
             | people in society, and with low waste.
             | 
             | Of course, EA has a bunch of other weird stuff like AI
             | safety, which isn't an idea that's been around for
             | millennia.
        
               | czzr wrote:
               | Well, actually, on AI safety:
               | https://en.m.wikipedia.org/wiki/Golem
        
             | kiba wrote:
             | _I do not believe utilitarianism is sound, because its
             | logic can be easily used to justify some obviously horrible
             | things. However the framework appeals very strongly to
             | "rationalist" type people. _
             | 
             | If it sounds horrible, then it's probably is?
             | 
             | The logical chain of hurting a human leading to helping two
             | human doesn't sound like something that is moral or
             | dependable.
             | 
             | Giving to charities that focus on the most severe and
             | urgent problem of humanity is a very straightforward idea.
             | 
             | However, not all charities are focused on the most urgent
             | problems. For example, a local charity I frequent does
             | improv and comedy theater, hardly 'urgent' needs. People
             | don't like to hear is that they could donate money to a
             | third world NGO providing vaccines, or fighting corruption
             | in third world countries instead of their local church or
             | community theater.
             | 
             | Don't get me wrong, community theaters/churches/etc are
             | good things. They just aren't saving lives.
        
             | gruez wrote:
             | >I do not believe utilitarianism is sound, because its
             | logic can be easily used to justify some obviously horrible
             | things.
             | 
             | Right, but that doesn't mean that we shouldn't care about
             | consequences at all. There's a pretty big gap between
             | "given that we have scarce resources, we should maximize
             | the impact of our use of it" and "committing this atrocity
             | is fine because the utility calculations work out".
        
         | bena wrote:
         | I wouldn't say "really-smart-person" but reasonably smart
         | person who believe they are way more intelligent than they
         | actually are. People who mistake competence in one area as
         | expertise in all areas.
        
       | kazinator wrote:
       | Effective altruism is a cult rooted in pseudo-intellectual
       | garbage ... that, we are in no way surprised to learn, has a
       | sexual harassment problem too.
        
       | stuaxo wrote:
       | Well, this is unsurprising.
        
       ___________________________________________________________________
       (page generated 2023-02-03 23:01 UTC)