[HN Gopher] Future of Humanity Institute shuts down
___________________________________________________________________
Future of Humanity Institute shuts down
Author : rdl
Score : 172 points
Date : 2024-04-17 15:14 UTC (7 hours ago)
(HTM) web link (www.futureofhumanityinstitute.org)
(TXT) w3m dump (www.futureofhumanityinstitute.org)
| mikece wrote:
| Possibly because the future of humanity itself is in question?
|
| "We're not going to make it, are we? Humans, I mean." --John
| Connor
| JohnKemeny wrote:
| Or because Nick Bostrom is a hack. Together with Max Tegmark _.
| Not to mention Ray Kurzweil.
|
| _ Initially wrote wrong surname.
| patrulo wrote:
| elaborate?
| busterarm wrote:
| academia is a vicious nest of vipers.
| FabHK wrote:
| You mean Max Tegmark?
|
| I mean, some of their ideas are way out there, and obviously
| if not tempered can lead to bad consequences (e.g. SBF/FTX).
| But a hack?
| observationist wrote:
| Laying SBF at the feet of effective altruism is more than a
| little silly. It'd be similar to associating the Democratic
| National Party with Bernie Madoff, or Epstein with any of
| the universities and foundations he donated to.
|
| One of the things criminals do to hide their activity,
| assuage their guilt, or put on a good show is associate
| with legitimate organizations. Social organizations
| typically don't and probably shouldn't do intrusive
| investigations into the lives and activities of their
| members sufficient to uncover their crimes.
|
| When such organizations learn of any benefits accrued
| through the illegal activities of members, there are
| obviously moral and sometimes legal requirements to
| disassociate, disavow, and return ill-gotten funds.
|
| Tegmark and Bostrom should be no means be tainted by
| association with EA. Their ideas and work on AI alignment
| and safety are excellent, and they ask the questions that
| should be answered as AI starts to reach human levels of
| competence and beyond.
| johngossman wrote:
| https://forum.effectivealtruism.org/posts/qFEwQbetaaSpvHm
| 9e/...
|
| Not so silly that the EA community didn't talk about what
| went wrong.
| observationist wrote:
| EA tend to be a bunch of intellectual windbags, by and
| large, and overestimate their impact on the world at
| every level it's possible to do so. Just because they
| collectively gasped and claimed responsibility doesn't
| mean their interpretation has anything to do with
| reality.
|
| The reality is SBF was a con man, and however complex his
| motivations and personality and psychological issues,
| whatever his ultimate intent, he willfully scammed a lot
| of money from a lot of people. EA might have been an
| influence, but SBF is a human with complex agency whose
| actions can't and shouldn't be reduced to membership or
| association with a community.
|
| Going from "here's a set of good ideas about how to
| effectively give to charity, since we see a lot of
| corruption and inefficiency in charities" to "we have a
| moral duty to ensure that we only associate with good
| people" is how you go from a good idea to a pompous
| internet cult.
| arduanika wrote:
| You're correct that reputation laundering is a thing, but
| in the FTX case, the facts do not match that pattern.
| These were true believers.
| helboi4 wrote:
| I really did not like Max Tegmark's AI book. It felt like bad
| science fiction.
| esafak wrote:
| I have not read any paper by Tegmark that seems hacky: https:
| //arxiv.org/search/cs?searchtype=author&query=Tegmark,...
|
| Could you point to some?
| goatlover wrote:
| Would you also say Sean Carroll is a hack for his strong
| support of the MWI?
| patrulo wrote:
| Curious as to why they didn't explore independence, moving from
| Oxford to a separate non-profit org. Getting funded doesn't seem
| impossible given the amount of high-profile e/acc advocates.
| rdl wrote:
| I think they're the enemies of e/acc, and their EA background
| (due to FTX, SBF, etc.) taints them a bit when it comes to
| fundraising.
| CorruptedArc wrote:
| Not to mention the "big picture" EAs did damage when put in
| charge of the money FTX was doling out. They had people who'd
| been funding organizations that would feed, cloth, and
| medicate people move their funding to "existential problems"
| like AI and climate science. While I won't argue the value of
| those, vague theoreticals are arguably more commonly
| tacticaled than those on the ground trying to help with
| active issues.
| constantcrying wrote:
| I think you are confusing EA and e/acc. EA was about
| billionaires defrauding the population who entrusted them with
| their money to give away to whatever cause was deemed best by
| the current "scientific" consensus in EA. Causes can include
| things like the suffering of fish or AI ending the known
| universe.
|
| e/acc is about doing capitalism so hard so that AI will solve
| every problem.
| esafak wrote:
| That's not true. sbf was about that, but EA is not. Are there
| other cases of EA fraud?
| vkou wrote:
| He was doing fraud, but the rest are mentally masturbating
| about problems that are about as relevant to the world as
| the number of angels that can fit on the head of a pin.
| Emma_Goldman wrote:
| Well, that's the point in question. For its critics, the
| whole enterprise is a misguided sham. See Leif Wenar's
| stimulating recent piece, 'The Deaths of Effective
| Altruism', which sees SBF less as aberration, than symptom.
|
| https://www.wired.com/story/deaths-of-effective-altruism/
|
| I think EA was naive from the start, both in its
| unreflective moral realism, and its reduction of
| complicated questions of social theory down to back-of-the-
| envelope utility calculations based on threadbare empirics.
| 77pt77 wrote:
| > "which charity saves the most lives?"
|
| > "None of them," said a young Australian woman to
| laughter. Out came story after story of the daily
| frustrations of their jobs. Corrupt local officials,
| clueless charity bosses, the daily grind of cajoling poor
| people to try something new without pissing them off.
|
| Never fails to disappoint...
| JohnFen wrote:
| > For its critics, the whole enterprise is a misguided
| sham.
|
| I'm a critic, but I don't think it's a sham. I do think
| it's very misguided and I very strongly suspect it has
| evolved into a cult, but it's not a sham.
| constantcrying wrote:
| >Are there other cases of EA fraud?
|
| You mean besides the legions of pseudo-philosopher techbros
| pontificating about how they can save the world and
| organize sex parties while doing so?
| kaashif wrote:
| It's okay if you don't like what they're doing and think
| it's stupid, but that doesn't make it fraud.
|
| Is having sex parties now considered fraud? I need to
| speak to my lawyer.
| karma_pharmer wrote:
| For a nonprofit research group with no patents/IP/etc, simply
| starting over with the same people is actually a better
| strategy than trying to "move". Nothing is lost (no capital,
| equipment, etc) and you don't have to deal with entanglements
| with the previous institution.
| gizajob wrote:
| Because without institutional funding, one's sci-fi has to
| appeal to a mass audience to pay its way.
| sinuhe69 wrote:
| What an irony! The Bostrom Institute has to close in the midst of
| the spectacular rise of AI, and the all-important questions of AI
| risks and human value alignment are more hotly debated than ever.
|
| I hope that Bostrom and his colleagues will continue and deepen
| their research, because humanity needs such insights now more
| than ever.
| EarthAmbassador wrote:
| How does one research and generate insights for what does not
| exist yet? Is there a framework? At SXSW this year, there were
| a couple of talks about forecasting, as a tool for futurism,
| but where ethics are concerned, I've not seen a good template.
| Maybe all if this is obvious, yet I'm curious so I'm asking.
| skeeter2020 wrote:
| I'm not expert and struggled to read Bostrom's Super
| Intelligence, but my interpretation was it started with a lot
| of broad, hand-wavy factors & historical interpetations,
| layered on some pretty tenious projections as inevitable and
| wanked off with deep thought experiments. I get this sort of
| open-ended exploration has value, but I'm not sold on it's
| "prioritized value" when compared with other areas and
| directions.
| christkv wrote:
| It was literally under the department of intellectual
| wankery. Them shutting it down is the pot calling the
| kettle....
| karma_pharmer wrote:
| I just finished reading it.
|
| It's extremely repetitive; he could have written a book
| one-third of its length without leaving anything out.
| There's some good stuff in there, but I was sort of annoyed
| that the author forces you to read everything three times.
| Kinda disrespectful to his readers' available free time.
| mitthrowaway2 wrote:
| > How does one research and generate insights for what does
| not exist yet?
|
| It's Zeno's paradox of research: we cannot think about what
| does not already exist, therefore nothing new can ever be
| brought into existence!
| pfdietz wrote:
| I call this "nothing can ever happen for the first time".
| One often hears it in passive-aggressive arguments against
| renewable energy.
| robertlagrant wrote:
| Never heard that in that context. Can you cite an
| example?
| pfdietz wrote:
| I constantly see the argument "no country has ever been
| powered by solar and wind", implying that it therefore
| can't be done, and that one should use nuclear instead
| (never mind the same is true of nuclear; not even France
| is fully nuclear powered).
| nwiswell wrote:
| > never mind the same is true of nuclear; not even France
| is fully nuclear powered
|
| This seems somewhat bad-faith: nuclear does supply the
| _majority_ of France 's power, and since nuclear is a
| "base load" part of the mix, it would be inefficient to
| get 100% of power from nuclear rather than a peaking-
| friendly mix that includes e.g. hydro and gas.
| pfdietz wrote:
| Focusing only on the grid is bad faith; if one includes
| non-grid energy use, France doesn't even get 50% of its
| energy from nuclear.
| kragen wrote:
| it seems plausible that all new things are brought into
| existence unintentionally. that is, you tried to create
| something you imagined, but what you actually created was
| something else. certainly there are many examples of
| thinking about renewable energy that were completely
| wrong because they were based on presumptions that no
| longer hold, or in some cases were already wrong but in a
| nonobvious way
|
| as my wife points out, we can't even imagine any of the
| things that do actually exist, only drastic
| simplifications thereof
| mise_en_place wrote:
| Is FHI even needed anymore? His ideas clearly won and took hold
| of a large group of people. AI doomerism is the default stance
| most people have on AI. e/acc makes up a tiny fraction of
| popular discourse today.
| beepbooptheory wrote:
| Geez if this is what it all looks like for AI to be unpopular
| I can't even imagine what its going to look like when
| everyone is _actually_ into AI.
| Maro wrote:
| This 2023 incident around founder Nick Bostrom may have something
| to do with it:
|
| https://en.wikipedia.org/wiki/Nick_Bostrom#1996_email_contro...
|
| From the wikipedia article:
|
| > In January 2023, Bostrom issued an apology for a 1996 email
| where he had stated that he thought "Blacks are more stupid than
| whites", and where he also used the word "niggers" in a
| description of how he thought this statement might be perceived
| by others. The apology, posted on his website, stated that "the
| invocation of a racial slur was repulsive" and that he
| "completely repudiate[d] this disgusting email". In his apology,
| he wrote "I think it is deeply unfair that unequal access to
| education, nutrients and basic healthcare leads to inequality in
| social outcomes, including sometimes disparities in skills and
| cognitive capacity."
|
| Edit: also adding the second paragraph from Wikipedia to avoid
| accusations of smearing:
|
| > In January 2023, Oxford University told The Daily Beast, "The
| University and Faculty of Philosophy is currently investigating
| the matter but condemns in the strongest terms possible the views
| this particular academic expressed in his communications." In
| August 2023, the investigation concluded (according to a letter
| Bostrom posted on his website) that "we do not consider [Bostrom]
| to be a racist or that [he holds] racist views, and we consider
| that the apology [he] posted in January 2023 was sincere."
|
| Another member of this institute was effective altruist William
| MacAskill, who has appeared in media in 2023 in this context:
| "Sam Bankman-Fried and Will MacAskill weren't just philosophical
| allies. They were old friends."
|
| https://time.com/6262810/sam-bankman-fried-effective-altruis...
|
| From the OP:
|
| > In late 2023, the Faculty of Philosophy decided that the
| contracts of the remaining FHI staff would not be renewed. On 16
| April 2024, the Institute was closed down.
|
| I'm just speculating, but it stands to reason that the Institute
| became a PR net negative for Oxford (or this specific Faculty).
| busterarm wrote:
| FHI/Bostrom's relationship with the university was damaged for
| years before the apology email.
|
| They'd already frozen funding and hiring years earlier which
| means their issues go back even further than that.
| malfist wrote:
| Wow, that's disgusting
| letmeinhere wrote:
| They may also be insolvent due to FTX bankruptcy clawing back
| grants.
| helboi4 wrote:
| Even the apology is sort of terrible. I'm not sure that I ever
| want it implied that black people have lower "cognitive
| capacity", whether or not it's said with recognition of social
| factors outside of our control. Social outcomes =/= cognitive
| capacity. And the causal factor here is poverty > race.
| exo-pla-net wrote:
| This is a smear taken out of context. What Bostrom actually had
| to say is both accurate and mild:
|
| > "I have always liked the uncompromisingly objective way of
| thinking and speaking: the more counterintuitive and repugnant
| a formulation, the more it appeals to me given that it is
| logically correct," the quoted excerpt begins. "Take for
| example the following sentence: Blacks are more stupid than
| whites. I like that sentence and I think it is true.
|
| "But recently I have begun to believe that I won't have much
| success with most people if I speak like that. They would think
| that I were a 'racist': that I _disliked_ black people and
| thought it is fair if blacks are treated badly. I don't. It's
| just that based on what I have read, I think it is probable
| that black people have a lower average IQ than mankind in
| general, and I think that IQ is highly correlated with what we
| normally mean by 'smart' and 'stupid'. I may be wrong about the
| facts, but that is what the sentence means for me. For most
| people, however, the sentence seems to be synonymous with: 'I
| hate those bloody n------!!!!"
| busterarm wrote:
| It also leaves out that the university investigated and came
| to the seemingly-rare conclusion that the controversial
| statement didn't indicate that he's a racist.
| myko wrote:
| I don't think this helps his case, but I do appreciate seeing
| the email in question
| helboi4 wrote:
| Every time race comes up on HackerNews i am shocked at how
| horrifyingly racist (some) users of this site are. Not only
| did a user somehow think that this context would exonerate
| this very racist man, both you and I are getting
| immediately downvoted for disagreeing. There was a post
| last week or so that was so full of racist comments it just
| got taken down. I wonder what on earth brings together
| HackerNews and racism like this.
| empath-nirvana wrote:
| It's because the people who run the site tolerate it and
| ban/warn people for making comments like yours.
| zo1 wrote:
| You know, topics like this are not always black and
| white. There is a full-range, nuance and discussion.
|
| I'd also wager that the downvotes here are because this
| flame-bait kind of comments are not appropriate for HN,
| or if they are appropriate then some might not think it's
| contributing to the discussion anyways.
|
| Me, I think the refusal by some to admit (or accept) that
| the full-context post adds to the discussion and to
| instead double-down and cry more racism is definitely not
| constructive.
|
| I'm honestly getting tired of these "race card" low-blows
| and one-sided thinking shutting down conversation.
| helboi4 wrote:
| I'm not sure what you think about my comments is flame
| bait. I'm having a discussion about whether or not these
| comments are racist, since someone brought them up.
|
| This is not pulling out the "race card". I made
| absolutely no "low-blows". Stating that calling black
| people lower IQ than whites is very much racist is not an
| unreasonable thing to say. I have no problem with anyone
| bringing up context. I just think implying that anything
| about the statement is "accurate" is racist. I'm not sure
| what is so contraversial about stating that saying white
| people are inherently smarter than other races is a white
| supremacist talking point.
|
| And its not just me, anyone saying absolutely anything in
| support of the idea that what this guy said was bad is
| being downvoted. No matter how restrained their comment.
|
| I would also much rather people replied to me rather than
| just downvote. That would be a discussion.
| daveguy wrote:
| > I'd also wager that the downvotes here are because this
| flame-bait kind of comments are not appropriate for HN,
| or if they are appropriate then some might not think it's
| contributing to the discussion anyways.
|
| It's odd to me that calling racism racism when a parent
| called it not racism is either non-contributing or
| inflamatory. Seems to me it is warranted.
| exo-pla-net wrote:
| An assertion without rationale is noise or worse.
|
| "That's racist; cancel them!" falls in the latter
| category. It's the mindless baying of the rabble. You
| don't _engage_ with the rabble, as there 's no fixing
| stupid. You just hope they shut up, so that you and the
| other adults can think, and you hope that the rabble
| burns down someone else's house.
|
| Analogously, there's not much to be gained from engaging
| with someone shouting "Allahu Akbar; death to infidels!"
| That's drone purview.
|
| I hope that helps.
| daveguy wrote:
| Your conflation of responding "yes it is," when someone
| claims something is not racist when it clearly is with
| "that's racist, cancel them!" seems disingenuous. As
| disingenuous as conflating it to "Allahu Akbar; death to
| infidels". Correctly identifying a statement as racist
| when someone else said it wasn't is about as polar
| opposite as you can get to "death to infidels!" Do you
| see them as equivalent?
| cauch wrote:
| But the author himself says that his sentence is
| repugnant.
|
| Are you saying that the author has no rational to say
| that?
|
| It really looks like nowadays we cannot say "that looks
| racist" without being accused of being the big satan that
| want to cancel everyone. If this kind of "ad-hominem" is
| not itself not without any rational and not noisy and
| inflammatory, I don't know what is.
|
| You can, if you want, defend that according to you this
| statement was fine and not racist. But you don't do just
| that, you also say that people who don't agree with you
| merit to be down-voted and that the forum would be better
| if their voice was not even there. Difficult to not see
| there exactly a justification of a "cancelation" of an
| opinion you just don't like.
| exo-pla-net wrote:
| > Are you saying that the author has no rational to say
| that?
|
| Sure, instrumental rationale: PR.
|
| And, because I believe that Bostrom says what he means,
| Bostrom probably _does_ think what he said was repugnant,
| but probably not in a way that you would find satisfying.
| Bostrom probably thinks that speaking truthfully about
| vulnerable people, in a manner that could distress them
| (e.g. owing to their misunderstanding of the truthful
| words, or in a "truth hurts" sort of way), is morally
| repugnant. Better to spare them suffering. If I am
| correct, I disagree with Bostrom. Having to cater to
| delicate and low-IQ sensibilities is a wrench in the
| wheels of intellectual discourse, as well as a dystopian
| blow to personal expression. Don't let the scolds win.
|
| > It really looks like nowadays we cannot say "that looks
| racist"
|
| You don't have a license to denigration. Think very
| carefully, and consider the possibility that you are
| wrong, before you cast stones.
|
| > You can, if you want, defend that according to you this
| statement was fine and not racist.
|
| But what I quoted contained my rationale? If it ain't
| good enough for you, the impetus is on _you_ to prove
| that Bostrom is, in fact, a witch. The ball is in your
| court.
|
| > Difficult to not see there exactly a "cancelation" of
| an opinion you just don't like.
|
| Any opinion at all, and _especially_ opinions that differ
| from my own, I 'd welcome at the table, as long as said
| opinion is articulated and epistemically rationalized by
| someone who is smart and who has given it careful
| thought. If you're not capable of that, then yes, your
| silence would improve the forum.
| notahacker wrote:
| A statement that Race X is "more stupid" than Race Y is
| almost _tautologically_ racist.
|
| The idea that Nick denigrating an entire race as 'stupid'
| is "accurate and mild", whereas any suggestion that the
| statement contains racism requires a "license to
| denigrate" is truly through the looking glass...
| exo-pla-net wrote:
| He's not saying the race is stupid. He is saying that it
| is _more stupid_ , an operant he expounds on, revealing
| his underlying meaning as both accurate and mild. A
| factual statement is not and is never denigration.
|
| If you and you specifically were the sole member of a
| race, for instance, his operant would rank your race
| below that of Black. This would be an observation, not a
| denigration.
|
| But, if you are not Black, you are the recipient of
| favorable averaging. Your race would be less stupid than
| Black, _despite_ you.
|
| I hope that helps.
| notahacker wrote:
| > He's not saying the race is stupid. He is saying that
| it is _more stupid_
|
| I think the fact that you're reduced to asserting that
| it's logically possible to assert that a group is "more
| stupid" without asserting that they are in any way stupid
| pretty neatly demonstrates my point about comparisons
| between races with disparaging adjectives being almost
| _tautologically_ racist.
|
| (The second half of your post is even more pointless to
| engage with. :)
| cauch wrote:
| You are pretending that you are welcoming any opinion,
| especially opinions that differ from your own. Yet, you
| were very quick to invent unfunded hypotheses to cast
| opinions different from yours as "not smart and therefore
| discardable".
|
| Your "PR" hypothesis or "cater to delicate and low-IQ
| sensibilities" falls flat as the author has demonstrated
| before and after that he does not want to play in this PR
| game. It's exactly the point he is making in the first
| statements and the point he is making in his excuse: "I
| do think that provocative communication styles have a
| place". He also explains that he apologized 24h after
| having sent that message, when he had no idea that he
| will need one day some kind of PR considerations, and at
| a time when he was not even pressured to make any kind of
| apologies.
|
| So, no, I call bullshit: he is giving the proof, himself,
| by explaining that, 24h after having said that, he
| properly realised his words went further than his
| thoughts. Without any need for PR, without even any
| pressure pushing him to do so. (and again, if it is a
| lie, it's a stupid one, as someone can check, and a
| totally useless one, because it does not need to invent
| that if he just want to do some PR clean-up)
|
| The funny part is that I think the quote is indeed racist
| but the guy is not, he is just one of these edgelords who
| want to provoke to feel themselves smart (based on what
| he himself says when he explains that he is biased
| towards provocative ideas). But now you are yourself
| painting him as a smart guy for defending something that
| himself explained is in fact not smart and not his
| opinion at all. It feels like some silence would have
| improved the forum and also avoided some people to look
| pretty stupid ...
| anigbrowl wrote:
| _An assertion without rationale is noise or worse._
|
| This could equally be applied to statements like 'blacks
| are more stupid than whites'. Rather than anyone calling
| for Bostrom to be cancelled, most of the people posting
| here just wonder how a clever and academically successful
| person like Bostrom could have been oblivious to the
| factual and historical problems of such a broad
| generalization. One could equally wonder why he picked a
| racial trope as his controversial example, as opposed to
| challenging the conventional wisdom on nuclear weapons,
| or economics, or the superiority of rugby to association
| football, or the correct pronunciation of 'gif'.
| itronitron wrote:
| Some words of wisdom passed on to me from a very
| knowledgeable person who was in turn given this knowledge
| when starting their career.
|
| _" This organization (group of people) represents a
| random sample of the population. Traits that occur within
| individuals in the population will therefore occur within
| some individuals in this organization (group of
| people.)"_
| chkaloon wrote:
| Bryan Caplan should take a clue from his bit of
| introspection.
| Maro wrote:
| I have no agenda for or against Nick Bostrom. It's a full
| paragraph quote from the Wikipedia article, with a link.
| gwern wrote:
| Are you trying to imply that Wikipedia articles by
| definition have no agenda and no one quoting a Wikipedia
| article can have an agenda either?
| exo-pla-net wrote:
| gwern! FWIW, I read what you have to say in the same,
| careful way that I read Bostrom. You're a treasure.
| helboi4 wrote:
| Thats.... still racist? Why does he think black people have a
| lower IQ? Nigerian immigrants to the US are some of the most
| successful immigrants. Like... black people just do not have
| lower IQs and to say so is considered very dangerous rhetoric
| for a reason. The reason we even have pervasive belief that
| black people are stupider is because it was convenient
| rhetoric for the colonial powers pillaging Africa and
| treating black people as subhuman cattle. It's not a claim
| based on fact, nor is it a benign thing to say.
|
| ...IQ tests are also wildly flawed measures of intelligence
| anyway, but let's not even get into that.
| frozenseven wrote:
| >Why does he think black people have a lower IQ?
|
| Because of every study that's ever been done on this topic?
| Pointing this out isn't racist.
|
| >Nigerian immigrants to the US are some of the most
| successful immigrants.
|
| Typically those immigrants come from among the smartest few
| %.
| neffy wrote:
| It of course depends on how you cut and slice it, and
| also on the categorisation of black - but we are talking
| about approximately 2-3 billion people there if we
| include India, Indonesia, Africa and the rest of the
| world.
|
| Can you point to a study that has comprehensively
| assessed that total population? Or just a few studies by
| Americans, who make their own racial biases, which from a
| cynical perspective can be boiled down to rampant and
| cruel exploitation over several centuries, abundantly
| clear in the articles concerned?
|
| Immigrants across the world tend to be a slightly self
| selecting class of folks - why would black immigrants be
| any different on that front?
| frozenseven wrote:
| South Asians are a wholly different and distinct group.
| Most Indonesians are South-East Asian, except for Chinese
| migrants and those living on the island of New Guinea.
| Africa is a continent, not a race.
|
| What you're saying is all over the place. And I'm not
| here to discuss politics.
| BlueTemplar wrote:
| This whole discussion is pointless - categorizing people
| by skin colour is ridiculous for almost all purposes, and
| also, considering history, racist.
|
| Also, didn't "black" and "n-word" switch meanings as
| slur/non-slur less than a century ago (and might switch
| them again in less than a century) ?
| frozenseven wrote:
| >categorizing people by skin colour is ridiculous for
| almost all purposes
|
| Sure. But race is most often about genetic (or ethno-
| linguistic) heritage, not skin color. A person from Japan
| might have the same skin color as someone from Greece.
| goatlover wrote:
| Sub-saharan Africans have greater genetic diversity than
| other continental populations. Race is just not an
| accurate categorization.
| malfist wrote:
| Are you trying to say that people didn't know the n word
| was offensive in....1996?
| shafyy wrote:
| Sorry, but how does this make it better? If something, it
| makes it worse. And you describing his statement as "accurate
| and mild" is also not great.
| MisterBastahrd wrote:
| It isn't a smear. His qualifying remarks indicate that he's
| either too stupid, arrogant, or bigoted to understand or care
| how context works, and thus has no business running a hot dog
| stand, much less an institute. Even disregarding that,
| publicly revealing his thoughts and framing them in such a
| fashion shows he has no common sense.
| woopsn wrote:
| What Bostrom said is that he completely repudiates these
| remarks and that they were disgusting. Leave it at that.
| beezlebroxxxxxx wrote:
| An academic would need to be incredibly stupid to think that
| that's a good thing to say in writing or out loud. The idea
| that you can "just" say these things is almost entirely the
| purview of people who coincidentally _just so happen_ to not
| say or refer to all of the contextual and explicative ideas
| around them, making pointing to IQ without them essentially
| meaningless at best and racist at worst.
|
| It's also not really "accurate or mild", as Bostrom himself
| stated in his apology for the email that:
|
| > I completely repudiate this disgusting email from 26 years
| ago. It does not accurately represent my views, then or now.
| The invocation of a racial slur was repulsive. I immediately
| apologized for writing it at the time, within 24 hours; and I
| apologize again unreservedly today. I recoil when I read it
| and reject it utterly.
| satvikpendem wrote:
| How is this "both accurate and mild?" If anything, it makes
| Bostrom seem even more racist, harkening back to the 20th
| century notion of scientific racism, which, regardless of
| whether you put a pseudoscientific spin on it, is still
| racism.
| IncreasePosts wrote:
| I wonder if these people ever paused to consider they aren't
| as smart as they think they are, if they're just figuring out
| some basics of human communication in their mid 20s that my 8
| year old has known for years.
| throwaway290 wrote:
| The irony is that you are looking at an example where a guy
| literally paused to consider how he was not so smart about
| communication. He also shared it with others who can also
| lack this skill
| IncreasePosts wrote:
| He said "I think it is laudable if you accustom people to
| the offensiveness of the truth, but be prepared that you
| may suffer some personal damage".
|
| Doesn't sound like someone doing introspection, it sounds
| more like he is lamenting that the world isn't as
| "logical" as he is.
| ceuk wrote:
| > Doesn't sound like someone doing introspection, it
| sounds more like he is lamenting that the world isn't as
| "logical" as he is.
|
| There's an autistic elephant in the room. "Why are people
| so irrational" could be one of the slogans if there was a
| high functioning autistic persons society
| robertlagrant wrote:
| Well, not logical. Truthful. "How much of communication
| is impaired by filtering through various politeness laws
| and offences?" Is how I read it.
| s1artibartfast wrote:
| Sure, I don't think anyone was claiming infallibility.
|
| I think it is easy however to romanticize these such
| errors made in the pursuit of truth.
|
| It can be like the pointing out the fallibility of
| Galileo Galilei in thinking he wouldn't be held to the
| inquisition, and made to recant his evidence of
| heliocentrism.
| NoMoreNicksLeft wrote:
| Has your 8 yr old really known this for years? He may
| behave in a conformist way, instinctively, without being
| able to describe it or understand the phenomenon... both of
| which are, in my opinion, required to _know_ it.
|
| And who's figuring out whose communication? He basically
| has to draw a picture in crayon of what he means, just so
| all the rest of you don't misconstrue his meaning. Your
| "human communication" is much too defective to be so proud
| of it.
| davidivadavid wrote:
| If he has to draw a picture in crayon, maybe he must
| think a little harder with that big head of his about how
| to say it properly in the first place so it doesn't
| require a second explanation?
|
| In this case, it's really hard to understand why someone
| not completely idiotic when it comes to communication
| would have used the phrase "I like that sentence" after
| saying "Blacks are more stupid than whites."
| IncreasePosts wrote:
| I assume by "draw a picture in crayon", you mean provide
| a very simple explanation. You seem to be confusing the
| fact that children generally draw simple things, and also
| draw with crayons commonly. But there is nothing about
| crayons intrinsically that means a crayon drawings must
| be simple.
| sokoloff wrote:
| Having drawn with crayons, pencils, and pens, I think
| there is an intrinsic property about crayon drawings that
| does severely limit their maximum complexity/detail.
| ceuk wrote:
| High dimensionality, more granular interpretation/models of
| the world. More conscious/deliberate behaviour, less
| benefit from neurological canalisation.
|
| I don't think you're seeing ineptitude, I think you're
| seeing lucidity, sapience.
| IncreasePosts wrote:
| 14 year old edgelords on tumblr are peak sapience by that
| standard.
| gizajob wrote:
| Bostrom has always had the air of knowing he's phenomenally
| intelligent and absolutely brilliant and almost certainly
| the smartest person in any room. Yet his work smacks of
| grind and storytelling rather than genius.
| itronitron wrote:
| The additional context you provide suggests that the smear
| was taken out of context in order to hide the fact that the
| smear was in fact covering a skid mark over a shit stain.
| malfist wrote:
| Yeah, I don't get how that context makes anything better.
| All it says to me is Bostrom is trying to claim he's not
| racist because he knows his racist view will get him called
| a racist and therefor he's not racist.
|
| The argument doesn't make sense. You don't get to claim
| your view that black people are inferior to "mankind" isn't
| racist just because someone calls you a racist
| golergka wrote:
| Depends on how you define racism: is it a descriptive
| (this is a fact about the world that I consider to be
| true: this group is smart, that group is not) or
| perceptive (this is what I want to see in the world: this
| group should be given privileges, that group should not)
| view? In the email, his own statement is the former, and
| his assumed definition of racism clearly only relates to
| the latter.
| daveguy wrote:
| Bostrom's original statement was not remotely accurate or
| mild. The statement, "Take for example the following
| sentence: Blacks are more stupid than whites. I like that
| sentence and I think it is true." -- is a classic example of
| racism. It is racist to the core. Not only that, Bostrom knew
| at the time racism like that would make things more difficult
| for him and he was correct. Sometimes cancel culture is
| deserved. At least he eventually apologized for it.
| throw7 wrote:
| I suppose we could also say IQ highly correlates with social
| ineptitude. Who could've known people wouldn't like you if
| you made repugnant statements... surely not 'smart' people!
| robocat wrote:
| > IQ highly correlates with social ineptitude
|
| Does it?
|
| Being socially ept requires high intelligence. However
| people that deeply apply their brains to social situations
| are often unrecognised as being bright in wider society.
| Although they may well be highly rewarded. And I suspect
| the very skilled often hide their skill because it's a
| hidden weapon in political or business negotiations. It is
| really hard to see applied IQ and you need to be very
| trusted for someone to explain their thinking: plus you
| need to be EQ smart to spot others that are EQ smart (and
| the +ve side of Dunning-Kruger causes problems too).
|
| I think you are alluding to the stereotype of social
| ineptitude of geeks or academics. Personally I have found
| that focusing your IQ too tightly into one narrow
| discipline is not that smart. Really smart geeks seem to
| also be highly socially capable: IQ is _general_
| intelligence. Some of the smartest people I know left
| school at 15: you won 't have highly academic discussions
| with them because it usually doesn't interest them but
| their raw IQ shows up in a bunch of other unobvious ways.
|
| Disclaimer: I'm a geeky slow learner - a redundant
| disclaimer given I'm making comments on HN.
|
| Edit: given we are on HN, here's a good example of Paul
| Graham deeply recognising someone as smart and socially
| epter than himself: https://www.paulgraham.com/jessica.html
| mise_en_place wrote:
| Most people are not ready to have an honest discussion about
| the correlation between race and IQ. It sadly gets muddied by
| various political ministrations. But it seems like a genuine
| effect that should be studied more. If we truly want equality
| of opportunity, we must understand what is causing certain
| races to be on the left side of the normal distribution. Is
| it nutrition? Social status? Lack of parenting? A combination
| of these?
|
| Bostrom's only crime there was hoping for an honest, curious,
| and intellectual discussion.
| cauch wrote:
| But these honest conversations are occurring.
|
| For example, scientists have honestly looked up the
| "biology" or "DNA" hypothesis. But this hypothesis is not
| very strong:
|
| - why a "color-of-the-skin" would be linked to IQ when a
| "color-of-the-eye" would not?
|
| (and also: why some people are so interested in IQ and
| color-of-the-skin but are not interested as soon as the
| genetic factor is something less "visible to the eye"?)
|
| - how could there be IQ disparity based on skin color when
| the human DNA is so strongly mixed that between two white
| men and one black men, one of the white can easily be
| genetically closer to the black than the other white? There
| is no "DNA of Black Men" group: the DNA of black men is as
| diverse as the one of the white men and mixes totally with
| the one of the white men.
|
| - why black men placed on different social situations are
| scored so differently on IQ when they have very similar DNA
| (same family or even twins separated at birth)
|
| - why white men placed on different social situations are
| scored so differently on IQ? If you use white men as a way
| to predict IQ based on sociological factors, you get a
| formula that also predict black IQ, so science would say
| that color-of-the-skin is not the relevant factor here.
|
| There are works about IQ and skin colors for ages now, and
| the discourse seems to always go backwards with people
| saying "sure, but let's forget that we know it does not
| make more sense and try again". This is those people who
| stop the honest, curious and intellectual discussion.
|
| And I'm pretty sure the first reaction to this would be
| "it's all lies", because instead of an honest, curious and
| intellectual discussion, a lot of people who want to have
| this discussion are in fact more interested of pushing for
| one particular answer. For different reason, but I think
| one of these reasons is the same as why the EA movement was
| popular despite being so flawed: those people want to think
| of themselves as very deep and very smart, they want to see
| "counter intuitive and repugnant" things and stroke their
| ego by explaining how smart they are for not finding it
| counter intuitive or repugnant. The problem is that they
| just take things that are counter intuitive simply because
| they are incorrect, and they force them into "look at me,
| I'm smart, it's counter intuitive and yet I dare to
| consider it".
|
| It's basically what the Bostrom says: he says himself that
| he is attracted by the idea black people have lower IQ
| because it is the rebel thing to do. But being the rebel
| thing to do does not mean that it is scientifically correct
| or scientifically smart. Saying "women are biologically
| less apt to choose their leaders and therefore it makes
| sense they don't have the right to vote" or "the position
| of stars in the sky is affecting our lives based on in
| which months people were born" are both as "counter
| intuitive" and "repugnant" as the Black IQ discussion.
|
| It's a bit strange, because in the case of the Black IQ
| question, the hypothesis of "I see black men falling more
| often, so I guess they are not as smart", is not counter
| intuitive at all. It is people who have considered this
| hypothesis and realised it's simplistic and the truth is
| more complicated who went further than the basic intuition.
| arduanika wrote:
| The philosopher David Thorstad keeps a blog with critiques of
| EA and related idea. His writing strikes me as fairly patient
| and in good faith. (Or at least, it's more measured than some
| of my own comments on this thread!)
|
| He wrote this good dissection of the Bostrom email
| controversy, and why the apology doesn't quite do it:
|
| https://ineffectivealtruismblog.com/2023/01/12/off-series-
| th...
|
| That said -- it did happen way back in the 90's. There has to
| be a place for forgiveness, even for imperfect people
| offering imperfect apologies. My sense is that there's plenty
| of other things to criticize that are more recent and more
| central to this general school of thought.
| pessimizer wrote:
| This context makes it worse. I was imagining a bunch of
| different framings that would make it sound thoughtful, but
| I've literally heard the same thing from Klansmen in
| Arkansas. Literally, not figuratively.
|
| That last sentence is a symptom of people thinking that the
| only important issue is whether they're good people or not.
| He's saying that saying dogs are stupider than humans is not
| the same as hating dogs. Who cares what he hates? The
| question is who he hires, and who he gives the benefit of the
| doubt to. Not hiring dogs isn't hating dogs either.
| karma_pharmer wrote:
| Yeah but this does not explain
|
| _Starting in 2020, the Faculty imposed a freeze on fundraising
| and hiring_
|
| (note the date)
|
| More likely the freeze and the smear have a common cause,
| rather than one causing the other.
| ctxc wrote:
| On an unrelated note - my HN client UI breaks since the domain
| name is long xD
| optimalsolver wrote:
| EDIT: Got my "future of" institutes mixed up.
| rmbyrro wrote:
| Or maybe some god scientist made an observation and the
| probability function collapsed in this non-funded state
| complianceowl wrote:
| I'm literally laughing out loud at my desk right now XD
| dotsam wrote:
| Tegmark's institution is the Future of Life Institute, this is
| the Oxford Future of Humanity Institute.
|
| Tegmark's institute is well-funded, apparently largely due to a
| big crypto donation from Vitalik Buterin.
| https://www.politico.com/news/2024/03/25/a-665m-crypto-war-c...
| swyx wrote:
| why do people not like Max Tegmark? he was kind of a rockstar
| at NeurIPS
| gojomo wrote:
| My read is that some former fans strongly disagree with,
| and are thus disappointed by, Tegmark's recent enthusiasm
| for "AI will kill us all" arguments, & advocacy of
| strong/intrusive policies against AI progress.
| zehaeva wrote:
| I like the guy, but I do think he's nutty over the level of
| multiverse he believes in.
| VirusNewbie wrote:
| I'm a huge fan in general, but some of his arguments are
| sloppier than others.
| karma_pharmer wrote:
| Well, he published a nonfiction book, the best part of
| which is the first chapter which consists of _literally a
| fiction story_.
|
| He also has some serious problems with blinders, but
| they're the same blinders HN has so if I explain any
| further this post will get flagged, flogged, deleted, and
| downvoted. Ah well.
| Vecr wrote:
| Tell me what they are. I won't do any of those things.
| kragen wrote:
| karma_pharmer cannot do that because others will
| Vecr wrote:
| Can you write it somewhere else then link it?
| exo-pla-net wrote:
| Seems the incompetent and curiously hostile Philosophy Faculty at
| Oxford killed FHI.
|
| > While FHI had achieved significant academic and policy impact,
| the final years were affected by a gradual suffocation by Faculty
| bureaucracy. The flexible, fast-moving approach of the institute
| did not function well with the rigid rules and slow decision-
| making of the surrounding organization. (One of our
| administrators developed a joke measurement unit, "the Oxford". 1
| Oxford is the amount of work it takes to read and write 308
| emails. This is the actual administrative effort it took for FHI
| to have a small grant disbursed into its account within the
| Philosophy Faculty so that we could start using it - after both
| the funder and the University had already approved the grant.)
| Starting in 2020, the Faculty imposed a freeze on fundraising and
| hiring. Unfortunately, this led to the eventual loss of lead
| researchers and especially the promising and diverse cohort of
| junior researchers, who have gone on to great things in the years
| since. While building an impressive alumni network and ecosystem
| of new nonprofits, these departures severely reduced the
| Institute. In late 2023, the Faculty of Philosophy announced that
| the contracts of the remaining FHI staff would not be renewed. On
| 16 April 2024, the Institute was closed down.
| goodcanadian wrote:
| Curious use of the word faculty . . . I would say the problem
| is the administration and bureaucracy and not the faculty
| (which usually refers to the academic staff). I guess they mean
| the word in the sense of the administrative unit of the
| university. Regardless, this is not a problem confined to
| Oxford; it seems to have proliferated throughout academia in
| the last couple of decades. The sheer amount of utter bullshit
| is mind boggling; I figure about 1 in 10 people actually do
| useful work while the other 9 conspire to make that person's
| life more difficult. Of course, industry is hardly any better .
| . .
| hollerith wrote:
| Why not assume that the OP knows the definition of the word
| "faculty" and that when the OP writes, "suffocation by
| Faculty bureaucracy", he meant suffocation by academic staff?
| goodcanadian wrote:
| 1. I was responding more to the commenter than to the
| quote.
|
| 2. That is, in fact, what I assume.
|
| 3. That was really tangential to my point, so I probably
| should have just left it out.
| polygamous_bat wrote:
| Surely it was Oxford, the institute older than most nations,
| being incompetent, and nothing to do with the easy money tap
| called Sam Bankman Fraud being thrown in jail. Oxford just
| seems like a easy scapegoat as they can't just say "our biggest
| donor is in jail for the foreseeable future and we have no
| money left because we used it to buy a mansion [0]"
|
| [0] https://twitter.com/paulmainwood/status/1600433194691502081
| levocardia wrote:
| Center for Effective Altruism is different from FHI
| arduanika wrote:
| Sure, and Beria is different from Marx.
|
| Nobody in this entire movement wants to take responsibility
| for what anybody else does, and it's honestly exhausting.
| They have this dense belief that once a thinker releases
| his ideas into the water, he bears no responsibility for
| the crimes and excesses they inspire.
| arduanika wrote:
| Everybody who disagrees with me is incompetent.
| sgift wrote:
| You obviously missed this part: > This is the actual
| administrative effort it took for FHI to have a small grant
| disbursed into its account within the Philosophy Faculty so
| that we could start using it - after both the funder and the
| University had already approved the grant.
|
| They make quite a clear difference between Oxford (i.e. the
| University) and the faculty.
| karma_pharmer wrote:
| They froze the institute "starting in 2020", three years
| before anybody suspected SBF was anything other than the
| Great Tech Messiah.
|
| FTX didn't even get its initial funding until the last months
| of 2019.
|
| I mean sure, FTX might be a stain on FHI's reputation _now_ ,
| but it certainly can't have been the initial cause of these
| actions by Oxford. The dates just don't work.
| setgree wrote:
| I think that either the link has changed or that the statement
| has changed, because the statement I'm reading is very
| different from your quote "in both content and deliverance"
| (https://www.youtube.com/watch?v=8UGtlUMMkOU))
| exo-pla-net wrote:
| The quote is from their "Final Report, which is the first
| link in the submitted article.
| johngossman wrote:
| "It was to be free from almost all the tiresome restraints--"
| red tape" was the word its supporters used--which have hitherto
| hampered research in this country."
|
| -- That Hideous Strength: by C. S. Lewis (1943)
|
| The book literally starts with a competition between Oxford and
| a fictional university about who gets to host a "trans-
| humanist" research organization.
| johngossman wrote:
| Bear with a digression. CS Lewis was a professor at Oxford. His
| novel "That Hideous Strength" is about an organization that wants
| to use science to save humanity.
|
| "It's a little fantastic to base one's actions on a supposed
| concern for what's going to happen millions of years hence; and
| you must remember that the other side would claim to be
| preserving humanity, too."
|
| -- That Hideous Strength: (Space Trilogy, Book Three) (The Space
| Trilogy 3) by C. S. Lewis
|
| Of course, it all goes very, very bad. The whole book can be read
| as a warning against what we now call transhumanism, obviously
| from a Christian perspective.
|
| Given CS Lewis's Oxford connection, I have always wondered if
| some of the faculty had their doubts about the FHI
| n4r9 wrote:
| I doubt that the philosophy faculty would take CS Lewis'
| opinion seriously into consideration. He was a decent
| storyteller, but not much of a philosopher.
| busterarm wrote:
| His work is often quietly cited by a handful of moral
| philosophers. In particular, The Discarded Image, anytime
| someone is writing about medieval philosophy.
|
| Anyway, being a philosopher simply isn't what the guy did and
| academics are fairly dismissive without consideration anytime
| religion (and specifically Christian apologetics) gets in the
| mix.
| n4r9 wrote:
| I'd interested to take a look if you have any further info
| about which moral philosophers and which works cite The
| Discarded Image?
|
| I admit I haven't read that one myself; my own take on him
| as a philosopher stems from reading Mere Christianity and
| summaries of Surprised by Joy, and not being aware of any
| references to his work by more recent philosophers that
| I've been interested in such as Chomsky, Zizek, or Dennett.
| I got a strong feeling that Lewis' arguments and exposition
| were guided by something other than logic, though they
| pretended to be following it. The trilemma is an example of
| this.
| undershirt wrote:
| > I got a strong feeling that Lewis' arguments and
| exposition were guided by something other than logic,
| though they pretended to be following it. The trilemma is
| an example of this.
|
| One way to discover that we necessarily have worldviews
| outside of logic is to look at a statement like this, and
| realize it appeals to something outside of logic
| (feelings) to critique how someone else is only
| pretending to follow it.
| n4r9 wrote:
| I am not saying that I or anyone am perfectly logical or
| uninfluenced by feeling. Hume was exactly correct when he
| said the reason is the slave of the passions.
|
| _But_ - with his trilemma, Lewis claims "here is a
| logical argument for believing in the Christian god", and
| then presents a weak and illogical argument. It's
| difficult to know what to make of this except that Lewis
| was somehow blinded to the logical flaws.
| AnimalMuppet wrote:
| What do you think are the logical flaws in the trilemma
| argument?
| n4r9 wrote:
| It's highly debatable whether Jesus believed himself to
| be divine. There are more than three options, for example
| Jesus may have simply made a mistake in his own
| reasoning. It is not as inconceivable as Lewis makes out
| that Jesus was a lunatic or a liar.
| cool_dude85 wrote:
| It's a false trilemma, as there are othe possibilities:
| he did not actually exist historically, or he was a wise,
| nice guy and others made up all the God stuff over the
| years, or he was a nice guy trying to help people and he
| thought tactically the mystical claims would allow him to
| help more people. One can probably imagine a bunch more
| or less realistic possibilities given a little while to
| think about it.
| jimbokun wrote:
| > "here is a logical argument for believing in the
| Christian god"
|
| That is definitely not the point of the trilemma. It
| simply argues that Christ must be placed into one of
| three categories, which do not include "generally nice
| and completely harmless moral teacher". He was clearly
| claiming to be God, which obviously leads to the trilemma
| of choices between Liar, Lunatic and Lord.
|
| That doesn't say which one of those three to pick. Just
| ruling out the other possibilities by considering the
| claims he made about himself.
| n4r9 wrote:
| > He was clearly claiming to be God
|
| I don't know if that's true. Certainly there are biblical
| scholars who would disagree. It's not mentioned in the
| first three gospels, for example.
|
| > obviously leads to the trilemma of choices
|
| Again, not obvious. The three choices are logically
| incomplete. There are other possibilities, such as that
| Jesus was not a ful-on lunatic: he had a single
| delusional belief about his own divinity but was
| otherwise rational.
| busterarm wrote:
| Mary Midgley
| johngossman wrote:
| Thank you! I love Mary Midgley and now that you mention
| it, it makes sense. Of course, there's the Oxford
| connection, and Midgley was critical of reductionist
| materialism, though she was no Christian apologist. There
| are loops within loops here, such as the Anscombe
| connection to both of them.
|
| Your comment led me to this:
| https://www.lewisiana.nl/marymidgley/
| goatlover wrote:
| My feeling is that Chomsky, Zizek and Dennett aren't free
| of being guided by something other than logic for some of
| their arguments as well. For example, Dennett's arguments
| against the hard problem of consciousness come across as
| dogmatic materialism.
| wizzwizz4 wrote:
| > _academics are fairly dismissive without consideration
| anytime religion (and specifically Christian apologetics)
| gets in the mix._
|
| In fairness, it's a good heuristic. Most Christian
| apologetics are just less poetic subsets of the Book of
| Job, but their authors act like they have some new and
| exciting insight that will prove Christianity 100% for sure
| this time. It's a total waste of time to engage.
|
| C.S. Lewis's works are the exception, not the rule, since
| there's actually something there to engage with. Plus, he's
| just a good author.
| dylan604 wrote:
| How is that any different than deity of religion will save
| humanity as any more/less believable?
| johngossman wrote:
| Not defending it. Just think the parallels with the novel are
| amusing. John Gray has been arguing for years that humanism
| and transhumanism grew out of Christian millennialism
| arduanika wrote:
| At least Lewis admits it. For me, that goes a long way.
| arduanika wrote:
| While I do appreciate this reference and agree with CS Lewis
| here, I think it's a bit of a stretch to draw a connection
| between him and the present day Oxford philosophy faculty. He
| was a single don from nearly a century ago, who now really
| belongs to the canon at large rather than a single university.
| Perhaps the current faculty all hold some special reverence for
| him, but it seems more likely that modern mainstream
| philosophers (not only at Oxford) merely share with Lewis some
| basic grasp of common sense, which informs their skepticism of
| FHI and EA.
| johngossman wrote:
| I agree with you completely. And yet...I can't imagine at
| least some of them aren't aware of it, and it might have
| planted a seed. I could totally believe there was a faculty
| meeting and someone said: "Didn't CS Lewis write a book or
| two about this?" followed by laughter (some of it nervous).
| It's just so odd to me that out of all the universities in
| the world, FHI ended up at the place CS Lewis was when he
| wrote this book. I can pull quote after quote out of it, that
| if you don't know it was written in 1943, you would think he
| was parodying FHI and EA. Which really reflects that these
| two movements aren't that new.
| stainablesteel wrote:
| i'm honestly glad, there seems to be a lot of places that may as
| well be called "the institute of impending doom for all of
| humanity" and i just don't care for this fear-based grant
| entrenchment, it gets us nowhere. people who are pioneering the
| front lines of any technology understand risk better than people
| who write about it
| mitthrowaway2 wrote:
| > people who are pioneering the front lines of any technology
| understand risk better than people who write about it
|
| That's far from guaranteed, and we have a long history of
| lessons written in blood that says otherwise. The people
| pioneering a technology are going to be self-selected to be the
| most optimistic about its potential and the most dismissive of
| its negative impacts.
| neilv wrote:
| The HN headline, "Future of Humanity Institute shuts down",
| sounds cynically funny, in context of current
| problems/challenges.
|
| The grandiose name Oxford chose now sounds like they've
| determined the situation is hopeless.
| jessriedel wrote:
| "Oxford" did not choose the name
| kikokikokiko wrote:
| By your definition no name ever was "chosen" by any
| corporation in history. Organizations are made of people, and
| when a new org inside the main org is named, it obviously was
| named by someone (or a committee). In the end, the
| "organization" choose the name, potato po-ta-to.
| jessriedel wrote:
| You misunderstand. There are many cases where there is a
| decision making process (either single person or
| collective) that reasonably represents an organization's
| decision. But no such organization representing Oxford
| picked the name of FHI. It was picked by the people, most
| likely Bostrom, who started FHI, and did not at all
| represent a decision by greater Oxford.
| colechristensen wrote:
| > in context of current problems/challenges
|
| People have been saying nonsense like this for as long as we
| have recorded history of people saying anything.
| stronglikedan wrote:
| _Nothing new under the sun_
|
| --Abraham Lincoln
| ryan_j_naughton wrote:
| I think it is actually from the bible:
|
| What has been will be again, what has been done will be
| done again; there is nothing new under the sun. -
| Ecclesiastes 1:9
| r2_pilot wrote:
| I've very happily been refuting this verse lately by
| building a smart robot that could not have existed before
| modern tech.
| eks391 wrote:
| "A house divided aginst itself cannot stand," part of a
| speech by Abe and thus credited to him, is also from the
| Bible: Mark 3:25. I'va a hunch he was religious.
| gizajob wrote:
| Nihil sub sole novum
| wigster wrote:
| and everytime they do, they ARE closer to being correct
| neilv wrote:
| If you're not feeling the increasing dissatisfaction and
| worsening conditions that a lot of research reports, that's
| great for you, but I wouldn't call it nonsense.
| colechristensen wrote:
| My point is a large group of people have been feeling
| "dissatisfaction and worsening conditions" for thousands of
| years. It's what growing up and getting older feels like.
| You yearn for how things were in the past.
|
| It's easy to take for granted the things which got better
| before you knew about them and to overemphasize things
| which are getting worse now. There is always something,
| there always has been something, always will be. And always
| there have been people insisting that the modern problems
| are the real serious ones compared to the past.
|
| Doom sells. Lots of people buy it.
| laurex wrote:
| We've had recorded history of people saying things for less
| time than we've had a formal scientific method. Surely the
| accumulation of knowledge through our recording of people
| saying things carries some weight? It truly has not been that
| long in which great acceleration of climate conditions have
| become existential- though I suppose in the timelines of life
| on earth, the era of recorded human thoughts is itself
| minuscule.
| colechristensen wrote:
| >We've had recorded history of people saying things for
| less time than we've had a formal scientific method.
|
| False, we have vast amounts of records of everyday
| correspondence, written speeches, graffiti, works of
| fiction, etc. from Romans 2000 years ago, long before
| scientific method (which I guess is usually attributed to
| Newton?, anyway approximately contemporary)
|
| https://en.wikipedia.org/wiki/Ages_of_Man
|
| Hesiod 2700 years ago wrote about the degradation of
| society from the Golden Age where people lived among the
| Gods with a garden of eden vibe down through silver, bronze
| to "present" iron where life was toil and misery and people
| were awful and immoral. Ovid said the same 2000 years ago.
|
| "Everything is going to shit, things aren't as good as they
| used to be and they're getting worse now more than ever" is
| literally a meme as old as human history.
| shawn_w wrote:
| I think they're using a more limited meaning of recorded
| than is usual; probably just referring to voice/sound
| recording.
| ken47 wrote:
| I agree with your sentiment. Every era has had its major
| challenges, and it's egocentric + myopic to say ours are the
| most important ever.
| PeterStuer wrote:
| Seems at first glance to be the same type slippery eel org as the
| 'Future of Life Institute'. They seem to have far more in common
| under the covers than just similar sounding names.
| Aromasin wrote:
| The full final report is well worth a read for those with time on
| their hands: https://www.futureofhumanityinstitute.org/s/FHI-
| Final-Report...
| arduanika wrote:
| Good riddance to these ideologues. It's a shame that the
| announcement shows no trace of remorse for the crimes they
| inspired (e.g. at FTX) or the lives that have been ruined by the
| cults to which they lent academic legitimacy. Instead, their
| demise is chalked up to some bland moaning about "administrative
| headwinds".
|
| Realistically, we're probably only looking at a brief respite
| before they regroup under some other benevolent-sounding name. Be
| on the lookout for their next incarnation, and let's hope that
| Oxford won't repeat the mistake of allowing them to evangelize
| under the seal of a great university.
|
| (Edit in response to causal's question below):
|
| They legitimized longtermism and pseudo-rational AI panic, which
| transformed much of EA into an apocalyptic sect, with all the
| high-demand group dynamics that come with it. Their research
| created an air of urgency and expediency which, among other bad
| outcomes, inspired the devoted EAs at FTX to justify their crimes
| to themselves. These crimes resulted in privation and a few
| suicides for their innocent depositors. To this day, there are
| EAs who rationalize these crimes against their outgroup as not
| that bad, considering that the "Future of Humanity" is at stake.
|
| You can look at the movement and find plenty of other negative
| effects of this urgency and expediency, but for me personally,
| the FTX crimes are what woke me up to the true nature of these
| hazardous ideas.
| causal wrote:
| That's a lot of vague vitriol, what exactly did FHI do?
| Genuinely curious.
| rurp wrote:
| It's quite a leap to assume that SBF only turned to a life of
| crime because the EA movement lured him into it.
|
| There are legitimate criticisms of EA and I'm not personally a
| follower of the movement but your post comes off as way too
| generally dismissive. Many of the people involved seem to have
| good intentions and a lot of money has been donated to
| objectively good causes. The longtermism stuff is more squishy,
| but our society is pretty bad at dealing with certain types of
| existential risks and we could use more people thinking about
| solutions rather than less.
| arduanika wrote:
| > to assume that SBF only turned to a life of crime because
| the EA movement lured him into it.
|
| It's not an assumption. It's a matter of record, understood
| by anyone who knows the basics of the FTX saga. SBF is a
| lifelong utilitarian, and his co-conspirators were also
| committed EAs. Anyone who obscures that fact is abetting the
| campaign of obfuscation.
|
| > Many of the people involved seem to have good intentions
|
| Yup. That includes Sam and his friends. How did that turn
| out?
|
| > a lot of money has been donated
|
| To borrow your language, "it's quite a leap to assume" that
| anyone donated money just because EA lured them into it. How
| do we know they weren't going to behave altruistically in the
| absence of the movement?
|
| Why is there an isolated demand for rigor when confronting
| the movement's adverse effects? Do you think that Sam is
| inherently a criminal, with ideology playing no role, whereas
| EA donors aren't inherently generous?
|
| > to objectively good causes
|
| Fair enough. I will concede that there were a few of these.
| In some cases, early EA principles may have helped people
| arrive at these good ideas in a way that wouldn't have
| happened without the movement.
|
| It's just that, I get annoyed when people dismiss the
| downsides, writing them off as aberrations or lone bad
| actors. And given the major flaws in the philosophy, it's
| hard to shake my suspicion that most of their longtermist
| explorations are worse than just "squishy". It seems quite
| plausible that they're doing more harm than good.
| johnthewise wrote:
| This type of argument can be used against pretty much any
| group that has ever existed, then either it's too broad and
| not a meaningful critique of the group itself or a critique
| of humans getting together in general.
| arduanika wrote:
| Sorry, you've lost me here. Which part of my argument are
| referring to?
|
| I'm not criticizing a group, btw. I'm criticizing ideas.
| Ideas have specific consequences. Some ideas inspire good
| actions. Some inspire bad actions that outweigh the good.
|
| When an idea claims to have big consequences in the
| distant future, we can look at its consequences in the
| present day to help us guess the likely nature of those
| future consequences.
|
| Oxford deals in ideas, and gets to decide which ones to
| host and fund. Sometimes they get it wrong, mistaking bad
| ideas for good ones. That's unfortunate, but it's nice
| when they come around to the right assessment eventually.
| kvee wrote:
| Future of Humanity Final Report from Anders Sandberg here:
|
| https://static1.squarespace.com/static/660e95991cf0293c2463b...
|
| and Google Doc version:
| https://docs.google.com/document/d/1jgl2KqtiJ6lLkpoZ1I_VeniP...
| gizajob wrote:
| Seems like they didn't see it coming...
| karma_pharmer wrote:
| _Starting in 2020, the Faculty imposed a freeze on fundraising
| and hiring. In late 2023, the Faculty of Philosophy decided that
| the contracts of the remaining FHI staff would not be renewed._
|
| Can anybody offer insight into the reasons here? Obviously there
| is not going to be an objective answer to this question. And that
| is probably why the linked page does not try to give an answer.
|
| I'm assuming lack of funding wasn't the problem, since they froze
| fundraising (if the inability to raise funds was the problem it
| is unlikely that they would have done this).
|
| Edit: looks like he got smeared in January of 2023, but that
| doesn't explain the freeze starting three years earlier:
| https://news.ycombinator.com/item?id=40066352
| javajosh wrote:
| The real question is whether Mr. Beast can make compelling
| content about saving the Future of Humanity Institute. If so,
| they're good.
| klyrs wrote:
| I saw some Mr Beast branded candy bars at the supermarket this
| week. Pretty sure that's not a future-saving venture, that's
| just taking advantage of the sugar-addicted kids that form his
| fanbase.
| Lockal wrote:
| Good riddance.
|
| A group of scammers without specialized education, theorizing
| about immortality (from people without medical education), saving
| humanity (thank you, but humanity thrives without your help),
| about ethics in artificial intelligence (from people who do not
| know any programming language), about drug promotion (from
| professional drug abusers), about pumping money into
| cryptocurrencies (which they personally purchased) and even
| crypto exchanges they own (hello SBF).
|
| I hope to see all these people in jail (as I see they have
| already removed their names from the site).
| andrelaszlo wrote:
| The future is here?
___________________________________________________________________
(page generated 2024-04-17 23:01 UTC)