[HN Gopher] Analyzing the historical rate of catastrophes
___________________________________________________________________
Analyzing the historical rate of catastrophes
Author : Hooke
Score : 39 points
Date : 2023-12-05 06:32 UTC (16 hours ago)
(HTM) web link (bounded-regret.ghost.io)
(TXT) w3m dump (bounded-regret.ghost.io)
| RcouF1uZ4gsC wrote:
| In the list of catastrophes, the Mongol Wars should probably be
| included:
| https://en.wikipedia.org/wiki/Destruction_under_the_Mongol_E...
|
| Some estimates are that 11% of the world's population was killed
| during this time.
| niccl wrote:
| Agree that it was a catastrophe by the article's definition,
| but the author specifically says 'since 1500', which excludes
| the Mongol Wars (from what I understood from your linked page)
|
| My first thought when I started reading TFA was that the list
| of catastrophes to consider would be biased because more recent
| events have better records. Maybe that's why the author decided
| on the 1500 cut off?
| RcouF1uZ4gsC wrote:
| The table includes the Plague of Justinian and the An Lushan
| Rebellion both of which occurred before 1500.
| jsteinhardt wrote:
| Author here. The Mongol invasions are in the .csv in the
| appendix (and represented in the scatter plot), but weren't
| included in the table because it restricts to events that
| lasted less than a decade.
|
| If you restrict to the single "worst" decade then the
| Mongol invasions would have been high enough to make the
| list, but I didn't want to start making too many manual
| adjustments to the data, so I left it as-is.
| RcouF1uZ4gsC wrote:
| Makes sense. Thanks for explaining. I really appreciated
| your in-depth quantitative analysis in the article.
| wodow wrote:
| For a narrower but deeper treatment of violence (and war) alone,
| Pinker's
| https://en.wikipedia.org/wiki/The_Better_Angels_of_Our_Natur...
| is well worth a read.
| brnaftr361 wrote:
| Skeptically.
|
| Pinker has been called out for cherry picking by numerous other
| authors, and particularly Graeber/Wengrow who are a duo of
| academic anthropologist and archaeologist respectively. Another
| is Christopher Ryan. In both cases well reasoned
| counterarguments are poised against Pinker's reasoning.
| sigilis wrote:
| Has there been a single instance of a self replicating ai? The
| article seems to think so, but try as I might none of the image
| generators, chess engines, llms, or linear regression models I've
| used or seen has even once copied itself to another location let
| alone run itself.
|
| The idea of ai as a novel self replicator is cool and appears in
| movies and books, but doesn't seem to exist outside of fiction.
| The other article referenced seems to dream of a future 2030 AI
| with all the capabilities one can imagine isn't supported by any
| reasonable projections for AI technology. It might as well be a
| warning about all the dangerously weaponizable portable fusion
| reactors that could exist if ITER development is super
| successful. In this respect, AI seems like an unlikely driver of
| catastrophe as defined in the near term.
|
| Putting even a 5-10% increase in rates of calamity due to this
| technology, which has no evidence to support it, while
| discounting all other technologies including nuclear weapons
| claiming there's too little data is not reasonable. The reality
| is, we don't know what risk value to assign. We won't know for
| some time.
|
| Just leave out the AI bit from the otherwise reasonable looking
| statistical analysis, and you'll be left with a more
| intellectually rigorous and useful work.
| AlbertCory wrote:
| You cannot "estimate" the probability of a catastrophic event,
| i.e. a Black Swan. All you can say is that it's possible, and
| over a long enough time period, things like that _will_ happen.
| As well as things you never imagined.
___________________________________________________________________
(page generated 2023-12-05 23:00 UTC)