Why do I think this ideology is so dangerous? The short answer is that elevating the fulfilment of humanity’s supposed potential above all else could nontrivially increase the probability that actual people – those alive today and in the near future – suffer extreme harms, even death. Consider that, as I noted elsewhere, the longtermist ideology inclines its adherents to take an insouciant attitude towards climate change. Why? Because even if climate change causes island nations to disappear, triggers mass migrations and kills millions of people, it probably isn’t going to compromise our longterm potential over the coming trillions of years. If one takes a cosmic view of the situation, even a climate catastrophe that cuts the human population by 75 per cent for the next two millennia will, in the grand scheme of things, be nothing more than a small blip – the equivalent of a 90-year-old man having stubbed his toe when he was two.
Bostrom’s argument is that ‘a non-existential disaster causing the breakdown of global civilisation is, from the perspective of humanity as a whole, a potentially recoverable setback.’ It might be ‘a giant massacre for man’, he adds, but so long as humanity bounces back to fulfil its potential, it will ultimately register as little more than ‘a small misstep for mankind’. Elsewhere, he writes that the worst natural disasters and devastating atrocities in history become almost imperceptible trivialities when seen from this grand perspective. Referring to the two world wars, AIDS and the Chernobyl nuclear accident, he declares that ‘tragic as such events are to the people immediately affected, in the big picture of things … even the worst of these catastrophes are mere ripples on the surface of the great sea of life.’
This way of seeing the world, of assessing the badness of AIDS and the Holocaust, implies that future disasters of the same (non-existential) scope and intensity should also be categorised as ‘mere ripples’. If they don’t pose a direct existential risk, then we ought not to worry much about them, however tragic they might be to individuals. As Bostrom wrote in 2003, ‘priority number one, two, three and four should … be to reduce existential risk.’ He reiterated this several years later in arguing that we mustn’t ‘fritter … away’ our finite resources on ‘feel-good projects of suboptimal efficacy’ such as alleviating global poverty and reducing animal suffering, since neither threatens our longterm potential, and our longterm potential is what really matters.
He does not care about us, why should anyone care about him? Unfortunately other rich people are also into this, because it helps them to ignore the worlds problems and to do whatever they want to the people living now.
What a shit community this has become when hoping someone is dead has this much upvotes.
The man is an asshole and stain on society, his death would likely marginally improve the world’s future outlook.
how dare you be mean to Fallenwouts big beloved internet daddy Elon Musk. /s
What a shit guy who cares more about people who live in thousands of years than people who live today:
https://netzpolitik.org/2023/longtermism-an-odd-and-peculiar-ideology/
https://aeon.co/essays/why-longtermism-is-the-worlds-most-dangerous-secular-credo
https://en.wikipedia.org/wiki/Longtermism
He does not care about us, why should anyone care about him? Unfortunately other rich people are also into this, because it helps them to ignore the worlds problems and to do whatever they want to the people living now.