The goal of "longtermism" and most AI evangelism or Singularity woo is to make trivial things sound important at the expense of actually important things so you will give these people money.
@gwynnion 100% THIS.
@gwynnion I'm a transhumanist and second all of this!
@gwynnion I think the goal of longtermism is to justify one's personal preferences behind a veneer of objectivity.
The fact is, nobody can tell the future, so pretending like you know what will maximize the human population in a million years is a level of hubris only attainable for those have been given (uneared) the wealth of literal gods.
In a way, it's pretty sad...they can't just say to themselves "this is what I want so I'll make it happen"...they have to invent some pseudo-scientific justification for themselves.
Money corrupts....but it also weakens...billionaires are the weakest of us.
@gwynnion yeeeeeep.
“The real problem isn’t climate change, which would require us to cut into our profit margins to address now, but sone hypothetical threat that may be presented by Roko’s Basilisk in 500 years!”
@gwynnion Oh, don't forget the ways that it normalizes and justifies eugenics and ethnic cleansing and genocide. After all, if the good of future generations matters far more than any pain, suffering, or crimes committed in the present...
The philosophy is basically just waiting for people to fill in the blanks with their preferred "solutions" to future "problems".
@gwynnion @davidgerard I disagree (somewhat). The singularity stuff and AI evangelism is the emergence of a post-God religion that started out as a rationalist/enlightenment Christian heresy. The boosters moving in are just a fresh paint job on the same old religious grifters and evangelical hucksters that parasitize Christianity. Yes, they're using it to make money—but they're plugging into a religious impulse, which makes it far more dangerous to the rest of us because zealots know no limits.
@davidgerard @cstross @gwynnion probably they believe in it as much as they do any other investment category: fervently agree with it when it makes them money, rapidly distancing when it moves out of favour