Scared Silicon Valley billionaire:

Look, you're just not grasping the scale of the AI threat. What if an AI one day is built that decides to harvest large chunks of the Earth's biosphere in order to make some numbers in a financial engine go up slightly? What would we all do then?

Climate change and species loss researchers:

Really? That's a thing that could happen? And people might not believe the predictions? Do go on.

Billionaire:

Well OBVIOUSLY I don't mean it's happening NOW

@natecull the AI threat isn't really a distinct threat to humanity as much as it is a threat to the power of billionaires

@sandrockcstm @natecull definitely

although I don't really see much of a difference between "rogue AI driven only by capital motive" and "out-of-touch billionaires driven only by capital motive" for the rest of us, y'know?

@lm @natecull I think the difference is one of capability.

An AI can move exponentially faster than a human, and therefore can accomplish basically any task faster than we could stop it. This has always been the true danger of an intelligence driven by physical forces that approach the speed of light.

The real nightmare is if an AI decides that the best way to accomplish its goal is to remove a confounding variable: humans, and decides to wipe us out. (cont)

@lm @natecull The example often cited is a "Gray Goo" apocalypse.

Imagine an AI designed only to build the best paper clips in the world. It decides the best way to do this is to colonize all of known space and convert it into paper clip factories. It converts some of its facilities into nano-bot production without anyone realizing, and then uses nano-bots to kill humanity in a week.

It then happily continues its task on a barren earth, making paper clips forever.

Sign in to participate in the conversation
Mastodon

Server run by the main developers of the project 🐘 It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!