AI ALIGNMENT FORUM
AF

105
Wikitags

IABIED

Edited by Gyrodiot last updated 20th Sep 2025

If Anyone Builds It, Everyone Dies (shortened as IABIED) is a book by Eliezer Yudkowsky and Nate Soares, released in September 2025.

The main thesis of the book is as follows (direct quote):

If any company or group, anywhere on the planet, builds an artificial superintelligence using anything remotely like current techniques, based on anything remotely like the present understanding of AI, then everyone, everywhere on Earth, will die.

We do not mean that as hyperbole. We are not exaggerating for effect. We think that is the most direct extrapolation from the knowledge, evidence, and institutional conduct around artificial intelligence today. In this book, we lay out our case, in the hope of rallying enough key decision-makers and regular people to take AI seriously. The default outcome is lethal, but the situation is not hopeless; machine superintelligence doesn't exist yet, and its creation can yet be prevented.

This tag is used for announcements related to the book, as well as reviews of it.

  • Official website
Subscribe
Discussion
Subscribe
Discussion
Posts tagged IABIED
15If anyone builds it, everyone will plausibly be fine
joshc
8d
15
Add Posts