You are viewing revision 0.0.1, last edited by ignoranceprior

Suffering risks (also known as s-risks) are risks of the creation of suffering in the far future on an astronomical scale. In this sense they can be considered a form of existential risk according to Bostrom's original definition, but it may be useful to distinguish between risks that threaten to prevent future populations from coming into existence (standard x-risks) and those which would instantiate a large amount of suffering (s-risks).

Although the Machine Intelligence Research Institute and Future of Humanity Institute have investigated strategies to prevent s-risks, the only EA organization with s-risk prevention as its primary focus is the Foundational Research Institute.

See also