AI ALIGNMENT FORUM
AF

571
Wikitags

Human Alignment

Edited by Jordan Arel last updated 6th Dec 2022

Human alignment is a state of humanity in which most or all of humanity systematically cooperates to achieve positive-sum outcomes for everyone (or at a minimum are prevented from pursuing negative sum outcomes), in a way perpetually sustainable into the future. Such a state of human alignment may be necessary to prevent an existential catastrophe in the case that the "Vulnerable World Hypothesis" is correct.

Subscribe
Discussion
2
Subscribe
Discussion
2
Posts tagged Human Alignment
73. Uploading
RogerDearnaley
2y
0
Add Posts