AI ALIGNMENT FORUM
AF

Wikitags

Singularitarianism

Edited by steven0461, Daniel Trenor, Kaj_Sotala, et al. last updated 31st Jan 2013

Singularitarianism refers to attitudes or beliefs favoring a technological singularity.

The term was coined by Mark Plus, then given a more specific meaning by Eliezer Yudkowsky in his Singularitarian principles. "Singularitarianism", early on, referred to an principled activist stance aimed at creating a singularity for the benefit of humanity as a whole, and in particular to the movement surrounding the Machine Intelligence Research Institute.

The term has since sometimes been used differently, without it implying the specific principles listed by Yudkowsky. For example, Ray Kurzweil's book "The Singularity Is Near" contains a chapter titled "Ich bin ein Singularitarian", in which Kurzweil describes his own vision for technology improving the world. Others have used the term to refer to people with an impact on the Singularity and to "expanding one's mental faculties by merging with technology". Others have used "Singularitarian" to refer to anyone who predicts a technological singularity will happen.

Yudkowsky has (perhaps facetiously) suggested that those adhering to the original activist stance relabel themselves the "Elder Singularitarians".

External Links

  • The Singularity is Near by Ray Kurzweil
  • Transhumanist and Singularitarian articles by Nick Bostrom
  • Singularity articles by Eliezer Yudkowsky

See Also

  • Singularity
  • Hard takeoff
  • Artificial General Intelligence
Discussion0
Discussion0