AI ALIGNMENT FORUM
AF

Wikitags

Anthropomorphism

Edited by Zack_M_Davis, PeerInfinity, Vladimir_Nesov, Ruby, et al. last updated 15th Sep 2020

Anthropomorphism is the error of attributing distinctly human characteristics to nonhuman processes. As creatures who evolved in a social context, we all have adaptations for making predictions about other humans by empathic inference. When trying to understand the behavior of other humans, it oftentimes is a helpful (and bias-correcting) heuristic to ask, "Well, what would I do in such a situation?" and let that be your prediction. This mode of prediction simply won't do, however, for things (and in this wide universe there are many) that don't share the detailed structure bequeathed on the human brain by evolution, although it is oftentimes tempting.

Related tags: Mind Projection Fallacy, Typical Mind Fallacy, Alien values, Paperclip maximizer

Blog posts

  • The Tragedy of Group Selectionism - A tale of how some pre-1960s biologists were led astray by expecting evolution to do smart, nice things like they would do themselves.
  • When Anthropomorphism Became Stupid
  • Anthropomorphic Optimism - You shouldn't bother coming up with clever, persuasive arguments for why evolution will do things the way you prefer. It really isn't listening.
  • Humans in Funny Suits
  • Anthropomorphic Optimism

See also

  • Evolution as alien god
  • Unsupervised universe
  • Alien values, Paperclip maximizer
  • Mind design space, Really powerful optimization process
Discussion0
Discussion0