AI ALIGNMENT FORUM
AF

Wikitags

One-Boxing

Edited by Chris_Leong last updated 17th Sep 2020

In Newcomb's Problem, one-boxing means only taking the box which could contain the million. It generally isn't used for just taking the box containing the thousand even though that would also just be taking one box, as there is no reason to do this.

The general consensus on Less Wrong is that one-boxing is the rational decision, whilst the two-boxing seems to be the most popular option among decision theorists.

Discussion0
Discussion0