AI ALIGNMENT FORUM
AF

Wikitags

Two-Boxing

Edited by Chris_Leong last updated 17th Sep 2020

In Newcomb's Problem, two-boxing means taking both boxes, typically on the basis that your decision is independent of the prediction that has already been made. The general consensus on Less Wrong is that one-boxing is the rational decision, whilst the two-boxing seems to be the most popular option among decision theorists.

Discussion0
Discussion0