Counterfactual Mugging

In Two Types of Updatelessness, makes a distinction between all-upside updatelessness and mixed-upside updatelessness. In all-upside case, utilising an updateless decision theory provides a better result in the current situation, while in a mixed-upside case the benefits go to other possible selves. Unlike Newcomb's Problem or Parfait'Parfit's Hitchhiker, Counterfactual Mugging is a mixed-upside case.

Depending on how the problem inis phrased, intuition calls for different answers. For example, Eliezer Yudkowsky has argued that framing the problem in a way Omega is a regular aspect of the environment which regularly asks such types of questions makes most people answer 'Yes'. However, Vladimir Nesov points out that Rationalists Should Win could be interpreted as suggesting that we should not pay. After all, even though paying in the tails case would cause you to do worse in the counterfactual where the coin came up heads, you already know the counterfactual didn't happen, so it's not obvious that you should pay. This issue has been discussed in this question.

Formal decision theories also diverge. For Causal Decision Theory, you can only affect those probabilities that you are causalcausally linked to. Hence, the answer should be 'No'. In Evidential Decision Theory any kind of connection is accounted, then the answer should be 'No'. Timeless Decision Theory answer seems undefined, however Yudkowsky has argued that if the problem is recurrently presented, one should answer 'Yes' on the basis of enhancing its probability of gaining $10000 in the next round. This seems to be Causal Decision Theory prescription as well. Updateless decision theory1 prescribes giving the $100, on the basis your decision can influence both the 'heads branch' and 'tails branch' of the universe.

Omega, a perfect predictor, flips a coin. If ifit comes up tails Omega asks you for $100. If it comes up heads, Omega pays you $10,000 if it predicts that you would have paid if it had come up tails.

Regardless of the particular decision theory, it is generally agreed that if you can pre-commit in advance that you should do so. The dispute is purely over what you should do if you didn't pre-commit.

In Logical Counterfactual Mugging instead of flipping a coin, Omega tells you the 10,000th digit of pi, which we assume you don't know off the top of your head. If it is odd, we treat it like heads in the original problem and if it is even treat it like tails. Logical inductors have been proposed as a solution to this problem. Applying this to Logical Counterfactual Mugging.

In Logical Counterfactual Mugging instead of flipping a coin, Omega tells you the 10,000th digit of pi, which we assume you don't know off the top of your head. If it is odd, we treat it like heads in the original problem and if it is even treat it like tails. Logical inductors have been proposed as a solution to this problem. It is possible to construct a version of the Counterfactual Prisoner's Dilemma for Logical Counterfactual Mugging too.

In Logical Counterfactual Mugging instead of flipping a coin, Omega tells you the 10,000th digit of pi, which we assume you don't know off the top of your head. If it is odd, we treat it like heads in the original problem and if it is even treat it like tails. Logical inductors have been proposed as a solution to this problem. It is possible to construct a version of the Counterfactual Prisoner's Dilemma for Logical Counterfactual Mugging too.