{
  localUrl: '../page/newcombs_problem.html',
  arbitalUrl: 'https://arbital.com/p/newcombs_problem',
  rawJsonUrl: '../raw/5pv.json',
  likeableId: '3610',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '1',
  dislikeCount: '0',
  likeScore: '1',
  individualLikes: [
    'EricBruylant'
  ],
  pageId: 'newcombs_problem',
  edit: '6',
  editSummary: 'formatting fixes',
  prevEdit: '4',
  currentEdit: '6',
  wasPublished: 'true',
  type: 'wiki',
  title: 'Newcomb's Problem',
  clickbait: 'There are two boxes in front of you, Box A and Box B.  You can take both boxes, or only Box B.  Box A contains $1000.  Box B contains $1,000,000 if and only if Omega predicted you'd take only Box B.',
  textLength: '7843',
  alias: 'newcombs_problem',
  externalUrl: '',
  sortChildrenBy: 'likes',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'EricBruylant',
  editCreatedAt: '2016-10-13 16:53:43',
  pageCreatorId: 'EliezerYudkowsky',
  pageCreatedAt: '2016-08-01 00:09:09',
  seeDomainId: '0',
  editDomainId: '123',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '0',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '79',
  text: '[summary:  A powerful alien named [5b2 Omega] has presented you with the following dilemma:\n\n- Before you are two boxes, Box A and Box B.\n- You can either take both boxes ("two-box"), or take only Box B ("one-box").\n- Box A is transparent and contains \\$1,000.  Box B is opaque.\n- If Omega predicted that you will take only Box B, Omega has put \\$1,000,000 in Box B.  Otherwise Omega left Box B empty.\n- Omega is an excellent predictor of human behavior (e.g. we can assume Omega has played this game many times before, and never been wrong).\n- At the time of your choice, Omega has already made its prediction and left.  Box B is already empty or already full.\n\nNewcomb's Problem was historically responsible for the invention of [5n9 causal decision theory] and its widespread adoption over [5px evidential decision theory].  For a discussion of [5pt Newcomblike decision problems] in general, see [58d].]\n\n\nNewcomb's Problem is the original [5pt Newcomblike decision problem] that inspired the creation of [5n9 causal decision theory] as distinct from [5px evidential decision theory], spawning a vast array of philosophical literature in the process.  It is sometimes called Newcomb's Paradox (despite not being a paradox).  The dilemma was originally formulated by [William Newcomb](https://en.wikipedia.org/wiki/William_Newcomb), and presented to the philosophical community by Robert Nozick.\n\nThe original formulation of Newcomb's Problem was as follows:\n\n[5b2 An alien named Omega] has come to Earth, and has offered some people the following dilemma.\n\nBefore you are two boxes, Box A and Box B.\n\nYou may choose to take both boxes ("two-box"), or take only Box B ("one-box").\n\nBox A is transparent and contains \\$1,000.\n\nBox B is opaque and contains either \\$1,000,000 or \\$0.\n\nThe alien Omega has already set up the situation and departed, but previously put \\$1,000,000 into Box B if and only if Omega predicted that you would one-box (take only the opaque Box B and leave Box A and its \\$1,000 behind).\n\nOmega is an excellent predictor of human behavior.  For the sake of quantifying this assertion and how we know it, we can assume e.g. that Omega has run 67 previous experiments and not been wrong even once.  Since people are often strongly opinionated about their choices in Newcomb's Problem, it isn't unrealistic to suppose this is the sort of thing you could predict by reasoning about, e.g., a scan of somebody's brain.\n\nNewcomb originally specified that Omega would leave Box B empty in the case that you tried to decide by flipping a coin; since this violates [fair_problem_class algorithm-independence], we can alternatively suppose that Omega can predict coinflips.\n\nWe may also assume, e.g., that Box A combusts if it is left behind, so nobody else can pick up Box A later; that Omega adds \\$1 of pollution-free electricity to the world economy for every \\$1 used in Its dilemmas, so that the currency does not represent a zero-sum wealth transfer; etcetera.  Omega never plays this game with a person more than once.\n\nThe two original opposing arguments given about Newcomb's problem were, roughly:\n\n- Argument for one-boxing: People who take only Box B tend to walk away rich. People who two-box tend to walk away poor. It is better to be rich than poor.\n- Argument for two-boxing: Omega has already made its prediction. Box B is already empty or already full. It would be irrational to leave behind Box A when this choice cannot *cause* Box B's contents to change. It's true that Omega has chosen to reward people with irrational dispositions in this setup, but Box B is now already empty, and taking only one box won't change that.\n\nFor the larger argument of which this became part, see [58d one of the introductions to logical decision theory].  As of 2016, the most academically common view of Newcomb's Problem is that it surfaces the split between [-5px] and [-5n9], and that [5n9 causal decision theory] is correct.  However, both that framing and that conclusion have been variously disputed, most recently by [-58b].\n\n%todo:  add a diagram of a causal model for Newcomb's Problem.%\n\nThe more extensive [Wikipedia page on Newcomb's Problem](https://en.wikipedia.org/wiki/Newcomb%27s_paradox) may be found under ["Newcomb's Paradox"](https://en.wikipedia.org/wiki/Newcomb%27s_paradox).\n\n# Replies by different decision theories\n\n(This section does not remotely do justice to the vast literature on Newcomb's Problem.)\n\n## Pretheoretic reactions\n\n- Well, by assumption, Omega is pretty good at predicting me, so I'd better take only Box B.\n- Omega's already gone.  I can't possibly get any more money by leaving behind Box A.\n- I have free will, so Omega *can't* predict me.  This problem is paradoxical.\n- This is a silly dilemma; why would Omega do that? %note: Other [5pt Newcomblike problems] may seem more naturally motivated, such as [5rb voting in elections], the [5py], [5s0], and the [5qh].%\n\n## Evidential decision theory\n\n[5px] can be seen as a form of decision theory that was originally written down by historical accident--writing the [18t expected utility formula] as if it [action_conditional conditioned] using [1ly Bayesian updating], because Bayesian updating is usually the way we condition probability functions.   Historically, though, [-5px] was explicitly named as such in an (arguably failed) attempt to rationalize the pretheoretic answer of "I expect to do better if I one-box" on Newcomb's Problem.\n\nOn [5px], the [-principle_of_rational_choice principle of rational choice] is to choose so that your act is the best news you could have received about your action; in other words, imagine being told that you had in fact made each of your possible choices, imagine what you would believe about the world in that case, and output the choice which would be the best news.  Thus, evidential agents one-box on Newcomb's Problem.\n\nAlthough the EDT answer happens to conform with "the behavior of the agents that end up rich" on Newcomb's Problem, LDT proponents note that it does not do so in general; see e.g. the [-5ry].\n\n## Causal decision theory\n\nOn causal decision theories, the principle of rational choice is to choose according to the causal consequences of your physical act; formally, to calculate expected utility by conditioning using a [causal_counterfactual causal counterfactual].  To choose, imagine as the world as it is right up until the moment of your physical act; assume that your physical act changes without that changing anything else about the world up until that point; then imagine time running forward under what your model says are the rules or physical laws.\n\nA causal agent thus believes that Box B is already empty, and takes both boxes.  When they imagine the (counterfactual) result of taking only box B instead, they imagine the world being the same up until that point in time--including Box B remaining empty--and then imagine the result of taking only Box B under physical laws past that point, namely, going home with \\$0.\n\nHistorically speaking, [5n9 causal decision theory] was first invented to justify two-boxing on Newcomb's Problem; we can see CDT as formalizing the pretheoretic intuition, "Omega's already gone, so I can't get more money by leaving behind Box A."\n\n## Logical decision theories\n\nOn [58b logical decision theories], the principle of rational choice is "Decide as though you are choosing the logical output of your decision algorithm."  E.g., on [-timeless_dt], our extended causal model of the world would include a logical proposition for whether the output of your decision algorithm is 'one-box' or 'two-box'; and this logical fact would affect both Omega's prediction of you, and your actual decision.  Thus, an LDT agent prefers that its algorithm have the logical output of one-boxing.\n\n%todo: add graph for TDT on NP%',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '1',
  maintainerCount: '1',
  userSubscriberCount: '0',
  lastVisit: '',
  hasDraft: 'false',
  votes: [],
  voteSummary: 'null',
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {},
  creatorIds: [
    'EliezerYudkowsky',
    'EricBruylant'
  ],
  childIds: [],
  parentIds: [
    'newcomblike'
  ],
  commentIds: [],
  questionIds: [],
  tagIds: [],
  relatedIds: [],
  markIds: [],
  explanations: [],
  learnMore: [],
  requirements: [],
  subjects: [],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '20138',
      pageId: 'newcombs_problem',
      userId: 'EricBruylant',
      edit: '6',
      type: 'newEdit',
      createdAt: '2016-10-13 16:53:45',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: 'formatting fixes'
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '17874',
      pageId: 'newcombs_problem',
      userId: 'EliezerYudkowsky',
      edit: '4',
      type: 'newEdit',
      createdAt: '2016-08-01 00:36:48',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '17873',
      pageId: 'newcombs_problem',
      userId: 'EliezerYudkowsky',
      edit: '3',
      type: 'newEdit',
      createdAt: '2016-08-01 00:35:12',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '17872',
      pageId: 'newcombs_problem',
      userId: 'EliezerYudkowsky',
      edit: '2',
      type: 'newEdit',
      createdAt: '2016-08-01 00:28:06',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '17867',
      pageId: 'newcombs_problem',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newTag',
      createdAt: '2016-08-01 00:09:11',
      auxPageId: '4v4',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '17866',
      pageId: 'newcombs_problem',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newParent',
      createdAt: '2016-08-01 00:09:10',
      auxPageId: 'newcomblike',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '17864',
      pageId: 'newcombs_problem',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newEdit',
      createdAt: '2016-08-01 00:09:09',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'false',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}