{ localUrl: '../page/transparent_newcombs_problem.html', arbitalUrl: 'https://arbital.com/p/transparent_newcombs_problem', rawJsonUrl: '../raw/5ry.json', likeableId: '3335', likeableType: 'page', myLikeValue: '0', likeCount: '1', dislikeCount: '0', likeScore: '1', individualLikes: [ 'EricBruylant' ], pageId: 'transparent_newcombs_problem', edit: '11', editSummary: '', prevEdit: '9', currentEdit: '11', wasPublished: 'true', type: 'wiki', title: 'Transparent Newcomb's Problem', clickbait: 'Omega has left behind a transparent Box A containing $1000, and a transparent Box B containing $1,000,000 or nothing. Box B is full iff Omega thinks you one-box on seeing a full Box B.', textLength: '6215', alias: 'transparent_newcombs_problem', externalUrl: '', sortChildrenBy: 'likes', hasVote: 'false', voteType: '', votesAnonymous: 'false', editCreatorId: 'EliezerYudkowsky', editCreatedAt: '2016-09-09 23:07:52', pageCreatorId: 'EliezerYudkowsky', pageCreatedAt: '2016-08-04 23:14:52', seeDomainId: '0', editDomainId: '123', submitToDomainId: '0', isAutosave: 'false', isSnapshot: 'false', isLiveEdit: 'true', isMinorEdit: 'false', indirectTeacher: 'false', todoCount: '0', isEditorComment: 'false', isApprovedComment: 'true', isResolved: 'false', snapshotText: '', anchorContext: '', anchorText: '', anchorOffset: '0', mergedInto: '', isDeleted: 'false', viewCount: '95', text: '[summary: A version of [5pv Newcomb's Problem] in which Box B is transparent. That is:\n\n[5b2 Omega] has presented you with two boxes, Box A which transparently contains \\$1,000, and Box B which transparently contains \\$0 or \\$1,000,000. You may take either both boxes, or only Box B. Omega has already filled Box B iff Omega predicted that you would, upon seeing a full Box B, take only Box B.\n\nThe Transparent Newcomb's Problem is noteworthy in that [5px evidential decision theory] and [5n9 causal decision theory] agree that a [principle_rational_choice rational] agent should take both boxes; only [58b logical decision agents] leave behind Box A and become rich. This is an apparent counterexample to a [edt_cdt_dichotomy widespread view] that EDT and CDT divide [5pt Newcomblike problems] between them, with EDT agents accepting 'Why aincha rich?' arguments.]\n\nLike [5pv Newcomb's Problem], but Box B is *also* transparent. That is:\n\n[5b2 Omega] has presented you with the following dilemma:\n\n- There are two boxes before you, Box A and Box B.\n- You can either take both boxes ("two-box"), or take only Box B ("one-box").\n- Box A is transparent and contains \\$1,000.\n- Box B is *also* transparent and contains either \\$1,000,000 or \\$0.\n- Omega has already put \\$1,000,000 into Box B *if and only if* Omega **predicts that you will one-box when faced with a visibly full Box B.**\n- Omega has been right in a couple of dozen games so far, but not a thousand games, and Omega *could* be wrong next time given our current knowledge. We may alternatively suppose that Omega is right 99%, but not 99.9%, of the time. %note: That is, Omega's success rate reflects that everyone who's seen a full Box B has one-boxed. Some people who've seen an empty Box B have been indignant about that. But based on Omega's accuracy in the testable cases, they're probably wrong about what they would have done.%\n\nThis [5pt Newcomblike dilemma] is structurally similar to [5s0] (no decision theory disputes this structural similarity, so far as we know).\n\nNote that it is not, in general, possible to have a transparent Newcomb's Problem in which, for every possible agent, Omega fills Box B iff Omega predicts unconditionally that the agent ends up one-boxing. Some agent could two-box on seeing a full Box B and one-box on seeing an empty Box B, making the general rule impossible for Omega to fulfill.\n\nSimilarly, the problem setup stipulates that it seems not entirely impossible that Omega will get the prediction wrong next time. Otherwise this would introduce a new and distracting problem of [action_conditional conditioning] on a visible impossibility when we see a full Box B and consider two-boxing.\n\n# Analyses\n\n## [5n9 Causal decision theory]\n\nTwo-boxes, because one-boxing cannot cause Box B to be full or empty, since Omega has already predicted and departed.\n\n## [5px Evidential decision theory]\n\nTwo-boxes, because one-boxing cannot be further *good news* about Box B being full, because the agent has already seen that Box B is full. The agent, upon imagining being told that it one-boxes here, imagines concluding "Omega made its first mistake!" rather than "My eyes are deceiving me and Box B is actually empty." (Thus, EDT agents never see a full Box B to begin with.)\n\n## [58b Logical decision theory]\n\nOne-boxes, because:\n\n• On [timeless_dt timeless decision theory] without the [5rz updateless] feature: Even after observing Box B being full, we conclude from our extended causal model that in the *counterfactual* case where our algorithm output "Take both boxes", Box B would have *counterfactually* been empty. (Updateful TDT does not [counterfactual_mugging in general] output the behavior corresponding to the highest score on problems in this [fair_problem_class decision class], but updateful TDT happens to get the highest score in this particular scenario.)\n\n• On [5rz updateless decision theories]: The policy of mapping the sensory input "Box B is full" onto the action "Take only one box" leads to the highest expected utility (as evaluated relative to our non-updated prior).\n\nThe Transparent Newcomb's Problem is significant because it counterargues a [edt_cdt_dichotomy widespread view] that EDT and CDT split the [5pt Newcomblike problems] between them, with EDT being the decision theory that accepts 'why aincha rich?' arguments.\n\n- EDT and CDT agree on two-boxing in the Transparent Newcomb's Problem, both saying, "Omega has chosen to penalize the rational behavior here, alas, but it is too late for me to do anything about that."\n- LDT disagrees with both and one-boxes, saying "My algorithm can output whatever behavior I want."\n- EDT and CDT agents exhibit the behavior pattern that corresponds to being poor; LDT agents ask, "If your principle of choice is so rational, why aincha rich?"\n\n## Truly clever LDT and EDT agents\n\nTruly clever agents will realize that the (transparently visible) state of Box B reflects oracular reasoning by Omega about any factor that could affect our decision whether to one-box after seeing a full Box B. The value of an advance prediction about any possible observable factor determining our decision could easily exceed a million dollars.\n\nFor example, suppose we have until the end of the day to actually decide how many boxes to take. On finding yourself in a transparent Newcomb's Problem, you could postcommit to an obvious strategy such as that you'll one-box iff the S&P 500 ends up on the day. If you see Box B is full, you can load up on margin and buy short-term call options (and then wait, and actually one-box at the end of the day iff the S&P 500 goes up).\n\nYou could also carry out the converse strategy (buy put options if you see Box B is empty), but only if you're confident that the S&P 500's daily movement is independent of any options you buy and that both of your possible selves converge on the same postcommitment, since what you're learning from seeing Box B in this case is what your action *would* have been at the end of the day *if* Box B had been full.\n\nThis general strategy was observed by [Eliezer Yudkowsky and Jack LaSota](https://www.facebook.com/yudkowsky/posts/10154554618439228).', metaText: '', isTextLoaded: 'true', isSubscribedToDiscussion: 'false', isSubscribedToUser: 'false', isSubscribedAsMaintainer: 'false', discussionSubscriberCount: '1', maintainerCount: '1', userSubscriberCount: '0', lastVisit: '', hasDraft: 'false', votes: [], voteSummary: 'null', muVoteSummary: '0', voteScaling: '0', currentUserVote: '-2', voteCount: '0', lockedVoteType: '', maxEditEver: '0', redLinkCount: '0', lockedBy: '', lockedUntil: '', nextPageId: '', prevPageId: '', usedAsMastery: 'false', proposalEditNum: '0', permissions: { edit: { has: 'false', reason: 'You don't have domain permission to edit this page' }, proposeEdit: { has: 'true', reason: '' }, delete: { has: 'false', reason: 'You don't have domain permission to delete this page' }, comment: { has: 'false', reason: 'You can't comment in this domain because you are not a member' }, proposeComment: { has: 'true', reason: '' } }, summaries: {}, creatorIds: [ 'EliezerYudkowsky', 'EricBruylant' ], childIds: [], parentIds: [ 'newcomblike' ], commentIds: [], questionIds: [], tagIds: [ 'c_class_meta_tag' ], relatedIds: [], markIds: [], explanations: [], learnMore: [], requirements: [ { id: '5981', parentId: 'causal_dt', childId: 'transparent_newcombs_problem', type: 'requirement', creatorId: 'EliezerYudkowsky', createdAt: '2016-08-04 23:22:20', level: '1', isStrong: 'true', everPublished: 'true' }, { id: '5982', parentId: 'evidential_dt', childId: 'transparent_newcombs_problem', type: 'requirement', creatorId: 'EliezerYudkowsky', createdAt: '2016-08-04 23:22:31', level: '1', isStrong: 'true', everPublished: 'true' }, { id: '5983', parentId: 'logical_dt', childId: 'transparent_newcombs_problem', type: 'requirement', creatorId: 'EliezerYudkowsky', createdAt: '2016-08-04 23:22:41', level: '1', isStrong: 'true', everPublished: 'true' }, { id: '5988', parentId: 'updateless_dt', childId: 'transparent_newcombs_problem', type: 'requirement', creatorId: 'EliezerYudkowsky', createdAt: '2016-08-04 23:29:30', level: '1', isStrong: 'true', everPublished: 'true' } ], subjects: [ { id: '5984', parentId: 'evidential_dt', childId: 'transparent_newcombs_problem', type: 'subject', creatorId: 'EliezerYudkowsky', createdAt: '2016-08-04 23:23:02', level: '2', isStrong: 'false', everPublished: 'true' }, { id: '5987', parentId: 'updateless_dt', childId: 'transparent_newcombs_problem', type: 'subject', creatorId: 'EliezerYudkowsky', createdAt: '2016-08-04 23:29:22', level: '2', isStrong: 'false', everPublished: 'true' } ], lenses: [], lensParentId: '', pathPages: [], learnMoreTaughtMap: {}, learnMoreCoveredMap: { '5rz': [ '5s0' ] }, learnMoreRequiredMap: {}, editHistory: {}, domainSubmissions: {}, answers: [], answerCount: '0', commentCount: '0', newCommentCount: '0', linkedMarkCount: '0', changeLogs: [ { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '19527', pageId: 'transparent_newcombs_problem', userId: 'EliezerYudkowsky', edit: '11', type: 'newEdit', createdAt: '2016-09-09 23:07:52', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18439', pageId: 'transparent_newcombs_problem', userId: 'EricBruylant', edit: '9', type: 'newEdit', createdAt: '2016-08-05 15:26:20', auxPageId: '', oldSettingsValue: '', newSettingsValue: 'fixed link' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18408', pageId: 'transparent_newcombs_problem', userId: 'EliezerYudkowsky', edit: '8', type: 'newEdit', createdAt: '2016-08-04 23:45:06', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18407', pageId: 'transparent_newcombs_problem', userId: 'EliezerYudkowsky', edit: '7', type: 'newEdit', createdAt: '2016-08-04 23:41:42', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18406', pageId: 'transparent_newcombs_problem', userId: 'EliezerYudkowsky', edit: '6', type: 'newEdit', createdAt: '2016-08-04 23:33:05', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18405', pageId: 'transparent_newcombs_problem', userId: 'EliezerYudkowsky', edit: '5', type: 'newEdit', createdAt: '2016-08-04 23:32:31', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18404', pageId: 'transparent_newcombs_problem', userId: 'EliezerYudkowsky', edit: '4', type: 'newEdit', createdAt: '2016-08-04 23:31:33', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18403', pageId: 'transparent_newcombs_problem', userId: 'EliezerYudkowsky', edit: '3', type: 'newEdit', createdAt: '2016-08-04 23:30:43', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18402', pageId: 'transparent_newcombs_problem', userId: 'EliezerYudkowsky', edit: '0', type: 'newRequirement', createdAt: '2016-08-04 23:29:30', auxPageId: 'updateless_dt', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18401', pageId: 'transparent_newcombs_problem', userId: 'EliezerYudkowsky', edit: '0', type: 'newSubject', createdAt: '2016-08-04 23:29:23', auxPageId: 'updateless_dt', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18395', pageId: 'transparent_newcombs_problem', userId: 'EliezerYudkowsky', edit: '0', type: 'newSubject', createdAt: '2016-08-04 23:23:02', auxPageId: 'evidential_dt', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18393', pageId: 'transparent_newcombs_problem', userId: 'EliezerYudkowsky', edit: '0', type: 'newRequirement', createdAt: '2016-08-04 23:22:42', auxPageId: 'logical_dt', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18392', pageId: 'transparent_newcombs_problem', userId: 'EliezerYudkowsky', edit: '0', type: 'newRequirement', createdAt: '2016-08-04 23:22:31', auxPageId: 'evidential_dt', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18391', pageId: 'transparent_newcombs_problem', userId: 'EliezerYudkowsky', edit: '0', type: 'newRequirement', createdAt: '2016-08-04 23:22:21', auxPageId: 'causal_dt', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18390', pageId: 'transparent_newcombs_problem', userId: 'EliezerYudkowsky', edit: '0', type: 'newTag', createdAt: '2016-08-04 23:22:08', auxPageId: 'c_class_meta_tag', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18389', pageId: 'transparent_newcombs_problem', userId: 'EliezerYudkowsky', edit: '2', type: 'newEdit', createdAt: '2016-08-04 23:21:33', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18388', pageId: 'transparent_newcombs_problem', userId: 'EliezerYudkowsky', edit: '0', type: 'newParent', createdAt: '2016-08-04 23:15:03', auxPageId: 'newcomblike', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18386', pageId: 'transparent_newcombs_problem', userId: 'EliezerYudkowsky', edit: '1', type: 'newEdit', createdAt: '2016-08-04 23:14:52', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' } ], feedSubmissions: [], searchStrings: {}, hasChildren: 'false', hasParents: 'true', redAliases: {}, improvementTagIds: [], nonMetaTagIds: [], todos: [], slowDownMap: 'null', speedUpMap: 'null', arcPageIds: 'null', contentRequests: {} }