{ localUrl: '../page/ldt_guide.html', arbitalUrl: 'https://arbital.com/p/ldt_guide', rawJsonUrl: '../raw/58d.json', likeableId: '0', likeableType: 'page', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], pageId: 'ldt_guide', edit: '2', editSummary: '', prevEdit: '1', currentEdit: '2', wasPublished: 'true', type: 'wiki', title: 'Guide to Logical Decision Theory', clickbait: 'The entry point for learning about logical decision theory.', textLength: '4321', alias: 'ldt_guide', externalUrl: '', sortChildrenBy: 'likes', hasVote: 'false', voteType: '', votesAnonymous: 'false', editCreatorId: 'EliezerYudkowsky', editCreatedAt: '2016-07-08 20:41:57', pageCreatorId: 'EliezerYudkowsky', pageCreatedAt: '2016-07-08 20:39:19', seeDomainId: '0', editDomainId: '15', submitToDomainId: '0', isAutosave: 'false', isSnapshot: 'false', isLiveEdit: 'true', isMinorEdit: 'false', indirectTeacher: 'false', todoCount: '0', isEditorComment: 'false', isApprovedComment: 'true', isResolved: 'false', snapshotText: '', anchorContext: '', anchorText: '', anchorOffset: '0', mergedInto: '', isDeleted: 'false', viewCount: '97', text: '"Logical decision theory" is a new family of decision theories, of varying levels of formalization, which are argued to have key implications for *theories of economic rationality,* the design of sufficiently advanced *machine intelligence* algorithms, and of course the *philosophy of rational decision.* Logical decision theories impinge on whether it's rational to vote in elections, or rational to give in to blackmail, or [how computational agents should play in dilemmas if the agents have common knowledge of each others' source code](https://arxiv.org/abs/1401.5577).\n\nThe new idea in logical decision theories can glossed as "choose as though you were choosing the logical output of your decision algorithm".\n\nBefore continuing, please help this page adapt to you by telling it a couple of things about yourself!\n\n[multiple-choice(q_dt_background): What's your primary background with respect to decision theory? How should we initially approach this subject?\na: An economic standpoint. Please start by telling me how this is relevant to economically rational agents deciding whether to vote in elections.\nknows: [dt_economics]\nwants: [dt_economics_intro]\n-wants: [dt_compsci_intro], [dt_newcomblike_intro], [dt_normal_intro], [dt_altruist_intro]\nb: A computer science or Artificial Intelligence standpoint. Start by talking to me about programs and code.\nknows: [dt_compsci]\nwants: [dt_compsci_intro]\n-wants: [dt_economics_intro], [dt_newcomblike_intro], [dt_normal_intro], [dt_altruist_intro]\nc: The standpoint of analytic philosophy. Talk to me about Newcomblike problems and why this new decision theory is better than a dozen other contenders.\nknows: [dt_philosophy]\nwants: [dt_newcomblike_intro]\n-wants: [dt_compsci_intro], [dt_newcomblike_intro], [dt_normal_intro], [dt_altruist_intro]\nd: I'm just somebody off the Internet. Can you explain to me from scratch what's going on?\nwants: [dt_normal_intro]\n-wants: [dt_economics_intro], [dt_compsci_intro], [dt_newcomblike_intro], [dt_altruist_intro]\ne: I'm an effective altruist or philanthropic grantmaker. I'm mainly wondering how this mysterious work relates to humanity's big picture and why it was worth whatever funding it received.\nwants: [dt_altruist_intro]\n-wants: [dt_economics_intro], [dt_compsci_intro], [dt_newcomblike_intro], [dt_normal_intro]\n]\n\n[multiple-choice(q_dt_math): What level of math should we throw at you?\na: As little math as possible, please.\n-knows: [1r5], [1r6], [1r7]\nb: Normal algebra is fine. Please be careful about how you toss around large formulas full of Greek letters.\nknows: [1r5], [1lx]\n-knows: [1r7]\nc: I'm confident in my ability to deal with formulas, and you can go through them quickly.\nknows: [1r5], [1r6], [1lx]\n]\n\n%%knows-requisite([dt_philosophy]):\n[multiple-choice(q_dt_newcomblike): How much of the prior debate on decision theory are you familiar with?\na: None, I just have a general background in analytic philosophy.\n-knows: [dt_prisonersdilemma], [dt_newcomb], [dt_newcomblike]\nb: I'm familiar with the Prisoner's Dilemma.\nknows: [dt_prisonersdilemma]\n-knows: [dt_newcomb], [dt_newcomblike]\nc: I'm familiar with Newcomb's Problem and I understand its relation to the Prisoner's Dilemma.\nknows: [dt_prisonersdilemma], [dt_newcomb]\n-knows: [dt_newcomblike]\nd: I'm familiar with a significant portion of the prior debate on causal decision theory versus evidential decision theory.\nknows: [dt_prisonersdilemma], [dt_newcomb], [dt_newcomblike]\n]\n%%\n\n%%knows-requisite([dt_compsci]):\n[multiple-choice(q_dt_compsci): Are you already familiar with game theory and the Prisoner's Dilemma?\na: Nope.\n-knows: [dt_prisonersdilemma], [dt_gametheory]\nb: I'm familiar with the Prisoner's Dilemma.\nknows: [dt_prisonersdilemma]\n-knows: [dt_gametheory]\nc: I'm familiar with Nash equilibria.\nknows: [dt_prisonersdilemma], [dt_gametheory]\n]\n%%\n\n%%knows-requisite([dt_economics]):\n[multiple-choice(q_dt_compsci): Are you already familiar with game theory and the Prisoner's Dilemma?\na: Nope.\n-knows: [dt_prisonersdilemma], [dt_gametheory]\nb: I'm familiar with the Prisoner's Dilemma.\nknows: [dt_prisonersdilemma]\n-knows: [dt_gametheory]\nc: I understand Nash equilibria and Pareto optima, and how the Prisoner's Dilemma contrasts them.\nknows: [dt_prisonersdilemma], [dt_gametheory]\n]\n%%\n\n\n', metaText: '', isTextLoaded: 'true', isSubscribedToDiscussion: 'false', isSubscribedToUser: 'false', isSubscribedAsMaintainer: 'false', discussionSubscriberCount: '1', maintainerCount: '1', userSubscriberCount: '0', lastVisit: '', hasDraft: 'false', votes: [], voteSummary: 'null', muVoteSummary: '0', voteScaling: '0', currentUserVote: '-2', voteCount: '0', lockedVoteType: '', maxEditEver: '0', redLinkCount: '0', lockedBy: '', lockedUntil: '', nextPageId: '', prevPageId: '', usedAsMastery: 'false', proposalEditNum: '0', permissions: { edit: { has: 'false', reason: 'You don't have domain permission to edit this page' }, proposeEdit: { has: 'true', reason: '' }, delete: { has: 'false', reason: 'You don't have domain permission to delete this page' }, comment: { has: 'false', reason: 'You can't comment in this domain because you are not a member' }, proposeComment: { has: 'true', reason: '' } }, summaries: {}, creatorIds: [ 'EliezerYudkowsky' ], childIds: [], parentIds: [ 'logical_dt' ], commentIds: [], questionIds: [], tagIds: [ 'guide_meta_tag' ], relatedIds: [], markIds: [], explanations: [], learnMore: [], requirements: [], subjects: [], lenses: [], lensParentId: '', pathPages: [], learnMoreTaughtMap: {}, learnMoreCoveredMap: {}, learnMoreRequiredMap: {}, editHistory: {}, domainSubmissions: {}, answers: [], answerCount: '0', commentCount: '0', newCommentCount: '0', linkedMarkCount: '0', changeLogs: [ { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '16259', pageId: 'ldt_guide', userId: 'EliezerYudkowsky', edit: '0', type: 'newEditGroup', createdAt: '2016-07-08 20:53:32', auxPageId: 'DecisionTheory', oldSettingsValue: '', newSettingsValue: '58c' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '16258', pageId: 'ldt_guide', userId: 'EliezerYudkowsky', edit: '2', type: 'newEdit', createdAt: '2016-07-08 20:41:57', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '16257', pageId: 'ldt_guide', userId: 'EliezerYudkowsky', edit: '0', type: 'newAlias', createdAt: '2016-07-08 20:40:42', auxPageId: '', oldSettingsValue: '58d', newSettingsValue: 'ldt_guide' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '16254', pageId: 'ldt_guide', userId: 'EliezerYudkowsky', edit: '0', type: 'newParent', createdAt: '2016-07-08 20:39:21', auxPageId: 'logical_dt', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '16255', pageId: 'ldt_guide', userId: 'EliezerYudkowsky', edit: '0', type: 'newTag', createdAt: '2016-07-08 20:39:21', auxPageId: 'guide_meta_tag', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '16252', pageId: 'ldt_guide', userId: 'EliezerYudkowsky', edit: '1', type: 'newEdit', createdAt: '2016-07-08 20:39:19', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' } ], feedSubmissions: [], searchStrings: {}, hasChildren: 'false', hasParents: 'true', redAliases: {}, improvementTagIds: [], nonMetaTagIds: [], todos: [], slowDownMap: 'null', speedUpMap: 'null', arcPageIds: 'null', contentRequests: {} }