{ localUrl: '../page/epistemic_exclusion.html', arbitalUrl: 'https://arbital.com/p/epistemic_exclusion', rawJsonUrl: '../raw/1g4.json', likeableId: 'PierreThierry', likeableType: 'page', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], pageId: 'epistemic_exclusion', edit: '1', editSummary: '', prevEdit: '0', currentEdit: '1', wasPublished: 'true', type: 'wiki', title: 'Epistemic exclusion', clickbait: 'How would you build an AI that, no matter what else it learned about the world, never knew or wanted to know what was inside your basement?', textLength: '2035', alias: 'epistemic_exclusion', externalUrl: '', sortChildrenBy: 'likes', hasVote: 'false', voteType: '', votesAnonymous: 'false', editCreatorId: 'EliezerYudkowsky', editCreatedAt: '2015-12-28 22:20:39', pageCreatorId: 'EliezerYudkowsky', pageCreatedAt: '2015-12-28 22:20:39', seeDomainId: '0', editDomainId: 'EliezerYudkowsky', submitToDomainId: '0', isAutosave: 'false', isSnapshot: 'false', isLiveEdit: 'true', isMinorEdit: 'false', indirectTeacher: 'false', todoCount: '0', isEditorComment: 'false', isApprovedComment: 'true', isResolved: 'false', snapshotText: '', anchorContext: '', anchorText: '', anchorOffset: '0', mergedInto: '', isDeleted: 'false', viewCount: '146', text: 'An "epistemic exclusion" would be a hypothetical form of AI limitation that made the AI not model (and if reflectively stable, not want to model) some particular part of physical or mathematical reality, or model it only using some restricted model class that didn't allow for the maximum possible predictive accuracy. For example, a [102 behaviorist genie] would not want to model human minds (except using a tightly restricted model class) to avoid [6v], [programmer_manipulation] and other possible problems.\n\nAt present, nobody has investigated how to do this (in any reflectively stable way), and there's all sorts of obvious problems stemming from the fact that, in reality, most facts are linked to a significant number of other facts. How would you make an AI that was really good at predicting everything else in the world but didn't know or want to know what was inside your basement? Intuitively, it seems likely that a lot of naive solutions would, e.g., just cause the AI to *de facto* end up constructing something that wasn't technically a model of your basement, but played the same role as a model of your basement, in order to maximize predictive accuracy about everything that wasn't your basement. We could similarly ask how it would be possible to build a really good mathematician that never knew or cared whether 333 was a prime number, and whether this might require it to also ignore the 'casting out nines' procedure whenever it saw 333 as a decimal number, or what would happen if we asked it to multiply 3 by (100 + 10 + 1), and so on.\n\nThat said, most *practical* reasons to create an epistemic exclusion (e.g. [102 against modeling humans in too much detail], or [1fz against modeling distant alien civilizations and superintelligences]) would involve some practical reason the exclusion was there, and some level of in-practice exclusion that was *good enough*, which might not require e.g. maximum predictive accuracy about everything else combined with zero predictive accuracy about the exclusion.', metaText: '', isTextLoaded: 'true', isSubscribedToDiscussion: 'false', isSubscribedToUser: 'false', isSubscribedAsMaintainer: 'false', discussionSubscriberCount: '1', maintainerCount: '1', userSubscriberCount: '0', lastVisit: '2016-02-26 20:11:29', hasDraft: 'false', votes: [], voteSummary: 'null', muVoteSummary: '0', voteScaling: '0', currentUserVote: '-2', voteCount: '0', lockedVoteType: '', maxEditEver: '0', redLinkCount: '0', lockedBy: '', lockedUntil: '', nextPageId: '', prevPageId: '', usedAsMastery: 'false', proposalEditNum: '0', permissions: { edit: { has: 'false', reason: 'You don't have domain permission to edit this page' }, proposeEdit: { has: 'true', reason: '' }, delete: { has: 'false', reason: 'You don't have domain permission to delete this page' }, comment: { has: 'false', reason: 'You can't comment in this domain because you are not a member' }, proposeComment: { has: 'true', reason: '' } }, summaries: {}, creatorIds: [ 'EliezerYudkowsky' ], childIds: [], parentIds: [ 'task_agi' ], commentIds: [], questionIds: [], tagIds: [ 'work_in_progress_meta_tag' ], relatedIds: [], markIds: [], explanations: [], learnMore: [], requirements: [], subjects: [], lenses: [], lensParentId: '', pathPages: [], learnMoreTaughtMap: {}, learnMoreCoveredMap: {}, learnMoreRequiredMap: {}, editHistory: {}, domainSubmissions: {}, answers: [], answerCount: '0', commentCount: '0', newCommentCount: '0', linkedMarkCount: '0', changeLogs: [ { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '4599', pageId: 'epistemic_exclusion', userId: 'EliezerYudkowsky', edit: '1', type: 'newEdit', createdAt: '2015-12-28 22:20:39', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '4598', pageId: 'epistemic_exclusion', userId: 'EliezerYudkowsky', edit: '0', type: 'newTag', createdAt: '2015-12-28 22:18:06', auxPageId: 'work_in_progress_meta_tag', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '4596', pageId: 'epistemic_exclusion', userId: 'EliezerYudkowsky', edit: '0', type: 'deleteTag', createdAt: '2015-12-28 22:18:02', auxPageId: 'stub_meta_tag', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '4594', pageId: 'epistemic_exclusion', userId: 'EliezerYudkowsky', edit: '0', type: 'newTag', createdAt: '2015-12-28 22:11:25', auxPageId: 'stub_meta_tag', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '4592', pageId: 'epistemic_exclusion', userId: 'EliezerYudkowsky', edit: '0', type: 'newParent', createdAt: '2015-12-28 22:11:20', auxPageId: 'task_agi', oldSettingsValue: '', newSettingsValue: '' } ], feedSubmissions: [], searchStrings: {}, hasChildren: 'false', hasParents: 'true', redAliases: {}, improvementTagIds: [], nonMetaTagIds: [], todos: [], slowDownMap: 'null', speedUpMap: 'null', arcPageIds: 'null', contentRequests: {} }