{ localUrl: '../page/5b.html', arbitalUrl: 'https://arbital.com/p/5b', rawJsonUrl: '../raw/5b.json', likeableId: '2334', likeableType: 'page', myLikeValue: '0', likeCount: '1', dislikeCount: '0', likeScore: '1', individualLikes: [ 'RokResnik' ], pageId: '5b', edit: '12', editSummary: '', prevEdit: '11', currentEdit: '12', wasPublished: 'true', type: 'wiki', title: 'Linguistic conventions in value alignment', clickbait: 'How and why to use precise language and words with special meaning when talking about value alignment.', textLength: '3509', alias: '5b', externalUrl: '', sortChildrenBy: 'likes', hasVote: 'false', voteType: '', votesAnonymous: 'false', editCreatorId: 'EliezerYudkowsky', editCreatedAt: '2015-12-17 22:42:06', pageCreatorId: 'EliezerYudkowsky', pageCreatedAt: '2015-04-27 22:13:43', seeDomainId: '0', editDomainId: 'EliezerYudkowsky', submitToDomainId: '0', isAutosave: 'false', isSnapshot: 'false', isLiveEdit: 'true', isMinorEdit: 'false', indirectTeacher: 'false', todoCount: '3', isEditorComment: 'false', isApprovedComment: 'true', isResolved: 'false', snapshotText: '', anchorContext: '', anchorText: '', anchorOffset: '0', mergedInto: '', isDeleted: 'false', viewCount: '134', text: 'A central page to list the language conventions in [2v value alignment theory]. See also [9m].\n\n# Language dealing with wants, desires, utility, preference, and value.\n\nWe need a language rich enough to distinguish at least the following as different [10b intensional concepts], even if their [10b extensions] end up being identical:\n\n- A. What the programmers explicitly, verbally said they wanted to achieve by building the AI.\n- B. What the programmers wordlessly, intuitively meant; the actual criterion they would use for rating the desirability of outcomes, if they could actually look at those outcomes and assign ratings.\n- C. What programmers *should* want from the AI (from within some view on normativity, shouldness, or rightness).\n- D. The AI's explicitly represented cognitive preferences, if any.\n- E. The property that running the AI tends to produce in the world; the property that the AI behaves in such fashion as to bring about.\n\nSo far, the following reserved terms have been advocated for the subject of value alignment:\n\n- [55 **Value**] and **valuable** to refer to C. On views which identify C with B, it thereby refers to B.\n- **Optimization target** to mean only E. We can also say, e.g., that natural selection has an 'optimization target' of inclusive genetic fitness. 'Optimization target' is meant to be an exceedingly general term that can talk about irrational agents and nonagents.\n- [109 **Utility**] to mean a Von Neumann-Morgenstern utility function, reserved to talk about agents that behave like some bounded analogue of expected utility optimizers. Utility is explicitly not assumed to be normative. E.g., if speaking of a paperclip maximizer, we will say that an outcome has higher utility iff it contains more paperclips. Thus 'utility' is reserved to refer to D or E.\n- **Desire** to mean anthropomorphic human-style desires, referring to A or B rather than C, D, or E. ('Wants' are general over humans and AIs.)\n- **Preference** and **prefer** to be general terms that can be used for both humans and AIs. 'Preference' refers to B or D rather than A, C, or E. It means 'what the agent explicitly and cognitively wants' rather than 'what the agent should want' or 'what the agent mistakenly thinks it wants' or 'what the agent's behavior tends to optimize'. Someone can be said to prefer their extrapolated volition to be implemented rather than their current desires, but if so they must explicitly, cognitively prefer that, or accept it in an explicit choice between options.\n- [5f **Preference framework**] to be an even more general term that can refer to e.g. meta-utility functions that change based on observations, or to meta-preferences about how one's own preferences should be extrapolated. A 'preference framework' should refer to constructs more coherent than the human mass of desires and ad-hoc reflections, but not as strictly restricted as a VNM utility function. Stuart Armstrong's [ utility indifference] framework for [ value learning] is an example of a preference framework that is not a vanilla/ordinary utility function.\n- **Goal** remains a generic, unreserved term that could refer to any of A-E, and also particular things an agent wants to get done for [ instrumental] reasons.\n- **[6h Intended goal]** to refer to B only.\n- **Want** remains a generic, unreserved term that could refer to humans or other agents, or terminal or instrumental goals.\n\n'Terminal' and 'instrumental' have their standard contrasting meanings.', metaText: '', isTextLoaded: 'true', isSubscribedToDiscussion: 'false', isSubscribedToUser: 'false', isSubscribedAsMaintainer: 'false', discussionSubscriberCount: '1', maintainerCount: '1', userSubscriberCount: '0', lastVisit: '2016-02-23 09:11:16', hasDraft: 'false', votes: [], voteSummary: 'null', muVoteSummary: '0', voteScaling: '0', currentUserVote: '-2', voteCount: '0', lockedVoteType: '', maxEditEver: '0', redLinkCount: '0', lockedBy: '', lockedUntil: '', nextPageId: '', prevPageId: '', usedAsMastery: 'false', proposalEditNum: '0', permissions: { edit: { has: 'false', reason: 'You don't have domain permission to edit this page' }, proposeEdit: { has: 'true', reason: '' }, delete: { has: 'false', reason: 'You don't have domain permission to delete this page' }, comment: { has: 'false', reason: 'You can't comment in this domain because you are not a member' }, proposeComment: { has: 'true', reason: '' } }, summaries: {}, creatorIds: [ 'EliezerYudkowsky', 'AlexeiAndreev' ], childIds: [ 'value_alignment_utility' ], parentIds: [ 'ai_alignment' ], commentIds: [], questionIds: [], tagIds: [ 'definition_meta_tag' ], relatedIds: [], markIds: [], explanations: [], learnMore: [], requirements: [], subjects: [], lenses: [], lensParentId: '', pathPages: [], learnMoreTaughtMap: {}, learnMoreCoveredMap: {}, learnMoreRequiredMap: {}, editHistory: {}, domainSubmissions: {}, answers: [], answerCount: '0', commentCount: '0', newCommentCount: '0', linkedMarkCount: '0', changeLogs: [ { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '21944', pageId: '5b', userId: 'EliezerYudkowsky', edit: '0', type: 'deleteChild', createdAt: '2017-02-07 19:44:21', auxPageId: 'preference_framework', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '4125', pageId: '5b', userId: 'EliezerYudkowsky', edit: '12', type: 'newEdit', createdAt: '2015-12-17 22:42:06', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '4124', pageId: '5b', userId: 'EliezerYudkowsky', edit: '11', type: 'newEdit', createdAt: '2015-12-17 22:41:48', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '4123', pageId: '5b', userId: 'EliezerYudkowsky', edit: '10', type: 'newEdit', createdAt: '2015-12-17 22:41:23', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '4121', pageId: '5b', userId: 'EliezerYudkowsky', edit: '9', type: 'newEdit', createdAt: '2015-12-17 22:39:02', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '4068', pageId: '5b', userId: 'EliezerYudkowsky', edit: '8', type: 'newEdit', createdAt: '2015-12-17 19:45:51', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '4067', pageId: '5b', userId: 'EliezerYudkowsky', edit: '7', type: 'newEdit', createdAt: '2015-12-17 19:45:10', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '3851', pageId: '5b', userId: 'AlexeiAndreev', edit: '6', type: 'newEdit', createdAt: '2015-12-16 04:25:04', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '1092', pageId: '5b', userId: 'AlexeiAndreev', edit: '1', type: 'newUsedAsTag', createdAt: '2015-10-28 03:47:09', auxPageId: 'definition_meta_tag', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '779', pageId: '5b', userId: 'AlexeiAndreev', edit: '1', type: 'newChild', createdAt: '2015-10-28 03:46:58', auxPageId: 'value_alignment_utility', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '780', pageId: '5b', userId: 'AlexeiAndreev', edit: '1', type: 'newChild', createdAt: '2015-10-28 03:46:58', auxPageId: 'preference_framework', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '373', pageId: '5b', userId: 'AlexeiAndreev', edit: '1', type: 'newParent', createdAt: '2015-10-28 03:46:51', auxPageId: 'ai_alignment', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '1892', pageId: '5b', userId: 'AlexeiAndreev', edit: '5', type: 'newEdit', createdAt: '2015-10-13 17:30:45', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '1891', pageId: '5b', userId: 'EliezerYudkowsky', edit: '4', type: 'newEdit', createdAt: '2015-07-15 23:07:06', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '1890', pageId: '5b', userId: 'EliezerYudkowsky', edit: '3', type: 'newEdit', createdAt: '2015-05-14 08:43:20', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '1889', pageId: '5b', userId: 'EliezerYudkowsky', edit: '2', type: 'newEdit', createdAt: '2015-04-27 23:32:54', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '1888', pageId: '5b', userId: 'EliezerYudkowsky', edit: '1', type: 'newEdit', createdAt: '2015-04-27 22:13:43', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' } ], feedSubmissions: [], searchStrings: {}, hasChildren: 'true', hasParents: 'true', redAliases: {}, improvementTagIds: [], nonMetaTagIds: [], todos: [], slowDownMap: 'null', speedUpMap: 'null', arcPageIds: 'null', contentRequests: {} }