{
  localUrl: '../page/value_alignment_utility.html',
  arbitalUrl: 'https://arbital.com/p/value_alignment_utility',
  rawJsonUrl: '../raw/109.json',
  likeableId: '12',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '2',
  dislikeCount: '0',
  likeScore: '2',
  individualLikes: [
    'EliezerYudkowsky',
    'EricRogstad'
  ],
  pageId: 'value_alignment_utility',
  edit: '4',
  editSummary: '',
  prevEdit: '3',
  currentEdit: '4',
  wasPublished: 'true',
  type: 'wiki',
  title: 'Utility',
  clickbait: 'What is "utility" in the context of Value Alignment Theory?',
  textLength: '2278',
  alias: 'value_alignment_utility',
  externalUrl: '',
  sortChildrenBy: 'likes',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'AlexeiAndreev',
  editCreatedAt: '2015-12-17 00:59:11',
  pageCreatorId: 'EliezerYudkowsky',
  pageCreatedAt: '2015-07-15 23:03:28',
  seeDomainId: '0',
  editDomainId: 'EliezerYudkowsky',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '6',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '102',
  text: 'In the context of value alignment theory, 'Utility' always refers to a goal held by an artificial agent.  It further implies that the agent is a [9h consequentialist]; that the agent has [ probabilistic] beliefs about the consequences of its actions; that the agent has a quantitative notion of "how much better" one outcome is than another and the relative size of different intervals of betterness; and that the agent can therefore, e.g., trade off large probabilities of a small utility gain against small probabilities of a large utility loss.\n\nTrue coherence in the sense of a [ von-Neumann Morgenstern utility function] may be out of reach for [ bounded agents], but the term 'utility' may also be used for the [ bounded analogues] of such decision-making, provided that quantitative relative intervals of preferability are being combined with quantitative degrees of belief to yield decisions.\n\nUtility is explicitly not assumed to be normative.  E.g., if speaking of a [10h paperclip maximizer], we will say that an outcome has higher utility iff it contains more paperclips.\n\nHumans should not be said (without further justification) to have utilities over complicated outcomes.  On the mainstream view from psychology, humans are inconsistent enough that it would take additional assumptions to translate our psychology into a coherent utility function.  E.g., we may differently value the interval between two outcomes depending on whether the interval is framed as a 'gain' or a 'loss'. For the things humans do or should want, see the special use of the word [55 'value'].  For a general disambiguation page on words used to talk about human and AI wants, see [5b].\n\nOn some [55 construals of value], e.g. [71 reflective equilibrium], this construal may imply that the true values form a coherent utility function.  Nonetheless, by convention, we will not speak of value as a utility unless it has been spelled out that, e.g., the value in question has been assumed to be a reflective equilibrium.\n\nMultiple agents with different utility functions should not be said (without further exposition) to have a collective utility function over outcomes, since at present, there is no accepted [ canonical way to aggregate utility functions][todo: link loudness problem].',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '1',
  maintainerCount: '1',
  userSubscriberCount: '0',
  lastVisit: '2016-02-25 07:48:32',
  hasDraft: 'false',
  votes: [],
  voteSummary: 'null',
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {},
  creatorIds: [
    'AlexeiAndreev',
    'EliezerYudkowsky'
  ],
  childIds: [],
  parentIds: [
    '5b'
  ],
  commentIds: [],
  questionIds: [],
  tagIds: [
    'definition_meta_tag',
    'value_alignment_glossary'
  ],
  relatedIds: [],
  markIds: [],
  explanations: [],
  learnMore: [],
  requirements: [],
  subjects: [],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4011',
      pageId: 'value_alignment_utility',
      userId: 'AlexeiAndreev',
      edit: '4',
      type: 'newEdit',
      createdAt: '2015-12-17 00:59:11',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4010',
      pageId: 'value_alignment_utility',
      userId: 'AlexeiAndreev',
      edit: '3',
      type: 'newEdit',
      createdAt: '2015-12-17 00:58:28',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4009',
      pageId: 'value_alignment_utility',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'newAlias',
      createdAt: '2015-12-17 00:57:27',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1087',
      pageId: 'value_alignment_utility',
      userId: 'AlexeiAndreev',
      edit: '1',
      type: 'newUsedAsTag',
      createdAt: '2015-10-28 03:47:09',
      auxPageId: 'definition_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1107',
      pageId: 'value_alignment_utility',
      userId: 'AlexeiAndreev',
      edit: '1',
      type: 'newUsedAsTag',
      createdAt: '2015-10-28 03:47:09',
      auxPageId: 'value_alignment_glossary',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '268',
      pageId: 'value_alignment_utility',
      userId: 'AlexeiAndreev',
      edit: '1',
      type: 'newParent',
      createdAt: '2015-10-28 03:46:51',
      auxPageId: '5b',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1296',
      pageId: 'value_alignment_utility',
      userId: 'AlexeiAndreev',
      edit: '2',
      type: 'newEdit',
      createdAt: '2015-10-13 17:38:36',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1295',
      pageId: 'value_alignment_utility',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newEdit',
      createdAt: '2015-07-15 23:03:28',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'false',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}