{
  localUrl: '../page/shutdown_problem.html',
  arbitalUrl: 'https://arbital.com/p/shutdown_problem',
  rawJsonUrl: '../raw/2xd.json',
  likeableId: '1836',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '2',
  dislikeCount: '0',
  likeScore: '2',
  individualLikes: [
    'NateSoares',
    'NickShesterin'
  ],
  pageId: 'shutdown_problem',
  edit: '3',
  editSummary: '',
  prevEdit: '2',
  currentEdit: '3',
  wasPublished: 'true',
  type: 'wiki',
  title: 'Shutdown problem',
  clickbait: 'How to build an AGI that lets you shut it down, despite the obvious fact that this will interfere with whatever the AGI's goals are.',
  textLength: '2468',
  alias: 'shutdown_problem',
  externalUrl: '',
  sortChildrenBy: 'likes',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'EliezerYudkowsky',
  editCreatedAt: '2017-02-13 17:39:31',
  pageCreatorId: 'EliezerYudkowsky',
  pageCreatedAt: '2016-03-28 02:03:51',
  seeDomainId: '0',
  editDomainId: 'EliezerYudkowsky',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '0',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '197',
  text: '[summary:  The 'shutdown problem' is creating a [7g1 sufficiently advanced Artificial Intelligence] which will, on the press of a button, suspend itself safely to disk; such that the AI [7g0 does not object] to the existence of this button, nor [45 try to prevent it] from being pressed.\n\nThis is difficult because avoiding your own halt or suspension is a [10g convergent instrumental strategy]; even [7g2 a goal like "bring the user coffee" implies avoiding shutdown].\n\nThis problem is sometimes decomposed into (1) the [2rf problem] of finding a utility function that [47 really actually means] "Suspend yourself safely to disk", and (2) the [1b7 problem] of building an agent that *wants* to switch to optimizing a different utility function if a button is pressed, but that *doesn't* want to press that button or prevent its being pressed.\n\nSee also [1b7], [2rf], [45], and [interruptibility].]\n\nThe 'shutdown problem' is creating a [7g1 sufficiently advanced Artificial Intelligence] which will, on the press of a button, suspend itself safely to disk; such that the AI [7g0 does not object] to the existence of this button, nor [45 try to prevent it] from being pressed.\n\nThis is difficult because avoiding your own halt or suspension is a [10g convergent instrumental strategy]; even [7g2 a goal like "bring the user coffee" implies avoiding shutdown].\n\nOne possible research avenue is to:\n\n 1. Describe a '[2rf shutdown utility function]' whose [7t9 attainable optimum] is "Suspend safely to disk (and possibly [2rg safely abort] plans in progress), without building a hidden environmental copy of yourself to carry out your previous goal and then assimilate all matter in the universe to make absolutely sure the original AI stays shut down".\n 2.  Find [1b7 a sensible way to compose] the shutdown utility function with the agent's regular utility function, such that:\n    - (2a) Which utility function the agent optimizes depends on whether a switch was pressed.\n    - (2b) The AI experiences no incentive to cause the switch to be pressed or prevent the switch from being pressed.\n\nHarder versions of this problem would add specifications such as:\n\n- The AI [nonadversarial_safety positively wants] the switch to exist and will restore the switch's existence if it is damaged.\n- The AI will [2rg safely abort] any plans in progress (and design plans such that they can be safely aborted).\n\nSee also [1b7], [2rf], [45], [interruptibility], [2pf], and [2rg].',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '1',
  maintainerCount: '1',
  userSubscriberCount: '0',
  lastVisit: '',
  hasDraft: 'false',
  votes: [],
  voteSummary: 'null',
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {},
  creatorIds: [
    'EliezerYudkowsky'
  ],
  childIds: [
    'no_coffee_if_dead'
  ],
  parentIds: [
    'corrigibility'
  ],
  commentIds: [],
  questionIds: [],
  tagIds: [
    'utility_indifference',
    'taskagi_open_problems',
    'shutdown_utility_function',
    'value_alignment_open_problem',
    'stub_meta_tag'
  ],
  relatedIds: [
    'updated_deference'
  ],
  markIds: [],
  explanations: [],
  learnMore: [],
  requirements: [],
  subjects: [],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '21994',
      pageId: 'shutdown_problem',
      userId: 'EliezerYudkowsky',
      edit: '3',
      type: 'newEdit',
      createdAt: '2017-02-13 17:39:31',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '21691',
      pageId: 'shutdown_problem',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newChild',
      createdAt: '2017-01-16 17:54:18',
      auxPageId: 'no_coffee_if_dead',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '10608',
      pageId: 'shutdown_problem',
      userId: 'EliezerYudkowsky',
      edit: '2',
      type: 'newEdit',
      createdAt: '2016-05-18 06:35:04',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '9149',
      pageId: 'shutdown_problem',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newEdit',
      createdAt: '2016-03-28 02:03:51',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'true',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}