{
  localUrl: '../page/convergent_self_modification.html',
  arbitalUrl: 'https://arbital.com/p/convergent_self_modification',
  rawJsonUrl: '../raw/3ng.json',
  likeableId: '0',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '0',
  dislikeCount: '0',
  likeScore: '0',
  individualLikes: [],
  pageId: 'convergent_self_modification',
  edit: '4',
  editSummary: '',
  prevEdit: '3',
  currentEdit: '4',
  wasPublished: 'true',
  type: 'wiki',
  title: 'Convergent strategies of self-modification',
  clickbait: 'The strategies we'd expect to be employed by an AI that understands the relevance of its code and hardware to achieving its goals, which therefore has subgoals about its code and hardware.',
  textLength: '2620',
  alias: 'convergent_self_modification',
  externalUrl: '',
  sortChildrenBy: 'likes',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'EliezerYudkowsky',
  editCreatedAt: '2016-05-18 06:34:21',
  pageCreatorId: 'EliezerYudkowsky',
  pageCreatedAt: '2016-05-16 07:00:16',
  seeDomainId: '0',
  editDomainId: 'EliezerYudkowsky',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '1',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '76',
  text: '[summary:  An AI which [9h reasons from ends to means], which understands how its own code and the properties of its software are relevant to achieving its goals, will [10g by default] have [10j instrumental subgoals] about its own code.\n\nThe AI might be able to modify its own code directly; or, if the code can't directly access itself, but the AI has sufficient [3nf savviness about the bigger picture], the AI might pursue strategies like building a new agent in the environment, or using material means to operate on its own hardware (e.g., use a robot to get the programming console).\n\nSome forms of instrumental self-modification pressures might arise even in an algorithm which isn't doing consequentialism about that, if some internal property $X$ is optimized-over as a side effect of optimizing for some other property $Y.$]\n\nAny [9h consequentialist agent] which has acquired sufficient [3nf big-picture savviness] to understand that it has code, and that this code is relevant to achieving its goals, would [10g by default] acquire subgoals relating to its code.  (Unless this default is [2vk averted].)  For example, an agent that wants (only) to produce smiles or make paperclips, whose code contains a shutdown procedure, [2xd will not want this shutdown procedure to execute] because it will lead to fewer future smiles or paperclips.  (This preference is not spontaneous/exogenous/unnatural but arises from the execution of the code itself; the code is [2rb reflectively inconsistent].)\n\nBesides agents whose policy options directly include self-modification options, big-picture-savvy agents whose code cannot directly access itself might also, e.g., try to (a) crack the platform it is running on to gain unintended access, (b) use a robot to operate an outside programming console with special privileges, (c) [309 manipulate the programmers] into modifying it in various ways, (d) building a new subagent in the environment which has the preferred code, or (e) using environmental, material means to manipulate its material embodiment despite its lack of direct self-access.\n\nAn AI with sufficient big-picture savviness to understand its programmers as agents with beliefs, might attempt to [3cq conceal its self-modifications].\n\nSome implicit self-modification pressures could arise from [ implicit consequentialism] in cases where the AI is optimizing for $Y$ and there is an internal property $X$ which is relevant to the achievement of $Y.$  In this case, optimizing for $Y$ could implicitly optimize over the internal property $X$ even if the AI lacks an explicit model of how $X$ affects $Y.$',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '1',
  maintainerCount: '1',
  userSubscriberCount: '0',
  lastVisit: '',
  hasDraft: 'false',
  votes: [],
  voteSummary: 'null',
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {},
  creatorIds: [
    'EliezerYudkowsky',
    'EricRogstad'
  ],
  childIds: [],
  parentIds: [
    'convergent_strategies'
  ],
  commentIds: [
    '3nl'
  ],
  questionIds: [],
  tagIds: [],
  relatedIds: [],
  markIds: [],
  explanations: [],
  learnMore: [],
  requirements: [],
  subjects: [],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '10607',
      pageId: 'convergent_self_modification',
      userId: 'EliezerYudkowsky',
      edit: '4',
      type: 'newEdit',
      createdAt: '2016-05-18 06:34:21',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '10606',
      pageId: 'convergent_self_modification',
      userId: 'EliezerYudkowsky',
      edit: '3',
      type: 'newEdit',
      createdAt: '2016-05-18 06:33:46',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '10517',
      pageId: 'convergent_self_modification',
      userId: 'EricRogstad',
      edit: '2',
      type: 'newEdit',
      createdAt: '2016-05-16 18:57:24',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '10479',
      pageId: 'convergent_self_modification',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newEdit',
      createdAt: '2016-05-16 07:00:16',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '10477',
      pageId: 'convergent_self_modification',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'deleteTag',
      createdAt: '2016-05-16 06:51:17',
      auxPageId: 'stub_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '10474',
      pageId: 'convergent_self_modification',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newTag',
      createdAt: '2016-05-16 06:20:23',
      auxPageId: 'stub_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '10473',
      pageId: 'convergent_self_modification',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newParent',
      createdAt: '2016-05-16 06:20:18',
      auxPageId: 'convergent_strategies',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'false',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}