{
  localUrl: '../page/instrumental_pressure.html',
  arbitalUrl: 'https://arbital.com/p/instrumental_pressure',
  rawJsonUrl: '../raw/10k.json',
  likeableId: 'MalcolmOcean',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '2',
  dislikeCount: '0',
  likeScore: '2',
  individualLikes: [
    'EliezerYudkowsky',
    'NopeNope'
  ],
  pageId: 'instrumental_pressure',
  edit: '4',
  editSummary: '',
  prevEdit: '3',
  currentEdit: '4',
  wasPublished: 'true',
  type: 'wiki',
  title: 'Instrumental pressure',
  clickbait: 'A consequentialist agent will want to bring about certain instrumental events that will help to fulfill its goals.',
  textLength: '4435',
  alias: 'instrumental_pressure',
  externalUrl: '',
  sortChildrenBy: 'likes',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'AlexeiAndreev',
  editCreatedAt: '2015-12-16 16:49:35',
  pageCreatorId: 'EliezerYudkowsky',
  pageCreatedAt: '2015-07-16 20:23:21',
  seeDomainId: '0',
  editDomainId: 'EliezerYudkowsky',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '1',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '84',
  text: 'Saying that an agent will see 'instrumental pressure' to bring about an event E is saying that this agent, presumed to be a [9h consequentialist] with some goal G, will *ceteris paribus and absent defeaters,* want to [10j bring about E in order to do G].   For example, a [10h paperclip maximizer], Clippy, sees instrumental pressure to gain control of as much matter as possible in order to make more paperclips.  If we imagine an alternate Clippy+ that has a penalty term in its utility function for 'killing humans', Clippy+ still has an instrumental pressure to turn humans into paperclips (because of the paperclips that would be gained) but it also has a countervailing force pushing against that pressure (the penalty term for killing humans).  Thus, we can say that a system is experiencing 'instrumental pressure' to do something, without implying that the system necessarily does it.\n\nThis state of affairs is different from the absence of any instrumental pressure:  E.g., Clippy+ might come up with some clever way to [42 obtain the gains while avoiding the penalty term], like turning humans into paperclips without killing them.\n\nTo more crisply define 'instrumental pressure', we need a setup that distinguishes [10j terminal utility and instrumental expected utility], as in e.g. a utility function plus a causal model.  Then we can be more precise about the notion of 'instrumental pressure' as follows:  If each paperclip is worth 1 terminal utilon and a human can be disassembled to make 1000 paperclips with certainty, then strategies or event-sets that include 'turn the human into paperclips' thereby have their expected utility elevated by 1000 utils.  There might also be a penalty term that assigns -1,000,000 utilts to killing a human, but then the net expected utility of disassembling the human is -999,000 rather than -1,000,000.  The 1000 utils would still be gained from disassembling the human; the penalty term doesn't change that part.  Even if this strategy doesn't have maximum EU and is not selected, the 'instrumental pressure' was still elevating its EU.  There's still an expected-utility bump on that part of the solution space, even if that solution space is relatively low in value.  And this is perhaps relevantly different because, e.g., there might be some clever strategy for turning humans into paperclips without killing them (even if you can only get 900 paperclips that way).\n\n### Link from instrumental pressures to reflective instrumental pressures\n\nIf the agent is reflective and makes reflective choices on a consequentialist basis, there would ceteris paribus be a reflective-level pressure to *search* for a strategy that makes paperclips out of the humans' atoms [42 without doing anything defined as 'killing the human'].  If a strategy like that could be found, then executing the strategy would enable a gain of 1000 utilons; thus there's an instrumental pressure to search for that strategy.  Even if there's a penalty term added for searching for strategies to evade penalty terms, leading the AI to decide not to do the search, the instrumental pressure will still be there as a bump in the expected utility of that part of the solution space.  (Perhaps there's some [9f unforeseen] way to do something very like searching for that strategy while evading the penalty term, such as constructing an outside calculator to do it...)\n\n### Blurring lines in allegedly non-consequentialist subsystems or decision rules\n\nTo the extent that the AI being discussed is not a pure consequentialist, the notion of 'instrumental pressure' may start to blur or be less applicable.  E.g., suppose on some level of AI, the choice of which questions to think about is *not* being decided by a choice between options with calculated expected utilities, but is instead being decided by a rule, and the rule excludes searching for strategies that evade penalty terms.  Then maybe there's no good analogy to the concept of 'an instrumental pressure to search for strategies that evade penalty terms', because there's no expected utility rating on the solution space and hence no analogous bump in the solution space that might eventually intersect a feasible strategy.  But we should still perhaps [ be careful about declaring that an AI subsystem has no analogue of instrumental pressures, because instrumental pressures may arise even in systems that don't look explicitly consequentialist].',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '1',
  maintainerCount: '1',
  userSubscriberCount: '0',
  lastVisit: '2016-02-13 06:05:31',
  hasDraft: 'false',
  votes: [],
  voteSummary: 'null',
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {},
  creatorIds: [
    'EliezerYudkowsky',
    'AlexeiAndreev'
  ],
  childIds: [],
  parentIds: [
    'instrumental_convergence'
  ],
  commentIds: [],
  questionIds: [],
  tagIds: [],
  relatedIds: [],
  markIds: [],
  explanations: [],
  learnMore: [],
  requirements: [],
  subjects: [],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '3928',
      pageId: 'instrumental_pressure',
      userId: 'AlexeiAndreev',
      edit: '4',
      type: 'newEdit',
      createdAt: '2015-12-16 16:49:35',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '3927',
      pageId: 'instrumental_pressure',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'newAlias',
      createdAt: '2015-12-16 16:49:34',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '183',
      pageId: 'instrumental_pressure',
      userId: 'AlexeiAndreev',
      edit: '1',
      type: 'newParent',
      createdAt: '2015-10-28 03:46:51',
      auxPageId: 'instrumental_convergence',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1802',
      pageId: 'instrumental_pressure',
      userId: 'EliezerYudkowsky',
      edit: '3',
      type: 'newEdit',
      createdAt: '2015-07-16 20:29:59',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1801',
      pageId: 'instrumental_pressure',
      userId: 'EliezerYudkowsky',
      edit: '2',
      type: 'newEdit',
      createdAt: '2015-07-16 20:26:43',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1800',
      pageId: 'instrumental_pressure',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newEdit',
      createdAt: '2015-07-16 20:23:21',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'false',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}