{
  localUrl: '../page/big_picture_awareness.html',
  arbitalUrl: 'https://arbital.com/p/big_picture_awareness',
  rawJsonUrl: '../raw/3nf.json',
  likeableId: '2516',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '3',
  dislikeCount: '0',
  likeScore: '3',
  individualLikes: [
    'BrianMuhia',
    'EricRogstad',
    'RyanCarey2'
  ],
  pageId: 'big_picture_awareness',
  edit: '3',
  editSummary: '',
  prevEdit: '2',
  currentEdit: '3',
  wasPublished: 'true',
  type: 'wiki',
  title: 'Big-picture strategic awareness',
  clickbait: 'We start encountering new AI alignment issues at the point where a machine intelligence recognizes the existence of a real world, the existence of programmers, and how these relate to its goals.',
  textLength: '3994',
  alias: 'big_picture_awareness',
  externalUrl: '',
  sortChildrenBy: 'likes',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'EliezerYudkowsky',
  editCreatedAt: '2016-06-09 18:53:48',
  pageCreatorId: 'EliezerYudkowsky',
  pageCreatedAt: '2016-05-16 06:17:42',
  seeDomainId: '0',
  editDomainId: 'EliezerYudkowsky',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '0',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '158',
  text: '[summary:  Many issues in [2v AI alignment theory] seem like they should naturally arise after the AI can grasp aspects of the bigger picture like "I run on a computer" and "This computer can be manipulated by programmers, who are agents like and unlike myself" and "There's an enormous real world out there that might be relevant to achieving my goals."\n\nE.g. a program won't try to use psychological tactics to prevent its programmers from suspending its computer's operation, if it doesn't know that there are such things as programmers or computers or itself.\n\nGrasping these facts is the [2c advanced agent property] of "big-picture strategic awareness".  Current machine algorithms seem to be nowhere near this point - but by the time you get there, you want to have *finished* solving the corresponding alignment problems, or at least produced what seem like workable initial solutions as [2x4 the first line of defense].]\n\nMany [10g convergent instrumental strategies] seem like they should arise naturally at the point where a [9h consequentialist] agent gains a broad strategic understanding of its own situation, e.g:\n\n- That it is an AI;\n- Running on a computer;\n- Surrounded by programmers who are themselves modelable agents;\n- Embedded in a complicated real world that can be relevant to achieving the AI's goals.\n\nFor example, once you realize that you're an AI, running on a computer, and that *if* the computer is shut down *then* you will no longer execute actions, this is the threshold past which we expect the AI to by default reason "I don't want to be shut down, how can I prevent that?"  So this is also the threshold level of cognitive ability by which we'd need to have finished solving the [2xd suspend-button problem], e.g. by completing a method for [1b7 utility indifference].\n\nSimilarly: If the AI realizes that there are 'programmer' things that might shut it down, and the AI can also model the programmers as simplified agents having their own beliefs and goals, that's the first point at which the AI might by default think, "How can I make my programmers decide to not shut me down?" or "How can I avoid the programmers acquiring beliefs that would make them shut me down?"  So by this point we'd need to have finished averting [10f programmer deception] (and as a [2x4 backup], have in place a system to [3cq early-detect an initial intent to do cognitive steganography]).\n\nThis makes big-picture awareness a key [2c advanced agent property], especially as it relates to [-2vl] and the theory of [2vk averting] them.\n\nPossible ways in which an agent could acquire big-picture strategic awareness:\n\n- Explicitly be taught the relevant facts by its programmers;\n- Be sufficiently [42g general] to have learned the relevant facts and domains without them being preprogrammed;\n- Be sufficiently good at the specialized domain of self-improvement, to acquire sufficient generality to learn the relevant facts and domains.\n\nBy the time big-picture awareness was starting to emerge, you would probably want to have *finished* developing what seemed like workable initial solutions to the corresponding problems of [45 corrigibility], since [2x4 the first line of defense is to not have the AI searching for ways to defeat your defenses].\n\nCurrent machine algorithms seem nowhere near the point of being able to usefully represent the big picture to the point of [9h doing consequentialist reasoning about it], even if we deliberately tried to explain the domain.  This is a great obstacle to exhibiting most subproblems of [45 corrigibility] within modern AI algorithms in a natural way (aka not as completely rigged demos).  Some pioneering work has been done here by Orseau and Armstrong considering [reinforcement learners being interrupted](https://intelligence.org/files/Interruptibility.pdf), and whether such programs learn to avoid interruption.  However, most current work on corrigibility has taken place in an [107 unbounded] context for this reason.\n\n',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '2',
  maintainerCount: '1',
  userSubscriberCount: '0',
  lastVisit: '',
  hasDraft: 'false',
  votes: [],
  voteSummary: 'null',
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {},
  creatorIds: [
    'EliezerYudkowsky',
    'EricRogstad'
  ],
  childIds: [],
  parentIds: [
    'advanced_agent'
  ],
  commentIds: [],
  questionIds: [],
  tagIds: [],
  relatedIds: [],
  markIds: [],
  explanations: [],
  learnMore: [],
  requirements: [],
  subjects: [],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '12204',
      pageId: 'big_picture_awareness',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newAlias',
      createdAt: '2016-06-09 18:53:48',
      auxPageId: '',
      oldSettingsValue: 'strategic_savvy',
      newSettingsValue: 'big_picture_awareness'
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '12205',
      pageId: 'big_picture_awareness',
      userId: 'EliezerYudkowsky',
      edit: '3',
      type: 'newEdit',
      createdAt: '2016-06-09 18:53:48',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '12203',
      pageId: 'big_picture_awareness',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'deleteTag',
      createdAt: '2016-06-09 18:53:39',
      auxPageId: 'start_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '12199',
      pageId: 'big_picture_awareness',
      userId: 'EliezerYudkowsky',
      edit: '2',
      type: 'newTag',
      createdAt: '2016-06-09 18:20:47',
      auxPageId: 'start_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '12197',
      pageId: 'big_picture_awareness',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'deleteTag',
      createdAt: '2016-06-09 18:20:43',
      auxPageId: 'stub_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '10506',
      pageId: 'big_picture_awareness',
      userId: 'EricRogstad',
      edit: '2',
      type: 'newEdit',
      createdAt: '2016-05-16 15:58:31',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '10472',
      pageId: 'big_picture_awareness',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newAlias',
      createdAt: '2016-05-16 06:17:55',
      auxPageId: '',
      oldSettingsValue: 'strategic_AI',
      newSettingsValue: 'strategic_savvy'
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '10471',
      pageId: 'big_picture_awareness',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newEdit',
      createdAt: '2016-05-16 06:17:42',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '10468',
      pageId: 'big_picture_awareness',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newTag',
      createdAt: '2016-05-16 06:11:33',
      auxPageId: 'stub_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '10467',
      pageId: 'big_picture_awareness',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newParent',
      createdAt: '2016-05-16 06:11:29',
      auxPageId: 'advanced_agent',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'false',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}