{
  localUrl: '../page/foreseeable_difficulties.html',
  arbitalUrl: 'https://arbital.com/p/foreseeable_difficulties',
  rawJsonUrl: '../raw/6r.json',
  likeableId: '2378',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '1',
  dislikeCount: '0',
  likeScore: '1',
  individualLikes: [
    'EliezerYudkowsky'
  ],
  pageId: 'foreseeable_difficulties',
  edit: '7',
  editSummary: '',
  prevEdit: '6',
  currentEdit: '7',
  wasPublished: 'true',
  type: 'wiki',
  title: 'Methodology of foreseeable difficulties',
  clickbait: 'Building a nice AI is likely to be hard enough, and contain enough gotchas that won't show up in the AI's early days, that we need to foresee problems coming in advance.',
  textLength: '3190',
  alias: 'foreseeable_difficulties',
  externalUrl: '',
  sortChildrenBy: 'likes',
  hasVote: 'false',
  voteType: 'approval',
  votesAnonymous: 'false',
  editCreatorId: 'MatthewGraves',
  editCreatedAt: '2016-11-23 00:34:12',
  pageCreatorId: 'EliezerYudkowsky',
  pageCreatedAt: '2015-06-09 19:56:09',
  seeDomainId: '0',
  editDomainId: 'EliezerYudkowsky',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '3',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '248',
  text: 'Much of the current literature about value alignment centers on purported reasons to expect that certain problems will require solution, or be difficult, or be more difficult than some people seem to expect.  The subject of this page's approval rating is this practice, considered as a policy or methodology.\n\nThe basic motivation behind trying to foresee difficulties is the large number of predicted [6q Context Change] problems where an AI seems to behave nicely up until it reaches some threshold level of cognitive ability and then it behaves less nicely.  In some cases the problems are generated without the AI having formed that intention in advance, meaning that even transparency of the AI's thought processes during its earlier state can't save us.  This means we have to see problems of this type in advance.\n\n(The fact that Context Change problems of this type can be *hard* to see in advance, or that we might conceivably fail to see one, doesn't mean we can skip this duty of analysis.  Not trying to foresee them means relying on observation, and it seems *predictable* that trying to eyeball the AI and rejecting theory *definitely* doesn't catch important classes of problem.)\n\n[todo: # Examples]\n\n[todo: ...most of value alignment theory, so try to pick 3 cases that illustrate the point in different ways.  Pick from Context Change?]\n\n# Arguments\n\nFor:  it's sometimes possible to strongly foresee a difficulty coming in a case where you've observed naive respondents to seem to think that no difficulty exists, and in cases where the development trajectory of the agent seems to imply a potential [6q Treacherous Turn].  If there's even one real Treacherous Turn out of all the cases that have been argued, then the point carries that past a certain point, you have to see the bullet coming before it actually hits you.  The theoretical analysis suggests really strongly that blindly forging ahead 'experimentally' will be fatal.  Someone with such a strong commitment to experimentalism that they want to ignore this theoretical analysis... it's not clear what we can say to them, except maybe to appeal to the normative principle of not predictably destroying the world in cases where it seems like we could have done better.\n\nAgainst:  no real arguments against in the actual literature, but it would be surprising if somebody didn't claim that the foreseeable difficulties program was too pessimistic, or inevitably ungrounded from reality and productive only of bad ideas even when refuted, etcetera.\n\nPrimary reply: look, dammit, people actually are way too optimistic about FAI, we have them on the record, [todo: find 3 prestigious examples] and it's hard to see how humanity could avoid walking directly into the whirling razor blades without better foresight of difficulty.  One potential strategy is enough academic respect and consensus on enough really obvious foreseeable difficulties that the people claiming it will all be easy are actually asked to explain why the foreseeable difficulty consensus is wrong, and if they can't explain that well, they lose respect.\n\nWill interact with the arguments on [108 empiricism vs. theorism is a false dichotomy].',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '2',
  maintainerCount: '1',
  userSubscriberCount: '0',
  lastVisit: '2016-02-25 10:39:36',
  hasDraft: 'false',
  votes: [
    {
      value: '65',
      userId: 'EliezerYudkowsky',
      createdAt: '2015-06-09 19:56:25'
    }
  ],
  voteSummary: 'null',
  muVoteSummary: '0',
  voteScaling: '1',
  currentUserVote: '-2',
  voteCount: '1',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {},
  creatorIds: [
    'EliezerYudkowsky',
    'AlexeiAndreev',
    'MatthewGraves'
  ],
  childIds: [],
  parentIds: [
    'advanced_safety'
  ],
  commentIds: [
    '7f'
  ],
  questionIds: [],
  tagIds: [
    'work_in_progress_meta_tag'
  ],
  relatedIds: [
    'goodharts_curse'
  ],
  markIds: [],
  explanations: [],
  learnMore: [],
  requirements: [],
  subjects: [],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '20367',
      pageId: 'foreseeable_difficulties',
      userId: 'MatthewGraves',
      edit: '7',
      type: 'newEdit',
      createdAt: '2016-11-23 00:34:12',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '12252',
      pageId: 'foreseeable_difficulties',
      userId: 'EliezerYudkowsky',
      edit: '6',
      type: 'newEdit',
      createdAt: '2016-06-10 00:21:14',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '12251',
      pageId: 'foreseeable_difficulties',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'turnOffVote',
      createdAt: '2016-06-10 00:21:13',
      auxPageId: '',
      oldSettingsValue: 'true',
      newSettingsValue: 'false'
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '12201',
      pageId: 'foreseeable_difficulties',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'deleteParent',
      createdAt: '2016-06-09 18:35:46',
      auxPageId: 'ai_alignment',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '3811',
      pageId: 'foreseeable_difficulties',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'newAlias',
      createdAt: '2015-12-16 00:52:51',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '3812',
      pageId: 'foreseeable_difficulties',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'turnOffVote',
      createdAt: '2015-12-16 00:52:51',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '3813',
      pageId: 'foreseeable_difficulties',
      userId: 'AlexeiAndreev',
      edit: '5',
      type: 'newEdit',
      createdAt: '2015-12-16 00:52:51',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1132',
      pageId: 'foreseeable_difficulties',
      userId: 'AlexeiAndreev',
      edit: '1',
      type: 'newUsedAsTag',
      createdAt: '2015-10-28 03:47:09',
      auxPageId: 'work_in_progress_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '376',
      pageId: 'foreseeable_difficulties',
      userId: 'AlexeiAndreev',
      edit: '1',
      type: 'newParent',
      createdAt: '2015-10-28 03:46:51',
      auxPageId: 'ai_alignment',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '424',
      pageId: 'foreseeable_difficulties',
      userId: 'AlexeiAndreev',
      edit: '1',
      type: 'newParent',
      createdAt: '2015-10-28 03:46:51',
      auxPageId: 'advanced_safety',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1942',
      pageId: 'foreseeable_difficulties',
      userId: 'EliezerYudkowsky',
      edit: '4',
      type: 'newEdit',
      createdAt: '2015-07-14 20:53:28',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1941',
      pageId: 'foreseeable_difficulties',
      userId: 'EliezerYudkowsky',
      edit: '3',
      type: 'newEdit',
      createdAt: '2015-06-12 21:17:54',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1940',
      pageId: 'foreseeable_difficulties',
      userId: 'EliezerYudkowsky',
      edit: '2',
      type: 'newEdit',
      createdAt: '2015-06-09 19:59:53',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1939',
      pageId: 'foreseeable_difficulties',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newEdit',
      createdAt: '2015-06-09 19:56:09',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'false',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}