{
  localUrl: '../page/4s.html',
  arbitalUrl: 'https://arbital.com/p/4s',
  rawJsonUrl: '../raw/4s.json',
  likeableId: '2316',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '0',
  dislikeCount: '0',
  likeScore: '0',
  individualLikes: [],
  pageId: '4s',
  edit: '7',
  editSummary: '',
  prevEdit: '6',
  currentEdit: '7',
  wasPublished: 'true',
  type: 'wiki',
  title: 'Natural language understanding of "right" will yield normativity',
  clickbait: 'What will happen if you tell an advanced agent to do the "right" thing?',
  textLength: '2179',
  alias: '4s',
  externalUrl: '',
  sortChildrenBy: 'likes',
  hasVote: 'true',
  voteType: 'probability',
  votesAnonymous: 'false',
  editCreatorId: 'AlexeiAndreev',
  editCreatedAt: '2015-12-16 04:33:27',
  pageCreatorId: 'EliezerYudkowsky',
  pageCreatedAt: '2015-04-17 01:26:24',
  seeDomainId: '0',
  editDomainId: 'EliezerYudkowsky',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '1',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '93',
  text: 'This proposition is true if you can take a cognitively powerful agent that otherwise seems pretty competent at understanding natural language, and that has been previously trained out of infrahuman errors in understanding natural language, ask it to 'do the right thing' or 'do the right thing, defined the right way' and its natural language understanding of 'right' yields what we would intuitively see as normativity.\n\n### Arguments\n\n[todo: expand]\n\nNatural categories have boundaries with low [5v algorithmic information] relative to boundaries produced by a purely epistemic system with a simplicity prior.\n\n'Unnatural' categories have value-laden boundaries.  Values have high  algorithmic information because of the [1y] and [5l].  Unnatural categories appear simple to us because we do dimensional reduction on value boundaries.  Things merely near to the boundaries of unnatural categories can fall off rapidly in value because of fragility.\n\nThere's an inductive problem where 18 things are important and only 17 of them vary between the positive and negative examples in the data.\n\n[2w Edge instantiation] makes this worse because it tends to seek out extreme cases.\n\nThe word 'right' involves a lot of what we call 'philosophical competence' in the sense that humans figuring it out will go through a lot of new cognitive use-paths ('unprecedented excursions') that they didn't traverse while disambiguating blue and green.  This also holds true when people are reflecting on how to figure out 'right'.  Example case of CDT vs. UDT.\n\nThis also matters because edge instantiation on the most 'right' as persuasively-right cases, will produce things that humans find superpersuasive (perhaps via shoving brains onto strange new pathways).  So we can't define right as that which would counterfactually cause a human model to agree that 'right' applies.\n\nThis keys into the inductive problem where variation must be shadowed in the data for the induced concept to cover it.\n\nBut if you had a complete predictive model of a human, it's then possible though not necessary that normative boundaries might be possible to induce by examples and asking to clarify ambiguities.',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '1',
  maintainerCount: '1',
  userSubscriberCount: '0',
  lastVisit: '2016-02-23 11:46:17',
  hasDraft: 'false',
  votes: [
    {
      value: '10',
      userId: 'EliezerYudkowsky',
      createdAt: '2015-04-17 01:28:13'
    },
    {
      value: '50',
      userId: 'PaulChristiano',
      createdAt: '2016-01-30 19:07:31'
    },
    {
      value: '50',
      userId: 'BenjyForstadt',
      createdAt: '2016-06-02 23:07:38'
    }
  ],
  voteSummary: 'null',
  muVoteSummary: '0',
  voteScaling: '2',
  currentUserVote: '-2',
  voteCount: '3',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {},
  creatorIds: [
    'EliezerYudkowsky',
    'AlexeiAndreev'
  ],
  childIds: [],
  parentIds: [
    'ai_alignment'
  ],
  commentIds: [
    '1st'
  ],
  questionIds: [],
  tagIds: [
    'work_in_progress_meta_tag'
  ],
  relatedIds: [],
  markIds: [],
  explanations: [],
  learnMore: [],
  requirements: [],
  subjects: [],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '3855',
      pageId: '4s',
      userId: 'AlexeiAndreev',
      edit: '7',
      type: 'newEdit',
      createdAt: '2015-12-16 04:33:28',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '3854',
      pageId: '4s',
      userId: 'AlexeiAndreev',
      edit: '6',
      type: 'newEdit',
      createdAt: '2015-12-16 04:29:51',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1124',
      pageId: '4s',
      userId: 'AlexeiAndreev',
      edit: '1',
      type: 'newUsedAsTag',
      createdAt: '2015-10-28 03:47:09',
      auxPageId: 'work_in_progress_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '362',
      pageId: '4s',
      userId: 'AlexeiAndreev',
      edit: '1',
      type: 'newParent',
      createdAt: '2015-10-28 03:46:51',
      auxPageId: 'ai_alignment',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1602',
      pageId: '4s',
      userId: 'EliezerYudkowsky',
      edit: '5',
      type: 'newEdit',
      createdAt: '2015-10-27 07:41:32',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1601',
      pageId: '4s',
      userId: 'EliezerYudkowsky',
      edit: '4',
      type: 'newEdit',
      createdAt: '2015-06-09 19:54:12',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1600',
      pageId: '4s',
      userId: 'EliezerYudkowsky',
      edit: '3',
      type: 'newEdit',
      createdAt: '2015-04-17 01:28:54',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1599',
      pageId: '4s',
      userId: 'EliezerYudkowsky',
      edit: '2',
      type: 'newEdit',
      createdAt: '2015-04-17 01:27:53',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1598',
      pageId: '4s',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newEdit',
      createdAt: '2015-04-17 01:26:24',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'false',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}