{
  localUrl: '../page/2qs.html',
  arbitalUrl: 'https://arbital.com/p/2qs',
  rawJsonUrl: '../raw/2qs.json',
  likeableId: '1657',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '0',
  dislikeCount: '0',
  likeScore: '0',
  individualLikes: [],
  pageId: '2qs',
  edit: '3',
  editSummary: '',
  prevEdit: '2',
  currentEdit: '3',
  wasPublished: 'true',
  type: 'comment',
  title: '"I think the [key question](..."',
  clickbait: '',
  textLength: '2315',
  alias: '2qs',
  externalUrl: '',
  sortChildrenBy: 'recentFirst',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'PaulChristiano',
  editCreatedAt: '2016-03-20 18:18:54',
  pageCreatorId: 'PaulChristiano',
  pageCreatedAt: '2016-03-20 02:56:02',
  seeDomainId: '0',
  editDomainId: 'EliezerYudkowsky',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '0',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '354',
  text: 'I think the [key question](https://medium.com/ai-control/the-informed-oversight-problem-1b51b4f66b35#.lqcjngn6w) is whether:\n\n1. the burrito judge needs to be extremely powerful, or\n2. the burrito judge needs to be modestly more powerful than the burrito producer.\n\nIn world 1 I agree that the burrito-evaluator seems pretty tough to build. We certainly have disagreements about that case, but I'm happy to set it aside for now.\n\nIn world 2 things seem much less scary. Because [I only need to run these evaluations with e.g. 1% probability](https://medium.com/ai-control/counterfactual-human-in-the-loop-a7822e36f399), the judge can use 50x more resources than the burrito producer. So it's imaginable that the judge can be more powerful than the producer.\n\nYou seem to think that we are in world 1. I think that we are probably in world 2, but I'm certainly not sure. I discuss the issue in [this post](https://medium.com/ai-control/the-informed-oversight-problem-1b51b4f66b35#.lqcjngn6w).\n\nSome observations:\n\n* The judge's job is easier if they are evaluating steps of the plan, before those steps are taken, rather than actually letting the burrito producer take actions. So let's do it that way.\n* The judge can look at the burrito producer's computation, and at the training process that produced that computation, and can change the burrito producer's training procedure to make that computation more understandable.\n* If the judge were epistemically efficient with respect to the producer, then maximizing the judge's expectation of a burrito's quality would be the same as maximizing the burrito producer's expectation of a burrito's quality. That's basically what we want. So the real issue is narrower than you might expect, it's some kind of epistemic version of "offense vs. defense," where the producer can think particular thoughts that the judge doesn't happen to think, and so the producer might expect to be able to deceive/attack the judge even though the judge is smarter. This is what the judge is trying to avoid by looking at the producer's computation.\n\nSo I don't think that we can just ask the judge to evaluate the burrito; but the judge has enough going for her that I expect we can find some strategy that lets her win. I think this is the biggest open problem for my current approach.',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '0',
  maintainerCount: '0',
  userSubscriberCount: '0',
  lastVisit: '',
  hasDraft: 'false',
  votes: [],
  voteSummary: 'null',
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {},
  creatorIds: [
    'PaulChristiano'
  ],
  childIds: [],
  parentIds: [
    'taskagi_open_problems',
    '2ql'
  ],
  commentIds: [],
  questionIds: [],
  tagIds: [],
  relatedIds: [],
  markIds: [],
  explanations: [],
  learnMore: [],
  requirements: [],
  subjects: [],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8856',
      pageId: '2qs',
      userId: 'PaulChristiano',
      edit: '3',
      type: 'newEdit',
      createdAt: '2016-03-20 18:18:54',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8855',
      pageId: '2qs',
      userId: 'PaulChristiano',
      edit: '2',
      type: 'newEdit',
      createdAt: '2016-03-20 18:17:58',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8847',
      pageId: '2qs',
      userId: 'PaulChristiano',
      edit: '1',
      type: 'newEdit',
      createdAt: '2016-03-20 02:56:02',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8833',
      pageId: '2qs',
      userId: 'PaulChristiano',
      edit: '0',
      type: 'newParent',
      createdAt: '2016-03-20 02:10:52',
      auxPageId: 'taskagi_open_problems',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8835',
      pageId: '2qs',
      userId: 'PaulChristiano',
      edit: '0',
      type: 'newParent',
      createdAt: '2016-03-20 02:10:52',
      auxPageId: '2ql',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'false',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}