{
  localUrl: '../page/1jl.html',
  arbitalUrl: 'https://arbital.com/p/1jl',
  rawJsonUrl: '../raw/1jl.json',
  likeableId: '491',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '0',
  dislikeCount: '0',
  likeScore: '0',
  individualLikes: [],
  pageId: '1jl',
  edit: '1',
  editSummary: '',
  prevEdit: '0',
  currentEdit: '1',
  wasPublished: 'true',
  type: 'comment',
  title: '"> on my view it seems extre..."',
  clickbait: '',
  textLength: '2715',
  alias: '1jl',
  externalUrl: '',
  sortChildrenBy: 'recentFirst',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'PaulChristiano',
  editCreatedAt: '2016-01-04 00:06:04',
  pageCreatorId: 'PaulChristiano',
  pageCreatedAt: '2016-01-04 00:06:04',
  seeDomainId: '0',
  editDomainId: 'EliezerYudkowsky',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '0',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '1422',
  text: '> on my view it seems extremely probable that, whatever we have in the way of AI algorithms short of full FAI creating other AI algorithms, they'll be helping out not at all with alignment and control\n\nYou often say this, but I'm obviously not yet convinced.\n\nAs I see it the biggest likely gap is that you can empirically validate work in AI, but maybe cannot validate work on alignment/control except by consulting a human. This is problematic if either human feedback ends up being a major cost/obstacle (e.g. because AI systems are extremely cheap/fast, or because they are too far beyond humans for humans to provide meaningful oversight), or if task definitions that involve human feedback end up being harder by virtue of being mushier goals that don't line up as well with the actual structure of reality.\n\nThese objections are more plausible for establishing that control work is a comparative advantage of humans. In that context I would accept them as plausible arguments, though I think there is a pretty good chance of working around them.\n\nBut those considerations don't seem to imply that AI will help out "not at all." It seems pretty plausible that you are drawing on some other intuitions that I haven't considered.\n\nAnother possible gap is that control may just be harder than capabilities. But in that case the development of AI wouldn't really change the game, it would just make the game go faster, so this doesn't seem relevant to the present discussion. (If humans can solve the control problem anyway, humans+AI systems would have a comparable chance.)\n\nAnother possible gap is that there are many more iterations of AI design, and a failure at any time cascades into future iterations. I've pointed out that there can't be many big productivity improvements before any earlier thinking about AI is thoroughly obsolete, but I'm certainly willing to grant that forcing control to keep up for a while does make the problem materially harder (moreso the more that our solutions to the control problem are closely tied to details of the AI systems we are building). I agree that sticking with the same AI designs for longer can in some respects make the control problem easier. But it seems like you are talking about a difference-in-kind for safety work, rather than another way to slightly improve safety at the expense of efficacy.\n\n\nNote: I'm saying that if you can solve the AI control/alignment problem for the AI systems in year N, then the involvement of those AI systems in subsequent AI design doesn't exert a significant additional pressure that makes it harder to solve the control/alignment problem in year N+1. It seems like this is the relevant question in the context of the OP.',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '0',
  maintainerCount: '0',
  userSubscriberCount: '0',
  lastVisit: '2016-02-21 14:22:30',
  hasDraft: 'false',
  votes: [],
  voteSummary: 'null',
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {},
  creatorIds: [
    'PaulChristiano'
  ],
  childIds: [],
  parentIds: [
    'KANSI',
    '1gp'
  ],
  commentIds: [],
  questionIds: [],
  tagIds: [],
  relatedIds: [],
  markIds: [],
  explanations: [],
  learnMore: [],
  requirements: [],
  subjects: [],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4946',
      pageId: '1jl',
      userId: 'PaulChristiano',
      edit: '1',
      type: 'newEdit',
      createdAt: '2016-01-04 00:06:04',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4935',
      pageId: '1jl',
      userId: 'PaulChristiano',
      edit: '0',
      type: 'newParent',
      createdAt: '2016-01-03 23:35:04',
      auxPageId: 'KANSI',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4937',
      pageId: '1jl',
      userId: 'PaulChristiano',
      edit: '0',
      type: 'newParent',
      createdAt: '2016-01-03 23:35:04',
      auxPageId: '1gp',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'false',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}