{
  localUrl: '../page/1h6.html',
  arbitalUrl: 'https://arbital.com/p/1h6',
  rawJsonUrl: '../raw/1h6.json',
  likeableId: '448',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '0',
  dislikeCount: '0',
  likeScore: '0',
  individualLikes: [],
  pageId: '1h6',
  edit: '2',
  editSummary: '',
  prevEdit: '1',
  currentEdit: '2',
  wasPublished: 'true',
  type: 'comment',
  title: '"> Here is my understanding ..."',
  clickbait: '',
  textLength: '2442',
  alias: '1h6',
  externalUrl: '',
  sortChildrenBy: 'recentFirst',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'EliezerYudkowsky',
  editCreatedAt: '2016-01-04 01:28:52',
  pageCreatorId: 'EliezerYudkowsky',
  pageCreatedAt: '2015-12-30 00:33:11',
  seeDomainId: '0',
  editDomainId: 'EliezerYudkowsky',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '0',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '1422',
  text: '> Here is my understanding of Eliezer's picture (translated into my worldview): we might be able to build AI systems that are extremely good at helping us build capable AI systems, but not nearly as good at helping us solve AI alignment/control or building alignable/controllable AI.\n\nThis indeed is the class of worrisome scenarios, and one should consider that (a) Eliezer thinks that aligning the rocket is harder than fueling it in general, and (b) that this was certainly true of e.g. Eurisko which was able to get some amount of self-improvement but with all control issues being kicked squarely back to Douglas Lenat.  We can also see natural selection's creation of humans in the same light, etcetera.  On my view it seems *extremely* probable that, whatever we have in the way of AI algorithms (short of full FAI) creating other AI algorithms, they'll be helping out *not at all* with alignment and control and things like reflective stability and so on.\n\nThe case where KANSI becomes important is where we get to the level where AGI becomes possible, at a point where there are *not* huge foregone advantages from whatever types of AI creation of AI algorithms *of a type where existing transparency or control work doesn't generalize.*  You can define a neural network undergoing gradient descent as "improving itself" but relative to current systems this doesn't change the algorithm to the point where we no longer understand what's going on.  KANSI is relevant in the scenario where we first reach possible-advanced-AGI levels at a point where an organization with *lots* of resources and maybe a realistically-sized algorithmic lead, that *foregoes* the class of AI-improving-AI benefits that would make important subprocesses very hard to understand, is not at a disadvantage relative to a medium-sized organization with fewer resources.  This is the level where we can put a big thing together out of things vaguely analogous to deep belief networks or whatever, and just run our current algorithms or minor variations on them, and have the AI's representation be reasonably transparent and known so that we can monitor the AI's thoughts - *without* some huge amount of work having gone into making transparency reflectively stable and corrigible through self-improvement or getting the AI to help us out with that, etcetera, because we're just taking known algorithms and running on them on a vast amount of computing power.',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '0',
  maintainerCount: '0',
  userSubscriberCount: '0',
  lastVisit: '2016-02-21 14:22:30',
  hasDraft: 'false',
  votes: [],
  voteSummary: 'null',
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {},
  creatorIds: [
    'EliezerYudkowsky'
  ],
  childIds: [],
  parentIds: [
    'KANSI',
    '1gp'
  ],
  commentIds: [],
  questionIds: [],
  tagIds: [],
  relatedIds: [],
  markIds: [],
  explanations: [],
  learnMore: [],
  requirements: [],
  subjects: [],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4957',
      pageId: '1h6',
      userId: 'EliezerYudkowsky',
      edit: '2',
      type: 'newEdit',
      createdAt: '2016-01-04 01:28:52',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4753',
      pageId: '1h6',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newEdit',
      createdAt: '2015-12-30 00:33:11',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4750',
      pageId: '1h6',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newParent',
      createdAt: '2015-12-30 00:24:43',
      auxPageId: 'KANSI',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4752',
      pageId: '1h6',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newParent',
      createdAt: '2015-12-30 00:24:43',
      auxPageId: '1gp',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'false',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}