{
  localUrl: '../page/Vingean_uncertainty.html',
  arbitalUrl: 'https://arbital.com/p/Vingean_uncertainty',
  rawJsonUrl: '../raw/9g.json',
  likeableId: '2450',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '2',
  dislikeCount: '0',
  likeScore: '2',
  individualLikes: [
    'EliezerYudkowsky',
    'JacksonFriess'
  ],
  pageId: 'Vingean_uncertainty',
  edit: '13',
  editSummary: '',
  prevEdit: '12',
  currentEdit: '13',
  wasPublished: 'true',
  type: 'wiki',
  title: 'Vingean uncertainty',
  clickbait: 'You can't predict the exact actions of an agent smarter than you - so is there anything you _can_ say about them?',
  textLength: '10842',
  alias: 'Vingean_uncertainty',
  externalUrl: '',
  sortChildrenBy: 'likes',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'EliezerYudkowsky',
  editCreatedAt: '2016-06-21 01:55:06',
  pageCreatorId: 'EliezerYudkowsky',
  pageCreatedAt: '2015-07-01 19:53:31',
  seeDomainId: '0',
  editDomainId: 'EliezerYudkowsky',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '1',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '302',
  text: '[summary:  [1c0] says that you (usually) can't predict *exactly* what an entity smarter than you will do, because if you knew exactly what a smart agent would do, you would be at least that smart yourself.  If you can predict exactly what move [1bx Deep Blue] will make on a chessboard, you can play chess as well as Deep Blue by moving to the same place you predict Deep Blue would.\n\nThis doesn't mean Deep Blue's programmers were ignorant of all aspects of their creation.  They understood where Deep Blue was working to steer the board's future - that Deep Blue was trying to win (rather than lose) chess games.\n\n"Vingean uncertainty" is the epistemic state we enter into when we consider an agent too smart for us to predict its exact actions.  In particular, we will probably become *more* confident of the agent achieving its goals - that is, become more confident of which final outcomes will result from the agent's actions - even as we become *less* confident of which exact actions the agent will take.]\n\n> Of course, I never wrote the “important” story, the sequel about the first amplified human. Once I tried something similar. John Campbell’s letter of rejection began: “Sorry—you can’t write this story. Neither can anyone else.”...\n> “Bookworm, Run!” and its lesson were important to me. Here I had tried a straightforward extrapolation of technology, and found myself precipitated over an abyss. It’s a problem writers face every time we consider the creation of intelligences greater than our own. When this happens, human history will have reached a kind of singularity—a place where extrapolation breaks down and new models must be applied—and the world will pass beyond our understanding. \n> -- [Vernor Vinge](https://books.google.com/books?id=tEMQpbiboH0C&pg=PA44&lpg=PA44&dq=vinge+%22pass+beyond+our+understanding%22+%22john+campbell%22&source=bl&ots=UTTxJ7Pndr&sig=88zngfy45_he2nJePP5dd0CTuR4&hl=en&sa=X&ved=0ahUKEwjD34_wrubJAhUHzWMKHVXYAocQ6AEIHTAA#v=onepage&q=vinge%20%22pass%20beyond%20our%20understanding%22%20%22john%20campbell%22&f=false), True Names and other Dangers, p. 47.\n\nVingean unpredictability is a key part of how we think about a [9h consequentialist intelligence] which we believe is smarter than us in a domain.  In particular, we usually think we can't predict exactly what a smarter-than-us agent will do, because if we could predict that, we would be that smart ourselves ([1c0]).\n\nIf you could predict exactly what action [1bx Deep Blue] would take on a chessboard, you could play as well as Deep Blue by making whatever move you predicted Deep Blue would make.  It follows that Deep Blue's programmers necessarily sacrificed their ability to intuit Deep Blue's exact moves in advance, in the course of creating a superhuman chessplayer.\n\nBut this doesn't mean Deep Blue's programmers were confused about the criterion by which Deep Blue chose actions.  Deep Blue's programmers still knew in advance that Deep Blue would try to *win* rather than lose chess games.  They knew that Deep Blue would try to steer the chess board's future into a particular region that was high in Deep Blue's preference ordering over chess positions.  We can predict the *consequences* of Deep Blue's moves better than we can predict the moves themselves.\n\n"Vingean uncertainty" is the peculiar epistemic state we enter when we're considering sufficiently intelligent programs; in particular, we become less confident that we can predict their exact actions, and more confident of the final outcome of those actions.\n\n(Note that this rejects the claim that we are epistemically helpless and can know nothing about beings smarter than ourselves.)\n\nFurthermore, our ability to think about agents smarter than ourselves is not limited to knowing a particular goal and predicting its achievement.  If we found a giant alien machine that seemed very well-designed, we might be able to infer the aliens were superhumanly intelligent even if we didn't know the aliens' ultimate goals.   If we saw metal pipes, we could guess that the pipes represented some stable, optimal mechanical solution which was made out of hard metal so as to retain its shape.  If we saw superconducting cables, we could guess that this was a way of efficiently transporting electrical work from one place to another, even if we didn't know what final purpose the electricity was being used for.  This is the idea behind [10g]: if we can recognize that an alien machine is efficiently harvesting and distributing energy, we might recognize it as an intelligently designed artifact in the service of *some* goal even if we don't know the goal.\n\n# Noncontainment of belief within the action probabilities\n\nWhen reasoning under Vingean uncertainty, due to our [ lack of logical omniscience], our beliefs about the consequences of the agent's actions are not fully contained in our probability distribution over the agent's actions.\n\nSuppose that on each turn of a chess game playing against Deep Blue, I ask you to put a probability distribution on Deep Blue's possible chess moves.  If you are a rational agent you should be able to put a [1bw well-calibrated] probability distribution on these moves - most trivially, by assigning every legal move an equal probability (if Deep Blue has 20 legal moves, and you assign each move 5% probability, you are guaranteed to be well-calibrated).\n\nNow imagine a randomized game player RandomBlue that, on each round, draws randomly from the probability distribution you'd assign to Deep Blue's move from the same chess position.  In every turn, your belief about where you'll observe RandomBlue move, is equivalent to your belief about where you'd see Deep Blue move.  But your belief about the probable end of the game is very different.  (This is only possible due to your lack of logical omniscience - you lack the computing resources to map out the complete sequence of expected moves, from your beliefs about each position.)\n\nIn particular, we could draw the following contrast between your reasoning about Deep Blue and your reasoning about RandomBlue:\n\n- When you see Deep Blue make a move to which you assigned a low probability, you think the rest of the game will go worse for you than you expected (that is, Deep Blue will do better than you previously expected).\n- When you see RandomBlue make a move that you assigned a low probability (i.e., a low probability that Deep Blue would make that move in that position), you expect to beat RandomBlue sooner than you previously expected (things will go worse for RandomBlue than your previous average expectation).\n\nThis reflects our belief in something like the [6s instrumental efficiency] of Deep Blue.  When we estimate the probability that Deep Blue makes a move $x$, we're estimating the probability that, as Deep Blue estimated each move $y$'s expected probability of winning $EU[y]$, Deep Blue found $\\forall y \\neq x: EU[x] > EU[y]$ (neglecting the possibility of exact ties, which is unlikely with deep searches and floating-point position-value estimates).  If Deep Blue picks $z$ instead of $x$, we know that Deep Blue estimated $\\forall y \\neq z: EU[z] > EU[y]$ and in particular that Deep Blue estimated $EU[z] > EU[x]$.  This could be because the expected worth of $x$ to Deep Blue was less than expected, but for low-probability move $z$ to be better than all other moves as well implies that $z$ had an unexpectedly high value relative to our own estimates.  Thus, when Deep Blue makes a very unexpected move, we mostly expect that Deep Blue saw an unexpectedly good move that was better than what we thought was the best available move.\n\nIn contrast, when RandomBlue makes an unexpected move, we think the random number generator happened to land on a move that we justly assigned low worth, and hence we expect to defeat RandomBlue faster than we otherwise would have.\n\n# Features of Vingean reasoning\n\nSome interesting features of reasoning under Vingean uncertainty:\n\n- We may find ourselves more confident of the predicted consequences of an action than of the predicted action.\n- We may be more sure about the agent's instrumental strategies than its goals.\n- Due to our lack of logical omniscience, our beliefs about the agent's action-mediated relation to the environment are not screened off by our probability distribution over the system's probable next actions.\n    - We update on the probable consequence of an action, and on the probable consequences of other actions not taken, after observing that the agent actually outputs that action.\n- If there is a compact way to describe the previous consequences of the agent's previous actions, we might try to infer that this consequence is a *goal* of the agent.  We might then predict similar consequences in the future, even without being able to predict the agent's specific next actions.\n\nOur expectation of Vingean unpredictability in a domain may break down [9j if the domain is extremely simple and sufficiently closed].  In this case there may be an optimal play that we already know, making superhuman (unpredictable) play impossible.\n\n# Cognitive uncontainability\n\nVingean unpredictability is one of the core reasons to expect [9f cognitive uncontainability] in [2c sufficiently intelligent agents].\n\n# Vingean reflection\n\n[1c1] is reasoning about cognitive systems, especially cognitive systems very similar to yourself (including your actual self), under the constraint that you can't predict the exact future outputs.  Deep Blue's programmers, by reasoning about the way Deep Blue was searching through game trees, could arrive at a well-justified but abstract belief that Deep Blue was 'trying to win' (rather than trying to lose) and reasoning effectively to that end.\n\nIn [1c1] we need to make predictions about the consequence of operating an agent in an environment, without knowing the agent's exact future actions - presumably via reasoning on some more abstract level, somehow.  In [-1mq], [1c0] appears in the rule that we should talk about our successor's specific actions only inside of quantifiers.\n\n"Vingean reflection" may be a much more general issue in the design of advanced cognitive systems than it might appear at first glance.  An agent reasoning about the consequences of *its current code*, or considering what will happen if it *spends another minute thinking,* can be viewed as doing Vingean reflection.  Vingean reflection can also be seen as the study of how a given agent *wants* thinking to occur in cognitive computations, which may be importantly different from how the agent *currently* thinks.  (If these two coincide, we say the agent is [1fx reflectively stable].)\n\n[1mq] is presently the main line of research trying to (slowly) get started on formalizing Vingean reflection and reflective stability.',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '1',
  maintainerCount: '1',
  userSubscriberCount: '0',
  lastVisit: '2016-02-21 22:34:19',
  hasDraft: 'false',
  votes: [],
  voteSummary: 'null',
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {},
  creatorIds: [
    'EliezerYudkowsky',
    'NateSoares',
    'AlexeiAndreev'
  ],
  childIds: [
    'Vinge_law',
    'deep_blue'
  ],
  parentIds: [
    'advanced_agent'
  ],
  commentIds: [],
  questionIds: [],
  tagIds: [
    'work_in_progress_meta_tag',
    'stub_meta_tag'
  ],
  relatedIds: [
    'Vinge_principle',
    'Vingean_reflection'
  ],
  markIds: [],
  explanations: [],
  learnMore: [],
  requirements: [],
  subjects: [],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '14238',
      pageId: 'Vingean_uncertainty',
      userId: 'EliezerYudkowsky',
      edit: '13',
      type: 'newEdit',
      createdAt: '2016-06-21 01:55:06',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '14235',
      pageId: 'Vingean_uncertainty',
      userId: 'EliezerYudkowsky',
      edit: '12',
      type: 'newEdit',
      createdAt: '2016-06-21 01:50:57',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '14233',
      pageId: 'Vingean_uncertainty',
      userId: 'EliezerYudkowsky',
      edit: '11',
      type: 'newEdit',
      createdAt: '2016-06-21 01:34:58',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '5346',
      pageId: 'Vingean_uncertainty',
      userId: 'EliezerYudkowsky',
      edit: '10',
      type: 'newEdit',
      createdAt: '2016-01-16 06:46:33',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '5338',
      pageId: 'Vingean_uncertainty',
      userId: 'NateSoares',
      edit: '9',
      type: 'newEdit',
      createdAt: '2016-01-16 04:44:37',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '5337',
      pageId: 'Vingean_uncertainty',
      userId: 'NateSoares',
      edit: '8',
      type: 'newEdit',
      createdAt: '2016-01-16 04:44:21',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4195',
      pageId: 'Vingean_uncertainty',
      userId: 'EliezerYudkowsky',
      edit: '7',
      type: 'newUsedAsTag',
      createdAt: '2015-12-18 23:27:10',
      auxPageId: 'Vinge_principle',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4193',
      pageId: 'Vingean_uncertainty',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'deleteChild',
      createdAt: '2015-12-18 23:27:06',
      auxPageId: 'Vinge_principle',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4191',
      pageId: 'Vingean_uncertainty',
      userId: 'EliezerYudkowsky',
      edit: '7',
      type: 'newEdit',
      createdAt: '2015-12-18 23:25:52',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4186',
      pageId: 'Vingean_uncertainty',
      userId: 'EliezerYudkowsky',
      edit: '6',
      type: 'newChild',
      createdAt: '2015-12-18 23:20:14',
      auxPageId: 'Vinge_principle',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4185',
      pageId: 'Vingean_uncertainty',
      userId: 'EliezerYudkowsky',
      edit: '6',
      type: 'newEdit',
      createdAt: '2015-12-18 23:18:31',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4184',
      pageId: 'Vingean_uncertainty',
      userId: 'EliezerYudkowsky',
      edit: '5',
      type: 'newEdit',
      createdAt: '2015-12-18 23:15:52',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4178',
      pageId: 'Vingean_uncertainty',
      userId: 'EliezerYudkowsky',
      edit: '4',
      type: 'newChild',
      createdAt: '2015-12-18 22:25:49',
      auxPageId: 'deep_blue',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4163',
      pageId: 'Vingean_uncertainty',
      userId: 'EliezerYudkowsky',
      edit: '4',
      type: 'newChild',
      createdAt: '2015-12-18 21:42:24',
      auxPageId: 'Vinge_law',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4162',
      pageId: 'Vingean_uncertainty',
      userId: 'EliezerYudkowsky',
      edit: '4',
      type: 'newEdit',
      createdAt: '2015-12-18 21:42:07',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4161',
      pageId: 'Vingean_uncertainty',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newAlias',
      createdAt: '2015-12-18 21:42:06',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4159',
      pageId: 'Vingean_uncertainty',
      userId: 'EliezerYudkowsky',
      edit: '3',
      type: 'newUsedAsTag',
      createdAt: '2015-12-18 21:09:44',
      auxPageId: 'Vingean_reflection',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4155',
      pageId: 'Vingean_uncertainty',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'deleteChild',
      createdAt: '2015-12-18 21:09:22',
      auxPageId: 'Vingean_reflection',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '4153',
      pageId: 'Vingean_uncertainty',
      userId: 'EliezerYudkowsky',
      edit: '3',
      type: 'newChild',
      createdAt: '2015-12-18 21:09:06',
      auxPageId: 'Vingean_reflection',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '3921',
      pageId: 'Vingean_uncertainty',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'newAlias',
      createdAt: '2015-12-16 16:32:50',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '3922',
      pageId: 'Vingean_uncertainty',
      userId: 'AlexeiAndreev',
      edit: '3',
      type: 'newEdit',
      createdAt: '2015-12-16 16:32:50',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '3920',
      pageId: 'Vingean_uncertainty',
      userId: 'AlexeiAndreev',
      edit: '2',
      type: 'newTag',
      createdAt: '2015-12-16 16:32:46',
      auxPageId: 'stub_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1127',
      pageId: 'Vingean_uncertainty',
      userId: 'AlexeiAndreev',
      edit: '1',
      type: 'newUsedAsTag',
      createdAt: '2015-10-28 03:47:09',
      auxPageId: 'work_in_progress_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '144',
      pageId: 'Vingean_uncertainty',
      userId: 'AlexeiAndreev',
      edit: '1',
      type: 'newParent',
      createdAt: '2015-10-28 03:46:51',
      auxPageId: 'advanced_agent',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1679',
      pageId: 'Vingean_uncertainty',
      userId: 'EliezerYudkowsky',
      edit: '2',
      type: 'newEdit',
      createdAt: '2015-07-01 22:04:24',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1678',
      pageId: 'Vingean_uncertainty',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newEdit',
      createdAt: '2015-07-01 19:53:31',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'true',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {
    fewerWords: {
      likeableId: '3647',
      likeableType: 'contentRequest',
      myLikeValue: '0',
      likeCount: '1',
      dislikeCount: '0',
      likeScore: '1',
      individualLikes: [],
      id: '127',
      pageId: 'Vingean_uncertainty',
      requestType: 'fewerWords',
      createdAt: '2016-10-24 21:05:31'
    }
  }
}