{
  localUrl: '../page/152.html',
  arbitalUrl: 'https://arbital.com/p/152',
  rawJsonUrl: '../raw/152.json',
  likeableId: '142',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '0',
  dislikeCount: '0',
  likeScore: '0',
  individualLikes: [],
  pageId: '152',
  edit: '7',
  editSummary: '',
  prevEdit: '6',
  currentEdit: '7',
  wasPublished: 'true',
  type: 'wiki',
  title: 'Processes of Rationality',
  clickbait: 'An idiosyncratic organization of rationality materials.',
  textLength: '5445',
  alias: '152',
  externalUrl: '',
  sortChildrenBy: 'likes',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'NicolasBourbaki',
  editCreatedAt: '2015-10-20 18:24:22',
  pageCreatorId: 'NicolasBourbaki',
  pageCreatedAt: '2015-10-12 06:10:22',
  seeDomainId: '0',
  editDomainId: 'MalcolmOcean.Malcolmoceanblog',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '0',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '15',
  text: 'This page is an experiment. It may violate some conventions. I'm sorry.\n\n(Our dithering between the authorial "I" and the authorial "we" is meaningful. Do not dismiss it or become inured to it.)\n\n## Self-Help for Animate Matter\n\nIf you are reading this page and it has the potential to influence your actions, then you are animate matter that has not reduced itself to an optimization process. This is good, maybe.\n\nWe can make the very weak assumption that you are human, or human-like enough (tulpa, *Homo sapiens* with unusual brain organization/damage/*thal*-ishness, etc.), but why should this matter? How do I know I'm not giving you the rope you'll need to hang us all?\n\nA thread running through this entire exploration is the question of whether rationality (or the collection of rationality-adjacent concepts) is safe and ought to be propagated. So if you are reading this, we have evidence that you should be reading this. We just haven't exposed you to it, yet.\n\n## Organization\n\nThis is an idiosyncratic organization of rationality materials. Rationality is here understood as optimization: systematically moving toward a goal. Finding a desired option in a large space of options. Optimization in this sense is strongly entangled with the notion of a selection effect. A major goal here is to elucidate that connection. Another goal is to make the principles of rationality seem as basic and necessary as possible. Following the idea of [practical advice backed by deep theories](http://lesswrong.com/lw/d4/practical_advice_backed_by_deep_theories/), we wish to make rationality sound like a branch of physics.\n\nThe reason for this is to [dissolve](http://lesswrong.com/lw/of/dissolving_the_question/) mistakes. Our business here is largely to reduce cognitive bias. It is one thing to helpfully point out a mistake. It is quite another to dissolve the mistake. If you see someone making a mistake but you don't know *why* they are making it, you should be unsettled. It is all too easy to approach a problem with a simplified view and shout "Bias!" when the decisions of others differ from what you compute to be optimal. One should ask: "is there a reason for their action which I've not yet determined?" Let others try to give helpful advice; our business here is the more careful course.\n\nTherefore, before any cognitive bias is introduced, the deep theory should be in place to explain why it occurs. Following this, practical advice for removing the bias from oneself should be given. However, by this point the advice should seem almost obvious based on the deep theory. What remains is only experimental validation that the advice works.\n\nAbove all, write what is needed. We write the above as a way of defining the highest standard we can so far conceive for a text on rationality. We will not always live up to this standard. To give good advice is still better than not to give it. To explain why something is a mistake without being able to explain why the mistake is made still offers a path to improvement.\n\nThe following outline is a plan of pages, with descriptions of planned content in some cases. Some bullets may stand for messy sets of pages, IE, portions of the outline to be filled in. Feel free to edit without worrying about stepping on feet.\n\n- [154 Being Strategic: The Very Idea]\n  - [15q Noticing Preferences]\n     - *Discuss the basic idea of noticing what you want and trying to get it. Discuss the research on value affirmation.*\n  - [15p Trying Things]\n     - *Optimization requires feedback. Discuss trial-and-error optimization processes, avoiding failures of the optimization feedback loop, seeking improvements by optimizing the feedback loop, optimization processes as a special case of selection processes.*\n     - Trial and Error\n     - Exploration vs Exploitation\n- Information and Control\n  - stuff about evolution which I need to research more\n     - *There is work connecting thermodynamics and evolution/ecology which seems like it would fit well here. The work starts with work by Schrodinger, titled* What is Life. *Among other things, the book introduced the idea that life persists by negentropy. More recent work by Jeremy England pursues this line of thinking in a mathematical model of evolution. It's possible that some or all of this is nonsense, but if it turns out to be good, it seems like an interesting way to ground things in general principles, and also an excuse to talk about some information theory concepts.*\n - Every Good Key Must Be a Model of the Lock It Opens\n     - *Another possibly interesting mathematical framework, which says something about why control systems need to accurately model the world if they do anything very complex.*\n - Truth: The Very Idea\n     - Pragmatism\n     - Discussion of basic epistemic biases??\n- Selection Effects\n     - Selection bias; Bayesian solution\n     - Cherry-picking evidence and other sins\n     - (a bunch of biases can be fit in here)\n     - Availability Heuristic\n         - Probability Neglect\n         - Base-Rate Neglect\n         - Normalcy Bias\n         - Frequency Illusion\n         - Mere Exposure Effect, Illusory Truth Effect\n         - Information Cascade, Availability Cascade\n    - Inspection Paradox\n       - (lots more specific biases can fit under this one too)\n- Tails Come Apart\n   - Optimizer's Curse\n   - Simultaneous Underestimation and Overestimation\n   - Principle-Agent Problem\n   - Lost Purposes',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '1',
  maintainerCount: '1',
  userSubscriberCount: '0',
  lastVisit: '2016-02-24 01:25:38',
  hasDraft: 'false',
  votes: [],
  voteSummary: 'null',
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {},
  creatorIds: [
    'NicolasBourbaki'
  ],
  childIds: [
    '154',
    '15p',
    '15q'
  ],
  parentIds: [],
  commentIds: [],
  questionIds: [],
  tagIds: [],
  relatedIds: [],
  markIds: [],
  explanations: [],
  learnMore: [],
  requirements: [],
  subjects: [],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '697',
      pageId: '152',
      userId: 'AlexeiAndreev',
      edit: '1',
      type: 'newChild',
      createdAt: '2015-10-28 03:46:58',
      auxPageId: '154',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '698',
      pageId: '152',
      userId: 'AlexeiAndreev',
      edit: '1',
      type: 'newChild',
      createdAt: '2015-10-28 03:46:58',
      auxPageId: '15q',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '699',
      pageId: '152',
      userId: 'AlexeiAndreev',
      edit: '1',
      type: 'newChild',
      createdAt: '2015-10-28 03:46:58',
      auxPageId: '15p',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1705',
      pageId: '152',
      userId: 'NicolasBourbaki',
      edit: '7',
      type: 'newEdit',
      createdAt: '2015-10-20 18:24:22',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1704',
      pageId: '152',
      userId: 'NicolasBourbaki',
      edit: '6',
      type: 'newEdit',
      createdAt: '2015-10-20 18:22:17',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1703',
      pageId: '152',
      userId: 'NicolasBourbaki',
      edit: '5',
      type: 'newEdit',
      createdAt: '2015-10-14 21:10:12',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1702',
      pageId: '152',
      userId: 'NicolasBourbaki',
      edit: '4',
      type: 'newEdit',
      createdAt: '2015-10-12 20:12:40',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1701',
      pageId: '152',
      userId: 'NicolasBourbaki',
      edit: '3',
      type: 'newEdit',
      createdAt: '2015-10-12 12:07:11',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1700',
      pageId: '152',
      userId: 'NicolasBourbaki',
      edit: '2',
      type: 'newEdit',
      createdAt: '2015-10-12 12:06:15',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '1699',
      pageId: '152',
      userId: 'NicolasBourbaki',
      edit: '1',
      type: 'newEdit',
      createdAt: '2015-10-12 06:10:22',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'true',
  hasParents: 'false',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}