{
  localUrl: '../page/explicit_bayes_counters_worry.html',
  arbitalUrl: 'https://arbital.com/p/explicit_bayes_counters_worry',
  rawJsonUrl: '../raw/1x3.json',
  likeableId: '846',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '1',
  dislikeCount: '0',
  likeScore: '1',
  individualLikes: [
    'EricBruylant'
  ],
  pageId: 'explicit_bayes_counters_worry',
  edit: '6',
  editSummary: '',
  prevEdit: '5',
  currentEdit: '6',
  wasPublished: 'true',
  type: 'wiki',
  title: 'Explicit Bayes as a counter for 'worrying'',
  clickbait: 'Explicitly walking through Bayes's Rule can summarize your knowledge and thereby stop you from bouncing around pieces of it.',
  textLength: '3965',
  alias: 'explicit_bayes_counters_worry',
  externalUrl: '',
  sortChildrenBy: 'likes',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'EliezerYudkowsky',
  editCreatedAt: '2016-02-08 01:13:04',
  pageCreatorId: 'EliezerYudkowsky',
  pageCreatedAt: '2016-02-08 00:47:22',
  seeDomainId: '0',
  editDomainId: 'AlexeiAndreev',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '0',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '117',
  text: '[summary:  By making up plausible Bayesian probabilities, we can summarize all of our knowledge about a problem in the same place at the same time, rather than 'worrying' by bouncing back and forth between different parts of it.  To use one real-life example, if an OKCupid match cancels your first date without explanation, then you might be tempted to 'worry' by bouncing your focus back and forth between possible valid reasons to cancel the date, versus possible ways it was a bad sign.  Alternatively, you could estimate  [1rm prior] [1rb odds] of 2 : 5 for desirability vs. undesirability of a 96% OKCupid match, a 1 : 3 [1rq likelihood ratio] for desirable vs. undesirable men flaking on the first date, decide that the [1rp posterior] odds of 2 : 15 don't warrant further pursuit, and then be done.  The point isn't that the made-up numbers are exact, but that they summarize the relevant weights into a single place and calculation, terminating the 'worry' cycle of alternating attention on particular pieces, and letting us be done.]\n\nOne of the possible [1x2 human uses of explicit Bayesian reasoning] is that making up explicit probabilities and doing the Bayesian calculation lets us summarize the relevant considerations in one place.  This can counter a mental loop of 'worrying' by bouncing back and forth between focusing on individual considerations.\n\nOne example comes from a woman who was a test subject for an early version of the Bayes intro at the same time she was dating on OKCupid, and a 96% OKCupid match canceled their first date for coffee without explanation.  After bouncing back and forth mentally between 'maybe there was a good reason he canceled' versus 'that doesn't seem like a good sign', she decided to try Bayes.  She estimated that a man like this one had [1rm prior] [1rb odds] of 2 : 5 for desirability vs. undesirability, based on his OKCupid profile and her past experience with 96% matches.  She then estimated a 1 : 3 [1rq likelihood ratio] for desirable vs. undesirable men flaking on the first date.  This worked out to 2 : 15 [1rp posterior] odds for the man being undesirable, which she decided was unfavorable enough to not pursue him further.\n\nThe point of this mental exercise wasn't that the numbers she made up were exact.  Rather, by laying out all the key factors in one place, she stopped her mind from bouncing back and forth between %%knows-requisite([1rj]): visualizing $\\mathbb P(\\text{cancel}|\\text{desirable})$ and visualizing $\\mathbb P(\\text{cancel}|\\text{undesirable})$, which were pulling her back and forth as arguments pointing in different directions.%% %%!knows-requisite([1rj]): switching between imagining reasons why a good prospect might've canceled, versus imagining reasons why a bad prospect might've canceled.%%  By combining both ideas into a likelihood ratio, and moving on to posterior odds, she summarized the considerations and was able to stop unproductively 'worrying'.\n\nWorrying isn't always unproductive.  Paying more attention to an individual consideration like "Maybe there were reasons a good dating prospect might've canceled anyway?" might cause you to adjust your estimate of this consideration, thereby changing your posterior odds.  But you can get the *illusion* of progress by switching your attention from one already-known consideration to another, since it feels like these considerations are pulling on your posterior intuition each time you focus on them.  It feels like your beliefs are changing and cognitive work is being performed, but actually you're just going in circles.  This is the 'unproductive worrying' process that you can interrupt by doing an explicitly Bayesian calculation that summarizes what you already know into a single answer.\n\n(She did send him a rejection notice spelling out her numerical reasoning; since, if he wrote back in a way indicating that he actually understood that, he might've been worth a second chance.)\n\n(He didn't.)',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '1',
  maintainerCount: '1',
  userSubscriberCount: '0',
  lastVisit: '2016-02-20 04:10:40',
  hasDraft: 'false',
  votes: [],
  voteSummary: [
    '0',
    '0',
    '0',
    '0',
    '0',
    '0',
    '0',
    '0',
    '0',
    '0'
  ],
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {
    Summary: 'By making up plausible Bayesian probabilities, we can summarize all of our knowledge about a problem in the same place at the same time, rather than 'worrying' by bouncing back and forth between different parts of it.  To use one real-life example, if an OKCupid match cancels your first date without explanation, then you might be tempted to 'worry' by bouncing your focus back and forth between possible valid reasons to cancel the date, versus possible ways it was a bad sign.  Alternatively, you could estimate  [1rm prior] [1rb odds] of 2 : 5 for desirability vs. undesirability of a 96% OKCupid match, a 1 : 3 [1rq likelihood ratio] for desirable vs. undesirable men flaking on the first date, decide that the [1rp posterior] odds of 2 : 15 don't warrant further pursuit, and then be done.  The point isn't that the made-up numbers are exact, but that they summarize the relevant weights into a single place and calculation, terminating the 'worry' cycle of alternating attention on particular pieces, and letting us be done.'
  },
  creatorIds: [
    'EliezerYudkowsky'
  ],
  childIds: [],
  parentIds: [
    'bayes_for_humans'
  ],
  commentIds: [],
  questionIds: [],
  tagIds: [
    'c_class_meta_tag'
  ],
  relatedIds: [],
  markIds: [],
  explanations: [
    {
      id: '5827',
      parentId: 'explicit_bayes_counters_worry',
      childId: 'explicit_bayes_counters_worry',
      type: 'subject',
      creatorId: 'AlexeiAndreev',
      createdAt: '2016-08-02 17:01:15',
      level: '1',
      isStrong: 'true',
      everPublished: 'true'
    }
  ],
  learnMore: [],
  requirements: [
    {
      id: '2054',
      parentId: 'bayes_rule',
      childId: 'explicit_bayes_counters_worry',
      type: 'requirement',
      creatorId: 'AlexeiAndreev',
      createdAt: '2016-06-17 21:58:56',
      level: '1',
      isStrong: 'true',
      everPublished: 'true'
    },
    {
      id: '2055',
      parentId: 'odds',
      childId: 'explicit_bayes_counters_worry',
      type: 'requirement',
      creatorId: 'AlexeiAndreev',
      createdAt: '2016-06-17 21:58:56',
      level: '2',
      isStrong: 'true',
      everPublished: 'true'
    }
  ],
  subjects: [
    {
      id: '5826',
      parentId: 'bayes_for_humans',
      childId: 'explicit_bayes_counters_worry',
      type: 'subject',
      creatorId: 'AlexeiAndreev',
      createdAt: '2016-08-02 17:00:47',
      level: '1',
      isStrong: 'false',
      everPublished: 'true'
    },
    {
      id: '5827',
      parentId: 'explicit_bayes_counters_worry',
      childId: 'explicit_bayes_counters_worry',
      type: 'subject',
      creatorId: 'AlexeiAndreev',
      createdAt: '2016-08-02 17:01:15',
      level: '1',
      isStrong: 'true',
      everPublished: 'true'
    }
  ],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18267',
      pageId: 'explicit_bayes_counters_worry',
      userId: 'EricBruylant',
      edit: '0',
      type: 'newTag',
      createdAt: '2016-08-03 19:15:24',
      auxPageId: 'c_class_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18094',
      pageId: 'explicit_bayes_counters_worry',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'newTeacher',
      createdAt: '2016-08-02 17:01:16',
      auxPageId: 'explicit_bayes_counters_worry',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18095',
      pageId: 'explicit_bayes_counters_worry',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'newSubject',
      createdAt: '2016-08-02 17:01:16',
      auxPageId: 'explicit_bayes_counters_worry',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18093',
      pageId: 'explicit_bayes_counters_worry',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'newSubject',
      createdAt: '2016-08-02 17:00:48',
      auxPageId: 'bayes_for_humans',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '6526',
      pageId: 'explicit_bayes_counters_worry',
      userId: 'EliezerYudkowsky',
      edit: '6',
      type: 'newEdit',
      createdAt: '2016-02-08 01:13:04',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '6525',
      pageId: 'explicit_bayes_counters_worry',
      userId: 'EliezerYudkowsky',
      edit: '5',
      type: 'newEdit',
      createdAt: '2016-02-08 01:10:40',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '6514',
      pageId: 'explicit_bayes_counters_worry',
      userId: 'EliezerYudkowsky',
      edit: '4',
      type: 'newEdit',
      createdAt: '2016-02-08 00:57:07',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '6513',
      pageId: 'explicit_bayes_counters_worry',
      userId: 'EliezerYudkowsky',
      edit: '3',
      type: 'newEdit',
      createdAt: '2016-02-08 00:50:32',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '6512',
      pageId: 'explicit_bayes_counters_worry',
      userId: 'EliezerYudkowsky',
      edit: '2',
      type: 'newEdit',
      createdAt: '2016-02-08 00:48:39',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '6511',
      pageId: 'explicit_bayes_counters_worry',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newEdit',
      createdAt: '2016-02-08 00:47:22',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '6510',
      pageId: 'explicit_bayes_counters_worry',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newRequirement',
      createdAt: '2016-02-08 00:32:15',
      auxPageId: 'odds',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '6508',
      pageId: 'explicit_bayes_counters_worry',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newRequirement',
      createdAt: '2016-02-08 00:32:10',
      auxPageId: 'bayes_rule',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '6506',
      pageId: 'explicit_bayes_counters_worry',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newParent',
      createdAt: '2016-02-08 00:31:08',
      auxPageId: 'bayes_for_humans',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'false',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}