{ localUrl: '../page/bayes_probability_notation_math1.html', arbitalUrl: 'https://arbital.com/p/bayes_probability_notation_math1', rawJsonUrl: '../raw/1yc.json', likeableId: '886', likeableType: 'page', myLikeValue: '0', likeCount: '5', dislikeCount: '0', likeScore: '5', individualLikes: [ 'EricBruylant', 'markkalsbeek', 'IvanKuzmin', 'SzymonWilczyski', 'SzymonSlawinski' ], pageId: 'bayes_probability_notation_math1', edit: '13', editSummary: 'removed first line header', prevEdit: '12', currentEdit: '13', wasPublished: 'true', type: 'wiki', title: 'Probability notation for Bayes' rule: Intro (Math 1)', clickbait: 'How to read, and identify, the probabilities in Bayesian problems.', textLength: '9377', alias: 'bayes_probability_notation_math1', externalUrl: '', sortChildrenBy: 'likes', hasVote: 'false', voteType: '', votesAnonymous: 'false', editCreatorId: 'EricBruylant', editCreatedAt: '2016-08-03 16:30:21', pageCreatorId: 'EliezerYudkowsky', pageCreatedAt: '2016-02-11 03:39:07', seeDomainId: '0', editDomainId: 'AlexeiAndreev', submitToDomainId: '0', isAutosave: 'false', isSnapshot: 'false', isLiveEdit: 'true', isMinorEdit: 'false', indirectTeacher: 'false', todoCount: '1', isEditorComment: 'false', isApprovedComment: 'true', isResolved: 'false', snapshotText: '', anchorContext: '', anchorText: '', anchorOffset: '0', mergedInto: '', isDeleted: 'false', viewCount: '616', text: 'To denote some of the quantities used in Bayes' rule, we'll need [1rj conditional probabilities]. The conditional probability $\\mathbb{P}(X\\mid Y)$ means "The [1rf probability] of $X$ given $Y$." That is, $\\mathbb P(\\mathrm{left}\\mid \\mathrm{right})$ means "The probability that $\\mathrm{left}$ is true, assuming that $\\mathrm{right}$ is true."\n\n$\\mathbb P(\\mathrm{yellow}\\mid \\mathrm{banana})$ is the probability that a banana is yellow - if we know something to be a banana, what is the probability that it is yellow? $\\mathbb P(\\mathrm{banana}\\mid \\mathrm{yellow})$ is the probability that a yellow thing is a banana - if the right, known side is yellowness, then, we ask the question on the left, what is the probability that this is a banana?\n\nIn probability theory, the definition of "conditional probability" is that the conditional probability of $L,$ given $R,$ is found by looking at the probability of possibilities with both $L$ *and* $R$ *within* all possibilities with $R.$ Using $L \\wedge R$ to denote the logical proposition "L and R both true":\n\n$\\mathbb P(L\\mid R) = \\frac{\\mathbb P(L \\wedge R)}{\\mathbb P(R)}$\n\nSuppose you have a bag containing objects that are either red or blue, and either square or round:\n\n$$\\begin{array}{l|r|r}\n& Red & Blue \\\\\n\\hline\nSquare & 1 & 2 \\\\\n\\hline\nRound & 3 & 4\n\\end{array}$$\n\nIf you reach in and feel a round object, the conditional probability that it is red is:\n\n$\\mathbb P(\\mathrm{red} \\mid \\mathrm{round}) = \\dfrac{\\mathbb P(\\mathrm{red} \\wedge \\mathrm{round})}{\\mathbb P(\\mathrm{round})} \\propto \\dfrac{3}{3 + 4} = \\frac{3}{7}$\n\nIf you look at the object nearest the top, and can see that it's blue, but not see the shape, then the conditional probability that it's a square is:\n\n$\\mathbb P(\\mathrm{square} \\mid \\mathrm{blue}) = \\dfrac{\\mathbb P(\\mathrm{square} \\wedge \\mathrm{blue})}{\\mathbb P(\\mathrm{blue})} \\propto \\dfrac{2}{2 + 4} = \\frac{1}{3}$\n\n![conditional probabilities bag](https://i.imgur.com/zscEdLj.png?0)\n\n# Updating as conditioning\n\nBayes' rule is useful because the process of *observing new evidence* can be interpreted as *conditioning a probability distribution.*\n\nAgain, the Diseasitis problem:\n\n> 20% of the patients in the screening population start out with Diseasitis. Among patients with Diseasitis, 90% turn the tongue depressor black. 30% of the patients without Diseasitis will also turn the tongue depressor black. Among all the patients with black tongue depressors, how many have Diseasitis?\n\nConsider a single patient, before observing any evidence. There are four possible worlds we could be in, the product of (sick vs. healthy) times (positive vs. negative result):\n\n$$\\begin{array}{l|r|r}\n& Sick & Healthy \\\\\n\\hline\nTest + & 18\\% & 24\\% \\\\\n\\hline\nTest - & 2\\% & 56\\%\n\\end{array}$$\n\nTo actually *observe* that the patient gets a negative result, is to eliminate from further consideration the possible worlds where the patient gets a positive result:\n\n![bayes elimination](https://i.imgur.com/LGeGIzW.png?0)\n\nOnce we observe the result $\\mathrm{positive}$, all of our future reasoning should take place, not in our old $\\mathbb P(\\cdot),$ but in our new $\\mathbb P(\\cdot \\mid \\mathrm{positive}).$ This is why, after observing "$\\mathrm{positive}$" and revising our probability distribution, when we ask about the probability the patient is sick, we are interested in the new probability $\\mathbb P(\\mathrm{sick}\\mid \\mathrm{positive})$ and not the old probability $\\mathbb P(\\mathrm{sick}).$\n\n## Example: Socks-dresser problem\n\nRealizing that *observing evidence* corresponds to [1y6 eliminating probability mass] and concerning ourselves only with the probability mass that remains, is the key to solving the [55b sock-dresser search] problem:\n\n> You left your socks somewhere in your room. You think there's a 4/5 chance that they're in your dresser, so you start looking through your dresser's 8 drawers. After checking 6 drawers at random, you haven't found your socks yet. What is the probability you will find your socks in the next drawer you check?\n\nWe initially have 20% of the probability mass in "Socks outside the dresser", and 80% of the probability mass for "Socks inside the dresser". This corresponds to 10% probability mass for each of the 8 drawers.\n\nAfter eliminating the probability mass in 6 of the drawers, we have 40% of the original mass remaining, 20% for "Socks outside the dresser" and 10% each for the remaining 2 drawers.\n\nSince this remaining 40% probability mass is now our whole world, the effect on our probability distribution is like amplifying the 40% until it expands back up to 100%, aka [1rk renormalizing the probability distribution]. This is why we divide $\\mathbb P(L \\wedge R)$ by $\\mathbb P(R)$ to get the new probabilities.\n\nIn this case, we divide "20% probability of being outside the dresser" by 40%, and then divide the 10% probability mass in each of the two drawers by 40%. So the new probabilities are 1/2 for outside the dresser, and 1/4 each for the 2 drawers. Or more simply, we could observe that, among the remaining probability mass of 40%, the "outside the dresser" hypothesis has half of it, and the two drawers have a quarter each.\n\nSo the probability of finding our socks in the next drawer is 25%.\n\nNote that as we open successive drawers, we both become more confident that the socks are not in the dresser at all (since we eliminated several drawers they could have been in), *and* also expect more that we might find the socks in the next drawer we open (since there are so few remaining).\n\n# Priors, likelihoods, and posteriors\n\nBayes' theorem is generally inquiring about some question of the form $\\mathbb P(\\mathrm{hypothesis}\\mid \\mathrm{evidence})$ - the $\\mathrm{evidence}$ is known or assumed, so that we are now mentally living in the revised probability distribution $\\mathbb P(\\cdot\\mid \\mathrm{evidence}),$ and we are asking what we infer or guess about the $hypothesis.$ This quantity is the **[1rp posterior probability]** of the $\\mathrm{hypothesis}.$\n\nTo carry out a Bayesian revision, we also need to know what our beliefs were before we saw the evidence. (E.g., in the Diseasitis problem, the chance that a patient who hasn't been tested yet is sick.) This is often written $\\mathbb P(\\mathrm{hypothesis}),$ and the hypothesis's probability isn't being conditioned on anything because it is our **[1rm prior]** belief.\n\nThe remaining pieces of key information are the **[1rq likelihoods]** of the evidence, given each hypothesis. To interpret the meaning of the positive test result as evidence, we need to imagine ourselves in the world where the patient is sick - *assume* the patient to be sick, as if that were known - and then ask, just as if we hadn't seen any test result yet, what we think the probability of the evidence would be in that world. And then we have to do a similar operation again, this time mentally inhabiting the world where the patient is healthy. And unfortunately, it so happens that the standard notation are such as to make this idea be denoted $\\mathbb P(\\mathrm{evidence}\\mid \\mathrm{hypothesis})$ - looking deceptively like the notation for the posterior probability, but written in the reverse order. Not surprisingly, this trips people up a bunch until they get used to it. (You would at least hope that the standard symbol $\\mathbb P(\\cdot \\mid \\cdot)$ wouldn't be *symmetrical,* but it is. Alas.)\n\n## Example\n\nSuppose you're Sherlock Holmes investigating a case in which a red hair was left at the scene of the crime.\n\nThe Scotland Yard detective says, "Aha! Then it's Miss Scarlet. She has red hair, so if she was the murderer she almost certainly would have left a red hair there. $\\mathbb P(\\mathrm{redhair}\\mid \\mathrm{Scarlet}) = 99\\%,$ let's say, which is a near-certain conviction, so we're done."\n\n"But no," replies Sherlock Holmes. "You see, but you do not correctly track the meaning of the [1rj conditional probabilities], detective. The knowledge we require for a conviction is not $\\mathbb P(\\mathrm{redhair}\\mid \\mathrm{Scarlet}),$ the chance that Miss Scarlet would leave a red hair, but rather $\\mathbb P(\\mathrm{Scarlet}\\mid \\mathrm{redhair}),$ the chance that this red hair was left by Scarlet. There are other people in this city who have red hair."\n\n"So you're saying..." the detective said slowly, "that $\\mathbb P(\\mathrm{redhair}\\mid \\mathrm{Scarlet})$ is actually much lower than $1$?"\n\n"No, detective. I am saying that just because $\\mathbb P(\\mathrm{redhair}\\mid \\mathrm{Scarlet})$ is high does not imply that $\\mathbb P(\\mathrm{Scarlet}\\mid \\mathrm{redhair})$ is high. It is the latter probability in which we are interested - the degree to which, *knowing* that a red hair was left at the scene, we *infer* that Miss Scarlet was the murderer. The posterior, as the Bayesians say. This is not the same quantity as the degree to which, *assuming* Miss Scarlet was the murderer, we would *guess* that she might leave a red hair. That is merely the likelihood of the evidence, conditional on Miss Scarlet having done it."\n\n## Visualization\n\nUsing the [1wy waterfall] for the [22s Diseasitis problem]:\n\n![waterfall labeled probabilities](https://i.imgur.com/f9E0ltp.png?0)\n\n[todo: add an Example 2 and Example 3, maybe with graphics, because I expect this part to be confusing. steal from L0 Bayes.]\n', metaText: '', isTextLoaded: 'true', isSubscribedToDiscussion: 'false', isSubscribedToUser: 'false', isSubscribedAsMaintainer: 'false', discussionSubscriberCount: '1', maintainerCount: '1', userSubscriberCount: '0', lastVisit: '2016-02-21 13:18:45', hasDraft: 'false', votes: [], voteSummary: [ '0', '0', '0', '0', '0', '0', '0', '0', '0', '0' ], muVoteSummary: '0', voteScaling: '0', currentUserVote: '-2', voteCount: '0', lockedVoteType: '', maxEditEver: '0', redLinkCount: '0', lockedBy: '', lockedUntil: '', nextPageId: '', prevPageId: '', usedAsMastery: 'false', proposalEditNum: '0', permissions: { edit: { has: 'false', reason: 'You don't have domain permission to edit this page' }, proposeEdit: { has: 'true', reason: '' }, delete: { has: 'false', reason: 'You don't have domain permission to delete this page' }, comment: { has: 'false', reason: 'You can't comment in this domain because you are not a member' }, proposeComment: { has: 'true', reason: '' } }, summaries: { Summary: 'To denote some of the quantities used in Bayes' rule, we'll need [1rj conditional probabilities]. The conditional probability $\\mathbb{P}(X\\mid Y)$ means "The [1rf probability] of $X$ given $Y$." That is, $\\mathbb P(\\mathrm{left}\\mid \\mathrm{right})$ means "The probability that $\\mathrm{left}$ is true, assuming that $\\mathrm{right}$ is true."' }, creatorIds: [ 'EliezerYudkowsky', 'NateSoares', 'AlexeiAndreev', 'EricBruylant' ], childIds: [], parentIds: [ 'bayes_probability_notation' ], commentIds: [], questionIds: [], tagIds: [ 'b_class_meta_tag' ], relatedIds: [], markIds: [], explanations: [], learnMore: [], requirements: [ { id: '2120', parentId: 'bayes_waterfall_diagram', childId: 'bayes_probability_notation_math1', type: 'requirement', creatorId: 'AlexeiAndreev', createdAt: '2016-06-17 21:58:56', level: '2', isStrong: 'false', everPublished: 'true' }, { id: '2121', parentId: 'math1', childId: 'bayes_probability_notation_math1', type: 'requirement', creatorId: 'AlexeiAndreev', createdAt: '2016-06-17 21:58:56', level: '2', isStrong: 'true', everPublished: 'true' }, { id: '5772', parentId: 'bayes_rule', childId: 'bayes_probability_notation_math1', type: 'requirement', creatorId: 'AlexeiAndreev', createdAt: '2016-08-02 00:15:52', level: '1', isStrong: 'true', everPublished: 'true' } ], subjects: [ { id: '2123', parentId: 'bayes_probability_notation', childId: 'bayes_probability_notation_math1', type: 'subject', creatorId: 'AlexeiAndreev', createdAt: '2016-06-17 21:58:56', level: '2', isStrong: 'true', everPublished: 'true' }, { id: '2124', parentId: 'conditional_probability', childId: 'bayes_probability_notation_math1', type: 'subject', creatorId: 'AlexeiAndreev', createdAt: '2016-06-17 21:58:56', level: '2', isStrong: 'true', everPublished: 'true' }, { id: '2132', parentId: 'bayes_rule_elimination', childId: 'bayes_probability_notation_math1', type: 'subject', creatorId: 'AlexeiAndreev', createdAt: '2016-06-17 21:58:56', level: '1', isStrong: 'false', everPublished: 'true' }, { id: '5640', parentId: 'bayes_rule', childId: 'bayes_probability_notation_math1', type: 'subject', creatorId: 'AlexeiAndreev', createdAt: '2016-07-26 17:01:39', level: '2', isStrong: 'false', everPublished: 'true' } ], lenses: [], lensParentId: 'bayes_probability_notation', pathPages: [], learnMoreTaughtMap: { '1rj': [ '565' ] }, learnMoreCoveredMap: { '1lz': [ '1xr', '1zh', '1zm', '220', '552', '56j', '6cj' ] }, learnMoreRequiredMap: { '1rj': [ '1rq', '1x5', '1xr', '1y9', '1zh', '1zm', '220', '554', '56j' ] }, editHistory: {}, domainSubmissions: {}, answers: [], answerCount: '0', commentCount: '0', newCommentCount: '0', linkedMarkCount: '0', changeLogs: [ { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18223', pageId: 'bayes_probability_notation_math1', userId: 'EricBruylant', edit: '0', type: 'newTag', createdAt: '2016-08-03 16:31:18', auxPageId: 'b_class_meta_tag', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18219', pageId: 'bayes_probability_notation_math1', userId: 'EricBruylant', edit: '13', type: 'newEdit', createdAt: '2016-08-03 16:30:21', auxPageId: '', oldSettingsValue: '', newSettingsValue: 'removed first line header' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '17968', pageId: 'bayes_probability_notation_math1', userId: 'AlexeiAndreev', edit: '0', type: 'newRequirement', createdAt: '2016-08-02 00:15:53', auxPageId: 'bayes_rule', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '17967', pageId: 'bayes_probability_notation_math1', userId: 'AlexeiAndreev', edit: '0', type: 'deleteRequirement', createdAt: '2016-08-02 00:15:25', auxPageId: 'bayes_rule_odds', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '17541', pageId: 'bayes_probability_notation_math1', userId: 'AlexeiAndreev', edit: '0', type: 'newSubject', createdAt: '2016-07-26 17:01:40', auxPageId: 'bayes_rule', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '17360', pageId: 'bayes_probability_notation_math1', userId: 'NateSoares', edit: '12', type: 'newEdit', createdAt: '2016-07-23 00:00:40', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '16880', pageId: 'bayes_probability_notation_math1', userId: 'AlexeiAndreev', edit: '11', type: 'newEdit', createdAt: '2016-07-16 17:10:05', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '8113', pageId: 'bayes_probability_notation_math1', userId: 'NateSoares', edit: '10', type: 'newEdit', createdAt: '2016-03-03 03:58:08', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '7743', pageId: 'bayes_probability_notation_math1', userId: 'NateSoares', edit: '9', type: 'newEdit', createdAt: '2016-02-24 17:25:08', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '7581', pageId: 'bayes_probability_notation_math1', userId: 'EliezerYudkowsky', edit: '8', type: 'newEdit', createdAt: '2016-02-22 21:19:18', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '7448', pageId: 'bayes_probability_notation_math1', userId: 'EliezerYudkowsky', edit: '7', type: 'newEdit', createdAt: '2016-02-20 03:55:36', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6875', pageId: 'bayes_probability_notation_math1', userId: 'EliezerYudkowsky', edit: '6', type: 'newEdit', createdAt: '2016-02-11 04:37:17', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6869', pageId: 'bayes_probability_notation_math1', userId: 'EliezerYudkowsky', edit: '5', type: 'newEdit', createdAt: '2016-02-11 04:13:21', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6868', pageId: 'bayes_probability_notation_math1', userId: 'EliezerYudkowsky', edit: '4', type: 'newEdit', createdAt: '2016-02-11 04:10:58', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6848', pageId: 'bayes_probability_notation_math1', userId: 'EliezerYudkowsky', edit: '3', type: 'newEdit', createdAt: '2016-02-11 03:59:37', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6842', pageId: 'bayes_probability_notation_math1', userId: 'EliezerYudkowsky', edit: '2', type: 'newSubject', createdAt: '2016-02-11 03:51:05', auxPageId: 'bayes_rule_elimination', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6840', pageId: 'bayes_probability_notation_math1', userId: 'EliezerYudkowsky', edit: '0', type: 'deleteRequirement', createdAt: '2016-02-11 03:51:01', auxPageId: 'bayes_rule_elimination', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6838', pageId: 'bayes_probability_notation_math1', userId: 'EliezerYudkowsky', edit: '2', type: 'newRequirement', createdAt: '2016-02-11 03:45:27', auxPageId: 'bayes_rule_elimination', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6829', pageId: 'bayes_probability_notation_math1', userId: 'EliezerYudkowsky', edit: '2', type: 'newEdit', createdAt: '2016-02-11 03:40:00', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6828', pageId: 'bayes_probability_notation_math1', userId: 'EliezerYudkowsky', edit: '1', type: 'newEdit', createdAt: '2016-02-11 03:39:07', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6809', pageId: 'bayes_probability_notation_math1', userId: 'EliezerYudkowsky', edit: '0', type: 'newSubject', createdAt: '2016-02-11 03:36:46', auxPageId: 'bayes_rule_elimination', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6807', pageId: 'bayes_probability_notation_math1', userId: 'EliezerYudkowsky', edit: '0', type: 'newSubject', createdAt: '2016-02-11 03:36:42', auxPageId: 'conditional_probability', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6805', pageId: 'bayes_probability_notation_math1', userId: 'EliezerYudkowsky', edit: '0', type: 'newSubject', createdAt: '2016-02-11 03:36:37', auxPageId: 'bayes_probability_notation', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6803', pageId: 'bayes_probability_notation_math1', userId: 'EliezerYudkowsky', edit: '0', type: 'newRequirement', createdAt: '2016-02-11 03:36:23', auxPageId: 'bayes_rule_odds', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6801', pageId: 'bayes_probability_notation_math1', userId: 'EliezerYudkowsky', edit: '0', type: 'newRequirement', createdAt: '2016-02-11 03:36:18', auxPageId: 'math1', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6799', pageId: 'bayes_probability_notation_math1', userId: 'EliezerYudkowsky', edit: '0', type: 'newRequirement', createdAt: '2016-02-11 03:36:14', auxPageId: 'bayes_waterfall_diagram', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6797', pageId: 'bayes_probability_notation_math1', userId: 'EliezerYudkowsky', edit: '0', type: 'newParent', createdAt: '2016-02-11 03:35:08', auxPageId: 'bayes_probability_notation', oldSettingsValue: '', newSettingsValue: '' } ], feedSubmissions: [], searchStrings: {}, hasChildren: 'false', hasParents: 'true', redAliases: {}, improvementTagIds: [], nonMetaTagIds: [], todos: [], slowDownMap: 'null', speedUpMap: 'null', arcPageIds: 'null', contentRequests: {} }