{ localUrl: '../page/bayes_rule_multiple.html', arbitalUrl: 'https://arbital.com/p/bayes_rule_multiple', rawJsonUrl: '../raw/1zg.json', likeableId: 'IndrajiUshanthaWanigaratne', likeableType: 'page', myLikeValue: '0', likeCount: '8', dislikeCount: '0', likeScore: '8', individualLikes: [ 'AndrewMcKnight', 'EricBruylant', 'AdeleLopez', 'TravisRivera', 'IanPitchford', 'NateSoares', 'SzymonWilczyski', 'StephanieKoo' ], pageId: 'bayes_rule_multiple', edit: '29', editSummary: '', prevEdit: '28', currentEdit: '29', wasPublished: 'true', type: 'wiki', title: 'Bayes' rule: Vector form', clickbait: 'For when you want to apply Bayes' rule to lots of evidence and lots of variables, all in one go. (This is more or less how spam filters work.)', textLength: '11401', alias: 'bayes_rule_multiple', externalUrl: '', sortChildrenBy: 'likes', hasVote: 'false', voteType: '', votesAnonymous: 'false', editCreatorId: 'giadang2', editCreatedAt: '2017-05-23 00:31:28', pageCreatorId: 'EliezerYudkowsky', pageCreatedAt: '2016-02-13 19:54:09', seeDomainId: '0', editDomainId: 'AlexeiAndreev', submitToDomainId: '0', isAutosave: 'false', isSnapshot: 'false', isLiveEdit: 'true', isMinorEdit: 'false', indirectTeacher: 'false', todoCount: '0', isEditorComment: 'false', isApprovedComment: 'true', isResolved: 'false', snapshotText: '', anchorContext: '', anchorText: '', anchorOffset: '0', mergedInto: '', isDeleted: 'false', viewCount: '5702', text: '[summary: The [1x5 odds form of Bayes' rule] works for odds ratios between more than two hypotheses, and applying multiple pieces of evidence. Suppose there's a bathtub full of coins. 1/2 of the coins are fair and have a 50% probability of producing heads on each coinflip; 1/3 of the coins produce 25% heads; and 1/6 produce 75% heads. You pull out a coin at random, flip it 3 times, and get the result HTH. You may calculate:\n\n$$\\begin{array}{rll}\n(1/2 : 1/3 : 1/6) = & (3 : 2 : 1) & \\\\\n\\times & (2 : 1 : 3) & \\\\\n\\times & (2 : 3 : 1) & \\\\\n\\times & (2 : 1 : 3) & \\\\\n= & (24 : 6 : 9) & = (8 : 2 : 3)\n\\end{array}$$]\n\n%todo: This page conflates two concepts: (1) You can perform a Bayesian update on multiple hypotheses at once, by representing hypotheses via vectors; and (2) you can perform multiple Bayesian updates by multiplying by all the likelihood functions (and only normalizing once at the end). We should probably have one page for each concept, and we should possibly split this page in order to make them. (It's not yet clear whether we want one unified page for both ideas, as this one currently is.)%\n%comment: Comment from ESY: it seems to me that these two concepts are sufficiently closely related, and sufficiently combined in their demonstration, that we want to explain them on the same page. They could arguably have different concept pages, though.%\n\n[1lz Bayes' rule] in the [1x5 odds form] says that for every pair of hypotheses, their relative prior odds, times the relative likelihood of the evidence, equals the relative posterior odds.\n\nLet $\\mathbf H$ be a vector of hypotheses $H_1, H_2, \\ldots$ Because Bayes' rule holds between every pair of hypotheses in $\\mathbf H,$ we can simply multiply an odds vector by a likelihood vector in order to get the correct posterior vector:\n\n$$\\mathbb O(\\mathbf H) \\times \\mathcal L_e(\\mathbf H) = \\mathbb O(\\mathbf H \\mid e)$$\n\n%comment: Comment from EN: It seems to me that the dot product would be more appropriate.%\n\nwhere $\\mathbb O(\\mathbf H)$ is the vector of relative prior odds between all the $H_i$, $\\mathcal L_e(\\mathbf H)$ is the vector of relative likelihoods with which each $H_i$ predicted $e,$ and $\\mathbb O(\\mathbf H \\mid e)$ is the relative posterior odds between all the $H_i.$\n\nIn fact, we can keep multiplying by likelihood vectors to perform multiple updates at once:\n\n$$\\begin{array}{r}\n\\mathbb O(\\mathbf H) \\\\\n\\times\\ \\mathcal L_{e_1}(\\mathbf H) \\\\\n\\times\\ \\mathcal L_{e_2}(\\mathbf H \\wedge e_1) \\\\\n\\times\\ \\mathcal L_{e_3}(\\mathbf H \\wedge e_1 \\wedge e_2) \\\\\n= \\mathbb O(\\mathbf H \\mid e_1 \\wedge e_2 \\wedge e_3)\n\\end{array}$$\n\nFor example, suppose there's a bathtub full of coins. Half of the coins are "fair" and have a 50% probability of producing heads on each coinflip. A third of the coins is biased towards heads and produces heads 75% of the time. The remaining coins are biased against heads, which they produce only 25% of the time. You pull out a coin at random, flip it 3 times, and get the result THT. What's the chance that this was a fair coin?\n\nWe have three hypotheses, which we'll call $H_{fair},$ $H_{heads}$, and $H_{tails}$ and respectively, with relative odds of $(1/2 : 1/3 : 1/6).$ The relative likelihoods that these three hypotheses assign to a coin landing heads is $(2 : 3 : 1)$; the relative likelihoods that they assign to a coin landing tails is $(2 : 1 : 3).$ Thus, the posterior odds for all three hypotheses are:\n\n$$\\begin{array}{rll}\n(1/2 : 1/3 : 1/6) = & (3 : 2 : 1) & \\\\\n\\times & (2 : 1 : 3) & \\\\\n\\times & (2 : 3 : 1) & \\\\\n\\times & (2 : 1 : 3) & \\\\\n= & (24 : 6 : 9) & = (8 : 2 : 3) = (8/13 : 2/13 : 3/13)\n\\end{array}$$\n\n...so there is an 8/13 or ~62% probability that the coin is fair.\n\nIf you were only familiar with the [554 probability form] of Bayes' rule, which only works for one hypothesis at a time and which only uses probabilities (and so [1rk normalizes] the odds into probabilities at every step)...\n\n$$\\mathbb P(H_i\\mid e) = \\dfrac{\\mathbb P(e\\mid H_i)P(H_i)}{\\sum_k \\mathbb P(e\\mid H_k)P(H_k)}$$\n\n...then you might have had some gratuitous difficulty solving this problem.\n\nAlso, if you hear the idiom of "convert to odds, multiply lots and lots of things, convert back to probabilities" and think "hmm, this sounds like a place where [4h0 transforming into log-space] (where all multiplications become additions) might yield efficiency gains," then congratulations, you just invented the [1zh log-odds from of Bayes' rule]. Not only is it efficient, it also gives rise to a natural unit of measure for "strength of evidence" and "strength of belief".\n\n# Naive Bayes\n\nMultiplying an array of odds by an array of likelihoods is the idiom used in Bayesian spam filters. Suppose that there are three categories of email, "Business", "Personal", and "Spam", and that the user hand-labeling the last 100 emails has labeled 50 as Business, 30 as Personal, and 20 as spam. The word "buy" has appeared in 10 Business emails, 3 Personal emails, and 10 spam emails. The word "rationality" has appeared in 30 Business emails, 15 Personal emails, and 1 spam email.\n\nFirst, we assume that the frequencies in our data are representative of the 'true' frequencies. (Taken literally, if we see a word we've never seen before, we'll be multiplying by a zero probability. [Good-Turing frequency estimation](https://en.wikipedia.org/wiki/Good%E2%80%93Turing_frequency_estimation) would do better.)\n\nSecond, we make the *naive Bayes* assumption that a spam email which contains the word "buy" is no more or less likely than any other spam email to contain the word "rationality", and so on with the other categories.\n\nThen we'd filter a message containing the phrase "buy rationality" as follows:\n\n[1rm Prior] odds: $(5 : 3 : 2)$\n\n[1rq Likelihood ratio] for "buy": \n\n$$\\left(\\frac{10}{50} : \\frac{3}{30} : \\frac{10}{20}\\right) = \\left(\\frac{1}{5} : \\frac{1}{10} : \\frac{1}{2}\\right) = (2 : 1 : 5)$$\n\nLikelihood ratio for "rationality": \n\n$$\\left(\\frac{30}{50} : \\frac{15}{30} : \\frac{1}{20}\\right) = \\left(\\frac{3}{5} : \\frac{1}{2} : \\frac{1}{20}\\right) = (12 : 10 : 1)$$\n\nPosterior odds:\n\n$$(5 : 3 : 2) \\times (2 : 1 : 5) \\times (12 : 10 : 1) = (120 : 30 : 10) = \\left(\\frac{12}{16} : \\frac{3}{16} : \\frac{1}{16}\\right)$$\n\n%%comment: 12/16 is intentionally not in lowest form so that the 12 : 3 : 1 ratio can be clear.%%\n\nThis email would be 75% likely to be a business email, if the Naive Bayes assumptions are true. They're almost certainly *not* true, for reasons discussed in more detail below. But while Naive Bayes calculations are usually quantitatively wrong, they often point in the right qualitative direction - this email may indeed be more likely than not to be a business email.\n\n(An actual implementation should add [1zh log-likelihoods] rather than multiplying by ratios, so as not to risk floating-point overflow or underflow.)\n\n# Non-naive multiple updates\n\nTo do a multiple update less naively, we must do the equivalent of asking about the probability that a Business email contains the word "rationality", *given* that it contained the word "buy".\n\nAs a real-life example, in a certain [rationality workshop](http://rationality.org/), one participant was observed to have taken another participant to a museum, and also, on a different day, to see their workplace. A betting market soon developed on whether the two were romantically involved. One participant argued that, as an eyeball estimate, someone was 12 times as likely to take a fellow participant to a museum, or to their workplace, if they were romantically involved, vs. just being strangers. They then multiplied their prior odds by a 12 : 1 likelihood ratio for the museum trip and another 12 : 1 likelihood ratio for the workplace trip, and concluded that these two were almost certainly romantically attracted.\n\nIt later turned out that the two were childhood acquaintances who were not romantically involved. What went wrong?\n\nIf we want to update hypotheses on multiple pieces of evidence, we need to mentally stay inside the world of each hypothesis, and condition the likelihood of future evidence on the evidence already observed. Suppose the two are *not* romantically attracted. We observe them visit a museum. Arguendo, we might indeed suppose that this has a probability of, say, 1% (we don't usually expect strangers to visit museums together) which might be about 1/12 the probability of making that observation if the two were romantically involved.\n\nBut after this, when we observe the workplace visit, we need to ask about the probability of the workplace visit, *given* that the two were romantically attracted *and* that they visited a museum. This might suggest that if two non-attracted people visit a museum together *for whatever reason*, they don't just have the default probability of a non-attracted couple of making a workplace visit. In other words:\n\n$$\\mathbb P({workplace}\\mid \\neg {romance} \\wedge {museum}) \\neq \\mathbb P({workplace}\\mid \\neg {romance})$$\n\nNaive Bayes, in contrast, would try to approximate the quantity $\\mathbb P({museum} \\wedge {workplace} \\mid \\neg {romance})$ as the product of $\\mathbb P({museum}\\mid \\neg {romance}) \\cdot \\mathbb P({workplace}\\mid \\neg {romance}).$ This is what the participants did when they multiplied by a 1/12 likelihood ratio twice.\n\nThe result was a kind of double-counting of the evidence — they took into account the prior improbability of a random non-romantic couple "going places together" twice in a row, for the two pieces of evidence, and ended up performing a total update that was much too strong. \n\nNaive Bayes spam filters often end up assigning ludicrously extreme odds, on the order of googols to one, that an email is spam or personal; and then they're sometimes wrong anyways. If an email contains the phrase "pharmaceutical" and "pharmacy", a spam filter will double-count the improbability of a personal email talking about pharmacies, rather than considering that if I actually do get a personal email talking about a pharmacy, it is much more likely to contain the word "pharmaceutical" as well. So because of the Naive Bayes assumption, naive Bayesian spam filters are not anything remotely like well-calibrated, and they update much too extremely on the evidence. On the other hand, they're often extreme in the correct qualitative direction — something assigned googol-to-one odds of being spam isn't *always* spam but it might be spam, say, 99.999% of the time.\n\nTo do non-naive Bayesian updates on multiple pieces of evidence, just remember to mentally inhabit the world *where the hypothesis is true*, and then ask about the likelihood of each successive piece of evidence, *in* the world where the hypothesis is true *and* the previous pieces of evidence were observed. Don't ask, "What is the likelihood that a non-romantic couple would visit one person's workplace?" but "What is the likelihood that a non-romantic couple which previously visited a museum for some unknown reason would also visit the workplace?"\n\nIn our example with the coins in the bathtub, the likelihoods of the evidence were independent on each step - *assuming* a coin to be fair, it's no more or less likely to produce heads on the second flip after producing heads on the first flip. So in our bathtub-coins example, the Naive Bayes assumption was actually true.\n', metaText: '', isTextLoaded: 'true', isSubscribedToDiscussion: 'false', isSubscribedToUser: 'false', isSubscribedAsMaintainer: 'false', discussionSubscriberCount: '2', maintainerCount: '2', userSubscriberCount: '0', lastVisit: '2016-02-27 17:45:50', hasDraft: 'false', votes: [], voteSummary: [ '0', '0', '0', '0', '0', '0', '0', '0', '0', '0' ], muVoteSummary: '0', voteScaling: '0', currentUserVote: '-2', voteCount: '0', lockedVoteType: '', maxEditEver: '0', redLinkCount: '0', lockedBy: '', lockedUntil: '', nextPageId: '', prevPageId: '', usedAsMastery: 'true', proposalEditNum: '35', permissions: { edit: { has: 'false', reason: 'You don't have domain permission to edit this page' }, proposeEdit: { has: 'true', reason: '' }, delete: { has: 'false', reason: 'You don't have domain permission to delete this page' }, comment: { has: 'false', reason: 'You can't comment in this domain because you are not a member' }, proposeComment: { has: 'true', reason: '' } }, summaries: { Summary: 'The [1x5 odds form of Bayes' rule] works for odds ratios between more than two hypotheses, and applying multiple pieces of evidence. Suppose there's a bathtub full of coins. 1/2 of the coins are fair and have a 50% probability of producing heads on each coinflip; 1/3 of the coins produce 25% heads; and 1/6 produce 75% heads. You pull out a coin at random, flip it 3 times, and get the result HTH. You may calculate:\n\n$$\\begin{array}{rll}\n(1/2 : 1/3 : 1/6) = & (3 : 2 : 1) & \\\\\n\\times & (2 : 1 : 3) & \\\\\n\\times & (2 : 3 : 1) & \\\\\n\\times & (2 : 1 : 3) & \\\\\n= & (24 : 6 : 9) & = (8 : 2 : 3)\n\\end{array}$$' }, creatorIds: [ 'EliezerYudkowsky', 'NateSoares', 'AdomHartell', 'ErikNash', 'NateWindwood', 'AlexeiAndreev', 'PatrickLaVictoir', 'ConnorFlexman2', 'FrancisMarineau', 'giadang2', 'NickJordan', 'RyanWhite', 'FedorBelolutskiy' ], childIds: [], parentIds: [ 'bayes_rule' ], commentIds: [ '2gq', '3xx', '7tn', '86w', '89j', '8jx', '984', '9hk', '9hm' ], questionIds: [], tagIds: [ 'c_class_meta_tag' ], relatedIds: [], markIds: [], explanations: [ { id: '2171', parentId: 'bayes_rule_multiple', childId: 'bayes_rule_multiple', type: 'subject', creatorId: 'AlexeiAndreev', createdAt: '2016-06-17 21:58:56', level: '2', isStrong: 'true', everPublished: 'true' }, { id: '6502', parentId: 'bayes_rule_multiple', childId: 'bayes_rule_fast_intro', type: 'subject', creatorId: 'EliezerYudkowsky', createdAt: '2016-09-29 04:42:29', level: '2', isStrong: 'true', everPublished: 'true' } ], learnMore: [], requirements: [ { id: '2165', parentId: 'bayes_rule_odds', childId: 'bayes_rule_multiple', type: 'requirement', creatorId: 'AlexeiAndreev', createdAt: '2016-06-17 21:58:56', level: '3', isStrong: 'true', everPublished: 'true' }, { id: '5642', parentId: 'odds', childId: 'bayes_rule_multiple', type: 'requirement', creatorId: 'AlexeiAndreev', createdAt: '2016-07-26 17:11:53', level: '3', isStrong: 'true', everPublished: 'true' }, { id: '5804', parentId: 'math3', childId: 'bayes_rule_multiple', type: 'requirement', creatorId: 'AlexeiAndreev', createdAt: '2016-08-02 00:45:48', level: '2', isStrong: 'true', everPublished: 'true' }, { id: '5805', parentId: 'bayes_rule', childId: 'bayes_rule_multiple', type: 'requirement', creatorId: 'AlexeiAndreev', createdAt: '2016-08-02 00:46:48', level: '3', isStrong: 'true', everPublished: 'true' } ], subjects: [ { id: '2171', parentId: 'bayes_rule_multiple', childId: 'bayes_rule_multiple', type: 'subject', creatorId: 'AlexeiAndreev', createdAt: '2016-06-17 21:58:56', level: '2', isStrong: 'true', everPublished: 'true' }, { id: '5299', parentId: 'bayes_rule', childId: 'bayes_rule_multiple', type: 'subject', creatorId: 'AlexeiAndreev', createdAt: '2016-07-16 16:09:21', level: '3', isStrong: 'false', everPublished: 'true' }, { id: '5811', parentId: 'bayes_rule_odds', childId: 'bayes_rule_multiple', type: 'subject', creatorId: 'AlexeiAndreev', createdAt: '2016-08-02 00:58:51', level: '3', isStrong: 'false', everPublished: 'true' } ], lenses: [], lensParentId: '', pathPages: [], learnMoreTaughtMap: {}, learnMoreCoveredMap: { '1lz': [ '1x5', '1yd', '1zj', '554' ], '1x5': [ '1yd' ] }, learnMoreRequiredMap: { '1zg': [ '207' ] }, editHistory: {}, domainSubmissions: {}, answers: [], answerCount: '0', commentCount: '0', newCommentCount: '0', linkedMarkCount: '0', changeLogs: [ { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '23139', pageId: 'bayes_rule_multiple', userId: 'FedorBelolutskiy', edit: '35', type: 'newEditProposal', createdAt: '2018-12-27 18:22:44', auxPageId: '', oldSettingsValue: '', newSettingsValue: 'The original 75% number (that Ryan White wished to change to "70%") seems correct... 12/16 = 75%' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '23130', pageId: 'bayes_rule_multiple', userId: 'RyanWhite', edit: '34', type: 'newEditProposal', createdAt: '2018-11-25 18:31:36', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '22865', pageId: 'bayes_rule_multiple', userId: 'NickJordan', edit: '33', type: 'newEditProposal', createdAt: '2017-11-03 18:01:55', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '22718', pageId: 'bayes_rule_multiple', userId: 'NateWindwood', edit: '32', type: 'newEditProposal', createdAt: '2017-07-21 14:55:24', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '22717', pageId: 'bayes_rule_multiple', userId: 'NateWindwood', edit: '31', type: 'newEditProposal', createdAt: '2017-07-21 13:35:42', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '22561', pageId: 'bayes_rule_multiple', userId: 'giadang2', edit: '29', type: 'newEdit', createdAt: '2017-05-23 00:31:28', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '22035', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '28', type: 'newEdit', createdAt: '2017-02-16 18:24:30', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '20167', pageId: 'bayes_rule_multiple', userId: 'FrancisMarineau', edit: '27', type: 'newEdit', createdAt: '2016-10-18 04:20:11', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '20144', pageId: 'bayes_rule_multiple', userId: 'ErikNash', edit: '26', type: 'newEditProposal', createdAt: '2016-10-14 10:57:40', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '20143', pageId: 'bayes_rule_multiple', userId: 'ErikNash', edit: '25', type: 'newEditProposal', createdAt: '2016-10-14 10:53:35', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '20063', pageId: 'bayes_rule_multiple', userId: 'AdomHartell', edit: '24', type: 'newEdit', createdAt: '2016-10-11 18:07:07', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '20062', pageId: 'bayes_rule_multiple', userId: 'AdomHartell', edit: '23', type: 'newEdit', createdAt: '2016-10-11 18:06:20', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '19978', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '22', type: 'newEdit', createdAt: '2016-10-09 18:11:58', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '19966', pageId: 'bayes_rule_multiple', userId: 'AdomHartell', edit: '21', type: 'newEdit', createdAt: '2016-10-08 22:09:40', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '3591', likeableType: 'changeLog', myLikeValue: '0', likeCount: '1', dislikeCount: '0', likeScore: '1', individualLikes: [], id: '19909', pageId: 'bayes_rule_multiple', userId: 'ConnorFlexman2', edit: '20', type: 'newEdit', createdAt: '2016-10-07 23:32:48', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '19754', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '0', type: 'newTeacher', createdAt: '2016-09-29 04:42:29', auxPageId: 'bayes_rule_fast_intro', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18242', pageId: 'bayes_rule_multiple', userId: 'EricBruylant', edit: '0', type: 'newTag', createdAt: '2016-08-03 17:08:19', auxPageId: 'c_class_meta_tag', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18049', pageId: 'bayes_rule_multiple', userId: 'AlexeiAndreev', edit: '0', type: 'newSubject', createdAt: '2016-08-02 00:58:52', auxPageId: 'bayes_rule_odds', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18039', pageId: 'bayes_rule_multiple', userId: 'AlexeiAndreev', edit: '0', type: 'newRequirement', createdAt: '2016-08-02 00:46:49', auxPageId: 'bayes_rule', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18038', pageId: 'bayes_rule_multiple', userId: 'AlexeiAndreev', edit: '0', type: 'deleteRequirement', createdAt: '2016-08-02 00:46:36', auxPageId: 'bayes_rule_probability', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18036', pageId: 'bayes_rule_multiple', userId: 'AlexeiAndreev', edit: '0', type: 'deleteRequirement', createdAt: '2016-08-02 00:45:53', auxPageId: 'math2', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '18034', pageId: 'bayes_rule_multiple', userId: 'AlexeiAndreev', edit: '0', type: 'newRequirement', createdAt: '2016-08-02 00:45:48', auxPageId: 'math3', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '17544', pageId: 'bayes_rule_multiple', userId: 'AlexeiAndreev', edit: '0', type: 'newRequirement', createdAt: '2016-07-26 17:11:54', auxPageId: 'odds', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '16929', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '19', type: 'newEdit', createdAt: '2016-07-16 20:39:05', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '16857', pageId: 'bayes_rule_multiple', userId: 'AlexeiAndreev', edit: '0', type: 'newSubject', createdAt: '2016-07-16 16:09:21', auxPageId: 'bayes_rule', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '16512', pageId: 'bayes_rule_multiple', userId: 'NateSoares', edit: '0', type: 'newRequirement', createdAt: '2016-07-10 22:14:38', auxPageId: 'bayes_rule_probability', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '16511', pageId: 'bayes_rule_multiple', userId: 'NateSoares', edit: '0', type: 'deleteRequirement', createdAt: '2016-07-10 22:14:30', auxPageId: 'bayes_probability_notation', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '15756', pageId: 'bayes_rule_multiple', userId: 'NateSoares', edit: '18', type: 'newEdit', createdAt: '2016-07-06 20:08:43', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '15631', pageId: 'bayes_rule_multiple', userId: 'NateSoares', edit: '17', type: 'newEdit', createdAt: '2016-07-06 07:01:29', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '14063', pageId: 'bayes_rule_multiple', userId: 'NateSoares', edit: '15', type: 'newEdit', createdAt: '2016-06-20 04:00:47', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '13952', pageId: 'bayes_rule_multiple', userId: 'NateSoares', edit: '14', type: 'newEdit', createdAt: '2016-06-18 18:18:40', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '13950', pageId: 'bayes_rule_multiple', userId: 'NateSoares', edit: '13', type: 'newEdit', createdAt: '2016-06-18 18:16:14', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '13949', pageId: 'bayes_rule_multiple', userId: 'NateSoares', edit: '12', type: 'newEdit', createdAt: '2016-06-18 18:15:29', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '11968', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '11', type: 'newEdit', createdAt: '2016-06-07 23:20:37', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '11967', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '10', type: 'newEdit', createdAt: '2016-06-07 23:19:41', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '11966', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '9', type: 'newEdit', createdAt: '2016-06-07 23:17:53', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '8226', pageId: 'bayes_rule_multiple', userId: 'PatrickLaVictoir', edit: '8', type: 'newEdit', createdAt: '2016-03-03 23:20:24', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '8108', pageId: 'bayes_rule_multiple', userId: 'NateSoares', edit: '6', type: 'newEdit', createdAt: '2016-03-03 03:31:50', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '7896', pageId: 'bayes_rule_multiple', userId: 'AlexeiAndreev', edit: '5', type: 'newEdit', createdAt: '2016-02-26 17:20:42', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '7162', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '4', type: 'newRequiredBy', createdAt: '2016-02-16 05:36:04', auxPageId: 'bayes_rule_details', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '7077', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '4', type: 'newEdit', createdAt: '2016-02-13 22:44:28', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6992', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '3', type: 'newTeacher', createdAt: '2016-02-13 19:59:37', auxPageId: 'bayes_rule_multiple', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6993', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '3', type: 'newSubject', createdAt: '2016-02-13 19:59:37', auxPageId: 'bayes_rule_multiple', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6991', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '3', type: 'newEdit', createdAt: '2016-02-13 19:58:13', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6990', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '2', type: 'newEdit', createdAt: '2016-02-13 19:54:30', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6989', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '1', type: 'newEdit', createdAt: '2016-02-13 19:54:09', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6988', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '0', type: 'newRequirement', createdAt: '2016-02-13 19:40:01', auxPageId: 'math2', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6986', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '0', type: 'deleteRequirement', createdAt: '2016-02-13 19:39:58', auxPageId: 'math1', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6984', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '0', type: 'newRequirement', createdAt: '2016-02-13 19:39:43', auxPageId: 'math1', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6982', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '0', type: 'deleteRequirement', createdAt: '2016-02-13 19:39:39', auxPageId: 'math2', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6980', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '0', type: 'newRequirement', createdAt: '2016-02-13 19:39:36', auxPageId: 'bayes_probability_notation', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6978', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '0', type: 'newRequirement', createdAt: '2016-02-13 19:03:45', auxPageId: 'math2', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6976', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '0', type: 'deleteRequirement', createdAt: '2016-02-13 19:03:41', auxPageId: 'math1', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6974', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '0', type: 'newRequirement', createdAt: '2016-02-13 19:03:15', auxPageId: 'math1', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6972', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '0', type: 'newRequirement', createdAt: '2016-02-13 19:03:12', auxPageId: 'bayes_rule_odds', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '6970', pageId: 'bayes_rule_multiple', userId: 'EliezerYudkowsky', edit: '0', type: 'newParent', createdAt: '2016-02-13 18:59:56', auxPageId: 'bayes_rule', oldSettingsValue: '', newSettingsValue: '' } ], feedSubmissions: [], searchStrings: {}, hasChildren: 'false', hasParents: 'true', redAliases: {}, improvementTagIds: [], nonMetaTagIds: [], todos: [], slowDownMap: 'null', speedUpMap: 'null', arcPageIds: 'null', contentRequests: { fewerWords: { likeableId: '3965', likeableType: 'contentRequest', myLikeValue: '0', likeCount: '5', dislikeCount: '0', likeScore: '5', individualLikes: [], id: '171', pageId: 'bayes_rule_multiple', requestType: 'fewerWords', createdAt: '2017-02-02 03:24:14' }, improveStub: { likeableId: '3702', likeableType: 'contentRequest', myLikeValue: '0', likeCount: '2', dislikeCount: '0', likeScore: '2', individualLikes: [], id: '150', pageId: 'bayes_rule_multiple', requestType: 'improveStub', createdAt: '2016-11-09 13:12:53' }, lessTechnical: { likeableId: '3337', likeableType: 'contentRequest', myLikeValue: '0', likeCount: '3', dislikeCount: '0', likeScore: '3', individualLikes: [], id: '32', pageId: 'bayes_rule_multiple', requestType: 'lessTechnical', createdAt: '2016-08-05 20:28:00' }, moreTechnical: { likeableId: '4060', likeableType: 'contentRequest', myLikeValue: '0', likeCount: '1', dislikeCount: '0', likeScore: '1', individualLikes: [], id: '191', pageId: 'bayes_rule_multiple', requestType: 'moreTechnical', createdAt: '2017-07-18 21:53:24' }, moreWords: { likeableId: '4025', likeableType: 'contentRequest', myLikeValue: '0', likeCount: '1', dislikeCount: '0', likeScore: '1', individualLikes: [], id: '180', pageId: 'bayes_rule_multiple', requestType: 'moreWords', createdAt: '2017-03-31 21:48:55' } } }