{
  localUrl: '../page/bayes_science_virtues.html',
  arbitalUrl: 'https://arbital.com/p/bayes_science_virtues',
  rawJsonUrl: '../raw/220.json',
  likeableId: 'JudeRyan',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '19',
  dislikeCount: '0',
  likeScore: '19',
  individualLikes: [
    'AlexeiAndreev',
    'MatthewGraves',
    'AndrewMcKnight',
    'EricBruylant',
    'JoelSutherland',
    'VladArber',
    'JaimeSevillaMolina',
    'IanPitchford',
    'NateSoares',
    'JoeZeng',
    'EricRogstad',
    'SzymonWilczyski',
    'SzymonSlawinski',
    'DonyChristie',
    'KatherineSavoie2',
    'benjaminwinegard',
    'YevgeniyGritsan',
    '99n',
    'EyalRoth'
  ],
  pageId: 'bayes_science_virtues',
  edit: '33',
  editSummary: '',
  prevEdit: '32',
  currentEdit: '33',
  wasPublished: 'true',
  type: 'wiki',
  title: 'Bayesian view of scientific virtues',
  clickbait: 'Why is it that science relies on bold, precise, and falsifiable predictions? Because of Bayes' rule, of course.',
  textLength: '28159',
  alias: 'bayes_science_virtues',
  externalUrl: '',
  sortChildrenBy: 'likes',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'EliezerYudkowsky',
  editCreatedAt: '2017-02-16 18:22:52',
  pageCreatorId: 'EliezerYudkowsky',
  pageCreatedAt: '2016-02-19 07:20:33',
  seeDomainId: '0',
  editDomainId: 'AlexeiAndreev',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '2',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '5827',
  text: '[summary:  The classic scientific virtues of falsifiability (saying what we should *not* expect to see if a hypothesis is true), making bold experimental predictions (that aren't predicted by other theories), and precision (making narrow predictions about quantitative measurements) can be seen in the light of [1lz Bayes' rule] as properties which allow theories to gain stronger evidence (greater [1rq likelihood ratios]) in their favor when their predictions are correct.  We can also interpret traditional ideas like falsificationism - acceptance being provisional, and rejection being more forceful - as features that arise from the probability theory of Bayes' rule.]\n\nA number of scientific virtues are explained intuitively by Bayes' rule, including:\n\n- __Falsifiability:__ A good scientist should say what they do *not* expect to see if a theory is true.\n- __Boldness:__ A good theory makes bold experimental predictions (that we wouldn't otherwise expect)\n- __Precision:__ A good theory makes precise experimental predictions (that turn out correct)\n- __Falsificationism:__ Acceptance of a scientific theory is always provisional; rejection of a scientific theory is pretty permanent.\n- __Experimentation:__ You find better theories by making observations, and then updating your beliefs.\n\n# Falsifiability\n\n"Falsifiability" means saying which events and observations should definitely *not* happen if your theory is true.\n\nThis was first popularized as a scientific virtue by Karl Popper, who wrote, in a famous critique of Freudian psychoanalysis:\n\n> Neither Freud nor Adler excludes any particular person’s acting in any particular way, whatever the outward circumstances. Whether a man sacrificed his life to rescue a drowning child (a case of sublimation) or whether he murdered the child by drowning him (a case of repression) could not possibly be predicted or excluded by Freud’s theory; *the theory was compatible with everything that could happen.*\n\nIn a Bayesian sense, we can see a hypothesis's falsifiability as a requirement for obtaining strong [1rq likelihood ratios] in favor of the hypothesis, compared to, e.g., the alternative hypothesis "I don't know."\n\nSuppose you're a very early researcher on gravitation, named Grek.  Your friend Thag is holding a rock in one hand, about to let it go.  You need to predict whether the rock will move downward to the ground, fly upward into the sky, or do something else.  That is, you must say how your theory $Grek$ assigns its probabilities over $up, down,$ and $other.$\n\nAs it happens, your friend Thag has his own theory $Thag$ which says "Rocks do what they want to do."  If Thag sees the rock go down, he'll explain this by saying the rock wanted to go down.  If Thag sees the rock go up, he'll say the rock wanted to go up.  Thag thinks that the Thag Theory of Gravitation is a very good one because it can explain any possible thing the rock is observed to do.  This makes it superior compared to a theory that could *only* explain, say, the rock falling down.\n\nAs a Bayesian, however, you realize that since $up, down,$ and $other$ are [1rd mutually exclusive and exhaustive] possibilities, and *something* must happen when Thag lets go of the rock, the [1rj conditional probabilities] $\\mathbb P(\\cdot\\mid Thag)$ must sum to $\\mathbb P(up\\mid Thag) + \\mathbb P(down\\mid Thag) + \\mathbb P(other\\mid Thag) = 1.$\n\nIf Thag is "equally good at explaining" all three outcomes - if Thag's theory is equally compatible with all three events and produces equally clever explanations for each of them - then we might as well call this $1/3$ probability for each of $\\mathbb P(up\\mid Thag), \\mathbb P(down\\mid Thag),$ and $\\mathbb P(other\\mid Thag)$.  Note that Thag theory's is isomorphic, in a probabilistic sense, to saying "I don't know."\n\nBut now suppose Grek make falsifiable prediction!  Grek say, "Most things fall down!"\n\nThen Grek not have all probability mass distributed equally!  Grek put 95% of probability mass in $\\mathbb P(down\\mid Grek)!$  Only leave 5% probability divided equally over $\\mathbb P(up\\mid Grek)$ and $\\mathbb P(other\\mid Grek)$ in case rock behave like bird.\n\nThag say this bad idea.  If rock go up, Grek Theory of Gravitation disconfirmed by false prediction!  Compared to Thag Theory that predicts 1/3 chance of $up,$ will be likelihood ratio of 2.5% : 33% ~ 1 : 13 against Grek Theory!  Grek embarrassed!\n\nGrek say, she is confident rock *does* go down.  Things like bird are rare.  So Grek willing to stick out neck and face potential embarrassment.  Besides, is more important to learn about if Grek Theory is true than to save face.\n\nThag let go of rock.  Rock fall down.\n\nThis evidence with likelihood ratio of 0.95 : 0.33 ~ 3 : 1 favoring Grek Theory over Thag Theory.\n\n"How you get such big likelihood ratio?" Thag demand.  "Thag never get big likelihood ratio!"\n\nGrek explain is possible to obtain big likelihood ratio because Grek Theory stick out neck and take probability mass *away* from outcomes $up$ and $other,$ risking disconfirmation if that happen.  This free up lots of probability mass that Grek can put in outcome $down$ to make big likelihood ratio if $down$ happen.\n\nGrek Theory win because falsifiable and make correct prediction!  If falsifiable and make wrong prediction, Grek Theory lose, but this okay because Grek Theory not Grek.\n\n# Advance prediction\n\nOn the next experiment, Thag lets go of the rock, watches it fall down, and then says, "Thag Theory assign 100% probability to $\\mathbb P(down\\mid Thag)$."\n\nGrek replies, "Grek think if Thag see rock fly up instead, Thag would've said $\\mathbb P(up\\mid Thag) = 1.$  Thag engage in [hindsight bias](https://en.wikipedia.org/wiki/Hindsight_bias)."\n\n"Grek can't prove Thag biased," says Thag.  "Grek make ad hominem argument."\n\n"New rule," says Grek.  "Everyone say probability assignment *before* thing happens.  That way, no need to argue afterwards."\n\nThag thinks.  "Thag say $\\mathbb P(up\\mid Thag) = 1$ and $\\mathbb P(down\\mid Thag) = 1$."\n\n"Thag violate [probability axioms](https://en.wikipedia.org/wiki/Probability_axioms)," says Grek.  "Probability of all [1rd mutually exclusive] outcomes must sum to $1$ or less.  But good thing Thag say in advance so Grek can see problem."\n\n"That not fair!" objects Thag.  "Should be allowed to say afterwards so nobody can tell!"\n\n## Formality\n\nThe rule of advance prediction is much more pragmatically important for informal theories than formal ones; and for these purposes, a theory is 'formal' when the theory's predictions are produced in a sufficiently mechanical and determined way that anyone can plug the theory into a computer and get the same answer for what probability the theory assigns.\n\nWhen Newton's Theory of Gravitation was proposed, it was considered not-yet-fully-proven because retrodictions such as the tides, elliptical planetary orbits, and Kepler's Laws, had all been observed *before* Newton proposed the theory.  Even so, a pragmatic Bayesian would have given Newton's theory a lot of credit for these retrodictions, because unlike, say, a psychological theory of human behavior, it was possible for *anyone* - not just Newton - to sit down with a pencil and derive exactly the same predictions from Newton's Laws.  This wouldn't completely eliminate the possibility that Newton's Theory had in some sense been overfitted to Kepler's Laws and the tides, and would then be incapable of further correct new predictions.  But it did mean that, as a formal theory, there could be less pragmatic worry that Thagton was just saying, "Oh, well, of course my theory of 'Planets go where they want' would predict elliptical orbits; elliptical orbits look nice."\n\n## Asking a theory's adherents what the theory says.\n\nThag picks up another rock.  "I say in advance that Grek Theory assign 0% probability to rock going down."  Thag drops the rock.  "Thag disprove Grek Theory!"\n\nGrek shakes her head.  "Should ask advocates of Grek Theory what Grek Theory predicts."  Grek picks up another rock.  "I say Grek Theory assign $\\mathbb P(down\\mid Grek) = 0.95$."\n\n"I say Grek Theory assign $\\mathbb P(down\\mid Grek) = 0$," counters Thag.\n\n"That not how science work," replies Grek.  "Thag should say what Thag's Theory says."\n\nThag thinks for a moment.  "Thag Theory says rock has 95% probability of going down."\n\n"What?" says Grek.  "Thag just copying Grek Theory!  Also, Thag not say that *before* seeing rocks go down!"\n\nThag smiles smugly.  "Only Thag get to say what Thag Theory predict, right?"\n\nAgain for pragmatic reasons, we should *first* ask the adherents of an informal theory to say what the theory predicts (a formal theory can simply be operated by anyone, and if this is not true, we will not call the theory 'formal').\n\nFurthermore, since you can find a fool following any cause, you should ask the smartest or most technically adept advocates of the theory.  If there's any dispute about who those are, ask separate representatives from the leading groups.  Fame is definitely not the key qualifier; you should ask Murray Gell-Mann and not Deepak Chopra about quantum mechanics, even if more people have heard of Deepak Chopra's beliefs about quantum mechanics than have heard about Murray Gell-Mann.  If you really can't tell the difference, ask them both, don't ask only Chopra and then claim that Chopra gets to be *the* representative because he is most famous.\n\nThese types of courtesy rules would not be necessary if we were dealing with a sufficiently advanced Artificial Intelligence or ideal rational agent, but it makes sense for human science where people may be motivated to falsely construe another theory's probability assignments.\n\nThis informal rule has its limits, and there may be cases where it seems really obvious that a hypothesis's predictions ought not to be what the hypothesis's adherents claim, or that the theory's adherents are just stealing the predictions of a more successful theory.  But there ought to be a *large* (if defeasible) bias in favor of letting a theory's adherents say what that theory predicts.\n\n# Boldness\n\nA few minutes later, Grek is picking up another rock.  "$\\mathbb P(down\\mid Grek) = 0.95$," says Grek.\n\n"$\\mathbb P(down\\mid Thag) = 0.95$," says Thag.  "See, Thag assign high probability to outcomes observed.  Thag win yet?"\n\n"No," says Grek.  "[1rq Likelihood ratios] 1 : 1 all time now, even if we believe Thag.  Thag's theory not pick up advantage.  Thag need to make *bold* prediction other theories not make."\n\nThag frowns.  "Thag say... rock will turn blue when you let go this time?  $\\mathbb P(blue\\mid Thag) = 0.90$."\n\n"That very bold," Grek says.  "Grek Theory not say that (nor any other obvious 'common sense' or 'business as usual' theories).  Grek think that $\\mathbb P(blue\\mid \\neg Thag) < 0.01$ so Thag prediction definitely has virtue of boldness.  Will be big deal if Thag prediction come true."\n\n$\\dfrac{\\mathbb P(Thag\\mid blue)}{\\mathbb P(\\neg Thag\\mid blue)} > 90 \\cdot \\dfrac{\\mathbb P(Thag)}{\\mathbb P(\\neg Thag)}$\n\n"Thag win now?" Thag says.\n\nGrek lets go of the rock.  It falls down to the ground.  It does not turn blue.\n\n"Bold prediction not correct," Grek says.  "Thag's prediction virtuous, but not win.  Now Thag lose by 1 : 10 likelihood ratio instead.  Very science, much falsification."\n\n"Grek lure Thag into trap!" yells Thag.\n\n"Look," says Grek, "whole point is to set up science rules so *correct* theories can win.  If wrong theorists lose quickly by trying to be scientifically virtuous, is feature rather than bug.  But if Thag try to be good and loses, we shake hands and everyone still think well of Thag.  Is normative social ideal, anyway."\n\n# Precision\n\nAt a somewhat later stage in the development of gravitational theory, the Aristotelian synthesis of Grek and Thag's theories, "Most things have a final destination of being at the center of the Earth, and try to approach that final destination" comes up against Galileo Galilei's "Most unsupported objects accelerate downwards, and each second of passing time the object gains another 9.8 meters per second of downward speed; don't ask me why, I'm just observing it."\n\n"You're not just predicting that rocks are observed to move downward when dropped, are you?" says Aristotle.  "Because I'm already predicting that."\n\n"What we're going to do next," says Galileo, "is predict *how long* it will take a bowling ball to fall from the Leaning Tower of Pisa."  Galileo takes out a pocket stopwatch.  "When my friend lets go of the ball, you hit the 'start' button, and as soon as the ball hits the ground, you hit the 'stop' button.  We're going to observe exactly what number appears on the watch."\n\nAfter some further calibration to determine that Aristotle has a pretty consistent reaction time for pressing the stopwatch button if Galileo snaps his fingers, Aristotle looks up at the bowling ball being held from the Leaning Tower of Pisa.\n\n"I think it'll take anywhere between 0 and 5 seconds inclusive," Aristotle says.  "Not sure beyond that."\n\n"Okay," says Galileo.  "I measured this tower to be 45 meters tall.  Now, if air resistance is 0, after 3 seconds the ball should be moving downward at a speed of 9.8 * 3 = 29.4 meters per second.  That speed increases continuously over the 3 seconds, so the ball's *average* speed will have been 29.4 / 2 = 14.7 meters per second.  And if the ball moves at an average speed of 14.7 meters per second, for 3 seconds, it will travel downward 44.1 meters.  So the ball should take just a little more than 3 seconds to fall 45 meters.  Like, an additional 1/29th of a second or so."\n\n![tower cartoon](https://i.imgur.com/kUeMG6w.png?0)\n\n"Hm," says Aristotle.  "This pocketwatch only measures whole seconds, so your theory puts all its probability mass on 3, right?"\n\n"Not literally *all* its probability mass," Galileo says.  "It takes you some time to press the stopwatch button once you see the ball start to fall, but it also takes you some time to press the button after you see the ball hit the ground.  Those two sources of measurement error should *mostly* cancel out, but maybe they'll happen not to on this particular occasion.  We don't have all *that* precise or well-tested an experimental setup here.  Like, if the stopwatch breaks and we observe a 0, then that will be a *defeat* for Galilean gravity, but it wouldn't imply a *final* refutation - we could get another watch and make better predictions and make up for the defeat."\n\n"Okay, so what probabilities do you assign?" says Aristotle.  "I think my own theory is about equally good at explaining any falling time between 0 and 5 seconds."\n\n![aristotle probability](https://i.imgur.com/LQNDTki.png?0)\n\nGalileo ponders.  "Since we haven't tested this setup yet... I think I'll put something like 90% of my probability mass on a falling time between 3 and 4 seconds, which corresponds to an observable result of the watch showing '3'.  Maybe I'll put another 5% probability on air resistance having a bigger effect than I think it should over this distance, so between 4 and 5 seconds or an observable of '4'.  Another 4% probability on this watch being slower than I thought, so 4% for a *measured* time between 2 and 3 and an observation of '2'.  0.99% probability on the stopwatch picking this time to break and show '1' (or '0', but that we both agree shouldn't happen), and 0.01% probability on an observation of '2' which basically shouldn't happen for any reason I can think of."\n\n![galileo probability](https://i.imgur.com/4CoLkuW.png?0)\n\n"Well," says Aristotle, "your theory certainly has the scientific virtue of *precision,* in that, by concentrating most of its probability density on a narrow region of possible precise observations, it will gain a great likelihood advantage over *vaguer* theories like mine, which roughly say that 'things fall down' and have made 'successful predictions' each time things fall down, but which don't predict exactly how long they should take to fall.  If your prediction of '3' comes true, that'll be a 0.9 : 0.2 or 4.5 : 1 likelihood ratio favoring Galilean over Aristotelian gravity." \n\n"Yes," says Galileo.  "Of course, it's not enough for the prediction to be precise, it also has to be correct.  If the watch shows '4' instead, that'll be a likelihood ratio of 0.05 : 0.20 or 1 : 4 against my theory.  It's better to be vague and right than to be precise and wrong."\n\nAristotle nods.  "Well, let's test it, then."\n\nThe bowling ball is dropped.\n\nThe stopwatch shows 3 seconds.\n\n"So do you believe my theory yet?" says Galileo.\n\n"Well, I believe it somewhere in the range of four and a half times as much as I did previously," says Aristotle.  "But that part where you're plugging in numbers like 9.8 and calculations like the *square* of the time strike me as kinda complicated.  Like, if I'm allowed to plug in numbers that precise, and do things like square them, there must be hundreds of different theories I could make which would be that complicated.  By the [11w quantitative form of Occam's Razor], we need to [ penalize the prior probability] of your theory for its [5v algorithmic complexity].  One observation with a likelihood ratio of 4.5 : 1 isn't enough to support all that complexity.  I'm not going to believe something that complicated because I see a stopwatch showing '3' just that one time!  I need to see more objects dropped from various different heights and verify that the times are what you say they should be.  If I say the prior complexity of your theory is, say, [1zh 20 bits], then 9 more observations like this would do it.  Of course, I expect you've already made more observations than that in private, but it only becomes part of the public knowledge of humankind after someone replicates it."\n\n"But of course," says Galileo.  "I'd like to check your experimental setup and especially your calculations the first few times you try it, to make sure you're not measuring in feet instead of meters, or forgetting to halve the final speed to get the average speed, and so on.  It's a formal theory, but in practice I want to check to make sure you're not making a mistake in calculating it."\n\n"Naturally," says Aristotle.  "Wow, it sure is a good thing that we're both Bayesians and we both know the governing laws of probability theory and how they motivate the informal social procedures we're following, huh?""\n\n"Yes indeed," says Galileo.  "Otherwise we might have gotten into a heated argument that could have lasted for hours."\n\n# Falsificationism\n\nOne of the reasons why Karl Popper was so enamored of "falsification" was the observation that falsification, in science, is more definite and final than confirmation.  A classic parable along these lines is Newtonian gravitation versus General Relativity (Einsteinian gravitation) - despite the tons and tons of experimental evidence for Newton's theory that had accumulated up to the 19th century, there was no sense in which Newtonian gravity had been *finally* verified, and in the end it was finally discarded in favor of Einsteinian gravity.  Now that Newton's gravity has been tossed on the trash-heap, though, there's no realistic probability of it *ever* coming back; the discard, unlike the adoption, is final.\n\nWorking in the days before Bayes became widely known, Popper put a *logical* interpretation on this setup.  Suppose $H \\rightarrow E,$ hypothesis H logically implies that evidence E will be observed.  If instead we observe $\\neg E$ we can conclude $\\neg H$ by the law of the contrapositive.  On the other hand, if we observe $E,$ we can't logically conclude $H.$  So we can logically falsify a theory, but not logically verify it.\n\nPragmatically, this often isn't how science works.\n\nIn the nineteenth century, observed anomalies were accumulating in the observation of Uranus's orbit.  After taking into account all known influences from all other planets, Uranus still was not *exactly* where Newton's theory said it should be.  On the logical-falsification view, since Newtonian gravitation said that Uranus ought to be in a certain precise place and Uranus was not there, we ought to have become [ infinitely certain] that Newton's theory was false.  Several theorists did suggest that Newton's theory might have a small error term, and so be false in its original form.\n\nThe actual outcome was that Urbain Le Verrier and John Couch Adams independently suspected that the anomaly in Uranus's orbit could be accounted for by a previously unobserved eighth planet.  And, rather than *vaguely* say that this was their hypothesis, in a way that would just spread around the probability mass for Uranus's location and cause Newtonian mechanics to be not *too* falsified, Verrier and Adams independently went on to calculate where the eighth planet ought to be.  In 1846, Johann Galle observed Neptune, based on Le Verrier's observations - a tremendous triumph for Newtonian mechanics.\n\nIn 1859, Urbain Le Verrier recognized another problem: Mercury was not exactly where it should be.  While Newtonian gravity did predict that Mercury's orbit should precess (the point of closest approach to the Sun should itself slowly rotate around the Sun), Mercury was precessing by 38 arc-seconds per century more than it ought to be (later revised to 43).  This anomaly was harder to explain; Le Verrier thought there was [a tiny planetoid orbiting the Sun inside the orbit of Mercury](https://en.wikipedia.org/wiki/Vulcan_(hypothetical_planet)).\n\nA bit more than half a century later, Einstein, working on the equations for General Relativity, realized that Mercury's anomalous precession was exactly explained by the equations in their simplest and most elegant form.\n\nAnd that was the end of Newtonian gravitation, permanently.\n\nIf we try to take Popper's logical view of things, there's no obvious difference between the anomaly with Uranus and the anomaly with Mercury.  In both cases, the straightforward Newtonian prediction seemed to be falsified.  If Newtonian gravitation could bounce back from one logical disconfirmation, why not the other?\n\nFrom a Bayesian standpoint, we can see the difference as follows:\n\nIn the case of Uranus, there was no attractive alternative to Newtonian mechanics that was making better predictions.  The current theory seemed to be [227 strictly confused] about Uranus, in the sense that the current Newtonian model was making confident predictions about Uranus that were much wronger than the theory expected to be on average.  This meant that there ought to be *some* better alternative.  It didn't say that the alternative had to be a non-Newtonian one.  The low $\\mathbb P(UranusLocation\\mid currentNewton)$ created a potential for some modification of the current model to make a better prediction with higher $\\mathbb P(UranusLocation\\mid newModel)$, but it didn't say what had to change in the new model.\n\nEven after Neptune was observed, though, this wasn't a *final* confirmation of Newtonian mechanics.  While the new model assigned very high $\\mathbb P(UranusLocation\\mid Neptune \\wedge Newton),$ there could, for all anyone knew, be some unknown Other theory that would assign equally high $\\mathbb P(UranusLocation\\mid Neptune \\wedge Other).$  In this case, Newton's theory would have no likelihood advantage versus this unknown Other, so we could not say that Newton's theory of gravity had been confirmed over *every other possible* theory.\n\nIn the case of Mercury, when Einstein's formal theory came along and assigned much higher $\\mathbb P(MercuryLocation\\mid Einstein)$ compared to $\\mathbb P(MercuryLocation\\mid Newton),$ this created a huge likelihood ratio for Einstein over Newton and drove the probability of Newton's theory very low.  Even if someday some other theory turns out to be better than Einstein, to do equally well at $\\mathbb P(MercuryLocation\\mid Other)$ and also get even better $\\mathbb P(newObservation\\mid Other),$ the fact that Einstein's theory did do much better than Newton on Mercury tells us that it's *possible* for simple theories to do much better on Mercury, in a simple way, that's definitely not Newtonian.  So whatever Other theory comes along will also do better on Mercury than $\\mathbb P(MercuryLocation\\mid Newton)$ in a non-Newtonian fashion, and Newton will just be at a new, huge likelihood disadvantage against this Other theory.\n\nSo - from a Bayesian standpoint - after explaining Mercury's orbital precession, we can't be sure Einstein's gravitation is correct, but we *can* be sure that Newton's gravitation is wrong.\n\nBut this doesn't reflect a logical difference between falsification and verification - everything takes place inside a world of probabilities.\n\n## Possibility of permanent confirmation\n\nIt's worth noting that although Newton's theory of gravitation was false, something *very much like it* was true.  So while the belief "Planets move *exactly* like Newton says" could only be provisionally accepted and was eventually overturned, the belief, "All the kind of planets we've seen so far, in the kind of situations we've seen so far, move *pretty much* like Newtonian gravity says" was much more strongly confirmed.\n\nThis implies that, contra Popper's rejection of the very notion of confirmation, *some* theories can be finally confirmed, beyond all reasonable doubt.  E.g., the DNA theory of biological reproduction.  No matter what we wonder about quarks, there's no plausible way we could be wrong about the existence of molecules, or about there being a double helix molecule that encodes genetic information.  It's reasonable to say that the theory of DNA has been forever confirmed beyond a reasonable doubt, and will never go on the trash-heap of science no matter what new observations may come.\n\nThis is possible because DNA is a non-fundamental theory, given in terms like "molecules" and "atoms" rather than quarks.  Even if quarks aren't exactly what we think, there will be something enough *like* quarks to underlie the objects we call protons and neutrons and the existence of atoms and molecules above that, which means the objects we call DNA will still be there in the new theory.  In other words, the biological theory of DNA has a "something *sort of like this* must be true" theory *underneath* it.  The hypothesis that what Joseph Black called 'fixed air' and we call 'carbon dioxide', is in fact made up of one carbon atom and two oxygen atoms, has been permanently confirmed in a way that Newtonian gravity was not permanently confirmed.\n\nThere's some amount of observation which would convince us that all science was a lie and there were fairies in the garden, but short of that, carbon dioxide is here to stay.\n\nNonetheless, in ordinary science when we're trying to figure out controversies, working to Bayes' rule implies that a virtuous scientist should think like Karl Popper suggested:\n\n- Treat disconfirmation as stronger than confirmation;\n- Only provisionally accept hypotheses that have a lot of favorable-seeming evidence;\n- Have some amount of disconfirming evidence and prediction-failures that makes you permanently put a hypothesis on the trash-heap and give up hope on its resurrection;\n- Require a qualitatively *more* powerful kind of evidence than that, with direct observation of the phenomenon's parts and processes in detail, before you start thinking of a theory as 'confirmed'.\n\n# Experimentation\n\nWhatever the likelihood for $\\mathbb P(observation\\mid hypothesis)$, it doesn't change your beliefs unless you actually execute the experiment, learn whether $observation$ or $\\neg observation$ is true, and [1y6 condition your beliefs in order to update your probabilities].\n\nIn this sense, Bayes' rule can also be said to motivate the experimental method.  Though you don't necessarily need a lot of math to realize that drawing an accurate map of a city requires looking at the city.  Still, since the experimental method wasn't quite obvious for a lot of human history, it could maybe use all the support it can get - including the central Bayesian idea of "Make observations to update your beliefs." \n',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '5',
  maintainerCount: '1',
  userSubscriberCount: '0',
  lastVisit: '',
  hasDraft: 'false',
  votes: [],
  voteSummary: [
    '0',
    '0',
    '0',
    '0',
    '0',
    '0',
    '0',
    '0',
    '0',
    '0'
  ],
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'true',
  proposalEditNum: '36',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {
    Summary: 'The classic scientific virtues of falsifiability (saying what we should *not* expect to see if a hypothesis is true), making bold experimental predictions (that aren't predicted by other theories), and precision (making narrow predictions about quantitative measurements) can be seen in the light of [1lz Bayes' rule] as properties which allow theories to gain stronger evidence (greater [1rq likelihood ratios]) in their favor when their predictions are correct.  We can also interpret traditional ideas like falsificationism - acceptance being provisional, and rejection being more forceful - as features that arise from the probability theory of Bayes' rule.'
  },
  creatorIds: [
    'EliezerYudkowsky',
    'NateSoares',
    'DewiMorgan',
    'EricBruylant',
    'PatrickLaVictoir',
    'EricRogstad',
    'OttoMossberg',
    'AdomHartell'
  ],
  childIds: [],
  parentIds: [
    'bayes_update'
  ],
  commentIds: [
    '413',
    '6ch',
    '8b3',
    '987',
    '988',
    '991'
  ],
  questionIds: [],
  tagIds: [
    'b_class_meta_tag'
  ],
  relatedIds: [],
  markIds: [],
  explanations: [
    {
      id: '2447',
      parentId: 'bayes_science_virtues',
      childId: 'bayes_science_virtues',
      type: 'subject',
      creatorId: 'AlexeiAndreev',
      createdAt: '2016-06-17 21:58:56',
      level: '2',
      isStrong: 'true',
      everPublished: 'true'
    }
  ],
  learnMore: [],
  requirements: [
    {
      id: '2273',
      parentId: 'math1',
      childId: 'bayes_science_virtues',
      type: 'requirement',
      creatorId: 'AlexeiAndreev',
      createdAt: '2016-06-17 21:58:56',
      level: '2',
      isStrong: 'true',
      everPublished: 'true'
    },
    {
      id: '5167',
      parentId: 'conditional_probability',
      childId: 'bayes_science_virtues',
      type: 'requirement',
      creatorId: 'NateSoares',
      createdAt: '2016-07-10 22:20:55',
      level: '2',
      isStrong: 'true',
      everPublished: 'true'
    },
    {
      id: '5169',
      parentId: 'bayes_log_odds',
      childId: 'bayes_science_virtues',
      type: 'requirement',
      creatorId: 'NateSoares',
      createdAt: '2016-07-10 22:21:30',
      level: '2',
      isStrong: 'false',
      everPublished: 'true'
    },
    {
      id: '5170',
      parentId: 'bayes_rule',
      childId: 'bayes_science_virtues',
      type: 'requirement',
      creatorId: 'NateSoares',
      createdAt: '2016-07-10 22:21:38',
      level: '2',
      isStrong: 'true',
      everPublished: 'true'
    }
  ],
  subjects: [
    {
      id: '2447',
      parentId: 'bayes_science_virtues',
      childId: 'bayes_science_virtues',
      type: 'subject',
      creatorId: 'AlexeiAndreev',
      createdAt: '2016-06-17 21:58:56',
      level: '2',
      isStrong: 'true',
      everPublished: 'true'
    },
    {
      id: '5643',
      parentId: 'bayes_rule',
      childId: 'bayes_science_virtues',
      type: 'subject',
      creatorId: 'AlexeiAndreev',
      createdAt: '2016-07-26 17:13:47',
      level: '2',
      isStrong: 'false',
      everPublished: 'true'
    }
  ],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {
    '1lz': [
      '1xr',
      '1yc',
      '1zh',
      '1zm',
      '552',
      '56j',
      '6cj'
    ]
  },
  learnMoreRequiredMap: {
    '220': [
      '25y'
    ]
  },
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '23087',
      pageId: 'bayes_science_virtues',
      userId: 'DewiMorgan',
      edit: '36',
      type: 'newEditProposal',
      createdAt: '2018-09-25 17:56:53',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '23086',
      pageId: 'bayes_science_virtues',
      userId: 'DewiMorgan',
      edit: '35',
      type: 'newEditProposal',
      createdAt: '2018-09-25 17:54:55',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '22034',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '33',
      type: 'newEdit',
      createdAt: '2017-02-16 18:22:52',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '20061',
      pageId: 'bayes_science_virtues',
      userId: 'NateSoares',
      edit: '32',
      type: 'newEdit',
      createdAt: '2016-10-11 17:57:00',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '20029',
      pageId: 'bayes_science_virtues',
      userId: 'AdomHartell',
      edit: '31',
      type: 'newEdit',
      createdAt: '2016-10-10 22:57:05',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '3477',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '2',
      dislikeCount: '0',
      likeScore: '2',
      individualLikes: [],
      id: '19413',
      pageId: 'bayes_science_virtues',
      userId: 'OttoMossberg',
      edit: '30',
      type: 'newEdit',
      createdAt: '2016-08-30 17:15:54',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18254',
      pageId: 'bayes_science_virtues',
      userId: 'EricBruylant',
      edit: '0',
      type: 'newTag',
      createdAt: '2016-08-03 17:54:22',
      auxPageId: 'b_class_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '17546',
      pageId: 'bayes_science_virtues',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'newSubject',
      createdAt: '2016-07-26 17:13:47',
      auxPageId: 'bayes_rule',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '16933',
      pageId: 'bayes_science_virtues',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'deleteSubject',
      createdAt: '2016-07-16 20:49:05',
      auxPageId: 'bayes_rule',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '16863',
      pageId: 'bayes_science_virtues',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'newSubject',
      createdAt: '2016-07-16 16:17:35',
      auxPageId: 'bayes_rule',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '16525',
      pageId: 'bayes_science_virtues',
      userId: 'NateSoares',
      edit: '0',
      type: 'newRequirement',
      createdAt: '2016-07-10 22:21:38',
      auxPageId: 'bayes_rule',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '16524',
      pageId: 'bayes_science_virtues',
      userId: 'NateSoares',
      edit: '0',
      type: 'deleteRequirement',
      createdAt: '2016-07-10 22:21:33',
      auxPageId: 'bayes_rule_guide',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '16522',
      pageId: 'bayes_science_virtues',
      userId: 'NateSoares',
      edit: '0',
      type: 'newRequirement',
      createdAt: '2016-07-10 22:21:30',
      auxPageId: 'bayes_log_odds',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '16521',
      pageId: 'bayes_science_virtues',
      userId: 'NateSoares',
      edit: '0',
      type: 'newRequirement',
      createdAt: '2016-07-10 22:21:16',
      auxPageId: 'bayes_rule_guide',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '16520',
      pageId: 'bayes_science_virtues',
      userId: 'NateSoares',
      edit: '0',
      type: 'deleteRequirement',
      createdAt: '2016-07-10 22:21:11',
      auxPageId: 'bayes_rule_odds',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '16518',
      pageId: 'bayes_science_virtues',
      userId: 'NateSoares',
      edit: '0',
      type: 'newRequirement',
      createdAt: '2016-07-10 22:20:56',
      auxPageId: 'conditional_probability',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '16517',
      pageId: 'bayes_science_virtues',
      userId: 'NateSoares',
      edit: '0',
      type: 'deleteRequirement',
      createdAt: '2016-07-10 22:20:46',
      auxPageId: 'bayes_probability_notation',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '16461',
      pageId: 'bayes_science_virtues',
      userId: 'NateSoares',
      edit: '0',
      type: 'deleteRequiredBy',
      createdAt: '2016-07-10 21:55:21',
      auxPageId: 'bayes_update_details',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '16228',
      pageId: 'bayes_science_virtues',
      userId: 'NateSoares',
      edit: '29',
      type: 'newEdit',
      createdAt: '2016-07-08 15:57:47',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '14346',
      pageId: 'bayes_science_virtues',
      userId: 'NateSoares',
      edit: '28',
      type: 'newEdit',
      createdAt: '2016-06-22 05:34:29',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '11814',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '27',
      type: 'newEdit',
      createdAt: '2016-06-05 01:13:30',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '11811',
      pageId: 'bayes_science_virtues',
      userId: 'EricRogstad',
      edit: '26',
      type: 'newEdit',
      createdAt: '2016-06-05 00:32:39',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8248',
      pageId: 'bayes_science_virtues',
      userId: 'PatrickLaVictoir',
      edit: '25',
      type: 'newEdit',
      createdAt: '2016-03-03 23:53:27',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8195',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '24',
      type: 'newTeacher',
      createdAt: '2016-03-03 20:28:05',
      auxPageId: 'bayes_science_virtues',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8196',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '24',
      type: 'newSubject',
      createdAt: '2016-03-03 20:28:05',
      auxPageId: 'bayes_science_virtues',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8173',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '24',
      type: 'newRequiredBy',
      createdAt: '2016-03-03 19:30:12',
      auxPageId: 'bayes_update_details',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8009',
      pageId: 'bayes_science_virtues',
      userId: 'EricBruylant',
      edit: '24',
      type: 'newEdit',
      createdAt: '2016-02-28 14:51:54',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7703',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '23',
      type: 'newEdit',
      createdAt: '2016-02-23 02:17:34',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7674',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '22',
      type: 'newEdit',
      createdAt: '2016-02-23 00:37:42',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7673',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '21',
      type: 'newEdit',
      createdAt: '2016-02-23 00:29:52',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7672',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '20',
      type: 'newEdit',
      createdAt: '2016-02-23 00:29:08',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7515',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '19',
      type: 'newEdit',
      createdAt: '2016-02-21 05:12:17',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7514',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '18',
      type: 'newEdit',
      createdAt: '2016-02-21 05:10:32',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7513',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '17',
      type: 'newEdit',
      createdAt: '2016-02-21 05:10:00',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7512',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '16',
      type: 'newEdit',
      createdAt: '2016-02-21 05:08:52',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7511',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '15',
      type: 'newEdit',
      createdAt: '2016-02-21 05:07:04',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7510',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '14',
      type: 'newEdit',
      createdAt: '2016-02-21 05:06:32',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7509',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '13',
      type: 'newEdit',
      createdAt: '2016-02-21 05:05:57',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7508',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '12',
      type: 'newEdit',
      createdAt: '2016-02-21 05:05:23',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7505',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '11',
      type: 'newEdit',
      createdAt: '2016-02-21 04:55:55',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7494',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '10',
      type: 'newEdit',
      createdAt: '2016-02-21 02:15:30',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7493',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '9',
      type: 'newEdit',
      createdAt: '2016-02-21 02:13:36',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7471',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '8',
      type: 'newEdit',
      createdAt: '2016-02-21 02:02:00',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7467',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '7',
      type: 'newEdit',
      createdAt: '2016-02-21 01:40:36',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7466',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '6',
      type: 'newEdit',
      createdAt: '2016-02-21 01:37:12',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7465',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '5',
      type: 'newEdit',
      createdAt: '2016-02-21 01:21:41',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7464',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '4',
      type: 'newEdit',
      createdAt: '2016-02-21 01:18:50',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7463',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '3',
      type: 'newEdit',
      createdAt: '2016-02-20 22:15:30',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7462',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'deleteTag',
      createdAt: '2016-02-20 22:15:27',
      auxPageId: 'work_in_progress_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7460',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '2',
      type: 'newEdit',
      createdAt: '2016-02-20 21:28:39',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7450',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newTag',
      createdAt: '2016-02-20 04:28:43',
      auxPageId: 'work_in_progress_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7440',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newEdit',
      createdAt: '2016-02-19 07:20:33',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7439',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newRequirement',
      createdAt: '2016-02-19 06:56:50',
      auxPageId: 'bayes_probability_notation',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7437',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newRequirement',
      createdAt: '2016-02-19 06:56:13',
      auxPageId: 'bayes_rule_odds',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7435',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newRequirement',
      createdAt: '2016-02-19 06:56:08',
      auxPageId: 'math1',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7431',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'deleteTag',
      createdAt: '2016-02-19 06:55:57',
      auxPageId: 'bayes_rule_odds',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7433',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'deleteTag',
      createdAt: '2016-02-19 06:55:57',
      auxPageId: 'bayes_probability_notation',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7429',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'deleteTag',
      createdAt: '2016-02-19 06:55:56',
      auxPageId: 'math2',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7427',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newTag',
      createdAt: '2016-02-19 06:55:43',
      auxPageId: 'bayes_probability_notation',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7425',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newTag',
      createdAt: '2016-02-19 06:55:35',
      auxPageId: 'bayes_rule_odds',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7423',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newTag',
      createdAt: '2016-02-19 06:55:26',
      auxPageId: 'math2',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7421',
      pageId: 'bayes_science_virtues',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newParent',
      createdAt: '2016-02-19 06:49:01',
      auxPageId: 'bayes_update',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'false',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {
    fewerWords: {
      likeableId: '3715',
      likeableType: 'contentRequest',
      myLikeValue: '0',
      likeCount: '1',
      dislikeCount: '0',
      likeScore: '1',
      individualLikes: [],
      id: '161',
      pageId: 'bayes_science_virtues',
      requestType: 'fewerWords',
      createdAt: '2016-11-22 18:18:01'
    },
    lessTechnical: {
      likeableId: '4029',
      likeableType: 'contentRequest',
      myLikeValue: '0',
      likeCount: '2',
      dislikeCount: '0',
      likeScore: '2',
      individualLikes: [],
      id: '182',
      pageId: 'bayes_science_virtues',
      requestType: 'lessTechnical',
      createdAt: '2017-04-09 14:56:01'
    },
    moreWords: {
      likeableId: '4019',
      likeableType: 'contentRequest',
      myLikeValue: '0',
      likeCount: '2',
      dislikeCount: '0',
      likeScore: '2',
      individualLikes: [],
      id: '178',
      pageId: 'bayes_science_virtues',
      requestType: 'moreWords',
      createdAt: '2017-03-24 16:50:00'
    }
  }
}