{
  localUrl: '../page/extraordinary_claims.html',
  arbitalUrl: 'https://arbital.com/p/extraordinary_claims',
  rawJsonUrl: '../raw/26y.json',
  likeableId: '1153',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '4',
  dislikeCount: '0',
  likeScore: '4',
  individualLikes: [
    'EricBruylant',
    'AlexRay',
    'EricRogstad',
    'SzymonWilczyski'
  ],
  pageId: 'extraordinary_claims',
  edit: '2',
  editSummary: '',
  prevEdit: '1',
  currentEdit: '2',
  wasPublished: 'true',
  type: 'wiki',
  title: 'Extraordinary claims',
  clickbait: 'What makes something an 'extraordinary claim' that requires extraordinary evidence?',
  textLength: '9465',
  alias: 'extraordinary_claims',
  externalUrl: '',
  sortChildrenBy: 'likes',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'EliezerYudkowsky',
  editCreatedAt: '2016-03-04 06:16:35',
  pageCreatorId: 'EliezerYudkowsky',
  pageCreatedAt: '2016-03-04 05:55:57',
  seeDomainId: '0',
  editDomainId: 'AlexeiAndreev',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '0',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '654',
  text: '[summary:  To determine whether something is an 'extraordinary claim', for purposes of deciding whether it [21v requires extraordinary evidence], we consider / don't consider these factors:\n\n- Do consider:  Whether the claim requires lots of complexity that isn't already implied by other evidence.\n- Don't consider:  Whether the claim has future consequences that are extreme or important-sounding.\n- Don't consider:  Whether the claim's truth would imply that we ought to do costly or painful things.\n- Do consider:  Whether the claim violates low-level or causal generalizations.\n- Don't consider:  Whether the claim, if locally true, would imply downstream effects that violate high-level or surface generalizations.\n\nWe don't automatically *believe* claims that pass such tests; we just consider them as ordinary claims which we'd believe on obtaining ordinary evidence.]\n\n# Summary\n\nWhat makes something count as an 'extraordinary claim' for the purposes of determining whether it [21v requires extraordinary evidence]?\n\nBroadly speaking:\n\n- Not-yet-supported complexity or precise details; by [occams_razor Occam's Razor], this requires added support from the evidence.\n- Violation of (deep, causal) generalizations; behaving in a way that's out of character for (the lowest level of) the universe.\n\nSome properties that do *not* make a claim inherently 'extraordinary' in the above sense:\n\n- Having future consequences that are important, or extreme-sounding, or that would imply we need to take costly policies.\n- Whether few or many people already believe it.\n- Whether there might be bad reasons to believe it.\n\nThe ultimate grounding for the notion of 'extraordinary claim' would come from [11w Solomonoff induction], or some generalization of Solomonoff induction to handle more [-naturalistic] reasoning about the world.  Since this page is a [4v Work In Progress], it currently only lists out the derived heuristics, rather than trying to explain in any great detail how those heuristics might follow from Solomonoff-style reasoning.\n\n# Example 1: Anthropogenic global warming\n\nConsider the idea of [anthropogenic global warming](https://en.wikipedia.org/wiki/Global_warming) as it might have been analyzed in *advance* of observing the actual temperature record.  Would the claim, "Adding lots of carbon dioxide to the atmosphere will result in increased global temperatures and climate change", be an 'extraordinary' claim requiring extraordinary evidence to verify, or an ordinary claim not requiring particularly strong evidence?  We assume in this case you can do all the physics reasoning or ecological reasoning you want, but you can't actually look at the temperature record yet.\n\nThe core argument (in *advance* of looking at the temperature record) will be:  "Carbon dioxide is a greenhouse gas, so adding sufficient amounts of it to the atmosphere ought to trap more heat, which ought to raise the equilibrium temperature of the Earth."\n\nTo evaluate the ordinariness or extraordinariness of this claim:\n\nWe *don't* ask whether the future consequences of the claim are extreme or important.  Suppose that adding carbon dioxide actually did trap more heat; would the Standard Model of physics think to itself, "Uh oh, that has some extreme consequences" and decide to let the heat radiate away anyway?  Obviously not; the laws of physics have no privileged tendency to avoid consequences that are, on a human scale, very important in a positive or negative direction.\n\nWe *don't* ask whether the policies required to address the claim are very costly - this isn't something that would prevent the causal mechanisms behind the claim from operating, and more generally, reality doesn't try to avoid inconveniencing us, so it doesn't affect the prior probability we assign to a claim in advance of seeing any evidence.\n\nWe *don't* ask whether someone has a motive to lie to us about the claim, or if they might be inclined to believe it for crazy reasons.  If someone has a motive to lie to us about the evidence, this affects the strength of evidence, rather than lowering the [1rm prior probability].  Suppose somebody said, "Hey, I own an apartment in New York, and I'll rent it to you for $2000/month."  They might be lying and trying to trick you out of the money, but this doesn't mean "I own an apartment in New York" is an extraordinary claim.  Lots of people own apartments in New York.  It happens all the time, even.  The monetary stake means that the person might have a motive to lie to you, but this affects the likelihood ratio, not the prior odds.  If we're just considering their unsupported word, the probability that they'll say "I own an apartment in New York", given that they *don't* own an apartment in New York, might be unusually high because they could be trying to run a rent scam.  But this doesn't mean we have to call in physicists to check out whether the apartment is really there - we just need stronger, but ordinary, evidence.  Similarly, even if there was someone tempted to lie about global warming, we'd consider this as a potential weakness of the evidence they offer us, but not a weakness in the prior probability of the proposition "Adding carbon dioxide to the atmosphere heats it up."\n\n(Similarly, wanting strong evidence about a subject doesn't always coincide with the underlying claim being improbable.  Maybe you're considering *buying* a house in San Francisco, and *millions* of dollars are at stake.  This implies a high [value_of_information value of information] and you might want to invest in extra-strong evidence like having a third party check the title to the house.  But this isn't because it's a Lovecraftian anomaly for anyone to own a house in San Francisco.  The money at stake just means that we're willing to pay more to eliminate small residues of improbability from this very ordinary claim.)\n\nWe *do* ask whether "adding carbon dioxide warms the atmosphere" or "carbon dioxide doesn't warm the atmosphere" seems more consonant with the previously observed behavior of carbon dioxide.\n\nAfter we finish figuring out how carbon dioxide molecules and infrared photons usually behave, we *don't* give priority to generalizations like, "For as long as we've observed it, the average summer temperature in Freedonia has never gone over 30C."  It's true that the predicted consequences of carbon dioxide behaving like it usually does, are violating another generalization about how Freedonia usually behaves.  But we generally give priority to *deeper* generalizations continuing, i.e., generalizations that are *lower-level* or *closer to the start of causal chains.*\n\n- The behavior of carbon dioxide is lower-level - it's generalizing over a class of molecules that we can observe in very great detail and make very precise generalizations about.  The weight of a carbon dioxide molecule (with the standard isotopes in both cases), or the amount of infrared light the corresponding gas allows to pass, is something that varies much less than the summer temperature in Freedonia - it's a very precise, very strong generalization.\n- The behavior of carbon dioxide is closer to the start of the chain of cause and effect.  The summer temperature in Freedonia is something that's caused by, or happens as a result of, a particular level of carbon dioxide in Freedonia's atmosphere.  We'd expect changes that happened toward the start of the causal chain to produce changes in the effects at the end of the causal chain.  Conversely, it would be very surprising if the Freedonia-surface-temperature generalization can reach back and force carbon dioxide to have a different permeability to infrared.\n\nWe *don't* consider whether lots of prestigious scientists believe in global warming.  If you expect that lots of prestigious scientists usually won't believe in a proposition like global warming in worlds where global warming is false, then observing an apparent scientific consensus might be moderately strong *evidence* favoring the claim.  But that isn't part of the *prior probability* before seeing any evidence.  For that, we want to ask about how complicated the claim is, and whether it violates or obeys generalizations we already know about.\n\nAnother way of looking at a test of extraordinariness is to ask whether the claim's truth or falsity would imply learning more about the universe that we didn't already know.  If you'd never observed the temperature record, and had only guessed *a priori* that adding carbon dioxide would warm the atmosphere, you wouldn't be too surprised to go look at the temperature record and find that nothing seemed to be happening.  In this case, rather than imagining that you were wrong about the behavior of infrared light, you might suspect, for example, that plants were growing more and absorbing the carbon dioxide, keeping the total atmospheric level in equilibrium.  But in this case you would have learned *a new fact not already known to you (or science)* which explained why global temperatures were not rising.  So to expect that outcome in *advance* would be a more extraordinary claim than to not expect it.  If we can imagine some not-too-implausible ways that a claim could be wrong, but they'd all require us to postulate new facts we don't solidly know, then this doesn't make the original claim 'extraordinary'.  It's still a very ordinary claim that we'd start believing in after seeing an ordinary amount of evidence.',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '2',
  maintainerCount: '1',
  userSubscriberCount: '0',
  lastVisit: '',
  hasDraft: 'false',
  votes: [],
  voteSummary: [
    '0',
    '0',
    '0',
    '0',
    '0',
    '0',
    '0',
    '0',
    '0',
    '0'
  ],
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {
    Summary: 'To determine whether something is an 'extraordinary claim', for purposes of deciding whether it [21v requires extraordinary evidence], we consider / don't consider these factors:\n\n- Do consider:  Whether the claim requires lots of complexity that isn't already implied by other evidence.\n- Don't consider:  Whether the claim has future consequences that are extreme or important-sounding.\n- Don't consider:  Whether the claim's truth would imply that we ought to do costly or painful things.\n- Do consider:  Whether the claim violates low-level or causal generalizations.\n- Don't consider:  Whether the claim, if locally true, would imply downstream effects that violate high-level or surface generalizations.\n\nWe don't automatically *believe* claims that pass such tests; we just consider them as ordinary claims which we'd believe on obtaining ordinary evidence.'
  },
  creatorIds: [
    'EliezerYudkowsky'
  ],
  childIds: [],
  parentIds: [
    'bayes_extraordinary_claims'
  ],
  commentIds: [
    '2hj'
  ],
  questionIds: [],
  tagIds: [
    'c_class_meta_tag'
  ],
  relatedIds: [
    'bayes_extraordinary_claims'
  ],
  markIds: [],
  explanations: [
    {
      id: '6505',
      parentId: 'extraordinary_claims',
      childId: 'bayes_extraordinary_claims',
      type: 'subject',
      creatorId: 'EliezerYudkowsky',
      createdAt: '2016-09-29 22:26:21',
      level: '1',
      isStrong: 'true',
      everPublished: 'true'
    },
    {
      id: '5309',
      parentId: 'extraordinary_claims',
      childId: 'extraordinary_claims',
      type: 'subject',
      creatorId: 'AlexeiAndreev',
      createdAt: '2016-07-16 16:27:18',
      level: '2',
      isStrong: 'true',
      everPublished: 'true'
    }
  ],
  learnMore: [],
  requirements: [],
  subjects: [
    {
      id: '5309',
      parentId: 'extraordinary_claims',
      childId: 'extraordinary_claims',
      type: 'subject',
      creatorId: 'AlexeiAndreev',
      createdAt: '2016-07-16 16:27:18',
      level: '2',
      isStrong: 'true',
      everPublished: 'true'
    }
  ],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '19781',
      pageId: 'extraordinary_claims',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'deleteTag',
      createdAt: '2016-09-29 22:28:19',
      auxPageId: 'work_in_progress_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '19779',
      pageId: 'extraordinary_claims',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'deleteTag',
      createdAt: '2016-09-29 22:28:07',
      auxPageId: 'bayes_extraordinary_claims',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '19777',
      pageId: 'extraordinary_claims',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newParent',
      createdAt: '2016-09-29 22:28:04',
      auxPageId: 'bayes_extraordinary_claims',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '19775',
      pageId: 'extraordinary_claims',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'deleteParent',
      createdAt: '2016-09-29 22:27:58',
      auxPageId: 'bayes_for_humans',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '19772',
      pageId: 'extraordinary_claims',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newTeacher',
      createdAt: '2016-09-29 22:26:21',
      auxPageId: 'bayes_extraordinary_claims',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '19770',
      pageId: 'extraordinary_claims',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'deleteRequiredBy',
      createdAt: '2016-09-29 22:26:12',
      auxPageId: 'bayes_extraordinary_claims',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18268',
      pageId: 'extraordinary_claims',
      userId: 'EricBruylant',
      edit: '0',
      type: 'newTag',
      createdAt: '2016-08-03 19:18:11',
      auxPageId: 'c_class_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '18135',
      pageId: 'extraordinary_claims',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'deleteParent',
      createdAt: '2016-08-02 17:20:13',
      auxPageId: 'bayesian_prior',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '16871',
      pageId: 'extraordinary_claims',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'newTeacher',
      createdAt: '2016-07-16 16:27:19',
      auxPageId: 'extraordinary_claims',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '16872',
      pageId: 'extraordinary_claims',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'newSubject',
      createdAt: '2016-07-16 16:27:19',
      auxPageId: 'extraordinary_claims',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8329',
      pageId: 'extraordinary_claims',
      userId: 'EliezerYudkowsky',
      edit: '2',
      type: 'newEdit',
      createdAt: '2016-03-04 06:16:35',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8317',
      pageId: 'extraordinary_claims',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newUsedAsTag',
      createdAt: '2016-03-04 05:56:35',
      auxPageId: 'bayes_extraordinary_claims',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8316',
      pageId: 'extraordinary_claims',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newAlias',
      createdAt: '2016-03-04 05:56:24',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8315',
      pageId: 'extraordinary_claims',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newEdit',
      createdAt: '2016-03-04 05:55:57',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8308',
      pageId: 'extraordinary_claims',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newParent',
      createdAt: '2016-03-04 04:38:21',
      auxPageId: 'bayes_for_humans',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8306',
      pageId: 'extraordinary_claims',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newParent',
      createdAt: '2016-03-04 04:38:05',
      auxPageId: 'bayesian_prior',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8304',
      pageId: 'extraordinary_claims',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'deleteParent',
      createdAt: '2016-03-04 04:37:56',
      auxPageId: 'bayes_reasoning',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8250',
      pageId: 'extraordinary_claims',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newTag',
      createdAt: '2016-03-03 23:53:41',
      auxPageId: 'bayes_extraordinary_claims',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8247',
      pageId: 'extraordinary_claims',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newTag',
      createdAt: '2016-03-03 23:53:19',
      auxPageId: 'work_in_progress_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '8245',
      pageId: 'extraordinary_claims',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newParent',
      createdAt: '2016-03-03 23:53:09',
      auxPageId: 'bayes_reasoning',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'false',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}