{
  localUrl: '../page/nofreelunch_irrelevant.html',
  arbitalUrl: 'https://arbital.com/p/nofreelunch_irrelevant',
  rawJsonUrl: '../raw/4lx.json',
  likeableId: '2774',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '5',
  dislikeCount: '0',
  likeScore: '5',
  individualLikes: [
    'AlexeiAndreev',
    'JaimeSevillaMolina',
    'NateSoares',
    'EricRogstad',
    'StevenZuber'
  ],
  pageId: 'nofreelunch_irrelevant',
  edit: '13',
  editSummary: '',
  prevEdit: '12',
  currentEdit: '13',
  wasPublished: 'true',
  type: 'wiki',
  title: 'No-Free-Lunch theorems are often irrelevant',
  clickbait: 'There's often a theorem proving that some problem has no optimal answer across every possible world.  But this may not matter, since the real world is a special case.  (E.g., a low-entropy universe.)',
  textLength: '4032',
  alias: 'nofreelunch_irrelevant',
  externalUrl: '',
  sortChildrenBy: 'likes',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'EliezerYudkowsky',
  editCreatedAt: '2016-06-20 06:02:01',
  pageCreatorId: 'EliezerYudkowsky',
  pageCreatedAt: '2016-06-19 19:35:09',
  seeDomainId: '0',
  editDomainId: 'EliezerYudkowsky',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '0',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '678',
  text: '[summary:  "No Free Lunch" theorems prove that across some problem class, it's impossible to do better in one case without doing worse in some other case.  They say some equivalent of, "There's no universal good strategy:  For every universe where an action causes you to gain \\$5, there's a different universe where that same action causes you to lose \\$5.  You can't get a *free lunch* across every universe; if you gain \\$5 in one possible world you must lose \\$5 in another possible world."\n\nTo this the realistic reply is very often, "Okay, that's true across maximum-entropy universes in general.  But the real world is an extremely special case.  We happen to live in a low-entropy universe, where things are far more ordered and predictable than in a universe where all the atoms have been placed at random; and the world where we gain \\$5 is far more probable than the world where we lose \\$5."\n\nSimilarly:  In theory, the ideal form of the protein folding problem is NP-hard.  In practice, biology is happy to select proteins that reliably fold up a particular way even if they don't reach the exact theoretical minimum.  "NP-hard" problems in real life with nonrandom data often have regularities that make them much more easily solvable than the worst-case problems; or they have pretty good approximate solutions.]\n\nThere's a wide variety of "No Free Lunch" theorems proving that, in general, some problem is unsolvable.  Very often, these theorems are not relevant because the real universe is a special case.\n\nIn a very general and metaphorical sense, most of these theorems say some equivalent of, "There's no universal good strategy:  For every universe where an action causes you to gain \\$5, there's a different universe where that same action causes you to lose \\$5.  You can't get a *free lunch* across every universe; if you gain \\$5 in one universe you must lose \\$5 in another universe."  To which the reply is very often, "Sure, that's true across maximum-entropy universes in general, but we happen to live in a low-entropy universe where things are far more ordered and predictable than in a universe where all the atoms have been placed at random."\n\nSimilarly:  In theory, the [protein folding problem](https://en.wikipedia.org/wiki/Protein_folding) (predicting the lowest-energy configuration for a string of amino acids) is NP-hard ([sorta](https://arxiv.org/abs/1306.1372)).  But since quantum mechanics is not known to solve NP-hard problems, this just means that in the real world, some proteins fold up in a way that doesn't reach the ideal lowest energy.  Biology is happy with just picking out proteins that reliably fold up a particular way.  "NP-hard" problems in real life with nonrandom data often have regularities that make them much more easily solvable than the worst cases; or they have pretty good approximate solutions; or we can work with just the solvable cases.\n\nA related but distinct idea:  It's impossible to prove in general whether a Turing machine halts, or [has any other nontrivial property](https://en.wikipedia.org/wiki/Rice%27s_theorem) since it might be conditioned on a subprocess halting.  But that often doesn't stop us from picking particular machines that do halt, or limiting our consideration to computations that run in less than a quadrillion timesteps, etcetera.\n\nThis doesn't mean *all* No-Free-Lunch theorems are irrelevant in the real-universe special case.  E.g., the Second Law of Thermodynamics can also be seen as a No-Free-Lunch theorem, and does actually prohibit perpetual motion in our own real universe (on the standard model of physics).\n\nIt should finally be observed that human intelligence does work in the real world, meaning that there's no No-Free-Lunch theorem which prohibits an intelligence from working at least that well.  Any claim that a No-Free-Lunch theorem prohibits machine intelligence in general, must definitely [3tc Prove Too Much], because the same reasoning could be applied to a human brain considered as a physical system.',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '2',
  maintainerCount: '1',
  userSubscriberCount: '0',
  lastVisit: '',
  hasDraft: 'false',
  votes: [],
  voteSummary: 'null',
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {},
  creatorIds: [
    'EliezerYudkowsky'
  ],
  childIds: [],
  parentIds: [
    'unbounded_analysis'
  ],
  commentIds: [
    '4mb'
  ],
  questionIds: [],
  tagIds: [],
  relatedIds: [],
  markIds: [],
  explanations: [],
  learnMore: [],
  requirements: [],
  subjects: [],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '14070',
      pageId: 'nofreelunch_irrelevant',
      userId: 'EliezerYudkowsky',
      edit: '13',
      type: 'newEdit',
      createdAt: '2016-06-20 06:02:01',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '14069',
      pageId: 'nofreelunch_irrelevant',
      userId: 'EliezerYudkowsky',
      edit: '12',
      type: 'newEdit',
      createdAt: '2016-06-20 06:00:59',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '14046',
      pageId: 'nofreelunch_irrelevant',
      userId: 'EliezerYudkowsky',
      edit: '11',
      type: 'newEdit',
      createdAt: '2016-06-20 00:02:57',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '14045',
      pageId: 'nofreelunch_irrelevant',
      userId: 'EliezerYudkowsky',
      edit: '10',
      type: 'newEdit',
      createdAt: '2016-06-20 00:02:29',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '14044',
      pageId: 'nofreelunch_irrelevant',
      userId: 'EliezerYudkowsky',
      edit: '9',
      type: 'newEdit',
      createdAt: '2016-06-20 00:02:11',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '14043',
      pageId: 'nofreelunch_irrelevant',
      userId: 'EliezerYudkowsky',
      edit: '8',
      type: 'newEdit',
      createdAt: '2016-06-20 00:01:31',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '14042',
      pageId: 'nofreelunch_irrelevant',
      userId: 'EliezerYudkowsky',
      edit: '7',
      type: 'newEdit',
      createdAt: '2016-06-20 00:00:54',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '14041',
      pageId: 'nofreelunch_irrelevant',
      userId: 'EliezerYudkowsky',
      edit: '6',
      type: 'newEdit',
      createdAt: '2016-06-19 23:59:51',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '14040',
      pageId: 'nofreelunch_irrelevant',
      userId: 'EliezerYudkowsky',
      edit: '5',
      type: 'newEdit',
      createdAt: '2016-06-19 23:58:18',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '14039',
      pageId: 'nofreelunch_irrelevant',
      userId: 'EliezerYudkowsky',
      edit: '4',
      type: 'newEdit',
      createdAt: '2016-06-19 23:57:40',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '14038',
      pageId: 'nofreelunch_irrelevant',
      userId: 'EliezerYudkowsky',
      edit: '3',
      type: 'newEdit',
      createdAt: '2016-06-19 23:57:04',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '14037',
      pageId: 'nofreelunch_irrelevant',
      userId: 'EliezerYudkowsky',
      edit: '2',
      type: 'newEdit',
      createdAt: '2016-06-19 23:56:50',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '14007',
      pageId: 'nofreelunch_irrelevant',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newParent',
      createdAt: '2016-06-19 19:35:11',
      auxPageId: 'unbounded_analysis',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '14005',
      pageId: 'nofreelunch_irrelevant',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newEdit',
      createdAt: '2016-06-19 19:35:09',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'false',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}