{
  localUrl: '../page/pd_tournament_99ldt_1cdt.html',
  arbitalUrl: 'https://arbital.com/p/pd_tournament_99ldt_1cdt',
  rawJsonUrl: '../raw/5hp.json',
  likeableId: '3220',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '1',
  dislikeCount: '0',
  likeScore: '1',
  individualLikes: [
    'JaimeSevillaMolina'
  ],
  pageId: 'pd_tournament_99ldt_1cdt',
  edit: '5',
  editSummary: '',
  prevEdit: '4',
  currentEdit: '5',
  wasPublished: 'true',
  type: 'wiki',
  title: '99LDT x 1CDT oneshot PD tournament as arguable counterexample to LDT doing better than CDT',
  clickbait: 'Arguendo, if 99 LDT agents and 1 CDT agent are facing off in a one-shot Prisoner's Dilemma tournament, the CDT agent does better on a problem that CDT considers 'fair'.',
  textLength: '5283',
  alias: 'pd_tournament_99ldt_1cdt',
  externalUrl: '',
  sortChildrenBy: 'likes',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'EliezerYudkowsky',
  editCreatedAt: '2016-07-23 22:29:45',
  pageCreatorId: 'EliezerYudkowsky',
  pageCreatedAt: '2016-07-21 01:10:27',
  seeDomainId: '0',
  editDomainId: '123',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '1',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '104',
  text: '[summary:  As a counterexample to the assertion that [58b] handles a strictly larger [fair_problem_class fair problem class] than [causal_dt CDT], [wei_dai Wei Dai] proposed the example of 1 CDT agent playing 99 LDT agents in a oneshot [prisoners_dilemma Prisoner's Dilemma] without knowledge of the other agent's code.  The LDT agents must cooperate uniformly to avoid accidentally defecting against each other; the CDT agent can defect against all of them.\n\nCounterargument:  Imagine a CDT agent and an LDT agent see a \\$20 bill lying in the street.  Two days previously, [5b2 Omega] has already bombed an orphanage if the LDT algorithm's output is to pick up the \\$20 bill.  The LDT agent refuses the money, and the CDT agent happily picks it up.\n\nFrom the standpoint of a CDT agent, this situation seems [fair_problem_class fair] because the CDT agent thinks that Omega has already left, and so believes the LDT agent would get the same payoff for the same physical act.  But from the LDT agent's perspective, a behavior from one algorithm is being penalized while the identical behavior from another algorithm is being rewarded.  Similarly, on the LDT view, 1 LDT agent versus 1 CDT agent facing 98 LDT agents in a [prisoners_dilemma PD] tournament are facing an 'unfair' asymmetric correlation that the CDT agent doesn't care about.]\n\nOne of the arguments supporting [58b] over [causal_dt] is the assertion that [fair_problem_class LDT handles a strictly wider problem class] and therefore [ dominates]:  Arguendo, we lose nothing by moving from CDT to LDT, and instead gain the ability to handle a strictly wider class of problems.\n\n[wei_dai Wei Dai] posed the following counterexample:  Suppose that 99 LDT agents and 1 CDT agent are playing a tournament of oneshot [prisoners_dilemma].  Then the LDT agents calculate that they will lose a great deal by defecting (since this would make them defect against 99 other LDT agents), while the CDT agent cheerfully defects (and does not move in unison with the LDT agents).\n\nTo make this example more precise, we might also imagine an LDT agent, versus a CDT agent, playing the game "Me and 98 LDT agents in a PD tournament."  (In the original example, the CDT agent is facing 99 LDT agents, and each LDT agent is facing 98 LDT agents and 1 CDT agent, which technically breaks the [fair_problem_class fairness] of the problem.)\n\nThere are two counterarguments that this example does not favor CDT as a decision theory:\n\nFirst, imagine inserting a single logical-style agent with some other slightly variant algorithm, LDT-Prime, such that it no longer sees itself as moving in lockstep with the other LDT agents in the PD tournament.  Then LDT-Prime will do as well as the CDT agent in the tournament, and do better on other Newcomblike problems.  This argues that the particulars of the CDT algorithm were not what gave the CDT agent its apparent advantage.\n\nSecond, an LDT agent would argue against the fairness of the PD tournament.  Since the LDT agent is facing off against 98 other agents moving in logical lockstep with itself, it is being faced with an environmental challenge unlike the one an LDT-Prime agent or CDT agent sees.  Arguendo, the LDT agent is being presented with different options or consequences for its decision algorithm, compared to the consequences for the LDT-Prime or CDT algorithm.\n\nSuppose that a CDT agent and an LDT agent both encounter a \\$20 bill in the street.  Two days previously, [5b2 Omega] has already bombed an orphanage if the output of the LDT algorithm is to pick up the \\$20 bill.  The LDT agent naturally refuses to pick up the \\$20.  The CDT agent laughs and remarks that Omega is already gone, and chides the LDT agent for not taking the \\$20 reward that was equally available to any agent passing by.\n\nSince the CDT agent doesn't think the past correlation can affect outcomes now, the CDT agent believes that the CDT agent and the LDT agent would receive just the same payoff for picking up the \\$20 bill, and thus that this scenario is a [fair_problem_class fair] challenge.  The LDT agent thinks that the CDT agent and LDT agent have been presented with different payoff matrices for the same outputs, and thus that this is an unfair challenge.  On the LDT view, CDT agents are blind to Newcomblike dependencies, so the CDT agent may believe that a scenario is a fair CDT problem when it is actually an unfair Newcomblike problem.\n\nOn the LDT view, something very similar happens when an LDT agent versus an LDT-Prime agent, or an LDT agent versus a CDT agent, are presented with a PD tournament against 98 LDT agents.  Being presented with 98 other contestants that will mirror you but not the competing agent doesn't seem very 'fair'.\n\nHowever, even the orphanage-bombing example taken at face value seems sufficient to technically refute the general statement that the [fair_problem_class fair problem class] on which LDT agents end up rich is *strictly* larger than the corresponding 'fair' problem class for CDT agents.  And the 99LDT/1CDT tournament does seem in some sense like it's posing a *natural* or a *realistic* scenario in which a CDT agent could get a higher payoff than an LDT agent; on the CDT view, this is exactly the sort of problem that CDT is good for.',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '1',
  maintainerCount: '1',
  userSubscriberCount: '0',
  lastVisit: '',
  hasDraft: 'false',
  votes: [],
  voteSummary: 'null',
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {},
  creatorIds: [
    'EliezerYudkowsky'
  ],
  childIds: [],
  parentIds: [
    'newcomblike'
  ],
  commentIds: [
    '5j5'
  ],
  questionIds: [],
  tagIds: [
    'b_class_meta_tag'
  ],
  relatedIds: [],
  markIds: [],
  explanations: [],
  learnMore: [],
  requirements: [],
  subjects: [],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '17886',
      pageId: 'pd_tournament_99ldt_1cdt',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newParent',
      createdAt: '2016-08-01 03:07:56',
      auxPageId: 'newcomblike',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '17884',
      pageId: 'pd_tournament_99ldt_1cdt',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'deleteParent',
      createdAt: '2016-08-01 03:07:51',
      auxPageId: 'logical_dt',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '3216',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '1',
      dislikeCount: '0',
      likeScore: '1',
      individualLikes: [],
      id: '17434',
      pageId: 'pd_tournament_99ldt_1cdt',
      userId: 'EliezerYudkowsky',
      edit: '5',
      type: 'newEdit',
      createdAt: '2016-07-23 22:29:45',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '17433',
      pageId: 'pd_tournament_99ldt_1cdt',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newAlias',
      createdAt: '2016-07-23 22:29:44',
      auxPageId: '',
      oldSettingsValue: 'pd_tournament_100ldt_1cdt',
      newSettingsValue: 'pd_tournament_99ldt_1cdt'
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '17372',
      pageId: 'pd_tournament_99ldt_1cdt',
      userId: 'EliezerYudkowsky',
      edit: '4',
      type: 'newEdit',
      createdAt: '2016-07-23 01:06:11',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '17371',
      pageId: 'pd_tournament_99ldt_1cdt',
      userId: 'EliezerYudkowsky',
      edit: '3',
      type: 'newEdit',
      createdAt: '2016-07-23 01:03:38',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '17369',
      pageId: 'pd_tournament_99ldt_1cdt',
      userId: 'EliezerYudkowsky',
      edit: '2',
      type: 'newEdit',
      createdAt: '2016-07-23 01:01:32',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '17253',
      pageId: 'pd_tournament_99ldt_1cdt',
      userId: 'EricRogstad',
      edit: '0',
      type: 'newAlias',
      createdAt: '2016-07-21 21:15:12',
      auxPageId: '',
      oldSettingsValue: '100ldt_1cdt_pd_tournament',
      newSettingsValue: 'pd_tournament_100ldt_1cdt'
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '17202',
      pageId: 'pd_tournament_99ldt_1cdt',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newParent',
      createdAt: '2016-07-21 01:10:29',
      auxPageId: 'logical_dt',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '17203',
      pageId: 'pd_tournament_99ldt_1cdt',
      userId: 'EliezerYudkowsky',
      edit: '0',
      type: 'newTag',
      createdAt: '2016-07-21 01:10:29',
      auxPageId: 'b_class_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '17200',
      pageId: 'pd_tournament_99ldt_1cdt',
      userId: 'EliezerYudkowsky',
      edit: '1',
      type: 'newEdit',
      createdAt: '2016-07-21 01:10:27',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'false',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}