{
  localUrl: '../page/absentee_billionaire.html',
  arbitalUrl: 'https://arbital.com/p/absentee_billionaire',
  rawJsonUrl: '../raw/1yn.json',
  likeableId: '895',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '1',
  dislikeCount: '0',
  likeScore: '1',
  individualLikes: [
    'EricBruylant'
  ],
  pageId: 'absentee_billionaire',
  edit: '2',
  editSummary: '',
  prevEdit: '1',
  currentEdit: '2',
  wasPublished: 'true',
  type: 'wiki',
  title: 'The absentee billionaire',
  clickbait: '',
  textLength: '6688',
  alias: 'absentee_billionaire',
  externalUrl: '',
  sortChildrenBy: 'likes',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'PaulChristiano',
  editCreatedAt: '2016-02-24 23:41:52',
  pageCreatorId: 'PaulChristiano',
  pageCreatedAt: '2016-02-11 23:31:57',
  seeDomainId: '0',
  editDomainId: '705',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '0',
  isEditorComment: 'false',
  isApprovedComment: 'true',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '437',
  text: 'Once each day, Hugh wakes for 10 minutes. During these 10 minutes, he spends 10 million dollars. The other 1430 minutes, he slumbers.\n\nHugh has a goal. Maybe he wants humanity to flourish. Maybe he wants to build the world’s tallest building. Maybe he wants to preserve the rainforest. Whatever Hugh wants, our question is: how can he get it?\n\nThe desideratum\n===============\n\nHugh has commissioned us to come up with a system for him to use—to tell him how to spend his ten minutes, what to look up, who to talk to, and how to decide where to send his money. We can give Hugh an instruction manual and spend a few hours explaining the system to him. After that, he’s on his own.\n\nHugh wishes he could pay people to adopt the goal “Hugh’s goals are satisfied” wholesale, to pursue it as effectively as they can. In the best case, he could do this with no more expense—and no more difficulty—than would be required to hire them to do similar work. Let’s call this ideal arrangement _perfect delegation_.\n\nWe’ll consider our system a success if it lets Hugh achieve his goals nearly as well as he could using perfect delegation.\n\nWith access to perfect delegation Hugh could find the best team money could buy and hire them to administer his budget in the service of his goals. They would in turn identify and evaluate opportunities, and use perfect delegation to implement them. Doing the same using only 10 minutes a day, without perfect delegation, would be a tall order.\n\nThe details\n===========\n\nSome details of Hugh’s situation:\n\n1. Hugh has access to the internet. He can send and receive emails, can browse the internet, and so on. He can attach money to emails painlessly.\n2. In principle, the people of Earth could all agree not to accept Hugh’s money. On average, this would make them better off. But it’s not going to happen.\n3. The world isn’t on Hugh’s side, either. He doesn’t have any super-trustworthy assistants, much less thousands of them. There are just a lot of people who want their share of $10M / day. In the easy version of the problem, there are also some people who share Hugh’s vision—but even then, who can tell the difference?\n4. Hugh has written a lot about his goals and his outlook. Anyone who’s curious can go learn about Hugh on the internet; lots of people have.\n5. If someone doesn’t want Hugh to learn something, they can pay a news site not to cover it. They can launch a DDoS against Google. They could even go to more extreme lengths. But we’ll assume that Hugh’s connection to the internet is itself tamper-proof. And just like the world won’t coordinate to turn down Hugh’s money, they won’t coordinate to set up a parallel version of the internet just to delude him.\n6. Hugh has no prospect of fixing his debilitating sleep disorder. He could leave his room if he wanted, but what is he going to do in 10 minutes, anyway? He’s already arranged for his room to be secure and well-stocked, and for his infrastructure to be reliable.\n7. Hugh’s goals do not require the cooperation of any specific person, or any unverifiable private information. Everything that Hugh wants to do could be done by one of several people (and, as per assumption #2, we assume that these people will not all collude). All of the information that Hugh needs could in principle be verified by Hugh, if he had enough time to spend veryfing it.\n\nThe prognosis\n=============\n\nHugh’s situation would not be interesting if it were hopeless. I’m posting this puzzle because I think it pushes the boundaries of what is possible without going beyond.\n\nI have a few ideas and a partial solution. I’ll present some of them in upcoming posts. I’d also love to see other approaches to the problem!\n\nThe metaphor\n============\n\nHugh’s situation is a caricature, but the basic problem seems ubiquitous. If available, perfect delegation would be useful in many domains: from philanthropists making grants, to voters electing representatives, to executives managing companies. A lot can be lost between intention and implementation.\n\nThe deeper reason I care about Hugh’s predicament is that I think it illustrates a much more fundamental difficulty. As a society, I don’t think we yet have the hang of building organizations that can effectively pursue an on-paper mandate, or that can make decisions explicitly and rationally. Instead we get bureaucracies, we get coalitional politics, we get crude rules applied bluntly, we get inertia, we get strong cultures and personal networks that can only grow so far before they decay or drift. In many cases, the best we can do is to leave decisions in the hands of the best-equipped individuals we can find and trust.\n\nSetting aside this more abstract difficulty, there are two concrete cases I find particularly interesting.\n\n**Philanthropy**. A philanthropist with a pure heart seeks the most effective way to use their money to improve the world. They are surrounded by people pursuing their own projects for their own reasons, uninterested and sometimes unable to think about their expertise in the context of the philanthropist’s broader goal. How can the philanthropist make funding decisions in a domain without implicitly adopting its practitioner’s values? How can the philanthropist best leverage the knowledge of many diverse experts?\n\nI think we have a lot of room to improve our basic institutions for this setting—Hugh’s extreme situation makes the problem especially obvious, but I think it is always there. [The Open Philanthropy Project](http://www.openphilanthropy.org/) is making a noble effort to attack this problem head on, and I expect they will do a lot of good. So far they are relying on a combination of small scale and staff with closely aligned values, and it will be interesting to see how these efforts scale. (I expect that the practical issues the OPP is struggling with are more pressing than the theoretical question; but I think that theoretical question has still received less attention than it deserves.)\n\n**AI Control**. Unless we go out of our way to avoid it, we humans will eventually live in a world that outpaces us as much as our world outpaces Hugh. If things go well, we may also be comparably rich (in absolute terms). One way of understanding the AI control problem is: how can humans continue to influence the overall direction of society when decisions are made automatically, much more quickly and much more frequently than humans can directly oversee?\n\nThe analogy is not entirely straightforward, but I expect that a solution to Hugh’s problem would also prove useful for attacking the AI control problem.',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '1',
  maintainerCount: '1',
  userSubscriberCount: '0',
  lastVisit: '2016-02-23 04:35:23',
  hasDraft: 'false',
  votes: [],
  voteSummary: 'null',
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {},
  creatorIds: [
    'PaulChristiano'
  ],
  childIds: [],
  parentIds: [
    'paul_ai_control'
  ],
  commentIds: [
    '208'
  ],
  questionIds: [],
  tagIds: [],
  relatedIds: [],
  markIds: [],
  explanations: [],
  learnMore: [],
  requirements: [],
  subjects: [],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '7766',
      pageId: 'absentee_billionaire',
      userId: 'JessicaChuan',
      edit: '2',
      type: 'newEdit',
      createdAt: '2016-02-24 23:41:52',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '6896',
      pageId: 'absentee_billionaire',
      userId: 'JessicaChuan',
      edit: '1',
      type: 'newEdit',
      createdAt: '2016-02-11 23:31:57',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '6895',
      pageId: 'absentee_billionaire',
      userId: 'JessicaChuan',
      edit: '0',
      type: 'newParent',
      createdAt: '2016-02-11 23:28:57',
      auxPageId: 'paul_ai_control',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'false',
  hasParents: 'true',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}