{
  localUrl: '../page/the_plan_experiment.html',
  arbitalUrl: 'https://arbital.com/p/the_plan_experiment',
  rawJsonUrl: '../raw/7gv.json',
  likeableId: '0',
  likeableType: 'page',
  myLikeValue: '0',
  likeCount: '0',
  dislikeCount: '0',
  likeScore: '0',
  individualLikes: [],
  pageId: 'the_plan_experiment',
  edit: '4',
  editSummary: '',
  prevEdit: '3',
  currentEdit: '4',
  wasPublished: 'true',
  type: 'wiki',
  title: 'The plan experiment',
  clickbait: 'Root page for describing the reason and the process for planning how to approach and navigate through AGI development.',
  textLength: '5133',
  alias: 'the_plan_experiment',
  externalUrl: '',
  sortChildrenBy: 'likes',
  hasVote: 'false',
  voteType: '',
  votesAnonymous: 'false',
  editCreatorId: 'AlexeiAndreev',
  editCreatedAt: '2017-01-31 02:34:04',
  pageCreatorId: 'AlexeiAndreev',
  pageCreatedAt: '2017-01-18 22:23:12',
  seeDomainId: '0',
  editDomainId: '2223',
  submitToDomainId: '0',
  isAutosave: 'false',
  isSnapshot: 'false',
  isLiveEdit: 'true',
  isMinorEdit: 'false',
  indirectTeacher: 'false',
  todoCount: '0',
  isEditorComment: 'false',
  isApprovedComment: 'false',
  isResolved: 'false',
  snapshotText: '',
  anchorContext: '',
  anchorText: '',
  anchorOffset: '0',
  mergedInto: '',
  isDeleted: 'false',
  viewCount: '39',
  text: '"The plan" is an experiment in collaborative discussion. The central topic is helping humans navigate development of AGI. Its development is a hard technical problem, and making sure that the way it's developed is beneficial to all humans makes it even harder. In the meantime, there is a perceived aura of pressure. This pressure comes from multiple sources:\n\n* Internal: this is a complex situation, so people want to make sure they are making the right decisions. The stakes are very high, so people want to get it right.\n* Novelty: humans have never built an AGI before, so there are a lot of unknown unknowns.\n* Us vs them perspective: people imagine that others are competing with them to get to AGI faster than they would. They imagine those people taking shortcuts in safety in order to speed up the timeline. (See [7hy])\n* Lack of clear communication: a lot of conversations about this happen in private.\n* Lack of clear direction: there is no plan for people to follow or discuss.\n\nThis experiment exists to address all of these issues by recognizing their manifestations in each individual. [7h3 To fully address these issues, the plan has to be public].\n\n## What's in the plan?\n\n[National response framework](https://www.fema.gov/national-response-framework)%%note: The National Response Framework describes not only how the Federal government organizes itself to respond to natural disasters, terrorist attacks, and other catastrophic events but also the importance of the whole community in assisting with response efforts. The intended audience for this document is individuals, families, communities, the private and nonprofit sectors, faith-based organizations, and local, state, tribal, territorial, insular area, and Federal governments.%% provides a [detailed document](https://www.fema.gov/media-library-data/1466014682982-9bcf8245ba4c60c120aa915abe74e15d/National_Response_Framework3rd.pdf) describing how various organizations and federal governments can respond to an event of emergency. I imagine the final output of the plan to be something along these lines, although it'll be framed in a more proactive way.%%note: It's possible [NIMS](https://www.fema.gov/national-incident-management-system) is closer to what I'm looking for. It's "a systematic, proactive approach to guide departments and agencies at all levels of government, nongovernmental organizations, and the private sector to work together seamlessly and manage incidents involving all threats and hazards—regardless of cause, size, location, or complexity—in order to reduce loss of life, property and harm to the environment."%%\n\nSpecifically, the plan will have these components:\n\n* [7gz List of all the relevant organizations and people]\n* Various milestones for anticipating and measuring AGI progress\n* Various suggested actions that should be taken once the milestones are reached\n* Various events that might happen (e.g. a call to nationalize AGI development) and possible responses\n\n## Execution\n\n### Steps\n\n1. Go through existing literature and extract the relevant parts. Use those to write the first version of the plan.\n * In particular, populate [7h0 the list of stages].\n2. I will show the current version of the plan to one person and get their feedback.\n * In that process, I will try to address the pressures I've outlined above.\n3. I will add as much of their feedback to the plan as possible, keeping their identity anonymous if requested, though I have a strong preference for people's opinions to be public.\n4. Iterate on this process and the plan.\n\n### Non-interference stance\n\nIt's hard to understand a person's point of view when your own gets in the way. This is why I'm going to follow these guidelines when having one-on-ones with people:\n\n1. My goal is not to make anyone do anything they don't want to do.\n2. My goal is not to get the person to adopt the plan.\n3. My goal is to explain various parts of the plan to a person, then understand and integrate their feedback into the plan. (Steelmanning their argument in the process.)\n4. My goal is for the person to be happy with the way their feedback was integrated into the plan.\n\nIn fact, if this approach works well, I'd be happy to take this to its logical conclusion of creating a new class of x-risks participants who are forbidden from interfering in AI development landscape.\n\n### The plan keeper\n\nA lot of the plan will come from one-on-one discussions. If all the conversations are stored in one person's mind, they can have future conversations much better. However, if the past conversations were done by two people, than each person can only bring half of all the relevant information. So, naively, this task is much better done by one person. However, that's probably unrealistic. In either case, keeping the group small seems best.\n\n### Measure of success\n\nInitial measure of success is: how many people's opinions are integrated into the plan and how happy those people are with their views being represented.\n\nFinal measure of success is: are people referring to and following the plan.\n\n(It's possible we should only use the initial measure.)',
  metaText: '',
  isTextLoaded: 'true',
  isSubscribedToDiscussion: 'false',
  isSubscribedToUser: 'false',
  isSubscribedAsMaintainer: 'false',
  discussionSubscriberCount: '1',
  maintainerCount: '1',
  userSubscriberCount: '0',
  lastVisit: '',
  hasDraft: 'false',
  votes: [],
  voteSummary: 'null',
  muVoteSummary: '0',
  voteScaling: '0',
  currentUserVote: '-2',
  voteCount: '0',
  lockedVoteType: '',
  maxEditEver: '0',
  redLinkCount: '0',
  lockedBy: '',
  lockedUntil: '',
  nextPageId: '',
  prevPageId: '',
  usedAsMastery: 'false',
  proposalEditNum: '0',
  permissions: {
    edit: {
      has: 'false',
      reason: 'You don't have domain permission to edit this page'
    },
    proposeEdit: {
      has: 'true',
      reason: ''
    },
    delete: {
      has: 'false',
      reason: 'You don't have domain permission to delete this page'
    },
    comment: {
      has: 'false',
      reason: 'You can't comment in this domain because you are not a member'
    },
    proposeComment: {
      has: 'true',
      reason: ''
    }
  },
  summaries: {},
  creatorIds: [
    'AlexeiAndreev'
  ],
  childIds: [
    'the_plan_orgs_and_people',
    'the_plan_stages',
    '7h3',
    '7h5',
    '7hv',
    'the_plan',
    '7hy'
  ],
  parentIds: [],
  commentIds: [],
  questionIds: [],
  tagIds: [
    'c_class_meta_tag'
  ],
  relatedIds: [],
  markIds: [],
  explanations: [],
  learnMore: [],
  requirements: [],
  subjects: [],
  lenses: [],
  lensParentId: '',
  pathPages: [],
  learnMoreTaughtMap: {},
  learnMoreCoveredMap: {},
  learnMoreRequiredMap: {},
  editHistory: {},
  domainSubmissions: {},
  answers: [],
  answerCount: '0',
  commentCount: '0',
  newCommentCount: '0',
  linkedMarkCount: '0',
  changeLogs: [
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '21905',
      pageId: 'the_plan_experiment',
      userId: 'AlexeiAndreev',
      edit: '4',
      type: 'newEdit',
      createdAt: '2017-01-31 02:34:04',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '21904',
      pageId: 'the_plan_experiment',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'newTag',
      createdAt: '2017-01-31 02:20:44',
      auxPageId: 'c_class_meta_tag',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '21839',
      pageId: 'the_plan_experiment',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'newChild',
      createdAt: '2017-01-26 00:01:18',
      auxPageId: '7hy',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '21835',
      pageId: 'the_plan_experiment',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'newChild',
      createdAt: '2017-01-26 00:01:02',
      auxPageId: 'the_plan',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '21812',
      pageId: 'the_plan_experiment',
      userId: 'AlexeiAndreev',
      edit: '3',
      type: 'newEdit',
      createdAt: '2017-01-21 03:39:21',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '21810',
      pageId: 'the_plan_experiment',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'newAlias',
      createdAt: '2017-01-21 02:49:21',
      auxPageId: '',
      oldSettingsValue: 'the_plan',
      newSettingsValue: 'the_plan_experiment'
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '21811',
      pageId: 'the_plan_experiment',
      userId: 'AlexeiAndreev',
      edit: '2',
      type: 'newEdit',
      createdAt: '2017-01-21 02:49:21',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '21808',
      pageId: 'the_plan_experiment',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'newChild',
      createdAt: '2017-01-21 02:42:39',
      auxPageId: '7hv',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '21776',
      pageId: 'the_plan_experiment',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'newChild',
      createdAt: '2017-01-18 23:09:14',
      auxPageId: 'the_plan_stages',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '21773',
      pageId: 'the_plan_experiment',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'newChild',
      createdAt: '2017-01-18 23:09:00',
      auxPageId: 'the_plan_orgs_and_people',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '21770',
      pageId: 'the_plan_experiment',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'newChild',
      createdAt: '2017-01-18 23:08:02',
      auxPageId: '7h3',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '21767',
      pageId: 'the_plan_experiment',
      userId: 'AlexeiAndreev',
      edit: '0',
      type: 'newChild',
      createdAt: '2017-01-18 23:06:54',
      auxPageId: '7h5',
      oldSettingsValue: '',
      newSettingsValue: ''
    },
    {
      likeableId: '0',
      likeableType: 'changeLog',
      myLikeValue: '0',
      likeCount: '0',
      dislikeCount: '0',
      likeScore: '0',
      individualLikes: [],
      id: '21763',
      pageId: 'the_plan_experiment',
      userId: 'AlexeiAndreev',
      edit: '1',
      type: 'newEdit',
      createdAt: '2017-01-18 22:23:12',
      auxPageId: '',
      oldSettingsValue: '',
      newSettingsValue: ''
    }
  ],
  feedSubmissions: [],
  searchStrings: {},
  hasChildren: 'true',
  hasParents: 'false',
  redAliases: {},
  improvementTagIds: [],
  nonMetaTagIds: [],
  todos: [],
  slowDownMap: 'null',
  speedUpMap: 'null',
  arcPageIds: 'null',
  contentRequests: {}
}