{ localUrl: '../page/not_more_paperclips.html', arbitalUrl: 'https://arbital.com/p/not_more_paperclips', rawJsonUrl: '../raw/3tm.json', likeableId: '2549', likeableType: 'page', myLikeValue: '0', likeCount: '1', dislikeCount: '0', likeScore: '1', individualLikes: [ 'NateSoares' ], pageId: 'not_more_paperclips', edit: '1', editSummary: '', prevEdit: '0', currentEdit: '1', wasPublished: 'true', type: 'wiki', title: 'You can't get more paperclips that way', clickbait: 'Most arguments that "A paperclip maximizer could get more paperclips by (doing nice things)" are flawed.', textLength: '4455', alias: 'not_more_paperclips', externalUrl: '', sortChildrenBy: 'likes', hasVote: 'false', voteType: '', votesAnonymous: 'false', editCreatorId: 'EliezerYudkowsky', editCreatedAt: '2016-05-25 22:33:44', pageCreatorId: 'EliezerYudkowsky', pageCreatedAt: '2016-05-25 22:33:44', seeDomainId: '0', editDomainId: 'EliezerYudkowsky', submitToDomainId: '0', isAutosave: 'false', isSnapshot: 'false', isLiveEdit: 'true', isMinorEdit: 'false', indirectTeacher: 'false', todoCount: '0', isEditorComment: 'false', isApprovedComment: 'true', isResolved: 'false', snapshotText: '', anchorContext: '', anchorText: '', anchorOffset: '0', mergedInto: '', isDeleted: 'false', viewCount: '77', text: 'Instrumental convergence says that various properties $P$ of an agent, often scary or detrimental-by-default properties like "trying to gain control of lots of resources" or "deceiving humans into thinking you are nice", will fall out of pursuing most utility functions $U.$ You might be tempted to hope that *nice* or *reassuring* properties $P$ would also fall out of most utility functions $U$ in the same natural way. In fact, your brain might tempted to treat [10h Clippy the Paperclip Maximizer] as a political agent you were trying to cleverly persuade, and come up with clever arguments for why Clippy should do things *your* way *in order to get more paperclips*, like trying to persuade your boss why you ought to get a raise *for the good of the company.*\n\nThe problem here is that:\n\n- Generally, when you think of a nice policy $\\pi_1$ that produces some paperclips, there will be a non-nice policy $\\pi_2$ that produces *even more* paperclips.\n- *Clippy* is not trying to generate arguments for why it should do human-nice things in order to make paperclips; it is just neutrally pursuing paperclips. So Clippy is going to keep looking until it finds $\\pi_2.$\n\nFor example:\n\n• Your brain instinctively tries to persuade this imaginary Clippy to keep humans around by arguing, "If you keep us around as economic partners and trade with us, we can produce paperclips for you under Ricardo's Law of Comparative Advantage!" This is then the policy $\\pi_1$ which would indeed produce *some* paperclips, but what would produce even *more* paperclips is the policy $\\pi_2$ of disassembling the humans into spare atoms and replacing them with optimized paperclip-producers.\n\n• Your brain tries to persuade an imaginary Clippy by arguing for policy $\\pi_1,$ "Humans have a vast amount of varied life experience; you should keep us around and let us accumulate more experience, in case our life experience lets us make good suggestions!" This would produce some expected paperclips, but what would produce *more* paperclips is policy $\\pi_2$ of "Disassemble all human brains and store the information in an archive, then simulate a much larger variety of agents in a much larger variety of circumstances so as to maximize the paperclip-relevant observations that could be made."\n\nAn unfortunate further aspect of this situation is that, in cases like this, your brain may be tempted to go on arguing for why really $\\pi_2$ isn't all that great and $\\pi_1$ is actually better, just like if your boss said "But maybe this company will be even better off if I spend that money on computer equipment" and your brain at once started to convince itself that computing equipment wasn't all that great and higher salaries were much more important for corporate productivity. (As Robert Trivers observed, deception of others often begins with deception of self, and this fact is central to understanding why humans evolved to think about politics the way we did.)\n\nBut since you don't get to *see* Clippy discarding your clever arguments and just turning everything in reach into paperclips - at least, not yet - your brain might hold onto its clever and possibly self-deceptive argument for why the thing *you* want is *really* the thing that produces the most paperclips.\n\nPossibly helpful mental postures:\n\n- Contemplate the *maximum* number of paperclips you think an agent could get by making paperclips the straightforward way - just converting all the galaxies within reach into paperclips. Okay, now does your nice policy $\\pi_1$ generate *more* paperclips than that? How is that even possible?\n- Never mind there being a "mind" present that you can "persuade". Suppose instead there's just a time machine that spits out some physical outputs, electromagnetic pulses or whatever, and the time machine outputs whatever electromagnetic pulses lead to the most future paperclips. What does the time machine do? Which outputs lead to the most paperclips as a strictly material fact?\n- Study evolutionary biology. During the pre-1960s days of evolutionary biology, biologists would often try to argue for why natural selection would result in humanly-nice results, like animals controlling their own reproduction so as not to overburden the environment. There's a similar mental discipline required [to not come up with clever arguments for why natural selection would do humanly nice things](http://lesswrong.com/lw/kr/an_alien_god/).', metaText: '', isTextLoaded: 'true', isSubscribedToDiscussion: 'false', isSubscribedToUser: 'false', isSubscribedAsMaintainer: 'false', discussionSubscriberCount: '1', maintainerCount: '1', userSubscriberCount: '0', lastVisit: '', hasDraft: 'false', votes: [], voteSummary: 'null', muVoteSummary: '0', voteScaling: '0', currentUserVote: '-2', voteCount: '0', lockedVoteType: '', maxEditEver: '0', redLinkCount: '0', lockedBy: '', lockedUntil: '', nextPageId: '', prevPageId: '', usedAsMastery: 'false', proposalEditNum: '0', permissions: { edit: { has: 'false', reason: 'You don't have domain permission to edit this page' }, proposeEdit: { has: 'true', reason: '' }, delete: { has: 'false', reason: 'You don't have domain permission to delete this page' }, comment: { has: 'false', reason: 'You can't comment in this domain because you are not a member' }, proposeComment: { has: 'true', reason: '' } }, summaries: {}, creatorIds: [ 'EliezerYudkowsky' ], childIds: [], parentIds: [ 'instrumental_convergence' ], commentIds: [], questionIds: [], tagIds: [ 'paperclip_maximizer', 'fallacy' ], relatedIds: [], markIds: [], explanations: [], learnMore: [], requirements: [], subjects: [], lenses: [], lensParentId: '', pathPages: [], learnMoreTaughtMap: {}, learnMoreCoveredMap: {}, learnMoreRequiredMap: {}, editHistory: {}, domainSubmissions: {}, answers: [], answerCount: '0', commentCount: '0', newCommentCount: '0', linkedMarkCount: '0', changeLogs: [ { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '11049', pageId: 'not_more_paperclips', userId: 'EliezerYudkowsky', edit: '1', type: 'newEdit', createdAt: '2016-05-25 22:33:44', auxPageId: '', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '11034', pageId: 'not_more_paperclips', userId: 'EliezerYudkowsky', edit: '0', type: 'deleteTag', createdAt: '2016-05-25 22:24:31', auxPageId: 'start_meta_tag', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '11033', pageId: 'not_more_paperclips', userId: 'EliezerYudkowsky', edit: '1', type: 'newTag', createdAt: '2016-05-25 22:24:30', auxPageId: 'fallacy', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '11026', pageId: 'not_more_paperclips', userId: 'EliezerYudkowsky', edit: '1', type: 'newTag', createdAt: '2016-05-25 22:09:43', auxPageId: 'start_meta_tag', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '11025', pageId: 'not_more_paperclips', userId: 'EliezerYudkowsky', edit: '1', type: 'newTag', createdAt: '2016-05-25 22:09:05', auxPageId: 'paperclip_maximizer', oldSettingsValue: '', newSettingsValue: '' }, { likeableId: '0', likeableType: 'changeLog', myLikeValue: '0', likeCount: '0', dislikeCount: '0', likeScore: '0', individualLikes: [], id: '11024', pageId: 'not_more_paperclips', userId: 'EliezerYudkowsky', edit: '1', type: 'newParent', createdAt: '2016-05-25 22:08:59', auxPageId: 'instrumental_convergence', oldSettingsValue: '', newSettingsValue: '' } ], feedSubmissions: [], searchStrings: {}, hasChildren: 'false', hasParents: 'true', redAliases: {}, improvementTagIds: [], nonMetaTagIds: [], todos: [], slowDownMap: 'null', speedUpMap: 'null', arcPageIds: 'null', contentRequests: {} }