Ideological Turing test

by Eliezer Yudkowsky Jan 10 2017

Can you explain the opposing position well enough that people can't tell whether you or a real advocate of that position created the explanation?

The Ideological Turing Test is when, to try to make sure you really understand an opposing position, you explain it such that an advocate of the position cannot tell whether the explanation was written by you or a real advocate. This is often run online as a formal test where an audience tries to distinguish which pro-X and anti-X essays have been written by real pro-X and anti-X advocates, and which essays were actually written by advocates for the opposing side. If you're pro-X, and you write an intelligent essay advocating anti-X, and other people can't tell that you were really pro-X, you win the Ideological Turing Test by having demonstrated that you really and honestly understand the arguments for anti-X.