Mind projection fallacy

https://arbital.com/p/mind_projection

by Eliezer Yudkowsky Jun 30 2016 updated Jun 30 2016

Uncertainty is in the mind, not in the environment; a blank map does not correspond to a blank territory. In general, the territory may have a different ontology from the map.


[summary: One commits the mind projection fallacy when they postulate that features of how their model of the world works are actually features of the world.

Suppose you flip a coin, slap it against your wrist, and don't look at the result. Does it make sense to say that the probability of the coin being heads is 50%? How can this be true, when the coin has already landed, and is either definitely heads or definitely tails? One who says "the coin is fundamentally uncertain; it is a feature of the coin that it is always 50% likely to be heads" commits the mind projection fallacy. Uncertainty is in the mind, not in reality. It makes sense that brains have an internal measure of how uncertain they are about the world, but that uncertainty is not a fact about the coin, it's a fact about the uncertain person. The coin itself is not sure or unsure.]

The "mind projection fallacy" occurs when somebody expects an overly direct resemblance between the intuitive language of the mind, and the language of physical reality.

Consider the [map_territory map and territory] metaphor, in which the world is a like a territory and your mental model of the world is like a map of that territory. In this metaphor, the mind projection fallacy is analogous to thinking that the territory can be folded up and put into your pocket.

As an archetypal example: Suppose you flip a coin, slap it against your wrist, and don't yet look at it. Does it make sense to say that the probability of the coin being heads is 50%? How can this be true, when the coin itself is already either definitely heads or definitely tails?

One who says "the coin is fundamentally uncertain; it is a feature of the coin that it is always 50% likely to be heads" commits the mind projection fallacy. Uncertainty is in the mind, not in reality. If you're ignorant about a coin, that's not a fact about the coin, it's a fact about you. It makes sense that your brain, the map, has an internal measure of how it's more or less sure of something. But that doesn't mean the coin itself has to contain a corresponding quantity of increased or decreased sureness; it is just heads or tails.

The [-ontology] of a system is the elementary or basic components of that system. The ontology of your model of the world may include intuitive measures of uncertainty that it can use to represent the state of the coin, used as primitives like [float floating-point numbers] are primitive in computers. The mind projection fallacy occurs whenever someone reasons as if the territory, the physical universe and its laws, must have the same sort of ontology as the map, our models of reality.

See also: