Humean degree of freedom

by Eliezer Yudkowsky Mar 14 2016

A concept includes 'Humean degrees of freedom' when the intuitive borders of the human version of that concept depend on our values, making that concept less natural for AIs to learn.

A "Humean degree of freedom" appears in a cognitive system whenever some quantity / label / concept depends on the choice of utility function, or more generalized preferences. For example, the notion of "important impact on the world" depends on which variables, when they change, impact something the system cares about, so if you tell an AI "Tell me about any important impacts of this action", you're asking it to do a calculation that depends on your preferences, which might have high complexity and be difficult to identify to the AI.