"Darn it, I wanted to use th..."

https://arbital.com/p/3nh

by Eliezer Yudkowsky May 16 2016


Darn it, I wanted to use this term to distinguish "not-explictly-consequentialistically optimizing for still optimizes for when is being varied and is causally relevant to " from "having an explicit model of being relevant to and therefore explicitly forming goals about and searching for strategies that affect " (E.g., natural selection does implicit consequentialism, humans do explicit consequentialism.) I'm not sure if I can think of an equally good replacement term for the thing I wanted to say. Would "proxy consequentialism" work for the thing you wanted to say?