"The reasoning which could cause us to remove ou..."

https://arbital.com/p/956

by Donald Hobson Jun 17 2018


The reasoning which could cause us to remove our minimal utility situations from the AI's utility function are the ones which cause the AI to change its utility function. Resistance to blackmail and cosmic ray errors. And It suffers from the same problem. If the universe decides to give our AI a choice between an existential catastrophe and a hyper-existential catastrophe, it won't care. This works on the individual level too. If there is someone severely ill and begging for death, This AI won't give it to them. (non-zero chance of mind starting to enjoy self again.) Of course, how much any of this is a problem depends on how likely reality is to hand you such a bad position.