Valley of Dangerous Complacency

by Eliezer Yudkowsky Mar 23 2016

When the AGI works often enough that you let down your guard, but it still has bugs. Imagine a robotic car that almost always steers perfectly, but sometimes heads off a cliff.

The Valley of Dangerous Complacency is when a system works often enough that you let down your guard around it, but in fact the system is still dangerous enough that full vigilance is required.

Compare "Uncanny Valley" where a machine system is partially humanlike - humanlike enough that humans try to hold it to a human standard - but not humanlike enough to actually seem satisfactory when held to a human standard. This means that in terms of user experience, there's a valley as the degree of humanlikeness of the system increases where the user experience actually gets worse before it gets better. Similarly, if users become complacent, a 99.99% reliable system can be worse than a 99% reliable one, even though, with enough reliability, the degree of safety starts climbing back out of the valley.


Eric Rogstad

I'm reminded of this article:, which provides some interesting examples.