"I fail to see how this setup is not fair - but ..."

https://arbital.com/p/5j5

by Jaime Sevilla Molina Jul 22 2016


I fail to see how this setup is not fair - but more importantly, I fail to see how LDT is losing in this situation. If the payoff matrix is CC:2/2, CD:0/3, DD: 1/1, then if LDT cooperates in every round it will get $~$99\cdot 2=198$~$ utilons, while if it defected then it gets $~$100$~$ utilons.

Thus $~$LDT$~$ wins $~$198$~$ utilons in this situation, while a CDT agent in his shoes would win $~$100$~$ utilons by defecting each round.

The situation changes if the payoff becomes Having a higher score that CDT: $~$1$~$, while Having an equal or lower score than CDT: $~$0$~$.

Then the game is clearly rigged, as there is no deterministic strategy that LDT could follow that would lead to a win. But neither could CDT win if it was pitted in the same situation.


Comments

Eliezer Yudkowsky

I'll edit to be more precise: A CDT agent thinks "me and an LDT agent facing off against 99 other LDT agents in a oneshot PD tournament" is a fair test of it versus the LDT agent. A CDT agent does not think that "Me facing off against 99 CDT agents and 1 LDT agent, versus an LDT agent facing 99 LDT agents and 1 CDT agent" is a fair and symmetrical test. On the CDT view, the LDT agent is being presented with an entirely different payoff matrix for its options in the second test.

Jaime Sevilla Molina

I do not think that the "Me facing off against 99 CDT agents and 1 LDT agent, versus an LDT agent facing 99 LDT agents and 1 CDT agent" is fair either.

The thing that confuses me is that you are changing the universe in which you put the agents to compete.

To my understanding, the universe should be something of the form "{blank} against a CDT agent and 100 LDT agents in a one-shot prisoners dilemma tournament", and then you fill the "blank" with agents and compare their scores.

If you are using different universe templates for different agents then you are violating extensionality, and I hardly can consider that a fair test.

Eliezer Yudkowsky

Makes sense (though the versus you quote wasn't being advocated as a fair example by either agent). I'll rewrite again.