Imagine that you are in a room with several mutually suspicious people who each want to collectively agree on a random number from 0 to 9. Each is suspicious that some of the others want to bias the choice. They can agree to the following protocol. They all submit a sealed number between 0 and 9. When they have all been submitted, the submissions are all unsealed and the numbers summed. The sum modulo 10 has a flat distribution if at least one of the submitters has chosen his number from a flat distribution.

If any person X actually wants a flat resulting definition and is competent to choose one, then X can reason as follows. No one can know what x’s choice will be. The others will each choose a number and the sum of those will be z. If X chooses x then the ultimate collective choice will be x+z. If x is from a flat distribution, then so is x+z.

In one of them is honest and competent, then the others can not cause a bias, even in collusion!

This uses x+y = y+x and x+(y+z) = (x+y)+z where + is mod 10. That’s all you need for + and that is why exclusive or and several other operations work as well here.

More generally the derived number has more entropy than the max of the entropy of the submitters. If the input from the max submitter is independent from the other’s. You can compute entropy as follows: if the probability number d is Pd then the entropy of the selection is −∑Pd log(Pd). Shannon defined information (entropy) quantitatively and derived this.

If in the protocol the sum is not announced, then baring total collusion, no one will know the sum unless it is announced.


Suppose that the probability of x is px and that of y is py. The probability of x xor y is
P(x ⩝ y) = P(x)(1 − P(y)) + (1 − P(x))P(y) = P(x) + P(y) − 2P(x)P(y). P(x ⩝ y) is never farther from 0.5 than either P(x) or P(y) — a tedious but elementary slog.