If any person X actually wants a flat resulting definition and is competent to choose one, then X can reason as follows. No one can know what x’s choice will be. The others will each choose a number and the sum of those will be z. If X chooses x then the ultimate collective choice will be x+z. If x is from a flat distribution, then so is x+z.

In one of them is honest and competent, then the others can not cause a bias, even in collusion!

This uses x+y = y+x and x+(y+z) = (x+y)+z where + is mod 10. That’s all you need for + and that is why exclusive or and several other operations work as well here.

More generally the derived number has more entropy than the max of the entropy of the submitters.
If the input from the max submitter is independent from the other’s.
You can compute entropy as follows: if the probability number d is P_{d} then the entropy of the selection is −∑P_{d} log(P_{d}).
Shannon defined information (entropy) quantitatively and derived this.

If in the protocol the sum is not announced, then baring total collusion, no one will know the sum unless it is announced.

Suppose that the probability of x is p

P(x ⩝ y) = P(x)(1 − P(y)) + (1 − P(x))P(y) = P(x) + P(y) − 2P(x)P(y). P(x ⩝ y) is never farther from 0.5 than either P(x) or P(y) — a tedious but elementary slog.