If you have a stream of random bits that you can consume, and these bits are 50% ones, and independent of each other then several cute tricks can be used to employ them “economically”.
If routine b() produces the next random bit then
int cut(){int B = b(), R = r(); if(B && ~R) return 1; if(R && ~B) return 0; return cut();}more obscurely:
int cut(){int B = b(), R = !r(); return B && R || (B || R) && cut();}returns a 1 with probability R
Alas this is not good enough. I use 2 bits of entropy to produce −((1/3)log_{2}(1/3) + (2/3)log_{2}(2/3)) = .918296 bits of entropy conveyed by a bit in a 2:1 channel. The discrepancy indicates that this does not achieve the efficiency of arithmetic coding. I think there is an idea lurking here but I can’t find it just now. Its still a useful trick for when entropy is scarce.
The original notion of arithmetic coding is most cleanly expressed in Scheme. The message is a real number m between 0 and 1. Given a channel with cost of 1 is R and cost of 0 is 1−R, a channel coder is yielded by
(define ((cc R) m) (let ((m m)) (lambda () (let ((v (- m R))(r (> m R))) (set! m (if r (/ (- m R) (- 1 R)) (/ m R))) (if r 1 0)))))Invoking cc creates a procedure whose successive invocations produce the bits to be transmitted.
(define (Do n p) (let Do ((n n)) (if (> n 0) (let ((u (- n 1))) (p u) (Do u))))) (let ((w ((cc 1/3) 0.2540226510663))) (Do 60 (lambda (j) (write (w)))))yields: 011101100111001001110101001110111111111011101001101111111110