This paper describes a computable normal number. One property of normal numbers is that, written in binary, the $i^{th}$ bit is equally likely to be 0 or 1 (in the sense that as $n$ gets large, the number of 1-bits and 0-bits before the $n^{th}$ bit are about the same). Doesn't this mean that the algorithm that computes bits of this normal number is a perfect random number generator? I thought deterministic random number generation is inherently pseudo-random, by information-theoretic limits or whatever. (Also: can anyone figure out the complexity of computing the $i^{th}$ bit using their method? The math in that paper is beyond me).
Finally: many easily-computable irrationals, like $\sqrt{2}$ and $\pi$, are very strongly conjectured to be normal. If that were proven, wouldn't that provide a poly-time perfect random number generator? Which is definitely impossible? What am I missing here?
