1

It is known that existence of pseudorandom generators (PRGs) is equivalent to the existence of one-way functions. In turn, the latter is an open problem.

I am curious if someone developed kind of "practical" PRGs, which are weaker than PRGs in terms of computational indistinguishably to uniform random number generators.

I know of some statistical tests for randomness, but is there any rigorous theory on the subject?

This is a repost from MO, and I intend to remove the original question.

Rubi Shnol
  • 113
  • 3

2 Answers2

3

It is worth mentioning that there is a connection in complexity theory often called "Hardness v Pseudorandomness" that makes this question somewhat difficult. It may not be surprising that given a strong enough PRG, one can derandomize certain randomized algorithms, i.e. trying to prove $P = BPP$ can be done by giving a provably strong enough pseudorandom generator.

What is perhaps surprising is that this is intimately connected to the existence of hard problems in $NP$. Specifically, if there is a problem in $NP$ that requires exponentially sized circuits, I believe it implies that $P = BPP$ via the construction of an explicit pseudorandom generator. Moreover, some form of the reverse holds --- being able to derandomize $P$ (via the construction of a provably good PRG of good enough parameters) leads to circuit lower bounds, which is a notoriously hard subject. Keywords for this are things like "Nisan-Wigderson PRG", or you can look at these notes for some pointers.

I think (but am fuzzy on the details) that one can generalize the above to other circuit classes as well, i.e. the existence of explicit provably secure PRGs against some circuit class is intimately related to circuit lower bounds against some related circuit class. As in general most circuit lower bounds we can prove are against fairly restricted models of computation, I would expect that the literature on "rigorous PRGs" is restricted to PRGs that are secure against relatively weak forms of computation.

I don't know a great place that summarizes this all --- I do not myself work in the area, but heard about it from someone who does. Perhaps their thesis would be a good reference, but I'll admit that I have not personally read it.

Mark Schultz-Wu
  • 15,089
  • 1
  • 22
  • 53
1

If the goal is cryptographic strength [given the context stated by the OP, I assume it is] in a PRNG that will be used in practice, then the randomness testing methods can be used to rule out generators as being weak, but obviously cannot rigorously demonstrate randomness.

However there are pitfalls in using generators which are based on problems which are assumed to be hard.

In terms of the CSPRNGs, such as Blum-Blum-Shub (BBS), which is the most well-known example, care must be taken that the extraction rate of "cryptographically strong bits" is not too high compared to the state space of the BBS iteration. The theoretical suggestion is if the iteration is $$ x_{k+1}=x_k^2 \pmod n,\quad n=pq $$ where $p$ and $q$ are large primes, one should at most take $O(\log \log n)$ least significant bits of $x_k$ and output them as pseudorandom bits.

This approach is full of pitfalls, however. Firstly, the specification is asymptotic, so what's a reasonable constant to use in front of the $O(\cdot)$ expression? Vanstone and Menezes in an Indocrypt paper suggested not to use more than 1 bit, i.e., just the least significant bit.

More seriously [see the first answer to this question in crypto stackexchange for details] it turns out that the security reduction in BBS is so inefficient that maybe it should not be used at all in practice:

Suppose you use BBS with a 768-bit modulus. You've read that 768 bits is enough to make factoring infeasible, so this sounds peachy. You've read that it is safe to extract O(lg n) bits in each iteration; here n = 768, and lg n = 9.58, so you decide to extract 9 bits in each iteration. You use it to generate a pseudorandom stream of 107 bits (about 1MB of pseudorandom data). How much security do you get? Answer: the security proof guarantees security against any adversary that uses at most $2^{-264}$ steps of computation. Yes, this is an utterly ridiculous and useless statement! To put it in plain English, the security proof guarantees absolutely nothing useful at all.

On the other hand if one wanted security against an attack of complexity approximately $2^{100}$ steps (reasonable since $2^{128}$ brute force complexity is standard these days) one would have to choose $n$ of about 6800 bits. This is now more feasible since RSA moduli of 4096 bits are now common.

Notwithstanding this, it is amazing how inefficient the security proof reduction in BBS is. One can demonstrate that breaking BBS and is $1054 n^3$ times faster than factoring the BBS modulus $n.$

So practical CSPRNG security is a moving target, very dependent on algorithmic developments, even in the case of generators designed by using hard problems.

kodlu
  • 25,146
  • 2
  • 30
  • 63