16

To what degree can we define an RSA variant, with a security argument that it is as safe as regular RSA with a given modulus size $m$ (e.g. $m=2048$), in which the public key has a compact representation of $k\ll m$ bits?

We can fix the public exponent to our favorite customary value, e.g. $e=2^{16}+1$ or $e=3$, thus need to store only the public modulus $n$. We need not store the leftmost bit of $n$, which is always set by definition; nor the rightmost bit, which is always set since $n$ is odd. With a little effort, we could save (very) few more bits noticing $n$ has no small divisors, but that will still be $k \sim m$ bits.

We can do better by forcing the $\left\lfloor m/2-\log_2(m)-2\right\rfloor$ high bits of $n$ to some arbitrary constant such as $\left\lfloor\pi \cdot 2^{\left\lfloor m/2-log_2(m)-4\right\rfloor}\right\rfloor$. Observe that we can chose the smallest prime factor $p$ of $n$ just as we would do in regular RSA, then find the maximum integer interval $[q_0,q_1]$ such that any $q$ in that interval cause $n=p\cdot q$ to have the right high bits, then pick a random prime $q$ in that interval (most often there will be at least one, if not we try another $p$). Some of the security argument is that

  • generating an RSA key $(p,q')$ using a regular method, with random huge primes in some appropriate range and no other criteria beside the number of bits in $n'=p\cdot q'$, and $p<q$;
  • then deciding the high bits of $n$ from those in $n'$;
  • then finding $[q_0,q_1]$, generating $q$ as a random prime in that interval, and setting $n=p\cdot q$;

demonstrably gives the same distribution of $(p,q)$ as said regular generation method, hence is as secure; then we remark that the high bits of $n$ are random (with some distribution not too far from uniform), and public, thus fixing it can't much help an attack (I think this can be made rigorous).

This is now $k \sim m/2+\log_2(m)$ bits to express the public key. We can save a few more bits, each one at worse doubles the amount of work to generate the private key (we can repeat the generation process outlined above until we find a key with these bits equal to some public arbitrary constant; or equal to bits from a hash of the other bits if we want a tighter assurance that the scheme is not weakened).

Can we do better, and what's the practical limit?

fgrieu
  • 149,326
  • 13
  • 324
  • 622

2 Answers2

12

Daniel J. Bernstein mentioned your way of compressing RSA public keys in his paper "A secure public-key signature system with extremely fast verification". The naive way you outline roughly doubles the work for each extra bit. If there were a better method which did not run very slowly then it could be repurposed as a factoring algorithm. So if it were possible to decompress arbitrary 104 to 128-bit strings into secure 2048-bit RSA public keys faster than factoring as David Schwartz suggests then that would be quite remarkable. Every time you ran the algorithm you'd effectively be finding the approximately equal sized factors of some 2048bit number of which you'd specified a lot of the bits. Although there's no theoretical reason I can think of why this should be impossible, nor would it render RSA necessarily insecure, it does strike me as rather unlikely.

As you hint, a further (impractical) way to reduce the storage of the unspecified low bits of the modulus would be to store their residues modulo various small primes and reconstruct using the CRT. In theory this should save between 3 to 4 bits for a 2048-bit modulus. You save the space because you can rely on the residues not being zero.

ByteCoin
  • 747
  • 1
  • 6
  • 7
2

Actually, it appears that we can do a bit better by using an unbalanced RSA key; that is, one composed of two primes of different sizes.

For example, suppose we have a 512 bit p and a 1536 bit q; to generate a key, we can select a random 512 bit prime p, and then for q, we search for a prime in the range $(C/p, (C+2^k)/p)$ (where $C$ is our 2048 bit constant which includes the bits we want to force, and $k$ is the number of bits we're willing to vary). We expect about $2^k/p \times (1 / \log( C/p )) \approx 2^{k-512} / (\log(2) (2048-512)) $ primes; if $k \approx 522$, then there would be 1 expected prime in the range. This would allow us to express a 2048 bit RSA key with only 522 bits.

Now, the obvious question is: what does this do to the security. I'm pretty sure that this algorithm generates an RSA modulus which is no easier to factor than a random modulus with the same sized factors, but that doesn't really answer the question. Now, we know that the time taken by NFS doesn't vary based on the size of the factors, but ECM does speed up if there are smaller factors. So, how small can we make $p$ before ECM becomes faster than NFS (and thus lowering our security level)? I don't know the answer to that one (or even if my example of a 512 bit factor would be already over the limit; it wouldn't surprise me if it was).

I believe this trick can be used to shrink RSA public keys to some extent, but I don't know how far you can take it. On the other hand, I do hope this is an academic exercise; if you're really interested in small public keys, it'd certainly be better to use an elliptic curve algorithm (which has small public keys without seeing how close we can get to the security edge).

poncho
  • 154,064
  • 12
  • 239
  • 382