132

I think 1024 bit RSA keys were considered secure ~5 years ago, but I assume that's not true anymore. Can 2048 or 4096 keys still be relied upon, or have we gained too much computing power in the meanwhile?

Edit: Lets assume an appropriate padding strategy. Also, I'm asking both about security of signatures and security of data encryption.

CodesInChaos
  • 25,121
  • 2
  • 90
  • 129
Inaimathi
  • 1,587
  • 3
  • 11
  • 15

7 Answers7

160

Since 2000, on a given $\text{year}$, no RSA key bigger than $(\text{year} - 2000) \cdot 32 + 512$ bits has been openly factored other than by exploitation of a flaw of the key generator (a pitfall observed in poorly implemented devices including Smart Cards). This linear estimate of academic factoring progress should not be used for choosing a key length so as to be safe from attacks with high confidence (or, equivalently, conforming to standards with that aim), a goal best served by this website on keylength.

The current factoring record is the 829-bit RSA-250 in late Feb. 2020, see the summary by the CADO-NFS team. That came shortly after the 795-bit RSA-240 in Dec. 2019, see the detailed paper.

I emphasize that the above is about attacks actually performed by academics. As far as we know, hackers have always been some years behind (see below). On the other hand, it is very conceivable that well-funded government agencies are many years ahead in the factoring game. They have the hardware and CPU time. And there are so many 1024-bit keys around that it is likely a worthwhile technique to be in a position to break these. It is one of the few credible and conjectured explanations for claims of a cryptanalytic breakthrough by the NSA. Also, dedicated hardware could change the picture someday; e.g. as outlined by Daniel Bernstein and Tanja Lange: Batch NFS (in proceedings of SAC 2014; also in Cryptology ePrint Archive, November 2014). Or in the distant future, quantum computers usable for cryptanalysis.

By 2020, the main practical threat to systems still using 1024-bit RSA to protect commercial assets often is not factorization of a public modulus; but rather, penetration of the IT infrastructure by other means, such as hacking, and trust in digital certificates issued to entities that should not be trusted. With 2048 bits or more we are safe from that factorization threat for perhaps two decades, with fair (but not absolute) confidence.

Factorization progress is best shown on a graph (to get at the raw data e.g. to make a better graph, edit this answer)

Graph of academic RSA factorization records This also shows the linear approximation at the beginning of this answer, which actually is a conjecture at even odds for the [2000-2016] period that I made privately circa 2002 in the context of deciding if the European Digital Tachograph project should be postponed to upgrade its 1024-bit RSA crypto (still widely used today). I committed it publicly in 2004 (in French). Also pictured are the three single events that I know of hostile factorization of an RSA key (other than copycats of these events, or exploitation of flawed key generator):

  • The Blacknet PGP Key in 1995. Alec Muffett, Paul Leyland, Arjen Lenstra, and Jim Gillogly covertly factored a 384-bit RSA key that was used to PGP-encipher "the BlackNet message" spammed over many Usenet newsgroup. There was no monetary loss.

  • The French "YesCard" circa 1998. An individual factored the 321-bit key then used (even though it was clearly much too short) in issuer certificates for French debit/credit bank Smart Cards. By proxy of a lawyer, he contacted the card issuing authority, trying to monetize his work. In order to prove his point, he made a handful of counterfeit Smart Cards and actually used them in metro tickets vending machine(s). He was caught and got a 10 months suspended sentence (judgment in French). In 2000 the factorization of the same key was posted (in French) and soon after, counterfeit Smart Cards burgeoned. These worked with any PIN, hence the name YesCard (in French) (other account in English). For a while, they caused some monetary loss in vending machines.

  • The TI-83 Plus OS Signing Key in 2009. An individual factored the 512-bit key used to sign downloadable firmware in this calculator, easing installation of custom OS, thus making him a hero among enthusiasts of the machine. There was no direct monetary loss, but the manufacturer was apparently less than amused. Following that, many 512-bit keys (including those of other calculators) have been factored.

Note: 512-bit RSA was no longer providing substantial security by 2000-2005. Despite that, reportedly, certificates with this key size were issued until 2011 by official Certification Authorities and used to sign malware, possibly by means of a hostile factorization.

fgrieu
  • 149,326
  • 13
  • 324
  • 622
28

You might want to look at NIST SP800-57, section 5.2. As of 2011, new RSA keys generated by unclassified applications used by the U.S. Federal Government, should have a moduli of at least bit size 2048, equivalent to 112 bits of security. If you are not asking on behalf of the U.S. Federal Government, or a supplier of unclassified software applications to the U.S. Federal Government, other rules might of course apply.

However, at the very least, these figures indicate what the U.S. Federal Government thinks about the computational resources of it's adversaries, and presuming they know what they are talking about and have no interest in deliberately disclosing their own sensitive information, it should give some hint about the state of the art.

Henrick Hellström
  • 10,556
  • 1
  • 32
  • 59
14

The simplest answer would be to look at the keylength.com site, and if you don't trust that, to the linked papers, particularly by NIST and ECRYPT II. Note that those mainly agree with the Lenstra equations, so you could use those as well.

You may have additional restrictions and - if you are brave or stupid - relaxations depending on the use case. But at least they establish a base line that you can work with.

Maarten Bodewes
  • 96,351
  • 14
  • 169
  • 323
10

An adversary with a moderately large quantum computer to run Shor's algorithm will cut through a 1024-bit RSA modulus like a hot knife through butter, and maybe through a 2048-bit RSA modulus like a butter knife through a tough piece of steak.

The quantitative difference between ‘butter’ and ‘steak’ here is that the cost of Shor's algorithm run by an attacker is a quadratic function of the cost of computing RSA by a legitimate user. Traditionally in crypto we want the cost of attacks to be exponential in the cost of usage, like trying to use a rubber duckie to cut through a meter-thick steel bank vault door. The cost of the best classical attacks on RSA, NFS and ECM, is not actually an exponential function of the user's costs, but it's comfortably more than polynomial, which is why, for example, we use 2048-bit moduli and not 256-bit moduli for a >100-bit security level.

However, while the cost of Shor's algorithm is a quadratic function of the user's cost, it is a function of $\lg n$, where $n$ is the modulus. Users can use the well-known technique of multiple primes to drive $\lg n$ up, making Shor's algorithm (and the classical NFS) much costlier, at more or less linear cost to users. The best classical attack in that case is no longer the NFS but the ECM, whose cost depends on $\lg y$ where $y > p_i$ is an upper bound on all factors $p_i$ of $n$.

An adversary with a large quantum computer combine the ECM with Grover's algorithm to get a quadratic speedup, requiring the legitimate user to merely double their prime sizes. 1024-bit primes are considered safe enough for today against ECM, so we could double that to 2048-bit primes to be safe against Grover–ECM, but out of an abundance of caution, we might choose 4096-bit primes.

At what size moduli does the cost of Shor's algorithm exceed the cost of Grover–ECM? It's hard to know for sure how to extrapolate that far out, but we might surmise from conservative estimates of costs what might be good enough.

Thus, to attain security against all attacks known or plausibly imaginable today including adversaries with large quantum computers, cryptographers recommend one-terabyte RSA moduli of 4096-bit primes. Cryptographers also recommend that you brush your teeth and floss twice a day.

Note that these estimates are very preliminary, because nobody has yet built a quantum computer large enough to factor anything larger than the dizzyingly large 21 with Shor's algorithm modified to get a little help from someone who knows the factors already. (Larger numbers like 291311 = 523*557 have been factored on adiabatic quantum computers, but nobody seems to know how the running time might scale even if we had enough qubits.)

So this recommendation may be unnecessarily conservative: maybe once we reach the limits on the cost of qubit operations, it will turn out to take only a few gigabytes to thwart Shor's algorithm. Moreover, standard multiprime RSA may not be the most efficient post-quantum RSA variant: maybe there is a middle ground between traditional RSA and RSA for paranoids, that will outperform this preliminary pqRSA proposal.

Squeamish Ossifrage
  • 49,816
  • 3
  • 122
  • 230
8

Another way of determining the key size that offers 'adequate security' and was presented initially by Lenstra is the equivalence between symmetric and asymmetric key lengths where two systems offer cost-equivalent security if at a given time accessing the hardware that allows a successful attack in a certain fixed amount of time, costs the same amount of money for both systems. Adequate security was defined as the security offered by DES in 1982.

There are several factors that influence the choice of the key length, for example the life span of the data you want to protect, the estimation of the computational resources (consider Moore's law) and the cryptanalytic advances through the years (Integer factorisation). According to Lenstra, by 2013 a symmetric key size of 80 bits and an asymmetric key size of at least 1184 bits is considered to offer adequate security.

A more recent method of determining adequate key sizes again by Lenstra ("Using the cloud to determine key strengths") is by using cloud services to estimate the computational cost required to factor keys assuming that the fastest way is the Number Field Sieve algorithm. He used Amazon's cloud services to develop his cost based model. Note that the Number Field Sieve algorithm is over 20 years old and since 1989 in this area there have been no major advances besides small tweaks.

In recent surveys it has been observed that people tend to move towards 2048-bit keys although certificates holding 1024-bit keys have no reason to be revoked as long they are not expired.

Mike Edward Moras
  • 18,161
  • 12
  • 87
  • 240
alexandros
  • 305
  • 2
  • 5
7

I personally think a key should have the potential to last a life time, if there are no easily predicted events. In other words: If you looking at history can say that a given key length is likely broken before your death, choose a longer length (unless, ofcourse, if there are other contraints).

There are several factors that historically have been surprisingly predictable: Evolution in hardware (Moore's law) and mathematical advances. These have historically been in the same ball park.

Others have looked at the evolution (the blue part https://klenstra.win.tue.nl/key.pdf), which I have extended (the part in red).

Graph key length

Based on that I estimate that if Moore's law holds and the mathematical advances are in the same ballpark, then a RSA key of 10kbit has the potential of being deemed secure in 2100 (with the usual quantum computer warning attached).

I will happily grant that this is of course a guess - and by no means a guarantee. But it should be held against for example creating a 3kbit key, which I am pretty sure will not be deemed secure in 2100.

One interesting aspect of choosing a 10kbit key is that the performance penalty will only affect me - not the corresponding party. To everyone else their performance will be similar to a 4kbit key: --verify and --encrypt will run roughly at the same speed. The penalty is on --key-gen, --sign, and --decrypt which is only run by the key holder.

You can find details here:

Ole Tange
  • 359
  • 6
  • 15
4

According to Schneier's book Cryptography Engineering, n = pq

...should be around 6800 bits...

in order to

design a system that will be used for 30 years, and the data must be kept secure for 20 years after it has first been processed.

also it states that it's not as same as symmetric keys in

...variable-sized...public key only needs to protect data 21 years, rather than the 50 years needed for symmetric keys. Each year, you generate a new public key, and you can choose larger public keys as computing technology progresses.

but it was also stated that key size almost never matter, as the other parts of the system is more often than not, are weaker.

Nae
  • 177
  • 3
  • 10