RSA key strength and math
2017-08-24 12:37:33 +0200 - Written by Mikal Villa
TL;DR Since time is a factor here, \(2048\) bit is probably fine for most systems and users as long as it is replaced often, like let’s encrypt does with it’s ninety-day lifetimes. However for an CA I wouldn’t use less than \(4096\) bit keys, and probably \(8192\) bit keys if they where to live longer than ~2030 maybe.
Update: There are algorithms with sub-exponential running time for factoring integers, which is much more faster than iterating through all the numbers. I’m aware of that.
From people that really knows math, I’ve often heard my friends which studied math or physics on a higher level, but also randoms say that choose to use really big RSA or ECC keys, don’t understand the math behind it. Maybe it’s true, maybe it’s not. Let’s find out.
For myself, I’m not the best in math for sure, back at school I had great grades in math, but up in my teenagers I realized it was more brainwashing and training to listen to authority. I remember I joked about who would ever need algebra after school. Well, here I am learning machine learning with algebra. (Fail). So don’t take me as an expert in math. This also means that if you got some higher education in math, I’ll probably just make you bored by this post.
Many believes bigger is better, for various of reasons. However, for TLS traffic \(2048\)bit keys are still not unrecommended. How come? Let’s try the math.
First I want to mention that it’s estimated to be \(7.5x10^{18}\) grain of sand on the Earth. That’s \(7,5E18\) or in human readable form 7,500,000,000,000,000,000. So let’s for fun see the result if we multiply it with itself.
(\(7.5x10^{18})x(7.5x10^{18}\)) = 56,250,000,000,000,000,000,000,000,000,000,000,000
That’s still only a 38 digit number, even we multiplied the estimated amount of grain of sand on the Earth with itself.
On the other hand it’s also estimated to be \(10^{80}\) atoms in the visible universe. That’s a little number with only 81 digits.
100,000,000,000,000,000,000,000,000,000,000,000,000,
000,000,000,000,000,000,000,000,000,000,000,000,000,000
Here is a list of how much you can store vs amount of bits.
- In \(1\)bit you can store \(2\) different values (0 or 1).
- In \(4\)bits you can store \(16\) different values (\(2x2x2x2\) or \(2^{4}\)).
- In \(8\)bits you can store \(256\) different values (\(2x2x2x2x2x2x2x2\) or \(2^{8}\) or \(2E2\)).
- In \(16\)bits you can store \(65536\) different values (\(2^{16}\) or \(6.5536E4\)).
- In \(32\)bits you can store \(4294967296\) different values (\(2^{32}\) or \(4.2E9\)).
- In \(64\)bits you can store \(18446744073709551616\) different values (\(2^{64}\)).
- In \(128\)bits you can store \(340282366920938463463374607431768211456\) different values (\(2^{128}\)).
And so on.
At \(128\)bit we crossed 38 digits, with 39 digits. That means at \(128\)bit we’re talking about numbers greater than the amount of grain of sand on Earth, multiplied by itself.
Could we brute-force a \(1024\) bit key? Where the approach had been to enumerate every possible key?
Short answer: Nope.
The number of primes smaller than \(x\) is approximately \(\frac{x}{\ln x}\). Therefore the number of \(512\)bit primes (about the length you need for \(1024\)bit modulus) is approximately:
\[\frac{2^{513}}{\ln 2^{513}}-\frac{2^{512}}{\ln 2^{512}} \approx 2.76×10^{151}\]RSA keys is a pair of two distinct prime numbers, that’s the holy secret behind it. The number of RSA moduli is therefore:
\[\frac{(2.76×10^{151})^2}{2}-2.76×10^{151}=1.88×10^{302}\]Wikipedia tells me there contains about \(10^{80}\) atoms in the observable universe. Assume that you could use each of those atoms as a CPU, and each of those CPUs could enumerate one modulus per millisecond. To enumerate all \(1024\) bit RSA moduli you would need:
\[\frac{x}{\ln x}\] \[\frac{2^{513}}{\ln 2^{513}}-\frac{2^{512}}{\ln 2^{512}} \approx 2.76×10^{151}\]$$\begin{eqnarray*} 1.88×10^{302}ms / 10^{80}&=&1.88×10^{222}ms\\ &=&1.88×10^{219}s\\ &=&5.22×10^{215}h\\ &=&5.95×10^{211} \text{years}\\ \end{eqnarray*}$$
Just as a comparison: the universe is about \(13.75×10^{9}\) years old.
There are much faster ways to find out a secret key however. There are algorithms with sub-exponential running time for factoring integers. But as I’ve understood it, it’s still gonna take quite some time to find the key. I haven’t had time to learn about it yet, but I got some links if you want to get deeper into that topic. Integer factorization at wikipedia and Chapter 15 - Factoring and Discrete Logarithms in Subexponential Time in a book named “Mathematics of Public Key Cryptography” by Steven Galbraith. The chapter is available in pdf here
However, \(1024\) bit is not recommended anymore and it’s nothing less than 309 digits. \(768\)bit keys have also been proved broken. That’s a bit freaky.
Back to \(1024\) bit, those numbers above doesn’t even start closing in on how big number \(2048\) bit is, it’s freaking 617 digits, where \(1024\) bit was 309 and \(768\) bit is 232 digits. You can try the calculation yourself at wolframalpha. The calculation is \(2^{2048}\).
Remember as earlier said, I’m not a math expert, but my conclusion on this topic is that \(2048\) bit seems fine for internal usage, webpages without sensitive information. Since time is a factor here, \(2048\) bit is probably fine for most systems and users as long as it is replaced often, like let’s encrypt does with it’s ninety-day lifetimes. However for an CA I wouldn’t use less than \(4096\) bit keys, and probably \(8192\) bit keys if they where to live longer than ~2030 maybe. In the end, maybe I’m too paranoid, but my minimal key size will still be \(4096\) bit as long as it’s an option versus \(2048\) bit.