This got me thinking.
Imagine this discovery led to a larger breakthrough on prime numbers that allowed easy factorization of large integers and effectively rendered public key cryptography such as RSA ineffective overnight, by allowing anyone with a consumer-grade CPU to crack any production-size key.
Does the industry have DR plans for this scenario? Can the big players quickly switch to a different, unbroken encryption system? While it would probably be a heavenly day for jailbreakers, console modders and other "device freedom" types generally, the overall impact would be disastrous and incalculable.
Does the industry simply not consider "sudden number theory breakthrough" a possible event?
I think this is where the move to elliptic curve comes in, and that seems well on its way. Both in signatures and handshakes (Diffie-Hellman).
Perhaps it's not a one minute operation to DR, but I doubt everything would be open if RSA/DH was rendered insecure overnight. I know my SSH keys are a mixed bag right now.
does the move to elliptic crypto suggest that the people in the know expect prime factorisation to be broken soon?
Since no one mentioned, a major reason to prefer elliptic curves is that you can get equivalent security for much smaller key sizes.
And the key consituents don't have to be prime. That use of heuristics has always scared me a bit.
Nit-pic: the key constituents only need to be co-prime rather than absolutely prime, though in practise this makes little or no difference.
So far as we know. It scares some people that maybe co-primes working might be a sign of some future attack. Nobody has been able to figure out how such an attack works, but considering RSA is defined for primes, that some non-prime numbers also work is scary.
I had to implement key generation on an embedded device @ 180MHz. RSA2048 would take upwards of five minutes if you didn't get lucky finding primes early on. Stronger ECC would be done within a second.
nice thank you!
I remember when I generated my first EC key and added it to an SSH client. I was so surprised by the size of the key I spent some time researching if I had made a mistake.
Honestly impressive.
https://en.wikipedia.org/wiki/Shor%27s_algorithm
As soon as quantum computers have enough qbits prime factorisation can be done very quickly. Not sure the timeline on that as there are a lot of challenges in the technology and it is hideously expensive, but a lot of the move away from RSA to elliptic curves is driven by readiness for quantum computing.
https://en.wikipedia.org/wiki/Post-quantum_cryptography
Elliptic curve cryptography can be broken by Shor's algorithm as well
https://arxiv.org/pdf/1706.06752
... and easier than with RSA. Not that it would make a significant difference.
sgt101 posted a good comment about this a couple months back: https://news.ycombinator.com/item?id=40187560
tl;dr: not in our lifetime.
It's to do with "if we ever have big/good quantum computers, prime factorization is doable" and with "let's use the new shiny things".
On a related note, someone might discover some elliptic curve math tomorrow and your CPU can break all that stuff just as well...
so with my cynic hat on, maybe a bunch of people already have that and that's why we're being moved off the hard stuff.
The NSA had the option to do something like that when they (via NIST) standardized DES.
They chose to standardize a version that's secure against attacks that only they knew at the time, shorten the key length so they can still brute-force it if they really need to, and successfully kept the attack secret until researchers at a foreign university independently discovered it decades later.
Smaller key sizes and faster at a given level of security, PQ-safe (at least as far as we publically know).
If anything, ECC is probably easier to break with qc. Shor's algorithm works "better" for discrete logarithms than for prime factorisation. "Better" as in requiring a smaller quantum computer. Still way beyond what is available today.
The reason for switching to ECC is speed and key size. The NSA recommends 3096bit rsa keys for 128 aes, but only 256 bit ecc keys for the same security against traditional attacks.
They also went ahead and recommended 348bit keys for ecc, but I don't know if something like that is widely used anywhere. Curve448 is nice but slow. Curve41417 is fast but largely unused.
Arguably, we know a lot more about elliptical curves than prime number theory too. There have definitely been a lot more folks working on it.
There are other algorithms, NIST has a post-quantum project. There are options, but it’s a lot more than having some algorithms, we need protocols and they are still a bit off. Prime factorization isn’t going to get easier or faster though. There might be another attack or approach to attacking, some other numerical break through that leads to faster rsa cracking might be found but it’s hard to imagine it being that practical.
If someone found a way to easily factorize large integers easily on consumer grade hardware then it would be very painful as RSA is one of the big public key algorithms.
Before you start worrying about it though consider that RSA has held up for 47 years of active cryptanalysis so far - during which time many alternative algorithms have been suggested as being superior, only to be broken a short time later.
Also the push to switch to Elliptic-Curve algorithms has been more down to them being easier for computers to use to encrypt/decrypt data.
Personally if I had to bet on which public key algorithm will still be around in 10 years time then I'd put my money on RSA.
Actually RSA has several "gotchas", so it is not that it has held up but people have managed to work around those gotchas into a working encryption system
(Basically your data is not encrypted with RSA, you encrypt a secondary key, send it with RSA but the main encryption is AES see https://en.wikipedia.org/wiki/Transport_Layer_Security#Key_e... )
Key exchange is done for speed (symmetric key crypto is way faster than public key) and forward secrecy. It’s not done because RSA is flawed per se. We use DH instead of e.g. ElGamal encryption for the same reasons.
Yeah it's not so much of a flaw of RSA, but encrypting pure text with it for example is more complicated (and has more caveats with padding, etc) than just encrypting a fixed amount of bytes
Don’t think this merits an “actually” - using a session key et al. is basic usage and does not bear on the strength of RSA itself.
There's "gotchas" with every encryption scheme - in fact whenever TLS uses any Public Key encryption scheme it'll pair it with a Symmetric Key encryption scheme. So you could say that by your definition no Public Key encryption scheme has "held up" and they've all had to be worked round :)
There are benefits to pairing the slower Public Key schemes with a Symmetric Key encryption scheme using a session key, as you get the benefits of an Public Key encryption scheme with the performance of a Symmetric Key encryption scheme.
A lot of the RSA gotchas are due to trying to take implementation shortcuts either for convenience or speed.
If you don’t choose your primes truly randomly for example.
Using a secondary key and using RSA for the key share is not about avoiding RSA gotchas it’s just about performance.
RSA has effectively been broken many times. We literally had 128bit RSA encryption hardware at one point. There were even export controls on keys beyond a certain length (512bits) that today are trivial to break with the general number field seive. You look at the history of RSA and it’s not pretty. Dixons method had us all scrambling to use 512bit keys (pushing the export restrictions), special number field seive had us rushing to get to 1024bit. The general number field seive more recently pushed us to 2048bits. Who can tell what’s next here. In fact look at the complexity of the special vs general number field seives and you’ll see the statements are almost the same, just some constants reduced. That’s worrying because there’s no reason to think the current constants are a minimum here. We may well find out 2048bits is not enough.
Heck just read a paper in state of the art dedicated RSA encryption hardware from the 80s. All now completely broken. They are very impressed with some of the 512bit hardware!
https://people.csail.mit.edu/rivest/pubs/pubs/Riv84.pdf
That is not when an encryption algorithm is usually considered to be broken, it just means that a certain key length is not sufficient anymore. You can break 20 bit RSA with pen and paper, but as long as a linear change in the key length causes an exponential increase in the decryption time, the algorithm is not broken. At this moment, the record for the factorization of a specific RSA key is one of 829 bits, which suggests (by extrapolation) that within a few decades 1024 bits may not be safe if your adversary has the resources. No (reasonable) key length can be expected to be safe forever, even without any mathematical breakthroughs
I’d say it’s a break if the encryption you once used (512bit and below RSA) is now trivially decrypted thanks to mathematical advances.
RSA 2048 hasn’t been broken but 512bit RSA definitely had been by any definition.
I feel “RSA is fine because much longer key lengths still work” is hiding what happened here. Yes we can still get into the infeasible realm with RSA and really long keys but the algorithm has definitely let us down multiple times thanks to mathematical improvements in factorization that just keep coming.
True, but is there much overlap between the set of people who actively do cryptanalyses on RSA and the set of people who actively research integer factoring?
As far as I know the skills needed to make progress on integer factoring are not really needed for anything in cryptography other than that specific problem and include a bunch of other skills that have nothing whatsoever to do with cryptography and take a long time to master. It's also been an open and important problem long enough that if it is solved it is likely to be by someone who is a world class expert in it, similar to how Fermat's Last Theorem was finally solved.
Similarly the skills needed for anything in cryptanalysis other than trying to break RSA by factoring are useless for working on integer factoring. The result is that very few, if any, people have the time and ability to become both cryptanalysts and world class integer factoring researchers.
Thus I'd expect nearly all of the 47 years of active RSA cryptanalysis has been on finding and fixing the numerous mistakes that can be made when implementing RSA that allow it to be broken without factoring.
I'd guess it is also similar with the P = NP problem. A solution to that has implications for cryptography but I doubt many cryptanalysts are working on the P = NP problem. Also maybe same for the Riemann hypothesis (I'm not sure if that one has implications for cryptography).
The world has moved away from RSA etc to elliptic curves. Not everybody did, through. RSA is no longer the standard, and has not been for many years.
Still, plenty of old stuff was scraped/sniffed under the "store now, decrypt later" methodology.
True. The only solution is to keep your data outside cloud(aka someone else's computer) no matter what encryption you use.
Also means it can’t transit the internet. So actually, only on airgapped networks.
If we're going to extremes like that, airgapped networks aren't truly safe either
Could you explain why that is? If I have an airgapped smart home network, someone has to come physically sniff the packets. If it’s only over ethernet, they have to physically plug in. That’s not a scalable attack strategy.
There's tons of ways to exfiltrate data from air gapped systems, if you can manage to get something installed in them. Ones I've read about are by toggling the caps lock led and recording it with a camera. Encoding data into the cpu fan speed, and capturing the sound with a microphone for analysis (run a spinloop for a 1, thread.sleep for a zero). Variations of these can also be used, such as with screen brightness, monitoring powerlines.
My personal favourite is one where they send specific patterns of data over usb, where the EM fields generated by the "data" flowing over the wire form a carrier signal onto which data can be encoded, which can be received up to 5m away. This requires no additional hardware.
All of these involve some malware installed on the system and have a tiny amount of bandwidth, but if there'a man on the inside, all they have to do is install the malware without having to worry about additional hardware for getting the data out of the machine.
Also, the safest data is the one never sampled into digital format and stored in computer systems.
You can’t imagine how many places I’ve seen “Elliptic curve certificates are not supported” as recently as this year.
It’s the IPv6 of the security world.
That's because certificates are all about authentication not encryption. There is no real reason to move away from RSA for authentication. The reason that TLS moved away from RSA for encryption is that it is awkward to do forward secrecy with RSA due to the slowness of generating new RSA keypairs. In practice you would want to generate a new RSA keypair every, say, hour on the server and then somehow get it down to the cryptography level for use. Totally doable, but a different way of doing things.
If there is a such a breakthrough then the hackers or even spy agencies will not reveal it. They will instead silently make use of it. It will be essentially a backdoor for them.
Wouldn’t it likely originate from academia? If so you can bet that the work will be published just like this one.
It's hard to know where things are today, but historically, public academia has often been behind the true cutting edge of cryptanalysis. For example, take a look at the history of Differential Cryptanalysis https://en.wikipedia.org/wiki/Differential_cryptanalysis
That is both impressive and disappointing. I'm so used to seeing large corporations publishing AI models and other techniques (like Ghidra) that I assumed a finding like that would be disseminated to the public. But you're right, something that could be used to decrypt modern ciphers could very well be kept secret for as long as possible.
Ghidra was private for many years before it was public (I don't know precisely how many, I suppose that information is also private heh)
Edit: Wikipedia mentions 1999, with v1.0 existing in 2003 https://en.wikipedia.org/wiki/Ghidra#History
It depends on who makes the breakthrough.
SETEC ASTRONOMY
The industry couldn’t even prepare for a bad Crowdstrike update. And yet, it figured things out in a few days or so.
The ability to prepare for catastrophic scenarios is overestimated. The ability to survive them is underestimated.
This would be a lot worse than that. Crowdstrike was bad because everyone lets relatively untested code straight into the Windows kernel - i.e. known incompetence of approach. This would be bad despite massive care taken to have the right approach.
Yes, except there is no “massive care”. If people are OK to install other companies’ rootkits to their critical infrastructure, they will not care about anything else, too.
The massive care is the algorithm selection process, the careful implementations, and the long-term observation and correction of the performance of the algorithm implementations.
"Some people did X" !== "All people do X"
CS was a software update. RSA is baked into many silicon circuits and firmware ROMs.
Well, hardware is replaceable, too.
That happened many many times over with rsa!
The us government used to restrict export of long rsa keys. At one point much of the world was using 128bit rsa keys but Dixon method had everyone scrambling to use 512bit keys. Then the special number field drive had us all scrambling to use 1024bit keys and the general number field seive again had us scrambling to get to 2048bit keys.l and that really wasn’t that long ago relatively speaking.
Check out rsa encryption hardware from the 80s. They are really proud of some of the hardware that can do 512bits! (Useless today)
https://people.csail.mit.edu/rivest/pubs/pubs/Riv84.pdf
The special and general number field seize complexity statements are a few constants in difference. Look at those constants. Do they seem to be some root limit to you? Is it really that unlikely that there’s not a way to reduce those further making even 2048bit keys useless?
You don’t need to ask “what would happen if RSA broke” because those of us who have been through this many times now can straight up tell you. You’ll be scrambling to once more bump up the key size and you’ll be auditing all the potential data leaked.
RSA failures with bit-depth were a matter of degree; a prime number factorization break-through would be a matter of kind.
It’s not log(n) but still a break since we were literally using lower bit strength than was trivially factorable thanks to mathematical advances and to the point of thinking RSA 2048 is safe, well we once thought that about 128bit RSA. If the above pans out like the general number field seive did we may yet need to move the goal posts further. And we really really shouldn’t be surprised if it happens since it’s happened so many times already.
I believe this was one of the reasons for the broad adoption of elliptic curve based cryptography. The mechanism is not based on prime numbers, so smaller keys were adequate, and it was hoped that they might avoid future degradation due to prime factorization research. Of course they could still be vulnerable to their own attacks, but it still requires an attacker to expend more resources.
When?
I was writing crypto libraries in the early 90s to support RSA, DSA and ElGamal at my company (this predates the times when good open source crypto libraries were broadly available).
Even back then 128 bit RSA keys were not used. The smallest the library supported was 512 and the smallest we ever used in production was 768 bits.
That's how far back my own memory goes. But here's a paper from Arjen Lenstra from 2001 which has a table showing computationally equivalent key sizes back to 1982.
https://infoscience.epfl.ch/server/api/core/bitstreams/c323a...
In 1982, security comparable (at the time!) to DES would have been 417 bit RSA keys.
So even in 1982, using 128 bit RSA keys made no sense!
If you've had to do this for RSA keys (more than once, even!) I respectfully suggest you need to be a lot more conservative picking key lengths. There has never been a sudden breakthrough in factorization that has rendered conservatively chosen RSA key lengths obsolete overnight.
Many people in the industry does not think that RSA is crackable due to the assumptions that the Riemann Hypothesis and also the distribution of prime numbers is such a hard problem with a long time of being unsolvable.
A possible mitigation for things like websites would be either ECC or even using the quantum resistant encryption systems (the industry would more likely avoid this due to the systems being very prototypical since we have just started researching this).
Since old bitcoin wallets can’t be moved off of RSA you can transfer the coins to your wallet and there is no mitigation.
I don't see how proving the Riemann Hypothesis would help cracking RSA? If it helps, couldn't you just assume it is true and start cracking RSA today? If you ever hit a point where it doesn't work then BOOOM: Riemann Hypothesis disproven!
I think it is the other way around--disproving the RH might break some things.
Most mathematicians believe RH is true, and generally when doing industrial number theory people operate under the assumption that RH is indeed true and so if they need to use X to justify something and there is a theorem of the form "if RH is true then X" they use X.
Thus a proof of RH is not a problem. It just confirms that what people applying number theory already assumed was correct.
A disproof means that those X's might not be true and their use would need to be reassessed.
RSA was once 128bits and today has to be 2048bits minimum to be secure because it was essentially broken multiple times. There used to be 128bit rsa encrypting hardware that now doesn’t work at all to protect data due to previous mathematical breakthroughs.
The congruence of squares equivalence to factorization demonstrated we need at least 500 bits and then the special number field seive that built on this push it to 1024. The general number field seive pushed it again to 2048.
Sure it’s not a log(n) break but it’s been broken. If you look at the complexity analysis of the special vs general number field seive the portion of the exponent going from 1/2 to 1/3 should give you thought. Can it be moved to 1/4? Could it be moved indefinitely to 1/x? The general number field seive is relatively recent. If someone comes up with a similar breakthrough again (and this has happened many times over with rsa) your 2048bit keys won’t be secure just as your 128bit rsa keys from the past are no longer secure.
From what I remember in my math class where we made those cryptography calculations by hand, the teacher many years ago said that the day we could guess prime numbers it will be a disaster because many cryptographic calculations are based on the premise that we can’t guess prime numbers. I don’t know if that changed ?
It's easy to find primes of a given bit length, and it's easy to multiply them together. It's hard to un-multiply a given number (public key) into its unique prime factors (private key).
But if we could more easily find primes, the search space for finding those prime factors would be significantly smaller.
In my mind, it’s not a question of easy vs hard… it’s a question of fast vs slow. The default algorithm for finding primes is pretty simple, but it takes a lot of math and time. If you reduce the time requirements, then we start to get into trouble.
IIRC, Elliptic curve cryptography doesn’t rely on factorization, so there’s already an interim PKC solution in place.
I also recall there were many problems with the ECC based algorithms or at least the implementations—something about discrete approximations weakening security?
Far beyond my comprehension
There is also lattice-based cryptography.
ECC is very closely related though (hidden abelian subgroup problem is the category they both fall under).
It’s actually concerning because rsa was broken. The reason we’re not using 128bit rsa keys anymore and instead using 2048bit keys is because rsa was broken by the general number field sieve. We’re now all using ecc to avoid working with very large keys but there’s no mathematical proofs that ecc is anymore difficult. In fact it’s widely believed to be the same problem underneath.
That may surprise people. ECC, the thing we rely on, is not proven except by the fact that no one has broken it yet just like rsa was until someone broke it.
Pretty sure that it would require that P=NP if such an event happened. So if factorization was cracked, everything else would be too.
Integer factorization is an NP problem but is not known to be NP-complete. Therefore, we do not know how to solve all NP problems in P time using a hypothetical P time factorization.
P =? NP would remain open.
Are you sure about that?
And even if problems can be solved in polynomial time, the constants involved can be prohibitively large.
I think someone working in cryptography will worry about a sudden number theory breakthrough that allows for breaking of cryptography as much as a someone working in the energy sector will worry about a sudden physics breakthrough that allows for practically free energy cold fusion energy.
free energy still needs to be distributed and solving free energy does not solve the balance of the grid. Dunno about Cryptographers.
Do the mathematicians not prove that this is impossible before we all adopt a cryptosystem? Like I don't think everyone started using RSA on a prayer
No it's quite hard to prove that
If someone did find out how to do it, do you think they would make it public?
Since Setec Astronomy closed operation, we've been okay.
I always assumed in the Anime “Ghost in the Shell Stand Alone Complex” they used, “Barrier Mazes” rather than cryptography for a reason
I guess one perspective is finding fast factoring is considered super rare. The story is so many smart people have looked at it, it's probably impossible...for now. But that story may be its own weakness.
Anyway, the risk, while just as real as something like the grid being downed by a massive solar storm with multiyear recovery period from stone age due to transformer manufacturing delays and no stockpiles, just seems too minuscule/theoretical to spend much time on - from that point of view.
Regarding any plan, I don't know if it's so easy to just switch to ECC, because actual asymmetric encryption with ECC depends on shared secret, which (if you're assuming an unsecured exchange channel due to RSA being broken), is more vulnerable to MITM than RSA. I don't think it's an easy swap out.
All that aside, another point of view is RSA is probably already broken, the break is just secret to the codebreaking agencies. It would be very desirable for them to keep their breakthrough secret. That might even involve trying to find ways to suppress any "sudden number theory breakthroughs" hahaha! :)
I remember telling my high school students that if they found an efficient way to factor large numbers, they would destroy capitalism and a large number of them got very excited about number theory after that.
Isn’t this the same as zero-day vulnerabilities? Typically only a bunch of people out there know how to take advantage of such holes, and eventually they get fixed.
I guess if public key cryptography gets broken, only a bunch of people would know how to take advantage of it, and eventually it would get fixed.
We could always switch to symmeric keys (but key exchange would be somewhat problematic). Also elliptic curves crypto doesn't use primes, so even public key/private key crypto would still be around and secure.