So what is the actual state of Quantum computing in regards to the level that would make something like this necessary?
Is it become like AI where instead of actually coming into existence the definition is mostly just changing to bring forth a new product under a previously existing name?
If the answer was "the NSA is operating quantum cryptanalysis in production and ECDH should be considered completely broken", then anyone who knew this and told you would be in enormous trouble.
It seems unlikely that that's the case, but still, the question is sort of unanswerable. For now it's not known to be a threat, but how much paranoia you have over its potential is subjective.
One can be paranoid in a completely different direction, like “any quantum computer can’t survive long enough to perform any useful computation because objective collapse theories are right, the whole post-quantum cryptography exists to push implementations with backdoors injected by government”.
I have this type of tinfoil hat. I put it on whenever I see people blindly merging security fixes in response to vulnerability scans.
We're likely to see hybrid PQ + ECDH key exchange for the foreseeable future, running the pair of exchanged values through a hash-based key derivation function.
Partially due to fears of backdoors, but also out of caution because the mathematics of PQ cryptography has seen much less attention than DH / ECDH / RSA cryptography.
For a long time, there was similar conservative skepticism regarding ECDH.
The paranoid answer is to assume that certain organizations (e.g., nation state intelligence services, corporations that have enormous capital available, extremely wealthy people who can create organizations to pursue this) already have quantum cryptanalysis capabilities or very soon will.
In any case, it does no harm to be ready for a PQ world.
I'm just barely scratching the surface of quantum computing mostly out of curiosity after almost a decade of traditional software work. So I mostly have done things like qiskit tutorials, but other than reading about the theory I have yet to see a coherent explanation as to how the devices actually implement the theory.
Again. I'd love to be enlightened
There is none. QC is a scam like nanotech and fusion reactors. To learn more I recommended 'Will We Ever Have A Quantum Computer' by Dyakanov.
That paper[1] is a joke.
The main argument it makes is based on counting amplitudes, and noting there are far too many to ever control:
The reason this is a joke is because it fundamentally misunderstands what is required for a quantum computation to succeed. Yes, if you needed fine control over every individual amplitude, you would be hosed. But you don't need that.
For example, consider a quantum state that appears while factoring a 2048 bit number. This state has 2^2048 amplitudes with sorta-kinda-uniform magnitudes. Suppose I let you pick a million billion trillion of those amplitudes, and give you complete control over them. You can apply any arbitrary operation you want to those amplitudes, as long it's allowed by the postulates of quantum mechanics. You can negate them, merge them, couple them to an external system, whatever. If you do your absolute worst... it will be completely irrelevant.
Errors in quantum mechanics are linear, so changing X% of the state can only perturb the output by X%. The million billion trillion amplitudes you picked will amount to at most 10^-580 % of the state, so you can reduce the success of the algorithm by at most 10^-580 %. You are damaging the state, but it's such an irrelevantly negligible damage that it doesn't matter. (In fact, it's very strange to even talk about affecting 1 amplitude, or a fraction of the amplitudes, because rotating any one qubit affects all the amplitudes.)
To consistently stop me from factoring, you'd need to change well more than 10% of the amplitudes by rotations of well more than 10 degrees. That's a completely expected amount of error to accumulate over a billion operations if I'm not using error correction. That's why I need error correction. But Dyakonov argues like you'd only need to change 0.0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001% of the amplitudes to stop me from factoring. He's simply wrong.
[1]: https://ebooks.iospress.nl/pdf/doi/10.3233/APC200019
I said his book, which makes further arguments: https://link.springer.com/book/10.1007/978-3-030-42019-2
And your rebuttal amounts to "if I let you mess with a trivial number of amplitudes then the error will be trivial". Well duh. Another way of phrasing what you said is that you need to control 90% of 2^2048 amplitudes. Which is Dyakanov's point, that nobody knows how to do this.
I maintain that Dyakonov's arguments are completely missing the mark. I predict this will be experimentally obvious, instead of just theoretically obvious from linearity, within 5 years (due to the realization of logical qubits with lifetimes thousands of times better than their parts).
Do you not think that AI has been doing quite a lot of "actually coming into existence" the past decade? Did we have computers 10 years ago that could beat the best humans at Go, generate photorealistic images or converse in natural language? Rather than the definition changing, the goalpost might be moving a little bit here.
I don't really, no. 10, 20 and even 30+ years ago you could see the path of existence for the current state of things like Image Generation, knowledge engines and language learning models. I don't personally envision anything that exists today as AI as I would have back then. IBM's Watson won on Jeopardy 13 years ago. I don't think of the goals posts having changed at all because all of those things have existed for a long time they have just gotten better, but they weren't AI then and they aren't AI now.
In my mind, and what I think most people thought of back before the current AI push is that AI has critical thinking skills to develop new information. I certainly think that the current wave of AI named tools are impressive but they aren't the leap that makes something AI to me, because you could easily see them coming for a very long time.
I think the goalposts are definitely moving. ChatGPT definitely counts as an AI as I imagined an AI fifteen years ago.
Cryptography has a peculiar approach to the threat of quantum computers, because it is not acceptable for some of the data and connections encrypted today to become decryptable even thirty, fifty years in the future. That means that the question is not "are QC coming soon" but "might QC plausibly come into existence in the next half century". Since the answer, despite not having a precise consensus, is not "no", here we are.
This is also why you are seeing a lot more progress on PQ key exchanges, as opposed to signatures: signature verification today is not affected by QC fifty years from now, while encryption is.
There is also the problem of embedded devices. Some of these will still be operational in 20 years, at which point a "cryptographically relevant quantum computer" might exist. So we want to ensure today that these devices can support post-quantum algorithms, and depending on the device, this support might need side-channel and fault protection.
In some cases we can add support with a device firmware update, but things like secure boot flows and hardware accelerators can't always be updated in the field. And we need to make sure that the devices are fast enough, have big enough key storage, RAM and packet sizes, etc to support the new algorithms.
It's also important to have an established history of use and vetting so by the time PQ is needed, systems with a long history of security are available
Within the last two-ish years, the NIST settled on some post quantum crypto algorithms and there's been some implementation of them more and more since. Quantum computing is still far off, but I think the mindset is "why not start now?"
I don't know for certain, but I'd assume things like elliptic curve were implemented a good bit before it garnered mainstream usage. I'd love for someone who was around that when it was happening to correct me if I'm wrong though.
You're correct. For a while, elliptic curves were avoided by most for 2 reasons (1) worry about interpretation of some of the now-expired patents and (2) there were some known "gotchas" for constructing curves, and there were some fears that there were still some "gotchas" to be found. (And, perhaps, the NSA knew some of the gotchas and were keeping them in their back pockets.)
Side note: arithmetic/range coding had similar slow adoption due to patents. Depending on your interpretation, IBM's range coding was prior art for the arithmetic coding patents, but nobody really wanted to test it in court. For instance, bzip2 is the original bzip with the arithmetic coding step replaced by a Huffman code. Nobody wanted a repeat of the debacle with GIF images being widely adopted before many realized the use of the patented LZW compression might be a problem.
It’s not about protecting against QCs today.
The risk is that adversaries can store today’s payloads, and decrypt them in the future. So the sooner you switch to quantum-safe cryptography, the less of a “backlog” you leave vulnerable to future exploit.
For quantum computers to break RSA2048, the current quality of physical qubits needs to go up 10x and the quantity needs to go up 10000x. These are rough numbers.
The next major milestone to watch for is a logical qubit with fidelity 1000x better than the physical qubits making it up. That will signal the physical qubits are good enough that you could start only scaling quantity.