return to table of content

Show HN: filippo.io/mlkem768 – Post-Quantum Cryptography for the Go Ecosystem

hattmall
20 replies
2d4h

So what is the actual state of Quantum computing in regards to the level that would make something like this necessary?

Is it become like AI where instead of actually coming into existence the definition is mostly just changing to bring forth a new product under a previously existing name?

amenhotep
4 replies
2d3h

If the answer was "the NSA is operating quantum cryptanalysis in production and ECDH should be considered completely broken", then anyone who knew this and told you would be in enormous trouble.

It seems unlikely that that's the case, but still, the question is sort of unanswerable. For now it's not known to be a threat, but how much paranoia you have over its potential is subjective.

sesm
2 replies
2d2h

One can be paranoid in a completely different direction, like “any quantum computer can’t survive long enough to perform any useful computation because objective collapse theories are right, the whole post-quantum cryptography exists to push implementations with backdoors injected by government”.

__MatrixMan__
0 replies
2d

I have this type of tinfoil hat. I put it on whenever I see people blindly merging security fixes in response to vulnerability scans.

KMag
0 replies
2d

We're likely to see hybrid PQ + ECDH key exchange for the foreseeable future, running the pair of exchanged values through a hash-based key derivation function.

Partially due to fears of backdoors, but also out of caution because the mathematics of PQ cryptography has seen much less attention than DH / ECDH / RSA cryptography.

For a long time, there was similar conservative skepticism regarding ECDH.

cryptonector
0 replies
2d2h

The paranoid answer is to assume that certain organizations (e.g., nation state intelligence services, corporations that have enormous capital available, extremely wealthy people who can create organizations to pursue this) already have quantum cryptanalysis capabilities or very soon will.

In any case, it does no harm to be ready for a PQ world.

Avicebron
4 replies
2d4h

I'm just barely scratching the surface of quantum computing mostly out of curiosity after almost a decade of traditional software work. So I mostly have done things like qiskit tutorials, but other than reading about the theory I have yet to see a coherent explanation as to how the devices actually implement the theory.

Again. I'd love to be enlightened

squabbles
3 replies
2d2h

There is none. QC is a scam like nanotech and fusion reactors. To learn more I recommended 'Will We Ever Have A Quantum Computer' by Dyakanov.

Strilanc
2 replies
1d23h

That paper[1] is a joke.

The main argument it makes is based on counting amplitudes, and noting there are far too many to ever control:

The hypothetical quantum computer is a system with an unimaginable number of continuous degrees of freedom - the values of the 2^N quantum amplitudes with N ~ 10^3–10^5 . [...] Now, imagine a bike having 1000 (or 2^1000 !) joints that allow free rotations of their parts with respect to each other. Will anybody be capable of riding this machine? [...] Thus, the answer to the question in title is: As soon as the physicists and the engineers will learn to control this number of degrees of freedom, which means - NEVER.

The reason this is a joke is because it fundamentally misunderstands what is required for a quantum computation to succeed. Yes, if you needed fine control over every individual amplitude, you would be hosed. But you don't need that.

For example, consider a quantum state that appears while factoring a 2048 bit number. This state has 2^2048 amplitudes with sorta-kinda-uniform magnitudes. Suppose I let you pick a million billion trillion of those amplitudes, and give you complete control over them. You can apply any arbitrary operation you want to those amplitudes, as long it's allowed by the postulates of quantum mechanics. You can negate them, merge them, couple them to an external system, whatever. If you do your absolute worst... it will be completely irrelevant.

Errors in quantum mechanics are linear, so changing X% of the state can only perturb the output by X%. The million billion trillion amplitudes you picked will amount to at most 10^-580 % of the state, so you can reduce the success of the algorithm by at most 10^-580 %. You are damaging the state, but it's such an irrelevantly negligible damage that it doesn't matter. (In fact, it's very strange to even talk about affecting 1 amplitude, or a fraction of the amplitudes, because rotating any one qubit affects all the amplitudes.)

To consistently stop me from factoring, you'd need to change well more than 10% of the amplitudes by rotations of well more than 10 degrees. That's a completely expected amount of error to accumulate over a billion operations if I'm not using error correction. That's why I need error correction. But Dyakonov argues like you'd only need to change 0.0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001% of the amplitudes to stop me from factoring. He's simply wrong.

[1]: https://ebooks.iospress.nl/pdf/doi/10.3233/APC200019

squabbles
1 replies
1d23h

I said his book, which makes further arguments: https://link.springer.com/book/10.1007/978-3-030-42019-2

And your rebuttal amounts to "if I let you mess with a trivial number of amplitudes then the error will be trivial". Well duh. Another way of phrasing what you said is that you need to control 90% of 2^2048 amplitudes. Which is Dyakanov's point, that nobody knows how to do this.

Strilanc
0 replies
1d22h

I maintain that Dyakonov's arguments are completely missing the mark. I predict this will be experimentally obvious, instead of just theoretically obvious from linearity, within 5 years (due to the realization of logical qubits with lifetimes thousands of times better than their parts).

uh_uh
2 replies
2d1h

Do you not think that AI has been doing quite a lot of "actually coming into existence" the past decade? Did we have computers 10 years ago that could beat the best humans at Go, generate photorealistic images or converse in natural language? Rather than the definition changing, the goalpost might be moving a little bit here.

hattmall
0 replies
15h43m

I don't really, no. 10, 20 and even 30+ years ago you could see the path of existence for the current state of things like Image Generation, knowledge engines and language learning models. I don't personally envision anything that exists today as AI as I would have back then. IBM's Watson won on Jeopardy 13 years ago. I don't think of the goals posts having changed at all because all of those things have existed for a long time they have just gotten better, but they weren't AI then and they aren't AI now.

In my mind, and what I think most people thought of back before the current AI push is that AI has critical thinking skills to develop new information. I certainly think that the current wave of AI named tools are impressive but they aren't the leap that makes something AI to me, because you could easily see them coming for a very long time.

GolDDranks
0 replies
1d12h

I think the goalposts are definitely moving. ChatGPT definitely counts as an AI as I imagined an AI fifteen years ago.

FiloSottile
2 replies
2d3h

Cryptography has a peculiar approach to the threat of quantum computers, because it is not acceptable for some of the data and connections encrypted today to become decryptable even thirty, fifty years in the future. That means that the question is not "are QC coming soon" but "might QC plausibly come into existence in the next half century". Since the answer, despite not having a precise consensus, is not "no", here we are.

This is also why you are seeing a lot more progress on PQ key exchanges, as opposed to signatures: signature verification today is not affected by QC fifty years from now, while encryption is.

less_less
0 replies
1d23h

There is also the problem of embedded devices. Some of these will still be operational in 20 years, at which point a "cryptographically relevant quantum computer" might exist. So we want to ensure today that these devices can support post-quantum algorithms, and depending on the device, this support might need side-channel and fault protection.

In some cases we can add support with a device firmware update, but things like secure boot flows and hardware accelerators can't always be updated in the field. And we need to make sure that the devices are fast enough, have big enough key storage, RAM and packet sizes, etc to support the new algorithms.

hackcasual
0 replies
1d22h

It's also important to have an established history of use and vetting so by the time PQ is needed, systems with a long history of security are available

jjice
1 replies
2d4h

Within the last two-ish years, the NIST settled on some post quantum crypto algorithms and there's been some implementation of them more and more since. Quantum computing is still far off, but I think the mindset is "why not start now?"

I don't know for certain, but I'd assume things like elliptic curve were implemented a good bit before it garnered mainstream usage. I'd love for someone who was around that when it was happening to correct me if I'm wrong though.

KMag
0 replies
1d23h

You're correct. For a while, elliptic curves were avoided by most for 2 reasons (1) worry about interpretation of some of the now-expired patents and (2) there were some known "gotchas" for constructing curves, and there were some fears that there were still some "gotchas" to be found. (And, perhaps, the NSA knew some of the gotchas and were keeping them in their back pockets.)

Side note: arithmetic/range coding had similar slow adoption due to patents. Depending on your interpretation, IBM's range coding was prior art for the arithmetic coding patents, but nobody really wanted to test it in court. For instance, bzip2 is the original bzip with the arithmetic coding step replaced by a Huffman code. Nobody wanted a repeat of the debacle with GIF images being widely adopted before many realized the use of the patented LZW compression might be a problem.

amomchilov
0 replies
2d3h

It’s not about protecting against QCs today.

The risk is that adversaries can store today’s payloads, and decrypt them in the future. So the sooner you switch to quantum-safe cryptography, the less of a “backlog” you leave vulnerable to future exploit.

Strilanc
0 replies
2d

For quantum computers to break RSA2048, the current quality of physical qubits needs to go up 10x and the quantity needs to go up 10000x. These are rough numbers.

The next major milestone to watch for is a logical qubit with fidelity 1000x better than the physical qubits making it up. That will signal the physical qubits are good enough that you could start only scaling quantity.

vbezhenar
14 replies
2d4h

The same guy that brought us https://github.com/FiloSottile/age

I really like this tool.

varispeed
8 replies
2d3h

Shame it doesn't have plausible deniability built in. That is you could encrypt at least two files and decrypt one of them based on which key you provide.

This seems to be a security flaw of most of these kind of tools. That is there is only one possible key, so someone with a hammer can make you disclose it. But if the number of keys is unknown, you can give up some keys and hope attacker will leave you alone, without revealing the actual protected file.

woodruffw
7 replies
2d3h

I don’t understand the threat model for plausible deniability in a file encryption tool: any adversary that would physically compel you to provide one key would also know that you’re potentially being dishonest about that key.

In other words: plausible deniability on decryption doesn’t satisfy the adversary; they’re just going to torture you until you hand over the real key.

(Maybe there are scenarios, like airport security, where the adversary probably won’t torture you but needs the appearance of decryption? But I don’t think the adversary in those scenarios is thinking about file encryption tools; they’re thinking about your phone’s PIN or biometric lock.)

varispeed
5 replies
2d2h

In other words: plausible deniability on decryption doesn’t satisfy the adversary;

If there is only one key that decrypts the file, then they will have validation whether you provided the right one.

For instance if you have encrypted your crypto wallet info. You would have to give up the real key that will decrypt the file.

With plausible deniability scenario, you could have encrypted two wallets, one that you afford to lose. You can give it up and it's possible the attacker will be satisfied and you can keep the wallet that you care about.

The attacker will also never know if there are more keys. Mind you, they can always shoot you either way, but with the plausible deniability you might have a chance to leave the wallet you care about to your dependents.

woodruffw
4 replies
2d1h

With plausible deniability scenario, you could have encrypted two wallets, one that you afford to lose. You can give it up and it's possible the attacker will be satisfied and you can keep the wallet that you care about.

The observation here is that there's a _very_ narrow slice of adversaries that satisfy the following constraints:

1. Are willing to force you to hand over private key material;

2. Are ignorant to the underlying thing they're looking for;

3. Are ignorant to the fact that your choice of encryption scheme allows for plausible deniability.

This model assumes all 3, when in reality almost any adversary that satisfies (1) is not going to satisfy (2) or (3) -- they almost always have a sense of what they're looking for (i.e., they know the wallet isn't empty or they wouldn't waste their time on you) and, given that, they aren't going to be fooled by a scheme that explicitly supports plausible deniability.

We can always contrive exceptions, of course. But part of secure design and threat modeling is having a reasonable conception of your adversary.

varispeed
3 replies
2d

You are moving the goal posts. I didn't say the second wallet is empty, but such that you can afford to lose.

For instance you could have one wallet with £10m on it an another with £1.5m. You could certainly convince adversary that they got bad intel and £1.5m is what you have. It's better to lose £1.5m than £11.5m.

There are other scenarios like journalist taking compromising photos. They could have two sets of photos - one with key photo missing and another with key photo in the set. When questioned by adversary they could claim they have missed and didn't take the photo and show the set as evidence.

Someone in abusive relationship planning on leaving the partner. They could have a folder with properties they are interested in without the property they are actually going to rent. When caught they could convince the partner that they were just looking, but have not committed to anything.

If you are not in these kind of situations, sure this additional layer may not be to your interest and frankly you wouldn't have to use it! But for many people lack of such feature is a deal breaker.

woodruffw
2 replies
1d23h

The "empty wallet" isn't an operative part of the argument. The argument is that all three properties need to hold; this is true even in the 11.5m example.

Each of the scenarios above fails test (3). The most compelling of them is the abusive relationship one, since we can reasonably imagine a non-sophisticated adversary in that setting. But even then, you're relying on conflicting terms: what kind of abusive partner is sufficiently irrational to be abusive but also sufficiently rational to believe a cryptographic proof? Or, put another way: overwhelming evidence has not historically been a strong defense against abuse.

varispeed
1 replies
1d8h

You are moving goal posts again.

Sorry my friend, but you are not discussing this in good faith.

woodruffw
0 replies
1d1h

I don't think I am, but I'm sorry that this hasn't been a productive conversation.

eviks
0 replies
2d3h

You seem to understand it with your own example of airport security types of situations. There are also other similar law enforcement actions where the people wouldn't have enough info to have a high enough degree of certainty that you possess something to be able to get suspect enough to cause additional harm on not finding it when the keys are disclosed

candiddevmike
3 replies
2d4h

Age is nice but seems to have stagnated--last release is from 2022, and it's not using a more modern PBKDF like argon.

If you're looking for something designed for secret storage/sharing, checkout rot: https://github.com/candiddev/rot

tptacek
0 replies
2d3h

Not changing since 2022 is a feature, not a bug. It's an exchange format, so it was never going to switch KDFs just for funsies, and scrypt is fine.

dpatterbee
0 replies
2d3h

I wouldn't describe age as having stagnated, it's simply stable. It has a well defined spec which it fully implements (seemingly correctly), so there isn't any need for more recent releases.

Also I think it's a bit sly to not mention that you're the creator of the alternative you suggest.

FiloSottile
0 replies
2d3h

age is intentionally stable as a format and as a core tool.

There is a lot of activity in the integration and plugin ecosystem, which I am very happy about. https://github.com/FiloSottile/awesome-age

I have a wishlist for v2 changes, and I am considering slowly and carefully making such a release this year, but the difference in security between scrypt and Argon2 doesn't really justify making a change there.

vaylian
0 replies
2d2h

I want to like this tool, but it lacks a manual or a tutorial that describes typical usage. I don't mean how to use the command line. That is well-documented. But I want to know how I should manage and distribute my keys and what I need to look out for. The entire social layer on top of the technology is unclear to me.

Some stories featuring Alice and Bob would be great.

lopkeny12ko
14 replies
2d3h

Whatever happened to "don't roll your own crypto"? Isn't this work best left to OpenSSL for example.

sackfield
2 replies
2d2h

This is a good point, most of Go's crypto library seems to be written in go: https://cs.opensource.google/go/go/+/master:src/crypto/crypt...

Go can link to C but the process is a bit horrible. I wonder if Go's memory safety in comparison to C and the security implications reverses this a bit.

ecnahc515
1 replies
1d14h

None of that is an actual implementation of the cryptography, that's just the interfaces. The actual implementations are elsewhere and use architecture specific assembly for anything that needs constant time properties.

FiloSottile
0 replies
1d7h

No, we use assembly for access to specialized (e.g. AES-NI) or vector instructions, and we do so reluctantly. We much prefer writing cross-platform constant time Go. https://go.dev/wiki/AssemblyPolicy

The actual implementations are in that tree, too: https://cs.opensource.google/go/go/+/master:src/crypto/

programd
2 replies
2d2h

FiloSottile is one of the handful of people in the world whose crypto code you use instead of rolling your own.

archgoon
1 replies
2d2h

Why?

ecnahc515
0 replies
1d15h

Because he's a security engineer/cryptographer. He's the maintainer of the crypto package in the Go standard library. https://words.filippo.io/hi/. It's literally his job to write the crypto code everyone else uses.

gnfargbl
1 replies
2d2h

Golang doesn't typically use OpenSSL: there is a BoringCrypto mode which can be enabled, but it's unsupported outside Google [1]. Instead, they have a policy of implementing well-known cryptographic algorithms themselves, in Go.

The author in this case is the lead cryptographic maintainer for Go.

[1] https://go.dev/src/crypto/internal/boring/README

HillRat
0 replies
1d14h

Possibly worth noting for folks that BoringCrypto is necessary if you're having to deal with FIPS certification (which hopefully you don't). There's the usual FFI penalty to using it, so Go-native code is preferred for most use cases.

e4m2
1 replies
2d2h

The author of the post happens to be the person who implements your crypto, so you don't have to roll your own: https://github.com/FiloSottile.

(I personally also wouldn't use OpenSSL as an example of good cryptographic code)

meesles
0 replies
2d2h

Taking this rare opportunity to put OpenSSL on blast a little bit - I have experienced no other software library or vendor where seemingly benign patch updates have completely broken functionality at the system level (in the web world). Semver people! Use it properly (and be consistent with the rest of the ecosystem you operate in)!!

wfn
0 replies
2d2h

Golang has its own native crypto implementations where possible. Having used openssl extensively and having read some of its source code[1], I would personally like good alternatives. The developer (Filippo Valsorda) is a cryptographer and Go maintainer (incl. of some well known Go crypto packages in its std lib).

In this particular instance he seems to have implemented this (ML-KEM-768) as an exercise (incl. educational), but still, just some context!

[1] openssl is a gift that keeps on giving (CVEs). Just look at all those nice (incl. recent) issues, incl. RCEs iirc. Also, very anecdotal, but I find it funny that they haven't updated this page for the last I don't know 15 years? https://wiki.openssl.org/index.php/Code_Quality

mcpherrinm
0 replies
1d22h

Sibling comments have already addressed the fact that Filippo is one of the people you're probably "best leaving" cryptography implementation to.

The author has actually talked explicitly about "Don't roll your own crypto":

https://securitycryptographywhatever.com/2021/07/31/the-grea...

chjj
0 replies
2d

I always took that to mean, "don't come up with your own cryptographic schemes", not "don't implement well-specified and standardized algorithms which are accompanied by a reference implementation and test vectors".

Retr0id
0 replies
2d2h

Beyond what the other replies have already pointed out, the most sensible way to deploy PQ crypto today is to use it as part of a hybrid scheme. This way, even if the PQ crypto is broken (either fundamentally, or due to implementation bugs), the scheme as a whole will still be secure under a pre-quantum threat model - so not worse than what you started with, assuming you were using pre-quantum crypto to begin with.

gnfargbl
14 replies
2d4h

I have no ability to judge the quality of this algorithm or implementation, but I do thoroughly approve of the usage of unicode in variable names:

    ρ, σ := G[:32], G[32:]
Somehow much better than seeing "rho", "sigma".

Philip-J-Fry
7 replies
2d4h

Gotta disagree. It's neat, but I don't like to see it in the real world.

For a start, I don't know how to type these on a keyboard.

Secondly, most people wouldn't know what these symbols are called. Granted, those looking at the code probably have a greater chance of knowing. But it isn't friendly code in my opinion. I think clarity is key, and "rho" or "sigma" are pretty clear.

Also, add in that there's a constant "n" and a constant "η". Just begging for confusion.

Retr0id
5 replies
2d4h

In a more general sense I'd agree, but in this instance, the ML-KEM draft specification (FIPS 203) uses the exact same greek symbols.

This code will be read many more times than it is written, and anyone auditing its correctness will be comparing against the spec.

If the variables are spelt out, then you have to do the mental (or otherwise) translation before you can compare the two, adding overhead. For comparing the symbols visually, you don't even need to know their names.

riquito
2 replies
1d23h

For comparing the symbols visually, you don't even need to know their names.

Well, no, this was the main issue with homograph attacks in domain names [1] that brought us to the use of punycode in browsers [2]

In particular for a cryptographic library, I wouldn't want to constantly have to watch out for sneaky malicious variables put in the right place (e.g. try to compare visually the Cyrillic а, с, е, о, р, х, у with the ascii a, c, e, o, p, x, y (no, they're not the same characters).

EDIT: I realize that many programming languages today allow the use of unicode variables and I like that it's a possibility, it's just not the best when you need to be paranoid about the code

- [1] https://en.wikipedia.org/wiki/IDN_homograph_attack - [2] https://en.wikipedia.org/wiki/Punycode

Retr0id
1 replies
1d22h

Preventing/detecting homoglyph attacks is a feature of competent text editors, and not a feature of the source code itself. If the source spelt out the variable names using latin characters, it would be no more or less susceptible to being backdoored in this way.

GolDDranks
0 replies
1d12h

Also, for example, rustc does a homoglyph detection pass and emits warnings.

xenophonf
1 replies
2d3h

the ML-KEM draft specification (FIPS 203) uses the exact same greek symbols

I'm as proud of my heritage as the next Greek-American, but just because mathematicians use unintelligible symbols in their manuscripts doesn't mean we have do the same thing in code. Let's prioritize comprehensibility and give variables meaningful names, instead.

Retr0id
0 replies
2d3h

FIPS 203 is not a math paper, it's a specification, giving concrete algorithms that use aforementioned variable names. Maybe they should use more descriptive variable names in the spec (you could tell them about it - they accept and respond to feedback from the public), but in the meantime, I think it's more useful for an implementation to use the same variable names as the spec.

eviks
0 replies
2d3h

For a start, I don't know how to type these on a keyboard.

You can learn how to do that, even find a way to type "sigma", but more importantly, it best benefits readers, not writers, so you don't need to learn to type it

and "rho" or "sigma" are pretty clear.

no it's not, where would you get clarity from is all the clarifying literature for these notions use the actual math notation σ?

"n" and a constant "η". Just begging for confusion

looks very distinct, one is obviously mathy, another isn't

tzs
3 replies
2d4h

Does Go allow unicode subscripts in variable names?

Of the languages I've checked, Perl, Python, and JavaScript (in Chrome and Firefox) do not. PHP does.

sapiogram
1 replies
2d4h

It does. Not exactly surprising, since Rob Pike and Ken Thompson made both Golang and UTf-8.

eviks
0 replies
2d3h

it doesn't (that's subscript 1)?

invalid character U+2081 '₁' in identifier

Though that's also not surprising given how poor overall Unicode support is

eviks
0 replies
2d3h

doesn't seem like it

invalid character U+2081 '₁' in identifier

But AutoHotkey does allow it

super² := 1, sub₂ := 2

clktmr
1 replies
2d4h

I must say, I don't like it at all. As with all characters that aren't part of my keyboard, the extra steps to type these add so much friction. Also, I would probably misread ρ as p, giving me the weirdest compile errors.

How would you feel about adding acutes and cedilles to characters? It just adds complexity. Let's stick to the smallest common denominator.

eviks
0 replies
2d3h

the smallest common denominator for math is math symbols, full ascii notation instead of those is the extra complexity in reading comprehension, which, as the saying goes, is a more frequent occurence

tgkudelski
7 replies
2d4h

Hello from Kudelski Security. This is super timely, because we recently had to discontinue one of the other only existing Go libraries for quantum-resistant cryptography in Go! Full story at https://research.kudelskisecurity.com/2024/02/01/the-kybersl...

client4
6 replies
2d1h

Wasn't kyber-512 intentionally weakened by the NSA members of NIST?

tptacek
5 replies
2d1h

No.

less_less
4 replies
1d20h

To expand on this: Daniel J Bernstein (of Curve25519 etc fame) has alleged that NIST and/or NSA knows a secret weakness in Kyber and therefore pushed it instead of NTRU. While the allegations are vaguely plausible -- NSA employs some very clever people, and of course they would like to interfere -- the evidence DJB put forth hasn't convinced very many other cryptographers.

There was also an incident a few months back when someone with an NSA email address suggested significant last-minute changes to Kyber on the PQC forum mailing list. These changes had a security flaw, and they were rejected. NSA might still know a different weakness, of course.

Note also that DJB's allegations focus on Kyber-512 being too weak, and this post is about Kyber-768.

tptacek
3 replies
1d19h

I don't think there's a single PQC cryptography researcher other than Bernstein himself (corrections welcome) that takes the claims he made seriously, and one of the basic mathematical arguments he made may have been refuted, two messages down in the mailing list thread, by Chris Peikert.

less_less
2 replies
1d19h

Yeah, sorry, "hasn't convinced very many other cryptographers" was probably too much of an understatement.

client4
1 replies
1d12h

I appreciate the follow-up. I read the long DJB page but never saw any follow-up; to be fair I wasn't directly looking for any. In either case it's great to know the allegations don't apply to Kyber-768 and up (and great there's a Golang implementation now!).

less_less
0 replies
1d7h

I'm sure he's alleged something about the other variants. Like a few years ago he had this theory about "S-Unit attacks" on Kyber and NewHope, but it hasn't gone anywhere.

IMHO the lattice finalists -- Kyber, Saber and NTRU -- are all basically good, each having advantages over the others but no decisive advantages, and Kyber was the community favorite. So that whole rant about NIST picking Kyber for unconvincing reasons is like ... yeah, that's just what happens when all remaining choices are fine.

There is also the issue that cryptanalysis has advanced. There haven't been any fundamental breakthroughs yet, but there have been significant optimizations. If this trend continues, Kyber-512 might become certificationally weak (i.e. it might be considered weaker than AES-128), but unless there is a deeper breakthrough, it probably will not become feasible to break it in practice. This threat is why the Kyber team recommends Kyber-768 for mainstream use. The same threat applies to Saber and NTRU, with NTRU having (IIUC) the weakest security at a given dimension, but the most freedom in choosing how many dimensions to use.

mooreds
4 replies
2d6h

Anyone aware of such implementations for other languages (java, c#, etc)?

elliewithcolor
2 replies
2d6h

General implementation or a „from scratch“?

For general here is a list: https://pq-crystals.org/kyber/software.shtml

cipherboy
1 replies
2d5h

Note that there may be incompatibilities (as noted in the article) until NIST has published the final revisions. Some specifications are on Round 3 kyber, others are on FIPS 203.

This one will interoperate with Bouncy Castle (both Java and C#) as we both use FIPS 203 draft, but it won't interoperate with OQS simultaneously (three-way interop) as that is still on the Round 3 submission.

See also: https://github.com/bcgit/bc-java/issues/1578

(Disclosure: BC is my employer)

elliewithcolor
0 replies
2d4h

Fair point.

glitchc
0 replies
2d3h

How about liboqs from OpenQuantumSafe? It includes an implementation of most PQC primitives proposed to date:

https://github.com/open-quantum-safe/liboqs

bennettnate5
4 replies
2d5h

Correct me if I'm wrong, but if it's written in pure Go, wouldn't that make it susceptible to timing/power side channel attacks?

bennettnate5
1 replies
2d4h

All critical operations are performed in constant time.

Should have clicked all the way through links to the project docs--looks like they're keeping this in mind.

e1g
0 replies
2d3h

Just for context, the OP (FiloSottile) was in charge of cryptography and security on the Go team until recently.

l33t7332273
0 replies
2d1h

Is there a language that is invulnerable to power side channel attacks? The idea seems nonsensical to me.

As far timing attacks, what about Go makes it more susceptible to timing side channels than any other language?

FiloSottile
0 replies
2d3h

Go is as susceptible to timing side channels as C, if not less. (The difference being that while there is one major Go compiler, which usually does not go overboard with optimizations, when writing C you have to do increasingly complex tricks to defend against the compiler realizing what you are trying to do and replacing it with a more efficient variable-time branch.) This implementation was written to avoid any secret dependent code path.

Power side channels, which require physical access, are indeed outside the threat model of Go.

tux3
2 replies
2d5h

Neat that it can also work as draft00/kyber v3 =)

How hard would it be to support a fast Kyber 90's mode, without SHA-3? (I suppose you would have to break the abstraction for that one).

FiloSottile
1 replies
2d5h

Yeah, replacing the hash would take a fork. Note that this implementation spends only about 20% of CPU time in SHA-3, so the gain wouldn't be massive. That proportion would probably grow after optimizing the field implementation, but almost certainly not enough to make it worth using a non-standardized, less-tested mode.

tux3
0 replies
2d4h

That's fair. Thanks.

mauricesvp
1 replies
2d2h

Unrelated, but c'mon Filo, the 32 bit syscall table is still 'coming soon' :')

FiloSottile
0 replies
2d

Hah, touché my friend. It's just that every time I think about touching that page it scope creeps into making it autogenerated from the kernel sources via CI etc. etc. :)

teleforce
0 replies
2d1h

Perhaps relevant to the discussions is this friendly book on crypto systems implementation in the latest version of Go by John Arundel. Inside the last section there is a passing mention on post quantum crypto. Perhaps if John can update the book later with this library once the NIST PQ is standardized.

Explore Go: Cryptography (Go 1.22 edition):

https://bitfieldconsulting.com/books/crypto

dorianmariefr
0 replies
2d5h