return to table of content

A quick post on Chen's algorithm

keepamovin
21 replies
6d15h

There'll probably be discovered a class of problems which is hard and Q hard, provably and we'll be set. Something basically new to cryptography, that was tried once in the past but failed until seen through a new light of more recent maths or research fashion.

But until then it seems to me like something based on the difficulty of attacking hash functions would be a good bet for Q resistant. Totally unsure how to make a PK scheme from that, but it has a few nice properties:

- hashes are often tuneable, you can add more state and increase the keyspace/security

- good hashes don't have any weaknesses that Q can exploit

- hashes are often pretty fast

- hashes are well studied

- hashes seem to be hard in C and hard in Q

YoumuChan
13 replies
6d15h

We don't even know the complexity class of factorization or discrete log, yet we still use those problems in DH, RSA, ECDSA, ...

eru
12 replies
6d14h

All of those problems are known to be in NP and co-NP. In that sense, we know some complexity classes they belong to.

However, we don't know if these bounds are tight, or whether they are eg in P, or something in between.

keepamovin
9 replies
6d14h

We don't know that factorization is NP-complete> Show me a reduction from SAT to factorization.

It's kind of trivial to say it's in NP because we can verify in P time, that's not a criticism of you just of the definition!!

I think a better definition of NP is "only nonpoly algos can exist, no P algos can exist". By that definition of NP, we don't even know that it's in NP strictly because there could exist P algorithms for solving it. It's more in 'unknown-NP' if that were a class! hahaha! :)

eru
5 replies
6d12h

We don't know that factorization is NP-complete.

Yes? No one ever said it was.

None of the common cryptographic problems are expected to be NP-complete, even if they aren't in P. That's because they are known to be in both NP and in co-NP, and it's expected that NP != co-NP.

I think a better definition of NP is "only nonpoly algos can exist, no P algos can exist".

In what sense is that a 'better' definition than the standard definition? It sounds like what you are talking about is NP\P (where \ is set subtraction, ie 'NP minus P').

keepamovin
4 replies
6d8h

I think some people have asked whether it was. I'm not saying you did, just thought it was interesting! Haha :)

I don't even know what co-NP is. Could you explain?

I think that's a better definition because I find it more predictive and useful to think about: pretty concrete to know that you can't have a polytime algo for it.

Yeah, I guess what you're saying about NP\P is right in that it's a restatement of the definition of what I said, haha! I'm not an expert this is just what I think :)

eru
3 replies
5d17h

I don't even know what co-NP is. Could you explain?

See https://en.wikipedia.org/wiki/Co-NP That article even mentions integer factorisation.

I think that's a better definition because I find it more predictive and useful to think about: pretty concrete to know that you can't have a polytime algo for it.

Well, that's a non-standard definition for NP, and you would have a hard time talking to anyone. And at the moment we have no clue whether your 'NP' has any problems in it at at all, or whether it's an empty set. In that sense, it's a very impractical definition.

Btw, there's some nice alternative but equivalent definitions for traditional NP. The classic definition is basically, NP are those problem that you can check in polynomial time if someone gives you a hint (ie they give you the answer and whatever else you need, but you need to verify, you can't trust the hint.)

A nice alternative definition says that with access to randomness, that hint needs to be at most O(log n) long, and you also only need to even look at 3 randomly chosen bits of that short hint, and you are still guaranteed to suss out any fake answer with at least 66% probability. See https://en.wikipedia.org/wiki/PCP_theorem

keepamovin
2 replies
5d17h

Thanks for the alt NP definition. I'd be fine to talk to people we just have to clarify the definitions first. Haha! :) I think mine's good but I get if you differ, no worries.

It's actually a very fascinating definition and question: Are there problems for which we can prove they are in NP but also prove they cannot have polynomial time (P time) solutions?

I did check out that wiki page first, but found it super difficult to parse. Do you have some insight that could help me understand more simply/intuitively??

For instance, I found the definition of NP as P if you have an NFA, to be super easy to understand. But when that wiki starts talking about "certificates" I just have no idea.

That is, co-NP is the set of decision problems where there exists a polynomial {\displaystyle p(n)} and a polynomial-time bounded Turing machine M such that for every instance x, x is a no-instance if and only if: for some possible certificate c of length bounded by {\displaystyle p(n)}, the Turing machine M accepts the pair (x, c).

eru
1 replies
5d15h

Are there problems for which we can prove they are in NP but also prove they cannot have polynomial time (P time) solutions?

That's exactly the famous P!=NP question.

I did check out that wiki page first, but found it super difficult to parse. Do you have some insight that could help me understand more simply/intuitively??

Scott Aaronson might have some good intro material on his blog. Otherwise, you can just ask your favourite search engine (or AI bot) for some intro material.

For instance, I found the definition of NP as P if you have an NFA, to be super easy to understand. But when that wiki starts talking about "certificates" I just have no idea.

The certificate is the 'cheatsheet' or 'hint'. Basically the question is, how well can you do in an exam where you have to show your work, if someone gives you all the answers? (But that guy is a troll, so you can't trust him, and still need to verify everything.)

keepamovin
0 replies
13h44m

Cool, thank you. Yeah that makes sense. I didn't expect you to actually explain the entire thing, I just wondered if you had some, you know, insight. It's all good hahaha! :) I like your cheatsheet, I guess that applies to your previos definition of co-NP ! :)

SJC_Hacker
2 replies
6d13h

I think this what alot of people get wrong. "N' in NP does not stand for "not" it stands for "non-deterministic". Meaning you can solve in P time with a non-deterministic Turing machine, or alternatively, a function executing on all inputs in parallel.

So maybe it should really be P and NDP.

keepamovin
0 replies
6d8h

That's a good explanation. I didn't know that.

eru
0 replies
5d16h

or alternatively, a function executing on all inputs in parallel.

I like to explain non-determinism in terms of getting a hint, or having an (untrusted) cheatsheet in a test. Or always making lucky guesses (but you don't trust your guesses).

But as long as your parallel executions don't interact at all, the definitions are identical, I think.

openasocket
1 replies
5d20h

I always found that part odd. I’d assume you would want the problem you build your crypto system built around to be NP-complete, since that would seem to put you on the firmest possible ground. And yet those are most likely not NP-complete, and I think the post-quantum systems proposed aren’t NP complete either.

Maybe being NP-complete isn’t as important as I realize? Or maybe there’s something about NP-complete problems that make them less amenable to be a valid crypto system?

eru
0 replies
5d17h

No crypto-problem is NP-complete. People tried that for a while, see https://en.wikipedia.org/wiki/Knapsack_cryptosystems but it didn't work.

Or maybe there’s something about NP-complete problems that make them less amenable to be a valid crypto system?

To simplify a bit, the problem is that to work as a crypto system your particular problems needs to be both in NP and in co-NP. And we know of no problem that is both NP-complete and in co-NP. It's widely conjectured that there is no such problem. See https://en.wikipedia.org/wiki/Co-NP that page even mentions integer factorisation.

That's why you can't just take the NP-complete problem itself as a basis for your cryptosystem, you have to pick some subset of instances that's also in co-NP. And apparently it's almost impossible for us to pick such a subset, but still have the instances be hard enough to solve on average.

somat
1 replies
5d5h

Just remember that one time pads are not only one of the simplest encryption schemes they are proven secure in any quantum computing regime. It's a shame about the key transport problem.

One of my favorite sci-fi macguffians, was in the book "Fire upon the deep" where they were space truckers with a cargo of one time pad keys.

notfed
0 replies
2d22h

This is also true of most symmetric cryptography algorithms. (Of which one-time pad is one of. So is AES-256. It's more convenient to just use AES-256.)

The implied context of "post-quantum cryptography" is usually asymmetric cryptography.

karl_gluck
1 replies
6d15h

Check out the Winternitz One-Time Signature

Sphere10.com/articles/cryptography/pqc/wots

Signing many things with one identity is possible by precomputing a Merkle tree, but this takes time and the signatures get big.

Vecr
1 replies
6d15h

https://en.wikipedia.org/wiki/McEliece_cryptosystem

Yes, it's reasonably fast even at very long lengths, but the main problem is the very long lengths. Code based and not hash based though.

Edit: not very fast to generate a key though. It's mostly used for non-ephemeral stuff.

amluto
0 replies
6d14h

The new algorithm purports to solve LWE with certain choices of parameters. LWE is the problem of solving a linear system of equations over a finite ring, where each equation has an additive error from a certain distribution.

McEliece has a public key that is a general linear code. A code is a bunch of linear equations constraining codewords, and codewords are vectors over a finite field, and decoding a code is solving those equations subject to errors from a given distribution. Sounds familiar?

They’re not the same problem, and the distributions are different in rather fundamental ways (which may or may not make a difference), but they are quite related. I would not move my eggs to the McEliece basket right now.

Hash-based signatures sound as safe as ever.

ihm
0 replies
6d15h

There are a bunch of hash based signature schemes, e.g., SPHINCS https://sphincs.org/

glitchc
7 replies
6d20h

Let's wait for the dust to settle to see if Chen has indeed broken the Shortest Vector Problem for lattices. Bold claims require strong evidence.

stoniejohnson
2 replies
6d20h

Extra-ordinary claims require extra-ordinary evidence!

(my favorite maxim from 2012 era atheism debates on youtube)

But yeah I totally agree. Putting the cart before the horse here with the outlined consequences probably isn't smart, but to be fair, I haven't (and probably couldn't) read the paper.

ecosystem
0 replies
6d12h

"Bananas: atheists worst nightmare"?

vitus
1 replies
6d20h

I was under the impression that the Shortest Vector Problem was NP hard (at least, in some cases). If this works even in those scenarios, that's an even bolder claim by far.

pbsd
0 replies
6d19h

SVP is NP-hard for approximation factors much smaller than this algorithm reaches. This algorithm solves approximation factors of at best O(n^4.5), but NP-hardness is only shown for approximation factors well below n^(1/2). See Figure 1 in page 2 of [1] for a breakdown of the hardness of various approximation factors.

[1] https://cims.nyu.edu/~regev/papers/cvpconp.pdf

gradschoolfail
0 replies
6d18h

As pointed out by archgoon (and also pbsd) only for specific parameters is the problem broken, somewhat akin to saying a problem is NP-hard doesnt mean all instances of a problem are hard. But in this case for those parameters it isnt even NP hard.

archgoon
0 replies
6d19h

Chen has not claimed to have broken SVP; only in certain situations. This is a better LLL; not a polynomial hierarchy overturning.

faitswulff
7 replies
6d20h

It seems like the post-quantum algorithm that Signal selected [0] involves lattices [1] somehow:

Kyber is an IND-CCA2-secure key encapsulation mechanism (KEM), whose security is based on the hardness of solving the learning-with-errors (LWE) problem over module lattices.

Curious to see if Chen's work will eventually lead to Signal selecting a different algorithm.

[0]: https://signal.org/blog/pqxdh/

[1]: https://pq-crystals.org/kyber/

tptacek
3 replies
6d17h

More or less all of the "serious", actually deployed PQC schemes involve lattices, going back before the competition.

tptacek
1 replies
6d16h

Some helpful, perhaps valid context is that lattice-based cryptography was a contender even before PQC became a thing (NTRU being the obvious example).

Really the only point I'm trying to make here is that there's nothing eyebrow-raising about systems using lattice crypto; after IFP/FFDLP stuff like RSA and ECDLP, lattices are maybe the next most mainstream approach to constructing asymmetric systems.

pclmulqdq
0 replies
4d2h

I'm late to the party, but McEliece codes have also been around for a very long time, predating AES by a fair margin. The biggest problem with them is that the public keys are gigantic and the private keys are very large - even bigger than the keys used in lattice-based cryptosystems. This has caused them to always be a sort of fringe form of cryptography.

The good part is that McEliece codes are based on a proven NP-hard algorithm, so cracking them in polynomial time needs P = NP.

contact9879
2 replies
6d19h

Green explicitly mentions Kyber in his post.

NIST-approved schemes based on lattice problems include Kyber and Dilithium (which I wrote about recently.)

Chen’s algorithm does not immediately apply to the recently-standardized NIST algorithms such as Kyber or Dilithium.

And it's not just Signal. Apple's new iMessage protocol, PQ3, uses Kyber, too. [1] Most deployments of PQ-crypto that I know of have used Kyber.

[1] https://security.apple.com/blog/imessage-pq3/

glitchc
1 replies
6d18h

deleted

jameswryan
0 replies
6d18h

Rainbow is not a KEM, but a signature scheme.

anonymous-panda
7 replies
6d15h

I find the panic over potential threat of quantum quite amusing when the machine is still extremely theoretical - all existing machines are slower than classical and it’s not even clear they can scale to the required number of qubits.

There’s nowhere near the same urgency and significantly more denial over global warming. A bit apples and oranges but climate models have a better track record, are wildly conservative (ie our present is much worse than the past climate models predicted) and it’s a real problem we know exists and is a bullet train headed our way.

CobrastanJorji
3 replies
6d15h

I totally agree that climate change is a far more serious problem than quantum computing, but we do actually spend quite a bit on climate change, if not nearly as much as the problem warrants.

Depending on how you measure it, we theoretically spend about a trillion bucks a year worldwide on climate change ( https://www.climatepolicyinitiative.org/publication/global-l... ) and maybe a couple billion bucks a year on quantum computers.

eru
2 replies
6d14h

Measures to combat climate change are weird, and also politicised in weird ways.

Eg the US administration like to pretend that climate change is a top priority, but then complains when Chinese tax payers generously subsidise solar cells and electric cars for American consumers.

mensetmanusman
1 replies
6d6h

These technologies are manufactured with stricter environmental controls when done in the US versus China. The picture isn't as simple as it may seem.

eru
0 replies
5d17h

How does this complicate the picture?

The environmental controls you are talking about mostly protect the local environment. Protecting the local environment is great, but crudely speaking, the global climate doesn't much care about whether you dump some toxic waste in some part of China.

What I hear in complaints about the exports from the Americans is that they are worried about good old jobs for good old American boys and girls (preferably in a union that is a strong voting block one way or another). So I would take the Americans at their word here.

Yet again, if the local environment in China was a priority of the administration that outweighed concern about climate change, you would see a very different policy. Instead of just putting tariffs and other restrictions on imports of some Chinese goods. You can probably come up with your own examples.

ziddoap
0 replies
6d14h

I think this may be your technology bubble showing.

Global warming is talked about on the radio, TV, movies. It's sung about in songs. There's several conferences, widely-attended protests, stickers on appliances, tax initiatives, etc.

A few comments on obscure sites like HN can hardly be called a panic. It is silly to suggest that there is more urgency about post-quantum attacks on crypto than global warming.

jon_richards
0 replies
6d15h

When global warming hits, the people who benefited from ignoring it will be best off. When quantum computers hit, the people benefiting the most right now will be the worst off as all their communications from this era are decrypted.

SJC_Hacker
0 replies
6d13h

The slow poison is alot less alarming than the sharp knive.

givemeethekeys
5 replies
6d17h

If the research is true, will it make it possible for someone to easily steal my bitcoin?

peddling-brink
1 replies
6d17h

Only if you give them the keys.

harryp_peng
0 replies
6d14h

dude are you nuts the whole point of all that was to steal you without the key

tptacek
0 replies
6d16h

Yes, if they have a sufficiently powerful quantum computer.

mensetmanusman
0 replies
6d6h

Sim swaps are easier

fsmv
0 replies
6d13h

This research has no impact on Bitcoin, it was already broken post quantum by Shor's algorithm. They will simply change the signature algorithm when quantum computers become available so it should be fine.

enthdegree
2 replies
6d19h

The Wikipedia link on "lattice" points to the wrong article. "Lattice-based cryptography" usually means group-sense lattices, not order-sense lattices. Lattice groups are even directly involved in the rest of the article.

matthewdgreen
0 replies
6d18h

Thanks, it's fixed. That's what I get for quickly adding links.

Upvoter33
0 replies
6d18h

fix it! (please)

mvkel
1 replies
6d12h

Quantum Luddite here.

So much of quantum computing is theory, and so much of current crypto is applied.

Is it realistic to think the first applied quantum computers could quickly get us to a P=NP solution, rendering all crypto effectively irrelevant?

foota
0 replies
6d12h

Quantum computers are only tangentially related to P=NP, in that some problems in NP may have solutions that are polynomial time on quantum computers, but this says nothing about the rest of them. This class of problems is know as BQP. Is is possible, but unknown, that all problems in NP may belong to BQP, which would imply that a sufficiently large quantum computer could solve all problems in NP in polynomial time, but even then it would be possible that P does not equal NP, since P and NP are about classical computers.

jonahx
0 replies
6d20h

I appreciate short, accessible writeups on subjects like these.