So how many gates are we talking to factor some "cryptographically useful" number? Is there some pathway that makes quantum computers useful this century?
lisper · 1h ago
> So how many gates are we talking to factor some "cryptographically useful" number?
That is a hard question to answer for two reasons. First, there is no bright line that delineates "cryptographically useful". And second, the exact design of a QC that could do such a calculation is not yet known. It's kind of like trying to estimate how many traditional gates would be needed to build a "semantically useful" neural network back in 1985.
But the answer is almost certainly in the millions.
[UPDATE] There is a third reason this is hard to predict: for quantum error correction, there is a tradeoff between the error rate in the raw qbit and the number of gates needed to build a reliable error-corrected virtual qbit. The lower the error rate in the raw qbit, the fewer gates are needed. And there is no way to know at this point what kind of raw error rates can be achieved.
> Is there some pathway that makes quantum computers useful this century?
This century has 75 years left in it, and that is an eternity in tech-time. 75 years ago the state of the art in classical computers was (I'll be generous here) the Univac [1]. Figuring out how much less powerful it was than a modern computer makes an interesting exercise, especially if you do it in terms of ops/watt. I haven't done the math, but it's many, many, many orders of magnitude. If the same progress can be achieved in quantum computing, then pre-quantum encryption is definitely toast by 2100. And it pretty much took only one breakthrough, the transistor, to achieve the improvement in classical computing that we enjoy today. We still don't have the equivalent of that for QC, but who knows when or if it will happen. Everything seems impossible until someone figures it out for the first time.
>> Is there some pathway that makes quantum computers useful this century?
> This century has 75 years left in it, and that is an eternity in tech-time.
As a comparison, we went from first heavier than air flight to man walking on the moon in only 66 years.
nabla9 · 1h ago
For RSA 4096 10^7 qubits with 10^-4 error rate (order of magnitude).
You can do useful and valuable quantum chemistry calculations already with few 100s of qubits with that low error rates, while post-quantum algorithms are becoming more common everyday removing incentives to build crypto cracking quantum computers.
I think the quantum technology will advance fastest in directions that are not easy to use in cryptography.
As a layman the pathway seems to exist behind multiple massive materials science breakthroughs
andrewflnr · 1h ago
How is it that we need to build logical qubits out of physical qubits for error correction purposes, but then still need to blow out our logic qubit numbers for error correction purposes, again? It seems like there's something missing from this explanation, at least.
tripplyons · 1h ago
Not sure about the gate count, but if you look at the number of logical qubits required, we are still very far away from factoring numbers that traditional computing has already factored like the 829-bit RSA-250 number.
Legend2440 · 1h ago
Realistically, you want millions to billions of qubits to compete with classical computers that already have trillions of transistors.
jameshart · 27m ago
Ah - this helped me understand the numbers in quantum computing a little more clearly. I had been under the impression (based on my naive interpretation of the naming) that the number of qubits in a quantum processor might be something analogous to the number of bits of register state in a regular CPU; that qubits should be thought of more as analogous to transistors or maybe even gates makes it a little clearer why you need so many more qubits to perform more complex operations.
AceJohnny2 · 3h ago
What does this mean about the size (and thus feasibility) of a circuit required to factor a cryptographically interesting number, say, to be generous, RSA1024?
Davidzheng · 2h ago
Off topic, but are cryptographers convinced that on the new gigawatt data centers RSA1024 is infeasible to factor? I gather that the fastest known algorithms are still too slow to factor it in reasonable time. But is consensus that there will not be improvements to these algorithms in near future?
rwmj · 2h ago
Number Field Sieves are still the best method, and the techniques are three or more decades old with only incremental improvements. (Of course there might be an incredible breakthrough tomorrow.)
tiahura · 2h ago
best published method
consp · 1h ago
Are the bitcoins in the first wallets gone? No? I will assume it's still the best method without any irrefutable evidence.
tripplyons · 1h ago
Bitcoin uses ECDSA to sign transactions, not RSA.
In addition, selling information to a government on how to break either system would be more valuable than the amount of bitcoin you would able to sell before exchanges stop accepting deposits or the price crashes.
aleph_minus_one · 1h ago
> In addition, selling information to a government on how to break either system would be more valuable
Honest question because one can find such claims very often on forums like HN:
Does there really exist a "feasible" way how some "lone hacker" could sell such information to some government and become insanely rich?
I know that people who apparently have some deep knowledge about how exploit markets work claimed on HN that "if you have to ask how/where to solve your exploit (i.e. you have the respective contacts), you are very likely not able to".
This latter observation seems a lot more plausible to me than the claim often found on HN that some "lone individual" would be able to monetize on it if he found a way how to break ECDSA or RSA by selling it to some government.
close04 · 1h ago
If a government knows you have such information they’ll take it not buy it.
So your best bet would probably be to try to sell as many BTC as possible then give away the solution for free to your/a government.
capitainenemo · 1h ago
Well, this discussion is about prime number factorisation, and bitcoins use elliptic curve...
littlestymaar · 1h ago
True, we can never know what state actors know that we don't, and my cryptography professor at university taught us that NSA likely had 20 years of mathematical advance over the academic crypto community.
That being said, NFS is almost thirty years old so maybe the NSA doesn't have anything better still.
tripplyons · 1h ago
I've seen pretty credible evidence that factoring large semiprime numbers is effectively a solved problem, even without considering quantum computing or gigawatt-scale computing. I'm not able to share specifics, but I would personally not trust RSA.
close04 · 52m ago
People who have seen this evidence don’t go around on the internet bragging they’ve seen this evidence.
ginko · 1h ago
It recently occurred to me that now would be the best time ever for state actors to build out massive data centers without anyone noticing.
tripplyons · 1h ago
I could reasons for them to build datacenters for AI or collecting encrypted messages to decrypt later, but not for brute force attacks on encrypted messages.
add-sub-mul-div · 3h ago
"(Quick aside: the amount of optimization that has gone into this factoring-21 circuit is probably unrepresentative of what would be possible when factoring big numbers. I think a more plausible amount of optimization would produce a circuit with 500x the cost of the factoring-15 circuit… but a 100x overhead is sufficient to make my point. Regardless, special thanks to Noah Shutty for running expensive computer searches to find the conditional-multiplication-by-4-mod-21 subroutine used by this circuit.)"
NooneAtAll3 · 53m ago
> I think a more plausible amount of optimization would produce a circuit with 500x the cost of the factoring-15 circuit
I don't get this part
If author already produced "115x", how can optimizations make it worse?
tsimionescu · 5m ago
The point was that the amount of optimization work that went into making the circuit for factoring 21 only ~100x larger than the one for factoring 15 is not feasible for larger numbers. So, the general rule should be that you need ~500x more gates for a similar increase in the number to be factored, not just ~100x.
turtletontine · 31m ago
I think the idea is the minimal implementation will be unstable and unreliable. I don’t know the details, but there’s much work and thought on quantum error correcting qubits - where you hook up N qubits in a clever way to function as one very stable qubit. Terms such as “decoherence time” make appearances. You can imagine this quickly expands into an awful lot of qubits.
tromp · 34m ago
The 115x was obtained by doing a (less plausable) large amount of optimization...
It’s worth noting that the reason we are deploying PQ crypto is not that we are 100% convinced QC is coming soon. It may or may not depending on how development goes.
The goal of cryptography is to make something as close to theoretically unbreakable as possible. That means even theoretical vulnerabilities are taken seriously.
For ECC and RSA and related algorithms we have a theoretical and physically plausible pathway toward a practical machine that could break them. That means many cryptographers consider them theoretically broken even if such a machine does not exist and may not exist for a long time. The math works even if we can’t build it yet.
So it’s considered prudent to go ahead and upgrade now while no QC exists. That way if some major advance does arrive we are ready.
Nobody’s talking seriously about replacing SHA2, AES, ChaCha, etc because there is no physically plausible theoretically valid path to a machine that can break these in, say, less than many millions of years. AFAIK there is no proof that such a path does not exist but nobody has found one, hence they are considered unbroken.
Note that cryptography is not the only or even the most useful application of QC. Things like physical stimulation of quantum systems, protein folding, machine learning, etc. could be more useful. Like digital computers there’s probably a ton of uses we don’t know about because we need to tinker with the machine to figure them out.
leeoniya · 1h ago
> Things like physical stimulation of quantum systems, protein folding, machine learning, etc. could be more useful
is there still more to do in protein folding after AlphaFold?
The predictions don't tell us anything about why the answer is what it is. There is probably important (useful) fundamental scientific knowledge in being able to know that vs. just being able to predict the result.
api · 1h ago
There’s a difference between good AI predictions and theoretically perfect QC computations. The AI estimates while the QC will give you the answer, full stop. The latter could be relied upon more strongly. It could also generate infinite training data to make much better models.
QC might be directly applicable to AI training too. It may be possible to compute the optimal model over a data set in linear time. It could allow training that is faster and consumes a tiny fraction of the energy current brute force methods need.
tsimionescu · 2m ago
Is there any known quantum exponential speedup for gradient descent?
tialaramex · 1h ago
For the symmetric cryptography (so obviously AES and ChaCha, but also in effect the SHA-2 family) we can hand wave the quantum attacks as halving key length by enabling a sort of meet-in-the-middle attack (this attack is why it was 3DES not 2DES when they strengthened DES). There's a lot of hand waving involved. Your real Quantum Computer won't in fact be equivalent cost to the non-quantum computer, or as fast, or as small, the speed-ups aren't quite halving, and so on. But it's enough to say OK, what if AES-128 was as weak as a hypothetical AES-64, and that's fine because we have AES-256 anyway.
However, the main focus is on Key Exchange. Why? Well, Key Exchange is the clever bit where we don't say our secrets out loud. Using a KEX two parties Alice and Bob agree a secret but neither of them utters it. Break that and you can learn the secret, which was used to encrypt everything else - for any conversation, including conversations which you recorded any time in the past, such as today.
If future bad guys did have a Quantum Computer the Key Exchange lets them read existing conversations they've tapped but today can't read, whereas breaking say the signing algorithm wouldn't let them somehow go back in time and sign things now because that's not how time works. So that's why the focus on KEX. Once such a thing exists or clearly is soon to deliver it's important to solve a lot of other problems such as signing, but for KEX that's already too late.
taway789aaa6 · 3h ago
> Third, notice that the only remaining multiplication is a multiplication by 4. Because 15 is one less than a power of 2, multiplying by 2 modulo 15 can be implemented using a circular shift. A multiplication by 4 is just two multiplications by 2, so it can also be implemented by a circular shift. This is a very rare property for a modular multiplication to have, and here it reduces what should be an expensive operation into a pair of conditional swaps.
> Aside: multiplication by 16 mod 21 is the inverse of multiplying by 4 mod 21, and the circuits are reversible, so multiplying by 16 uses the same number of Toffolis as multiplying by 4.
I couldn't really find anything explaining the significance of this. The only info I found said that "4 mod 21 = 4" (but I don't know if it was AI slop or not).
Is "multiplying by 4 mod 21" something distinct to quantum computing?
shiandow · 3h ago
It is phrasing mathematicians sometimes use. In this sentence 'mod 4' does not indicate an operator but denotes what number system you are in.
For instance the following are equivalent:
2 = 6 mod 4
6 = 2 mod 4
This 'mod 4' can also appear in parentheses or in some other way, but it must appear at the end. Like I said it is not an operator rather it denotes that the entire preceding statement takes place in the appropriate quotient space.
So it is not (multiplying by (4 mod 21)) but ((multiplying by 4) mod 21)
AnotherGoodName · 2h ago
Fractions are under any modulus are actually representable as whole numbers (provided they don’t share factors with the modulus).
For example under mod 21 a half can actually be represented by 11. Try it. Times any even number by 11 and you’ll see you halved it.
Take any number that’s a multiple of 4 and times it by 16 under mod 21. You now have that number divided by 4.
I mean 4 is equivalent to 4 mod 21. That part's not AI slop.
JohnKemeny · 3h ago
I think, in fact, for every prime number p at least 5, 4 mod p is (congruent to) 4.
oh_my_goodness · 2h ago
Even without the restriction to primes, that feels like a pretty good guess!
freehorse · 2h ago
Or for less than 5.
oh_my_goodness · 1h ago
lol good point.
alchemist1e9 · 4h ago
And these are the same quantum computers that will eventually break ecliptic curve cryptography? Now I’m very confused.
griffzhowl · 3h ago
If we can build a machine with enough coherent qubits, then it'll be able to break ECC.
As it turns out, that's a big if, but the bigness of the if is about hardware implementation. The theory behind it is just basic quantum mechanics
oh_my_goodness · 2h ago
Article: it takes 2405 entangling gates to factor the number 21.
wongarsu · 3h ago
In many applications you are concerned about data you encrypt today still being secure 20 years from now. Especially if your threat model includes a malicious actor holding on to data in hopes of decrypting it later.
From the article it sounds like we will still be safe for 20+ years. On the other hand 15 was just extraordinarily easy, progress after 21 will be much quicker. And we never know which breakthroughs might come in the next decades that speed up progress.
oh_my_goodness · 3h ago
"progress after 21 will be much quicker"
Can you provide a quick verification for that?
wongarsu · 2h ago
The article lists three reasons why 21 is so much harder than 15. Mostly that with 15 most of the required conditional multiplications are multiplications by 1 and the remaining multiplication can be done with cheap shifts because it's a multiplication by a power of 2 modulo 15 (which is one less than a power of two).
But 22 and 24 are in the same boat as 21 here. All three of them require computing only factors that are not one, all three are not one less than a factor of 2. You need slightly more multiplications (and thus more gates) as the numbers get larger, but that only grows linearly. Maybe the conditional multiplications required get slightly more expensive to implement, but I wouldn't expect a 100x cost blowup from that. Error correction is still an issue, potentially making a linear complexity increase quadratic, but qubit counts in quantum computers also increase at an exponential rate
oh_my_goodness · 2h ago
24 has 3 as a factor. 3 is one less than a power of 2.
freehorse · 1h ago
Well n=21 too but the solution for n=15 used that 15 is one less than a power of 2, not its divisors, because we are living in modulo n.
oh_my_goodness · 1h ago
Thanks. I don't understand quantum computing at all.
oh_my_goodness · 2h ago
23 and 29 are prime.
wongarsu · 2h ago
That's what I get for not enough coffee. Same holds for 22 and 24 though
bArray · 4h ago
I think the idea is that it doesn’t matter if it takes billions of gates to achieve, the point is that it can do it very fast. If we thought a table sized FPGA could do it, a state actor would most definitely build one.
lazide · 3h ago
theoretically
The practical problem is that ‘noise’ between gates seems to increase exponentially, so practically it may actually be impossible to construct anything with more than a handful of gates for the foreseeable (possibly indefinite?) future.
It’s essentially the crypto version of Fusion.
EthanHeilman · 3h ago
Quantum error correction addresses this problem and we now have tech demos showing that we can build scalable QCs with surface codes [0].
This, like all other purported advancements in quantum error correction, is a meme with zero practical impact.
lazide · 3h ago
Cool, so we should totally be able to factor 21 (or larger numbers)…. When?
EthanHeilman · 1h ago
You brought up gate noise, there has been quite a bit of progress on that problem.
> so we should totally be able to factor 21 (or larger numbers)…. When?
Just because we solve one problem doesn't imply all the problems in QC are also instantly solved. I guess it does if you assume noise is the only problem and once is it solved the engineering is trivial. That is not the case. Even assuming all foundational problems have been solved, figuring out how actually engineer and also mass produce large numbers of gates, will take a while.
As the article pointed out, going from 15 to 21 requires a 100x increase in gates.
As the article that you posted under says:
"Because of the large cost of quantum factoring numbers (that aren’t 15), factoring isn’t yet a good benchmark for tracking the progress of quantum computers. If you want to stay abreast of progress in quantum computing, you should be paying attention to the arrival quantum error correction (such as surface codes getting more reliable as their size is increased) and to architectures solving core scaling challenges (such as lost neutral atoms being continuously replaced)."
privatelypublic · 4h ago
Hasn't classical already severely crippled ECC because of some mathematical Assumptions that somebody came back in 2022 and Proved were wrong?
cwmma · 2h ago
I believe you are thinking of "Supersingular isogeny Diffie–Hellman key exchange" or SIKE which is a post quantum encryption algorithm that was spectacularly broken a couple years ago. The math involves elliptical curves but it's different from the elliptical curve cryptography used in your browser.
kevindamm · 3h ago
Which assumptions? ECDLP is still considered computationally hard, and ECC considered secure. There are invalid curve attacks and small subgroup attacks but that's a problem with key selection, not a fundamental problem with ECC.
Do you have a citation?
markusde · 3h ago
Could you link to any more information about this?
bitexploder · 3h ago
Not in general, no. It is still secure and used. There are of course attacks but those were not completely breaking
Mistletoe · 3h ago
This is the reality a million clickbait fearmongering articles won’t tell you. And pr machines for quantum computing companies won’t either.
z3phyr · 3h ago
"Is this what you can conjure Saruman?"
I have a strong belief that new mathematical tools and methods can be developed that can make it "easy" to break a lot of modern cryptography primitives without ever using a quantum computer.
santiagobasulto · 3h ago
The potential is there, we haven't made it yet. It's the same with AI, AGI and all that. Imagine if you'd read a response from GPT-2 back in 2019, you'd also be like "and these are the same models that will eventually give us AGI".
heyjamesknight · 3h ago
Not a great analogy, since there’s zero chance the kinds of model involved in GPT-2 will give us AGI.
ACCount37 · 3h ago
Zero? Aren't you a little bit overconfident on that?
Transformer LLMs already gave us the most general AI as of yet, by far - and they keep getting developed further, with a number of recent breakthroughs and milestones.
No comments yet
fmbb · 3h ago
Imagine if you'd read a response from GPT-5 in 2025, you'd also be like "and these are the same models that will eventually give us AGI".
raverbashing · 2h ago
My belief in achieving actual quantum computing is going down as noise in qbits goes up
Digital computers were much easier than that. Make it smaller, make a larger number of it, and you're set.
Quantum computers complexity goes up with ~ n^2 (or possibly ~ e^n) where n is the number of qbits
At the same time, things like d-wave might be the most 'quantum' we might get in the practical sense
analog31 · 2h ago
It turns out that error correction was easy on digital computers, and was essentially a solved problem early in their development. In fact, "noise immunity" is arguably the defining feature of a digital system. And error correction can happen at each gate, since there's no reason to propagate an indeterminate number.
That is a hard question to answer for two reasons. First, there is no bright line that delineates "cryptographically useful". And second, the exact design of a QC that could do such a calculation is not yet known. It's kind of like trying to estimate how many traditional gates would be needed to build a "semantically useful" neural network back in 1985.
But the answer is almost certainly in the millions.
[UPDATE] There is a third reason this is hard to predict: for quantum error correction, there is a tradeoff between the error rate in the raw qbit and the number of gates needed to build a reliable error-corrected virtual qbit. The lower the error rate in the raw qbit, the fewer gates are needed. And there is no way to know at this point what kind of raw error rates can be achieved.
> Is there some pathway that makes quantum computers useful this century?
This century has 75 years left in it, and that is an eternity in tech-time. 75 years ago the state of the art in classical computers was (I'll be generous here) the Univac [1]. Figuring out how much less powerful it was than a modern computer makes an interesting exercise, especially if you do it in terms of ops/watt. I haven't done the math, but it's many, many, many orders of magnitude. If the same progress can be achieved in quantum computing, then pre-quantum encryption is definitely toast by 2100. And it pretty much took only one breakthrough, the transistor, to achieve the improvement in classical computing that we enjoy today. We still don't have the equivalent of that for QC, but who knows when or if it will happen. Everything seems impossible until someone figures it out for the first time.
---
[1] https://en.wikipedia.org/wiki/UNIVAC_I#Technical_description
> This century has 75 years left in it, and that is an eternity in tech-time.
As a comparison, we went from first heavier than air flight to man walking on the moon in only 66 years.
You can do useful and valuable quantum chemistry calculations already with few 100s of qubits with that low error rates, while post-quantum algorithms are becoming more common everyday removing incentives to build crypto cracking quantum computers.
I think the quantum technology will advance fastest in directions that are not easy to use in cryptography.
As a layman the pathway seems to exist behind multiple massive materials science breakthroughs
In addition, selling information to a government on how to break either system would be more valuable than the amount of bitcoin you would able to sell before exchanges stop accepting deposits or the price crashes.
Honest question because one can find such claims very often on forums like HN:
Does there really exist a "feasible" way how some "lone hacker" could sell such information to some government and become insanely rich?
I know that people who apparently have some deep knowledge about how exploit markets work claimed on HN that "if you have to ask how/where to solve your exploit (i.e. you have the respective contacts), you are very likely not able to".
This latter observation seems a lot more plausible to me than the claim often found on HN that some "lone individual" would be able to monetize on it if he found a way how to break ECDSA or RSA by selling it to some government.
So your best bet would probably be to try to sell as many BTC as possible then give away the solution for free to your/a government.
That being said, NFS is almost thirty years old so maybe the NSA doesn't have anything better still.
I don't get this part
If author already produced "115x", how can optimizations make it worse?
The goal of cryptography is to make something as close to theoretically unbreakable as possible. That means even theoretical vulnerabilities are taken seriously.
For ECC and RSA and related algorithms we have a theoretical and physically plausible pathway toward a practical machine that could break them. That means many cryptographers consider them theoretically broken even if such a machine does not exist and may not exist for a long time. The math works even if we can’t build it yet.
So it’s considered prudent to go ahead and upgrade now while no QC exists. That way if some major advance does arrive we are ready.
Nobody’s talking seriously about replacing SHA2, AES, ChaCha, etc because there is no physically plausible theoretically valid path to a machine that can break these in, say, less than many millions of years. AFAIK there is no proof that such a path does not exist but nobody has found one, hence they are considered unbroken.
Note that cryptography is not the only or even the most useful application of QC. Things like physical stimulation of quantum systems, protein folding, machine learning, etc. could be more useful. Like digital computers there’s probably a ton of uses we don’t know about because we need to tinker with the machine to figure them out.
is there still more to do in protein folding after AlphaFold?
https://www.isomorphiclabs.com/articles/alphafold-3-predicts...
QC might be directly applicable to AI training too. It may be possible to compute the optimal model over a data set in linear time. It could allow training that is faster and consumes a tiny fraction of the energy current brute force methods need.
However, the main focus is on Key Exchange. Why? Well, Key Exchange is the clever bit where we don't say our secrets out loud. Using a KEX two parties Alice and Bob agree a secret but neither of them utters it. Break that and you can learn the secret, which was used to encrypt everything else - for any conversation, including conversations which you recorded any time in the past, such as today.
If future bad guys did have a Quantum Computer the Key Exchange lets them read existing conversations they've tapped but today can't read, whereas breaking say the signing algorithm wouldn't let them somehow go back in time and sign things now because that's not how time works. So that's why the focus on KEX. Once such a thing exists or clearly is soon to deliver it's important to solve a lot of other problems such as signing, but for KEX that's already too late.
> Aside: multiplication by 16 mod 21 is the inverse of multiplying by 4 mod 21, and the circuits are reversible, so multiplying by 16 uses the same number of Toffolis as multiplying by 4.
I couldn't really find anything explaining the significance of this. The only info I found said that "4 mod 21 = 4" (but I don't know if it was AI slop or not).
Is "multiplying by 4 mod 21" something distinct to quantum computing?
For instance the following are equivalent:
2 = 6 mod 4
6 = 2 mod 4
This 'mod 4' can also appear in parentheses or in some other way, but it must appear at the end. Like I said it is not an operator rather it denotes that the entire preceding statement takes place in the appropriate quotient space.
So it is not (multiplying by (4 mod 21)) but ((multiplying by 4) mod 21)
For example under mod 21 a half can actually be represented by 11. Try it. Times any even number by 11 and you’ll see you halved it.
Take any number that’s a multiple of 4 and times it by 16 under mod 21. You now have that number divided by 4.
Etc.
Absolutely nothing to do with quantum computers.
As it turns out, that's a big if, but the bigness of the if is about hardware implementation. The theory behind it is just basic quantum mechanics
From the article it sounds like we will still be safe for 20+ years. On the other hand 15 was just extraordinarily easy, progress after 21 will be much quicker. And we never know which breakthroughs might come in the next decades that speed up progress.
Can you provide a quick verification for that?
But 22 and 24 are in the same boat as 21 here. All three of them require computing only factors that are not one, all three are not one less than a factor of 2. You need slightly more multiplications (and thus more gates) as the numbers get larger, but that only grows linearly. Maybe the conditional multiplications required get slightly more expensive to implement, but I wouldn't expect a 100x cost blowup from that. Error correction is still an issue, potentially making a linear complexity increase quadratic, but qubit counts in quantum computers also increase at an exponential rate
The practical problem is that ‘noise’ between gates seems to increase exponentially, so practically it may actually be impossible to construct anything with more than a handful of gates for the foreseeable (possibly indefinite?) future.
It’s essentially the crypto version of Fusion.
[0] https://scottaaronson.blog/?p=8525
> so we should totally be able to factor 21 (or larger numbers)…. When?
Just because we solve one problem doesn't imply all the problems in QC are also instantly solved. I guess it does if you assume noise is the only problem and once is it solved the engineering is trivial. That is not the case. Even assuming all foundational problems have been solved, figuring out how actually engineer and also mass produce large numbers of gates, will take a while.
As the article pointed out, going from 15 to 21 requires a 100x increase in gates.
As the article that you posted under says:
"Because of the large cost of quantum factoring numbers (that aren’t 15), factoring isn’t yet a good benchmark for tracking the progress of quantum computers. If you want to stay abreast of progress in quantum computing, you should be paying attention to the arrival quantum error correction (such as surface codes getting more reliable as their size is increased) and to architectures solving core scaling challenges (such as lost neutral atoms being continuously replaced)."
Do you have a citation?
I have a strong belief that new mathematical tools and methods can be developed that can make it "easy" to break a lot of modern cryptography primitives without ever using a quantum computer.
Transformer LLMs already gave us the most general AI as of yet, by far - and they keep getting developed further, with a number of recent breakthroughs and milestones.
No comments yet
Digital computers were much easier than that. Make it smaller, make a larger number of it, and you're set.
Quantum computers complexity goes up with ~ n^2 (or possibly ~ e^n) where n is the number of qbits
At the same time, things like d-wave might be the most 'quantum' we might get in the practical sense