Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Our quantum volume data is taken on our commercial systems interwoven with customer jobs.

Are there use-cases for QC other than ruining asymmetrical encryption?



Anything involving simulation of quantum mechanical systems. Think materials, molecules, nuclei, general field theories, etc.

The fixation on asymmetric encryption vulnerability is overblown at this point. The near-completion of the post-quantum cryptographic standards and the still long road to fault tolerance make this essentially a non-issue to anyone that hasn’t been living under a rock for the past 20 years.


People encrypting stuff 10 years ago did so under the impression that it would be safe far longer.

People encrypting stuff today might not have an option to make it resilient.

How is that not a problem?


The migration to post-quantum cryptosystems is a different issue than standardizing one in the first place, and I'm not going to pretend to know anything about the organizational cat-herding needed to achieve such a migration. The US government is beating the drum to have the migration of critical systems completed by 2030, and there are plenty of IT consulting firms who will happily sell post-quantum migration services to get ahead of that deadline.

If you're worried about pre-harvested ciphertext that nation-states are sitting on, waiting for the right hardware to come along to decrypt it, I personally have no solution. The genie's probably out of the bottle on that one. Hopefully the affected parties took, or are taking mitigating actions.


Are you saying they are primarily a research tool?


In their present state, yes. These things are basically sophisticated physics experiments with a cloud API. This is fine for research, education and training, but currently insufficient for problems of economic value.


Thanks for answering of course the result is more questions.

I suppose the end goal is to have the overall material report a status on its environment and stresses.

Are they relatively fast?

Are they on the same time frame as fusion?


In principle, the advantages of quantum computers are delivered not in speed or throughput, but in algorithmic scaling. For example, the well-known Grover's search algorithm scales as O(sqrt(N)), as opposed to O(n) for classical linear search. Similarly, Shor's factoring algorithm comes with an exponential scaling speedup, taking a nominally O(2^n) problem and bringing it to O(n^k). Problems related to physics and chemistry simulations are falling in various places along that quadratic-to-exponential speedup spectrum as well.

In practice, there's expected to be a larger constant overhead than with classical computing hardware, since error correction would need to be performed continuously during operation, and the physical operations of addressing qubits are inherently slower than classical CPU or GPU operations. There have been a few publications [1,2] asserting that this overhead essentially negates any quadratic scaling benefits, and that only cubic or better speedups would deliver any practical advantage. This assumes no further improvement in operation times or error correction overhead.

As far as timeframe, it's anyone's guess. Error correction will be needed to get beyond these research prototypes, and I think we'll have a better sense over the next 2-5 years as to how hard that will be to engineer.

1. https://arxiv.org/abs/2011.04149

2. https://arxiv.org/abs/2307.00523


Thank you. It seems computing architecture might be driving towards the qubit's advantage (really wide).

The explanation you have provided nicely laid out the dynamics of development. I will look over the documents you provided.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: