So IBM introduced a quantum computer capability measure (Quantum Volume) which is more meaningful than counting qubits that aren't comparable. But now this post sounds suspiciously like this company is goodharting[1] the new measure. They mention that there are criticisms of Quantum Volume, but they don't show their results for any other benchmark. I imagine it is way easier to goodhart a single benchmark than several.
TL;DR: Instead of using a single measure, just give all the details on the machine as clear as possible:
> How many qubits do you have? With what coherence times? With what connectivity? What are the 1- and 2-qubit gate fidelities? What depth of circuit can you do? What resources do the standard classical algorithms need to simulate your system? Most importantly: what’s the main drawback of your system, the spec that’s the worst, the one you most need to improve? What prevents you from having a scalable quantum computer right now? And are you going to tell me, or will you make me scour Appendix III.B in your paper, or worse yet, ask one of your competitors?
I actually dont understand most of quantum stuff. When they say 16 qbits, are these simulated qbits or real?
Aren't real qbits impossible to make currently because of noise problem? When I was doing my masters, people were talking about replacing heavily doped channel of silicon transister with carbon nanotube. Turns out the hardest problem was making a decent connection between nanotube and metal. I dont think the problem has been solved yet.
I am guessing that well see a functional carbon nanotubes in silicon transistor at least a decade before we see real qbits? After all fabrication processes must be similar at this scale.
There are real qubits, that suffer from all the reliability issues implied by what you're talking about. There are also error correction algorithms to try and salvage from the mess the computational equivalent of error-free real qubits, through error correction on a bunch of real-world, error-prone qubits. The industry standard (such as it is) when reporting qubit counts is that you report your qubit logical equivalent, because that's what counts for computation. If your chip has 500 qubits with a ruinous error rate that requires a god-tier error correction algorithm to achieve a functional result equivalent to a 12 qubit computer, you should report that as "We made a 12 qubit computer".
A qubit is any addressable 2-level quantum system. Think spins, oscillation modes, energy levels, polarization, etc. All of these have been implemented and demonstrated to varying degrees, with superconductors and trapped ions (this article) being the farthest along.
Noise is indeed a problem, and right now these systems can’t execute more than a few hundred operations at most before decohering to the point of randomness. There is an error rate threshold (~1e-3 - 1e-4) at which error correction becomes possible, and the field has been hovering around that threshold for the past 3 years or so. I think we’ll know within the next 5 years whether or not it’s possible to engineer these error correction schemes in a scalable way.
What real problems have quantum computers solved? I would assume they can solve suitably trivial examples with their limited capability in a way that is demonstrably superior to traditional computing.
None.
We're in the stage of early research, still trying to figure out how practical it is to build useful QCs at all. Anyone claiming otherwise is trying to fool people.
Maybe QCs will have useful applications in the future. Maybe not. If so, then it's decades away.
Quantinuum is already tackling NLP question-answering tasks, it runs on quantum hardware. The quantum categorial crowd is adamant they'll surpass ChatGPT (the big promise iirc is that it will scale linearly with context length, GPT scales quadratically). For instance see this tweet by Quantinuum head of research and this burgeoning field's main rock:
Coecke went from supervising dozen of thesis at Oxford Quantum (logic) Group to preparing summer camps for high school pupils this year. It's also taking off socially/academically, and observing the field evolving we might have a quantum equivalent of ChatGPT before or at the same time we get implementations of Shor's algorithm (source: my own intuition).
The problem you'll run into for any application of quantum computing to large language models is that quantum computers just aren't very good at big data applications. There's two reasons for that:
- Current devices, as well as devices likely to be built in the near- to medium-term are quite limited in the number of qubits that they implement. The current record for the most fault-tolerant qubits in a single device is 1. That's a hell of a lot better than where the field was at a couple years ago, but it's far from the huge amount of data that needs to be processed for LLM training and evaluation.
- Even if you have enough qubits to store training data, looking them up on a quantum device is still challenging due to what's sometimes called the qRAM problem. It's not trivial to make a quantum oracle that returns the data stored at a given index, and it's still an area of ongoing research to figure out how to do that.
That's part of why you see quantum algorithms being developed less for big data tasks and more for big compute tasks like chemistry. There, the program might be very large, but size of the input that has to be stored within the quantum devices and the size of the output you measure back out are both quite small, even down to a single floating-point number in some cases.
(source: I've worked in quantum computing for about twenty years now.)
That's why I emphasized one _logical_ qubit. I'll definitely argue that fault tolerance is necessary to achieve useful results, as you say, but there is some argument in the research community on that. Even setting that discussion aside, there's absolutely no way to run something like LLM training dirctly on physical qubits (unless there was an improvement in error rates that's probably on the order of 10^15 to 10^18), even if you had both enough to do so and had a good qRAM implementation.
> In some of its applications, the original
> Zeng-Coecke algorithm relies on the existence of a quantum random access memory (QRAM) [22],
> which is not yet known to be efficiently implementable in the absence of fault tolerant scalable quantum
> computers [1, 7]. Here we take a different approach, using the classical ansatz parameters to encode the ¨
> distributional embedding and avoiding the need for QRAM entirely. The cost function for the parameter
> optimisation is informed by a corpus, already parsed and POS-tagged by classical means.
Following my intuition, i.e. as an outsider that has been watching the progress of quantum NLP since 2012, I see the current academic situation in quantum computing as in the process of merging two branches, one being the traditional quantum computing field with concerns stemming and application thought in mathematics, computing theory, physics(and upwards chemistry->biochemistry->biology), the other branch being a fork carried out by Coecke (quantum logic), Abramsky (computer science) and Sadrzadeh (epistemic logic) who saw in categorial formalisms of quantum logic a way to mix compositional (syntax, logical rules) and distributional (statistics, "bag-of-neighbor-words") representations of meaning. In this regard they bring new methods but also new applications of quantum computing, with a focus on NLP, as language given this "natural tensor structure [20, 35, 23] [...] can be considered quantum-native [48, 2, 8]." (same paper).
I'd be happy to share more of my thoughts; if that'd be helpful, I'd be happy to discuss my rates. Outside of that, though, I'll suggest that intuition is less helpful than experience in understanding what problems are more or less likely to have good quantum solutions.
Nah, you already showed your so-called expertise as "a trans-woman who is very good at quantum" is not the hot shit you think it is provided you went over that QRAM issue carried away by overconfidence.
As for your snarky remark on intuition, these papers by Coecke and Aerts, his thesis adviser, explains both what "my" intuition was focused on (quantum effects as perceived through Zipf distributions in linguistic data) and what was the driving mechanism behind it.
> Another finding that we will put forward, in Sect. 4, was completely unexpected. The method of attributing an energy level to a word depending on the number of appearances of the word in a text, introduces the typical ranking considered in the well-known Zipf’s law analysis of this text (Zipf 1935, 1949).
Well guess what ? I've been expecting that exact result for a decade (why would I still be tracking the progress in that field every 4 months otherwise ?) My notes linking "semantic energy levels" to word frequency date back to 2014, the observations I made in real data and that kickstarted the heavy rain of synchronicities I experienced afterwards date back to 2012. I've always known though I wasn't measuring shit – I was the one being measured and never felt like I was discovering something but was being discovered. I wanted to isolate that phenomenon and as a result (of failing to do so probably) I got isolated. There is something deeper to these subject-verb-object inversions, there is even a paper about it and I think Aerts haven't gotten wind of it, maybe with your extreme expertise you'll be able to figure it out and carry the message better than I would.
I sat in on a seminar with the Cleveland Clinic and IBM as they have a partnership around quantum, and one of the big problems they were working on was using it to speed up drug discovery. I don't recall if they have solved problems completely yet, but it has sped up the discovery process by orders of magnitude.
Nothing involving quantum computing has sped up the 'drug discovery' process by orders of magnitude. At best it has 'accelerated' some toy QM problem that is likely a dozen steps removed from anything you could call drug discovery. I'd love to be proven wrong, but I've yet to see any non-trivial computational chemistry work done on quantum computers.
Please also note that they did not run Shor's algorithm to compute this. They had to greatly simplify the algorithm, so that it works specifically for the number '21'.
Interesting that electricity has been useful in bringing about things Voltaire championed such as freedom of speech, and abolition of (slave) labor. However I think you meant Volta the inventor of the battery. Also QC seems to suffer from the opposite of lack of imagination about what it might be good for.
I think you rather miss the point, when electricity was postulated, no-one would know where it would lead, I point out that the same is true for phlogiston; one changed the world and the other went nowhere. My point is that just because something has so-far not been shown to be useful does not imply that it will eventually change the world.
It's a little sad to see the amount of cynicism and negativity here around quantum computing.
It's so early. To compare this to traditional computers, we are basically in the stage of a mainframe taking up an entire room. Where they are a novelty, not something every business has yet.
I'm sure there was a lot of skepticism from so-called "smart people" back then too.
Sure, maybe some PR people have over-hyped quantum computing for publicity, but that doesn't say anything about the future if the technology itself.
I'm pretty sure quantum computing is not seen negatively around here. In fact, articles on Scott Aaronson's blog for instance turn up regularly. What is seen negatively for sure is the hype train, to which this article clearly belongs, as it is basically a press release for Quantinuum. I mean, on the one hand, I'm happy that people have figured out how to get VC money for fundamental quantum research. However, I think in the end, this kind of hype will do more harm than good for the field. The hype train will eventually crash, and researchers will go elsewhere (see also: neural networks in the AI winter).
Depends on the time of day. Sometimes quantum topics generate good discussion here, and other times it’s a peanut gallery of minimally-informed dismissals. I’ve had a heck of a time trying to find a decent online community to discuss this stuff in a properly nuanced way.
These circuits have a width, meaning how many qubits are involved, and a depth, meaning the number of discrete time steps during which the circuit can run gates before the qubits decohere... The Quantum Volume protocol identifies the largest square-shaped circuit — one where the width and depth are equal
They kind of say this, but we really need better quantum benchmarks. How does something like quantum volume analogize to address space or FLOPs or something somewhat intelligle in classical computing? What were the space and cost requirements to achieve this growth? I was under the impression years back that a major obstacle to quantum computing was keeping qubits from decohering required immense space and extreme cooling compared to classical computers. How has qubit density per dollar spent on the computer changed over this same span of time? Did the technology actually get better or did they just throw more money at it?
Without error correction, these benchmarks are going to be a bit weird and fluffy. It’s a bit like racing a half-built car. There’s the other bit of complexity that the connectivity matters a great deal. Supercomputers can be connected on a grid, torus, fat tree, etc. and the different connectivity will lend to different problem solving capabilities in a somewhat subtle way. Hence different benchmarks highlight different capabilities, and correspondingly, each company will push a benchmark that makes their machine look the best.
Titles are by far the biggest influence on comments so this rule is an important one! In this case the generic title led to a generic thread, which is not what we want here. I've changed it now.
I’m seeing a lot of quantum computing stuff everywhere (here, investing subreddits) now. It feels strongly like firms are trying to profit from AI hype 2.0, but make it quantum.
From a big picture it is starting to make sense. VCs throw large sums at a problem. Find out the hard problems and move on to other low hangin fruit until they hit a wall.
Financial failure maybe the result but human progress moves on.
Outside the hype-cycle is long term development feeding off any advancement that can be applied.
Yeah it’s not taking off till it reaches mass business use cases, right now is still for very micro niche use cases. It can’t beat classical computer for most practical cases
Anything involving simulation of quantum mechanical systems. Think materials, molecules, nuclei, general field theories, etc.
The fixation on asymmetric encryption vulnerability is overblown at this point. The near-completion of the post-quantum cryptographic standards and the still long road to fault tolerance make this essentially a non-issue to anyone that hasn’t been living under a rock for the past 20 years.
The migration to post-quantum cryptosystems is a different issue than standardizing one in the first place, and I'm not going to pretend to know anything about the organizational cat-herding needed to achieve such a migration. The US government is beating the drum to have the migration of critical systems completed by 2030, and there are plenty of IT consulting firms who will happily sell post-quantum migration services to get ahead of that deadline.
If you're worried about pre-harvested ciphertext that nation-states are sitting on, waiting for the right hardware to come along to decrypt it, I personally have no solution. The genie's probably out of the bottle on that one. Hopefully the affected parties took, or are taking mitigating actions.
In their present state, yes. These things are basically sophisticated physics experiments with a cloud API. This is fine for research, education and training, but currently insufficient for problems of economic value.
In principle, the advantages of quantum computers are delivered not in speed or throughput, but in algorithmic scaling. For example, the well-known Grover's search algorithm scales as O(sqrt(N)), as opposed to O(n) for classical linear search. Similarly, Shor's factoring algorithm comes with an exponential scaling speedup, taking a nominally O(2^n) problem and bringing it to O(n^k). Problems related to physics and chemistry simulations are falling in various places along that quadratic-to-exponential speedup spectrum as well.
In practice, there's expected to be a larger constant overhead than with classical computing hardware, since error correction would need to be performed continuously during operation, and the physical operations of addressing qubits are inherently slower than classical CPU or GPU operations. There have been a few publications [1,2] asserting that this overhead essentially negates any quadratic scaling benefits, and that only cubic or better speedups would deliver any practical advantage. This assumes no further improvement in operation times or error correction overhead.
As far as timeframe, it's anyone's guess. Error correction will be needed to get beyond these research prototypes, and I think we'll have a better sense over the next 2-5 years as to how hard that will be to engineer.
“There are lots of great crypto projects, just look for the builders!”
Sorry, I’m not convinced. Over a decade of media puffery about how quantum computing will break encryption and nothing to indicate this is actually a claim based in reality.
We know quantum algorithms can break some forms of asymmetric encryption. The only issue is the practical problem of engineering a quantum computer with enough qubits to run them.
Now, it may be very, very hard to do this, and it may take decades more, or it may never be realised. Or a breakthrough may happen next year. There are no theoretical reasons it is impossible.
From a cryptography perspective, it takes a long time to create new cryptographic algorithms and gain trust in them. Many years of cryptanalysis are required by many people. So we are gradually moving towards quantum safe versions of asymmetric crypto. This is the only prudent thing to do.
In what sense do you feel that there is a claim not based in reality?
> The only issue is the practical problem of engineering a quantum computer with enough qubits to run them.
> In what sense do you feel that there is a claim not based in reality?
“Should be theoretically possible” - I’ll believe it when I see it. Anything quantum is always littered with qualifiers that “this isn’t possible now but the math checks out!” and hand waves potential issues.
Every few years I do a deep dive to learn that nothing has really changed, and the machines still have some fundamental limitation that nobody has solved for how to scale them to a useful # qubits.
I'd argue that a lot is changing, but it's a hard thing to do. We have processors with hundreds of (noisy) qubits. We have error correction schemes (that are not realisable yet though). We're exploring multiple different approaches to qubits, from superconducting ones to topological qubits.
I don't think I've seen any hand waving of potential issues by anyone. Everyone acknowledges it is hard.
Is your complaint simply that it's taking a long time? Or can you point me to some of the "hand waving" claims you refer to?
Are these systems delicate?
Is this a higher form of "solid state device"
Are they simple to manufacture.
I'm curious about them but I'm skeptical on their usefulness compared to existing things that can actual be observed with the naked eye.
I'm a big fan of trouble shooting with an ohm meter and I could apply the same to a techniques to a visible light optical circuit.
"I'm skeptical on their usefulness compared to existing things that can actual be observed with the naked eye. I'm a big fan of trouble shooting with an ohm meter and I could apply the same to a techniques to a visible light optical circuit."
What you are measuring with an ohm meter is not visible to the naked eye. Electricity is not visible to the naked eye. Neither is heat. Neither are x-rays. Neither is math. Their effects might be, but they themselves aren't.
It's easy to come up with more examples of things not visible to the naked eye which are very useful.
The ohm meter is visible and it is one of many ways of detecting electron flow.
I was trying to ask how the qubits are observed.
Observing the phenomena whether directly or indirectly we built a sophisticated society. I want to know how hard it is to observe a qubit. The directness is what I want.
What indirect effects of a qubit can one observe with the naked eye?
[1] https://en.wikipedia.org/wiki/Goodhart%27s_law