Anything involving simulation of quantum mechanical systems. Think materials, molecules, nuclei, general field theories, etc.
The fixation on asymmetric encryption vulnerability is overblown at this point. The near-completion of the post-quantum cryptographic standards and the still long road to fault tolerance make this essentially a non-issue to anyone that hasn’t been living under a rock for the past 20 years.
The migration to post-quantum cryptosystems is a different issue than standardizing one in the first place, and I'm not going to pretend to know anything about the organizational cat-herding needed to achieve such a migration. The US government is beating the drum to have the migration of critical systems completed by 2030, and there are plenty of IT consulting firms who will happily sell post-quantum migration services to get ahead of that deadline.
If you're worried about pre-harvested ciphertext that nation-states are sitting on, waiting for the right hardware to come along to decrypt it, I personally have no solution. The genie's probably out of the bottle on that one. Hopefully the affected parties took, or are taking mitigating actions.
In their present state, yes. These things are basically sophisticated physics experiments with a cloud API. This is fine for research, education and training, but currently insufficient for problems of economic value.
In principle, the advantages of quantum computers are delivered not in speed or throughput, but in algorithmic scaling. For example, the well-known Grover's search algorithm scales as O(sqrt(N)), as opposed to O(n) for classical linear search. Similarly, Shor's factoring algorithm comes with an exponential scaling speedup, taking a nominally O(2^n) problem and bringing it to O(n^k). Problems related to physics and chemistry simulations are falling in various places along that quadratic-to-exponential speedup spectrum as well.
In practice, there's expected to be a larger constant overhead than with classical computing hardware, since error correction would need to be performed continuously during operation, and the physical operations of addressing qubits are inherently slower than classical CPU or GPU operations. There have been a few publications [1,2] asserting that this overhead essentially negates any quadratic scaling benefits, and that only cubic or better speedups would deliver any practical advantage. This assumes no further improvement in operation times or error correction overhead.
As far as timeframe, it's anyone's guess. Error correction will be needed to get beyond these research prototypes, and I think we'll have a better sense over the next 2-5 years as to how hard that will be to engineer.
Are there use-cases for QC other than ruining asymmetrical encryption?