But Hashcash with large memory requirements will likely not be affected as long as scaling quantum computers up to millions of bits remains elusive.

I didn't find information on time-memory trade-off of quantum computers, but if we assume that the trade-off is not worse than the trade-off of classical computers then we get that memory increase of the hashing function can be counteracted by increasing time we run the computations. So Hashcash with large memory won't save us.

Of course I was talking about hash-functions that don't allow for time-memory trade-offs.

I introduced a novel algorithm to solve the bitcoin mining problem without using (explicit) brute force. Instead, the nonce search is encoded as a decision problem and solved by a SAT solver in such a way that a satisfiable instance contains a valid nonce. The key ingredients in the algorithm are a non-deterministic nonce and the ability to take advantage of the known structure of a valid hash using assume statements.

A couple of benchmarks demonstrated that already with simple parameter tuning dramatic speed ups can be achieved. Additionally, I explored the contentious claim that the algorithm might get more efficient with increasing bitcoin difficulty. Initial tests showed that block 218430 with considerably higher difficulty is solved more efficiently than the genesis block 0 for a given nonce range.

This means that in average computation of a single bit takes less time than computation of the whole hash.

Argon2 whitepaper says that time-memory trade-off still can be used. At some point the trade-off stops working because computational units will occupy more space than the removed memory but this protection won't work for a quantum computer with its perfect parallelism of computations. Looks like Argon2 fails to deliver protection against quantum computers.

Argon2 whitepaper says that time-memory trade-off still can be used. At some point the trade-off stops working because computational units will occupy more space than the removed memory but this protection won't work for a quantum computer with its perfect parallelism of computations. Looks like Argon2 fails to deliver protection against quantum computers.

The whitepaper (Table 1) says that reducing memory for Argon2d by a mere factor of 7 requires increasing the amount of computation by 2^18, and it only gets much worse beyond that.

Two years ago I was the first German blogger that took notice of Nxt. I hope for IOTA I can also play an important role to create attention in the German speaking communities (what includes Switzerland and Austria as well).

A small number of particles in superpositionstates can carry an enormous amount of information:a mere 1,000 particles can be in a superpositionthat represents every number from 1 to2^1,000 (about 10^300), and a quantum computerwould manipulate all those numbers inparallel, for instance, by hitting the particleswith laser pulses.

While it's obvious that 1 number is not enough for Argon2 computation, if we assume that 10 numbers is enough then 18*10 extra qubits should solve the problem. Right?

No sure why you bring up D-Wave, everyone knows that they are doing quantum annealing, not proper quantum computations. None of this suggests we should not take a physical theory seriously. That's what this really boils down to, engineering challenges, the theory of quantum mechanics is crystal clear on this topic.

An idea has come to my mind. We could use a quantum computer to check SHA256 digests for different patterns by using Kuperberg's quantum sieve algorithm, this would let us to assess how secure SHA256 is. No patterns = hash function is close to random oracle. We could do the same for any algorithm even if it requires petabytes of RAM, we need only digests.