The latest example of big banks organising themselves to exploit the potential of blockchain technology came this week with the announcement that four big lenders have teamed up to develop a “utility settlement coin” — a new form of digital cash.

The four banks — UBS, Santander, Deutsche Bank and BNY Mellon, which are working with UK broker ICAP and developer Clearmatics Technologies — stress that they are not creating a new cryptocurrency.

…

The aim is to speed up clearing and settlement in financial markets by allowing institutions to pay for securities, such as bonds and equities, without waiting for traditional money transfers to be completed in the so-called delivery-versus-payment process.

By switching clearing and settlement of financial markets on to a distributed ledger, the banks hope to do away with much of their costly back office operations that process trades and keep records up to date. Quicker settlement should also free up capital that banks hold against trading risk.

Hyder Jaffrey, head of fintech innovation at UBS, says: “Every bank, exchange and clearing house, we all have our own sets of the same data, which get out of sync and have to be updated and reconciled.”

“The distributed ledger is the first technology which could implement a shared golden copy of that data,” says Mr Jaffrey. “If you have that breakthrough you can really see how that would be revolutionary in the financial world.”

Yet the banks don’t seem to be using blockchain’s key feature at all:

Some sceptics reckon the banks are missing the point. “This is banks talking to each other and the point of blockchain is to establish consensus in the presence of potentially untrusted actors, as with bitcoin, on the internet,” says Dave Birch, of payments consultancy Consult Hyperion. “It’s a sorry state of affairs, that technology is not going to fix, if the banks don’t trust each other.“

There are also doubts about whether blockchain technology can actually do the thing the banks want it to do:

The project, however, still faces many challenges. One is transaction speed. Bitcoin is often criticised for being unable to scale up because its blockchain can handle only about seven transactions a second, as opposed to, say, the 24,000 a second that Visa can. Any solution from the banks will have to be fast and capable of processing heavy loads.

So who’s made the massive software design breakthrough necessary to get the banks interested? A clue comes from a quick look at the web site of the consortium’s technology partner Clearmatics Technologies:

Clearmatics is based on the Ethereum Virtual Machine, specialized for financial and fiduciary computations, and it uses a new consensus protocol designed to achieve finality of settlement and eligibility as a Designated Settlement System.

Ethereum last hit the headlines just a couple of months ago, in connection with the DAO hack, in which $50Mn of the $150Mn raised by the DAO a month earlier suddenly went for a walk, with assistance from some hackers and a not-so-smart Ethereum smart contract. So that little misadventure is hardly Ethereum’s fault: Ethereum was just the infrastructure of the DAO, not the guiding mind behind the DAO dumb contracts. The real questions are still: is Ethereum any good for the banks? What’s the new technology?

In this new algorithm, agreement within the blockchain would be measured not on the basis of how much computing power agrees with the current state, but instead on the basis of how much digital currency agrees with the current state. The owners of this digital currency hold a financial stake in the success of the blockchain that tracks it, which is where we get the name for the algorithm.

…the bulk of the [blockchain’s] state is in fact distributed between the different nodes in the network, with each node only holding a small number of ‘shards’ of the state.

…

Of course, the whole point of sharding is to move away from the “everyone processes everything” paradigm, and split up the validation responsibility among many nodes. With naïve proof of work, doing that securely is difficult: proof of work as implemented in Bitcoin is a completely anonymous…consensus algorithm, and so if any one shard is secured by only a small portion of the total hashpower, an attacker can direct all of their hashpower towards attacking that shard, thereby potentially disrupting the blockchain with less than 1% of the hashpower of the entire network.

This changes, however, with proof of stake…the participants in the consensus process do have some kind of identity, even if it’s just the pseudonymous cryptographic identity of an address, and so we can solve the “targeted attack” problem with random sampling schemes…making it impossible for attackers to specifically target any particular transaction or any particular shard.

With that, and one or two more tweaks, the scalability issues will be banished:

The long term goal for Ethereum 2.0 and 3.0 is for the protocol to quite literally be able to maintain a blockchain capable of processing VISA-scale transaction levels, or even several orders of magnitude higher, using a network consisting of nothing but a sufficiently large set of users running nodes on consumer laptops.

Such schemes, however, can incur substantial overhead when cross-shard coordination is required in a Byzantine setting, so sharding protocols for blockchains are an open area of research.

It’s now August, so either the state of the art is advancing lickety-split, or there are still one or two loose ends in Mr Buterin’s scheme; the problem of duplicates, perhaps, not explicitly addressed in Mr Buterin’s paper, but just as relevant to Ethereum as it is to Bitcoin:

Bitcoin solves the “double spend” problem with double entry accounting, and it ensures the integrity of the accounting system by guaranteeing that all transactions are globally unique and partially ordered. A “new” transaction must not be a duplicate of a transaction in any prior block, nor may it be a duplicate of a transaction already in the current block. For this to work somehow we must to look at every pair of transactions to determine that all transactions are unique.

This point seems frequently misunderstood, as you can see in this quote from one of the many Bitcoin scalability white papers and which is representative of the general trend:

The problem of simultaneously achieving the best of both worlds: having only a small portion of consensus participants explicitly participate in validating each transaction, but have the entire weight of the network implicitly stand behind each one, has proven surprisingly hard.4)Vitalik Buterin (31 May 2015), Notes on Scalable Blockchain Protocols. Available at <https://github.com/vbuterin/scalability_paper/blob/master/scalability.pdf>

Well, no. If we want to provide a guarantee of each transaction being globally unique then we must inspect every transaction. There’s nothing “implicit” about this. And if we weaken this guarantee (such as only ensuring that transaction are locally unique) then we break Bitcoin.

The consortium promises to have a commercial-grade blockchain system by 2018. That’ll be a mighty impressive achievement, if it happens. If it doesn’t, there’s plenty of money riding on it, and on many other competing schemes too, so it’ll be a mighty impressive bust.

Over to our sceptic again (I particularly recommend the perpetual motion link):

A lot of smart people have been working on the Byzantine problem (which blockchain is a partial solution to) for over 20 years and it’s well understood. We know the shape of what is possible, and current focus is on incremental improvement and niche applications where we can relax a constraint.

If you want the guarantees of consistency provided by blockchain then ultimately everything must pass through a single consensus processes. This is not a programming thing, it’s the laws of physics. We’ve known this for nearly a couple of decades, but unfortunately folk keep proposing perpetual motion machines.

We can easily do a lot better than Bitcoin’s few transactions per second, but the only parameters we have to play with are dwell time, block size and the strength of the consistency guarantee. Assuming that some smart person is just going to walk in and solve this problem is hubris.

Blockchain performance might always suck, but that’s not a problem

The sweet spot for blockchain – and distributed ledgers – is low volume, high-value exchanges. There’s a lot of interesting problems to solve in this space, from tracking diamond provenance through contract attestation and so on. It’s just that the pie in the sky, blockchain-taking-over-the- entire-financial-system predictions are likely wrong.

The final word goes to Reuters, hinting at what this UBS blockchain initiative is really all about: