Can you harpoon a Whale?

The London 'Whale' case has thrown up all sorts of issues around monitoring behaviour in banks. But, according to Nigel Cannings, a new approach is needed.

Monitoring the whale Galyna Andrushko

The headline figures themselves are probably enough to keep CEO’s around the world awake for years to come. However, JP Morgan Cazenove faces not only some of the largest fines in regulatory history, but some serious scrutiny from the criminal side of the enforcement fraternity, and to a banker, that is of far more concern.

The days are now gone where as a Chief Executive or board member you can fall on your sword, and hopefully walk away with a reasonable pile of cash to fund a quiet retirement. Now people are looking for blood. In the UK, Vince Cable, the Business Secretary, has already announced proposals for dealing with “rogue directors”, and has made no secret of the fact that this is aimed at the financial services and banking sector.

On the other side of the pond, however, it’s got more serious more quickly, with criminal charges being brought against a number of the players in the “London Whale” affair. But according to Reuters, the FBI is still looking at bringing criminal charges against other key JPMC staff involved in the trades and the subsequent lack of openness

CNBC tweeted on Thursday that an “unnamed SEC source” had confirmed that JPMC CEO James Dimon is “unlikely to be prosecuted” in the London Whale scandal. I would be packing an overnight bag at this point.

Why another bank crisis?

What, though, are the root causes of yet another bank crisis? Surely with the increased scrutiny and regulation that has come in since the 2008 financial crash, this sort of thing should not happen anymore?

With the increasing complexity and speed of financial transactions, we have come to rely on computerised systems to allow us to spot rogue trades, insider dealing, unusual market volumes and breaches of risk and trading limits. In a utopian world, these should be enough to ensure that no-one can take a risk big enough to break a bank, or worse still, and entire financial system.

What we have not weeded out, it seems, is the human element, that ability to see a system, and try to find a way to subvert it.

A slow motion crash

The Final Notice issued by the FCA reads like a slow motion car crash. If you want an example of how to take an appalling situation and make it worse through sheer human failing, you would be hard pushed to find a worse one.

To quote the FCA:

“The losses were caused by a high risk trading strategy, weak management of that trading and an inadequate response to important information which should have notified the Firm of the huge risks present in the SCP (Synthetic Credit Portfolio)”

One trader had, in essence, built up a $100 billion bet that a basket of companies would remain solvent and credit worthy, that their bonds would stay as “investment grade”. So confident was he that he was right, he ignored the usual procedure to “hedge” his bets by buying Credit Default Swaps (CDS), a kind of insurance against the bet going wrong. In fact, he sold CDS against his own position, so confident was he that his bet could not fail.

In 2012, this position started to unravel, partly because people had begun to realise that JPMC were so massively exposed. The strategy was to force JPMC into a position where they were so exposed, they had to buy CDS to cover themselves, but at an inflated price.

Changing the risk model

When it became apparent that risk levels had been exceeded at JPMC, the risk model was changed. As losses started to mount, traders were instructed to submit valuations for their portfolios that were presented, to quote the FCA, “in a noticeably favourable manner”.

And then, it seems, as the scale of the problem became more and more apparent internally, that certain very key areas of information were kept from the regulator, even though JPMC had been told that their ‘”appetite for further surprises was close to zero”’ – It does not get much more damning than that.

There is no accounting for human behaviour

So initially rogue humans circumvented the systems designed to prevent rogue trading, then more rogue humans instructed other rogue humans to circumvent another system to allow it to remain hidden, and then a final bunch of rogue humans decided to hide it from the people who make the rules that are supposed to prevent rogue humans doing rogue things.

What does this teach us? Well, when we are designing systems that are supposed to control human behaviour, you have to build in safeguards to allow for human nature. You can often only do this by indirect means, which means, for example, monitoring changing stress levels in email and telephone calls, or looking at changing patterns of speech and behaviour. Key words and phrases can give trained investigators clues as to where to look.

We rely to heavily on “structured” systems to do our dirty work. Time we took a step back and thought about what unstructured data can teach us