Big data: easier, faster, cheaper

Tag Archives: data architecture

Banks do not need to be wedded to complexity, says Navin Suri, Percipient’s CEO

Marie Kondō’s bestseller, The Life-Changing Magic of Tidying Up: The Japanese Art of Decluttering and Organizing, is sweeping the world. Her message that simplicity pays off applies as much to a bank’s data architecture as it does to a person’s wardrobe.

Few bankers would argue with the notion that the IT architecture in banks is overly complex and as a result, far less productive than it could be. So how did we get here? Rather than a single blueprint, most banks’ IT evolved out of the global financial industry’s changing consumer demands, regulatory requirements, geographic expansion, and M&As. This has led to a tangled web of diverse operational systems, databases and data tools.

Rapid Digitisation

But rapid digitisation has put this complex architecture under further stress. Amid dire warnings, such as the one from Francisco Gonzales, then CEO of BBVA, that non-tech ready banks “face certain death”, many rushed to pick up the pace of their digital transformation.

Banks rolled out their mobile apps and digital services by adopting a so-called “two-speed infrastructure”, that is, enhanced capabilities at the front, built on a patchwork of legacy systems at the back. Now over a third of all banks, according to a 2015 Capgemini survey, say “building the architecture/ application infrastructure supporting transformation of the apps landscape” is their topmost priority.

Fragmented Infrastructure

Meanwhile a key reward of digitisation – high value business intelligence – remains elusive. Banking circles may be abuzz with talk of big data, but the lack of interoperability across systems makes this difficult to achieve. In some cases, cost effective big data processing technologies like Hadoop have actually deepened the problem by introducing yet more elements to an already unwieldy architecture.

To address the problem, financial institutions have opted for two vastly contrasting approaches. Either paper over the cracks with a growing number of manual processes, or bite the bullet, as UBS is doing. The world’s largest private bank announced in October last year that it will be spending US$ 1 billion on an IT overhaul to integrate its “historically fragmented infrastructure”.

Attack On Complexity

However, for those banks unable or unwilling to rip out and replace their existing sytems, there is a third way. The availability of highly innovative open source software offer banks the option of using middleware to declutter and integrate what they have.

Percipient’s data technology solutions, for example, enable banks to pull together all their data without the need for data duplication, enterprise data warehouses, an array of data transformation tools, or new processes and skills. These solutions are, at their core, an attack on the architectural complexity that banks have come to grudgingly accept.

Visible Order

As Marie Kondō points out, “Visible mess helps distract us from the true source of the disorder.” In the case of most banks, the true source of the disorder appears to be an IT infrastructure derived, rather than designed, to meet the huge demands placed on it by digitisation. There is now a real opportunity to turn this visible mess into visible order.

This article was a contribution to, and originally appeared in, finews.asia