The Five Deadly Sins of Financial Services IT..

This blog has time & again discussed how Global, Domestic and Regional banks need to be innovative with their IT platform to constantly evolve their product offerings & services. This is imperative due to various business realities – the increased competition by way of the FinTechs, web scale players delivering exciting services & sharply increasing regulatory compliance pressures. However, systems and software architecture has been a huge issue at nearly every large bank across the globe.

Regulation is also afoot in parts of the globe which will give non traditional banks access to hitherto locked customer data. E.g PSD-2 in the European Union. Further, banking licenses have been more easily granted to non-banks which are primarily technology pioneers. e.g. Paypal

It’s 2016 and Banks are waking up to the fact that IT Architecture is a critical strategic differentiator. Players that have agile & efficient architecture platforms, practices can not only add new service offerings but also able to experiment across a range of analytic led offerings that create & support multi-channel offerings. These digital services can now be found abundantly areas ranging from Retail Banking, Capital Markets, Payments & Wealth Management esp at the FinTechs.

So, How did we get here…

The Financial Services IT landscape – no matter the segment – one picks across the spectrum – Capital Markets, Retail & Consumer Banking, Payment Networks & Cards, Asset Management etc are all largely predicated on a few legacy anti-patterns. These anti-patterns have evolved over the years from a systems architecture, data architecture & middleware standpoint.

These anti-patterns have resulted in a mishmash of organically developed & shrink wrapped systems that do everything from running critical Core Banking Applications to Trade Lifecycle to Securities Settlement to Financial Reporting etc. Each of these systems operates in an application, workflow, data silo with it’s own view of the enterprise. These are all kept in sync largely via data replication & stove piped process integration.

If this sounds too abstract, let us take an example & a rather topical one at that. One of the most critical back office functions every financial services organization needs to perform is Risk Data Aggregation & Regulatory Reporting (RDARR). This spans areas from Credit Risk, Market Risk, Operational Risk , Basel III, Solvency II etc..the list goes on.

The basic idea in any risk calculation is to gather a whole range of quality data in one place and to run computations to generate risk measures for reporting.

These data feeds are then tactically placed in memory caches or in enterprise data warehouses (EDW). Once the data has been extracted, it is transformed using a series of batch jobs which then prepare the data for Calculator Frameworks to which run the risk models on them.

All of the above need access to large amounts of data at the individual transaction Level. The Corporate Finance function within the Bank then makes end of day adjustments to reconcile all of this data up and these adjustments need to be cascaded back to the source systems down to the individual transaction or classes of transaction levels.

These applications are then typically deployed on clusters of bare metal servers that are not particularly suited to portability, automated provisioning, patching & management. In short, nothing that can automatically be moved over at a moment’s notice. These applications also work on legacy proprietary technology platforms that do not lend themselves to flexible & a DevOps style of development.

Finally, there is always need for statistical frameworks to make adjustments to customer transactions that somehow need to get reflected back in the source systems. All of these frameworks need to have access to and an ability to work with terabtyes (TBs) of data.

Each of above mentioned risk work streams has corresponding data sets, schemas & event flows that they need to work with, with different temporal needs for reporting as some need to be run a few times in a day (e.g. Traded Credit Risk), some daily (e.g. Market Risk) and some end of the week (e.g Enterprise Credit Risk)

Illustration – The Five Deadly Sins of Financial IT Architectures

Let us examine why this is in the context of these anti-patterns as proposed below –

THE FIVE DEADLY SINS…

The key challenges with current architectures –

Utter, total and complete lack of centralized Data leading to repeated data duplication – In the typical Risk Data Aggregation application – a massive degree of Data is duplicated from system to system leading to multiple inconsistencies at the summary as well as transaction levels. Because different groups perform different risk reporting functions (e.g Credit and Market Risk) – the feeds, the ingestion, the calculators end up being duplicated as well. A huge mess, any way one looks at it.

Analytic applications which are not designed for throughput– Traditional Risk algorithms cannot scale with this explosion of data as well as the heterogeneity inherent in reporting across multiple kinds of risks. E.g Certain kinds of Credit Risk need access to around 200 days of historical data where one is looking at the probability of the counter-party defaulting & to obtain a statistical measure of the same. The latter are highly computationally intensive and can run for days.

Lack of Application Blueprint, Analytic Model & Data Standardization– There is nothing that is either SOA or microservices-like and that precludes best practice development & deployment. This only leads to maintenance headaches. Cloud Computing enforces standards across the stack. Areas like Risk Model and Analytic development needs to be standardized to reflect realities post BCBS 239. The Volcker Rule aims to ban prop trading activity on part of the Banks. Banks must now report on seven key metrics across 10s of different data feeds across PB’s of data. Most cannot do that without undertaking a large development and change management headache.

Lack of Scalability– It must be possible to operate it as a central system that can scale to carry the full load of the organization and operate with hundreds of applications built by disparate teams all plugged into the same central nervous system.One other factor to consider is the role of cloud computing in customer retention efforts. The analytical computational power required to understand insights from gigantic data sets is costly to maintain on an individual basis. The traditional owned data center will probably not disappear, but banks need to be able to leverage the power of the cloud to perform big data analysis in a cost-effective manner.
EndFragment

A Lack of Deployment Flexibility– The application & data requirements dictate the deployment platforms. This massive anti pattern leads to silos and legacy OS’s that can not easily be moved to Containers like Docker & instantiated by a modular Cloud OS like OpenStack.

THE BUSINESS VALUE DRIVERS OF EFFICIENT ARCHITECTURES …

Doing IT Architecture right and in a responsive manner to the business results in critical value drivers that that are met & exceeded this transformation are –

Effective Compliance with increased Regulatory Risk mandates ranging from Basel – III, FTRB, Liquidity Risk – which demand flexibility of all the different traditional IT tiers.

Exist & evolve in a multichannel world dominated by the millennial generation

Reduced costs to satisfy pressure on the Cost to Income Ratio (CIR)

The ability to open up data & services that operate on the customer data to other institutions

A uniform architecture that works across of all these various types would seem a commonsense requirement. However, this is a major problem for most banks. Forward looking approaches that draw heavily from microservices based application development, Big Data enabled data & processing layers, the adoption of Message Oriented Middleware (MOM) & a cloud native approach to developing applications (PaaS) & deployment (IaaS) are the solution to the vexing problem of inflexible IT.

The question is if banks can change before they see a perceptible drop in revenues over the years

Tags:

Your email address will not be published. Required fields are marked *

Comment

Name *

Email *

Website

Related Posts

BLOG

11.17.17

Building a global data lake for...

Financial institutions need to leverage all the information they can gather to guide future investments, reduce risk and detect fraud. These objectives directly influence an institution’s bottom line and have become more challenging with the rising volumes and varieties of Big Data. To keep up, financial institutions are continuously adapting their data architectures to support…

Strata Data Conference New York –...

The annual Strata Data Conference will make its next stop in New York the week of September 25-29. The core theme of the conference this year is around driving business transformations through the power of data. And in the world of data, few topics excite as much as data science, machine learning, and deep learning.…

Top 5 Use Cases of HDF...

With the explosion of the Internet of Things (IoT), businesses need to reevaluate their existing data strategies and adopt a more modern data architecture. Building a near-real time streaming application that take advantage of data-in-motion can be a daunting journey to undertake. Most streaming applications require a considerable amount of coding, testing and time to…

Hospitality and the Big Data Window...

What eventually drives us to purchase something online? This isn’t a hypothetical or philosophical question, as businesses are forced to find answers every day. Consumers across every industry have access to almost anything they can find on a search engine, however with such a big supply of goods, businesses must manage the Big Data that’s…

How the Leading Transportation Tech Provider...

Last week we published a story on how the leading transportation tech provider is able to solve the last mile problem for the carrier industry. This article expands on this and how companies need to have clean data. How Clean Is Your Data? By now, we all know that data is Big. We know that…

Denodo Platform 6.0 Certification with HDP...

This guest blog is from our partner Denodo who is a leader in data virtualization and a Hortonworks partner for many years. Denodo helps customers who have data from multiple, heterogeneous sources to quickly, easily and cost-effectively integrate it to derive business insights and positively change their strategy to become more data-driven. In addition to…

How Data Fuels the Breakneck Speed...

The world is moving faster than ever before, and companies are finding themselves scrambling to keep up. Our society has become centered on convenience and everything happening in real time. Data insights need to happen in real time as well so that companies can maximize their ability to better serve their customers. Last week I…

How the Leading Transportation Tech Provider...

In many industries, the last mile problem is the most difficult problem to crack. This is especially true in the carrier industry, where 91% of the total carriers operate fleets of six vehicles or less. TMW Systems (TMW) solves the last mile problem, and an array of other transportation and business challenges, by consolidating the…

GDPR Compliance Presents New Business Opportunities

The General Data Protection Regulation (GDPR) is creating new business opportunities by making it easier for companies worldwide to operate throughout the European Union. GDPR compliance will force companies to review their data-handling practices, and to analyze and understand their customer data types. Now, the conversation is changing from the burden of compliance to the…