Banks Must Ditch Legacy IT

Banks with decades-old IT systems are struggling to adjust to the changing regulatory and financial landscape.

The banking industry is staring at a major challenge: how to drive growth, attract new customers and slice costs while relying on 40-year old technology systems. Even with constrained IT budgets, many banks need to modernize the aging systems that run their core operations -- deposit gathering, lending, mortgages, cards and online banking.

Banks have had their reasons to put off modernization. In the pre-financial crisis years, with profits flush, there wasn't a big incentive to make the big investment and take the risk of a large modernization project. Instead, banks opted for smaller, less costly alternatives such as product or feature enhancements, which often added complexity to their environments

Additionally, during years of acquisitions -- there have been nearly 250 large mergers since 1990 -- most spending on core platforms was focused on integrating acquired banks' systems onto their own, rather than on improving capabilities. This left little time or dollars for simplification. The result -- a spaghetti-like maze of legacy systems -- was expensive to maintain but generally considered to be the cost of doing business.

But that is no longer the case. The banking industry's new world order is marked by regulations requiring greater transparency, more self-sufficient customers with rising expectations, stronger competition from traditional and non-traditional players, and relentless cost pressures.

Customers expect banks' systems to be flexible enough to give them one experience across channels -- branch, ATMs, online banking, mobile banking and even social media. That means, for example, enabling a customer to start her mortgage application process online and finish it at the branch or by phone with a customer service representative, without starting over. And, in today's slow growth environment, banks must be able to leverage their oceans of customer data to exploit cross-sell opportunities -- such as marketing investment products to that same mortgage customer -- and quickly bring new products to market.

Yet many banks' core systems are a significant obstacle to achieving these strategic objectives. Some financial institutions, for example, run different platforms for related products. Thus, consumer loans run on a completely separate system from consumer deposits which, in turn, run on a separate platform from commercial deposits. The data is not integrated for these systems except on the backend. Some larger banks may have multiple platforms for the same line of business across multiple regions.

Bank execs resist a core transformation of their systems because it's such a daunting undertaking. We know this because Accenture sells software and services tied to these kinds of transformations.

The financial investment can be substantial, requiring in some cases investments of hundreds of millions of dollars and a time horizon of three years or longer, depending on the scope of the project. When these projects go wrong, the result can be delays, cost overruns and customer service disruptions. Setting aside time for lengthy employee training is another factor to consider. Historically, many bank executives have adopted a "not on my watch" attitude due to the potential risks of a modernization.

Three Factors Changing Banking IT Needs

Unlike traditional legacy systems, the promise of modernized core systems is that using a modern architecture including components that banks can quickly configure will make it easier to design and deliver new products and services. Advances in enabling technologies such as service-oriented architectures are making transformations more manageable and affordable.

Currently, several of the top 20 U.S. banks are engaged in core banking upgrades -- whether full-blown system replacements or incremental modernization. Approximately 20% of U.S. banks have reached a high level of urgency regarding replacing their core systems, the research firm Aite Group finds, and an additional 56% would benefit significantly from a replacement.

I'm a recent hire at a top ten US bank. If they got the most modern system in the world it would be legacy as soon as this IT dept got their hands on it. The Java apps created by the developers are tightly coupled to the version of the Websphere IDE the app was created with. When they need to work on a different app, these guys wait a few days for desktop services to install a different version of Websphere IDE on their desktop. Can you freaking imagine? I hope Google gets into the banking business and wipes these guys out.

I've heard similar arguments, for similar reasons, in the engineering computing realm - people feeling the need to try to replace highly technical applications written in Fortran with the language/flavor of the week. Why doesn't this approach work?

1. The folks that created the original applications are probably no longer with the organization - How many banks still have people on staff from 40 years ago, in the same technical positions that they had at that time? VERY few. 2. How well documented is the original application? If you have to go back and re-engineer an application because of a lack of properly documented code, the sheer manpower and man hour requirement to make that happen is going to turn off a lot of execs with the mindset of "If it's not broke, don't fix it." - but then these are the same folks who might start the project with the organization and not be around by the time the delivery is planned 3-4 years down the road.3. ROI - why spend millions of dollars overhauling an existing system to replace it with something that does the exact same function? As long as existing hardware systems are supported by the manufacturer or a trusted third-party, there's no impetus to make a move from that standpoint and it creates a cottage industry (if one can call it that) that supports nothing but banking organizations.

Ultimately, it comes down to how much money is going to be made/saved by making the decision to move to a new system - and given the fuzzy math behind a lot of those calculations, once you start getting into the 7 figure range and higher, you'd better have some serious backup for those calculations.

That said, my mantra when I was in the consulting game was that upgrades are inevitable - doing them on your own time is much more cost effective than being forced into them by a system failure. Once a major bank starts having major technical failures, I think you'll see the temperature start to rise for other banking organizations to get going on their own modernization efforts.

I have to question the picture that you are painting here. The IT departments of the largest banks are not putting off modernization. They are in fact using a multitude of platforms, each best suited to the task that they are designed to handle. For example, mainframe systems are at the very core of the largest banks, because that platform is the best tool available to manage the highest transaction rates in the world. Now, banks, or any other organization that relies on powerful IT systems at their core, must use the best systems available for specific tasks. Any bank that is using mainframe systems for ALL of its workloads just wouldn't be able to stay in business. That is why most businesses have dumped mainframe use for most purposes. Except one- processes that require tremendous throughput-- like transaction processing. For most other workloads, I would agree- offload as much as you can to cheaper PC-based servers. But only those workloads that make sense...

Healthcare data is nothing new, but yet, why do healthcare improvements from quantifiable data seem almost rare today? Healthcare administrators have a wealth of data accessible to them but aren't sure how much of that data is usable or even correct.