Data Management Top IT Priority at Banks

Although IT spending will grow slowly for North American banks in 2008, rising just 3 percent over 2007 levels, a full-court press will be placed on effective data management.

The fertility of bank programs in generating unstructured data is forcing banks to supplement data management practices. Data warehousing, storage and security are not sufficient in organizing data for relevant and accurate use. Savvy banks are adopting comprehensive “life cycle” strategies, according to the “Global Banking Top 10 Strategic Initiatives for 2008” report from Financial Insights, Framingham, Mass.

“Back in the ’60s and ’70s, banks collected data because it was a normal part of their business,” said Jim Scurlock, vice president of the global financial services group at EDS. “There was no thought at all as to what you could do with that data.”

In the ’80s, he said, IBM and Oracle introduced the relational database through which banks could query their databases and manipulate the data they’d collected. There were challenges, however. Some applications crippled the mainframe while it was tending to everyday bank business.

To rid themselves of this model’s troubles, Scurlock said, banks made copies of their data to data warehouses and ran queries there. Because data warehouses did not automatically update when new data became available, small groups within the institutions used data marts, in which they worked with fresh data. But without synchronization among data marts, the institution and many disciplines within it learned nothing.

“This year, banks are focusing much more on back-office improvements with their technology spending,” said Jeanne Capachin, vice president of global banking and insurance at Financial Insights. “Business heads are recognizing that to meet their individual goals, the enterprise infrastructure must be refreshed. Product-focused investments will no longer solve the problems they are facing.”

“The next evolution is taking hold,” Scurlock said. “Banks are operating data stores off the mainframe but driven by users. They can update in real time, near real time or in batches. Insurance companies and banks are implementing them for cross-sell and upsell [marketing] and to create a customer experience, to become more thematic in sales.”

The change in data management approaches makes large banks more proactive than reactive in focusing on data that is important and relevant to creating a customer experience.

“The last new bank product was in the 1600s — derivatives. Everything since then has been a combination of existing products,” he said. “Banks can’t differentiate on products or price. These are commodities. The way to differentiate is on customer experience.”

“There is a whole new frontier out there waiting to be looked at,” he continues. “It’s in the data. We’re just on the fringes of it.”

One customer experience that banks know to optimize is data security. It ranked third in the Financial Insights research but is the No. 2 concern for Scurlock and EDS. As banks face potential security problems from channel expansion, real-time transaction processing, outsourcing and a mobile workforce, “the risk to reputation is more damaging than any aspect that may arise in data loss,” he said.

“People don’t read statement inserts used to communicate security of personal data. It’s important that people trust the system. Institutions have done an outstanding job getting out ahead of this.”

“A lot of initiatives have been put into place, and now they are going into implementation and rollout,” Capachin agrees.

Scurlock describes these as “locks on the inside door” and prescribes a system of checks and balances be added to secure a bank from inadvertent loss. “This is a relatively new conversation — keeping information in.”

Kelly Shermach is a freelance writer based in Brooklyn, N.Y., who frequently writes about technology and data security. She can be reached at editor (at) certmag (dot) com.