The Business of Data Management

Enterprise Data Management is a topic that has consumed much media ink this year. Many firms (large and small) are now realizing that data is fast becoming the “new currency”. Data, if properly managed, can help firms better understand and control their costs, control their risk and satisfy their regulatory obligations, while, at the same time, enable new insights into how their customers behave, what products and services they want and determine how to penetrate new markets, grow revenue and increase wallet share. But realizing the value of one’s data is not a simple task. Many firms are steeped in legacy environments, littered with disparate and duplicative data, with marginal controls and understanding of the data assets they own. With the need to realize value from one’s data, what are firms doing to address this challenge and position their organizations to take advantage of their data?

What many firms do, to address these challenges, is throw technology at the problem. Don’t get me wrong, technology is critical to successfully managing data, but technology alone will not address the challenge and enable a lasting solution.
To build a successful and lasting data management solution, firms must adopt a data management discipline driven by business and enabled by technology – and make it a part of their organizational DNA.

A business driven data management discipline revolves around the recognition that data is a critical corporate asset, and needs to be managed like every other critical asset of a firm. To do so involves three important concepts.
First is the unique identification of “things”. Data represents things. Whether its widgets produced by Factory X or financial instruments contracted by Financial Firm Y, all elements of business (products, customers, transactions, etc.) must be uniquely identified. Without identification standards, it is nearly impossible to aggregate holistically, to determine concentrations, weaknesses and opportunities. The financial industry witnessed this first-hand during the Lehman collapse. Lehman was not one company, but a collection of subsidiaries and subsidiaries of subsidiaries. There was no unique identification scheme for these entities, thus, understanding the systemic risks associated with the inter-relationships of these entities and the relationships and dependencies of these entities with firms across industry was nearly impossible to quickly calculate. This left decision makes unsure of the aggregate impact of a Lehman fail.

Next is the realization that the meaning of data matters! As an industry, we have been managing the “movement” of data for many years – and we’re pretty good at it. The practice of acquiring, processing, storing and distributing data is well established. Every year we get better and better and faster and faster at the technology of moving data.

But where we have come up short has been in truly understanding the meaning of our data. Data that feeds our operations, our marketing engines and our risk and regulatory functions must be well understood, with unambiguous business meaning, shared across departments, divisions, firms – and across entire industries. If we don’t control and govern the consistent meaning of data across our operational functions, how else can regulatory bodies, and firms themselves, safeguard their industries if they don’t have consistent and trusted meaning of their data?

Finally, even if we are able to identify our entities without ambiguity, and we are successful at assigning consistent and shared meaning of our data across all of our functions, if we don’t have a well-established control environment for managing how data flows throughout its lifecycle, the quality of that data, and the assurance of its proper use, will quickly deteriorate.

Creating such an environment is anchored in adhering to industry best practices. Several efforts are underway across industry today. EDM Council, the industry trade association focused on data management, has developed the DCAM – Data management Capability Assessment Model that defines these best practices. Other organizations and consulting firms have done similar work. This is an important trend. Best practice models guide data professionals in developing the strategies, defining the business cases, building the programs, establishing data governances, constructing data and technology architectures, building the data quality programs and defining the operational models that will position firms to fully realize the value of its data.

We have been swimming in the vast ocean of data for many years – but fair to say, we’ve only been using one arm to paddle through the water. It’s time we use both arms – technology AND business – to propel us to properly manage this important asset.