Data Management – Is it time for a best of breed approach?

Data management – it’s a catch all term for any aspect of a firm’s technology or operational process which involves the loading, movement, transformation, storage, management or use of data, whether it be structured or unstructured.

Well let’s just focus on the structured data for now and assume that most financial organisations (both buy and sell side) have a reasonably good understanding of how that data moves through their organisation, how and where it is stored, disseminated to underlying applications and ultimately consumed in supporting the core business functions. Most firms have a data management process of some description otherwise they wouldn’t be functioning. Now for the tricky question: Is that process any good, and how do we measure if it is any good?

The first criteria for assessing the data management process naturally relates to the ability of the firm to transact its normal business and that client and firm trades are not impeded or at worse incorrectly executed because of any failures or gaps in the data management process. If we don’t make too many mistakes, compliance is happy, we meet regulatory requirements and we have some level of governance, then the process must be fit otherwise we probably wouldn’t remain in existence. Sure enough, most firms can answer the question to this level.

However, do firms really understand their data management process? Are we an efficient organisation? Can we reduce our data costs? Do we understand the impact of bad data quality? Can we measure this quality and do we have full oversight over the data management process? Can we pre-empt data issues? When data issues arise can we take restorative action quickly and make ongoing adjustments along the way without losing control or oversight of those changes?

There are very few organisations that have made significant inroads into this more qualitative assessment of their data management process. It is becoming increasingly evident that this is an area that needs some attention and investment. Whether the drivers are a desire for greater operational efficiency driven by business expansion or contraction, tighter business integration, risk control, regulation or client pressures, a greater understanding and control of the data management process is well overdue in many firms.

Up to now, most attempts to introduce data management processes into firms have focused around the primary objectives – fix any glaring holes, satisfy clients, compliance and meet core regulation. Typically, this has involved applying the investment budget to the physical movement of data and will depend in large part on the style and complexity of the business function. Obvious targets in the physical process include robust solutions for data loading, data transformation, data storage as appropriate (data marts, warehouse, gold copy etc) and data routing.

This plays well to the strengths of IT teams and core ETL/ EDM toolsets. IT teams are usually happiest and most successful when they are working on projects which involve bringing the data and applications together into a coherent framework and simplified topology. But IT is not usually, and you could argue that neither should they be, the best judges of what makes the data management process fit the business needs – surely that should be the Operations teams.

As these data management projects have got more lengthy and price tags of EDM solutions have got more expensive, then it is not uncommon for some firms to claim that a re focused data management process, as described above, will also improve data quality, process control, operational risk, data lineage and ongoing data cost evaluation. The truth is that while these areas may take some benefit from a reworked process, they are not just functions of the physical data management architecture but need their own analysis and business solutions. Let the business really get to grips with the data management process from a qualitative aspect and the benefits to the business will really start coming through. In other words, don’t just process the data, manage it to the benefit of the entire business.

EDM projects over the last five years or so have got us part of the way. Now, we need to go back to the technology teams and solution vendors and start demanding solutions that deliver better control of the data management process, particularly better control over the data quality from a process perspective. The benefits to the business of such a shift in emphasis are clear: better data governance, better risk control, meaningful oversight of data vendors and suppliers leading ultimately to operational efficiency and cost reduction.
So can the traditional EDM/ETL tools deliver any of the required features and functions that the modern financial firm needs to move to this enhanced world of data management? Up to a point they can. In addition to their core functions of moving and storing data, they can offer some level of data cleansing/data validation, configurable rules engines and some control over data provenance for example, i.e. better physical control over the data process.

What these tools are not typically good at, however, is giving the control of that data management process to the Operations teams. Solutions that will allow Operations teams to:
Build, enhance, test and tweak the data rules without constant reliance on IT.

Have control over the application of those rules. – When to turn on/off or ignore. (Maybe we can’t fix the data this week. Maybe we are happy with the differences that are being detected so we can ignore that issue until the data changes.)

Prioritise data issues around the business – let me work on the issues that will have the biggest impact on the business first.

Manage the lifecycle of an issue – who owns this issue, what happens next and how long does it take to fix?

Determine what is the impact of an issue – can I see where this data comes from and where it will make its biggest impact? Which clients are most affected?

Take control over data amendments. If I amend the data, can we take control of that amendment until the underlying data/supplier gets back in line?

Gauge whether my data is getting better, worse or is it the same problems time and time again? How can management know who are the best suppliers and measure the impact of decisions around data providers?

Providing solutions that help teams provide answers to the above are what really determines whether a data management process is working or not and whether it is helping build a stronger and healthier business.

And why does the firm need to have a separate solution for managing the processing of structured data? Because only when Operations have full control over their core data functions can the firm move ahead with improving client satisfaction, internal process improvements, external vendor management, better adherence to risk and regulation initiatives, operational excellence and cost control.

Also, there is no reason why a firm should only have a single technology approach to data management or that each firm should follow the same approach. Many firms will decide that they do not have the appetite or the desire to change the underlying plumbing of their data management process in order to achieve Operational control of the data.

At a time when lower technology budgets become commonplace, the drive to generate more efficiencies in the business becomes overriding and a rapid return on investment a prerequisite to the IT spend, the desire to re work the physical data management process in any particular firm will become a less compelling approach to the problem. As was commented above, maybe the process already works just fine from a physical perspective and it is only the operational control that needs to be enhanced. Again let’s not define data management with a catch all EDM or super ETL toolset. Firms should recognise that they need to solve more than one problem or indeed different problems under the same data management umbrella and this will more likely than not take a best of breed or a combination of solutions to really build in the business process improvement they are seeking.

With the evolution of most business functions, particularly when they involve technology components, history often records that the initial solutions rarely deliver the entire solution and that a more granular analysis and a series of evolutionary solutions will often yield the best results in the long run. In that vein, just as the data management areas of financial firms are waking up to the new data management and best of breed data quality management solutions available to them, one can expect to see new approaches in the areas of data profiling, data lineage and data governance over the coming months.

At Sun Street we believe that a best of breed approach to data quality management can yield considerable business benefits to financial organisations. To discover more about the Curium approach please get in touch.