Data Quality Management – What does Best Practice look like?

Within Investment Management and financial services generally, data quality is key, as in fact it is in most industries. As so much of Investment Management data leads directly back into decision making processes of an organisation, data errors can lead directly to decision and execution errors. Therefore, in an ideal world, an organisation should have 100% confidence that their data is correct, complete and fit-for-purpose.

In the extreme, what issues can incorrect data cause:

Incorrect trading decisions

Financial loss – both from incorrect decision making as well as cost of remediation

Obviously, these can have a major impact on an organisation’s efficiency and financial stability. With this level of criticality attached to data quality, does your organisation have the requisite systems and processes in place to answer the question: “Are we confident in the quality of our key data?”

Clearly, in an ideal world we want to get to a point of having 100% accurate data. But how do we get there and how do we know that we have got there?

The following are some of the facets of a data quality management (DQM) environment that can assist in approaching DQM Nirvana – or at least something approaching best practice.

Identify what data checks need to be in place including types of data and frequency

Get some wins on the board by implementing high priority items early

Automate the data checks as far as possible, preferably on an event driven basis, i.e. when new data arrives, or data changes etc.

Record data quality (DQ) exceptions using a mechanism that enforces ownership assignment, tracking, escalation capability and oversight of the issues. Workflow and dashboard capabilities will assist immensely in this regard

Provide tools that assist with analysis and resolution of DQ exceptions. This may include enquiries over associated data and the ability to issue data corrections with suitable authorisation levels where appropriate

Resolve the issues and record their root cause

Recheck automatically to ensure issues have been resolved

Perform trend analysis to understand why and how issues are occurring and then determine ways to fix the root cause of the problems – whether this be more timely data checks, changes to the data architecture or perhaps a review of data vendors

The overall aim of a DQM environment is to ensure that data is of the highest quality appropriate to the data set in question at any point in time. To achieve this, it is important to quantify what data quality exceptions exist and their historical trends. Once the DQM environment is understood it can be analysed and remediated. It also can be measured going forward to ensure a commensurate improvement in data services.

Whilst ‘best practice’ suggests having a measurable DQM process in place is highly desirable, the new regulatory environment is also accentuating the need to incorporate such a process sooner rather than later.

So, what does a good DQ business process look like from the user perspective? Being able to look at your DQ dashboard for an up to the minute status on data quality, as well as having confidence in the knowledge that data is being monitored and fit for intended purpose is certainly a reasonable expectation. Can you do this now?

This article touches on Data Quality Management elements. DQM should also be viewed within the broader context of data governance, including ownership, lineage, provenance and data exploration, all of which assist in achieving data quality.

CuriumDQM, which is part of the Curium product suite provides a purpose built DQM process control solution that can assist firms in establishing best practice over their data environments and providing the levels of confidence necessary to meet their operational and regulatory data quality needs.