DPM data dictionary

The Data Point Model (DPM) is a data dictionary encompassing the harmonised data requirements developed by the EBA and included in its Technical Standards and Guidelines. With the objective of achieving a more complete data harmonisation, the DPM may include as well other data definitions related to the EBA regulatory data.

The DPM data contents

Why is the DPM needed?

The DPM enables the harmonisation of the banking regulatory framework, by providing a clear interpretation of data exchange requirements to all relevant stakeholders.

The DPM data dictionary copes with changes and new regulation developments. Therefore, new data requirements or new definitions can be added, but all previous data requirement frameworks can still be tracked.

The DPM bridges the communication gap between business and IT by providing a common platform of understanding. The business concepts are specified in the DPM according to formal rules, as required by IT specialists, while being still manageable by policy experts and other data users.

The DPM provides the metadata support to fully automate the production of data exchange specifications, such as XBRL taxonomies, or other equivalent exchange formats.

The DPM can be viewed from three distinct aspects: the content, the process, and the supporting data structure.

What is the DPM content?

The DPM contains information about what data should be reported and how it should be exchanged. It includes:

A common core vocabulary of elementary financial and prudential concepts, arranged in different categories and ready to be used in the definition of the regulatory data;

A complete set of data points that are part of the regulatory frameworks, described by a distinct combination of elementary concepts of the core vocabulary;

The reporting templates, their rendering information, the mapping of template cells with the data points, and the agreed validation rules. Additional metadata required for automating the generation of data exchange formats (XBRL taxonomies) is also included.

The DPM does not contain information about institutions and their reporting obligations, which are the subject of a complementary Master Data Management system.

How is the DPM created?

The DPM is the result of a systematic modelling process with embedded automated checks, which translates each piece of regulation into a formal data dictionary ready to be used by electronic information systems, or directly accessed by experts on banking data.

The core part of the process is performed by policy experts who take as input the designed templates, instructions and the legal texts, and produce a formal description of the complete set of the involved data points. The process starts with the modelling of each template row and column (and sheet, if applicable), followed by the assignment of a specific dimensional categorisation – set of pairs of elementary concepts (Members) of different categories (Dimensions) that fully describe the concept associated to the row/column/sheet.

The full categorisation of a data point is obtained by combining the categorisations of the cell position coordinates in an automated process that generates the dimensional categorisation for each individual cell (data point), and checks the consistency and uniqueness of each data point definitions. The modelling process must iterate until the DPM is complete, consistent and free of errors.The XBRL taxonomies can then be generated by an IT process, taking the DPM as an input.

How is the DPM structured?

The results of the modelling process are stored in the DPM database, which has a formal metamodel structure implemented as a relational database, agnostic to any particular financial or regulatory technology and available on the EBA website as a free public good.

The DPM database can be incorporated as a component of an IT platform, thus fully leveraging the possibilities of integration between all the institutions involved in the reporting, analysis and disclosure of banking data.

What is the DPM lifecycle?

The DPM lifecycle is supported by a set of ongoing processes organised in projects to deliver each new DPM release.

The EBA staff has been developing the DPM data dictionary in cooperation with banking and policy experts appointed by national competent authorities of the EEA countries, who regularly participate in the EBA committees and working groups.

The development of the DPM started in 2012 with the first publication as a database format in 2013. Since then, the DPM has been extended by successive versions following the evolution of regulatory data requirements, amended by the decisions of the Single Rulebook Q&A process, and the systematic feedback received from the Authorities' experience on the data harmonisation process.

Who is involved in the DPM?

Regulators, supervisors, financial institutions, service providers, other organisations, and the general public have access to the DPM data dictionary, and can use it as the common repository of clear and structured specifications of the data referred to in Banking Regulation.

The DPM serves, in different ways, both policy and IT experts involved in the regulatory data processing chain, from data exchange harmonization to data analysis and dissemination, and streamlining the processes that are dealing with regulatory data within and across institutions.

What is the DPM metamodel?

The DPM database is designed as a metamodel in order to ensure the necessary flexibility for extending the EBA Technical Standards and Guidelines to new requirements. The DPM revisions and extensions are implemented by adding new metadata content to the database, without the need for any change at the level of its structure. It also enables the set-up of logical constraints on metadata content, and the implementation of automatic consistency checks, which are used to ensure the quality of the model.

The complete structure of the DPM database includes a high number of tables and relationships, which are needed in different contexts. However, the initial approach to the DPM can be facilitated by focusing just on the core part of the metamodel, which includes only a relatively small number of concepts.