By Tom Kimner, Head of Global Operations for Risk Management, SAS

Financial institutions’ risk data aggregation and reporting techniques and systems are receiving increased attention both internally and externally. Regulatory agencies have stated that it is critically important that an institution report holistically and accurately on their key risk indicators, exposures, assets, liabilities, etc., across the entire firm for all major risk areas.

For most financial institutions, the need to have strong overall data quality, management and reporting practices for risk and financial information is nothing new. Many risk and financial managers will recall that in the early 2000s, the banking industry came under scrutiny for having, in some cases, grossly insufficient controls and management practices in place around their financial data and reporting. In 2002, the Sarbanes-Oxley Act demanded that banks improve controls on financial reporting – even going so far as requiring that top management individually certify the accuracy of financial information, with the possibility of prison time for those who submit false or inaccurate financial statements.

After the financial crisis, the importance of data accuracy for risk and financial reporting took on increased significance, with regulators increasing requirements for stress testing and capital adequacy. Once again, common themes around gaps in data completeness and accuracy, issues with aggregation and consolidation of information, reporting errors, and deficiencies in controls and overall governance emerged. While regulators worked diligently to address capital adequacy issues with stress testing and other quantitative and qualitative reviews, they also began to look more closely at risk data aggregation.

According to the Basel Committee on Banking Supervision (BCBS), the term risk data aggregation means “defining, gathering and processing risk data according to the bank’s risk reporting requirements to enable the bank to measure its performance against its particular risk tolerance/appetite.” The committee adds that “this includes sorting, merging or breaking down sets of data.”

When the committee released BCBS 239: Principles for Effective Risk Data Aggregation and Reporting in 2013, it established a number of foundational principles for all financial institutions to provide strong governance around their risk data and reporting. The outlined principles are being incorporated into local regulatory regimes in each of the participating jurisdictions including the US, UK, Canada, China, Germany and many other nations. The principles are structured around four broad areas: governance, data aggregation, risk reporting and supervisory requirements.

Taking a comprehensive approach to BCBS principles and risk data aggregation and management

A technology solution that helps financial institutions address all four of the key principles of BCBS 239 will provide operational benefits far beyond regulatory compliance. Programmatic risk data management will lead to better decision making across your enterprise. Here are some best practices and capabilities to consider:

Governance: Banks should have process controls and end-to-end transparency of data lineage and quality rules as well as change management and review controls. Also critical are full traceability and auditability for regulatory compliance purposes with fully documented report generation rules and controls. Lastly, standard templates that are easily customizable can aid in meeting regional or local regulatory requirements for electronic filing.

Data aggregation: A high-performance risk engine can quickly aggregate positions and exposures and perform a variety of risk calculations with many portfolios supported (e.g., banking and trading book). Consider in-memory aggregation and the ability to process data at the lowest level of granularity with visual and dynamic querying. With this speed and capacity, you should be able to interactively explore and analyze; drill-up, down and across on the fly; instantly aggregate to different levels of detail; apply user-defined filters; and easily pivot, group, rank and sort.

Supervisory requirements: To meet supervisory requirements, you’ll need advanced data management capabilities for business rules management and execution, data quality measurement, workflow traceability and data documentation. Data quality monitoring with operational and data quality metrics can be presented in a dashboard-style interface with access to all relevant information. What you want is a consistent approach to getting correct data where and when it is needed to gain increased confidence in the accuracy and timeliness of operational and business information.

Strong risk and financial data management is not just about complying with new regulatory requirements. Better, faster risk data aggregation and reporting processes are essential to compete successfully and avoid unnecessary regulatory and reputational hits. Banks need to ensure that their data is complete, accurate and trustworthy. This is not a luxury for financial institutions; it is a necessity.

Tom Kimner leads Risk Operations and Pre-sales for SAS Risk Research and Quantitative Solutions. He is responsible for executing the overall risk management strategy by leading and coordinating the division’s marketing and communications functions. Prior to joining SAS, Mr. Kimner spent the bulk of his career at Fannie Mae in various senior management roles spearheading corporate initiatives to more effectively manage credit and financial risk. Mr. Kimner has testified before the Financial Services Committee of the U.S. House of Representatives and regularly speaks at Risk Conferences and other SAS hosted events.