Basel III and Dodd-Frank are placing new requirements on liquidity management that banks can address if they take an enterprise-wide approach to data and analytics.

Basel III and Dodd-Frank are placing new requirements on liquidity management that banks can address if they take an enterprise-wide approach to data and analytics.

Liquidity risk is moving to the forefront of the regulatory agenda, and as a result, financial institutions, especially banks, are faced with a new compliance and business reality. While final details remain in flux, there is no question that Basel III and Dodd-Frank will bring more structured and stringent liquidity requirements, as well as greater compliance complexity, to the industry.

Traditionally, regulations have largely shaped financial institutions’ risk management programs. Today, however, progressive firms, focused on more proactive management of their businesses, are beginning to look to risk management, including liquidity risk management, as a strategic tool that can deliver valuable and timely views of costs and profitability.

Banks face two distinct sets of challenges when addressing liquidity risk requirements. First, they must work to achieve compliance – setting up the operational infrastructure and policies that support ratio calculation, capital realignment, and reporting requirements.

The second, and arguably more daunting challenge, is the ability to optimize liquidity. Under the liquidity coverage ratio (LCR) requirement, the stock of high quality assets is tightly defined, and one can expect to see a sustained escalation in the price of high-quality government and corporate bonds as demand for these asset classes increases. With the cost of liquidity steadily increasing, banks seek ways to become more efficient in how they manage liquidity, given various constraints emanating from the regulatory landscape and market variability.

While most financial institutions are focused, first-and-foremost, on building a foundation for achieving and managing their LCRs and net stable funding ratios (NSFRs), it is important to avoid a “quick fix” approach when creating a platform to manage liquidity risk. Instead, banks should also look ahead and begin to plan for future optimization needs. Liquidity risk management, and ultimately, optimization requires a flexible IT infrastructure that can adapt to an ever-changing regulatory landscape and business objectives.

To accurately calculate liquidity risk, as well as model options for how best to optimize it, banks must effectively integrate the liquidity risk framework and the governance of other risk, including credit and market risk, via a unified data model and enterprise analytical platform. Many firms, however, retain a stove-piped environment filled with point solutions that preclude visibility and hinder data consistency, traceability, and availability.

In addition to adopting an enterprise-platform approach, banks should consider the following to create an analytical environment that not only facilitates regulatory compliance and sets a solid foundation for future liquidity optimization, but can also enhance a bank’s overall profitability.

• Does the bank have full and timely visibility into the balance sheet and all cash flows? A financial institution cannot meaningfully manage or optimize risk until it has visibility into its entire balance sheet – spanning retail, commercial and wholesale operations.

• Does the bank have visibility into contingent exposures, including off balance sheet exposures? Tier one banks with significant collateralized off balance sheet exposures realize that increased market volatility will create additional margin requirements, which is a major drain on funding. Banks need to know, at the most granular level, what exposures they can have on and off the balance sheet, by product, currency, counter-party, and more.

• Can the bank rapidly create liquidity risk stress testing scenarios? Stress testing is here to stay, and banks can count on it to be more rigorous and frequent, with regulators specifying much faster turnarounds. As such, financial institutions require analytical solutions that enable them to rapidly define scenarios, adapt them as needed, and store them for reuse.

• Can the solution estimate liquidity gaps in business as usual (BAU) conditions, as well as under various extreme conditions scenarios? Banks are looking to define specific BAU and shock conditions at various levels of granularity and then apply them to contractual cash flows either as absolute value or percentages to estimate the liquidity gap under normal and extreme conditions. A bank’s risk management framework should support a wide range of assumptions and provide flexibility to help model and apply multiple shocks of varying magnitude to the BAU assumptions. Further, it should support comparison between scenarios to identify the most plausible conditions to use when making strategic decisions.

• Can the solution calculate liquidity ratios and funding concentrations enterprise-wide and for each currency, product, and counterparty? A liquidity risk management solution should give financial institutions the flexibility to calculate the LCR and NSFR using their specified parameters for liquidity horizon, liquidity haircuts, and funding factors to support compliance.

• Does the solution enable the bank to model and create “what if” scenarios and analyze various options to address liquidity shortfalls? Banks require the ability to model and create “what if” counterbalancing strategies to combat liquidity gaps under normal and stressed business conditions. With the ability to define multiple counterbalancing strategies on the same baseline and stressed liquidity gap reports and analyze them across multiple dimensions, banks can readily identify and adopt the optimal course of action as part of their liquidity management strategy.

• Does the bank’s liquidity risk management framework create a foundation for analytical analysis that will enable banks to continually optimize liquidity under normal and stressed conditions? Flexibility, scalability, and a unified data model are essential to future optimization initiatives. To optimize liquidity options, organizations must be able collate a wide spectrum of relationship data and process it rapidly to yield a set of strategies that reflect the dynamics of the current market environment.

When it comes to liquidity risk, the stakes are high for today’s financial institutions. It is important that they create an infrastructure that not only meets short-term compliance requirements but also lays a foundation for future needs, including additional regulatory requirements, as well as the compelling desire to optimize liquidity management.