About Us

FDIC Part 370: More motivation for a holistic data approach

by Linda Jamison, Product Manager, Wolters Kluwer

Published
January 09, 2018

Amid the discussion around
large-scale, transformative regulatory initiatives post Dodd-Frank, it’s easy
to lose sight of the fact that new regulatory requirements are also emerging
from other sources, further underlining the need for banks to have systems and
processes in place that are capable of accommodating change quickly, and on
multiple fronts.

The Federal Deposit Insurance Corporation (FDIC)’s 12
CFR Part 370 rule, finalized earlier this year and due to be implemented by
early 2020, is a good example. The rule aims to ensure account holders have
swift access to their insured deposits in the event of a bank failure, which
the FDIC notes is critical to public confidence in the banking system. It
dictates that covered institutions must have the technological capability to
rapidly and accurately determine depositor insurance payouts in failure cases
-- an ability that the FDIC will test by requiring institutions to
perform and report multiple insurance calculations.

FDIC part 370
will apply to all insured institutions with over 2 million deposit
accounts, meaning nearly 40 could be affected, the largest of which has a
staggering 87 million accounts. Like many other regulatory requirements
that have emerged in recent years, it is likely to involve the collection,
processing and reporting of data not captured in current or historical
regulatory reports. Underlying documentation must also validate and
comply with requirements to support insurance coverage calculations. This
puts the onus on banks to ensure documentation such as customer
signature cards and similar materials are complete and up to date.

The picture will be complicated further by the fact that in some
cases, the required data or materials will not rest within the
institution. Collective or master accounts such as mortgage escrow
accounts, for example, may bundle deposits from thousands of customers
from which underlying details will be required to accurately process
insurance payments. Banks that have been involved in acquisitions are
also likely to struggle to gather records or information on customers
that they have effectively imported.

The high price of
failure

The stakes are high, as failing to adequately perform
calculations or being unable to provide the documents and details needed
to satisfy the FDIC could lead to customers potentially losing insurance
coverage in the event of a bank collapse. This makes it critical that
banks put procedures in place that will enable them to connect with
end-customers to obtain any necessary information quickly and efficiently,
in a way that minimizes the impact on customer relationships. Ownership
of this process will vary from bank to bank, but we recommend a blended
team under a single leader as relevant deposits may be spread among
several business lines, including retail, commercial and mortgage, and
A&L. In essence, banks will need to draw on those who know the
customers best.

On the technology side, FDIC part 370 underlines
yet again the need for a foundation where data from various departments
can be stored, processed and reused for regulatory reporting, as well
as other functions. Ideally, the bank will have a flexible reporting
solution in place that can be tweaked or enhanced to perform the complex
calculations demanded by FDIC part 370, negating the need to build or
implement a specific solution from scratch to address the rule’s
requirements.

A lack of system or solution integration and
flexibility means any new regulation will trigger yet another round of
complex change management, challenging already strained plans and
resources. Rather than trying to respond to every emerging requirement in
an ad hoc manner, constantly building out systems and processes to fit
new purposes, banks should have a regulatory architecture in place that
can be easily extended to address new demands. At the heart of this
framework should be an integrated, transparent repository for the
consistent and efficient aggregation of financial instrument and
customer data, allowing easy consumption to address various and
ever-changing regulatory requirements.

Developing an end-to-end
view of data

A more holistic view of data throughout the
processes of acquisition, aggregation, calculation and classification not
only enhances accuracy, but also creates the audit trail that is needed
to effectively meet the requirements of regulators like the FDIC, who
are increasingly demanding the supporting information and calculations
that underpin regulatory reports. Visibility of data flows throughout
the organization can also serve strategic goals, by highlighting areas
of duplication or inefficiency, or uncovering trends that can serve as a
basis for decision-making. FDIC part 370 is further impetus for a new
approach to data that forward-looking institutions will adopt as they
aim to become more resilient to regulatory change.