This week I will begin a five-series exploring data analysis and how it can be used by the Chief Compliance Officer (CCO) or compliance practitioner to support a best practices compliance program under the Foreign Corrupt Practices Act (FCPA), UK Bribery Act or other anti-corruption compliance regime. My partner in this exploration is Joe Oringel, a co-founder and Managing Director at Visual Risk IQ, a data analytics services firm, who I interviewed for this series.

Today, we will focus the basics of data analysis and how it differs from other forms of data testing such as sampling and inspection of documents. Next I will consider how to think through the use of data analysis and the COSO Framework. Then I will explore some of the ways Oringel and his team have used data analytics to assist companies in ways that are analogous to FCPA based compliance programs. Additionally Oringel and I recorded a three-podcast series where we explored these issues in an interactive format. If you check out the podcasts you will be eligible to receive an additional White Paper, at no cost, on the complete series and topic.

Being a recovering trial lawyer, I began with the basics, which is: what are data analytics and data analysis? Oringel kept it simple, saying that it’s merely using data to answer questions. He noted that such analysis predates computers since Sherlock Holmes became well known for using deductive reasoning to make determinations from data based evidence. In the 21st century business world, the best evidence that we have as to whether something took place or not is most often digital evidence. Oringel pointed to a variety of authoritative digital data sources, which intone that modern data analysis is a process of inspecting, cleansing, transforming, and modeling data with the goals of highlighting useful information and supporting our decision-making, so data analysis is answering a question with data.

Oringel next pointed to another set of definitions for data analysis, which derived from Thomas Davenport, who is a well-known academic and author who teaches at Babson College. Davenport incorporates the notion of time to categorize data analytics as answering certain questions about either the past, the present or even the future. Incorporating time into analytics focuses these efforts so you can build repeatable patterns into the questions that should be asked and answered.

Oringel, who has both academic and professional training as an internal auditor, said that external financial auditors, like the Big Four, usually focus on answering the question, “What has happened?” This is really a focus on historical transactions, looking backwards and looking at the reporting of transactions, for example what was recorded in the books and records of the company? How was the transaction recorded? Why was the transaction recorded a certain way?

I next turn to the difference between data analysis and traditional internal auditing or sampling. Oringel believes this is the most significant change in technology in the last 25 to 30 years due to the advent of the personal computer and the associated spreadsheets and database softwares that allow auditors to make their conclusions with data, and to have those conclusions not be based on a sample of data, but, rather, on analyzing the population of data. He said “In the late 1980’s, early 1990’s, the predominant technique that internal auditors used was sampling. If an audit was designed to vouch fixed assets, auditors would pick a sample of 25 or more fixed assets; re-compute, or test, the acquisition date, and the disposition date; and finally re-compute depreciation by hand. If the fixed assets in our sample were properly recorded, then we looked up on a statistical chart or table and concluded that we were sufficiently confident that all of the fixed assets at the company were properly stated.”

He further said “with today’s digital accounting software, every fixed asset can be downloaded and the depreciation re-computed based on the acquisition date and the disposition date and the various depreciation rules for each asset class. If there are any differences in the valuation of any asset, the differences can be found through data analysis. Data analysis allows a company’s auditor, whether internal or external, to re-compute or model the financial recording of transactions, as they ought to be recorded and, therefore, have even greater confidence than if they had tested using sampling. By analyzing every asset and related transaction, a company is able to test the entire population and be much more confident in the results. This has obvious implications for any FCPA audit as there is no materiality standard under the FCPA.”

Data analytics can transition from a review of historical transaction to a review of current transactions simply by asking similar questions of similar data, but with a change in focus. This focus change is to answer the question “what is happening now and what should we do about it” instead of merely “what has happened.” When your bank or credit card company puts a freeze on your charge card because of suspicious transactions, they are using data analysis as an alerting function. More sophisticated companies use this sort of data analytics tools and processes as part of their Compliance program for areas like monitoring for improper payments or to identify vendors who may be a match with entities on a Denied Parties list.

This use of monitoring as an alerting task is a logical next step for compliance teams, but most are not yet for any number of reasons. The transition from data analytics as historical analysis to alerting through continual or continuous monitoring can be a challenge, and it is still an emerging best practice. Continual or continuous monitoring establishes these alerts and suggests us to take action based on something that happened just a quick moment ago.

I asked Oringel if he could provide an example along the lines of the Department of Justice (DOJ) and Securities and Exchange Commission (SEC), jointly released FCPA Guidance, which says that the goal of a best practices compliance program should be to Prevent Detect Remedy matters before they become FCPA violations. He translated the FCPA Guidance into “Stop, find and fix”. He believes that it is about asking the time period that you are pulling the data from, so if you are looking at transactions that happened 6 or 9 months ago, then your analytics are serving as a reporting function. He gave an example where a business development person entertained a government official, yet did not seek preapproval to do so. Unfortunately, the amount spent was more than was allowed under the company’s Gifts and Entertainment Policy for entertaining a foreign official. Now the compliance function needs to fix that policy violation and make sure that it does not happen again.

The next frontier for data analytics is a move from alerting to predictive analytics, which is using data analysis to predict what will likely happen in the future. This allows us to move from answering questions about what has happened in the past or present to what will likely happen in the future. While predictive analytics is common in many industries and processes, like Commercial Lending or Insurance, it is not at all common in compliance. Yet.

The “find” capability goes from the past to the present and to the future and may be where the most advanced audit and compliance teams go next. This actually moves to an almost a proscriptive action, where, because you were able to predict, or have an insight, going forward you are able to deliver a risk management solution to that potential situation.

Oringel concluded by saying that it is this future orientation, with data analysis as a predictor, that he believes is the next step in the compliance function using data. A company can score high risk employees in their unit by identifying the salespeople that tend to not respect the organization’s T&E policies; who spend too much on lavish meals or engage in other activities which contradict company policies, such as neglecting mandatory compliance training or simply being routinely late with expense report submissions.

Joe Oringel is a Managing Director at Visual Risk IQ, a risk advisory firm established in 2006 to help audit and compliance professionals see and understand their data. The firm has completed more than 100 successful data analytics and transaction monitoring engagements for clients across many industries, including Energy, Higher Education, Healthcare, and Financial Services, most often with a focus on compliance.

Joe has more than twenty-five years of experience in internal auditing, fraud detection, and forensics, including ten years of Big Four assurance and risk advisory services. His corporate roles included information security, compliance and internal auditing responsibilities in highly-regulated industries such as energy, pharmaceuticals, and financial services. He has a BS in Accounting from Louisiana State University, and an MBA from the Wharton School at the University of Pennsylvania.

This publication contains general information only and is based on the experiences and research of the author. The author is not, by means of this publication, rendering business, legal advice, or other professional advice or services. This publication is not a substitute for such legal advice or services, nor should it be used as a basis for any decision or action that may affect your business. Before making any decision or taking any action that may affect your business, you should consult a qualified legal advisor. The author, his affiliates, and related entities shall not be responsible for any loss sustained by any person or entity that relies on this publication. The Author gives his permission to link, post, distribute, or reference this article for any lawful purpose, provided attribution is made to the author. The author can be reached at tfox@tfoxlaw.com.