Please note, MISTI is in the process of changing our payment details. Please contact us for further details and on ways to pay at misti@misti.com or +44 (0) 203 819 0800. We are sorry for any inconvenience.

Speed Up Your Audit Team's Performance with Data Analytics

Data is doubling in size every two years. Experts estimate that by 2020, the world will reach 44 zettabytes of data. That’s 44 trillion gigabytes of data that we create and copy annually.

And the world is doing its best to capitalize on all this data. There’s no human way to pull together and analyze this data, so we turn to the computerized world to figure out how to make this data useful.

Given the talents and skills that auditors possess (analyzing data, spotting trends, forming conclusions), auditors are in a perfect position in a company to be part of data analytic innovation. In context to the zettabytes of information out there, what an auditor can analyze is comparable to a grain of sand on the beach. But, who cares? Small granules turn into a huge beach. So let’s get started!

This article proposes a plan to fill in the gaps and implement data analytics in the business.

Step 1. Define the objective

Figuring out what to analyze is much like creating an audit plan. Like an audit plan, the analysis of data will only be as good as the objective you set.

So, define the objective or business questions that you want to solve. It’s important that the objective is specific and correct. Part of defining the objective is defining specific elements to gather for analysis. It’s important to include the extended team as you discuss elements that will be part of the analysis or part of an algorithm that gets built. Discuss the idea with data owners, the IT department, and others who may need to be part of the data gathering process.

One myth behind analytics is that there is no bias in data analytics because computers don’t lie. However, because humans are making up the analysis, it’s easy for specific elements to be left out of the analysis or creating an algorithm. A simple example of algorithm bias was when a study revealed that Google’s advertising system showed an ad for high-income jobs to men more than it showed the same ad to women. Although the person creating the algorithm may not have had this specific bias, the elements that went into the algorithm created that bias. So you have to consider every element and how it will be analyzed and how the data will play into any algorithms created.

Considering the data elements that go into analysis leads us to learn all the systems, storage, data owners, data formats, and file requirements that will go into data analysis or algorithms. There will need to be a method for extracting this information (and making sure that the information is complete), cleaning it, and organizing the data into a readable format by a machine.

In defining the specifics of your objective, you end up with the following deliverables:

List of data sources

Analytics requirements document

Proposed timeline and plan for resources

This information needs to be in a viewable format so it can be presented to your peers, the extended team, and to the company management.

Step 2. Collect and validate data

Once step 1 is completed, documented, reviewed, and approved, you can move on to the next step of collecting the data. Data collection is a very specific and organized effort.

Having complete and perfect data is another myth. Is there any perfect data? Inevitably, something will be missing. You need a process for requesting the data, storing the data, making sure the data is complete and accurate, and cleaning the data as needed. Auditors are used to this. When large samples are gathered, specific elements can be missing. For this step, there must be a process for parsing out these missing elements and figuring out how these outliers fit into the overall analysis and any algorithm created.

The outcome of step 2 is a process for requesting and storing data, and determining a process for identifying gaps in the data received (or not received).

{tweetme}One myth behind analytics is that there is no bias in data analytics because computers don’t lie. @MIS_Training #audit{/tweetme}

Step 3. Gather insight

By the third step, you should have completed the following items:

You have the exact objective or question that you want to answer

You have determined the time and expense required to carry out the analysis to meet your objective

You have developed repositories and processes for identifying gaps

Now you get to test the data.

Staying true to the initial objective, perform analytic tests on the data. With experience, analytics becomes more sophisticated. As those with a knack for analysis and IT tools get used to tools on hand (like Microsoft Excel or Access), they will move into the more complex tools like ACL or IDEA. Eventually, analysis that starts small can reap huge rewards as the auditor becomes more adroit at learning how to create a solid analysis or algorithm.

The right analytic software will integrate with and handle large data sets efficiently and will be somewhat easy to use. Query tools (like SAP or Oracle), auditing software tools (like SAS, IDEA, or ACL) and even interactive data analytics visualization software (like Tableau), all accomplish different purposes in data analytics. Determine what you need first, and then shop for a tool that will integrate easily with your needs.

As you finalize the analysis approach, develop test scripts and queries, and perform these scripts, you can analyze your results. Then go back and test again. It’s important to start with a sample first and get the algorithm correct before trying it on the full population. The biases may surface in the testing and analysis stage. Since humans are imperfect and machines learn from humans, there will always be room for improvement in machine learning and algorithm and data analysis.

Step 4. Interpret and report results

Once information has been tested, keep the audience apprised of how the analysis is working. As you evaluate the results and compare against the initial objectives, form conclusions you want your reader to make. Much like an audit plan ends with an audit report, so does the objective of the data analysis end with a report too.

A main area to focus attention on in the analysis report is that the whole purpose of the analysis is to formulate conclusions and make recommendations. As the analyst, it’s important that you formulate those conclusions to share. If you have a report full of facts, then they are just facts and you haven’t really even performed analysis. Share the results of the analysis and then share what you want your reader and the business to conclude from the analysis. Are you trying to detect risk or show how the company can increase sales? Returning to the objective, make sure the conclusions reflect the initial objective.

Data analytics doesn’t have to be rocket science (though it certainly can grow to that point). Just like an audit plan, data analytics requires a solid plan as well, a team willing to figure it out, and a person (or two) with a knack for analysis and the ability and desire to learn new tools for data.

Quick Links

MIS Training Institute is registered with the National Association of State Boards of Accountancy (NASBA) as a sponsor of continuing professional education on the National Registry of CPE Sponsors. State boards of accountancy have final authority on the acceptance of individual courses for CPE credit. Complaints regarding registered sponsors may be submitted to the National Registry of CPE Sponsors through its website: www.nasbaregistry.org.