I like the pace of technology, especially in finance where it can move so fast. In 2015 I was named one of the 25 top global finserv influencers. (http://bit.ly/1DsqaeK) In addition to Forbes I write for International Finance Magazine, Banking Technology and Mondo Visione, all based in London.

Solutions To Spreadsheet Risk Post JPM's London Whale

His question was prompted by the JPMorgan Task Force Report into the Chief Investment Office’s $6 billion-plus loss. It found the bank’s Value at Risk (VaR) was being calculated with an Excel spreadsheet that “required time-consuming manual inputs to entries and formulas, which increased the potential for errors.” At another point the report found “the model operated through a series of Excel spreadsheets, which had to be completed manually, by a process of copying and pasting data from one spreadsheet to another.” (The best guide through the 125-plus page JPMorgan Report is the FT’s Alphaville which has a link to the full report and careful dissection of its details, which no doubt prompted the consumption of significant quantities of coffee. The oral history is a good place to start and has an excellent index near the bottom of the page for less stalwart readers who can’t take the whole report at one go.)

The news about Excel has bounced around blogs, like James Kwak’s Baseline Scenario, and drawn comments from people who are as “Shocked, Shocked.” as Captain Renault in Casablanca to find that manual Excel processes are playing a fundamental role in large and complex financial operations.

Excel is so powerful, flexible and easy to use that it remains at the core of many critical financial operations, often for far longer than a desktop tool with no controls should be used. The problem is not a new one and several companies offer solutions.

ClusterSeven, based in London, provides a way to map all the spreadsheets inside a company’s operations and audit all changes with the results stored on a Microsoft SQL Server database. WestClinTech has built XLeratorDB which can operate refined Excel calculations and a lot of new financial calculations inside SQL Server where they can run up to 100 times faster than Excel and are fully protected by server security.

“Everyone seems to have spreadsheet policies but they don’t implement them,” said Ralph Baxter, CEO of ClusterSeven. That includes JPMorgan which set a January 31, 2012 deadline for automating its VaR model and then apparently forgot about it. One reason organizations ignore, or oppose, controls over spreadsheets is that they think the process will be too painful, he added. (See Baxter’s blog post on how pervasive spreadsheets are.)

BlackRock, the asset manager with $3.8 trillion in assets under management, has used ClusterSeven since 2007 to manage its spreadsheets.

“I can’t imagine it would be possible for an investment manager to not use Excel,” said Stuart Symonds, director Aladdin & Technology at BlackRock. “It is essential for flexibility and ingenuity. However, with that flexibility comes risk and the challenge that everyone faces is to control and minimize it. These issues are generally applicable across the investment management industry as Excel is so widely used.”

The niche ClusterSeven occupies is known as Enterprise Spreadsheet Management (ESM). Symonds said that at the time BlackRock selected ClusterSeven was chosen because it is a non-invasive tool.

“Business users don’t need to know it is there. It was very important to achieve business buy-in that the control put in place didn’t affect their flexibility and productivity or become an onerous task.”

ClusterSeven offers similar controls for the Microsoft Access Database, which is tucked away in thousands of organizations running vital departmental tasks, often little known to anyone outside the group. It also has an integration layer with IBM Cognos provided by IBM Business Partner Assimil8. That allows an investment bank, for instance, to run a business intelligence (BI) layer on top of its spreadsheets.

XLeratorDB, from WestClinTech in Irvington, NY, says it extends the analytic and number-crunching capabilities of Excel into the SQL Server database. Co-founder Charles Flock is a financial tech veteran; he was also a co-founder of Frustum Group, which developed the OPICS international banking system later acquired by Misys.

Because Excel is so flexible, new trading ideas or risk management tools are often launched in Excel and usually remain there far too long before moving into applications, which are more secure, audit-able, automated and comprehensive — eliminating the cut and paste of the JPMorgan Excel practice, for example.

WestClinTech launched a product that could do everything Excel can, sometimes more accurately, always faster, but placed it on a server running SQL instead of on a desktop running Microsoft Office.

“We come from an environment where traders take their P&L by hitting F9, the recalculation function in Excel worksheets,” said Flock, in almost every environment in a trading room. Traders are taking their own P&L in Excel using an algorithm, macro or a formula they developed or maybe someone on the desk before them, or the guy running the desk, and they have no idea whether or not the algo accurately captures all the data and catches the conditions of the market.”

Flock said companies they talk to want to get the model in one place and make sure everyone is using the same numbers and formulas. VBA allows firms to write their own applications, but someone has to check the code and maintain it and check regularly to make sure nothing has been changed.

Now with XLeratorDB, companies can put all their data into SQL and run as many algos as they want while being assured that users are using the right algos.

“We back up SQL every night. Now firms can create a platform is that truly is unlimited in the number of rows of data. It takes all the flexibility and computational capability of Excel and puts it in one place.” And firms find big gains in performance because the data stays in one place rather than traveling back and forth across networks.”

A REIT or private equity fund that has one million cash flows in its database and wants to use XIRR to send a statement to all its investors can do that from SQL very fast and with better accuracy than Excel calculations, he added.

“We have articles on our Web site about errors from Excel,” said Flock. “That comes as a huge shock to these guys. It is so embedded in their thinking that if it is coming out of Excel it has to be right. We have firms telling us this calculation is embedded in our swap agreement, out to 10 decimal places and you are telling us the Excel calculation is wrong? It is always is an interesting conversation. We have put up some fairly lengthy pieces about here is how this calculation works.”

Post Your Comment

Post Your Reply

Forbes writers have the ability to call out member comments they find particularly interesting. Called-out comments are highlighted across the Forbes network. You'll be notified if your comment is called out.

Comments

It always amazes me that with all the new technology out there with computer-aided spreadsheet audit tools so few people use them. Don’t know if it’s laziness, ego, or just a lack of awareness of the existence of these types of tools. There are several good ones I’ve found. Probably the two best (in my opinion) are the Audinator (http://www.audinator.com/) and FormulaRx (idk website). Typically, in just a few seconds these types of tools scan every single formula throughout your spreadsheet workbook testing for a wide variety of common (and uncommon) spreadsheet errors and risks. Certainly doesn’t replace good old roll-up the sleeves manual review and best practice spreadsheet development, but it definitely catches deeply hidden errors that a manual review would never find.

I’m glad to read this article and to see that issues like these are finally brought forward. Excel is an incredibly useful and dangerous software. It is useful because it is easy to learn. It is dangerous for the very same reason. Excel tries to be everything and do a little bit of everything, though it does some things arguably bad. For example, in my field, it is well documented how inaccurate Excel has been in very elementary statistical computations. And while some things have improved, other problem areas have not been addressed by Microsoft.

What I find curious and worrying is people’s attitudes towards software. Who would question that a large database has to be managed in a database application? Who would question that a wordy document has to be created in a word editor? Emails are written in email applications, not Word. Technical drawing are done in a CAD software, not MS paint. Yet, when it comes to Excel, the appropriateness of a specialized solution is often overlooked in favour of what one already is familiar with.

When in the past I received data in Excel format from other people, sometimes it was so bad that I had to send it back. Color coding cells had an abscure meaning that was clear only in the original author’s head. Formulae that lost their referring cells because they were moved just to list a few issues.

To me the explanation that one uses Excel because that is what all businesses use is meaningless. Likewise, using Excel simply because that is all one knows is a far more dangerous approach. Unfortunately, I don’t think the solution to this Excel epidemic can be found in third party plugins. If anything, it will create even more managing headaches. Instead, companies, starting from analysts at the big financial institutions, should start setting standards by identifying a number of software applications aimed at performing a task that was previously done using Excel. Next, they should start training those same analysts at using this new tools. Needless to say, these tools have to be able to transparently identify what is being done and where. In other words, no more buried formulas in some random cell in a random worksheet. Ideally, a log is kept detailing each step of manual intervention so that when analyst A gives his or her file to analyst B, he/she will be able to trace and understand what was done. Furthermore, companies should also join with universities in order to train their students on those very same tools.

I see part of the Excel epidemic lies at the university level. In helping an MBA student at a well known university, I couldn’t help but be shoked that Excel was being used for a quantitative course, while a similar course at the undergraduate level was using more appropriate software. In trying to help their students prepare for real world situations, the university was becoming a partner in spreading the epidemic.

While recognizing changing behaviors towards software practices is not something that can be achieved overnight, it is never too late to start doing things right from the beginning.

Good points. Certainly one problem here is compliance. I assume most people will use Excel as long as they are allowed because they are comfortable with it. But I remember visiting ABN in London, oh maybe 10 years ago, and they were just implementing Apama, now part of Progress Software, I think, but at the time a fairly new application from Cambridge. If I remember my first interviews with Cluster Seven, a little after my meetings with ABN, they had some global oil company clients that were running several hundred thousand spreadsheets, many of them feeding each other. Good desktop tools are not necessarily appropriate for the enterprise, and I guess from what Thomas has to say, not necessarily appropriate for advanced maths. JPM had been running Numerix but the moved to an in-house spreadsheet tool that hadn’t been vetted or automated. This isn’t just a London problem — the Chicago Fed found a lot of funds around the city were modifying their algos and deploying with little if any testing.

I think that in many organizations Excel works so well that people/departments keep using it and expanding on and fail to see where they should make the move to a database or specific application. Not Excel’s fault, of course, that it is both powerful and flexible.

from Bloomberg (http://www.bloomberg.com/news/2013-04-17/7-data-disasters-more-embarrassing-than-reinhart-and-rogoff-s.html)

“The most glaring error: In an Excel data set of countries’ annual GDP growth and their public debt, Rogoff and Reinhart apparently forgot to select an entire row when it came to averaging growth figures, leaving out Australia, Austria, Belgium, Canada and Denmark. Rogoff and Reinhart acknowledged the error, writing that “it leads to a notable change in the average growth rate for the over-90-per-cent debt group.” This means that one of the most influential claims in public discussions and government policies related to austerity, debt and stimulus — “that there is a sharp fall-off in growth when debt reaches 90 percent of GDP — was partially due to a simple Excel error.” Ouch!

As a statistician, I tend to be exposed to the “most glaring errors” that were committed in Excel. Naturally, because of this my view is rather biased. I strive to see the good in Excel. I am convinced that there are useful aspects in Excel. But too many times those same aspects that have made Excel popular and useful also created inertia in that people use it for tasks that clearly it wasn’t designed for, simply because that is all they know. It is deeply troubling that economists of the calibre of R & R, are relying on Excel to carry out their analysis. The room for error is just too much (in this case they forgot to drag down a list of values, among other things…). More appropriate tools exist, let’s use them.

Yup, too useful for its own damned good. Or perhaps for its users’ good. Can you suggest other tools that users should turn to, and how to know when they are pushing the boundaries of Excel? Isn’t it really a user awareness issue rather than a tech issue?