Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

An anonymous reader writes "Computer science professor Daniel Lemire explains why spreadsheets shouldn't be used for important work, especially where dedicated software could do a better job. His post comes in response to evaluations of a new economics tome by Thomas Piketty, a book that is likely to be influential for years to come. Lemire writes, 'Unfortunately, like too many people, Piketty used spreadsheets instead of writing sane software. On the plus side, he published his code ... on the negative side, it appears that Piketty's code contains mistakes, fudging and other problems. ... Simply put, spreadsheets are good for quick and dirty work, but they are not designed for serious and reliable work. ... Spreadsheets make code review difficult. The code is hidden away in dozens if not hundreds of little cells If you are not reviewing your code carefully and if you make it difficult for others to review it, how do expect it to be reliable?'"

Spreadsheets are just a part of the Darwinism of applications. Some sharp fellow within an organization things its important to start tracking some data point or another. Maybe it gets ignored and forgotten. Other times it grows as other people see its utility and start making requests to track related data points. Eventually you get a multi-worksheet or even multi-workbook spreadsheet masquerading as an application. At some point it becomes far to hard to maintain or understand so they contract out someone like me who moves it to a relational database with a web front end. Everyone is happy!

This work forms a major part of my work load don't fuck with it!

Also, it is appropriate. It would be inefficient to develop a proper relational database application on the whim that some set of data points might be useful. Spreadsheets are a proving ground, and important stage in the life cycle of an application.

To be fair, they have their place. They are excellent tools for creating a table of data with a chart that can be emailed to other people and read.

Sadly, one of the big selling points for spreadsheets is their application. Pretty much any computer being used for work will have something that can read and display excel spreadsheets, you can send one to anyone and not have to worry about what they have installed. Then again you can get the same level of compatibility by outputting PDFs from matlab or something slightly saner like that....

Spreadsheets are really easy to use properly, all you have to do is adjust your mind to the idea of creating two styles of spread sheet, the working spread sheet, well laid and and documented, to ensure the workings are understandable and checked and a linked presentation spreadsheets where the data is taken from the working spreadsheets and presented prettily of nepotistic management, so even the dumbest spawn of management can, well, at least pretend to understand.

I recall a survey of (non-trivial) corporate spreadsheets in the mid-90's, it went something like 95% had a maths bug, in 80% of cases the bug made the sheet useless, 50% of the spreadsheets were used to make (incorrect) financial decisions. The reason why corporations coffers don't evaporate is that they use thousands of them so the +/-ve affect on the money buffer has a central limit of zero. It's a much more precarious situation if you using a single homespun spreadsheet to run a corner store

The question is whether having the logic squirreled away in code or a DB would have made it more correct, which is a big assumption!

I really think Piketty deserves a lot of credit for releasing his "source" spreadsheets on such a substantive and controversial work. Most authors do not. If the critiques turn out to be substantial and extensive, I plan on waiting for a second edition with corrections before investing time in reading it.

I've done audits on spreadsheets. They're not terribly difficult, and I dare say they're easier than many of the code reviews I've been through.

The most important thing is to understand how to use the spreadsheets. Either use separate worksheets for each major step in the calculation, or at least separate the computations using extra blank space. That serves the same function as code blocks, breaking up the computation into smaller, more manageable, pieces. Each small piece can be audited separately, and it provides a clear trail of how one number becomes another.

Next, use your formatting, even if it's not in a worksheet ever intended for public viewing. I'm particularly a fan of using conditional formatting to highlight the cells in a sheet (especially minimums and maximums) that will be passed on to the next worksheet. Then it's easy to check that the correct values are being passed, and the intermediate values all make sense.

Finally, use your fill tools correctly to ensure that the same computation is being applied to all cells. You should be able to audit the top of your worksheet and fill down to the bottom, without any formatting or visual elements getting in the way, and know that the whole worksheet is correct. When reviewing an old worksheet, note that Excel will highlight (with a green corner mark, as I recall) cells that don't fit the pattern.

Finally, remember that writing an algorithm for a spreadsheet has some of the same pitfalls as any other implementation. Double-check any function of which you're not certain the parameters. Put comments in non-obvious areas. Don't be too clever, and of course, if someone else can't understand your brilliance, you're not being brilliant.

Excel checks formulas for "consistency" so if you have
b1 : = a1+1 , b2: =a2+1 , b3: =a42 + 1 , b4: =a4+1 ,
then the ropey B3 will be flagged up. Of course there are sometimes false positives and you switch this check off or ignore it, and who knows how many false negatives. The message "just say no to Excel" still stands.

Eh, I think it can be legitimately argued that spreadsheets are a bad place to do complex things. Even people who are skilled at setting them up produce work that is difficult to examine and track. In many ways it is a technology that it still stuck in the 80s, even though they keep throwing in more and more complex functionality, but the method of storing and organizing the logic is dated in a bad (rather then proven) way.

Even teaching students matlab would probably be an improvement, but excel is what they default to teaching anyone outside math and CS, building all the coursework around it.

No kidding. Also, it MAY not be that easy to review the code in a spreadsheet, but it is VERY VERY EASY to test it. If you want reliable spreadsheets its PERFECTLY possible to test them to the Nth degree, far more so than with most other code. You have a place to put the tests, and a place to put the expected results, its all rather devilishly simple actually. For that matter you can document the bejeezus out of them too.

I think spreadsheets are like any sort of simple interpreted language. Idiots can easily blow their left foot off. Real software engineers can also do some very cool stuff. Most of the perl code I've seen is ugly as all hell and pretty worthless, but MY perl code is a thing of beauty that people maintain for years. Its all in how you use the tool.

In MS Excel 2003 and 2013 ctrl-T is your friend when debugging a couple of thousands of cells with formulas. It also indicates what cells have formulas and what cells have values.

Having said that, I am busy trying to convince my boss to have a massive pain in the ass excel working document with dozens of macros and a thousands of rows converted to a relational database.Excel is useful, but people often exceed it's limits.

No, what he is saying is that it is easy to "write" sloppy code for excel and hard to write good code.And even harder to review it.

It's similar to the reason a) people moved away from basic, and b) basic evolved to be (duck, please no flame) almost usable (I still do not like it, but recognize that it is possible to write usable code in visual basic).

If you want to criticizes him, picking on Piketty is VERY political, "excel" errors are galore in neocon publications, but of course the FT did not find anything not to love there, but saying that just maybe having a small group of people siphoning off all the cash from society is not sustainable for ever does make them nervous and very desirous to find some scab to pick at...

Nevertheless he is right, it would be very good if decision makers would be able to "read the numbers" and not just "massage the numbers".Something like R or ADaMSoft would drive you to test ideas on datasets and learn from them whereas excel (or calc:)) have a tendency to get you to fiddle the numbers until the taxman aherm the reader sees what you would like them to see...

Krugman was joined by economists Justin Wolfers, James Hamilton, Gabriel Zucman, frequent Piketty critic Scott Winship and others, along with The Economist's Ryan Avent, The Washington Post's Matthew O'Brien, and The New York Times' Neil Irwin, to name a few.

Well the part about figures being constructed "out of thin air" is a smear (whoever it may be who claims it), as becomes clear when one reads the rest of the article you cite. The most balanced assessment of the Giles vs Picketty dispute is perhaps the piece Inequality: A Piketty problem? [economist.com] from The Economist.

One of the nastiest things about spreadheets in relation to software is that software is essentially linear, making it easier to follow what's affecting what.

Spreadsheets are 2-dimensional, can incorporate data from invisible cells and even other sheets. Then on top of that, what's normally displayed is the results, not the code. There's no side-by-side view of code/value on any spreadsheet I'm aware of.

Also, since code is linear, one screwup and it tends to make itself obvious by propagating downstream. Sp

What people fail to realize is that spreadsheets are like any other form of programming, and therefore should be treated as such. Write tests. Break complex formulas down into named cells. Use references to carry concepts. Beware of globals. Keep small concepts small, simple, and modular. Write more tests.

Does anybody do that with every spreadsheet they write? Doubtful. I know I only go to all that trouble myself when I have a boatload of inputs that have to get put together. I usually discover about part way in that the sheet is going to be complex enough to need tests. When I do, it's time to start refactoring it, and these are my general steps:

Give cells and ranges meaningful names

Break complex formulas down to several small formulas

Add tests for the formulas

Factor out duplicates

Of all of these, giving cells and ranges names is the most important, because it makes the sheets readable. I can then usually understand the results well enough to know if my formulas are working, but a complex formula often needs an independent set of tests to prove the discontinuities in the functions are actually where I think they should be.

The problem isn't that people that know how to program and to test are writing crappy spreadsheets. The problem is that people that don't know how to program and test, ie. the general non IT tech population is writing spreadsheets because they don't know another way to build these tools and they do it in the only way that they've taught themselves to do. If they knew better they often wouldn't start by clicking the excel icon in the first place.

I've had to take over maintenance of a few "excel" based applications. Never. Again.

That's Excel for you.

I use a lot of scripts that are based on CSV files for input, output and storage of values. You want to know what I edit them in... Notepad. Because Excel fucks around with it too much and I'm sick of the "but this is not in our proprietary format" dialogue when closing it (it also refuses to save on exit unless I change it to.xlsx). However the biggest sin Excel does (to me) is removing leading zeros, that number has to fix a N digit mask or it will fail.

So what's the alternative? There are no good and easy to use software packages to create simple data-intensive apps.
The closest alternative was VB6 and if I had to chose between it and Excel, I'd choose Excel any day of the week.

I know it's huge overkill, but I've had times where it was honestly easier to drop the data into PostgreSQL (MySQL, if you prefer) than edit it in Excel / Gnumeric / Open/LibreOffice's spreadsheet tool.

There was one case where my friend needed to analyze a modest amount of data -- 70k rows, 30 columns or so -- and Excel would absolutely choke on her new laptop running Excel. Dropped it into Postgres on my anemic netbook and queries were lightning fast. No need to specify column types, either -- just load

People will laugh. But in an office environment it's an excellent solution. But one can still write formulas directly in reports and forms, so code review isn't necessarily easier.

For those who don't understand relational database concepts, Access can be a machine gun for shooting yourself in the foot. The types of errors that typically find their way into Excel spreadsheets can get magnified several times over by moving to Access.
Those who do understand relational database concepts are probably putting their data in a real DBMS (MSSQL, Oracle, Postgres, MySQL, etc).

Access has its place. Its front-end and integration with other office solutions ensure it is the best quick, light database solution for majority of small businesses.

But the emphasis is on "light". I am sure everyone has dealt with Access files that are used as whole business-tracking applications. Files that approach close to their limit of 4GB. Files that are simultaneously being used by all the users to record their timesheets in. And anytime it is suggested to move to a more proper database engine, ever

Perl. When things get to messy for a spreadsheet, I whip up a little perl. Easier to repeat the calculations for different data sets, as a bonus. Access to much richer libraries, and you can shell out to GNUPlot, Ploticus, Asymptote, or whatever.
Or Python, if that's your cup of tea... or Ruby, even R... whatever scripting language floats your boat.

ugh...so anger! always with the nomenclature distinctions...this is a stupid approach to a real problem

a spreadsheet is a computer program

that's it...

to criticize the act of entering data and performing computations on that data using computer software is the height of ignorance

I don't know if he's right or not, but this guy's real criticizm, once you fight through his ignorance of the issue is that in his view Pickety didn't show enough of how he got his figures...or more accurately, the TFA author had to

Maybe you should read it again?His real criticizm is that spreadsheet software is horrible for any high end work, or with anything you want to share, and he is correct.

"so he probably doesn't know how to use the interface of a spreadsheet very well, which makes the act of checking a formula tedious..."it is tedious, even if you are an expert and even if the user uses goof practices.

"P-hacking is the problem in social science/economics research, not using 'spreadsheets'"I don't think you know what P-Hacking is.

if it can execute the operation needed for the research then it is acceptable...if not, then no

I think geekoid is trying to say that even though spreadsheets can in theory "execute the operation needed for the research", practical limits inherent in the spreadsheet user interface make it difficult to verify that what the spreadsheet is calculating matches what you wanted to calculate. Consider this: An 8-bit microcomputer "can execute the operation needed for the research" but that doesn't make it the best tool.

if it can execute the operation needed for the research then it is acceptable...if not, then no

You could probably write this computational code in a shell script, too. But it would still be a terrible idea. Why? Because it's the wrong tool for the job. Simple as that. It doesn't matter what you can and cannot do, it matters what you should do, and you shouldn't use spreadsheets for anything complicated. It's simply too easy to make stupid mistakes that are difficult to trace and correct (or even notice).

you can't blame a spreadsheet for a poorly devised experiment...you *can* blame a researcher for using an inappropriate statistical model...you *cannot* criticize the method of analysis as long as it is physically capable of the computation

TFA isn't blaming the spreadsheets, he's blaming the people who use them for using them. It's not acceptable to use a tool that works poorly and is highly susceptible to mistakes, and no one should listen to anyone who does so unless that person is damned good at that tool: yes, it is possible that someone is so fantastically good with spreadsheets they can use them for massive data analysis with no problems. They are, however, the exception, and I would generally be inclined to disbelieve the results from anyone who does large work with spreadsheets (simply because of the possibility for errors and the lack of concern for accuracy that using spreadsheets demonstrates). So, the conclusion is that you shouldn't use spreadsheets for important work. You absolutely can criticize an analysis if it uses a tool that is highly likely to introduce errors, and that's fundamentally the point (and it's underscored by the fact that that is precisely what happened in Piketty’s case).

I agree, a well made spreadsheet is far easier to follow than a proprietary program or even most study's results.

If you have a custom formula in a spreadsheet, create it in the program's scripting language instead of copy/pasting to tons of cells. Create the spreadsheet in a repeatable layout that is ease to understand the sections and the flow of the data.

I do not see how that is any different than using a proprietary program. At least with a spreadsheet you can look directly at the code for errors. In

If the inability to code review spreadsheets was a real issue, it wouldn't be too hard to convert spreadsheet functions into a functional language. For non-programmers, a spreadsheet lowers the barrier to entry. This allows people to do something useful and productive who couldn't do so otherwise. That's a good thing.

Another major issues with spreadsheets is that they don't handle data typing issues very well. For example, if you try to add a list of numbers, and somewhere in the list you have a number encoded as text, instead of throwing an error, it won't be included in the sum.
Errors should never pass silently.
Unless explicitly silenced.

The fact that Piketty's work describes a damning indictement of the USA's most cherished concept - free market capitalism - means that thousands of neo-liberal economists will pour over every single digit and operator in his spreadsheets looking for anything to negate the findings. If they can't find anything, they'll attack him. When you hear of character attacks against Piketty or some other diversionary tactic, you'll know his data is correct.

Other economics papers that reached similar conclusions such as the well known Growth in a Time of Debt [theconversation.com] also were based on flawed spreadsheets. It makes one question the entire hypothesis when the best known works on the subject are based on incorrect (or just plain fabricated) data.

Well, what makes you think that Gates, Buffet, or Slim work harder than anyone else? Clearly there is plenty of luck involved, so R can be greater than G but there is a LOT of noise. As for expecting the richest man to be a Rockefeller, who says the Rockefellers aren't vastly more wealthy than Gates or any other one of these people that Forbes lists? Do you think they keep their money around in places where it can be counted? Nobody has EVEN THE SLIGHTEST IDEA how much money the Rockefellers, the Rothschild

Heh. They should just teach Econ 101, Physics 101, and Evolutionary Theory in Kindergarden and then quit while the students are at the peak of explanatory power, with everything seeming so nice and simple.

My father was a wise man, and a solid programmer. He liked Basic, because it was simple, and readable (in his environment the alternatives were mainly Assembler, Cobol, and RPG). Whenever people made fun of his love for Basic, and how it resulted in bad code, he always replied “there are no bad languages, just bad programmers.”

The problem isn't the spreadsheet. The problem is people building ugly models in it. Do they seriously think that if those models were written in C, Java or Perl they would have been magnitudes better? I doubt it; you're just transplanting bad habits onto a different platform.

Of course, if he'd used trained professionals to build his models in whatever language of choice the models would be better. If he'd used trained professionals to build his spreadsheet models they would have been better as well.

Tools are built so that people can perform tasks they can't otherwise do. As a result, if tool fails because it's not good enough for the task, at least part of the blame lies with tool and its creator.

There are, however, languages that make it far easier to write code that is less readable and harder to maintain. As a specific example, compare Fortran 77 with Fortran 90. I can write the latter without any need for numerical statement labels. I can write a straightforward "DO WHILE" loop in Fortran 90, while in Fortran 77, I'd have to use the dreaded GOTO to get the same effect. Aside from basic stuff like that, I can write formulas in Fortran 90 with whole

I think the title should be "Why You Shouldn't Use Spreadsheets for *Complicated* Work". Just because a job is important doesn't mean the calculation is complex and something that needs to be coded in, for example, matlab.

If my job is to make a pie chart, I can't see why using Excel is a bad idea. On the other hand, if I am examining the variance of several thousand data points and then plotting the residuals from a gaussian fit, then yes, I can see why using something else would be a lot better. It has not

Lemire is right, spreadsheets are terrible for complex models that need to be modified. He is right for precisely the reasons he outlined.

That doesn't mean that spreadsheets are useless. If you have a standard form where you're only modifying values, rather than functions, spreadsheets are great. There is a low barrier to entry and they are good for communicating results. But as soon as you need to audit or modify functions, you are jumping all over the place and it is easy to make mistakes. Yes, there are ways to consolidate your code (at least in spreadsheets that support scripting), but you are going to take so much time learning how to use the advanced features of you spreadsheet that may as well learn a dedicated programming language in those cases.

And the reality is that it's pretty easy to learn how to use programming languages these days. Not as easy as using a spreadsheet, to be sure, but even the standard Python distribution can handle most of the vulgarities of loading data into memory and storing it properly (i.e. you don't have to worry about parsing or data structures too much). By adding the appropriate modules you can do some decent visualization of data. In some cases the visualization will be better than spreadsheets, and in others spreadsheets will have the lead. And that's just Python, which I chose as an example because I'm familiar with it. The reality is that there are much more appropriate domain specific languages out there.

I think Excel stores formulas in a zipped XML document. Someone could write a tool that extracts each cell's formula from a workbook, sorts them topologically, and spits out JavaScript, Python, or whatever your favorite scripting language is.

I've actually written a very limited version of this. My boss likes to prototype algorithms in Excel, but I need to cram them into a machine with instructions written in a scripting language. I first use a VBA tool to tokenize and collect the Excel formulas, then over to Python to do some conversions of a few built in fuctions, then run it through a symbolic algebra toolkit (Sympy). Sympy has a nice feature where it can format its output as c-code. At that point, if I were using C I would be all done, but I