Posted
by
samzenpus
on Wednesday October 01, 2008 @12:02PM
from the read-all-about-it dept.

cgjherr writes "If the recent financial meltdown has left you wondering, 'When does exponential decay function stop?' then I have the book for you. Advanced Excel for Scientific Data Analysis is the kind of book that only comes along every twenty years. A tome so densely packed with scientific and mathematical formulas that it almost dares you to try and understand it all. A "For Dummies" book starts with a gentle introduction to the technology. This is more like a "for Mentats" book. It assumes that you know Excel very well. The first chapter alone will have you in awe as you see the author turn the lowly Excel into something that rivals Mathematica using VBA, brains, and a heaping helping of fortitude." Read on for the rest of Jack's review.

Advanced Excel for Scientific Data Analysis

author

Robert de Levie

pages

700

publisher

Oxford Press

rating

9

reviewer

Jack Herrington

ISBN

9780195370225

summary

Use Excel for high end scientific data analysis akin to Mathemetica

When I first opened this book my mouth just dropped. It had been years since I had seen a book typeset using LaTeX. But in an instant it made sense as the book is crammed packed with the kind of equations that would have been a nightmare to build with any other tools. Chapter after chapter has everything a really smart person needs to do curve fitting, statistical measures, differential equations, time-frequency analysis. But don't expect a play by play here. You will get the equations, set within a few dense paragraphs, with maybe a spreadsheet and a chart or two to show the results.

The first chapter concentrates on the getting the most out of Excel as a tool. All the chapters that follow dig into specific data analysis techniques. Chapters two, three and four are on least squares. Chapter five and six cover the analysis in the time domain including fourier transforms. Chapter seven covers differential equations. Chapter eight returns to Excel by digging in deeper into macros. Which leads into chapter nine, where we dig deeper into basic mathematical operations. Chapter ten covers matrix operations. And chapter eleven wraps it all up by giving you some spreadsheet best practices.

In University style there are also some exercises that you can do along the way if you want to tweak your brain pan a little more. To amuse myself I tried a few and I believe the book would have assessed my attempts 'wanting' if it had a voice to tell me.

Where most books like this would have several authors this book has just one; Roberte de Levie. This means that the tone, style and quality of the book is consistent throughout. A fact that you will come to appreciate as the book wades in ever increasingly deep data analysis concepts as the chapters roll on.

Though I would have preferred the book to have code samples in C#, I understand that the language of Excel is VBA and I guess I have to live with that. Thankfully VBA has come a long way and if you so inclined it would likely be easy to translate the code into C#, Java, or whatever else you like.

The fact that one person wrote the book left me wondering, "Who is this guy?" In my minds eye I kinda of figured he would look like one of those pulsing brain guys from Star Trek. Turns out he is a professor at Bowdoin College. And his fields of study include ionic equilibria, electrochemical kinetics, electrochemical oscillators, stochastic processes, and a whole lot more stuff that almost seems made up to sound impressive.

When this book isn't serving as an amazing reference for both Excel, scientific problem solving, or just insane equations it serves other purposes as well. It's a handy portable IQ test, as the count of pages you can grind through in one sitting, plus 90, is roughly your intelligence quotient. And if you fail at that you can always put a copy of the book, along with the Orange Bible, under your pillow and try to osmose your way to becoming the Kwisatz Haderach.

In all seriousness, this is a great book. It represents the kind of in-depth work and research we used to see in books that came out twenty years ago. Robert is to be applauded for his work. This is an excellent resource for anyone looking to do scientific data analysis but who was unaware of the powerful capabilities that Excel provides that is likely waiting just one Startup menu click away.

The book is not without fault. I would have preferred that it had been in color, or at least have one color section to show some of the more impressive visualizations that I'm sure would look great in color. In addition the index is silly short for a book that clocks in at 700 pages. But those are only minor quibbles for what is all-in-all an amazing piece of work.

Python is the solution I recommend for everyone who looks for tips on advanced Excel uses. Excel is OK if you just want some quick and dirty solution for a small problem, but if you have to go to the trouble of reading a book, Excel is clearly not the best solution.

For scientists and engineers who need something more than what Excel (and possibly Matlab) offers, I recommend starting with either A Byte of Python [swaroopch.com] or Dive Into Python [diveintopython.org].

I use python for certain tasks, and it's great and handy. For data analysis tasks, I tend to use R, which is both far more interactive than python, has much better graphing solutions than python or excel,and supports more statistical analysis methodologies than pretty much anything else. I can prototype and figure out a methodology in R (and provide provable results) long before I get to start running my python script.

I like scipy, but it certainly is not the 'in between' the parent is looking for. I haven't used it, but my wife likes gnumeric. From what I've seen peeking over her shoulder it looks to be much more apropos.

LaTeX is a markup language. You can express math with it, but it doesn't do anything for you in terms of analysis.

Excel is good for small data sets and quick looks at stuff - but painful to develop in.

Mathematica requires college-level calculus and linear algebra... not PhD stuff by any stretch.

Anyway, you left out Matlab - which is pretty awesome. Depending on what you are doing, there is also R, Maple, Minitab, MathCAD, yada, yada, yada. Lately I've been doing stuff in Python... SAGE is pretty nifty, and the NumPy/SciPy stuff is coming along well (it is included in SAGE).

Right, and when the data moved past 1,048,576 rows [microsoft.com] the whole thing crashed and took the market with it. Little known fact: the.com crash in 2000 was actually caused by exceeding the earlier limit of 65536 rows.

It's OK for simple stuff, but trying to do something like implementing a loop in a spreadsheet. And yes, the criticism applies to OO as well.

There are very good packages out there - some open source - for doing scientific analysis. I'd recommend R or Octave (a matlab clone), personally. Also, Python + NumPy + SciPy + Pylab is great for doing Matlab-like things, and it's all free as well.

It's OK for simple stuff, but trying to do something like implementing a loop in a spreadsheet.

This is one reason the VB scripting turns out to be highly useful. But that said... for half the things loops might be useful for in a normal context, they're wrong in a spreadsheet. Iterating over a set of data isn't done with loops, it's done with applying formulas over a range of cells. And if you turn on iteration for the spreadsheet, it *is* possible to build flow-controlling state machines without using the

Iterating over a set of data isn't done with loops, it's done with applying formulas over a range of cells. And if you turn on iteration for the spreadsheet, it *is* possible to build flow-controlling state machines without using the scripting engine. Not particularly natural for most imperative programmers, but definitely possible.

You could go to all that trouble...or use the most appropriate tool for the job. Yes, if you kludge the living shit out if it, you can get Excel to do a whole lot of things th

I use spreadsheets to prototype and document ideas. Once I had thought a full blown reference implementation in a spreadsheet would be a good idea (basically, more time was spent on the reference than the final project). Fact is spreadsheets are good for one-off problems, or simple problems that gather lots of data (ex. accounting, statistics). When you have a heavy data model, heavy logic model, and complex results, spreadsheets are ultimate FAIL. They are good for developing algorithms quickly, good for t

- Crappy visualization- Sometimes a 2-D data structure ain't the best- Excel's pivot tables get the job done, but they have some pretty inconvenient behavior- Sometimes you want to define a formula once and apply it everywhere, not once per row (when you have ~60k rows). Excel can really bog down when you start having a lot of formulas for big datasets; other tools handle this better- Last I checked it still had some accuracy issues- And many more

Last time I checked (and it has been a while), Excel has computational bugs in it which can result in valid data in -> garbage out. In my mind, 'meaningful scientific data analysis' involves accurate computation. But maybe I'm just a dreamer.

Well, 2007 has bugs in it. I don't use Excel, I use something that can utilize math correctly. Have you checked your spreadsheet program? Or do you just assume that Microsoft does everything correctly?

Well, 2007 has bugs in it. I don't use Excel, I use something that can utilize math correctly. Have you checked your spreadsheet program? Or do you just assume that Microsoft does everything correctly?

I use Excel for daily business functions and data analysis, and will continue to do so, but I don't assume Excel is perfect. I do what I should do with any program I use for calculations, though: I stay aware of all of the quirks and bugs I can of the program, and try to work around them.

Every program is going to have a bug or two (or five thousand, seeing as Excel is part of MS Office), but part of working with software is to know what those are and learn to not let them ruin work.

I certainly run the test suite before using a new build of SciPy/NumPy - but I'm largely dependent on the developers.

Still, in Excel I've caught errors and so now I usually calculate things in two different ways to try and catch stuff. For instance, fit a line using both the built-in methods and the solver. Don't use the line fit in the graphing tool - that's one error that I found.

Several scientific papers came out which had to be recalculated using mathematica or matlab or spice or something, because the data couldn't be trusted after the error was exposed. A small error introduced can be very large, depending how it was used and in the order the data were gathered... thus Excel got a very bad name for doing "meaningful scientific data analysis."

Care to expand on why you think you can't do 'meaningful scientific data analysis in Excel?' Are you one of these people who 'reviews' books without actually reading them?

Someone else has already posted a link to a page that nicely summarizes many (not all) of the problems with using excel for science. But there is virtually no statistical technique which isn't already better implemented in R (free) and many other statistical packages. Real stats packages provide implementations of a given technique that are at least as reliable, provide more control, more options, more diagnostics, and often more guidance. The built-in stuff in Excel is so oversimplified that I think if

While I agree, sometimes being an engineer or analyst means working with one or two or six hands tied behind your back because of time, money, or IT-imposed user-permissions. If you aren't capable of identifying the sources of error in your data as well as those caused by your tools, then you are probably going to do a poor job even with the best tools. Bad tools should never be an acceptable excuse for delivering faulty analysis.

If I hadn't already commented similarly, I'd mod you up. R has been my bread and butter for serious (and ad-hoc) analysis for a few years now. It's fast to write, easy to get data into and out of, provides fantastic stats support, and creates beautiful graphs with very little effort. The interactivity is incredibly useful when prototyping.

If you're willing to spend hours setting it up, and getting it to "sort of" work with touch-ups required every few days. Maybe I'm a dolt, but I never got any of the embedded-R interfaces to work satisfactorily. The documentation was always just too out-dated, and there were too many surprises and inconsistencies. By the time I worked them out, it would have been easier to do it another way.

If you're in a unix environment, I suggest looking at littleR [vanderbilt.edu] which makes the R libraries usable in unix "piping" styl

If you're going to mention that the Office costs $150 for a student version, you might as well mention that Mathematica's student version (identical to the full version, except for a banner upon printing) is $140.

Although you don't have to be a student to use the Home and Student edition, keep in mind that it is not licensed for commercial use of any kind, including non-profits.

yup, already pointed that out in another post that if you want to use it for commercial use, you have to step up to the standard version for $240. point still stands w/ regards to mathematica at $2500 though.

interact "directly" with the excel data, as in manipulate it in excel so you can use other excel functions

exporting does not allow the person with whom you gave the results to work with the data with the same functions that you used

mathematica student version is only licensed to students; office home and student can be bought by non students w/out violating the terms of the licensing agreement. if you need it for commercial use, you can buy the standard version for $240.

They are wasting their time on something enjoyable; mspaint "masterpieces" are comparable to playing WoW or doing macrame (all 3 of these things are great, don't get me wrong).

On the other hand, it's disgusting to have so much productivity and economy wasted on shoe-horning the godawful Excel into pretending intelligence. Not to mention how error-prone it is; a mistake in MSPaint is mildly annoying; a rounding error or outright bogus value in science is close to lying.

The money and availability arguments posted below are better. You can always write up a simple conversion script to convert from the data crunching program to excel. And if you are doing serious number crunching parsing a file to csv isn't much of a task. Learning how to do something in excel will not make it easier in Mathematica. Its always easier to do any kind of mathematical operation in Mathematica.

Now if you have a crazy person that needs to twiddle some numbers and watch the out put in excel.. th

Yeah. Try writing a paper full of equations with Word. You will feel like bashing the monitor in before you are a fifth of your way into the task. (Assuming you know LaTeX).
It may be close now in output quality, but any search, point and click system will always be inferior to LaTeX when it comes to equations.

Output quality: does it have automatic equation numbering? An equivalent for BibTeX? Intelligent modifiable Table of Contents? Ability to replace a math symbol wherever used with another? Change aforementioned numerations at will?

I've seen a lecture by the main developer of SAGE. It seems to be more a tool for doing mathematics research. I've heard of scientists using S-PLUS and R [r-project.org] (the open source alternative to S) for their research. In any case, any of these tools is probably better than a spreadsheet for serious scientific research.

You see, there is a fundamental problem in science and the problem can be summarized as this: how do you get the right results in order to optimize the grants that you receive. Spreadsheets are ideal for this purpose for two reasons. First of all, they are designed to handle financial data. This is great because financial data are what grants are all about. For example: will result X allow for a conference in Hawaii or California this year.

The other big reason to use spreadsheets is that they make data more maluable. Normal scientific tools make it difficult to micromanage the data that you acquire, partially because the people who produce that software have this mistaken notion that data has to be managed in a consistent way. So you're usually stuck doing the same thing to an entire dataset, and it's even difficult to treat different datasets in different way. But spreadsheets expose all of that data, so it is easy to tweak an observation here and a variable there to get the desired result to maximize your grant.

So you see, spreadsheets are a tremendously valueable tool for scientists. It is the best tool for the job.

turn the lowly Excel into something that rivals Mathematica using VBA, brains, and a heaping helping of fortitude

So? What's so special about that? You can turn C, Fortran, or even assembly language into something that rivals Mathematica using brains and a heaping helping of fortitude. This is arguably a better deal, since you don't need the VBA.

Hardcore data analysis in Excel is almost always a bad idea. You can almost always find a way to do it in excel, and you can almost always find a way to do it better, faster, and cheaper somewhere else.

R, MatLab, Mathemateica, Python/Numpy, SigmaPlot, and any number of old, well written, debugged and vetted numerical libraries written in C or Fortran. I've used all of these at various times to solve something that a co-worker couldn't figure out how to do in Excel.

I fit quick linear regressions in Excel. For *anything* else, there is a better choice.

Hardcore data analysis in Excel is almost always a bad idea. You can
almost always find a way to do it in excel, and you can almost always
find a way to do it better, faster, and cheaper somewhere else.

I would have to disagree, having used both Excel and "rolled my own" in
pure C.

My own code runs a few thousand times faster, I know exactly where
the errors might pop up, and I don't need to try to squeeze the data into
a form suitable to whatever MS decided I should use this week, I just handle
anything

It's much more effective to use a well-respected numerical analysis package rather than rolling your own everything from scratch. NumPy, for example, lets you do mathematical analysis far better than Excel very quickly.

As a graduate student in physics, I have never seen a serious researcher use excel for data analysis.
Nor for that matter, is it common to see a scientist using windows for the OS--all linux and mac OS.
This is akin to writing a book about publishing scientific papers with office. Instead, learn LaTex...
The only group of people who use excel for large data analysis are financial types and MBAs. Need I remind you how that turned out?

As a graduate student in physics, I have never seen a serious researcher use excel for data analysis.

Nor for that matter, is it common to see a scientist using windows for the OS--all linux and mac OS.

This is akin to writing a book about publishing scientific papers with office. Instead, learn LaTex...

The only group of people who use excel for large data analysis are financial types and MBAs. Need I remind you how that turned out?

Oh, so that's why at APS meetings I've seen maybe 5 presentations, ever (out of at the very least 500) given on something other than Windows (Ubuntu once, MacOS the other few times), despite the fact that nearly every speaker uses his or her own laptop for the presentation. Wait...my data seems to indicate that physicists hardly ever use Mac or Linux at all!

If we're just talking about computers controlling instruments, then I see about 90% Windows, 10% Linux if the instrument costs less than a million doll

I know it is popular and many science and engineering faculty lazily encourage their graduate students to use it. However, something like matlab beats the crap out of excel any day. Spreadsheets tend to obfuscate relationships between data, require a lot more clicking (read human intervention) and waste time that could be spent thinking about the data, and are singularly unsuited for analysis of similar sets of data (a situation any scientist faces when he has to do a series of experiments).
Matlab might take sometime to initially write the scripts, but it is so powerful and extensible that no one in their right mind would want to use excel. If you are a slave to spreadsheets, get yourself a copy of Microcal Origin or Labplot.

Excel is especially unsuited to the task of preparing figures for scientific publications. The default formatting is at once wrong for the task and hard to change. Once you set your preferences in matlab (easy to do), you are set for life.

In my experience, excel is also rarely used for anything serious outside of US. Maybe its an indictment of how lazy, slow witted and easily misled our pool of talent is becoming.

Thank you for mentioning Labplot (http://labplot.sourceforge.net/). I'd also like to put in a shout-out for QTIPlot (http://soft.proindependent.com/qtiplot.html) and Scilab (http://www.scilab.org/), and of course the aforementioned Sage (http://www.sagemath.org/).

QTIPlot and LabPlot, in particular, have amazingly responsive developers, who seems to go out of their ways to help people.

In my experience, excel is also rarely used for anything serious outside of US. Maybe its an indictment of how lazy, slow witted and easily misled our pool of talent is becoming.

I recently spent some time in Japan in a design group for a large Japanese company. I was showed the massive spreadsheet used to calculate power plant capacity and consumption. I almost cried. The whole sheet was based upon one large circular reference. Nobody understood it and it referenced steam tables through a plugin but

In my experience, excel is also rarely used for anything serious outside of US. Maybe its an indictment of how lazy, slow witted and easily misled our pool of talent is becoming.

I have experienced whole companies running on Excel spreadsheets - they use it for accounting, instead of a database, and, you guessed it, scientific data analysis.
The company I'm talking about is in the power supply industry.

When I worked in the semiconductor industry in the late 90's, Excel nearly cost us several hundred grand. It had "helpfully" autocorrected a code in the documentation for a mask used in one of our clock buffer chip products. Had the engineers not caught this mistake in the printout, the fab of the chip would have been botched. The engineers were mad as I recall because they would change the code and Excel would change it back.
If you can't prove what your tool is doing, you don't get to use it is what they taught me in engineering school.

SPSS has now become the standard data analysis package for quantitative studies in social sciences. It's very crappy software, and it wouldn't take a whole lot of augmentation to get Excel do what SPSS does.

The problem is that social scientists don't want to mess with the internals too much, and SPSS made for them a point and click interface - in effect, they out-Microsofted Microsoft. They charge an insulting $1500/copy and completely dominate the universities, so they're making good money.

Now, who told you SPSS is crappy software? It's a widely used software for not only Social Science but for the Biology and Medical fields - in short, for anyone serious about statistics who's not a statistician.

Excel, OTOH, has a long track record of errors. Microsoft does not have the expertise for numerical and statistics software.

Which is not surprising, if you remind yourself that Microsoft did not even have security expertise for its own main product line...This software landscape is dominated by Matla

Advanced Excel for Scientific Data Analysis is the kind of book that only comes along every twenty years

Excel was introduced in 1990. So, assuming that it was introduced with a book just like this one in 1990, that would be "a kind of book that comes along every eighteen years". I'm certain the poster would have realized this had he or she applied what she'd read in the first edition to do the proper calculation.

When I was a freshman in engineering school, my intro to engineering class required us to purchase a book similar to this. We were given two class periods to work with Excel, supervised by a TA. (it was considered a lab) I remember the assignment involved proving that sin^2+cos^2=1.

If you couldn't figure out Excel within those two class periods, it was recommended that you switched your major to business administration. The business administration school had a semester long class devoted to learning Excel.

When I was a freshman in engineering school, my intro to engineering class required us to purchase a book similar to this. We were given two class periods to work with Excel, supervised by a TA. (it was considered a lab) I remember the assignment involved proving that sin^2+cos^2=1.

Proving that with Excel? How does that work? That's a trigonometry problem, and it follows from the definitions of the sine and cosine functions, and from Pythagoras's theorem. You do it with a pen and paper and you write 'QED' at the bottom. To prove it with Excel, you'd have to calculate the result individually for every possible angle, and unless Microsoft have released an update I haven't had yet then Excel doesn't have a transfinite number of available rows.

Oh, wait...

engineering school

That's dangerously close to reality. That's where they think that if something works the first fifty million times, then it's going to work every time.

Still, it could be worse. You could be in
If you couldn't figure out Excel within those two class periods, it was recommended that you switched your major to business administration.

To prove it with Excel, you'd have to calculate the result individually for every possible angle, and unless Microsoft have released an update I haven't had yet then Excel doesn't have a transfinite number of available rows.

Ok, so it wasn't a strict mathematical proof. Let's face it, in the real world you rarely have to worry about the infinitesimal. With empirical data in today's digital age, you will have a sampling rate to contend with. Why use an integral when a Riemann sum is all your data can support?

Some of us have to crunch numbers every day and it's interesting to consider Excel as a tool for this purpose. But to then have a reviewer talk about things like "insane equations" makes it clear that the reviewer sees "equations" as some kind of esoteric icon associated with peculiar people with "pulsing brains" rather than the bread and butter of the jobs of thousands, if not millions of people the world over. How can/. post a review by such a clearly ignorant reviewer? It verges on embarassing to read.

I thumbed through the book but not impressed.The author probably has used excel as best as any one can in doing the task he intends. But for most of people, the effort to acquire the skill by reading the book is not well-spent, since one can probably learn other tools which really intended for scientific analysis.

For statistic packages, R probably is much better, though I would prefer SAS. Try a huge data set (200MB, and put it in excel, your system will crawl before excel crashes, but in SAS, it will be re

So according to the book, here's the recipe:1. Write your data analysis software in VBA2. Use the Excel cells, buttons, bells, and whistles for the I/O3. Profit!

The math is actually irrelevant. Any computational mathematics book that respects itself uses pseudocode for the examples. If it is possible to program in one programming language it should be possible in any other language too.

I tried it too, although I wasn't nearly as crazy to do any numerical computations in VBA. I wrote the program in pure good old Fortran 95, wrote some VBA scripts to read the Fortran ASCII output, and set everything else up in Excel that my boss liked (I'm a chemical engineer). There you go: it's fast to program, fast to run, easy to maintain.

I would like to see anyone try to keep up with the Microsoft paraphernalia between VBA-Excel versions, if the whole thing is written in VBA. Not to mention the problems that I had with the locale when I tried to run the VBA code in a computer running a german version of Excel that had decided that the decimal point is there as a thousands separator and the comma was used as a decimal point. The setting for it in Excel was nowhere to be seen (I still haven't really figured it out. The central Windows setting seemed to have no influence on it, although I suppose it should) and 1.234 was then 1234 and 1.2E-02 was a character string. Oh, the pain... Thankfully, my *basic* Fortran part absolutely did not care, it just worked, and only the I/O needed to be reviewed.

Try to send the program to a customer without knowing what kind of Excel version he is running. We had to go as far as Office 97 just to be sure, and there was still the problem with the locale. After a year, the I/O was useless, but who cares? It was only 1% of the code.

I would still use Excel, but for nothing other than the most trivial tasks. There are wonderful libraries out there that work with Fortran and produce very nice graphs on the fly.

Look at all those posts saying "Excel is not the right tool for this" or "When all you have is a hammer...". The point was not grokked by those folks.

I'll lay it out for you, plain and simple:

This book is like installing a linux kernel onto a wristwatch.

We should be marvelling at the feat, not lambasting a tool that was "hacked" to do so much more than it is normally used for. If you can't appreciate that kind of work, maybe you should just stick to appreciating fine arts.

While some of us admire the author for doing something akin to fitting Linux on watch, some of the objectors are pointing out the watch still needs to accurately tell time. There are a number of papers that show how Excel does not have the accuracy necessary for detailed scientific analysis. Some things like random number generation are not implemented correctly. So back to the watch analogy, as long as the author clearly divulged that the watch only tells time accurately to the minute and is not waterpr

The publishing industry (including my company) typesets books using LaTeX all the time. The reason you don't notice it (apart from the superior quality) is that it does its job of typesetting very well.

If this book has been typeset using LaTeX then I'm a Dutchman, or something has gone very wrong (and I'd like the author to contact me to let me know what).

Perhaps he was given faulty fonts, perhaps he was using a badly-written publisher's style, or perhaps he -- or his editor -- spent a long time making it look as bad as possible. Maybe OUP had it completely re-typeset in some other system without telling him. There are at least a dozen typographic faults in one paragraph alone, from unnecessary hyphenation to excessive word-spacing to bad math spacing, and LaTeX simply doesn't make those types of mistake unless you work very hard to introduce them manually.

As I don't have the book (and wouldn't understand it anyway:-) I'd be interested to know where the information came from that it was typeset with LaTeX; and if it really was done in LaTeX, I'd love to know WTF kind of style files, fonts, and preamble were used.

Maybe some of the people yelling about how Excel is the wrong tool can give some advice for my scientific data analysis and visualization needs.

I have simulations (written in C++ and Python) that spit out tab delimited data files. I then need to analyze that data, doing things like linear regression on subsets of the data and calculations to transform the raw data into something else for plotting.

I have a Mac (with Windows XP in Parallels), I am not a student, and I don't have much budget ($500) for software. Currently I use a Mac program called Plot which is a little buggy and incomplete but has some nice plotting abilities. When I need a spreadsheet I use Apple's Numbers, but that seems sorely limited in abilities. What's a better tool for this job?