(1) See Peter Drucker's new book, The Essential Drucker, for a thoughtful chapter on "the information executives need today." That is, you should start by considering the intellectual problems that the displays are supposed to help with. The point of information displays is to assist thinking; therefore, ask first of all: What are the thinking tasks that the displays are supposed to help with?

(2) It is essential to build in systematic checks of data quality into the display and analysis system. For example, good checks of the data on revenue recognition must be made, given the strong incentives for premature recognition. Beware, in management data, of what statisticians call "sampling to please"--selecting, sorting, fudging, choosing data so as to please management. Sampling to please occurs, for example, when the outflow from a polluting factory into the Hudson River is measured by dipping the sampling test-tube into the cleaner rather than the dirtier effluent.

(3) For information displays for management, avoid heavy-breathing metaphors such as the mission control center, the strategic air command, the cockpit, the dashboard, or Star Trek. As Peter Drucker once said, good management is boring. If you want excitement, don't go to a good management information system.

Simple designs showing high-resolution data, well-labelled information in tables and graphics will do just fine. One model might be the medical interface in Visual Explanations (pages 110-111) and the articles by Seth Powsner and me cited there. A model for tables might be the supertable, shown in The Visual Display of Quantitative Information, p. 179. More generally, see chapter 9 of The Visual Display of Quantitative Information. The displays should often be accompanied by annotation, details from the field, and other supplements.

Sparklines show high-resolution data and also work to reduce the recency bias prevalent in data analysis and decision-making. Sparklines are ideal for executive decision support systems. See our threads on sparklines:

(4) For understanding a process and for designing a display for understanding a process, a good way to learn about what is going on is to watch the actual data collection involved in describing the process. Watch the observations being made and recorded; chances are you will learn a lot about the meaning and quality of the numbers and about the actual process itself. Talk to the people making the actual measurements; maybe you'll learn something.

(5) Measurement itself (and the apparent review of the numbers) can govern a process. For example, in printing my books, I ask that during the press run that the density of the black ink be measured in 6 or 8 different positions on every 3000th sheet being printed. These pulled sheets are then inspected shortly after the run and before the next run. The idea is to try to ensure that the color of the black type is uniform and at the right level of blackness in 3 ways: (1) across the 8 pages printed up on each sheet of paper, called a "form", (2) over the 40,000 sheets printed of that form, and (3) over the many forms making up the entire book. We sometimes review these pulled sheets the next day to check these density readings and to yell at the printer if there is a problem. But mainly the mere fact that the printers are making these measurements keeps the process in control. And the fact that someone might review the measurements.

Note that this example is mainly just common sense in workaday action; no jargon about an Executive Decision Protocol Monitoring Support Dashboard System is needed. In fact, such jargon would be an impediment to thinking.

(6) My own encounter with a real business trying to improve management data and the display of that data was in consulting for Bose. At one point it appeared to me that too many resources were devoted to collecting data. It is worth thinking about why employees are filling out forms for management busybody bureaucrats rather than doing something real, useful, productive. The story of this work is told in Michael H. Martin, "The Man Who Makes Sense of Numbers," Fortune, October 27, 1997, pp. 273-276; and in James Surowiecki, "Sermon on the Mountain: How Edward Tufte led Bose out of the land of chartjunk," Metropolis, January 1999, pp. 44-46. Both accounts make me appear excessively heroic. These articles are posted in the NEW section at http://www.edwardtufte.com/tufte/fortune_97http://www.edwardtufte.com/tufte/metropolis_0199

(7) Most of all, the right evidence needs to be located, measured, and displayed. And different evidence might be needed next quarter or next year.

-- Edward Tufte

Response to Executive Decision Support Systems

One reason management reports tend to be dull and unhelpful is that so many of them are designed to show variances between actual results and targets/budgets.

The idea is that reducing variances keeps an organisation on track. There is no need to look further.

Once you recognise that this technique doesn't work very well it is natural to be much more imaginative about the displays you use.

The Beyond Budgeting Round Table (www.bbrt.org) is dedicating to helping member organisations (e.g. Unilever, the World Bank) get rid of budgetary control and similar systems, and part of doing that will be improvements to the display of management information.

I've had a go at tackling the subject in one of the papers on my website at www.internalcontrolsdesign.co.uk The paper is called "Design ideas for Beyond Budgeting management information reports".

Back in the 70s and 80s when state of the art reporting was on 17x11 green-lined piano
paper I developed a reporting concept called 'Across a crowded room". The mind
experiment was a long glass-walled corridor in an old-fashioned office building with
offices leading off it and desks facing the window. The employees all sat facing the
window and every hour the manager would walk down the corridor and rap on the glass
wall of each office. The employees in each room, without turning round, would each hold
one piece of paper above their heads that would tell the manager from the distance of the
corridor what he needed to know. He would only enter an office where a report was bad.

The design criteria which result are useful principles for all information. Good reports
should look less troubled than bad ones. Bad reports should be busy and highly coloured.
Similar types of result should be shaped and coloured similarly. No text should be
readable in the distant view but each piece of paper should have all the information in
detail upon it so
that it can be inspected close to. The distant appearance should be the sum total of the
information, thus giving a fractal effect. Text can be used to give shapes (Visual Display of
Quantitative Information page 141 as example).

My problem with conventional ESS/Dashboard/Balanced Scorecard displays is that they are
often devoted largely to symbols that need further information from elsewhere. Why have
a green 'traffic light" for example when the reporting text itself could be green thus giving
the same effect from a distance.

Some useful metaphors for data displays in business monitoring systems include the tables
on the sports data page, the weather page, and the financial page of a good newspaper.
There, readers quickly look through large data sets in order to examine selected tables of
data. The selection of the tables to look at is more in the hands of the viewers and less in the
hands of the editors. Thus executives can use their judgment in deciding which data reports
to look at while scanning over a large collection of monitoring tables.

Millions of people everyday select and read the sports data tables and the weather pages
found in newspapers; business executives can also work at that level of resolution. Also the
sports and weather (and stock market tables) have a straightforward workaday undesigned
quality.

-- Edward Tufte

Another way to look at this: Why don't the data tables for business monitoring have the same
design and density of information as the financial tables in the Wall Street Journal or the New
York Times?

I watched Microsoft's demo on the Business Scorecard Manager (BSM). The technology underlying the product looks like it could deliver a useful system. For example, SQL Server could pull useful information quickly and arrange it in a user-familiar format; the Pivot Table feature could allow for quickly seeing things in alternatvie perspectives by re-arranging and/or filtering them. But the system shown in the demo was atrocious.

Contoso is described as a fictitious sporting goods company. It is a good thing for them that they are fictitious, their BSM is abombinable and their executives make decisions based upon very little data.

The graphs shown include 3d exploded pie charts, polar charts (not always bad, but IMHO not useful for financial or operational graphs), 3d area graphs (lovely - when the data in front gets larger than the data in back, you cannot see the data in back), and pie charts embedded in a map of the United States (looking very much like the "what not to do" sample that Tufte showed on pg 178 of VDQI.

Then, to illustrate the functionality of the system, they show an example in which the CEO contacts the CFO (Laurie) to find out why margins are low. Laurie sees that margins are low in the western part of the US and concludes that the inexperieced salespeople are emphasizing volume over margin and are discounting too heavily. She then launches a sales training program based upon this analysis and the next quarter the information is now shown in green thanks to Laurie's prompt action. Hooray for Laurie and the Microsoft BSM!!!

Aaaarrgghh! I would hate to work in a company where that level of idiocy is displayed by the CFO. If one could make top-level decisions based on such thin data without a lot of nuanced understanding, why not let the computer run the company and save the CFO's salary?

It is possible that such a scenario might play out in the real world and that an experienced CFO might be pulling on a vast resevoir of knowledge about the nature of the markets, salesforce, etc, but this demo seems to imply that decisions are being based on some fairly thin data. At the very least, someone in a position like Laurie would need to check all of their assumptions and ask lots of questions to determine if the problem is as simple as the demo would have you believe.

The data displays and control panels on my photo-editing program (Aperture) are far more intelligent and informative than those on the Microsoft mock-up. The guy pacing around and saying "cool" a lot seems to be imitating Steve Job's presentation style. Better to imitate the Apple interface for, say, Aperture. Here is Jobs, some years ago, on Microsoft product development:

What do you think of GE's Digital Cockpit, as discussed in their 2001 Annual Report? Personally, I like the fact that a lot of data are displayed, but it sure is ugly.

Asked by Bill Weitze

Too low resolution; not serious. Scientific analysis these days operates at data-resolutions and data-densities 10 to 20 times greater (see any recent issue of Science or Nature) than what is shown at the GE digital cockpit. So should financical analysis.

Here is a copy of my earlier answer (August 27, 2001) at this board on Executive Decision Supports Systems:

IDEAS FOR MONITORING BUSINESS AND OTHER PROCESSES Edward Tufte

(1) See Peter Drucker's new book, The Essential Drucker, for a thoughtful chapter on "the information executives need today." That is, you should start by considering the intellectual problems that the displays are supposed to help with. The point of information displays is to assist thinking; therefore, ask first of all: What are the thinking tasks that the displays are supposed to help with?

(2) It is essential to build in systematic checks of data quality into the display and analysis system. For example, good checks of the data on revenue recognition must be made, given the strong incentives for premature recognition. Beware, in management data, of what statisticians call "sampling to please"--selecting, sorting, fudging, choosing data so as to please management. Sampling to please occurs, for example, when the outflow from a polluting factory into the Hudson River is measured by dipping the sampling test-tube into the cleaner rather than the dirtier effluent.

(3) For information displays for management, avoid heavy-breathing metaphors such as the mission control center, the strategic air command, the cockpit, the dashboard, or Star Trek. As Peter Drucker once said, good management is boring. If you want excitement, don't go to a good management information system.

Simple designs showing high-resolution data, well-labelled information in tables and graphics will do just fine. One model might be the medical interface in Visual Explanations (pages 110-111) and the articles by Seth Powsner and me cited there. A model for tables might be the supertable, shown in The Visual Display of Quantitative Information, p. 179. More generally, see chapter 9 of The Visul Display of Quantitative Information. The displays should often be accompanied by annotation, details from the field, and othersupplements.

"Sparklines" show high-resolution data and also work to reduce the recency bias prevalent in data analysis and decision-making. More on that in the next book, Beautiful Evidence, probably out by 2003, I do hope.

(4) For understanding a process and for designing a display for understanding a process, a good way to learn about what is going on is to watch the actual data collection involved in describing the process. Watch the observations being made and recorded; chances are you will learn a lot about the meaning and quality of the numbers and about the actual process itself. Talk to the people making the actual measurements; maybe you'll learn something.

(5) Measurement itself (and the apparent review of the numbers) can govern a process. For example, in printing my books, I ask that during the press run that the density of the black ink be measured in 6 or 8 different positions on every 3000th sheet being printed.These pulled sheets are then inspected shortly after the run and before the next run. The idea is to try to ensure that the color of the black type is uniform and at the right level of blackness in 3 ways: (1) across the 8 pages printed up on each sheet of paper, called a "form", (2) over the 40,000 sheets printed of that form, and (3) over the many forms making up the entire book. We sometimes review these pulled sheets the next day to check these density readings and to yell at the printer if there is a problem. But mainly the mere fact that the printers are making these measurements keeps the process in control. And the fact that someone might review the measurements.

Note that this example is mainly just common sense in workaday action; no jargon about an Executive Decision Protocol Monitoring Support Dashboard System is needed. In fact, such jargon would be an impediment to thinking.

(6) My own encounter with a real business trying to improve management data and the display of that data was in consulting for Bose. At one point it appeared to me that too many resources were devoted to collecting data. It is worth thinking about why employees are filling out forms for management busybody bureaucrats rather than doing something real, useful, productive. The story of this work is told in Michael H. Martin, "The Man Who Makes Sense of Numbers," Fortune, October 27, 1997, pp. 273-276; and in James Surowiecki, "Sermon on the Mountain: How Edward Tufte led Bose out of the land of chartjunk," Metropolis, January 1999, pp.44-46. Both accounts make me appear excessively heroic.

(7) Most of all, the right evidence needs to be located, measured, and displayed. And different evidence might be needed next quarter or next year.

Coming from a project driven engineering environment where schedule of software production is established based on delivery date (vice telling the customer when they can actually have the product) we begin with an environment based unrealistically; so it should have come as no surprise when we were then asked as engineers to begin "extracting" our current status into simple green/yellow/red status bars for weekly reporting to the first level chain of management.

So the data extraction process was broken at the lowest level of reporting. Not only was the fox put in charge of the hen-house, (the chickens are fine, thanks); but being engineers we quickly learned to "game" the system to drive resources where we needed them in a collaborative process held between the lead engineers at the local watering hole on Fridays.

The biggest problem with this model (outside of our continuing to fail to meet marketing's promises) was it circumvented and circumscribed three layers of management who had effectively agreed to be circumvented.