Flaws in a Less Rigorous Education 'Dashboard'

Last month, U.S. Secretary of Education Arne Duncan released an online “Education Dashboard” that is supposed to show how American schools and students are performing and to encourage public debate and discussion. The dashboard offers Web pages full of numbers, charts, and arrows meant to provide us with a meaningful picture at a glance of where we are and where we are heading.

Neither of us is a stranger to such an effort, and we appreciate the difficulty of concisely yet meaningfully presenting a complex endeavor such as education. In fact, some two years ago, we worked on a very similar project under then-Education Secretary Margaret Spellings. Our key objective was focus and brevity (we targeted five indicators: achievement, achievement gap, high school graduation, college readiness, and college completion); after all, the U.S. Department of Education already provides extensive annual information in its Digest of Education Statistics and Conditions of Education.

We thought being concise was important because, despite an ocean of data, nothing existed at the time to which the general public could easily relate. There was nothing like the cost-of-living index or the unemployment figures that could quickly and clearly communicate to the nation how our schools and students were doing.

With this in mind, we were pleased to see that the current administration has chosen to continue what we believe is an important way of telling the story of education to the public. Yet we were disappointed to see that this administration has chosen to backtrack from the George W. Bush administration’s focus on outcomes. The Obama administration and Secretary Duncan have instead paid increased attention to process and inputs. So, for example, where the Bush-era indicators defined college readiness by college-admissions-test scores of high schoolers, Secretary Duncan has replaced them with the rather meaningless measure of “public school graduates who took at least one Advanced Placement test”—took, but did not necessarily pass.

There is already an abundance of available educational data; what’s missing is a focused and succinct summary that can be easily communicated.

While many distracting inputs have been added, an important one has been dropped. That measure of input, which was included in the Bush-era indicators, is broadly meaningful to the public: average national K-12 spending per student. Such a figure is a vital component of any effort to measure productivity. Per-pupil spending has continued to rise even during the recession. Yet this figure is now absent from the Obama administration’s dashboard and thus hidden from the public.

Previously, a measure of educational attainment reported how many 25- to 34-year-olds held a bachelor’s degree or better; the new replacement measure reports how many hold an associate’s degree or better. Yes, the numbers are higher, but are they more meaningful? Do they indicate international competitiveness more accurately?

Perhaps the most wrongheaded Obama administration change is the elimination of anything pointing to the size of the achievement gap. Where the previous Bush-era indicators prominently included the ratio between students achieving proficiency from disadvantaged and nondisadvantaged populations, the new dashboard simply shows trends of achievement in a way that glosses over the achievement gap.

The dilution of expectations and the change of focus from outcomes to inputs are major problems, but not the only ones. As we have mentioned, there is already an abundance of available educational data; what’s missing is a focused and succinct summary that can be easily communicated. The new dashboard includes more than a dozen numbers and more than three dozen arrows and indicators of trends, each leading to yet more supporting tables and graphs.

We find it difficult to believe that such large amounts of data will help the general public get a better sense of the state of public education. In fact, this abundance of numbers and indicators reminds us of a similar flaw that was formerly found in nuclear-plant monitoring rooms and which contributed to the Three Mile Island nuclear meltdown by overloading plant operators with too much unimportant data.

In summary, we are disappointed that Secretary Duncan and the Obama administration have chosen, in effect, to lower achievement expectations and regress to a focus on inputs rather than on outcomes.

Ze'ev Wurman is a manager in a Silicon Valley technology startup company and was a senior adviser to the U.S. Department of Education during President George W. Bush’s administration. Williamson M. Evers is a research fellow at Stanford University’s Hoover Institution and a member of the institution’s Koret Task Force on K-12 Education. He was the assistant U.S. secretary of education for planning, evaluation, and policy development during the same administration.

Vol. 30, Issue 20, Pages 28-29

Published in Print: February 9, 2011, as New Education Dashboard: Less Rigorous, Less Meaningful

Ground Rules for Posting
We encourage lively debate, but please be respectful of others. Profanity and personal attacks are prohibited. By commenting, you are agreeing to abide by our user agreement.
All comments are public.