The 2013 RHSU Edu-Scholar Public Presence Rankings

Today, we unveil the 2013 RHSU Edu-Scholar Public Presence rankings. The metrics, as explained yesterday, are designed to recognize those university-based academics who are contributing most substantially to public debates about K-12 and higher education. The rankings offer a useful, if imperfect, gauge of the public impact edu-scholars had in 2012, both due to short-term activity and longer-term contributions. The rubric reflects both a scholar's body of academic work--encompassing books, articles, and the degree to which these are cited--and their 2012 footprint on the public discourse.

The top scorers are familiar edu-names with long careers, bodies of influential scholarship, track records of commenting on public developments, and outsized public and professional profiles. In order, Linda Darling-Hammond and Diane Ravitch are tied for first (In the case of ties, scholars are listed alphabetically by last name), followed by Howard Gardner, Rick Hanushek, and Paul Peterson. Rounding out the top ten were Larry Cuban, Gary Orfield, Yong Zhao, Richard Elmore, and Tony Wagner. These results reflect the nature of the scoring, which recognizes the influence of a scholar's body of work and not simply whether a scholar garnered press clippings or blog mentions in 2012.

The RHSU Edu-Scholar Rankings are restricted to university-based researchers and excluded think tankers (e.g. Checker Finn or John Chubb) whose job description is to influence the public discourse. After all, the point is to nudge what is rewarded and recognized at universities. (The term "university-based" provides a bit of useful flexibility. For instance, Tony Bryk currently hangs his hat at Carnegie. However, he is an established academic who retains a university affiliation and campus digs. So he's included.)

In calculating scores, we sought to be careful and consistent. That said, there were inevitable challenges in determining search parameters, dealing with common names or quirky diminutives, and so forth. Bottom line: this is a serious but inevitably imperfect attempt to nudge universities, foundations, and professional associations to consider the merits of doing more to cultivate, encourage, and recognize contributions to the public debate.

In terms of affiliations, Harvard fared impressively, claiming two of the top five spots, and six of the top 20. Stanford also claimed two of the top five and four of the top 20, while the University of Virginia and NYU each nabbed two top 20 spots. Other institutions with faculty placing in the top 20 were UCLA, the University of Oregon, Columbia (Teachers College), the University of Wisconsin, Johns Hopkins, UC Berkeley, and Arizona State.

As with any such ranking, this exercise ought to be interpreted with appropriate caveats and caution. Given that the ratings are a snapshot of 2012, the results obviously favor scholars who penned a successful book or big-impact study this year. But that's how the world works. And that's why we do this every year.

Because of the scoring caps, a few scholars tended to hit the ceiling in any given category:

➢ When it came to book points, Ravitch, Gardner, Nel Noddings, Cuban, and Peterson each maxed out. Ravitch scored the highest Amazon ranking at 19.7, as well as the highest Klout score at 8.2.

➢ With regards to mentions in the education press, only Ravitch hit the cap, while Ravitch, Darling-Hammond, Gardner, Hanushek, Cuban, Wagner, Pedro Noguera, and Roland Fryer each hit the cap when it came to blog mentions. A similar cast of characters maxed out on newspaper mentions - Ravitch, Darling-Hammond, Gardner, Hanushek, Fryer, and Jonathan Zimmerman.

If readers want to argue the relevance, construction, reliability, or validity of the metrics, I'll be happy as a clam. I'm not sure that I've got the measures right, that categories have been normed in the smartest ways, or even how much these results can or should tell us. That said, I think the same can be said about U.S. News college rankings, NFL quarterback ratings, or international scorecards of human rights. For all their imperfections, I think such efforts convey real information--and help spark useful discussion. That's what I've sought to do here.

I'd welcome suggestions for possible improvements--whether that entails adding or subtracting metrics, devising smarter approaches to norming, or what have you. I'd be interested in hearing your critiques, concerns, questions, and suggestions. So, take a look, and have at it.

Categories:

Ground Rules for Posting
We encourage lively debate, but please be respectful of others. Profanity and personal attacks are prohibited. By commenting, you are agreeing to abide by our user agreement.
All comments are public.

The opinions expressed in Rick Hess Straight Up are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.