How Productive Is Your Program?

By

U.S. News and the National Research Council have some new competition in the rankings business -- from a business that takes a very different approach to evaluating universities.

In recent weeks, a company called Academic Analytics has started selling its research to universities as a tool for evaluating graduate programs. More than 10 universities have already purchased the service, which promises a better way to analyze how productive departments are and how they compare to other departments. The new business is being talked about among graduate deans and institutional research leaders, but faculty members whose output is being analyzed are largely unaware of the tool.

Some experts believe that the company -- which is based on a graduate dean's research -- offers a much better way to measure program quality than anything that is out there now. The tool seems to be particular popular among up-and-coming research universities that want to demonstrate their quality. But others who have been briefed about the new tool are skeptical or wonder why the information should be kept secret except for those willing to buy it.

The rankings provided by Academic Analytics come from the Faculty Scholarly Productivity Index, which was developed by Lawrence Martin, dean of the graduate school at the State University of New York at Stony Brook. The index analyzes a series of measures of faculty productivity:

Journal publication per capita.

Book publication with university presses, per capita.

A combined book and journal publication index.

Journal citations.

Grant dollars per faculty member.

Percentage of faculty members in a department receiving grants.

A "Faculty Funding Index" based on the various grant measures.

Awards and honors per faculty member, gathered from a long list of groups issuing such commendations.

The company has gathered information on departments, faculty member by faculty member, for universities considered "research extensive" or "research intensive" under the categories developed by the Carnegie Foundation for the Advancement of Teaching. Combined, those categories cover most institutions offering any significant graduate education.

"What our system does is to look at the major areas of scholarly activity," said Martin. While some of the measures are used in other rankings, this system combines them and is also set up for regular updates and a broad range of disciplines (85 in all), allowing universities to track changes from year to year and see long-term patterns. In contrast, the National Research Council's departmental rankings are currently being revised for the first time since 1995.

Many universities are frustrated not only with the time between NRC rankings, but the generally cautious approach of the council in adding fields. Academic Analytics takes a more inclusive approach. "If you can get a Ph.D. in it, we measure it," Martin said.

Universities that buy the service will receive reports that compare their departments to those of 10 other universities, selected by the purchasing institution. Universities can select 10 comparison institutions for all disciplines or (for an extra fee) vary the comparison institutions by discipline. The databases that produce these reports are proprietary and will not be published.

The fee structure is based on how many programs a university wants evaluated. Martin said that an institution with the broadest range of programs might spend $30,000 a year on the service while a place like Stony Brook would be able to have its departments evaluated for about $10,000.

"The only university that doesn't need this report is the institution where every program is wonderful and equally at the top of the pack," Martin said, adding that he thought the reports would be a great tool for deans seeking to figure out which departments could improve, which needed more resources and attention, and so forth.

He declined to release the names of those that have already purchased the service, but officials of the State University of New York at Albany and the University of Cincinnati confirmed that they are among them. Generally, Martin said, the universities buying have been those on the rise rather than those that have been known for decades for their top research departments.

Opinions about the service and its use vary among those who have been discussing it. Several university administrators said that they were intrigued by the idea, but didn't think their faculty members would like it and that they wanted to wait a bit before signing on.

Susan Herbst, provost at Albany, said that the Academic Analytics system is "not perfect," but that the breadth of its rankings across disciplines was much better than what the NRC offers. She also said that the service provided by the new company was superior, and that she also planned to make use of rankings from The Center, a University of Florida program that evaluates research universities.

Herbst said that having multiple evaluations is a good thing. "Since we are moving this university ahead dramatically, we'll take all the data we can get and then triangulate," she said. "The more independent bodies that can help us assess quality, the better. It's nearly impossible for senior faculty, or even a stellar graduate dean, to judge where their programs sit on the national scene."

Robert Frank, dean of the graduate school at Cincinnati, said that the NRC rankings are "a bit old," and that state officials are pushing hard for measures of the quality of programs, making the new service attractive. "I like having these sorts of data available when we come to the table."

David Hardesty, president of West Virginia University, said he hasn't been approached about using the service, but that he could see why it would be popular with administrators. "In the private sector, data comparisons like this are common," he said. "Higher education needs to be more data driven" in discussions about faculty productivity, he added.

Some are waiting before deciding whether to use the service and are attending briefings about it. Officials from the company are soon meeting with the institutional research officials of members of the Association of American Universities.

Julie Carpenter-Hubin, director of institutional research and planning at Ohio State University, said she's interested in the project because "current data sources on faculty scholarship are limited" and there's a lot of interest in finding ways to make comparisons. "Academic Analytics is capturing that kind of information. A big part of what they are selling is the comparison," she said.

Carpenter-Hubin said she will be watching to see how the company's clients use the data, and whether they find it valid.

Lydia Snover, assistant to the provost at the Massachusetts Institute of Technology, said that she was originally "quite negative" about the idea, but that after receiving a pitch, she believes that the concept "has some potential." Snover said that people who lead universities very much want new measures of faculty productivity. But she said that the data may be of less value when a university knows that it has some departments that are already on top of any rankings.

There is also the issue of checking the accuracy of the data. Because the database is proprietary, universities won't generally see the information about their departments without making a purchase.

Elizabeth Capaldi, vice chancellor of the SUNY system, is also co-editor of the work of the Florida center on ranking research universities. That project is currently expanding to look at departments, not just entire universities. Capaldi noted that The Center's data are public -- so any university can challenge the basis for rankings. "The data should be checkable," she said. "It's very important to be able to correct errors, to understand the methodology."

In response to such concerns, Academic Analytics has shared some of its data with some universities, but officials at some of those institutions have been trading e-mail messages asking one another why they should check the data to enable a business to make money.

John V. Lombardi, who founded the Florida center and is the other editor of its project, said that he has heard that many academics are "not entirely comfortable with the profit-making model" used by Academic Analytics. Lombardi, chancellor of the University of Massachusetts at Amherst, is also a columnist for Inside Higher Ed.

In terms of methodology, Lombardi said that he was bothered by the lack of differentiation among journals (some of which are easy to publish in and others of which are quite difficult) and for a lack of precision about works with multiple authors. He said that there may be other flaws as well, but that he can't see all parts of the methodology as a non-subscriber to the service.

Still, he predicted that there would be demand for what the company is offering. "The desperation to prove institutional excellence will surely encourage some universities to subscribe just as most universities, whose administrators and faculty know better, nonetheless collaborate in helping U.S. News make a high profit on the dubious rankings they produce," he said.

If the Academic Analytics project is raising both interest and eyebrows now, there is another wrinkle that could make the project even more controversial down the road. The data reported is by department, not individual. But all of the data is gathered faculty member by faculty member so the system could allow for comparisons of individual professors on all of the factors in the system.

Academic Analytics officials state that while they have "unit record data" (meaning data on individuals), only the departmental aggregate data will be sold. But when explaining this policy, Martin said the company is not going to use the individual data "for the moment."

He said that he worried that data on individuals might not be good to use from any one single year and that his fear was that administrators with data on individuals "would hand out pink slips without thinking."

Asked to define "for the moment," he said that the company is in the process of creating an academic advisory board to see if it has made "good decisions" on many issues, including its non-release of individual records, and that the company "might think about" different approaches in the future. But he reiterated that there are no plans to sell individual data now.