Reputation Without Rigor

Submitted by Stephanie Lee on August 19, 2009 - 3:00am

The form submitted by the provost at the University of Wisconsin at Madison deemed 260 of its 262 peer institutions to be of “adequate” quality. A survey from the University of Vermont’s president listed “don’t know” for about half of the universities. The forms provided by Ohio State University’s president and provost were virtually identical. And the University of Florida’s president, like his highly publicized colleague at Clemson University, rated his own institution well above many of his competitors.

Long a sore spot for many critics, the peer assessment survey for U.S. News & World Report’s annual college rankings has been subjected to especially tough scrutiny since June, when an official at Clemson revealed that her bosses[1], as part of a larger strategy to propel the university up the rankings, had regularly given low scores[2] on the "reputational" survey to other universities to make Clemson look better.

To try to gauge the extent to which Clemson was an anomaly or an example, Inside Higher Ed sought the reputational survey forms from the 48 other public institutions in the magazine’s 100 “best national universities” last year.

Our review found little of the sort of outright gaming that was apparent in Clemson’s strategy, and many institutions appeared to engage in honest, if imperfect, attempts to assess the quality of others. But as U.S. News prepares to release the 2010 edition of its survey tomorrow, the analysis also suggests that the reputational survey is subject to problems, such as haphazard responses and apathetic respondents, that add to the lingering questions about its legitimacy.

The Peer Survey

The peer assessment survey, worth 25 percent of a university’s ranking, asks college presidents, provosts and deans of admissions to rate the “academic quality” of undergraduate programs at hundreds of other institutions on a scale of “distinguished” to “marginal.” Inside Higher Ed filed public records requests for the most recent surveys from 48 institutions; U.S. News keeps the assessments confidential. Eighteen provided full or partial sets of records, while 22 said they had not kept copies. The remainder either claimed exemption under state laws, did not respond by press time, declined to make the data readily available, or said they had not participated in the U.S. News survey to begin with.

Among those that did provide their responses, several revealed major oddities. At the University of Wisconsin at Madison, the provost’s most recent peer assessment form[3] gave the highest possible rating, “distinguished,” to just two institutions: its own and the New School.

To every other university but one, Madison's response gave the second-lowest rating, “adequate.” Those 260 “adequate” institutions included Harvard, Yale and the rest of the Ivy League, the University of California at Berkeley, the Massachusetts Institute of Technology and Stanford University. Only Arizona State University scored below all the rest, given the lowest rating of “marginal.”

This news surprised Julie Underwood, then-interim provost, when informed by Inside Higher Ed this month. That’s because she didn’t fill out the survey submitted in her name. As many other officials do, she sent it to an administrator to complete on her behalf -- in this case, Aaron Brower, vice provost for teaching and learning, who had never filled it out before. Underwood did not give her input or approval before he returned it to U.S. News.

Brower says he was responding as neutrally as possible to a “bad survey” and an “impossible question” that have gained what he views as a “shocking” degree of importance. Universities are “good in some areas and not good in others,” he says, and catch-all ratings ignore those nuances.

“I first looked at this and started considering every institution and trying to fill it out that way. And I thought, if anyone were to ever ask me this and say, ‘Why did you put this institution as strong versus good?,’ I wouldn’t have an answer,” says Brower, who has been researching higher education for 25 years and became vice provost in 2007. “That was to me less defensible than saying, ‘It’s a mixed bag, here’s a neutral response.’ ”

Brower’s idea of a neutral response was to deem every institution “adequate” -- which he interpreted to mean “good enough” -- except for those he says he knows extremely well. Having worked at Madison since 1986, he says he is “very confident about saying that we're an excellent, distinguished school.” As for the New School, Brower says he admires its focus on writing seminars, internships and other programs aimed at improving learning outcomes. One of his sons is a rising sophomore at the New School, but Brower says he has long been impressed with the university, regardless of family ties.

He would not elaborate on why he singled out Arizona State as “marginal,” saying, “They were hit very hard by the economy and I know their program and felt like I couldn't rate them just kind of neutrally.”

Asked why he didn’t check “don’t know” for all the other institutions, or refuse to complete the survey, Brower says the possibility never occurred to him: “It seemed like this was a task that I was to do and I was to do it the best I could.” And he insists that he wasn’t trying to game the system, noting that his survey was one of hundreds. “There wasn't any ulterior motive, like, ‘Oh, let's increase our ranking,’ ” he says. “We already rank well. There's sort of a marginal value for us to try and manipulate rankings.”

Underwood, who served as interim provost at Wisconsin from January to July, says it would not be “appropriate” for her to comment on the survey since Brower filled it out. But she says that she didn’t see anything wrong with forwarding it to him in the first place.

“[The official who] is most knowledgeable about other campuses fills that out, so it actually is valid,” Underwood says, adding, “It never would have been sent outside of the provost’s office.... We do have lots of people who work in the provost’s office. The office functions as an office, with people working in a collaborative team.”

U.S. News specifically addresses the survey to presidents, provosts and deans of admissions, as noted by Robert Morse, the magazine’s director of data research, in a March letter enclosed with the survey. The recipients “have direct unique knowledge of the quality of undergraduate programs in this country,” he wrote. The letter continues, “As a result, you are one of a select group of people being asked their opinion of undergraduate programs at colleges and universities in your category of institution. We believe that you have the broad experience and expertise needed to assess the academic quality of your peer institutions.”

Furthermore, the magazine relies on universities to report data accurately, Morse said in a June 11 blog post[4].

Paul M. DeLuca, now the provost at Wisconsin, says he is “not okay” with the survey responses and would take a more hands-on approach in the future. Overall, he says, he approves of the peer survey and its 25 percent weight; he believes that all the responses, averaged together, provide a reasonable gauge of quality.

“Given all the variables, this is the most reasonable mechanism of giving peer assessment of what do peers think of these institutions, [which] do have merit for this or merit for that,” he says. “In that sense, it’s reasonably useful. Would I imagine that I would select a school purely based on that? I doubt it, but it’s a good indicator.”

Clock's A-Tickin'

Ten hours. With 260-some colleges, giving each two or three minutes of attention, that’s how long it would take to adequately respond to the U.S. News survey, estimates Daniel M. Fogel, president of the University of Vermont. And he says that’s time no one like him can afford to spend.

Fogel says he himself spent an hour filling in the bubbles in this year’s questionnaire, devoting 10 to 15 seconds to each institution. Asked why he didn’t invest more energy in the process, he said, “Nobody's paying us to help U.S. News produce a commodity.… When I’m being paid hundreds of dollars a day, why would I spend time reading up on South Dakota State [University] so I could give U.S. News a better answer?”

While Fogel says he values the rankings overall, he called the assessment survey “clearly highly subjective and based for the most part on very partial and very insufficient measures by most of the people filling out the survey on most schools.” But despite his criticisms, he says he would never consider declining to respond: The magazine is “going to put it together whether we participate or not -- we might as well play and try to provide accurate, honest data.”

Last year, Fogel forwarded the survey to Fred Curran, Vermont's director of institutional studies, to complete on his behalf. Curran marked “don’t know” for 156 of the universities, saying that his office has more pressing issues to deal with other than researching the unfamiliar ones.

“I don’t think U.S. News expects you to evaluate every institution. This is a small piece of work that our office does -- very small,” Curran says. “I’m not going to spend an inordinate amount of time, when I’m responding for the president, to research some 250-odd institutions at this point.”

But opting out isn’t an option in Curran’s mind, either. “It’s something we've got to do -- it’s been around long enough.”

The Numbers Game

Among other unusual results from the surveys Inside Higher Ed received:

The surveys submitted by the president and provost at Ohio State were virtually identical to each other in 2007, 2008 and 2009. For this year’s rankings, the president and provost rated Ohio State “strong” and gave an “adequate” rating to 108 and 104 institutions, respectively. Both gave identical ratings to all members of the Big 10 and the Ivy League, including “strong” ratings for Cornell University, Columbia University, Brown University, Dartmouth College and the University of Pennsylvania. They also both identified the same eight institutions as “distinguished.” Officials at Ohio State did not respond to repeated requests for comments or explanations about the similarities.

The presidents and/or provosts of 15 of the 18 universities rated their institutions “distinguished,” from Berkeley (no. 21 on last year’s list) to the University of Missouri at Columbia (No. 96).

At Berkeley in 2008, the chancellor rated other “top” publics -- including the University of Virginia, the University of Michigan at Ann Arbor and the University of North Carolina at Chapel Hill – “strong.” However, he rated all of the University of California campuses “distinguished,” with the exceptions of Santa Cruz and Riverside, which were also “strong.” (Merced was not on the list.)

In a 2009 survey, an official at the University of California at San Diego (No. 35) rated that campus “distinguished,” above the University of Pennsylvania, Duke University, Dartmouth College, Northwestern University and Johns Hopkins University (all “strong”).

The president of the University of Florida (No. 49) rated his campus “distinguished” in this year’s survey -- along with Harvard, Stanford and MIT -- and no other institution in Florida above “good,” as reported by the Gainesville Sun[5].

Racing to the Top

Morse, who directs the magazine’s rankings, declined to comment for this article by press time.

The 2010 edition of the rankings is hitting newsstands amid growing criticism from administrators and third parties. Lloyd Thacker, founder of the Education Conservancy, says he regards the peer assessment survey as "the most ludicrous" component of the process. His organization has circulated a letter asking college presidents to refuse to fill out the survey, and is generally trying to make college admissions less commercial than they have become.

"It would be hard-pressed for any college president on a public stage to say they know more about more than 10 colleges. It's not their job," Thacker said, adding, "The rankings in and of themselves do a great disservice to education. They imply a degree of precision and authority that simply is not supported by data. Their influence has grown way beyond any kind of actual educational jurisdiction they might have, any educational reliability they might have."

But Carolyn (Biddy) Martin, chancellor of the University of Wisconsin at Madison, says she doesn’t blame U.S. News per se, but rather the culture’s broader love of comparing and listing: “We're all susceptible to the lure of rankings of various things, and that sort of race to the top and reliance on rankings of this kind is itself part of the problem.”

As for her own survey, Martin said that she finds it difficult to evaluate every institution. (She said that an administrator in the Office of Academic Planning and Analysis fills out the chancellor’s survey in consultation with the chancellor.) But despite her concerns about U.S. News’s methodology, she said she feels as if she has increasingly little choice but to comply.

“Given the market-driven nature of higher education, as with other things, there’s always the inclination to try and do as well as possible on measures that are part of the survey,” she said. “And I guess I would like to have institutions have a little more say about what they think really matters when it comes to students' educations in colleges and universities.”