How Duke Achieved Its No. 1 Businessweek Ranking

When the new business school team at Bloomberg Businessweek did a review of the magazine’s rankings this past year, they quickly settled on one, very big and disruptive change. Instead of surveying the corporate recruiters who literally make the MBA market—a relatively small group of companies and organizations—Businessweek’s editors chose to widen the net to survey every single person who showed up on a top business school campus to recruit MBAs.

That meant recruiters who visited only one school—their alma mater. That meant recruiters who hired only one or two MBA students. That meant multiple recruiters from the exact same companies. Instead of surveying a single person in charge of MBA recruitment at a company, as the magazine had done for the previous 26 years, Businessweek changed it up, and the voting went wild.

BUSINESSWEEK’S RECRUITER RESPONSE RATE PLUNGED TO LESS THAN HALF

The magazine sent its corporate recruiter surveys to 8,358 recruiters at 4,931 companies this year—nearly a 15-fold increase—this year. Two years ago, Businessweek surveyed only 566 recruiters at 566 companies—and that was a generous sample given the relatively small number of companies—under 200 organizations—that actively recruit MBA talent. Businessweek had a very healthy 36% response rate two years ago when 206 companies completed its employer surveys.

This year, the response rate plunged to less than half—15.8%—but some 1,320 recruiters completed the surveys, representing 614 employers. Yet, the median number of schools rated by the respondents was only three, suggesting that many of the recruiters had highly limited exposure to the business school market and little perspective with which to judge one school’s MBA graduates from another.

Worse, BusinessWeek conceded that the alums in the recruiter sample were indeed biased.
“Alumni tended to rate their own school significantly more favorably than non-alumni rating that school,” explained Jonathan Rodkin, who oversaw the rankings methodology. “While some schools in our rankings had many alumni in the employer survey, others had zero.”

The magazine acknowledged that when a recruiter who was an alum of a school rated his or her alma mater, the scores were about 18% higher than the average non-alumni rating. Some 38 schools in the sample derived at least 25% of their ratings from alumni, while five schools derived “at least 50%” of their ratings from alumni. Translation: Alumni recruiters played a substantial role in ranking nearly four in ten schools–43 out of 112–in this year’s Businessweek survey.

USING A STATISTICIAN’S LOGIC FOR THE CHANGE: THE LARGER THE SAMPLE, THE BETTER THE RESULT

Instead of dampening down the impact of this bias, Businessweek then chose to amplify it by using only this year’s recruiter survey for its employer score, discontinuing the practice of adding in historical data. In 2012, for example, BW gave the year’s survey results a 50% weight, and then added the results from both 2010 and 2008, each at a weight of 25%. In all, the 2012 employer scores were the result of 663 corporate executives in charge of MBA recruitment. That methodology tended to result in less survey-to-survey variation.

The magazine used a statistician’s logic for the change: The larger the sample size, the better the result. Except that in this instance that logic is dead wrong. Surveying people who have no basis to make a comparative judgment about the MBAs they are evaluating only dilutes the impact of those in the sample who literally make the market. And by surveying alumni about their alma maters, you are inviting an inevitably biased result.

To be sure, the new business school team–largely hired only this past year–made a well-intentioned effort to improve the rankings. They brought clarity and greater transparency to the process, and they were incredibly candid about the limitations of their new approach. But by surveying anyone who shows up on a campus to recruit students–including alumni of the schools–they unintentionally weakened the survey’s validity.

The consequence of this judgment is obvious in the ranking’s bizarre results, that go far beyond Duke’s emergence as No. 1.

The University of Cincinnati, which didn’t make the list last time, is ranked 21st in the employer survey, ahead of Cornell, Georgetown, Vanderbilt, and the University of Southern California.

The University of Maryland’s Smith School finished in first place in the magazine’s student survey where Harvard finished 25th and Stanford finished 17th.

UC-San Diego’s Rady School of Management, which failed to even make the Businessweek ranking two years ago, finished first in intellectual capital, 24 places above Stanford which was ranked 25th on this measure. Rady was No. 69 on the most recent UT-Dallas North American ranking for academic research.

The University of Buffalo is ranked 19th on the employer survey, up 26 positions from 2012, and above Cornell and UT-Austin.

Wake Forest University, which only last month announced that it will shut down its full-time MBA program, places 25th in the employer survey, well ahead of UC-Berkeley, Notre Dame, Minnesota, and Georgetown. If employers so deeply love Wake Forest MBAs, why is the school closing down its MBA program?

Harvard Business School failed to even make this year’s top five MBA programs. The school–widely regarded as having the world’s best MBA program–slumped to eighth place, its worst showing ever.

UC-Berkeley’s Haas School, the second most highly selective MBA program in the U.S., is way down at No. 19, with UCLA’s Anderson School high above it at a rank of 11.

The University of Virginia’s Darden School of Business skidded ten spots to rank 20th.

To be fair, this is not the first time a Businessweek ranking has prompted ridicule or criticism. Four years ago, for example, the magazine ranked Southern Methodist University’s Cox School of Business 12th in 2010, the highest rank ever achieved by the school in any of the top five most influential B-school rankings published by BW, U.S. News, The Financial Times, Forbes, and The Economist.

NEVER HAS THERE BEEN SO MANY UNCONVINCING RESULTS IN A BUSINESSWEEK RANKING

Businessweek was so embarrassed by the outcome that it changed the way it calculated its recruiter scores, causing Cox to fall 17 positions in 2012 to a rank of 29th. And last time the magazine admitted that it had “miscalculated intellectual capital scores” for 40 American business schools and 10 international schools and had to issue a significant correction.

But not until now has there ever been so many peculiar and unconvincing outcomes in a Businessweek ranking of top MBA programs. A typical response comes from Betsy Massar, an HBS grad and an MBA admissions consultant at Master Admissions: “The whole thing smells funny,” she says. “There are just too many irregularities. Forget HBS. Everyone can find something wrong with it. But to see Tepper above Sloan and Tuck? That’s kind of insane.”

Matt Turner, who closely follows rankings for the University of Texas’ McCombs School, agrees.
“The new listing shakes up conventional wisdom,” writes Turner. “Harvard currently finds itself in eighth place, down six ranks from second in the last (2012) survey. For its venerable MBA program, which usually secures a top-three spot in any ranking, that’s a shocker. Harvard’s current standing among the other major media outlets includes: No. 1 (U.S. News, 2015), No. 1 (Financial Times, 2014), No. 1 (Poets&Quants, 2013), No. 3 (Forbes, 2013), and No. 4 among U.S. schools (Economist, 2014).”

Turner says that Businessweek’s “latest ranking is volatile across the board. The average rise or fall this year for 2012’s top-10 programs was almost four ranks. Among programs previously ranked between 11 and 20, the average was just over five, and those numbered between 21 and 30 saw a whopping 7.5 average change of rank.”