Can B-School Rankings Be Reformed?

By

Rankings have become a central part of many colleges' admissions strategies -- even as many experts have questioned their validity and whether they help anyone except those organizations that produce them. The latest salvo in the fight over rankings came this month from 21 scholars at a range of business schools who have published a joint call to change the way business schools are ranked. They argue that traditional methods -- which reduce business schools to ordinal rankings -- are deceptive and biased toward certain kinds of M.B.A. and other business school programs.

The article -- soon to appear in the journal Decision Sciences, abstract available here -- does not suggest that all rankings be abolished, and the authors see value in the release of data on which many rankings are currently based.

But the article argues for creating tools so different types of students may emphasize different characteristics of business schools. "Many of us continue to acquiesce to methods of comparison we know to be fundamentally misleading," the article says. "We continue to allow the strategic initiatives we establish to be undermined by assessment approaches that not only restrict views of these initiatives, but which also can entirely skew holistic considerations of programs and institutional strength."

While the rankings critique focuses on business schools, many of its arguments would apply to other kinds of rankings of professional and undergraduate academic programs. Inside Higher Ed asked a range of experts what they make the new critique of rankings, and the state of rankings for business schools and other parts of higher education. Here is what they had to say:

John A. Byrne, founder and editor-in-chief of PoetsandQuants.com, which produces an M.B.A. ranking:

There's no question that current rankings are imperfect at best and seriously flawed, if not intellectually dishonest, at worse. But business school rankings in particular are one reason why the M.B.A. has become the most popular graduate degree in the U.S. and the most successful higher education product of the post-war period. These lists from credible media brands have raised the awareness and importance of graduate business education.

More than that, they have forced reluctant school administrators to become more transparent, putting into the marketplace a wealth of data to allow students and parents to make more informed decisions on higher education. Given the substantial investment and debt burden schools are asking students to make, it's important to have a third-party analysis of a degree, regardless of the flaws.

Having a debate about rankings is also a good thing. It helps to remind people not to take them too seriously. But what you get in these occasional flare-ups by deans is a self-interested dialogue that leads to proposed changes that tend to benefit their own schools at the expense of others.

Mark A. Zupan, president of Alfred University and former dean of the business school at the University of Rochester:

With regard to rankings, I think that there are plenty of opportunities for improvement. One of the flaws of the prevailing U.S. News & World Report rankings [in undergraduate admissions], for example, is that by creating a beauty pageant between schools on criteria such as high school grade-point average, ACT/SAT scores, etc., they encourage higher ed to promote more merit-based aid and shift away resources from need-based aid. There are many societal benefits, however, from need-based aid given the opportunity that universities provide for social mobility.

Existing rankings at the b-school level also don't adjust for cost of living considerations ($100K in NYC is not the same as $100K in Columbus, Ohio) and don't reflect the non-monetary rewards associated with various opportunities ($60K at a non-profit may be much more rewarding to an individual than $150K at a high-stress finance job on Wall Street). Looking at starting salary as a metric, moreover, is not the same as looking at ROI and has also caused b-schools to shift, since the advent of rankings in the 80s, toward an older incoming student body (best predictor of post-MBA salary is pre-MBA salary) at the expense of a more diverse (by gender, race, and ethnicity) student body.

All that said, where are their flaws in the existing methodologies, there is an opportunity for improvement. Instead of trying to ignore rankings or resist them, it is incumbent on individuals such as the authors of the study to come up with a better mousetrap ... and a mousetrap that has staying power. Until such a productive next step is taken, the exercise remains akin to King Canute trying to order the waves to stop advancing.

Chad Troutwine, Veritas Prep CEO and former member of the admissions committee at Harvard University's Kennedy School of Government:

The authors make a compelling argument that ordinal rankings are flawed. However, they are far less convincing that this is a genuine effort to replace the rankings with something better, or anything it all. Instead, it reminds one of the current debate around health care in America, where many support a "repeal at all costs" approach citing the clear limitations of the Affordable Care Act. Like the authors, those same repeal advocates are far less persuasive or -- as in this case -- silent when asked to provide a better alternative.

We subscribe to Winston Churchill's observation about democracy: it's the worst form of government, except for all of the others. Until someone creates a better approach, we support sharing total access to all types of data to prospective candidates. We have the technological tools to let them choose the factors they favor, weight those factors how they see fit, and create a customized ranking based on each candidate's individual needs. The applicant pool is filled with bright and capable people who will be making consequential decisions based on effectively analyzing data. We should equip them with the tools to hone those skills at the start of the admissions process.

While business school rankings may serve to provide exposure for graduate programs and a baseline for prospective students, reliance on rankings alone does not show the whole picture. Data gathered from a ranking system may influence an applicant’s decision to attend one graduate program over another, but as the forthcoming article discusses, a holistic approach may be a more accurate and equitable way to determine program quality.

Sometimes, ranking systems can favor bigger name or more selective schools, which does not necessarily mean a student will be a good fit or successful at that institution. High rankings may not equate to a good fit for a student. That said, applicants must consider all aspects of a program’s offerings, from environment to career placement, which may not be evident from rankings alone. Although the rankings system is ingrained in current business school culture, it is time to look outside the box and consider what really makes a quality program.

Times Higher Education has been producing the leading global university rankings since 2004, and while traditional ordinal rankings are here to say, more and more in recent years we've been encouraging a more holistic look at the data, well beyond the overall composite scores. So our website users can re-order our rankings on a range of broad criteria, including the teaching environment, or pure research impact, or even things like industry links or international outlook. We are also working with universities to give them subscription access to the underlying rankings data, so they can completely unpick the rankings and profile and benchmark themselves against any of the 13 performance indicators that make up the overall ranking score.

After a significant investment in our data science team over the last two years, we're currently consulting on adding a ranking of M.B.A. programs and other programs in business and management to our portfolio of global rankings. As we did with the U.S. college rankings that we produced with The Wall Street Journal for the first time last year, we'll only move into a new area if we believe we are truly adding a different perspective on excellence and improving on the existing offers. With the WSJ-THE College Rankings, we pioneered a nation-wide survey of current U.S. college students - asking them how much they are challenged and stretched by their teachers, for example, to get a true handle on teaching excellence. We'd expect to bring this sort of new approach to any M.B.A. rankings.

Stacy Blackman, an M.B.A. admissions consultant:

Business school rankings do have a ton of power. They influence my clients enormously, in terms of which schools they choose to apply to. Many clients will say that they only want to apply to the "top 5" schools. This amuses me, because as an M.B.A. expert, I have no idea which five schools they are referring to. Every ranking is different and based on unique criteria, so those five schools vary depending on which publication they are viewing. They also influence the schools' admissions practices. For example, if an applicant wants to go into nonprofit, admissions might think twice about an admit if they are catering to a ranking based on salary. I don't believe that the rankings were developed to be truly helpful to applicants. They are a marketing ploy, as the annual rankings issues generate a lot of buzz and eyeballs for the publications.

As an admissions consultant, I try to educate my clients on the meaning behind the rankings, and give them other criteria to consider. I ask them to reflect on whether the culture is a fit for them and what size/location/teaching methodology appeals to them. I advise them to research the career centers and investigate whether they will be effective helpers for their career of choice. I believe they should analyze alumni networks, student populations and course offerings. I think we would be better off without rankings. But since they are such a draw (often the BIGGEST draw for a publication) I don't think they will change, so I do my best to educate clients, often in vain.

Jeremiah Nelson, director of enrollment management for the Charlotte program of the Wake Forest University business school, and a member of the board of NAGAP, the Association for Graduate Enrollment Management:

I've had the unique opportunity to work for several very different business schools. Some, like Wake Forest, are highly ranked and enjoy attention from prospective students due to the strong endorsement these rankings offer during the exploration process. This has been the value I've always discussed with prospective students. It doesn't matter how well ranked a school is, if it is a terrible fit for you. Ultimately, rankings are only a starting place for researching schools in the early stages of the exploration process. Most savvy b-school candidates are not making their decisions on ranking alone.

There is certainly room for improvement with the research methodology, and I believe the authors are right that change is possible with the help of AACSB as a collective voice of the world's best business schools. It is difficult (and for some schools dangerous) for any one school to opt out if the rest of their peers still participate. It is possible for the data to be better, but I think the authors forget that the motivation of these publications is readership and advertising. Until their readers demand more, it may be hard for these publications to justify the time and money required to significantly complicate their processes more than they are already.

Bob Morse, head of rankings at U.S. News & World Report:

We created the Best Graduate Schools M.B.A. rankings to provide prospective students with the data-driven comparative information needed to make better-informed decisions about their higher education. This is why prospective U.S. and international M.B.A. students come to U.S. News when choosing an M.B.A. program.