Money Magazine’s College Rankings Try to Define Value

I’ve never been a fan of college rankings. But they are a necessary evil for magazine publishers.It’s more challenging to define the “value” of a college education than it is for a house or a car, or anything else that involves a long term financial commitment.

As college rankings go, the Money magazine rankings consider affordability. They look at the “net price,” the amount that the average family is expected to pay after scholarships and grants are considered. They also consider that parents, like students, take out loans. PLUS loan indebtedness taken out by parents is a factor in Money’s college rankings.

The problem I see with using net price is that college-going families have “their price.” They don’t pay an average. For example, a Delaware resident will pay less to attend the University of Delaware than a non-resident. Yet more than two thirds of the students who go to University of Delaware are from other states. When you make your list of schools you need to know your price not a theoretical average.

These college rankings also consider performance, whether the actual graduation rate for a class was better than a forecasted graduation rate, based on the economic characteristics of the class. In other words, schools that admit a high percentage of students from economically disadvantaged backgrounds might be penalized in college rankings for having a low four-year graduation rate. But those schools might also be rewarded if the school did better at graduating a class than they were expected to do. This reflects positively not only college rankings, but also on a college’s commitments to academic and career advising, among the many other services towards student success.

The downside is that Money used the six-year graduation rate. The writers citied academic studies that mention that most college students do not graduate in four years. That might be true, but the job of a four-year college is to make it possible for students to graduate in four years, five if the school has co-op or a Bachelor’s/Masters or other dual degree program. Colleges should be evaluated based on how they make the possible possible.

Money’s college rankings consider early-career salaries as well as “employable skills.” Both favor schools that grant a higher percentage of business and or STEM degrees. If you excluded this information, as U.S. News does, you might have seen a different ranking. The problem with ranking schools based on mid-career salary data is that the school’s influence on a graduate’s performance in the workplace diminishes over time.

Suppose, for example, that you graduate from college as an entry-level engineer. You are offered a job based on your interview skills, your academic performance in college as well as the evaluations that you received from supervisors where you worked during your internships or co-ops. Once you are hired, past accomplishments mean nothing. It’s what you do going forward that gets you choice assignments, promotions and higher salaries. You might be ambitious and be quite capable of working your alumni network. But there will also be others who graduated from other schools who will do the same things.

Ah, you might hear. “The graduates of the ‘better’ schools received higher starting salaries for the same major.” Not always. You might run into a MIT or Cal Tech educated engineer who received an astronomical starting salary to work in a field such as chemical engineering or computer engineering. If that firm wants a very small number of MIT or Cal Tech educated engineers, then yes, those hires are likely to earn more than the state school grads who work for other firms. The MIT and Cal Tech educated engineers were targeted by the employer. S/he did not want graduates from other schools.

Now suppose that employer has to cast a much wider net to recruit entry-level engineering talent. That employer can pay well, but not as well as the smaller, more selective employer. They need too many people. They might go to MIT or Cal Tech. But they might also go to the University of California-Davis or Santa Clara University if the firm is in California. If the firm is located in Massachusetts, it might also go to UMass-Amherst, Boston University, Northeastern or Tufts. But no matter where they went to school there will be negligible, if any, differences in starting salary.

However, Money is a personal finance magazine. Their editors would be skewered if salary information was not part of college rankings. The more financially-oriented readers would not care about the graduates of liberal arts schools who are neither business nor STEM majors.

Stanford came out on top in Money magazine’s college rankings. Kudos well deserved. Stanford does an excellent job at aiding and supporting its graduates. One factor that helps: computer science is the most popular major at Stanford. Other schools that ranked high are either quite selective and very well endowed or have a low sticker price (Brigham Young, Cooper Union and the College of the Ozarks, a work college, being examples among private schools).

Money’s college rankings, like others, validate the desirability of the more selective schools. That’s okay. Those schools are what they are. The problem is that only a small percentage of the college-going population will receive an invitation to be part of their freshman classes. What about those who will not? Yes, Money also provided additional rankings including “best” public schools and “best colleges that you can actually get into.” Some are legitimate bests, others are not.

But a school that fails to graduate at least half of a full-time freshman class on time should not be considered to be a best in any college ranking, unless there are valid reasons why. Different schools serve different markets and purposes. College rankings do not do the best job of telling us that a school did what it promised to do or what it was expected to do.