Splitting the Guild — A Modest Proposal for American Academe

The system by which most universities grant doctorates mimics the unsavory characteristics of medieval guilds. Even if you have no direct connection with academe, the problem affects you. Your tax dollars fund these institutions. The quality and volume of their graduates’ research impacts your health and well-being. Doctorate-driven innovation helps make America innovative enough to keep you employed.

One modest institutional change could improve matters.

The problem is, doctoral studies consist of two radically different processes that, in general, must be completed at the same institution. First, a student takes several years of courses to acquire a broad knowledge of the field of study; coursework typically concludes with comprehensive examinations, taken collectively with classmates. Second, the student — supervised by a committee of professors — writes a dissertation. Unlike coursework and comprehensives, the dissertation is a solitary activity.

The dissertation process can take years and requires the student to obtain approval of a committee of professors. Diligent, brilliant students can be thwarted for reasons entirely beyond their control. One friend, years into research, had to scrap his dissertation and start over because his adviser departed for overseas. Another friend’s dissertation collapsed when her adviser died. Dissertations implode because of personal disagreements between faculty committee members.

(Note: My own original dissertation withered on the vine in the 1980s because my adviser dawdled with it for over a year and then suddenly left the university. Hearing the facts a decade later, the university waived normal time limits, allowing me to return. This time, a superb adviser and committee shepherded me through in just over a year. My persistence paid off, but luck was crucial.)

In some departments, grad students must perform years of drudge work for professors as an unwritten requirement for ultimate approval. This means deferred income and, perhaps, starting research long after one’s most creative years have passed.

This guild-like process can homogenize thinking, discouraging departures from conventional wisdom or prevailing ideology. Students’ philosophical leanings are often unformed when their studies begin. In my field — the economics of health care — coursework and readings may steer a student’s initially amorphous views toward, say, a suspicion of heavy government involvement in financing and providing care. But, faced with a phalanx of professors with the opposite viewpoint, the student may have to suppress her own philosophy and write to the committee’s prejudices. Having done so, it is difficult later on to revert to her actual leanings, effectively renouncing her years of research.

And make no mistake: Ideological monocultures are detrimental to America’s research capabilities.

Less nefariously, a student may simply discover that her department has no scholars capable of supervising a dissertation on the particular subject that interests her. The problem is that once a student learns that her department is indifferent, incompetent or incompatible with her philosophy or research interests, it’s usually too late. Transferring to a different university generally requires repeating years of coursework, plus new comprehensive exams based around the new department’s philosophy and interests. It’s like telling a 12th-grader, “You can transfer to a better school, but only if you repeat 10th and 11th grades.”

My fix is to separate the two processes. Award a master’s degree upon completing coursework and comprehensives (some universities do already). Then, let the student earn the doctoral degree by writing the dissertation at any university willing to oversee her research — without additional coursework or exams.

This two-part process would roughly mimic the structure of medical education, where coursework occurs at one place and, often, residency at another. Most students would likely finish the master’s and doctorate at the same institution. But the option to do otherwise would offer students greater career latitude and a route of escape from ill-suited advisers. Plus, the threat of students departing after the master’s would discourage feckless advisers from imposing inordinate delays, ideological litmus tests, or other abuses upon advisees.

This isn’t a new problem, by the way. In “The Wealth of Nations” (1776), Adam Smith railed against the social costs of insulating universities from market forces. He celebrated the idea of mobile, dissatisfied students departing somnolent professors, thus forcing universities to better meet the needs of their students.

After two-and-a-half centuries, perhaps it’s time to heed Smith’s advice.

About the Author

Robert Graboyes (@Robert_Graboyes) is a senior research fellow with the Mercatus Center at George Mason University, where he focuses on technological innovation in health care. He authored “Fortress and Frontier in American Health Care,” teaches health economics at Virginia Commonwealth University, and is a recipient of the Bastiat Prize for Journalism. He is a columnist with InsideSources.