Sandwiched between three mind-numbing years of basic science courses and hospital rotations and the lockdown years of residency training, the fourth year of medical school has long been a welcome respite for future doctors. It is the only time in their medical education when students have few requirements and a plethora of elective course offerings – and the time to go on vacation and spend time with friends and family.

“Do it now,” a mentor said as I was about to start my last year, “because you may never get the chance again.”

I followed that advice wholeheartedly. I spent most of my fourth year away from my medical school, caring for children with hematologic disorders one month, then shadowing cancer surgeons for another, in hopes of figuring out which specialty I liked more. I spent time working in a laboratory, something I’d never done, learning how to culture and freeze cells, care for mice, and critique studies. I attended national medical meetings, hung out with old friends, and slept and ate to my heart’s content at my parents’ home.

For me, it was a pivotal, reassuring year.

But not all of my classmates felt the same. One friend interested in a particularly competitive residency spent much of the year in high-stress “audition clerkships,” four-week clinical tours at hospitals where she hoped to train; she resented having to pay tuition at our home school while paying travel and living expenses so she could learn at other institutions. Another, older classmate, who had already spent 10 successful years in another profession, was just eager to get on with his training; for him, a fourth year filled with electives and extended vacations was a waste of time and tuition money.

“The fourth year is kind of bogus,” one friend recently recalled. “It might have been fun at the time, but I’m not sure it made me a better doctor.”

Established over a century ago as part of a sweeping change to a chaotic collection of schools, apprenticeships and fly-by-night training programs, the four-year medical school curriculum is the sacred cow of medical education. Like soldiers in lockstep, nearly all medical students over the last 100 years have spent their first two years in lecture halls learning the theory and basic science of medicine and their third and fourth years on the wards learning the practical clinical applications. Apart from a few short-lived experiments during World War II and in the 1970s to shorten the curriculum to three years, not even the most radical of educational reformers have dared stray from the norm, carefully integrating their changes well within the venerated four-year framework.

But now it appears that the convergence of physician shortages, rising health care costs and student debt has begun to tip this hallowed heifer. In 2010, responding to the physician shortage, Texas Tech University Health Sciences Center School of Medicine began offering a three-year medical school track for select students interested in primary care. Soon thereafter Mercer University School of Medicine’s campus in Savannah, Ga., followed suit; and this fall, New York University School of Medicine welcomed, in addition to its traditional four-year students, its first group of students to pursue a three-year option.

Proponents believe that the three-year programs will help address several pressing issues. By producing doctors faster, three-year M.D. degree programs help to address the critical doctor deficits projected over the next 15 years. And with almost two-thirds of medical students graduating with $150,000 or more of educational debt and more students entering medical school at an older age, the three-year option allows students to begin practicing sooner and with as much as 25 percent less debt.

“We can’t dissociate medical education from societal and student needs,” said Dr. Steven B. Abramson, lead author of the perspective piece in favor of three-year programs and vice dean for medical education, faculty and academic affairs at N.Y.U. “We can’t just sit back in an ivory tower and support a mandatory year of prolonged adolescence and finding oneself, when society needs doctors to get out into the community sooner.”

But critics are quick to point out the failures of past attempts to do the same. In the 1970s, for example, with support from the federal government, as many as 33 medical schools began offering a three-year M.D. option to address the impending physician shortages of the time. While the three-year students did as well or better on tests as their four-year counterparts, the vast majority, if offered a choice, would have chosen the traditional four-year route instead. Many who completed their work in three years were exhausted by the pace of accelerated study; and as many as a quarter asked to extend their studies by a year or two anyway.

Dr. Goldfarb and other critics contend that a host of new issues complicate the issue. The amount of material that students must assimilate, for example, has increased dramatically since the 1970s; and regulations now limit the number of hours trainees can work in the hospital. Taking away an entire year of the educational process dramatically whittles away the time young doctors have to interact with patients and gain critical clinical experiences.

“The complexity of medicine is greater than it’s ever been,” Dr. Goldfarb noted. “Compressing what has become more complex than it’s ever been seems counterintuitive.”

Four more medical schools are nonetheless currently considering adding a three-year M.D. option to their traditional programs. And while all of these shorter programs still remain an option and not the norm, the debate they have incited has brought greater attention to other exciting initiatives, like a novel assessment method that is based on a student’s actual skill rather than the number of years completed. This “competency-based” assessment would mean that students would be allowed to graduate when they demonstrated the skills and not just when they fulfilled the four-year requirements of a 100-year-old standard.

“Everything has changed around us,” Dr. Abramson said. “Specialties have changed, society has changed and the debt burden has changed. We are never going to be able to adapt to changing needs of society and of our students if we continue to believe that the way we educate medical students is sacred.”

A version of this article appears in print on 10/29/2013, on page D6 of the NewYork edition with the headline: A Med School Issue Renewed.