A team of professors at UC San Diego who investigated the university's worst known case of research fraud has concluded that the checks and balances long relied on to detect scientific dishonesty are no longer sufficient to do the job.

In an article published today in the New England Journal of Medicine, five of the faculty members who studied the case of Dr. Robert Slutsky recommend that journals, granting agencies and research institutions more closely scrutinize researchers and their work.

"We conclude that peer review and replication will not always detect fraud, and that trust in a researcher's motivation ignores the ever-present dangers of conflict between personal and intellectual motives," the authors state.

Slutsky, a 36-year-old specialist in cardiac radiology, resigned from the UCSD School of Medicine in 1985 amid suspicions of research fraud. Over the following year, a university committee concluded that he had committed extensive research fraud.

The committee charged last year that Slutsky had fudged statistics, recycled data and inflated numbers of experimental animals--all apparently in an attempt to make a reputation for himself in the highly competitive world of academic medicine.

The case is believed to have been one of the most extensive academic fraud cases in recent history. It derailed Slutsky's career, indirectly damaged the careers of some colleagues and embarrassed a major research institution.

In the article published today, the authors conclude that the peer review system under which scientists review each other's work before publication failed to detect implausible claims as well as discrepancies in some of Slutsky's papers.

The professors also cast doubt on the traditional system of replication, under which scientists attempt independently to verify or refute other scientists' findings. The group concluded that it has become increasingly difficult to obtain funding for duplicative work.

Fabricated Data

Furthermore, replication detects only incorrect results, not correct results based on fraudulent data, the professors note. In Slutsky's case, the committee found he wrote papers reaching correct conclusions but using fabricated data, apparently to pad his bibliography.

"Occasionally, as in the case of Slutsky, the culprit does not play for high stakes," the authors state. "The goal is simply to establish a niche in the profession, and to this end, the papers announce results that are predictable extensions of basic work done by others."

The professors suggest in today's article that the peer-review system may be overloaded and that "problems with the number of manuscripts" need to be addressed. They also suggest that so-called referees for journals may need improved training.

They recommend that institutions advise all researchers of specific policies and procedures for reporting unethical practices, and inform new faculty members and trainees of "realistic performance expectations" in order to discourage "excessive productivity."

"We felt that we had gone through a learning experience in going through this process and that it would be worthwhile to share it with others," Dr. Paul Friedman, a member of the committee, said Wednesday when asked about the decision to publish today's article.

Review Raised Suspicions

Slutsky resigned in May, 1985, while under consideration for appointment to an associate professorship in the field of cardiac radiology. A routine review of his lengthy bibliography in preparation for his promotion had raised faculty members' suspicions that Slutsky might have faked some research.

After a long investigation, the faculty committee concluded that 12 of Slutsky's published research papers contained fraudulent information. It classified 48 as questionable--that is, the co-authors were unable to prove the papers' validity--and 77 as valid.

Slutsky, now in training as an anesthesiologist in New York state, never responded to the faculty committee's final report. However, a lawyer representing him had requested retraction of 15 published papers citing "information . . . which cast some doubt on the results."

The new article does not exempt UCSD from blame.

The group of professors found that some of Slutsky's superiors failed to supervise his work adequately and to recognize his "excessive productivity." In addition, the professors concluded that colleagues had neglected to report suspicions about him.

They accused some of his 93 co-authors of "culpable . . . carelessness" in being named on papers they had never scrutinized. Others, they suggested, engaged in "deliberate misrepresentation" in accepting "gift co-authorship" without doing any work.

"Each university and department should develop guidelines for the supervision and evaluation of research trainees, as well as a policy on publication and authorship," the group recommends. It also suggests strict guidelines governing co-authorship.