Until recently, there had been few solid answers — just guesses and hunches, marketing hype and extrapolations from small pilot studies. So far, the [new research] office — the Institute of Education Sciences — has supported 175 randomized studies.

Other countries are no further along than the United States, researchers say. They report that only Britain has begun to do the sort of randomized trials that are going on here, with the assistance of American researchers.

The director of the International Performance Indicators in Primary Schools center in England, wrote:

The wake-up call was a national realization that all the money spent on education reform had almost no impact on basic skills.

Suddenly, scholars who had long argued for randomized trials began to be heard. In the United States, the effort to put some rigor into education research began in 2002, when the Institute of Education Sciences was created and Dr. Whitehurst was appointed the director.

I found on arriving that the status of education research was poor, [he said].

It was more humanistic and qualitative than crunching numbers and evaluating the impact.

You could pick up an education journal and read pieces that reflected on the human condition

It was more like the work a historian might do than what a social scientist might do

At the time, the Education Department had sponsored only a few randomized trials.

One was a study of Upward Bound, a program that was thought to improve achievement among poor children.

The study found it had no effect.

So Whitehurst brought in new people who had been trained in more rigorous fields, and invested in doctoral training programs to nurture a new generation of more scientific education researchers.

He faced heated opposition from some people in schools of education, he said, but he prevailed. The studies are far from easy to do.

It is an order of magnitude more complicated to do clinical trials in education than in medicine

In education, a lot of what is effective depends on your goal and how you measure it.

Most programs that had been sold as effective had no good evidence behind them.

As many as 90 percent of programs that seemed promising in small, unscientific studies had no effect on achievement or actually made achievement scores worse.

With a growing body of evidence on what works, researchers wonder how they can get educators and the public to pay attention.