For years, I have espoused and professed the principle that, under no circumstances, should a classroom instructor grade his own students. It is a principle that generalizes.

Yes, of course, quizzes are important, to the instructor and his students, for gauging progress. But, when the time comes, certainly for a final exam and maybe also for a midterm exam, arrangements must be made. Quite likely, this one act alone would do more than any other conceivable policy change, to attenuate, maybe even eliminate, the corrosion of grade inflation, and other corruptions in education.

If there is one person in the U.S. who is not surprised by the newspaper article, below, he is your humble correspondent.

High schools that didn?t grade their own Regents exams last year fared worse than those that did

By YOAV GONEN Education Reporter

Last Updated: 6:44 AM, June 8, 2013

Posted: 1:12 AM, June 8, 2013

They lost their home-field advantage.

Schools that were barred from grading their own students? Regents exam last year fared significantly worse than schools that continued to self-score ? suggesting that students have benefitted from inflated grading for years.

At about 150 schools that had their Regents exams scored by a central team of educators, scores sank deeper across all five subjects than they did at schools that graded their own exams.

At more than a dozen schools that participated in last year?s Department of Education pilot program, passing rates plummeted by incredible margins, a Post analysis found.

?I know for a fact that some schools tell the teachers to pass the kids,? one principal said of the results. ?When there?s a vested interest in your school doing well ? because they?re your students ? you?re going to tend to look for every point you have to so that your school has a good grade.?

But a Brooklyn high school principal said the differences seen in the pilot program ? which were spurred by a statewide crackdown on shady scoring practices ? don?t mean schools had been actively inflating their results.

He said that what could have played a role was the fact that teachers who scored the exams centrally work in schools with varying standards.

?If we had 10 people grading a paper . . . where that teacher?s coming from may determine how that teacher sees that paper,? he said. ?You always have disagreement when you have that subjectivity.?

In US History, schools whose exams were graded centrally saw their passing rates drop by an average of 3.9 percentage points from 2011 to 2012.

By contrast, schools that graded their own exams boosted their passing rates by an average of 2.4 percentage points.

In Living Environment, pilot schools saw their passing rates plummet by an average of 6.7 percentage points last year.

Self-grading schools only dropped by 2 percentage points.

Compared to nonpilot schools, pilot-school passing rates also dropped by an additional 2.1 percentage points in English and 1.7 percentage points in math.

Those differences signify thousands of additional failing kids, because the exams are taken by 90,000 to 122,000 kids annually in each subject.

Department of Education officials cautioned against drawing conclusions from the trial run because of mitigating factors.

They said pilot schools tended to be higher-poverty, which may explain their steeper declines.

Also, results can be skewed by students who take exams multiple times and by changes to student populations.

Still, all five public high schools with the biggest drops in Regents scores had their exams scored externally.

At Choir Academy of Harlem, 66 percent of test-takers passed the US History Regents in 2011. Last year, when the exams were graded elsewhere, 13 percent did.

At Brooklyn Theatre Arts HS in Canarsie, 77 percent of kids passed the US History exam in 2011. Last year, just 21 percent did.

This is the first year all high schools are barred from scoring their own Regents.