Friday, July 15, 2011

"Uh, I left the report with someone, I just don't remember whom."

Pa. examines why report on tests was ignored

Schools flagged for results contend state never investigated

Thursday, July 14, 2011

By Jonathan D. Silver, Pittsburgh Post-Gazette

A report analyzing irregularities among the standardized tests of nearly 1 million students arrived at the state Department of Education in July 2009 and "basically sat on a shelf," agency spokesman Tim Eller said Wednesday.

The report only came to light last week, when an online publication ran an article about it, and the department is trying to determine why its findings were never acted upon, Mr. Eller said.

The 2009 analysis of the Pennsylvania System of School Assessment exams flagged roughly 50 of the state's 747 districts and charter schools and stated that anomalies -- improbable results as calculated by several measures -- could indicate cheating by students or school officials.

However, several school districts in southwestern Pennsylvania that were flagged in the report said they were never contacted by the education department about the results.

George Batterson, superintendent of the New Kensington-Arnold School District, said he found it "kind of shocking" that he was first notified by a reporter -- and not state education officials -- about test results from the third grade of his district's Fort Crawford School.

The report was commissioned by Shula Nedley, who was director of the education department's Bureau of Assessment and Accountability. She said she wanted to restore analyses of test data, including an examination of erasure marks made on the standardized exams given annually in grades 3 through 8 and 11.

Ms. Nedley, who is now a consultant in Pittsburgh, said the impetus for the analysis was less to detect cheating than to have a third party validate test results in the eyes of federal officials responsible for ensuring that schools comply with the federal No Child Left Behind Act.

"This is an issue of data quality," she said. "One of the big purposes that this report fills in my way of thinking is to provide us with evidence of the quality of data so the feds will approve of our plans and have faith that the data from our schools is valid and accurate. The intent of this report wasn't grounded in suspicions of any particular school district."

Ms. Nedley, formerly the longtime top testing official for Pittsburgh Public Schools, said she recalled that the report arrived in her Harrisburg office in July 2009, about a week before she departed her position to move to the private sector in Pittsburgh.

"I can comment that it was left with instructions. I left it with one of the staff," Ms. Nedley said.

She added that she could not recall the specific person to whom she delegated responsibility.

"The people in my bureau, the senior staff in my bureau knew about the report. They were involved with the development and approval of the report. The report arrived ...[my] last week there, and it was included in my to-do list when I left," Ms. Nedley said.

"The intent was to have conversations with school districts to the point of, 'Hmm, you made great improvements this year, in fact way beyond what we typically see in a year. Can you tell me how you did it?' [Partly] to question, 'Is this data really valid?' "

Mr. Eller said the education department was trying to figure out why the report was never acted upon.

"We do care about it, and that's actually part of the internal investigation to find out what fell through the cracks," he said.

Ms. Nedley had a guess.

"Sometimes when people in politically appointed positions leave office," she said, "those things that are associated with them go out the window."

The 2009 report was unearthed by Philadelphia Public School Notebook, an online publication, which sought state data in May about the 2009 standardized exams, Mr. Eller said. The report was brought to the department's attention Monday.

The analysis by Data Recognition Corp. of Minnesota, which also develops and scores the exams, also was supposed to be carried out in 2010 as part of its three-year contract but was cut out of the budget, Mr. Eller said.

Education Secretary Ron Tomalis directed that the analysis be redone this year. The cost of the analysis this year is $113,000, and a report on this spring's exams is expected by the end of the month.

Meanwhile, Mr. Eller said, the department is in the process of asking the individual school districts flagged in the report to look into the findings and report back.

"The secretary, he is concerned with what the report shows, which is why he ordered a follow-up" Mr. Eller said. "As far as whether it points to anything significant, we're not going that far at this point."

In some cases, flags were raised not by erasure marks but simply because certain groups of students -- such as economically disadvantaged or special education -- grew or shrank by a large percentage.

Those results could accurately reflect true changes. Or, Ms. Nedley said, they could represent a manipulation of the district's demographics by administrators.

Mr. Eller said he did not anticipate that widespread cheating would be uncovered. Instead, the department expects to find mundane problems involving coding issues or students who did a lot of erasing to redo their answers after realizing they had skipped a question.

"I think any number showing up is of concern. It doesn't mean that there's guilt, but any inconsistency showing up is of concern," Mr. Eller said.