Mysteries of FCAT irk some observers

Scoring and other details about Florida's exams have not been shared with the public.

May 23, 2005|By Vicki Mcclure, Sentinel Staff Writer

Many teachers and parents think student scores on the FCAT reflect the number of correct answers children made on Florida's annual high-stakes test.

They are wrong.

The state uses what is known as "pattern scoring," which means that if students are careless and miss too many easy questions, they could be classified in the bottom level no matter how many difficult questions they get right.

The scoring system is one of many details of the Florida Comprehensive Assessment Test that state officials have not shared with the public, even though understanding how the exams are graded could affect whether a child passes third grade or graduates from high school.

Educators and parents alert to the practice could modify their test training, such as strongly emphasizing that students take their time on easy questions to prevent being unfairly pegged as low performers.

To Bill Miller, principal of Umatilla Middle School in Lake County, the secrecy surrounding the FCAT is unfair.

"I know when I play a baseball game I have to have more runs to win, and we don't quite get that with what we are doing here," Miller said after 2005 test scores were released last week. "It is too much of a mystery for the stakes being as high as they are. . . . We need specifics, and we need precise expectations to meet the hurdles that are being raised each year."

Nearly 1.6 million Florida schoolchildren took the FCAT exams in reading, writing, math and science this year, an annual hurdle designed to measure how well public schools are performing.

Although such states as Indiana, Maryland, Alaska and New York clearly explain how and why officials use pattern scoring when grading their achievement tests, Florida does not.

Georgia and Massachusetts post tests and answer keys on the Internet. Ohio provides copies of children's individual responses upon request so parents and students can verify the results.

Again, Florida does not.

Cornelia Orr, the head of testing at the Florida Department of Education, said a dearth of funding has prevented her staff from providing documentation to educators and parents explaining how the state has graded the FCAT since its inception in 1997.

She said a new manual will be published this fall, and a section will be devoted to explaining pattern scoring. Currently, Florida merely states that the exam is scored on a point system, with different values for correct answers, depending upon the question.

Orr said she would like to release copies of the tests administered to students each year, but the Legislature has not provided enough money to create new questions to replace those that would be made public.

"We have requested more money every year," Orr said. "I don't think anyone would disagree that this is what we need to do."

Earlier this month, at a national education-writers conference in St. Petersburg, Gov. Jeb Bush said Florida does not have the money to create new tests each year to replace those released to the public.

The state, however, received an extra $2.2 billion in unanticipated revenue this year. Bush and legislators approved a $250 million tax cut and other spending priorities.

The governor and Education Department also successfully fought in court two years ago to keep the tests secret.

"The benefits [of releasing the FCAT] would be far less than the cost," Bush said in St. Petersburg, although he quickly acknowledged that taking the mystery out of the FCAT was "important."

The FCAT design is based on something called "item-response theory," which many statisticians support as providing a better estimation of a student's ability than calculating a score based on a "number correct" approach.

Each question is essentially weighted according to its difficulty, the probability its answer can be guessed correctly and how well it can discriminate between those with low and high levels of aptitude.

In scoring the test, each student is given a preliminary ability rating based on the overall number of correct answers. The responses then are considered one at a time, with the score being adjusted either up or down depending upon the type of question.

Such an approach provides a more accurate picture of a student's ability, Orr said. A student who misses a few simple questions but correctly answers the difficult ones should be considered, say, a Level 3 on FCAT's scale of 1 to 5, because that more accurately reflects the child's aptitude, she said.

Orr could not say, however, how many Level 1 -- or easy -- questions a child would have to miss to be ranked in the bottom category but added that it varied by test.

Orange County parent Kimberly Cornett said her third-grade daughter, Christina, tended to carelessly rush through simple questions because they did not interest her.

She worked with her child at home on pace by having her do practice FCAT tests on the computer. She reminded her daughter to use the techniques she learned at school and to spend time on each question.