"Preparedness" Expected to Be Judged on NAEP

Federal officials have taken a major step toward setting standards that would allow the public to use scores on a test known as "the nation's report card" to judge 12th grade students' preparation for college and the job market.

The move essentially allows federal officials to move forward and begin conducting detailed studies to judge how students' NAEP test performance could be matched against their preparation for college and work. The goal is to have measures of student "preparedness" in place for testing in 2009, so that they could be publicly reported in 2010.

In theory, the move would allow the public to look at 12th graders' average NAEP scores from states, and individual student demographic groupscurrently presented on a scale of 0 to 500and judge whether they were prepared for postsecondary demands, or for certain occupations.

Members of the National Assessment Governing Board hope that by providing information on 12th graders' "preparedness," they can provide clearer information to parents, policymakers, and the general public about how to interpret NAEP scores, said Mary Crovo, the board's interim executive director.

"It's a clear way to communicate what scores on the NAEP scale mean," Crovo said after the vote.

Added board member David Driscoll: "The question is, how do we make the 12th grade report much more effective?" Judging students' skill for life after high school, he said, "would add meaning to our reporting, for us, for students, for their parents."

The NAEP 12th grade test currently provides student scores on a national level, rather than for individual states. This year, however, 11 states agreed to take part in an assessment that will produce individual results for those jurisdictions.

The governing board had asked members of a 7-member panel to study the "preparedness" issue. That panel was chaired by Mike Kirst, a professor emeritus at Stanford University, who presented a report to the board on Friday.

Opinions vary on how to judge students' preparation for college study and jobs requiring varying degrees of skill, as Ed Week has reported. In his presentation, Kirst recommended that the board begin conducting a series of studies, including an examination of how NAEP content compares with certain measurements of college and workforce preparedness, such the ACT and SAT.

Categories:

2 Comments

NAEP assessment does not discover whether a HS student has read one (1) complete nonfiction book or written one (1) serious research paper while in high school These accomplishments or the lack of them have an effect on survival and success in college.

Hooray! Another measure taken toward further emphasizing the importance of standardized testing. State legislators have already given teachers a heavy weight to bear with performance compared to other districts in the state. Now, with proposed reform at the national level, states will put more pressure on educators to perform so that they can compete at the national level.

I have taken that the test is a norm-referenced measure that will weigh how well prepared students within a state are. I can only speak for Ohio, but in the forum of education standardized testing is the center of many heated debates. They hinder teacher creativity and vigor, and do the same to students. Standardized tests are designed to measure whether a student has learned enough to earn a high school diploma. Now there is a proposed test that is supposed to measure whether students will be able to perform in college.

Many in the field of education agree that standardized tests may be culturally bias. It is proven that students in higher SES areas perform much better on standardized tests. For example, I will use two districts in Ohio, one that is notoriously wealthy, and one that is notoriously poverty ridden. Hudson Schools (very high SES) scored a 30/30 on the state report card. On the other hand, East Cleveland Schools met only two of thirty state requirements. If a test were given at the national level, I propose that the test will be regionally bias. There will be arguments to defend and argue invalidity of the test across the board from Appalachia to New York to California.

There is no way that a single test will be able to measure or predict how well a student will perform in college. I will use pseudonym's to offer you two real life examples. Manning has always been bright, but never had to try. He finished high school with a 3.4 GPA and scored a 35 of 36 overall on the ACT. Julia finished high school with a 3.2 high school GPA and scored a 17 of 36 on the ACT. If I had gone to high school with Manning and Julia, I would have predicted far greater things for Manning in college. Five years later, Manning has been placed on academic probation twice and has a 2.4 GPA. He still has two years until he receives his Bacholor Degree. On the other hand, Julia has a 3.7 overall GPA and has been named to the Dean's List numerous times. The ACT did not take into account work ethic. Anyone who has gone to college has seen this phenomena first hand. A students natural brilliance may in fact cripple their performance in college because the curriculum and work load is far more demanding but since they never had to study in high school, they never developed good study habits and can't get by on natural ability alone.

That rant has finished, and I will now further my original point. Though national tests will seemingly have no bearing on students individually...they really will. A national norm-referenced test will pressure state legislators into higher performance from districts in their given state. When legislators want more performance, and teachers are offered less breathing room, hindering their performance and the education of their students. I love that we are taking steps for education, but I fear that we may be walking backward.