Search form

This summer more than 100 New Jersey teachers got a head start on the new school year and some new ideas for thinking about how to monitor their students’ science learning. With funding from the state’s Department of Education, three New Jersey Mathematics and Science Partnerships (MSP) invited Project 2061 to provide several days of professional development focused on assessment.

New Jersey has adopted the common core mathematics standards and is a lead partner in the development of the next generation science standards scheduled for released in 2013, so the state's MSP program has focused closely on standards-based teaching and learning. That is one of the reasons why Mike Heinz, science coordinator for New Jersey, thought the Project 2061 workshop would be a good fit. “Our science teachers are beginning to realize the importance of quality formative assessment strategies as well as the impact that access to quality assessment items has on instruction,” Heinz said. “The Project 2061 assessment workshop was a good choice for a large number of our teachers who have been involved with the Mathematics and Science Partnerships and are looking for the next steps in their professional development,” he noted.

Led by Rowan University, Montclair State University, and Stevens Institute of Technology, the three New Jersey MSPs applied in early 2012 for supplemental funding from the state to include the Project 2061 workshop as part of their summer professional development programs. Each MSP works with about 18 school districts. Teachers from all grade levels in science and mathematics attended the summer workshops, along with some higher education faculty and administrators. With support from the U.S. Department of Education, the mission of the New Jersey MSPs is to enhance the content knowledge and teaching skills of classroom teachers in high-need school districts. By bringing the districts together with higher education faculty in science, technology, engineering, and mathematics and other business and nonprofit partners, the MSPs provide services, expertise, and resources aimed at making big improvements in STEM education.

Assessment as a Diagnostic Tool“Teachers are really anxious about assessment,” said Colette Killian, administrator for the Professional Resources in Sciences and Mathematics (PRISM) MSP program at Montclair State University, “so this workshop had great value for them.” Day One began with an overview of Project 2061’s approach to assessment and a focus on how to clarify as precisely as possible the ideas that will be targeted by a test question and identify possible misconceptions that students may have about those ideas. “Project 2061 emphasizes the diagnostic value of assessment,” says workshop leader and Project 2061 researcher Cari Herrmann Abell. “We try to use common misconceptions as answer choices in our items as much as possible so that when students choose those answers, we get a much better sense of why they are having difficulties.”

On Day Two, the teachers were introduced to Project 2061’s criteria for judging the alignment of an assessment item to the knowledge being measured and about other factors—for example, unfamiliar language or unclear illustrations—that can affect students’ performance on an item. The teachers then used these criteria to critique items and began planning a team study project that would draw on all they were learning to develop and evaluate a set of items that could be used in their own classroom practice. Day Three focused on strategies for obtaining feedback from students that could be used to improve items, such as having students provide reasons for selecting or not selecting each answer choice and having them identify any words, illustrations, or other item features that they found confusing. Quantitative and qualitative strategies for analyzing items and student response data were also presented. Teachers then had a chance to explore Project 2061’s Science Assessment website, which provides access to more than 700 assessment items for 16 topics in earth, life, and physical science and the nature of science, along with data on how students performed on the items in national field tests and a “create and take tests” feature for assembling items into tests that can be administered and scored online.

Classroom Applications and More
One aspect of the workshop that many participants found particularly useful was learning how to evaluate the alignment of test items to the ideas students are expected to know. “I really enjoyed picking apart test items and the discussion that followed…using these strategies will really increase my ability to evaluate student understanding as well as the efficacy of my teaching,” reported one teacher who participated in the workshop at Stevens Institute of Technology. Pamela Fadden a math teacher at Hawthorne High School, expects to use what she learned at the Stevens workshop to critique test questions so that she is “confident that the students understand what is being asked.” “As I complete a chapter, I’m going to review the multiple choice questions and select only the ones that ‘pass the test’,” Fadden said.

According to Mercedes McKay, deputy director at the Center for Innovation in Engineering & Science Education which houses the MSP at Stevens, teachers at their workshop were also enthusiastic about the AAAS assessment website and discussed how the AAAS items could be used to focus more closely on identifying their students’ misconceptions. The Stevens MSP has already used AAAS items to develop instruments for evaluating teachers and students who have been participating in their program, said McKay.

Teachers attending the workshop at Rowan University liked the assessment focus but also valued the time spent learning about other Project 2061 materials that could help them align their curriculum and assessments to content standards. “The workshop was a great experience for our teachers as well as their administrators,” reported Jenny Murphy, program coordinator for the MSP at Rowan University. “They especially appreciated learning how to construct valid multiple choice questions and appropriate questions for pre-assessing students’ knowledge.”

“We would definitely recommend this workshop for other MSPs. We had a very positive response from teachers and from our local education agency partners,” Colette Killian of Montclair concluded. “The workshop was a great value, and we liked the fact that it was ‘turnkey’ and no fuss for us.”

# # #

If you are interested in bringing an assessment workshop to your organization—whether it’s an MSP, a school district, university, or informal science institution—please contact Mary Koppal at 202 326 6643 or at mkoppal@aaas.org for more information.