Identifying English Learners: The ELPA21 Dynamic Screener

August 27, 2018

Today we highlight the CRESSTCON’18 breakout session, Identifying English Learners: The ELPA21 Dynamic Screener. This session will describe efforts to develop an efficient and accurate test used for the initial identification of students who are potential English learners in eight states of the English Language Proficiency Assessment for the 21st Century (ELPA21). This is a high-stakes test — determining students’ eligibility for English language development services — and ELPA21’s online screener has broken the mold of legacy screeners in states. Come hear key members of the design team speak about the goals and data driving the development and administration of the Dynamic Screener.

Cathryn Still

Cathryn “Cat” Still is the Executive Director of ELPA21, an English language proficiency assessment system supported by member states and housed at CRESST. Prior to joining the team at CRESST, Cat worked as Program Director for ELPA21 at the Council of Chief State School Officers, leading the development of the ELPA21 assessment system under an Enhanced Assessment Grant from the US Department of Education. Cat brings strong expertise in organizational management, operations, and contract management, as well as years of experience building new programs, developing and executing strategic plans, and identifying and deepening operational efficiencies.

Previously, Cat supported Los Angeles Unified School District’s Comprehensive Assessment Program on behalf of the district’s assessment vendor, CORE ECS. Cat has also worked with 2U and The Princeton Review. Cat received her MA from the University of Texas and her BA from Trinity University. In this breakout session, she will provide an overview of the ELPA21 consortium and its assessment system.

Mark Hansen

Mark Hansen is an Assistant Professor in Residence in the UCLA Graduate School of Education and Research Scientist at CRESST. He is also the Lead Psychometrician for ELPA21. His work focuses on the use of latent variable models, particularly item response theory and diagnostic classification models, to support the design of educational, psychological, and health-related assessments. In this breakout session, he will describe the methods used to optimize the ELPA21 screener design, with particular focus on the specification of an early stopping rule to minimize testing burden while ensuring high classification accuracy.

Michelle McCoy

Michelle McCoy is the Manager of Assessment Design with the English Language Proficiency for the 21st Century (ELPA21) consortium. Michelle received her Masters of Education and her Bachelor of Arts at the University of Oregon. Currently, she is a doctoral candidate in Educational Leadership at Lewis & Clark College in Portland, Oregon. Prior to joining ELPA21 at CRESST, Michelle was an Assessment and Implementation Specialist for the Oregon Department of Education, focusing on English language proficiency. In this breakout session, she will describe the content and design goals of the screening test.

EunHee Keum

EunHee Keum is a Research Scientist at CRESST. She received her PhD in Quantitative Psychology from the Ohio State University. Her research interests involve the development of multi-stage testing approaches and the application of latent variable models, particularly item response theory and factor analysis models, to small-sample longitudinal data. Her work focuses on evaluating and improving the design and delivery of educational assessments. She also provides general assistance on statistical and psychometric issues arising in other ongoing CRESST research projects. For this breakout session, she will detail the methods used to optimize the ELPA21 screener design to ensure high classification accuracy.