Paper Authors

Christian Anderson Arbogast
Oregon State University

Christian Arbogast is a graduate student in the School of Mechanical, Industrial, and Manufacturing Engineering at Oregon State University. His academic and research interests include adapting computer science techniques to supplement traditional qualitative analysis and the mechanical design process.

Abstract

This work in progress describes an approach to enriching a qualitative evaluation of students’ conceptual understanding by integrating assessment tools developed in the context of Computational Linguistics. Evaluating students’ conceptual understanding remains a significant challenge in engineering education and has led to the adoption of a variety of survey assessment tools, such as concept inventories. These existing tools typically compare conceptual understanding against an established (though often tacit) standard of expertise. This can help practicing educators categorize student learning outcomes, especially in aggregate, but tend to fall short when describing the intricacies of an individual student’s conceptual understanding of a subject. The personal attention of an experienced qualitative researcher is still the unrivaled standard for a comprehensive assessment of an individual, but it comes at an often prohibitive expense.

The goal of this research is to automatically generate intermediary evidence of student understanding in order to supplement the non-prescriptive process of qualitatively analyzing student learning. Recent advances in the field of Natural Language Processing have greatly increased the practicality of using computer software to extract meaning from human language. These tools are especially good at identifying linguistic patterns in the way a person structures their communication. This allows us to programmatically parse and analyze student interview transcripts to identify the frequency and variation of linguistic artifacts. These artifacts can illuminate how the student interviewee’s use of language changes when describing concepts they have a deep understanding of versus concepts with which they have little expertise. By performing a statistical analysis of those linguistic properties, a collection of at-a-glance data can be generated to aid a qualitative researcher’s assessment.

This work is largely grounded in the aspect of Cognitive Load Theory that connects elements of learner expertise and transferability of knowledge structures to measurable performance ratings and physiological effects. Notable examples include EEG monitoring, eye tracking, and recording linguistic effects, all of which demonstrate significant interdependence. Variation of those performance and physiological effects have been shown to reflect changes in mental effort and working memory usage, which can be used as partial evidence for a student’s faculty with a concept. Developing a methodology that focuses on examining linguistic effects holds great practical promise because of its very low dissemination and implementation costs. This investigation blends software-generated statistical indicators of student understanding with a more traditional process of qualitative analysis. Under this approach, the researcher’s role of discovering what is being communicated is aided by the natural language software’s extraction of how a student communicates it.

As with any new implementation of cross-disciplinary techniques, assessing the validity of this approach will be foundational. We currently see a correlation between some linguistic artifacts and student conceptual understanding, and would like to present out preliminary findings to the research community. The implications of further development could be greater repeatability and inter-rater reliability of qualitative data analysis, increasing the confidence of findings, and reducing the amount of time needed for a beginning researcher to gain proficiency. Ultimately, quick and reliable assessments of student conceptual understanding have the potential to dramatically change engineering education by encouraging effective formative feedback, assessing pedagogies and curricular materials, or by increasing the degree to which academic assessments reflect student knowledge and abilities.

EndNote - RIS

TY - CPAPER
AB - Applying Natural Language Processing Techniques to an Assessment of Student Conceptual Understanding
This work in progress describes an approach to enriching a qualitative evaluation of students’ conceptual understanding by integrating assessment tools developed in the context of Computational Linguistics. Evaluating students’ conceptual understanding remains a significant challenge in engineering education and has led to the adoption of a variety of survey assessment tools, such as concept inventories. These existing tools typically compare conceptual understanding against an established (though often tacit) standard of expertise. This can help practicing educators categorize student learning outcomes, especially in aggregate, but tend to fall short when describing the intricacies of an individual student’s conceptual understanding of a subject. The personal attention of an experienced qualitative researcher is still the unrivaled standard for a comprehensive assessment of an individual, but it comes at an often prohibitive expense.
The goal of this research is to automatically generate intermediary evidence of student understanding in order to supplement the non-prescriptive process of qualitatively analyzing student learning. Recent advances in the field of Natural Language Processing have greatly increased the practicality of using computer software to extract meaning from human language. These tools are especially good at identifying linguistic patterns in the way a person structures their communication. This allows us to programmatically parse and analyze student interview transcripts to identify the frequency and variation of linguistic artifacts. These artifacts can illuminate how the student interviewee’s use of language changes when describing concepts they have a deep understanding of versus concepts with which they have little expertise. By performing a statistical analysis of those linguistic properties, a collection of at-a-glance data can be generated to aid a qualitative researcher’s assessment.
This work is largely grounded in the aspect of Cognitive Load Theory that connects elements of learner expertise and transferability of knowledge structures to measurable performance ratings and physiological effects. Notable examples include EEG monitoring, eye tracking, and recording linguistic effects, all of which demonstrate significant interdependence. Variation of those performance and physiological effects have been shown to reflect changes in mental effort and working memory usage, which can be used as partial evidence for a student’s faculty with a concept. Developing a methodology that focuses on examining linguistic effects holds great practical promise because of its very low dissemination and implementation costs. This investigation blends software-generated statistical indicators of student understanding with a more traditional process of qualitative analysis. Under this approach, the researcher’s role of discovering what is being communicated is aided by the natural language software’s extraction of how a student communicates it.
As with any new implementation of cross-disciplinary techniques, assessing the validity of this approach will be foundational. We currently see a correlation between some linguistic artifacts and student conceptual understanding, and would like to present out preliminary findings to the research community. The implications of further development could be greater repeatability and inter-rater reliability of qualitative data analysis, increasing the confidence of findings, and reducing the amount of time needed for a beginning researcher to gain proficiency. Ultimately, quick and reliable assessments of student conceptual understanding have the potential to dramatically change engineering education by encouraging effective formative feedback, assessing pedagogies and curricular materials, or by increasing the degree to which academic assessments reflect student knowledge and abilities.
AU - Christian Anderson Arbogast
AU - Devlin Montfort
CY - New Orleans, Louisiana
DA - 2016/06/26
PB - ASEE Conferences
TI - Applying Natural Language Processing Techniques to an Assessment of Student Conceptual Understanding
UR - https://peer.asee.org/26262
DO - 10.18260/p.26262
ER -