Paper Authors

Fred Meyer
U.S. Military Academy

Colonel Karl F. (Fred) Meyer is an Associate Professor and Civil Engineering Program Director in the Department of Civil and Mechanical Engineering at the United States Military Academy (USMA) at West Point, NY. He is a registered Professional Engineer in Virginia. COL Meyer received a B.S. degree from USMA in 1984, and M.S. and Ph.D. degrees in Civil Engineering from the Georgia Institute of Technology in 1993 and 2002.

Stephen Bert
U. S. Military Academy

Major Steve Bert is an instructor in the Department of Civil and Mechanical Engineering at the United States Military Academy. He serves as the Course Director for CE404, Design of Steel Structures and CE492, Senior Capstone Design course. He is a registered Professional Engineer in Virginia. MAJ Bert received a B.S. degree from Norwich University in 1995 and an M.S.C.E. degree from Virginia Tech in 2005.

Abstract

NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

A Technique for Program-Wide
Direct Assessment of Student Performance

Abstract

This paper builds on previous work related to the direct assessment of student performance.
Previous work assessed CE program outcomes using a single senior-level capstone design
course. This paper illustrates a systematic approach across the entire CE program for the direct
assessment of program outcomes. The civil engineering program outcomes reflect the current
ABET 3a-k as well as the ASCE Body of Knowledge (BOK).

The approach integrates existing grading practices and correlates the results with the desired
program outcomes. This system of direct assessment provides a quantitative assessment without
increasing faculty work load, by leveraging what is already being done in the evaluation and
grading of student work. This technique uses embedded indicators, which are specific student
performance events common to all students in the course such as homework problems, projects
and tests. The program director and course directors identify potential embedded indicators that
correlate strongly with the desired program outcomes. In addition to the embedded indicators,
non-standard measures of program outcomes such as membership in the ASCE student chapter
and performance on the Fundamentals of Engineering Exam are considered.

The greatest benefit of using a well developed system of embedded indicators is to provide a
quantitative assessment without increasing faculty workload. The quantitative assessment can
then be used to validate an “anecdotal” assessment or identify areas for improvement that may
not be readily apparent. This simple yet thorough assessment enables programs to spend time
developing improvements or identifying needed resource re-allocation instead of collecting and
compiling assessment data.

Introduction

The purpose of this paper is to discuss a program-wide assessment system developed at the
United States Military Academy (USMA) and used in the Civil Engineering (CE) program. The
ABET requirement to demonstrate a process for program assessment is best approached on a
continual basis with annual updates. Within the Department of Civil & Mechanical Engineering
at the USMA, course assessments are conducted at the conclusion of each course; in attendance
are those instructors involved with teaching the course as well as leadership from the department
responsible for overall course and program oversight. During the course assessment meeting, an
in-depth analysis of the course is conducted which includes not only administrative items, but a
review of the course’s embedded indicators that contribute to the overall program assessment.
The embedded indicators from the course are specifically identified by the program director to
provide a direct assessment of student learning for a given program outcome. At the program
level, the data from each embedded indicator is compiled into an overall spreadsheet broken
down by the 16 program outcomes. The process of identifying specific embedded indicators for
each course began during Academic Year (AY) 05-06; the results are now being collected. The
focus of this paper is to provide an overview of the assessment process and to provide initial