Student course evaluations ignored by students, disliked by staff

Winter Term has ended and Spring Term is in motion. Much knowledge has been gained. Techniques have been acquired, and grades received. However, a process of critical analysis required for change has yet again been left out of the equation — student course evaluations.

Since these course evaluations have been moved strictly online since last fall, the amount of completed forms has dropped significantly. Just over 11 percent of students completed evaluations for the courses they were enrolled in for Winter Term compared to the estimated 70 percent response rate for in-class paper evaluations, according to Faculty Council.

Some staff believe students should be utilizing these tools more. That these evaluations can help them adapt and change to fit new learning styles. However, a majority of students seem to be aloof, having to fill out the same style of forum for each class and teacher with no guarantee that it actually makes a difference.

The issue of completing these student evaluations seems to correlate with the perceived effectiveness of doing said evaluations.

“I don’t think mine alone [evaluation] will make much of a difference, it’s certainly not helpful for me,” Mitchel Miller, computer science major, said. “I would have to take a class twice with the same teacher to understand if they [evaluations] were useful or not.”

A lot of students, including myself, question the purpose of these evaluations, wondering whether or not they will actually have an effect on a particular instructor\’s teaching style.

The nature of this thought process is flawed, these evaluations do not measure teaching effectiveness but rather student satisfaction with their teachers and courses. For example, a student could have a bias toward a teacher for multiple reasons, causing skewed evaluation results such as personal preference for a teacher, low or good grade, previous history, etc.

“I think that there is a bias when filling out the evaluations, grade dependent, experience dependent, etc.,” Miller said.

Students and faculty need to question the purpose and effectiveness of the course evaluations. According Lane’s website, evaluations “[provide] one method for evaluating the teaching component of the learning environment.

“Evaluation questions [are] selected to provide information for a faculty member(s) to create a better learning environment and to become a better teacher.” Such questions as “what did you like about the class?” Or ambiguous statements like “this teacher encouraged students to think.”

The purpose is good in theory, but the application tells a different story. If only 11 percent of the students are even attempting to complete the forms, only a slight minority of teachers will be able to read their evaluations and make possible changes.

After having contacted many faculty members, the general census seems to indicate a distaste towards the whole system. In a desire to avoid unnecessary “hassle” from their co-workers, the faculty members wished to remain anonymous.

“Students receive no benefit from these evaluations, they are unable to see them unlike other rating systems (such as ratemyprofessor.com),” one Lane instructor said. “There are no consequences for instructors who receive poor evaluations.”

Some teachers offer personal surveys, and these questions drastically differ from the standard online course evaluation. They have shown spectacular results and enthusiasm for designing their own evaluations, ones that purposely do not play into possible promotion or other factors that would potentially advance their career. One teacher explained that they had recorded less than 5 standard online course evaluations over their 3 courses, but offered their own personalized version and received more than 50. This is clearly an effective route compared to having a one-size-fits-all evaluation form for all types of teachers.

By switching to the online volunteer based system of review, students are showing less and less enthusiasm. Clearly, a standardized evaluation for all teachers isn’t working. The process needs to evolve to become more adaptable and accessible to students and faculty.

We at The Torch believe in the power of information to change the world and our communities. Our student staff strive to uphold the ethics of journalism and produce engaging content. The Torch has an average circulation of over 2,000 print editions published weekly and content published online throughout the week.