Post navigation

For the want of a response, the data was crap

Any time I hear someone use data from one of the new freshman, senior, or recent graduate surveys to advocate for a particular idea, I can’t help but smile a little. It is deeply gratifying to see faculty and administrators comfortably use our data to evaluate new policy, programming, and strategic direction ideas. Moreover, we can all point to a growing list of data-driven decisions that we know have directly improved student learning.

So it might seem odd, but that smile slips away almost as quickly as it appears. Because underneath this pervasive use of data lies a deep trust in the veracity of those numbers. And the quality of our data depends almost entirely upon the participation of 18-22 year-olds who are . . . . let’s just say “still developing.” Data quality is like milk – it can turn on you overnight. If the students begin to think that survey questions don’t really apply to them or they start to suspect that the results aren’t valued by the college, they’ll breeze through the questions without giving them much thought or blow off the survey entirely. If that happens on a grand scale . . . . I shudder to think about it. So you could say that I was “mildly concerned” as I organized fall IDEA course feedback forms for processing a few weeks ago and noticed several where the only bubbles colored in were “fives.” A few minutes later I found several where the only darkened bubbles were “ones.”

Fortunately, a larger sampling of students’ IDEA forms put my mind at ease. I found that on most forms the distribution of darkened circles varied and, as best as I could tell, student’s responses to the individual questions seemed to reflect at least a minimal effort to provide truthful responses. However, this momentary heart attack got me wondering: to what degree might student’s approach to our course feedback process impact the quality of the data that we get? This is how I ended up in front of Augustana’s student government (SGA) earlier this week talking about our course feedback process, the importance of good data, the reality of student’s perceptions and experiences with these forms, and ways that we might convince more students to take this process seriously.

During this conversation, I learned three things that I hope you’ll take to heart. First, our students really come alive when they feel they are active participants in making Augustana the best place it can be. However, they start to slip into passive bystanders when they don’t know the “why” about processes in which they are expected to be key contributors. When they become bystanders, they are much less likely to invest their own emotional energy in providing accurate data. Many of the students honestly didn’t think that the IDEA data they provided on the student form was used very often – if ever. If the data doesn’t really matter anyway, so their thinking goes, the effort that they put in to providing it doesn’t matter all that much either.

Second, students often felt that not all of the questions about how much progress they made on specific objectives applied in all classes equally. As I explained to them how the IDEA data analysis worked and how the information that faculty received was designed to connect the objectives of the course with the students’ sense of what they learned, I could almost hear the light bulbs popping on over their heads. They were accustomed to satisfaction-type surveys in which an ideal class would elicit a high score on every survey question. When they realized that they were expected to give lower scores to questions that didn’t fit the course (and that this data would be useful as well), their concern about the applicability of the form and all of the accompanying frustrations disappeared.

Third, even though we – faculty, staff, and administrators – know exactly what we mean when we talk about learning outcomes, our students still don’t really know that their success in launching their life after college is not just a function of their major and all the stuff they’ve listed on their resume. On numerous occasions, students expressed confusion about the learning objectives because they didn’t understand how they applied to the content of the course. Although they may have seen the lists of skills that employers and graduate schools look for, it seems that our students think these are skills that are largely set in stone long before they get to college, and that college is mostly about learning content knowledge and building a network of friends and “connections.” So when they see learning objectives on the IDEA forms, unless they they have been clued in to understand that these are skills that the course is designed to develop, they are likely to be confused by the very idea of learning objectives above and beyond content knowledge.

Although SGA and I plan to work together to help students better understand the value of the course feedback process and its impact on the quality of their own college experience, we – faculty, staff, and administrators – need to do a much better job of making sure that our students understand the IDEA course feedback process. From the beginning of the course, students need to know that they will be learning more than content. They need to know exactly what the learning goals are for the course. Students need to know that faculty want to know how much their students’ learned and what worked best in each class to fuel that learning, and that satisfaction doesn’t always equate to learning. And students need to know how faculty have used course feedback data in the past to alter or adapt their classes. If you demonstrate to your students how this data benefits the quality of their learning experience, I think they will be much more willing to genuinely invest in providing you with good data.

Successfully creating an evidence-based culture of perpetual improvement that results in a better college requires faculty, staff, and administrators to take great care with the sources of our most important data. I hope you will take just a few minutes to help students understand the course feedback process. Because in the end, not only will they benefit from it, but so will you.