This past summer, Taskstream conducted extensive research into the “state of affairs” of assessment in higher education. Through focus groups, interviews, and an online survey, the research explored perceptions among faculty, institutional leaders, and assessment professionals about various assessment topics, such as the nature and perceived value of assessment, terminology in the field, and technology to support the work.

In this article, we will discuss findings from our national online survey, which received over 1,000 responses from institutional leaders, assessment professionals, and faculty members at institutions of all types and sizes across the country. The survey was restricted to full-time employees at higher education institutions in the United States and distributed via email and in a newsletter from the online news service Education Dive.

A total of 359 faculty members responded to the survey, the majority of which came from public institutions (58%) and private not-for-profits (36%), with a small percentage from the private for-profit sector (6%). In terms of discipline/area, a large proportion of respondents were from schools of education (47%) and arts and sciences (26%), while the remainder were associated with business, health sciences, general studies, IT, nursing, and social work departments. With this article, we aim to provide insight into how faculty perceptions of their personal experience with assessment relate to their involvement in assessment, views on its importance, and their specific needs for professional development. For this examination, we focused on a comparison of responses from faculty who rated their personal level of experience with assessment as “beginner/intermediate” (55% of the faculty respondents) to those from faculty who rated their experience with assessment as “advanced” (45% of the faculty respondents).

Results

Involvement in assessment

Faculty members who identified their personal level of experience with assessment as beginner/intermediate indicated that they are highly involved in course (87%) and program (70%) level assessment at their institutions. Likewise, faculty members who rated themselves as advanced indicated that they are highly involved in course and program level assessment at their institutions (87% and 69%, respectively). At the department level, beginner/intermediate and advanced level faculty also indicated comparable levels of involvement in assessment, with both groups rating their participation at 54%.

The most notable difference between the two groups appeared in their involvement at the institutional level: 38% of beginner/intermediate faculty members said they are not involved in assessment at the institutional level, compared to 26% of faculty who rated themselves advanced in assessment. Only 13% of beginner/intermediate faculty said they are highly involved in assessment at the institutional level, compared to 20% of faculty who rated themselves advanced in assessment.

Comfort with data & the use of technology to support assessment

When faculty respondents rated their comfort with data (based on a scale of 1-5 where 1=data challenged and 5=data geek) those who identified themselves as advanced in assessment were more likely to view themselves as “data geeks.” Nearly 28% of this group rated their comfort with data at this level in comparison to 12% of faculty who identified themselves as having beginner/intermediate levels of experience in assessment. Further, approximately 74% of the advanced group selected either a “4” or “5” on the scale, showing a high degree of comfort with data, compared to 41% of the beginner/intermediate group.

When asked to rate how important it is for their institution to support assessment efforts with technology, nearly all of the respondents across both groups indicated that they think it is either “somewhat important” or “very important” for their institution to support assessment with technology (98% of the beginner/intermediate group and 99% of the advanced group). A higher percentage of the advanced group indicated that technology to support assessment is very important than those in the beginner/intermediate group (85% compared to 77%, respectively).

Institutional assessment maturity and the importance of assessment

When faculty were asked to rate their institution’s level of assessment “maturity”, 83% of the beginner/intermediate group said their institutions were also at a beginner/intermediate level, and only 16% believed their institutions were advanced when it came to assessment. However, 45% of faculty members who rated themselves as advanced in assessment also rated their institution as advanced. In other words, both groups were more inclined to rate their institution at the same level they rated their own personal experience with assessment.

When it comes to their personal opinion on the value of assessment for an institution, faculty respondents with an advanced level of experience with assessment were more likely to indicate that it is important for an institution (92%) in comparison to those at the beginner/intermediate level. Likewise, when asked how important assessment is to the future of higher education, the advanced group of faculty members were more likely to indicate it is very important (88%) in comparison to those at the beginner/intermediate level (80%).

Professional development interests/needs

Respondents were asked to what extent they felt they needed, or were interested in, professional development (PD) in the following areas: rubric design; data analysis and interpretation; scoring calibration/norming; developing/selecting assessment assignments; assessment terminology; documenting assessment results and reports; the benefits of assessment; inter-rater reliability; and curriculum mapping. These topics were rated on a 1-5 scale (1=not at all interested to 5=very interested).

The top two topics of most interest/need for beginner/intermediate faculty — as indicated by a “4” or “5” on the scale — are: 1) developing/selecting assessment assignments and 2) rubric design. Curriculum mapping and data analysis and interpretation tied for third most interesting to this group. The top three topics for advanced faculty are somewhat different: 1) documenting assessment results and reports, 2) inter-rater reliability, and 3) curriculum mapping.

Although both groups rated curriculum mapping as the third most interesting topic for PD, a larger percentage (36%) of those who identified themselves as advanced showed little to no interest in the topic — as indicated by a “1” or “2” on the scale — than those who identified themselves as beginner/intermediate in their assessment experience (28%). When comparing ratings between the two groups, the beginner/intermediate group indicated greater interest than the advanced group in the following topics: developing/selecting assessment assignments; rubric design; and assessment terminology. On the other hand, advanced faculty were more interested than the beginner/intermediate group in these topics: inter-rater reliability; documenting assessment results and reports; and scoring calibration/norming.

Discussion

Based on our survey findings, advanced faculty are more inclined to view assessment as very important both for an institution and the future of higher education. They are also more likely to be involved in assessment at the institutional-level, more comfortable with data, more likely to view technology to support assessment as very important, and more likely to perceive their institution’s assessment maturity as advanced.

Our research indicates that one’s personal level of experience with assessment affects the professional development topics of most interest. According to our survey, those who see themselves as beginner/intermediate were most interested in PD focused on developing/selecting assessment assignments; rubric design; curriculum mapping; and data analysis and interpretation. Meanwhile, those who rated themselves as advanced in assessment are most interested in PD on documenting assessment results and reports; inter-rater reliability; and curriculum mapping. Considered another way, faculty with less experience with assessment are interested in topics related to the beginning phases of the process (i.e., developing/selecting assignments that will provide evidence for assessment and creating rubrics to assess that evidence); whereas more advanced faculty are interested in working on documenting results and more advanced practices and data analysis (i.e., inter-rater reliability). It’s worth noting that curriculum mapping was one of the top areas of interest to both groups. This finding is in line with our experience working with a wide variety of institutions: we find that there is a greater interest/need for professional development around more strategic, planning-related topics among institutions at all levels and stages in the process.

As with all research, this study raises additional areas for further investigation. For example, our sample was limited to full-time faculty members; it would be interesting for further research efforts to focus on part-time adjunct faculty, exploring their personal experience with assessment, the level(s) in which they are involved with assessment on their campuses, and the specific professional development areas they are most interested in. We can see from our initial examination of our survey data that like assessment, “one size does not fit all” when it comes to planning professional development activities on college campuses. Institutions need to consider not only their faculty’s perceived level of experience in assessment, but also the different faculty groups who are engaging in assessment practices on their campuses. Our data shows that curriculum mapping seems to be the common denominator for all levels of expertise and it is an integral step in the beginning stages of systematic assessment effort. We encourage institutions to focus their initial professional development activities on this topic and build out more advanced sessions from there.