This blog post will highlight another “bite sized” piece of the Value Report, the section on Student Learning found on pages 37-45. Please see the full text of the Value of Academic Libraries Report for more information and for full citations, and if you’re currently working on a project that addresses the value of academic libraries, please share it with us at this link: http://www.surveymonkey.com/s/SMWQKNL

In the area of student learning, academic libraries are in the middle of a paradigm shift. In the past, academic libraries functioned primarily as information repositories; now they are becoming learning enterprises (Bennett 2009, 194). This shift requires academic librarians to embed library services and resources in the teaching and learning activities of their institutions (Lewis 2007). In the new paradigm, librarians focus on information skills, not information access (Bundy 2004, 3); they think like educators, not service providers (Bennett 2009, 194).

For librarians, the main content area of student learning is information literacy, so this is also the focus of much of our assessment of student learning. Traditionally, information literacy assessment focused on satisfaction (Association of College and Research Libraries 2000) or self-report surveys, like the Research Practices Survey, rather than outcomes. More recent literature is outcomes-focused and emphasizes multiple choice tests like the Standardized Assessment of Information Literacy Skills (SAILS) as well as bibliography analysis (Walsh 2009, 21). However, most of the literature relates the details of case studies focused on one group of students, one class, or one semester (Matthews Evaluation p 243). In other words, most examples are “micro-level studies” (Streatfield and Markless, Evaluating the Impact 2008, 103) or “narrow and momentary glances” at impact of instructional efforts (Shupe 2007, 54), rather than the broader, more coherent demonstrations of value that librarians need to articulate the importance of information literacy learning in an institutional context.

There are large gaps in the literature and a need for rigorous, larger-scale assessments that emphasize “changes in levels of student competence…changes in student behavior…effects of information literacy based changes in the curriculum…the comparative efficacy of different levels and types of information literacy interventions…[and] the overall value of library based information literacy work to the academic community” (Carrigan 1992, 104). Some literature gaps can be closed by using assessment management systems to compile small scale institutional assessments into larger, more systematic investigations; others can be filled by organized, cooperative studies.