YALSA’s recently updated Teen Programming Guidelines encourage the use of evidence-based outcome measurement as a means of developing meaningful programs for young people. The Public Library Association – through its latest field-driven initiative, Project Outcome – is also working to assist with librarians’ efforts to capture the true value and impact of programs and services. At ALA Annual 2016, PLA will launch Project Outcome, designed to help any programmer measure outcomes beyond traditional markers such as circulation and program attendance. Instead, Project Outcome focuses on documenting how library services and programs affect our patrons’ knowledge, skills, attitudes, and behaviors. It will help librarians use concrete data to prove what they intuitively know to be true: Communities are strengthened by public libraries and patrons find significant value in library services.

Lessons from the Field: Skokie (IL) Public Library

At Skokie Public Library, we participated in the pilot testing of Project Outcome in the fall of 2014 by administering surveys for 10 different programs. The surveys were conducted online, on paper, and through in-person interviews. In one example, teens attending a class about biotechnology were interviewed using a survey designed to measure outcomes for “Education/Lifelong Learning.” Participants ranked the extent to which they agreed or disagreed with statements measuring knowledge, confidence, application, and awareness. Results showed that 85% of respondents agreed or strongly agreed that they learned something helpful, while only 43% agreed or strongly agreed that they intended to apply what they just learned. The results demonstrated some improvement in subject knowledge, information that can be useful for advocacy. But it also revealed that there’s room for growth in ensuring program participants understand how they can apply what they’re learning. In an open-ended question asking what they liked most about the program, teens mentioned the chemical experiments conducted during the program. This type of data is something that we can pay attention to when planning future programs.

In another example, we surveyed teens participating in a program titled, “Slam Poetry: Are We So Different?” Since this program was part of a community-wide initiative to discuss how race shapes our lives, we asked questions to measure the impact on participants’ knowledge, awareness, and application. 83% of respondents agreed or strongly agreed that they felt more knowledgeable about the issues of race and racism in the community, while 67% agreed or strongly agreed that the program inspired them to take action or make a change. This type of outcome measurement goes much deeper into measuring the true influence of a program than simply recording the number of attendees.

Moving forward, we’ll continue to experiment with different Project Outcome surveys while also exploring other techniques. For long-term engagement, we are developing in-house digital badging systems. We prototyped a simple badging game for Teen Tech Week that provided data about the preferences of our teen patrons (see report). Not only do badges tally how many people are participating, they illuminate user behaviors on a granular level. Badges also make different opportunities throughout the library more visible and help teens track their progress toward mastery of a skill or subject.

Whether through Project Outcome or alternative techniques, evaluating outcomes is a fluid process. We’ll keep experimenting because the information we’re gathering is helpful for advocating for the library and improving what we’re doing so that we can have a greater impact on the people we serve. What we’re learning confirms that the library plays a crucial role in teens’ lives, which is why it is so important to use outcome measures to make an even stronger case for funding, partnerships, adding staff, and garnering community support.