3
2 How to Use the Bruner Foundation Evaluation Essentials for Program Managers Powerpoint Slides The Evaluation Essentials for Program Managers slides were developed as part of a Bruner Foundation special project, by evaluation trainer Anita Baker – Evaluation Services, and jointly sponsored by the Hartford Foundation for Public Giving. They were tested initially with a single organization in Rochester, NY (Lifespan) as part of the Evaluation Support Project 2010. The materials were revised and re-tested with three nonprofit organizations as part of the Anchoring Evaluation project in 2011-12. The slides, intended for use in organizations that have already participated in comprehensive evaluation training, include key basic information about evaluation planning, data collection and analysis in three separate presentations. Organization officials or evaluation professionals working with nonprofit organization managers are encouraged to review the slides, modify order and add/remove content according to training needs. (Please note that the first session begins with a presentation of “results” as a framework to help trainees see the overall relevance of evaluative capacity, i.e., what they are working toward. There is an ancillary file with multiple slides of “results” which can be substituted depending on trainee organization program focus.) Additional Materials To supplement these slides there are sample agendas, supporting materials for activities, and other handouts. There are “placeholder” slides with just a picture of the target with an arrow in the bullseye that signify places where activities can be undertaken. Be sure to move or eliminate these depending on the planned agenda.Other more detailed versions of the Evaluation Essentials materials area also available in Participatory Evaluation Essentials: An Updated Guide for Nonprofit Organizations and Their Evaluation Partners and the accompanying 6-session slide presentation. These materials are also available on the Bruner Foundation and Evaluation Services websites free of charge. Whether you are an organization leader or an evaluation professional working to assist nonprofit organization staff, we hope that the materials provided here will support your efforts. When you have finished using the Evaluation Essentials for Program Managers series have trainees take our survey. https://www.surveymonkey.com/s/EvalAnchoringSurveyhttps://www.surveymonkey.com/s/EvalAnchoringSurvey Bruner Foundation Rochester, New York

5
Or results like these?  More than 90% of case managers at all sites but location C indicated they had fully adopted the Program model (PM).  Two-thirds or more of clients at all sites but location C reported improved quality of life. SITE% of clients reporting improved quality of life since PM initiated. A69% B73% C40% D71% E66% ii

7
What if you saw results like these? RESULTS Desired Outcome 20092010 * 65% of Clients show slowed or prevented disease progression at 6 and 12 months 83%87% * 75% of clients are fully engaged in HIV primary medical care 96% * 80% of clients show progress in 2 or more areas of service plan 90%94% * 50% of clients with mental health issues show improvement in mental health function by 6 months 97% * 75% of clients enrolled in SA treatment decrease use of drugs/alcohol after accessing services 93%92% * 90% of clients show improved or maintained oral health at 6 and 12 months 92%94% iv

8
Logical Considerations for Planning 1. Think about the results you want. 2. Decide what strategies will help you achieve those results? 3. Think about what inputs you need to conduct the desired strategies. 4. Specify outcomes, identify indicators and targets.** DECIDE IN ADVANCE, HOW GOOD IS GOOD ENOUGH 5. Document how services are delivered. 6. Evaluate actual results (outcomes). 1

12
Target: Reminders  Should be specified in advance. Requires buy in.  Carefully word targets so they are not over or under-ambitious, make sense, and are in sync with time frames.  If target indicates change in magnitude – be sure to specify initial levels and what is positive. 5

13
Outcome, Indicator, Target - EXAMPLE Outcome Participants will be actively involved in afterschool activities Indicators At least 500 students will participate each month. Students will attend 70% or more of all available sessions. At least half of participants will participate in 100 or more hours per semester. 6

18
How do you know when your programs really work?.... EVALUATION Program Evaluation Thoughtful, systematic collection and analysis of information about activities, characteristics, and outcomes of programs, for use by specific people, to reduce uncertainties, inform decisions. 10

19
How do you know when your programs really work?.... EVALUATION Program Evaluation Thoughtful, systematic collection and analysis of information 10

20
How do you know when your programs really work?.... EVALUATION Program Evaluation Thoughtful, systematic collection and analysis of information about activities, characteristics, and outcomes of programs, 10

21
How do you know when your programs really work?.... EVALUATION Program Evaluation Thoughtful, systematic collection and analysis of information about activities, characteristics, and outcomes of programs, for use by specific people, to reduce uncertainties, inform decisions. 10

23
Key Questions Focus and drive the evaluation. Should be carefully specified and agreed upon in advance of other evaluation work. Generally represent a critical subset of information that is desired. 12

24
Evaluation Question Criteria  It is possible to obtain data to address the questions.  There is more than one possible “answer” to the question.  The information to address the questions is wanted and needed.  It is known how resulting information will be used internally (and externally).  The questions are aimed at changeable aspects of activity. 13

26
How do you know when your programs really work?.... EVALUATION Program Evaluation Thoughtful, systematic collection and analysis of information about activities, characteristics, and outcomes of programs, for use by specific people, to reduce uncertainties, inform decisions.

27
Types, Focuses and Timing of Evaluation TYPEFOCUSTIMING MonitoringCompliance with terms of a grant, or program design Period of the grant or program duration FormativeImplementation Short/Mid-Term Outcomes While program is operating While program is operating, at certain key junctures SummativeLong-term outcomesAs or after the program ends 14

29
Characteristics of Effective Evaluators  Basic knowledge of substantive area being evaluated  Knowledge about and experience with program evaluation  Field is un-regulated  First graduate level training programs in evaluation recent  Good references from sources you trust  Personal style and approach fit (MOST IMPORTANT) 15

30
Evaluation Strategy Clarification  All Evaluations Are:  Partly social  Partly political  Partly technical  Both qualitative and quantitative data can be collected and used and both are valuable.  There are multiple ways to address most evaluation needs.  Different evaluation needs call for different designs, data and data collection strategies. 16