One of the most popular AEA365 posts over time is Annaliese Calhoun’s Measuring Sustainability Capacity and Planning for Long Term Success. I suspect this is because as evaluators, our work can regularly stray into the area of program planning. And if we’re providing input into program design, why not add extra value in terms of ways to promote program sustainability as well?

Years ago I was given an opportunity to research what factors are associated with greater program sustainability, and what I learned truly surprised me. It turns out achieving sustainability isn’t rocket science. In fact, the promoters of program sustainability read like many best practices for operating a nonprofit: diversified funding, program champions, and collaborative partners. Not surprisingly, evaluation is also a strongly associated with greater program longevity. But there are other promoters you might not immediately think about, such as a strong volunteer base, in-kind resources, high visibility, local values and culture, and a sustainability plan. Although researchers are still learning about the conditions under which new program innovations are sustained, I’m excited by the potential of what we can achieve with what we currently understand now.

Rad Resource: The Program Sustainability Assessment Tool (PSAT). There are several tools available to help groups appraise their level of sustainability, but this is the one I like best. Developed by the Center for Public Health Systems Science at Washington University in St. Louis, the PSAT allows you to assess a program on several dimensions that increase the likelihood of later sustainability. Although the Center originally developed it for chronic disease prevention programs, it’s relevant for most types of programs. The tool is quick, user-friendly, and best of all produces a score that groups can use to evaluate their progress over time.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Greetings! Jessica Manta-Meyer here, a Director at Public Profit, a consulting firm helping mission-driven organizations measure and manage what matters. Have you thought about creating a replication manual, but didn’t know where to start? Or worried you’ll put in a titanic amount of time creating a manual to train staff at new sites, only to see it sink into obscurity or go out-of-date? Fear not! I’m here to offer a few tips to help you avoid the icebergs, and maybe even make it possible for others to replicate that awesome program!

Hot Tips:

Identify what is loose vs. tight in your program. Every program has elements that are essential, and ones that can be customized. The replication site will inevitably be running the model program in a new context. Document what essential elements help your program thrive, and let the replication site adjust the other elements to fit their needs.

Make explicit what is implicit. Replication manuals aren’t just the “what” and “how” of a program, but the “why” as well. Often program staff understand the “why” on such a deep level that it feels self-evident. When you make space to highlight the “why” it makes a stronger case for the more concrete elements of “what” and “how.”

Know your audience. I bet everything you’ve ever read about writing has told you this one, thus it holds true for replication manuals too! Don’t leave it up to the partner program to sort out who should be using your manual. Manuals for front line service providers will read differently than manuals meant for their managers. Figure out which group you are targeting and write to that.

Link to related sources. It is not efficient, or practical, to try and re-create every training agenda, background document, or handout in a program’s universe. Link to them instead! This ensures that whenever these items are updated, your partner program will be on the same page.

Public Profit has created a resource that outlines the tips above and provides examples for each. Check it out here. (Bonus: it also includes an amazing Carl Sagan quote!)

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello from Debi Lang with the Massachusetts Area Health Education Center Network (MassAHEC) at the University of Massachusetts Medical School’s Center for Health Policy and Research. I last published an aea365 post on how evaluation and program staff collaborated to establish a competency-based model for a range of MassAHEC Health Careers Promotion and Preparation (HCPP) programs. The current post focuses on the importance of learning objectives as part of program design and evaluation, with some tips and resources on how to write clear objectives.

The AHEC HCPP model consists of 5 core competencies with learning goals that apply across a range of HCPP programs (see the chart below).

Core Competencies Learning Goals

Each of the programs has written learning objectives that define specific knowledge, skills, and attitudes students will learn by participating in these programs. Learning objectives are important because they:

document the knowledge, skills, attitudes/behaviors students should be able to demonstrate after completing the program;

encourage good program design by guiding the use of appropriate class activities, materials, and assessments;

tell students what they can expect to learn/become competent in by participating in the program; and

help measure students’ learning.

Below are some of the learning objectives from one HCPP program and their connection to the competencies listed above:

Hot Tips: Here are some recommendations for writing learning objectives.

Think of learning objectives as outcomes. What will students know/be able to do once they complete the program? Start with the phrase: “At the end of this program, students will…”

Be careful not to write learning objectives as a description of the activities or tasks students will experience during the program.

Make sure student learning assessments are based on the learning objectives.

Rad Resource: “Bloom’s Taxonomy” is a framework based on 6 levels of knowledge (cognition) that progress from simple to more complex. When writing learning objectives, use the keywords associated with the knowledge level you expect students to achieve.

To be continued…

Program-specific learning objectives that connect to one or more core competencies can help measure student learning in order to report program outcomes from a competency perspective on a local and state level. In a future post, I’ll discuss how learning objectives are used in an evaluation method called the retrospective pre-post, along with ways to analyze data collected using this design feature.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

I’m Chithra Adams, a program evaluator at the Human Development Institute, University of Kentucky. I explore how design principles and research can be applied to program evaluation.

Rad Idea – prototyping: A key part of the design process is prototyping. Prototyping is a process by which ideas and concepts are tested for technical and social feasibility. Prototypes are physical products. They can range from the expression of an early idea to an almost complete product. Depending on the context, prototyping can be used in several ways (Stapper, 2010):

Prototypes evoke a focused discussion in a team because the phenomenon is ‘on the table’.

Prototypes allow testing of a hypothesis.

Prototypes confront theories because instantiating one typically forces those involved to consider several overlapping perspectives/theories/frames.

Prototypes confront the world because the theory is not hidden in abstraction.

A prototype can change the world because in interventions it allows people to experience a situation that did not exist before.

Lessons learned in prototyping evaluation products and process: I use prototyping to test out new ideas as well as to get the client involved in the evaluation process. Prototyping can be used both to test out ideas for products (reports, briefs) and process (client interaction, stakeholder involvement). Regardless, the client/stakeholder should understand that the prototype is only a draft and further refinements will be made. In some cases, prototyping with a client/stakeholder is simply not feasible. In those cases, the idea or product can be tested with someone outside a project. Sometimes I try out the first few iterations with other evaluators and test the final versions with clients.

The timing of the prototyping process is critical. It should be done when the client has the time to provide feedback for further refinement. You cannot prototype an evaluation report a week before it is due! The key to prototyping is rapidly testing ideas and getting feedback. Be receptive to feedback. Being receptive to it does not mean that all feedback will be incorporated in the next iteration. Expert judgment should be used to identify what feedback will be adopted.

The American Evaluation Association is celebrating Program Design TIG Week with our colleagues in the Program Design Topical Interest Group. The contributions all this week to aea365 come from our Program Design TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

I’m Alvin Yapp, an evaluator with the Edmonton Oliver Primary Care Network (EOPCN), a team of doctors and other health providers, such as nurses, dietitians, kinesiologists, mental health workers, and pharmacists. We work as a team to provide coordinated primary health care to patients. I lead the evaluation efforts at EOPCN. In this post, I share my experiences in using evaluation to inform the design of primary health care programs.

I have found evaluation to be a powerful tool for program design; evaluators can help create a framework to gather evidence which informs program development. I have found that evaluators can positively influence program design through the following activities:

Identify important indicators to examine in order to measure success of the program.Teams that design programs without clear indicators of success and a plan to measure those indicators will have a difficult time identifying problems and fixing them.

2. Embed evaluation processes into the processes of the program; evaluation is a part of the program.

If the evaluation processes are a key part of the program and clearly articulated from the beginning, staff will be much more engaged with evaluation activities (i.e., providing feedback) and the quality of the data will be higher.

3. Provide periodic check-ins of indicators to support ongoing development.

This does not have to take the form of a weekly written report, but can be regular team huddles around what the data is currently showing and informal conversations about possible problems and solutions.

4. Support evidence-based program changes.

Use the evidence from evaluation activities to inform and rationalize changes to the program.

Hot Tip: Spend some time instilling a culture of evaluation into the team/organization. If the team understands that you are there to support the design and success of their program, they will be much more open with identifying areas where the program can improve. That said, do not overburden them with evaluation activities; they still have a program to design and implement!

Lesson Learned in Evaluating Developing Programs: Program development does not happen in a straight line. Be nimble. Plan as much as you can, and plan for those plans to go wrong.

The American Evaluation Association is celebrating Program Design TIG Week with our colleagues in the Program Design Topical Interest Group. The contributions all this week to aea365 come from our Program Design TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

I’m Marti Frank, a researcher and program evaluator from Portland, Oregon. I’ve found the hardest part of a project is making recommendations that resonate with my client, and I’ve been working on an approach to developing and communicating recommendations that’s rooted in non-violent communication.

Lessons Learned: Evaluation clients are naturally attached to their programs. Finding non-threatening ways to draw attention to program design issues and inspire action can be a challenging skill to master for evaluators.

Hot Tip: Use a non-violent communication approach to develop and present program design recommendations.

Non-violent communication (NVC) dates back to the 1960s and people have found it useful in incredibly diverse contexts, from peace-making to parenting. NVC focuses on three aspects of communication: self-empathy (an awareness of one’s inner experiences), empathy (understanding the feelings and emotions of others), and honest self-expression (expressing oneself authentically in a way that inspire compassion from others). The goal is to foster open, honest communication and avoid communication that block compassion or alienates people. My eight-year-old son is learning NVC at school as a way to mediate disputes. Why not use it in evaluation, too?

In my interpretation, NVC is a three-step process that frames a conversation.

Observations ? consequences ? requests

In the evaluation context:

Observations and consequences are both evidence-based, they differ only in their causal relationship; the evaluator’s challenge is to distinguish between them.

Observations can be any information that helps us describe the program and its context. This approach to thinking about and reporting on evaluation findings has helped me make a place for anecdotal or otherwise-hard-to-chart data.

Consequences are what result from the observed conditions. In my formulation, consequences are usually the social condition we want to explain, or change.

Requests are recommendations: what we think needs to happen to move from the consequences at hand to our ideal state.

Lessons Learned: I find the benefits of the NVC approach are that it:

Forces us to think about data in a narrative format. This means there’s no shying away from questions of causality.

Makes room for anecdotes and other one-off pieces of data that strike us intuitively as important but which otherwise may not find a home in an evaluation report structured around the statistically significant results of data collection activities.

Contextualizes recommendations so that, by the time we get to them in the report or presentation, we’ve brought the audience along on our chain of logical reasoning.

Can help tie the particular findings from any single evaluation back to the organization’s theory of change, supporting or adding nuance to the theory.

Rad Resource: Getting to Yes is a classic handbook on negotiation and communication with ties to the NVC approach.

The American Evaluation Association is celebrating Program Design TIG Week with our colleagues in the Program Design Topical Interest Group. The contributions all this week to aea365 come from our Program Design TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Alexey Kuzmin and I am the Director of the Process Consulting Company based in Moscow, Russia (since 1992). We specialize in program evaluation and evaluation related services such as evaluation training; design and implementation of monitoring and evaluation systems. Often clients invite us to participate in program design.

Lesson Learned: Values influence program design to a great extent, but rarely become an explicit part of program description.

Cool trick: Suggest your clients to make program values an explicit part of their program design. It will stimulate an in-depth and important discussion among the key stakeholders.

Hot Tips: Remember that if you decide to do so, you need to be ready to:

Raise stakeholders’ awareness of role and importance of values in program design.

My colleague (Irina Efremova-Garth) and I reported on our experiences in including values in program design at the AEA15 conference. Our slides are available via the AEA Public eLibrary.

The American Evaluation Association is celebrating Program Design TIG Week with our colleagues in the Program Design Topical Interest Group. The contributions all this week to aea365 come from our Program Design TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello! I am Terence Fitzgerald, Senior Director of Program Design & Evaluation at International Justice Mission (IJM), a nonprofit organization that works to protect people from violent crime. I lead a team of staff that provides leadership and technical support to IJM on program design, monitoring, evaluation, and research. My team’s remit is to “bring evidence to bear for the mission” so that leaders at executive, portfolio, and project levels can make informed decisions; and so that IJM can realize its vision and fulfil its mission. I want to share lessons that my team has learned from working on program design from inside a mission-driven organization.

Lessons Learned:

Demystify program design: Before we engage with a team on program design, we ask for and listen to their views and experiences of it; and, where necessary, we explain concepts, processes, definitions, benefits, and challenges. We meet the teams where they are. I have found that a one-hour session on designing a personal program around “being more healthy” can generate lots of input and allow for discussion of many core design issues – different types of impact; relationships between resources, activities, and results; milestones and indicators; and assumptions and risks. Even with inexperienced staff, it can be quite easy to convey key concepts, get enthusiastic buy-in, and then build a more solid design.

Progressively elaborate the design: As we work with teams on a design, my team creates draft logic models and other design artifacts based on our understanding of what they have told us. We give those back to the team to review our work, correct any erroneous content, note any unresolved issues, and to secure buy-in for further elaboration. We raise concerns where we see threats to the design’s plausibility or feasibility, and we help teams to understand the causes of our concerns. We raise unresolved concerns to decision-makers.

Facilitate change to designs: Teams can tend to revere and thus be reluctant to change the designs into which they have put time and energy. My team reinforces that designs are meant to change, based on the team’s experiences and environmental changes. We explain and, where necessary, guide teams through IJM’s program change management process.

Rad Resources:

The OECD DAC criteria for evaluating development assistanceare very useful to bring evaluation into the program design process. My team uses the criteria to inform design discussions that often lead to clarifications on program priorities, inclusions and exclusions, and other design elements.

The American Evaluation Association is celebrating Program Design TIG Week with our colleagues in the Program Design Topical Interest Group. The contributions all this week to aea365 come from our Program Design TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi! We’re Angelina Lopez (NYC Department of Education’s iZone) and Chi Yan Lam (Queen’s University) from the Program Design TIG. The PD-TIG was founded to provide a forum to explore the theory and practice of program design. Our TIG is proud to sponsor this week’s AEA365 posts. The contributors this week each work at the intersection of evaluation and design. You’ll hear how program design has shaped the way they approach evaluation to enhance evaluation use and program impact.

Hot Tip: 2016 is the Year of Evaluation + Design. At #Eval16, AEA’s annual conference, the concept of design will be explored through the lens of program design, evaluation design, and information design.

Hot Tip: While the field of design has traditionally been associated with the creation and development of tangible products (e.g., industrial design of consumer products; architecture of public spaces), designers are increasingly transcending fields and applying their craft in the social, business, and public sectors:

Check out how design firms like IDEO.org and Catapult (presenters at #Eval13) are applying design for social change.

Rad Resource: The design and social innovation community is actively exploring how design methods, processes, and mindsets may be applied towards solving complex social issues. Toolkits including the Design Kit by IDEO.org, Bootcamp Bootleg by d.School at Stanford University, and the Development Impact & You Toolkit by Neta UK share how to facilitate design methods and activities to explore the needs of end-users of programs and to think creatively about potential solutions (i.e., programs, products, and services).

Lesson Learned: “Design Thinking” has become a trendy buzzword and is often misconstrued as a clear, linear process. In reality, an integrated approach to program design is messy and requires continuous alignment between people and organizations. Additionally, many of these so-called “innovative” methods and processes are not new. We’ve found alignment with participatory and developmental evaluation frameworks, and believe evaluative thinking skills can add value to design approaches.

The American Evaluation Association is celebrating Program Design TIG Week with our colleagues in the Program Design Topical Interest Group. The contributions all this week to aea365 come from our Program Design TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.