My friends at TRAASS have launched a new e-learning course on Real-time evaluation and adaptive management:

“What exactly is an RTE/AM approach and how can it help in unstable or conflict affected situations? Do M&E practitioners need to ditch their standard approaches in jumping on this latest bandwagon? What can you do if there is no counterfactual or dataset? This modular course covers these challenges and more.”

For those interested in the area of humanitarian work and advocacy, this presentation could be of interest – where I explain what is humanitarian advocacy – its definition, levels, process and challenges.

We often talk about using mixed methods in evaluation but we rarely see examples that go beyond a combination of surveys and interviews. So I wanted to share an example of an evaluation that I thought was a good example of using a variety of methods. I was part of a team (of the Independent Evaluation Office) that carried out an evaluation of knowledge management at the Global Environment Facility.

The methods we used included:
-Semi-structured interviews
-Online surveys
-Comparative study of four organisations
-Meta-analysis of country-level evaluations
-Citation analysis – qualitative and quantitative
The image shows the visualisation of the citation analysis (carried out by Matteo Borzoni) by theme – interesting stuff! I feel that the range of data collected gave us a very solid evidence base for the findings. The report is available publicly and can be viewed here (pdf)>>

A recent issue of Policy and Politics journal has a special focus on influencing policy, mostly about the work of research and academia in this respect. There are many parallels to advocacy and policy influence work in general, with this particular lesson I found highly relevant:

“Avoid relying too much only on evidence and analyses, instead combine evidence with framing strategies and storytelling”

The Evaluation for Development blog from Zenda Ofir has been collating tips for young / emerging evaluators – that even experienced evaluators will find interesting. Here are some highlights:
From Zenda herself:
Top Tip 1. Open your mind. Read
Top Tip 2. Be mindful and explicit about what frames and shapes your evaluative judgments.
Top Tip 3. Be open to what constitutes “credible evidence”.
Top Tip 4. Focus a good part of your evaluative activities on “understanding”.
Top Tip 5. Be or become a systems thinker who can also deal with some complexity concepts.Read more about these tips>>

From Juha Uitto:
Top Tip 1. Think beyond individual interventions and their objectives.
Top Tip 2. Understand, deal with and assess choices and trade-offs made or that should have been made.
Top Tip 3. Methods should not drive evaluations.
Top Tip 4. Think about our interconnected world, and implore others to do the same.Read more about these tips>>

From Benita Williams:
Top Tip 1. The cruel tyranny of deadlines.
Top Tip 2. Paralysis from juggling competing priorities.
Top Tip 3. Annoyance when you are the messenger who gets shot at
Top Tip 4. Working with an evaluand that affects you emotionally
Top Tip 5. Feeling rejected if you do not land an assignment
Top Tip 6. Feeling demoralized when you work with people who do not understand evaluation
Top Tip 7. Feeling discouraged because of wasted blood sweat and tears
Top Tip 8. Feeling lazy if you try to maintain work-life balance when other consultants seem to work 24/7
Top Tip 9. Feeling overwhelmed by all of the skills and knowledge you should haveRead more about these tips>>

In an evaluation of the Shifting the Power project we were interested to see how local networks of NGOs had grown over the three years of the project. We were lucky that the project had carried out a mapping of NGO networks at the start of the project in 2015 and we then did the same in early 2018; here you can see the results comparing 2015 to 2018 from Bangladesh – interesting data!