3 What are we ‘evaluating’?Actions, programs, activitiesConducted in a community setting, over a period of timeAimed at reducing deaths, injuries, and/or events and behaviors that cause injuries3Safety 2010 LondonWHO VIP Webinar 20113

5 Why do we ‘evaluate’?To know ourselves what works and if we are doing some goodIn performing some activityIn the communityIn the countryTo convince funders and supporters that their investment is worthwhileTo convince the community about the benefits of the multiple activities and actions carried outWHO VIP Webinar 2011

6 Main purposes of evaluationEvaluation helps determine:How well a program/policy works relative to its goals & objectivesWhy a program/policy did or didn’t work, relative to planned processHow to restructure a program/policy to make it work, or work betterWhether to change funding for a programProgram evaluation. Throughout this presentation we usually use the word program. By that we are talking about programs, policies, efforts, interventions. For example, coalition building is not a program per se, but is clearly an activity that can be assessed and evaluated on how well it’s being done relative to your goals for it.NSC Chicago 2010WHO VIP Webinar 20116

8 When should evaluation be considered?Evaluation needs to begin in, and be part of, the planning process…Otherwise, “if you do not know where you are going, it does not matter which way you go, and you will never know if you got there or not!”Lewis Carroll (1872)Alice in WonderlandOne of our main messages today is that evaluation is not something that you do after the fact. It is best done and MOST EASILY done when it is part of the planning processWHO VIP Webinar 20118Adapted from M. Garrettson8

9 Types of evaluation depending on program phaseFormative EvaluationHow can the program activities be improved before implementation?ProgramPlanningPhaseProgramImplementationPhaseProcess EvaluationHow is/was the program (being) implemented?Impact / OutcomeDid the program succeed in achieving the intended impact or outcome?PostProgramPhaseNSC Chicago 2010WHO VIP Webinar 201199

10 Cycle of program planning and evaluationWHO VIP Webinar 2011Adapted from C Runyan10

13 Define target audienceTo whom is the program directed?Whose injuries need to be reduced?Who is the target of the program?at risk personscare givers (e.g. parents)general publicmediadecision makersThe injuries you are trying to reduce and the target of the program may not be the same. If you are trying to reduce SBS you’re targeting the parents not the babies.WHO VIP Webinar 20111313

18 Set goals & objectives Goal Objectivesbroad statement of what program is trying to accomplishObjectivesSpecificMeasurableTime-framedThis takes us back to Alice.Where are you trying to get and how are you planning to try to get there.WHO VIP Webinar 20111818

21 Haddon Matrix Person Vehicle/ vector Physical Environ. Social Environ.Pre-eventEventPost-eventThere are multiple models that can help you identify strategies. One that is often used in injury prevention is the Haddon Matrix. I’m not going to teach you how to use it, but just show you what it is. This is the kind of thing that we could help you with in the future.WHO VIP Webinar 201121Haddon 1970 Am J Public Health21

22 3-dimensional Haddon MatrixPhasesPre-EventOther??FeasibilitySo the Haddon Matrix lets you look at different phases of an injury event and different kinds of factors that influence it. It can also help you identify other criteria, like cost and equity in choosing a strategy. So for example, I mentioned champions before as a factor that might influence your choice, they are something that can greatly improve the feasibility of actually caring out a particular strategy or program.PreferencesEventStigmatizationEquityPost-eventFreedomPersonVehicle/Vector)PhysicalEnviron.SocialCostEffectivenessDecision CriteriaFactorsWHO VIP Webinar 201122Runyan 1998 Injury Prevention22

24 Formative EvaluationQuestions it answersWhy it’s usefulWhat is the best way to influence the target population?Will the activities reach the people intended, be understood and accepted by target population?How can activities be improved?Improves (pilot-tests) program activities before full-scale implementationMay increase likelihood program or policy will succeedMay help stretch resourcesMarket reserachChevy: NovaQuestionnaire with Alaskan native communities about parenting practices—can’t ask yes/no questionsWhat we are doing now is formative evaluation—we are gathering information about our target audience (SC coalitions) to be able to develop a program of evaluation support and we are piloting this support in the US before trying to roll it out on a more global scale.WHO VIP Webinar 201124* Modified from Thompson & McClintock, 200024

26 Implementation As planned, with attention to detailDocumented clearly so others can replicate if appropriateNuff said.WHO VIP Webinar 20112626

27 Define target audienceIdentifyproblem& populationDefine target audienceDisseminateEvaluation:ProcessImpactOutcomeIdentifyresourcesImplementTest, Refine,ImplementEvaluation:FormativeSo now we get to the part of evaluation that most people think about. Let’s start with process eval.Test & refineimplementationSet goals/objectivesChoose strategiesNSC Chicago 2010WHO VIP Webinar 20112727

28 Process evaluation Purpose is to address: What was done?How was it implemented?How well was it implemented?Was it implemented as planned?WHO VIP Webinar 20112828

29 Process evaluation – examples of questionsWho carried out intervention?Was this the appropriate person/group?Who supported and opposed intervention?What methods/activities were used?You are asking these questions along the way as part of your continuous quality improvement of your effort. You are also then asking them in retrospect over a period of time to help you understand any outcomes that you are or are not getting.WHO VIP Webinar 20112929

30 Process evaluation - why is it useful?Allows replication of programs that work.Helps understand why programs fail.Process eval helps you know exactly what happened so you can replicate it if you have found positive outcomes.It also helps you understand what did or did not happen in the implementation so that a “failure” can be attributed to a program that doesn’t work versus a program that wasn’t actually implemented as planned.In other words it help to keep you from throwing the baby out with the bathwaterClose to Home exampleWHO VIP Webinar 201130* Modified from Thompson & McClintock, 200030

31 The intervention cannot be a black box…It must be clearly understood?IdeaOutcomeThink of it this way:If you are going to have heart surgery, you certainly don’t want your surgeon to have been told to 1. cut you open and 2. replace a valve. You want them to have a very detailed understanding of every little step.WHO VIP Webinar 20113131

34 Using impact measures forEstablishing effectivenessSuppose we have a public safety campaign as our strategyNeed to showCampaign  Behavior  OutcomeIf we already have demonstrated that Behavior  OutcomeWe simply need to showCampaign  BehaviorYou can then decide if showing these impact changes is sufficient.If the correlation between the impacts and the outcomes is strong then they can be used as a proxy.WHO VIP Webinar 20113434

37 Evaluation examples of questions for local policy of smoke alarmsDid the local policy of smoke alarms in apartments…Get passedWhere people aware of it?Did people have access to smoke alarms?Did people get them installed properly?Do people keep them maintained?Lead to a reduction in the number or rates of:events (e.g. apartment fires)injuriesdeathscosts (e.g. burn center costs, family burden, property loss)Working to pass a policy that mandates that all apartment owners install smoke alarms. There are lots of questions you can ask.What are these first kinds of evaluation questions…WHO VIP Webinar 20113737

42 Dissemination Dissemination not done well Dissemination done wellNot attemptedNot based on research about how to disseminate information to intended audienceDissemination done wellDefining audienceHow to access audienceHow best to communicate change message to themPresentation of clear, straightforward messagesWHO VIP Webinar 20114242

43 Evaluation measures Lead to evidence of effectivenessBut only if the research and study methodologies, and the statistical analyses methodologies, are appropriate to convince the funders and supporters, the skeptics, the stakeholders, the communityand understandableThis might be the place to put an exercise so they can think about ways to measure process, impact and outcome for their specific topic example.THEN you can help them think about the ease or difficulty of getting those types of measures from secondary sources or primary data collection. But, at least get them to understand what they would WANT to measure.WHO VIP Webinar 20114343

45 Randomized controlled trial (RCT) / Experiment ‘strongest’ evidenceInterventionGroupO X ORandomizeNeed better colors on these slides.I find that students here are very confused about intervention and control groups vs. case and control designs. Perhaps your audience won’t know that jargon or know it better. But, our students are not learning in EPID the differences, so I always have to explain. They also don’t understand difference between random selection and randomization.O OX’ControlGroupWHO VIP Webinar 20114545

46 Quasi-experimental designs ‘qualified’ evidenceIntervention GroupO X OI always make a point to explain how this can be individuals or can be communities.Comparison GroupO OWHO VIP Webinar 20114646

47 One group pre/post ‘weak’ evidenceInterventionGroupO X OI used to teach this before the quasi-experimental but have found they more easily see the flaws in this design if they work from the ideal backwards.WHO VIP Webinar 20114747

48 One group – multiple pre / multiple post better ‘weak’ evidenceInterventionGroupO O O X O O O O OI used to teach this before the quasi-experimental but have found they more easily see the flaws in this design if they work from the ideal backwards.WHO VIP Webinar 2011484848

49 One group, post only ‘basically ignorable’ evidenceInterventionGroupX OWhen I present this, I point out that this is often what is done, but DON”T DO IT… and leave it at that. By this point, if they have understood the prior three or four slides, they will get it.Might want to finish with a review ofWHY EVALUATEAnd how evaluation has many types and is part of the planning cycle – that is a slide that you or Andres cut that may want to put back in here.Also, I think you need to help them understand that this was just a taste. That you don’t expect them to do evaluation based on this presentation and that they will need help and much more training. Again – knowing how to add and solve simple equations is necessary but not sufficient to understanding how to do statistics – same with evaluation, there are many elements and subtleties. People take whole courses on this stuff and spend their careers doing it. You neither want them to have false sense of security or to be too discouraged. It is learnable – their job is to be clear about what they want to accomplish, be sure it is measurable and to engage with experts in evaluation to help with the details – just as one would engage a statistician to help with analysis planning and implementation..WHO VIP Webinar 20114949

50 Observational designs - cohort study evidence?Self-chosen Intervention GroupX X XI always make a point to explain how this can be individuals or can be communities.Self-chosen Non-intervention GroupO O O50Safety 2010 LondonWHO VIP Webinar 2011505050

51 Observational designs - case-control study evidence?XOCasesI always make a point to explain how this can be individuals or can be communities.XOControls51Safety 2010 LondonWHO VIP Webinar 2011515151

52 Observational designs - cross-sectional study evidence?InjuredX XO XONon-injuredO XOO OI always make a point to explain how this can be individuals or can be communities.52WHO VIP Webinar 2011Safety 2010 London525252

53 Statistical analysis methodologiesChoice - often guided by what has been done previously, or what is feasible to do, or easy to explainChoice should be tailored to the audience & their ability to understand results; but also on the ability of the presenter to explain the methodologiesWHO VIP Webinar 2011

54 Statistical analysis Determined by research question(s)Guided by study design – experimental or observationalGroup randomized controlled experimentNon-randomized comparison studySingle site pre/post; surveillance studyRetrospective or cross-sectionalGuided by whether outcome is studied at a single time point or multiple time pointsTime series analysesGuided by audienceVisual and descriptive appreciationWHO VIP Webinar 2011

58 Systematic reviewsA protocol driven comprehensive review and synthesis of data focusing on a topic or on related key questionsformulate specific key questionsdeveloping a protocolrefining the questions of interestconducting a literature search for evidenceselecting studies that meet the inclusion criteriaappraising the studies criticallysynthesizing and interpreting the resultsWHO VIP Webinar 2011

60 Systematic reviewsOf particular value in bringing together a number of separately conducted studies, sometimes with conflicting findings, and synthesizing their results.To this end, systematic reviews may or may not include a statistical synthesis called meta-analysis, depending on whether the studies are similar enough so that combining their results is meaningfulZaza et al (2001) Amer. J Preventive Medicine – motor vehicleGreen (2005) Singapore Medical JournalWHO VIP Webinar 2011

61 Meta analysisA method of combining the results of studies quantitatively to obtain a summary estimate of the effect of an interventionOften restricted to randomized controlled trialsRecently, the Cochrane Collaboration is ‘branching out’ to include both experimental and observational studies in meta analysesWHO VIP Webinar 2011

63 Meta analysis The combining of results should take into account:the ‘quality’ of the studiesAssessed by the reciprocal of the variancethe ‘heterogeneity’ among the studiesAssessed by the variance between studiesWHO VIP Webinar 2011

64 Meta analysis – estimation of effectThe estimate is a weighted average, where the weight of a study is the reciprocal of its varianceIn order to calculate the variance of a study, one can use either a ‘fixed’ effects model or a ‘mixed’/’random’ effects modelFixed effects model:utilizes no information from other studies Random effects model:considers variance among and within studies WHO VIP Webinar 2011Borenstein et al (2009) Introduction to Meta Analysis

65 Meta analysis & meta regressionDealing with ‘heterogeneity’ among the studies - 2Decompose the total variance into among and within components  using mixed effects models for getting a more precise estimate of the intervention effectIf there is still residual heterogeneityExpand the mixed effects model to include study-level covariates that may explain some of the residual variability among studies  meta regressionWHO VIP Webinar 2011

67 Meta analysisStandard meta-analytical methods are typically restricted to comparisons of 2 interventions using direct, head-to-head evidence alone.So, for example, if we are interested in the Intervention A vs Intervention B comparison, we would include only studies that compare Intervention A versus Intervention B directly.Many times we have multiple types of interventions for the same type of problem, and we hardly have head-to-head comparisonsWe may also have multiple component interventionsWHO VIP Webinar 2011

68 Mixed treatment meta analysisLet the outcome variable be a binary response1 = positive response0 = negative responseWe can calculate the binomial counts out of a total number at risk on the kth intervention in the jth studyWe can then calculatethe estimated probability of the outcome (risk of response) for the kth intervention in the jth studyWHO VIP Webinar 2011Welton et al 2009 Amer J Epid

72 Statistical analysisMethodology does exist for developing stronger collective evidence, evaluating the effectiveness of community based interventions, using different types of study designs and interventionsDeveloping “practice-based evidence”WHO VIP Webinar 2011

73 Dissemination We should not stop at developing the evidenceWe must work alongside economists in developing ways to effectively communicate ‘what works’  methodology and cost models do exist for estimating the “return on investment”Money talks !!73Safety 2010 LondonWHO VIP Webinar 201173

74 Challenges – Evaluation requiresIntegration of evaluation from the beginningAppropriate measures, possible to be collected objectively, unbiasedly, easily and with completenessAppropriate qualitative and process information, to complement the quantitative informationConcrete and convincing evidence of what aspects work in individual communitiesFormal methodological statistical evaluation of specific elements of programsCollective evidence of what common elements of programs workEffective dissemination strategies – “return on investment”WHO VIP Webinar 201174

75 We are really looking forward to working with youWe are really looking forward to working with you. We will have a brief session tomorrow to share and discuss what we have to offer more specifically and what that might look likeWHO VIP Webinar 20117575

About project

Feedback

To ensure the functioning of the site, we use cookies. We share information about your activities on the site with our partners and Google partners: social networks and companies engaged in advertising and web analytics. For more information, see the Privacy Policy and Google Privacy &amp Terms.
Your consent to our cookies if you continue to use this website.