Methods

What did you do?

Describes ethical aspects of implementing and studying the improvement, such as privacy concerns, protection of participants' physical well-being, and potential author conflicts of interest, and how ethical concerns were addressed

Describes the intervention and its component parts in sufficient detail that others could reproduce it

Indicates main factors that contributed to choice of the specific intervention (for example, analysis of causes of dysfunction; matching relevant improvement experience of others with the local situation)

Outlines initial plans for how the intervention was to be implemented: e.g., what was to be done (initial steps; functions to be accomplished by those steps; how tests of change would be used to modify intervention), and by whom (intended roles, qualifications, and training of staff)

Outlines plans for assessing how well the intervention was implemented (dose or intensity of exposure)

Describes mechanisms by which intervention components were expected to cause changes, and plans for testing whether those mechanisms were effective

Identifies the study design (for example, observational, quasi-experimental, experimental) chosen for measuring impact of the intervention on primary and secondary outcomes, if applicable

Explains plans for implementing essential aspects of the chosen study design, as described in publication guidelines for specific designs, if applicable (see, for example, http://www.equator-network.org)

Describes aspects of the study design that specifically concerned internal validity (integrity of the data) and external validity (generalizability)

Describes instruments and procedures (qualitative, quantitative, or mixed) used to assess a) the effectiveness of implementation, b) the contributions of intervention components and context factors to effectiveness of the intervention, and c) primary and secondary outcomes

Reports efforts to validate and test reliability of assessment instruments

Explores factors that could affect generalizability (external validity), for example: representativeness of participants; effectiveness of implementation; dose-response effects; features of local care setting

Addresses likelihood that observed gains may weaken over time, and describes plans, if any, for monitoring and maintaining improvement; explicitly states if such planning was not done

Reviews efforts made to minimize and adjust for study limitations

Assesses the effect of study limitations on interpretation and application of results

Explores possible reasons for differences between observed and expected outcomes

Draws inferences consistent with the strength of the data about causal mechanisms and size of observed changes, paying particular attention to components of the intervention and context factors that helped determine the intervention's effectiveness (or lack thereof), and types of settings in which this intervention is most likely to be effective

Suggests steps that might be modified to improve future performance

Reviews issues of opportunity cost and actual financial cost of the intervention