Call Us: US - +1 845 478 5244 | UK - +44 20 7193 7850 | AUS - +61 2 8005 4826

Determining Measurement Methods and Data Sources

Once the evaluator, in collaboration with the stakeholders, has identified the appropriate type and scope of the evaluation and prepared or reviewed program information and theory, work on the evaluation design can begin. The evaluation literature and experts sometimes differentiate between an evaluation design (exactly how the evaluation will be constructed, particularly what will be compared and how achievements will be determined) and an evaluation plan, (the systematic process of conducting public health and social program evaluations). In this guide, we use evaluation design to mean the construction of the evaluation. The evaluation plan is the document that describes the design, discusses and addresses issues, defines roles and responsibilities and guides the evaluation implementation process. The basic evaluation design types include experimental, quasi-experimental, and nonexperimental design. The designs specify how the evaluation is constructed to show the effectiveness of the program intervention. Implementation of the more rigorous of these designs usually requires the involvement of staff or consultants with specific training and experience in the areas of epidemiology or experimental evaluation. See Appendix V-A for a description of these designs. When developing an evaluation design be sure to do the following: • Provide background by conducting a literature search to identify evaluations of similar programs and to show what evaluation methods they used. • Articulate the program theory or hypothesis and identify what will be measured. Define the standards and measures that will be used. Determine how factors or events external to the program may impact either the implementation or the outcomes of the planned program and whether and how the evaluation can control for those factors. • Identify sources of data (quantitative and qualitative). • Select the intervention measurement methods/data collection instrument. • Determine the process and timeline for data collection. • Develop policies and procedures to protect client privacy and guarantee confidentiality of client specific information. • Determine data analysis methods (to the extent possible at this time). As discussed previously, in multifaceted public health programs — such as an MCH county program responsible for implementing a variety of interventions — the evaluation will generally be a performance monitoring evaluation that includes process evaluation and monitoring of outcomes. In this case the evaluation design includes a measurement of change from a baseline or simple comparison to an objective or standard. Sometimes there will be a need for a more rigorous outcome evaluation of a specific component program.