Skip Navigation

Evaluation Tips

This KnowledgeBase archive includes content and external links that were accurate and relevant as of September 30, 2019.

Designing the Evaluation

  • Focus initial evaluation efforts on implementation questions. The findings obtained can be used to improve program operations.
  • Don't forget to plan and schedule the collection of baseline data.
  • Plan to use existing data whenever possible and to integrate evaluation data collection into existing procedures as much as possible.
  • Coordinate the data collection and reporting requirements of your CSRD program with the requirements of other programs in place. For example, the reporting requirements of Title I clearly overlap with CSRD requirements.
  • Look for opportunities to use the same measures to answer multiple evaluation questions.

Getting Started

  • To make ongoing data collection more efficient, create a list or database of contact information for various groups of individuals who will serve as sources of data.
  • Carefully plan data analysis procedures at the outset. Consider using a consultant to start on the right foot if this expertise is not available in-house.
  • Document data collection and data analysis procedures early in the evaluation process. You will need this information for reporting and for making revisions to the evaluation plan.

Analysis and Interpretation

  • Don't rely solely on statistical analysis. Other forms of analysis may better serve your needs.
  • Allow sufficient time to conduct thoughtful and in-depth analyses.
  • As a team, ask yourselves, do the results make sense? What are the possible explanations of findings, and how will the results help us decide what actions will improve the program?
  • Involve others in interpreting the results to gain insights from their experiences and to maintain their excitement about and involvement in the evaluation

Reporting Your Findings

  • Consider creating a master data file or report that compiles all the local context, program characteristics, baseline data and evaluation findings to date. Once developed, the file is relatively easy to update with new data and findings.
  • Use the master data file to create reports and presentations tailored to key audiences. Consider developing at least the following:
    • a written executive summary of the findings
    • an oral presentation supported with overheads or a computer-based presentation
    • a briefing paper that highlights key evaluation findings and recommendations
  • Don't forget to include recommendations for improvement in your reports and presentations. They are the blueprint for program improvement.
  • Be sure that findings are tailored for use by specific groups.
  • Be specific and give examples of what might be done in the future in your presentations and reports.
  • Let each stakeholder group know that using the findings is important and that you will be checking back to see if you can be of assistance.
  • Always follow up with key members of stakeholder groups to gain new insights.

Source:

Evaluating for Success: An Evaluation Guide for Schools and Districts, Louis F. Cicchinelli and Zoe Barley, 1999

 

The contents of this website were developed under a grant from the U.S. Department of Education and are intended for general reference purposes only. However, those contents do not necessarily represent the policy of the U.S. Department of Education or the Center, and you should not assume endorsement by the Federal Government. Some resources on this site require Adobe Acrobat Reader. This website archive includes content and external links that were accurate and relevant as of September 30, 2019.