Official State of Nebraska website - Return to Nebraska.gov

RETURN to 21st CCLC

Data Collection and Evaluation

It is important to develop an evaluation design in the planning stages of your program. Evaluation is an on-going process that you will use to understand your program and make continuous improvements. Evaluation involves collecting data pertinent to your program goals and analyzing that data to determine if your program is achieving those goals. This section will assist you in setting up a plan for evaluation. Current Nebraska 21st CCLC grantees can access additional evaluation information under the My 21st CCLC tab.

FIRST STEPS

  • Meet with your management team to design your evaluation plan.

  • Appoint your evaluator.

  • Understand the reporting requirements of your grant (if applicable).

  • Review your program goals and decide what questions you need to answer.

  • Determine your methods of data collection.

  • Develop an on-going evaluation plan based to promote continuous improvement.

DESIGN YOUR EVALUATION PLAN
Moving Towards Success: Framework for After-School Programs is a great resource for developing an evaluation plan. This document discusses the theory of change approach to evaluation planning, continuous improvement and communication among partners. The document outlines the steps for theory of change:

  • Use the logic model to assess your program's goals, elements and outcomes. For more information on the logic model, visit http://www.collaborativecommunications.com/assets/78_framework.pdf.)

  • Refine and identify program goals to meet the needs of participants.

  • Refine and select program elements needed to achieve program goals.

  • Brainstorm and refine participant outcomes aligned with program goals and elements.

  • Establish performance measures, data sources and data collection methods to assess implementation of program elements and progress toward program goals.

SELECT YOUR EVALUATOR

  • Recruit a qualified evaluator. You will want the evaluator to be involved in the development of your evaluation plan. Check with your school district to see if they have contracted an evaluator for other programs. That evaluator may be able to evaluate your program as well. Local universities may also have experienced evaluators.

  • Determine the evaluator job responsibilities.

  • Budget for the local evaluator's compensation. Some districts may have an in-house evaluator who will provide services as an in-kind contribution. Other programs hire an external evaluator and allocate up to 10% of the budget for evaluation costs, according to the Beyond the Bell Toolkit.

  • Conduct interviews with evaluator candidates. For examples, review sample evaluator interview questions.

REVIEW GOALS AND CHOOSE INDICATORS

  • The management team should review and confirm the goals established by the planning committee. The program may have one goal or several goals, depending on the needs of the program.

  • Discuss how the program's activities will contribute to reaching these goals.

  • Identify indicators for each of the program's goals. An indicator is a quantified measurement that can be taken repeatedly over time to track progress. Your group may brainstorm many indicators before deciding on one or two to measure.

For example, your program's goal may be to improve academic performance. Specifically, you want to improve homework completion for participating students. To reach this goal, your program will offer homework help as an activity afterschool from 4:00-5:00 P.M., led by a certified teacher. Your indicator will be to increase the number of completed assignments, as reported by classroom teachers.

Tips for indicators:

  • Be sure the indicator is relevant to the goal.

  • Be certain that the data necessary to analyze the indicator can be collected.

  • For additional tips, see Beyond the Bell Toolkit- Chapter 3: Evaluation (http://www.beyondthebell.org/index.php).

DATA COLLECTION
There are various ways to collect data. Your method of collecting data will depend on your indicator. Here are some common ways to collect data:

  • Attendance should be kept at the immediate start of the program each day. Preferably, enter attendance data electronically each day to keep up on data entry. Also, back data up and save hard copies of all attendance files. See the sample Beyond the Bell Monthly Attendance Record.

  • Surveys can be given to a range of stakeholders such as community partners, parents, students, program staff and school staff. See these Survey Tips for suggestions.

  • Interviews, like surveys, can be given to a range of stakeholders. Interviews provide the opportunity to ask additional information and understand more details.

  • Information reports from the school district can provide essential information for evaluating your program (grades, test scores, crime statistics, detention reports, attendance records). If you require additional information that the school district doesn't provide, ask for that information on enrollment forms. Be sure to practice confidentiality with student information.

  • Miscellaneous information can provide a different snapshot of your program for your evaluator. Save newsletters, meeting minutes, photographs, staff journals, activity calendars, student portfolios and other projects.

  • Observation methods can address important issues such as program environment, safety and wellness, program administration, relationships, interactions, professional development opportunities, program activities, student engagement, program sustainability and school/community/family partnerships.

TIPS FOR AN ON-GOING EVALUATION PLAN

  • Develop an evaluation calendar for the year. This will help your program stay on track for your evaluation activities such as data collection, written reports and grant requirements.

  • After the evaluator processes the data and completes written reports, review the evaluation with the management team to discuss the findings. Discuss whether or not the program reached the goals. Brainstorm reasons for your outcomes. Decide on future changes on program practices or data collection.

  • Inform stakeholders of the outcomes from the evaluation. Discuss future goals with program staff. Review results with community partners. Inform parents and staff about any changes based on evaluation results. Share success stories with the community.

WEB RESOURCES FOR EVALUATION
Afterschool Resources offers sample surveys and other instruments for measuring a variety of issues. It also provides information on research and evaluation studies. http://www.afterschoolresources.org/index.html.

Child Care and Early Research Connections provides a link to The Quality of School-Age Child Care in Afterschool Settings, by Priscilla Little. This resource identifies the characteristics of high-quality afterschool programs, examines the research linking program quality to positive outcomes and reviews measures used in program quality assessment. http://www.researchconnections.org/SendPdf?resourceId=12576.

C.S. Mott Foundation provides a working document titled Moving Towards Success: Framework for After-School Programs. This framework is designed to be a tool to help program staff and evaluators develop a long-term strategic plan for program development, program improvement and program effectiveness. http://www.collaborativecommunications.com/assets/78_framework.pdf.

Analyzing Qualitative Data, by Ellen Taylor-Powell and Marcus Renner, explores the ways of examining narrative data and describes the analysis process. Related resources related to quantitative data, planning program evaluation and questionnaire design can be purchased at http://learningstore.uwex.edu/Default.aspx.