What Is Program Evaluation?

Annual reports.

If you shuddered when you read that phrase, you’re not alone. Many program directors feel that their stress levels rise as annual report deadlines approach. Funders want to know how organizations are stewarding the grant money they have been given and what impact their investment is making. One day, you may scramble to find survey data, testimonials, and a few anecdotes that you can add to your report. The next day, you may be seeing double from hours spent analyzing your budget spreadsheet.

One way to reduce your load going into annual report season is to develop and implement a program evaluation strategy that will ensure that you have the information you need to complete your report in a timely manner.

Defining Program Evaluation

In Planning Programs for Adult Learners: A Practical Guide, Rosemary S. Caffarella and Sandra Ratcliff Daffron defined program evaluation as

a process used to determine whether the design and delivery of a program were effective and whether the proposed outcomes were met….The central purposes that drive evaluation processes are gathering and analyzing data for decision making and accountability.

Caffarella and Daffron distinguish between “systematic” and “informal” evaluation approaches. Systematic evaluation is planned and often relies on trusted research methods to gather quality data. Administering a feedback survey at the end of a class is an example of systematic evaluation. Informal evaluation is unplanned and often responsive to real-time experiences in a program. Noticing how participants engage in a small group discussion is a type of informal evaluation.

While some funders may ask you to report on program changes you made based on informal evaluation, many are more interested in what you learned from systematic evaluation activities and what, if any, changes you made as a result.

Two Types of Systematic Evaluation Data

There are two types of systematic evaluation data: process and outcome data. Process data gives you information about the delivery or implementation of your key activities.

  • Did the application process run smoothly?
  • What technological components need work?
  • Did participants find the speakers engaging?
  • How can we improve communication with the participants?

Impact data gives you information about the impact your key activities had on the participants.

  • What did the participants learn as a result of participating?
  • How have participants changed their behavior as a result of the program?
  • In what ways are participants thinking differently in light of engaging with our program materials?

You can collect both types of data at the same time. For example, this survey collects process and impact data. I have included the program outcomes because impact data should always link back to hoped-for outcomes.

Using Process and Impact Data in Your Report

You can use both process and impact data in your annual report. If a funder asks, Was your program successful? You could say something like this:

Feedback surveys administered after the workshops indicated that the workshops were very effective in helping participants identify their strengths, craft a compelling vision of their future, and develop goals to help them move toward their future. In fact, 92% of participants said that, as a result of attending the workshop, they feel motivated to achieve their goals. One person shared, “Thinking about my future in this way was new to me and a little scary at first, but I’m so excited to begin making progress on my goals because I think this could be attainable!” We look forward to following up with workshop participants later this year to understand the longer-term impact of the program.

With respect to program improvements, some participants noted that the registration process was a bit complicated, so we are working on improving that process with the support of our IT team. We also learned that one of our facilitators struggled to communicate clearly in the workshops, and we are mentoring her to provide additional training and support.