What is Program Evaluation?
The overall purpose of a social service organization undertaking an evaluation of its programs and services is to determine their effectiveness, and to identify approaches for improving those programs and services as they continue to develop and evolve.
The evaluation plan provides a means for the organization to assess the degree of success it achieves in meeting:
• the goals of the organization;
• the goals of a parent or partner organization; and
• the goals of the individual programs and services.
This will allow the organization to:
• meet the requirements of any review or accreditation procedures;
• assess the degree to which client needs are being met;
• determine the congruence of service provided with funder expectations; and
• identify the degree to which organization activities are meeting community partner needs and expectations.
The evaluation plan is an activity which is complementary to and builds upon planning and direction provided in other organizational activities:
• the strategic plan;
• program service logic models;
• Continuous Quality Improvement Assessments; and
• Risk Management Assessments.
Practical Approach to Evaluation
While a formal evaluation can be a complex, costly and time consuming exercise, there are basic approaches that can provide practical information to management in the organization. The approach used to evaluate programs and services must be aligned with the purpose of the evaluation and the goals and values of the organization.
It is appropriate to use a “mixed method” approach to evaluation. This means that, as relevant to the program/service evaluated, the approach should be:
• Outcome/objective oriented: The focus is on making clear the goals and objectives, and measuring how the program has done in reaching them.
• Client/participant oriented: Program participants and stakeholders are key sources of both questions to be answered by the evaluation and of information to answer those questions. This information is useful for program improvement purposes.
These are the specific questions an evaluation will answer. Being clear about the questions that the evaluation should answer is key to obtaining an evaluation that meets the needs of the organization. There are specific evaluation questions for each program/service.
Identifying these questions begins with an examination of the service logic model for each program/service. This logic model describes the links between activities and expected outcomes, and raises obvious evaluation questions.
Stakeholders provide another major source of appropriate evaluation questions. The evaluation questions of interest to the various stakeholders associated with a specific program or service have to be identified and prioritized. Participants, service delivery staff, program managers, senior management, the Advisory Council, funders, community partners – each has a different perspective.
Questions to be addressed are, of course, limited by resources and practicality.
• Planning and implementation: E.g. Are those in most need of help receiving services? Are community members satisfied that the program meets community needs? Why do clients enter, and why do they leave?
• Assessing implementation of objectives: E.g. How well has the program met its stated objectives? How many participated? How many hours were involved?
• Impact on participants: E.g. How much and what kind of a difference has the program made for its participants? How has behaviour changed as a result of involvement in the program? Are participants satisfied with the experience? Were there any negative results from participation?
• Outcomes: E.g. What resulted from the program? Were there any negative results from the program? Do the benefits outweigh the cost?
A procedure has to be confirmed or set in place to ensure consistent, ongoing collection and analysis of information for use in decision making. Data collection is not a stand-alone activity, nor does it occur only at the end of the program. Evaluation is ongoing, and assists staff and stakeholders in making effective decisions to continuously strengthen and improve the program/service.
In deciding on appropriate data collection for a specific program or service, a first step is to examine the data that is already collected from routine program activities. An adjunct data collection system may have to be incorporated if it will provide useful answers to other evaluation questions not already fully addressed. Data collection methods are appropriate to the information to be collected, and can include both qualitative and quantitative measurement.
Baseline data on key outcome and implementation areas is collected. This will provide a performance standard for later evaluation.
Deciding on short-term outcomes will allow staff and managers to use information collected during the period of evaluation to make decisions about program/service performance and to make changes to improve the program if required.
During the evaluation period, information relating to short and long term goals and to other pertinent evaluation questions continues to be collected.
Analysis and Reporting
At the end of the planned evaluation period for each program or service, information and data collected to address the evaluation questions is analyzed and interpreted. Conclusions based on this analysis leads to recommendations to/by senior management and perhaps any Advisory Council or Board about decisions that must be made about the program or service.
Part of this final evaluation includes evaluating the evaluation, to identify strengths and areas for improvement in the implemented evaluation so that future evaluations can be improved.
Specified staff are assigned responsibility to ensure that identified challenges are addressed, and that an appropriate action plan is in place.
Conclusions and recommendations are formally documented, and communicated to relevant stakeholders, including funders. This information is also incorporated into the overall strategic planning cycle for the organization. The improvement of programs and practices is an agenda item for the Advisory Council or Board.