Program Evaluation for the Homelessness Sector

Reporting the Results

This lesson will help you structure and write an Evaluation report.

What should you do with your findings? You want to present your results in a way that is easy for your audience to understand. The most common method is to write up a report.

How to Structure Your Report

Executive summary 

This is potentially the most important section of your report. There will be a portion of your audience who just doesn’t have the opportunity to read an entire report. And yet, your findings may still be of interest to them. The executive summary is an opportunity to present key highlights from your evaluation to ensure that readers get the main messages.

Introduction 

You should include a brief background on why you are conducting your evaluation.

Methods

Provide a brief summary of how you collected and analyzed your data. For example

Results

This section covers the results of your analysis of your data. You can mention any problems in your methodology or challenges you encountered in the data collection process in this section as well.

Interpretation 

Use this section to explain your results. What are the implications for the local context? What is working (facilitators) and what isn’t (challenges)? You may also explain how the results relate to previous work.

Recommendations 

Here is where you discuss what should come from your results. You can describe what change, if any, if needed and how you will go about implementing those changes.

Appendix

You can include copies of your data collection tools such as survey and interview guides in the appendices.

Example: Executive Summary
A pilot program on youth employment strategies was evaluated by an external evaluation team. They used interviews and online surveys to capture feedback from program participants and they also conducted interviews with representatives of youth serving organizations (YSOs).

The report highlights general themes that emerged from the data. Job seekers reported common barriers to employment included language and lack of Canadian experience. Job seekers also experienced significant anxiety about the job search process. 

From the perspective of YSOs, the report notes that job seekers would benefit from additional support for specific job search strategies. Job seekers had more successful outcomes with supportive employers. These employers had more awareness of the challenges that job seekers were facing and were able to be flexible and accommodate their needs.

You can make recommendations based on the results of your data collection and analysis. The findings from the EPIC pilot program evaluation suggest some improvements that could be made to the pilot program. For example, additional resources can be added to the site for job seekers to learn Program staff could do more work with employers to help them understand their role and how they could support job seekers to be successful.

Review recommendations with stakeholders to identify actionable outcomes and discuss what has been learned from conducting the evaluation and next steps to incorporate results. Prioritize actions and develop an action plan as a group.

 

Lessons Learned

Program evaluation should not be a one-time activity. There will always be opportunities to improve on practices and review data collection and analysis strategies. Therefore, after the end of each evaluation time should be dedicated to reflecting on the evaluation process. This includes examining what worked well and what could be improved in the future. Each phase of the evaluation should be reviewed. This could include:

1.    Reviewing the program logic model on an annual basis.

2.    Reviewing the outcome indicators. Are there more relevant outcome indicators that could be used? Are there new tools that have been developed to measure your outcome indicators?

3.    Data collection plan. Did the data collection plan work? Was there a substantial amount of missing data? Could qualitative or quantitative measures been added? Was data collected at the right time? Do the data collectors require more training?

4.    Data analysis plan. Was the analysis plan realistic? Are outside sources required to enhance the data analysis?

5.    Communication. Who did the results go to, and in what form? What could be done better and what went well?

Taking these steps will ensure that future evaluations are meaningful and valid. 

When findings are not what you expected

You may recall in Lesson 1, we explored some reasons why people are reluctant to take on program evaluation. One important reason was a lack of trust and transparency about how the findings would be used.

A recent report calls for safe spaces for sharing evaluation findings so that nonprofit organizations can feel comfortable sharing negative evaluation results (ONN, 2017). The report suggests that safe spaces for sharing results that fall short of your expectations are other organizations within your sector. To make it okay to share results that are not the outcomes we had hoped for, we first have to commit to honest and transparent communications about how our services are working. That can lead to better solutions to tackle our most challenging social issues.