Program Evaluation Toolkit for Harm Reduction Organizations

External Focus: Using Evaluation Results for Program Sustainability

External Focus: Using Evaluation Results for Program Sustainability

Program sustainability can refer to different things depending on how new or established your harm reduction program is. Newer programs, for example, may prioritize securing additional or supplemental funds as a part of their sustainability strategy to support their harm reduction work once the initial funding period comes to an end. In addition to securing funding, more established harm reduction programs may also include strategies such as expanding their partnership network or broadening their harm reduction policy initiatives. Sharing the results of your program evaluation with external audiences is an effective way to make the case that your harm reduction program is essential, impactful, and worth funding.

The data sets that are often pulled for these purposes are outcome data, impact data, and summative data. (Note: Summative data is data that is usually collected after the program has ended and speaks to the overall effectiveness of the program.) These data sets often help answer the questions, “What do we want people to know?” and, “What do we need people to understand?” External audiences can include:

  • Funders (current and prospective)
  • Program participants
  • The general public

Table (5.4). Sharing results with your funder:

While it is true that sharing your evaluation results with your funder is often a program requirement, it can also be looked at as an opportunity to make a case for continued funding. Using your evaluation data to illustrate your program’s “wins,” while also pointing out the additional areas that you are positioned to address, can be a salient way to demonstrate to your funder the effectiveness of your organization’s work, the limitations of the existing funding, and potential opportunities for addressing the needs of PWUD more fully were funding to continue or increase.

Types of data to include: Things to consider when sharing data:
  • Data that shows the extent to which program activities have been accomplished.
  • Data that indicates whether program goals are being met.
  • Data that shows the influence on cost.
  • Data that demonstrates impact within the community.
  • Data that offers lessons learned and best practices.
  • Should be more formal; prioritize sharing the information your funder has requested in the format they have requested.
  • Should be visually appealing and data driven.
  • Include human-interest stories where possible to elevate the validity of the findings and proposed approaches.
  • Should be organized in a way that easily ties back to program activities and deliverables.
  • Be intentional about highlighting your program’s accomplishments, as well as how your program will address areas that have not gone as planned.

Table (5.5). Sharing results with your clients:

Sharing your evaluation findings with current and potential program participants can be beneficial for several reasons. Communications about your program’s implementation can convey how important participants are to the program; ensure participants have the information they need to successfully participate in the program; and provide reminders to participants of upcoming services, tasks, or events. Using your program evaluation results to share the benefits of the program and the ability of the program to support the needs of existing participants can also be an effective way to sustain or increase interest and community demand for your program’s services. While it is helpful to highlight and share the positive aspects of your program, it’s just as important to be transparent about any limitations to what your program can offer so that current and prospective participants have a realistic understanding of what to expect.

Types of data to include: Things to consider when sharing data:
  • Information that outlines how to participate in the program and any associated requirements.
  • Data that demonstrates existing participant perceptions and experiences.
  • Data that specifically aligns with participants' interests and addresses known concerns.
  • Include human-interest stories where possible to elevate the social validity of your program.
  • Should be conversational and in the participant's primary language (if not English) wherever possible.
  • Should be brief, visually interesting, and free of jargon.
  • Should be organized in a way that easily ties back to the accessibility and availability of services.

“A lot of the behavioral models that are used [by the CDC] are, like they say, based on behavior. So intuitively we oftentimes have the answer, we’re doing the work, we just haven’t named it what the experts have named it. So I’d often backtrack and ask ‘well tell me what you were doing? Oh you were using thinkers in the community, so that’s actually a Popular Opinion Leaders- model.’ So now we can write it up and lean into that and be more intentional in calling it by the model. Most programmatic stuff comes from human content, so it’s a human design.”
—Al Forbes

Table (5.6). Sharing your results with the general public.

Sharing your evaluation results with the general public can create awareness about the importance of harm reduction programs and can take the form of a marketing tool to solicit donors and supporters within your community. In many places around the country, harm reduction work remains greatly underfunded, particularly when it comes to the types of supplies that are needed to save people’s lives. Generating awareness not just about the program, but also about the positive impact of harm reduction services and interventions within the community, can mobilize community support.

Types of data to include: Things to consider when sharing data:
  • Data that demonstrates impact within the community.
  • Data that offers lessons learned and best practices.
  • Data that demonstrates existing participant perceptions and experiences
  • Data that specifically aligns with known interests, priority areas, or concerns.
  • Depending on the specific audience, can range from conversational to more technical.
  • Should be visually appealing and data-driven.
  • Humanize the numbers, where possible, to connect the dots for people.
  • Should be organized in a way that easily ties back to impact and lessons learned.
  • Be intentional about highlighting your program’s accomplishments and the importance of this work in the community.