Program Evaluation Toolkit for Harm Reduction Organizations

Glossary of Terms

Glossary of Terms

MODULE 1: Making the Case for Program Evaluation

  • Program evaluation: the systemic method for collecting, analyzing, and using data to examine the effectiveness and efficiency of a program
  • Program monitoring: the process of collecting and assessing information on program activity to ensure that the program is accomplishing what is intended
  • Program research: use of program data to investigate observed phenomenon to establish global facts or reach new conclusions
  • Meaningful involvement of people who use drugs: the act of granting decision-making power to people who actively use drugs in ways that inform the design, implementation, and analysis or reporting of a program evaluation effort
  • Emotional hesitation: reluctance to engage in a program evaluation effort because of a previous negative experience
  • Ideological resistance: reluctance to engage in a program evaluation effort because of moral opposition

MODULE 2: Preparing for Your Evaluation

  • Evaluation capacity: a program’s ability to successfully support the activities of an evaluation process
  • Culture: the customs, values, and beliefs that inform how we behave and understand the world around us
  • Social justice: the pursuit of equal rights and equitable opportunity for all
  • Inclusion: the action or state of being granted equal access to opportunities to contribute to a program evaluation effort
  • Evaluation approach: the distinct ways to think about, design and conduct an evaluation effort
  • Participatory evaluation: an evaluation approach that emphasizes the involvement of the individuals who are directly impacted by the results of the evaluation
  • Conventional evaluation: the traditional way we have been taught evaluations should be carried out
  • Process evaluation: focuses on whether your harm reduction program activities have been implemented in the way they were intended and resulted in the intended outputs
  • Outcome evaluation: measures the extent your program has influenced changes in behaviors, practices, or knowledge during the program period
  • Impact evaluation: assesses why or how a program has been able to influence sustained changes (impact) over time; it can also be used to determine which services help the program to accomplish its goals most effectively
  • Summative evaluation: provides an overall synopsis of the effectiveness of the program; typically, this type of evaluation helps determine whether a program should be continued, expanded, or ended

MODULE 3: Planning for Your Evaluation

  • Evaluation team: a group of individuals who are tasked with contributing input that influences the design, implementation, and/or communication of a program evaluation effort
  • Role clarity: establishing how the lead evaluator, evaluation team and harm reduction program team will work together to support the evaluation to minimize ambiguity, build trust, and enhance efficiencies
  • Program Goal: the specific outcome (or impact) that your program is working to accomplish
  • Logic models: a visual depiction of all the activities, outcomes and outputs that contribute to the program’s ability to solve a program or achieve a goal
  • Inputs: the staffing, resources, supplies, and time that goes into your harm reduction program
  • Activities: what your program is doing to accomplish your program goals; the program’s efforts
  • Outputs: what is produced or happens because of the activities
  • Short-term outcomes: the intended difference the program activities should make within the organization or the community at large in the short-term
  • Intermediate (medium) outcomes: the intended difference the program activities should make within the organization or the community at large beyond the short-term time frame
  • Long-term outcomes (impact): the intended difference the program should ultimately make within the organization or the community at large
  • Assumption: what is understood to be true about the harm reduction program, program activities, and/or participants
  • External impacts: the environmental factors that will likely influence program activities
  • Evaluation questions: questions that reflect the purpose and priorities of a program evaluation and focus the evaluation effort
  • Survey questions: questions that are focused on assessing a specific behavior, feeling, or perception
  • Indicators: signs of progress that are used to determine if a program is meeting its objectives and goals
  • Qualitative data: data that describes qualities or characteristics
  • Quantitative data: data that is represented numerically
  • Close-ended questions: questions that have predetermined answers for respondents to choose from
  • Open-ended questions: questions that cannot be answered with a simple “yes” or “no” response
  • Resource availability: the resources you have available to launch and carry out a program evaluation, which can be grouped into “people resources” and “project resources”
  • Evaluation plan: A written document that outlines how a program will be evaluated and how the results of the evaluation will be used

MODULE 4: Doing Your Evaluation

  • Data sources: entities that provide information that have been collected in a systematic way
  • Surveys: written tools that are used to collect information from multiple respondents
  • Individual interviews: conversational sessions conducted with program participants, either in person or virtually, to gain an in-depth understanding of their experiences and opinions
  • Observations: written documentation (usually completed by staff) of program events and/or participant interactions
  • Document review: the gathering of existing program documents, monitoring data and records
  • Focus groups: conversational sessions conducted with a group of program participants, either in person or virtually, to gain an in-depth understanding of their experiences and opinions
  • Data reliability: program data that can be trusted (i.e., accurate, unique, and complete) to effectively inform your evaluation in the ways that are needed
  • Data analysis: the process of reviewing, summarizing, and comparing program data to draw useful conclusions about your program
  • Data validation: consists of ensuring that all the data that has been collected for your program evaluation has been cleaned, is complete, and is labeled and stored properly
  • Data editing: the process of ensuring your quantitative data is clear and understandable by reading through the data to identify unclear entries and using reason to decipher the meaning or fill in missing information, where appropriate
  • Data coding: the process of grouping and assigning value to quantitative responses
  • Mean: a numerical average
  • Median: the midpoint of a data set when in chronological order
  • Mode: the most common value that appears in a range of values
  • Percentage: the ratio or number that represents a fraction of 100
  • Frequency: the rate at which something occurs or is repeated
  • Range: the largest number minus the smallest number in the data set
  • Descriptive statistics: analyzing data in a way that helps to describe or summarize the relationships and patterns that are present
  • Inferential statistics: conducting an advanced analysis of quantitative data to draw conclusions (or inferences) and make predictions about the larger population
  • Narrative analysis: a qualitative analysis that focuses on analyzing respondents’ experiences and motivations by looking closely at the individual stories that they share and interpreting meaning
  • Thematic analysis: a qualitative analysis that focuses on using the patterns identified to determine and compare common themes across the qualitative data sets to tell a larger, overarching story

MODULE 5: Using Your Evaluation

  • Internal audience: typically consists of the individuals who make up the organization’s staff and volunteers
  • External audience: consists of the individuals who are not formally a part of the organization, but may be stakeholders who are directly involved in program impact (e.g., funders, program participants, partners, and the broader community or community groups)
  • Evaluation report: documentation of the purpose, process, and results of your program evaluation effort so that it can be referenced and, if needed, shared with others
  • Communication strategy: blueprints for developing and disseminating a message to an intended audience
  • Data visualization: the act of translating evaluation data into a visual context to make it relatable and easier to understand