By John Shields, PhD, MSW | September 25, 2013
I’m the director of ETR’s portfolio of program evaluation projects with the San Francisco Unified School District’s Student, Family, and Community Support Department (SFUSD-SFCSD). ETR has been working in close partnership with SFUSD for over twenty years now—eleven during my tenure.
One of the most important components of our work with the SFUSD has been our bi-annual administration and analysis of the CDC’s Youth Risk Behavior Survey (YRBS). The YRBS provides high-quality data on the health risk behavior of SFUSD’s middle and high school students. In partnership with SFUSD, we’ve taken steps to use the power of the YRBS to address the critical health and wellness needs of lesbian, gay, bisexual, transgender and questioning (LGBTQ) youth.
By Matt Cherry | September 18, 2013
Many organizations and companies are looking at the possibility of migrating existing face-to-face trainings into the e-learning environment. There are some compelling reasons to do so. E-learning can be more affordable, accessible and consistent for trainees across a broad geographic range.
It’s important to follow established best practices for training design and implementation. I like to use the ADDIE model—Analyze, Design, Develop, Implement and Evaluate. By following an established instructional design methodology such as ADDIE, you can produce an effective online course that meets your organizational objectives.
By Jamie Sparks | August 28, 2013
I’m the Coordinated School Health Project Director with the Kentucky Department of Education. When it comes to program, policies and curricula, Kentucky is a local control state—the state sets up the standards, and the local districts determine how to meet them.
We’re a CDC-funded project, and we think very highly of comprehensive school health education. We’re taking active steps to foster more buy-in for this approach in districts throughout Kentucky. We’re building the foundations for an ongoing discussion with our districts on how to make comprehensive school health education work.
By Pam Drake, PhD | August 21, 2013
When we want to evaluate how well an evidence-based program (EBP) works, one of the important variables we need to measure accurately is implementation fidelity. This variable helps confirm that the program is being presented as intended, and that different educators are doing essentially the same things in teaching the program.
With good implementation fidelity, there’s a better chance others can replicate the program’s outcomes. Schools and communities that show good implementation fidelity for a program can affirm they’re taking the correct steps to reach health goals.
Implementation fidelity also helps us interpret outcomes—for example, why an intervention did or didn’t work. We can assess how practical the program activities are, or refine programs by determining which components lead to the outcomes we want.
List of national resources that can provide information, training and materials for sexuality educators. "Appendix A" from Clint E. Bruess, EdD, FASHA, FAAHE and Elizabeth Schroeder, EdD, MSW. Sexuality Education: Theory and Practice (Seventh Edition). Scotts Valley CA: ETR, 2018.