By ETR | July 19, 2016
Note: We're posting about some of the presentations ETR researchers and professional development specialists are offering at the Office of Adolescent Health Teen Pregnancy Prevention Grantee Conference July 19-21.
Here’s a challenge facing anyone delivering evidence-based teen pregnancy prevention programs. Educators usually want to adapt programs to boost their relevance with the teens they’re working with. Program managers need to be sure any adaptations are done ways that maintain the fidelity and effectiveness of a program. If there is an evaluation component in the project, managers also need to be sure that adaptations have been noted and are taken into consideration when data is analyzed and reported.
How do you feel about fidelity monitoring of your teen pregnancy prevention programs? Have you faced challenges balancing these dynamics between adaptation and program fidelity?
BA Laris, MPH. Evaluations: Adaptations/Fidelity. Tuesday 7/19/16 1:00-2:00 p.m. Topical Roundtable in the Evaluation Section, Key-3.
When we talk about evaluation and fidelity, we are thinking about the core components of a program model—the content, the pedagogy and the implementation. Essentially, we are asking whether the program is teaching the content the way it was designed to be delivered, using a supportive learning environment.
The Office of Adolescent Health Performance Measures are designed to capture this fidelity by looking at the percentage of sessions implemented and the percentage of activities completed.
One of the best ways for the field to learn more about what’s really happening with adaptation and fidelity is to engage in authentic, honest, informal conversations. What’s working for you? What are the challenges? What system are you using to monitor and gather real-time data from educators? Have you struggled with response rates or compliance?
ETR’s BA Laris, MPH, will be facilitating a roundtable at the OAH TPP conference addressing exactly these issues. “When everyone comes together to share insights and experiences, we help inform our own and others’ practice,” BA explains. “Sometimes this is a great way to think about new ways to gather fidelity data, support facilitators and enhance youth learning.”
If you’re at the conference, come on by the Roundtable if you’d like to discuss questions such as these: