Evaluations: Adaptations and Fidelity

Evaluations: Adaptations and Fidelity

By ETR | July 19, 2016
Note: We're posting about some of the presentations ETR researchers and professional development specialists are offering at the Office of Adolescent Health Teen Pregnancy Prevention Grantee Conference July 19-21.

Here’s a challenge facing anyone delivering evidence-based teen pregnancy prevention programs. Educators usually want to adapt programs to boost their relevance with the teens they’re working with. Program managers need to be sure any adaptations are done ways that maintain the fidelity and effectiveness of a program. If there is an evaluation component in the project, managers also need to be sure that adaptations have been noted and are taken into consideration when data is analyzed and reported.


How do you feel about fidelity monitoring of your teen pregnancy prevention programs? Have you faced challenges balancing these dynamics between adaptation and program fidelity?

BA Laris, MPH. Evaluations: Adaptations/Fidelity. Tuesday 7/19/16 1:00-2:00 p.m. Topical Roundtable in the Evaluation Section, Key-3.

It’s About the Core

When we talk about evaluation and fidelity, we are thinking about the core components of a program model—the content, the pedagogy and the implementation. Essentially, we are asking whether the program is teaching the content the way it was designed to be delivered, using a supportive learning environment.

The Office of Adolescent Health Performance Measures are designed to capture this fidelity by looking at the percentage of sessions implemented and the percentage of activities completed.

Mixing It Up at the Roundtable

One of the best ways for the field to learn more about what’s really happening with adaptation and fidelity is to engage in authentic, honest, informal conversations. What’s working for you? What are the challenges? What system are you using to monitor and gather real-time data from educators? Have you struggled with response rates or compliance?

ETR’s BA Laris, MPH, will be facilitating a roundtable at the OAH TPP conference addressing exactly these issues. “When everyone comes together to share insights and experiences, we help inform our own and others’ practice,” BA explains. “Sometimes this is a great way to think about new ways to gather fidelity data, support facilitators and enhance youth learning.”

Good Questions to Ask

If you’re at the conference, come on by the Roundtable if you’d like to discuss questions such as these:

  1. What tools have you used to measure fidelity?
  2. What are pros and cons of using standard fidelity measures? How does that change when you make adaptations to a curriculum or program?
  3. Can you adapt fidelity measures to mirror the adaptations you have made to the curriculum?
  4. How can we overcome some of the challenges with self-report fidelity data?
  5. Looking at fidelity evaluation data, how can you distinguish between an individual’s interpretation of an adaptation and a violation of the core elements of the curriculum?
  6. How do you use your fidelity data? Is this different when you have adaptations?




Post a Comment

Required Field

Sign up for the ETR Health Newsletter.

Social Media :

  • YouTube
  • LinkedIn
  • Twitter
  • Facebook