Variation in research designs used to test the effectiveness of dissemination and implementation strategies: A review

Stephanie Mazzucca, Rachel G. Tabak, Meagan Pilar, Alex T. Ramsey, Ana A. Baumann, Emily Kryzer, Ericka M. Lewis, Margaret Padek, Byron J. Powell, Ross C. Brownson

Research output: Contribution to journalReview article

14 Scopus citations

Abstract

Background: The need for optimal study designs in dissemination and implementation (D & I) research is increasingly recognized. Despite the wide range of study designs available for D & I research, we lack understanding of the types of designs and methodologies that are routinely used in the field. This review assesses the designs and methodologies in recently proposed D & I studies and provides resources to guide design decisions. Methods: We reviewed 404 study protocols published in the journal Implementation Science from 2/2006 to 9/2017. Eligible studies tested the efficacy or effectiveness of D & I strategies (i.e., not effectiveness of the underlying clinical or public health intervention); had a comparison by group and/or time; and used ≥1 quantitative measure. Several design elements were extracted: design category (e.g., randomized); design type [e.g., cluster randomized controlled trial (RCT)]; data type (e.g., quantitative); D & I theoretical framework; levels of treatment assignment, intervention, and measurement; and country in which the research was conducted. Each protocol was double-coded, and discrepancies were resolved through discussion. Results: Of the 404 protocols reviewed, 212 (52%) studies tested one or more implementation strategy across 208 manuscripts, therefore meeting inclusion criteria. Of the included studies, 77% utilized randomized designs, primarily cluster RCTs. The use of alternative designs (e.g., stepped wedge) increased over time. Fewer studies were quasi-experimental (17%) or observational (6%). Many study design categories (e.g., controlled pre-post, matched pair cluster design) were represented by only one or two studies. Most articles proposed quantitative and qualitative methods (61%), with the remaining 39% proposing only quantitative. Half of protocols (52%) reported using a theoretical framework to guide the study. The four most frequently reported frameworks were Consolidated Framework for Implementing Research and RE-AIM (n = 16 each), followed by Promoting Action on Research Implementation in Health Services and Theoretical Domains Framework (n = 12 each). Conclusion: While several novel designs for D & I research have been proposed (e.g., stepped wedge, adaptive designs), the majority of the studies in our sample employed RCT designs. Alternative study designs are increasing in use but may be underutilized for a variety of reasons, including preference of funders or lack of awareness of these designs. Promisingly, the prevalent use of quantitative and qualitative methods together reflects methodological innovation in newer D & I research.

Original languageEnglish
Article number32
JournalFrontiers in Public Health
Volume6
Issue numberFEB
DOIs
StatePublished - Feb 19 2018

Keywords

  • Dissemination research
  • Implementation research
  • Research methods
  • Research study design
  • Review

Fingerprint Dive into the research topics of 'Variation in research designs used to test the effectiveness of dissemination and implementation strategies: A review'. Together they form a unique fingerprint.

  • Cite this

    Mazzucca, S., Tabak, R. G., Pilar, M., Ramsey, A. T., Baumann, A. A., Kryzer, E., Lewis, E. M., Padek, M., Powell, B. J., & Brownson, R. C. (2018). Variation in research designs used to test the effectiveness of dissemination and implementation strategies: A review. Frontiers in Public Health, 6(FEB), [32]. https://doi.org/10.3389/fpubh.2018.00032