Study Designs and Systematic Reviews of Interventions: Building Evidence Across Study Designs

被引:46
|
作者
Sargeant, J. M. [1 ,2 ]
Kelton, D. F. [1 ,2 ]
O'Connor, A. M. [3 ]
机构
[1] Univ Guelph, Ctr Publ Hlth & Zoonoses, Guelph, ON N1G 2W1, Canada
[2] Univ Guelph, Dept Populat Med, Ontario Vet Coll, Guelph, ON N1G 2W1, Canada
[3] Iowa State Univ, Dept Vet Diagnost & Prod Anim Med, Coll Vet Med, Ames, IA USA
基金
加拿大健康研究院;
关键词
Study designs; evidence pyramids; systematic review; veterinary medicine; INTERPRETING CLINICAL RESEARCH; RANDOMIZED CONTROLLED-TRIALS; BLINDING TERMINOLOGY; EQUIPOISE; ETHICS;
D O I
10.1111/zph.12127
中图分类号
R1 [预防医学、卫生学];
学科分类号
1004 ; 120402 ;
摘要
This article is the second article in a series of six focusing on systematic reviews in animal agriculture and veterinary medicine. This article addresses the strengths and limitations of study designs commonly used in animal agriculture and veterinary research to assess interventions (preventive or therapeutic treatments) and discusses the appropriateness of their use in systematic reviews of interventions. Different study designs provide different evidentiary value for addressing questions about the efficacy of interventions. Experimental study designs range from in vivo proof of concept experiments to randomized controlled trials (RCTs) under real-world conditions. The key characteristic of experimental design in intervention studies is that the investigator controls the allocation of individuals or groups to different intervention strategies. The RCT is considered the gold standard for evaluating the efficacy of interventions and, if there are well-executed RCTs available for inclusion in a systematic review, that review may be restricted to only this design. In some instances, RCTs may not be feasible or ethical to perform, and there are fewer RCTs published in the veterinary literature compared to the human healthcare literature. Therefore, observational study designs, where the investigator does not control intervention allocation, may provide the only available evidence of intervention efficacy. While observational studies tend to be relevant to real-world use of an intervention, they are more prone to bias. Human healthcare researchers use a pyramid of evidence diagram to describe the evidentiary value of different study designs for assessing interventions. Modifications for veterinary medicine are presented in this article.
引用
收藏
页码:10 / 17
页数:8
相关论文
共 50 条
  • [1] Development of an algorithm to provide awareness in choosing study designs for inclusion in systematic reviews of healthcare interventions: a method study
    Peinemann, Frank
    Kleijnen, Jos
    [J]. BMJ OPEN, 2015, 5 (08):
  • [2] Study designs to detect sponsorship and other biases in systematic reviews
    Dias, Sofia
    Welton, Nicky J.
    Ades, A. E.
    [J]. JOURNAL OF CLINICAL EPIDEMIOLOGY, 2010, 63 (06) : 587 - 588
  • [3] A newly developed tool for classifying study designs in systematic reviews of interventions and exposures showed substantial reliability and validity
    Seo, Hyun-Ju
    Kim, Soo Young
    Lee, Yoon Jae
    Jang, Bo-Hyoung
    Park, Ji-Eun
    Sheen, Seung-Soo
    Hahn, Seo Kyung
    [J]. JOURNAL OF CLINICAL EPIDEMIOLOGY, 2016, 70 : 200 - 205
  • [4] Testing a tool for the classification of study designs in systematic reviews of interventions and exposures showed moderate reliability and low accuracy
    Hartling, Lisa
    Bond, Kenneth
    Santaguida, P. Lina
    Viswanathan, Meera
    Dryden, Donna M.
    [J]. JOURNAL OF CLINICAL EPIDEMIOLOGY, 2011, 64 (08) : 861 - 871
  • [5] Systematic reviews incorporating evidence from nonrandomized study designs: reasons for caution when estimating health effects
    Reeves, BC
    van Binsbergen, J
    van Weel, C
    [J]. EUROPEAN JOURNAL OF CLINICAL NUTRITION, 2005, 59 (Suppl 1) : S155 - S161
  • [6] Systematic reviews incorporating evidence from nonrandomized study designs: reasons for caution when estimating health effects
    B C Reeves
    J van Binsbergen
    C van Weel
    [J]. European Journal of Clinical Nutrition, 2005, 59 : S155 - S161
  • [7] Randomized study designs for lifestyle interventions: a tutorial
    Younge, John O.
    Kouwenhoven-Pasmooij, Tessa A.
    Freak-Poli, Rosanne
    Roos-Hesselink, Jolien W.
    Hunink, M. G. Myriam
    [J]. INTERNATIONAL JOURNAL OF EPIDEMIOLOGY, 2015, 44 (06) : 2006 - 2019
  • [8] Study Designs in Epidemiology and Levels of Evidence
    Wang, Jie Jin
    Attia, John
    [J]. AMERICAN JOURNAL OF OPHTHALMOLOGY, 2010, 149 (03) : 367 - 370
  • [9] The role of systematic reviews of qualitative evidence in evaluating interventions: a case study
    Lorenc, Theo
    Pearson, Mark
    Jamal, Farah
    Cooper, Chris
    Garside, Ruth
    [J]. RESEARCH SYNTHESIS METHODS, 2012, 3 (01) : 1 - 10
  • [10] Comparison of Case-Only Designs with Traditional Study Designs: A Systematic Review
    Pouwels, Koen B.
    Mulder, Bianca
    Hak, Eelko
    [J]. PHARMACOEPIDEMIOLOGY AND DRUG SAFETY, 2015, 24 : 584 - 584