Evaluating technology enhanced learning by using single-case experimental design: A systematic review

被引:0
|
作者
Dayo, Nadira [1 ,2 ,3 ,4 ]
Metwaly, Sameh Said [1 ,2 ]
Van Den Noortgate, Wim [1 ,2 ]
机构
[1] Katholieke Univ Leuven, Fac Psychol & Educ Sci, Leuven, Belgium
[2] Katholieke Univ Leuven, ITEC Imec Res Grp, Leuven, Belgium
[3] Katholieke Univ Leuven, Fac Psychol & Educ Sci, KU Leuven campus Kulak Kortrijk,E Sabbelaan 51, B-8500 Kortrijk, Belgium
[4] Katholieke Univ Leuven, ITEC Imec Res Grp, KU Leuven campus Kulak Kortrijk,E Sabbelaan 51, B-8500 Kortrijk, Belgium
关键词
systematic review; single-case experimental design; technology-enhanced learning; AUTISM SPECTRUM DISORDERS; SUBJECT RESEARCH; CHILDREN; STUDENTS; INTERVENTIONS; TRENDS; TUTOR;
D O I
10.1111/bjet.13468
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
Single-case experimental designs (SCEDs) may offer a reliable and internally valid way to evaluate technology-enhanced learning (TEL). A systematic review was conducted to provide an overview of what, why and how SCEDs are used to evaluate TEL. Accordingly, 136 studies from nine databases fulfilling the inclusion criteria were included. The results showed that most of the studies were conducted in the field of special education focusing on evaluating the effectiveness of computer-assisted instructions, video prompts and mobile devices to improve language and communication, socio-emotional, skills and mental health. The research objective of most studies was to evaluate the effects of the intervention; often no specific justification for using SCED was provided. Additionally, multiple baseline and phase designs were the most common SCED types, with most measurements in the intervention phase. Frequent data collection methods were observation, tests, questionnaires and task analysis, whereas, visual and descriptive analysis were common methods for data analysis. Nearly half of the studies did not acknowledge any limitations, while a few mentioned generalization and small sample size as limitations. The review provides valuable insights into utilizing SCEDs to advance TEL evaluation methodology and concludes with a reflection on further opportunities that SCEDs can offer for evaluating TEL.Practitioner notesWhat is already known about this topicWhat this paper addsImplications for practice and/or policy SCEDs use multiple measurements to study a single participant over multiple conditions, in the absence and presence of an intervention SCEDs can be rigorous designs for evaluating behaviour change caused by any intervention, including for testing technology-based interventions. Reveals patterns, trends and gaps in the use of SCED for TEL. Identifies the study disciplines, EdTech tools and outcome variables studied using SCEDs. Provides a comprehensive understanding of how SCEDs are used to evaluate TEL by shedding light on methodological techniques. Enriches insights about justifications and limitations of using SCEDs for TEL. Informs about the use of the rigorous method, SCED, for evaluation of technology-driven interventions across various disciplines. Contributes therefore to the quality of an evidence base, which provides policymakers, and different stakeholders a consolidated resource to design, implement and decide about TEL.
引用
收藏
页码:2457 / 2477
页数:21
相关论文
共 50 条
  • [1] Single-Case Experimental Design for Evaluating Basic Reading Interventions for Students With Learning Disabilities: An Illustrative Case
    Berkeley, Sheri
    Hock, Michael
    Washburn, Jocelyn
    LEARNING DISABILITY QUARTERLY, 2023, 46 (01) : 6 - 16
  • [2] Evaluating the Use of Smart Home Technology by People With Brain Impairment: Protocol for a Single-Case Experimental Design
    Jamwal, Rebecca
    Callaway, Libby
    Winkler, Di
    Farnworth, Louise
    Tate, Robyn
    JMIR RESEARCH PROTOCOLS, 2018, 7 (11):
  • [3] Evaluating single-case research data for systematic review: A commentary for the special issue
    Maggin, Daniel M.
    Odom, Samuel L.
    JOURNAL OF SCHOOL PSYCHOLOGY, 2014, 52 (02) : 237 - 241
  • [4] Evaluating a computerized training of executive functions with single-case experimental design methods
    Ponsioen, A.
    ten Brink, E.
    Prins, P.
    EUROPEAN CHILD & ADOLESCENT PSYCHIATRY, 2015, 24 : S45 - S46
  • [5] A review of counseling research using single-case research design
    Brown, Cian L.
    Peltier, Corey
    Smarinsky, Evan C.
    Lewis, Olivia J.
    Lee, David Y.
    JOURNAL OF COUNSELING AND DEVELOPMENT, 2024, 102 (03): : 334 - 349
  • [6] Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards
    Smith, Justin D.
    PSYCHOLOGICAL METHODS, 2012, 17 (04) : 510 - 550
  • [7] Development and applications of the single-case design risk of bias tool for evaluating single-case design research study reports
    Reichow, Brian
    Barton, Erin E.
    Maggin, Daniel M.
    RESEARCH IN DEVELOPMENTAL DISABILITIES, 2018, 79 : 53 - 64
  • [8] The design of technology-enhanced vocabulary learning: A systematic review
    Zhou, Keyi
    Jin, Fangzhou
    Li, Weiwei
    Song, Zicong
    Huang, Xianhan
    Lin, Chin-Hsi
    EDUCATION AND INFORMATION TECHNOLOGIES, 2024, 29 (12) : 14875 - 14897
  • [9] A Systematic Quality Review of Single-Case Experimental Designs Using Acceptance and Commitment Therapy in Adult Clinical Populations
    Luck, Safia A. M.
    Golijani-Moghaddam, Nima
    Dawson, David L.
    BEHAVIOR THERAPY, 2024, 55 (04) : 856 - 871
  • [10] A Systematic Review of Instructional Comparisons in Single-Case Research
    Ledford, Jennifer R.
    Chazin, Kate T.
    Gagnon, Kari L.
    Lord, Anne K.
    Turner, Virginia R.
    Zimmerman, Kathleen N.
    REMEDIAL AND SPECIAL EDUCATION, 2021, 42 (03) : 155 - 168