Pushing Critical Thinking Skills With Multiple-Choice Questions: Does Bloom's Taxonomy Work?

被引:49
|
作者
Zaidi, Nikki L. Bibler [2 ]
Grob, Karri L. [2 ]
Monrad, Seetha M. [1 ,3 ,4 ]
Kurtz, Joshua B. [5 ]
Tai, Andrew [3 ,6 ]
Ahmed, Asra Z. [3 ]
Gruppen, Larry D. [7 ]
Santen, Sally A. [8 ,9 ]
机构
[1] Univ Michigan, Sch Med, Off Med Student Educ, 5310 Taubman Hlth Sci Lib, Ann Arbor, MI 48109 USA
[2] Univ Michigan, Sch Med, Off Med Student Educ, Evaluat & Assessment, Ann Arbor, MI USA
[3] Univ Michigan, Sch Med, Internal Med, Ann Arbor, MI USA
[4] Univ Michigan, Sch Med, Learning Hlth Sci, Ann Arbor, MI USA
[5] Univ Michigan, Sch Med, Ann Arbor, MI USA
[6] Univ Michigan, Sch Med, Microbiol & Immunol, Ann Arbor, MI USA
[7] Univ Michigan, Sch Med, Dept Learning Hlth Sci, Ann Arbor, MI USA
[8] Univ Michigan, Sch Med, Off Med Student Educ, Educ Res & Qual Improvement, Ann Arbor, MI USA
[9] Univ Michigan, Sch Med, Dept Emergency Med, Educ, Ann Arbor, MI USA
关键词
SUMMATIVE ASSESSMENT; MEDICAL-EDUCATION; RELIABILITY; ASSESSMENTS; KNOWLEDGE; IMPACT;
D O I
10.1097/ACM.0000000000002087
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
Medical school assessments should foster the development of higher-order thinking skills to support clinical reasoning and a solid foundation of knowledge. Multiple-choice questions (MCQs) are commonly used to assess student learning, and well-written MCQs can support learner engagement in higher levels of cognitive reasoning such as application or synthesis of knowledge. Bloom's taxonomy has been used to identify MCQs that assess students' critical thinking skills, with evidence suggesting that higher-order MCQs support a deeper conceptual understanding of scientific process skills. Similarly, clinical practice also requires learners to develop higher-order thinking skills that include all of Bloom's levels. Faculty question writers and examinees may approach the same material differently based on varying levels of knowledge and expertise, and these differences can influence the cognitive levels being measured by MCQs. Consequently, faculty question writers may perceive that certain MCQs require higher-order thinking skills to process the question, whereas examinees may only need to employ lower-order thinking skills to render a correct response. Likewise, seemingly lower-order questions may actually require higher-order thinking skills to respond correctly. In this Perspective, the authors describe some of the cognitive processes examinees use to respond to MCQs. The authors propose that various factors affect both the question writer and examinee's interaction with test material and subsequent cognitive processes necessary to answer a question.
引用
收藏
页码:856 / 859
页数:4
相关论文
共 48 条
  • [21] ChatGPT's quiz skills in different otolaryngology subspecialties: an analysis of 2576 single-choice and multiple-choice board certification preparation questions
    Hoch, Cosima C.
    Wollenberg, Barbara
    Lueers, Jan-Christoffer
    Knoedler, Samuel
    Knoedler, Leonard
    Frank, Konstantin
    Cotofana, Sebastian
    Alfertshofer, Michael
    EUROPEAN ARCHIVES OF OTO-RHINO-LARYNGOLOGY, 2023, 280 (09) : 4271 - 4278
  • [22] Author's response: What determines the success of generating multiple-choice questions
    Remick, Daniel
    ACADEMIC PATHOLOGY, 2024, 11 (02):
  • [23] Multiple choice questions can be designed or revised to challenge learners’ critical thinking
    Rochelle E. Tractenberg
    Matthew M. Gushta
    Susan E. Mulroney
    Peggy A. Weissinger
    Advances in Health Sciences Education, 2013, 18 : 945 - 961
  • [24] Multiple choice questions can be designed or revised to challenge learners' critical thinking
    Tractenberg, Rochelle E.
    Gushta, Matthew M.
    Mulroney, Susan E.
    Weissinger, Peggy A.
    ADVANCES IN HEALTH SCIENCES EDUCATION, 2013, 18 (05) : 945 - 961
  • [25] Assessing Skills of Identifying Variables and Formulating Hypotheses Using Scenario-Baseu Multiple-Choice Questions
    Temiz, Burak Kagan
    INTERNATIONAL JOURNAL OF ASSESSMENT TOOLS IN EDUCATION, 2020, 7 (01): : 1 - 17
  • [26] DOES THE RASCH MODEL REALLY WORK FOR MULTIPLE-CHOICE ITEMS - NOT IF YOU LOOK CLOSELY
    DIVGI, DR
    JOURNAL OF EDUCATIONAL MEASUREMENT, 1986, 23 (04) : 283 - 298
  • [27] Towards Automated Multiple Choice Question Generation and Evaluation: Aligning with Bloom's Taxonomy
    Hwang, Kevin
    Wang, Kenneth
    Alomair, Maryam
    Choa, Fow-Sen
    Chen, Lujie Karen
    ARTIFICIAL INTELLIGENCE IN EDUCATION, PT II, AIED 2024, 2024, 14830 : 389 - 396
  • [28] Paired Multiple-Choice Questions Reveal Students' Incomplete Statistical Thinking about Variation during Data Analysis
    Hicks, Jenna
    Dewey, Jessica
    Abebe, Michael
    Brandvain, Yaniv
    Schuchardt, Anita
    JOURNAL OF MICROBIOLOGY & BIOLOGY EDUCATION, 2021, 22 (02)
  • [29] Multiple-true-false questions reveal more thoroughly the complexity of student thinking than multiple-choice questions: a Bayesian item response model comparison
    Brassil, Chad E.
    Couch, Brian A.
    INTERNATIONAL JOURNAL OF STEM EDUCATION, 2019, 6 (1)
  • [30] Tapping into Bloom Taxonomy's Higher-Order Cognitive Processes: The Case for Multiple Choice Questions as a Valid Assessment Tool in the ESP Classroom
    Lenchuk, Iryna
    Ahmed, Amer
    ARAB WORLD ENGLISH JOURNAL, 2021, : 160 - 171