Using Multistage Testing to Enhance Measurement of an English Language Proficiency Test

被引:4
|
作者
MacGregor, David [1 ]
Yen, Shu Jing [2 ]
Yu, Xin [2 ]
机构
[1] Univ Wisconsin, Wisconsin Ctr Educ Res, WIDA, 1025 West Johnson St, Madison, WI 53706 USA
[2] Ctr Appl Linguist, Washington, DC USA
关键词
TEST DESIGNS;
D O I
10.1080/15434303.2021.1988953
中图分类号
G44 [教育心理学];
学科分类号
0402 ; 040202 ;
摘要
How can one construct a test that provides accurate measurements across the range of performance levels while providing adequate coverage of all of the critical areas of the domain, yet that is not unmanageably long? This paper discusses the approach taken in a linear test of academic English language, and how the transition to a computer-based test allowed for a design that better fit the demands of the test. It also describes the multi-stage adaptive approach that was devised. This approach allows for a test that covers a broad range of performance levels while including items that assess the language of the content areas as described in the English language development standards underpinning the test. The design also allows for a test that is closely tailored to the ability level of the English learner taking the test, and that therefore produces a more precise measure. The efficacy of the design in enhancing measurement of two versions of a high-stakes English language assessments is explored, and the implications of the results are discussed.
引用
收藏
页码:54 / 75
页数:22
相关论文
共 50 条
  • [21] English language proficiency testing in an on-line course format: a genre analysis
    Buzetto, Bruno
    Marcuzzo, Patricia
    ENTREPALAVRAS, 2023, 13 (02): : 84 - 110
  • [22] Multistage complexity in language proficiency assessment: A framework for aligning theoretical perspectives, test development, and psychometrics
    Luecht, RM
    FOREIGN LANGUAGE ANNALS, 2003, 36 (04) : 527 - 535
  • [23] Issues in vertical scaling of a K-12 English language proficiency test
    Kenyon, Dorry M.
    MacGregor, David
    Li, Dongyang
    Cook, H. Gary
    LANGUAGE TESTING, 2011, 28 (03) : 383 - 400
  • [24] CONTEXTUALISING THE CEFR: THE UNIVERSITI MALAYSIA PAHANG ENGLISH LANGUAGE PROFICIENCY WRITING TEST
    Ali, Z. Mohd
    Ali, F.
    Radzuan, N. R. Mohd
    Alwi, N. A. Nik Mohd
    Abu Kassim, N. L.
    Don, Z. Mohd
    11TH INTERNATIONAL CONFERENCE OF EDUCATION, RESEARCH AND INNOVATION (ICERI2018), 2018, : 4892 - 4902
  • [25] Developing and validating a computerized oral proficiency test of English as a foreign language (Coptefl)
    Isler, Cemre
    Aydin, Belgin
    INTERNATIONAL JOURNAL OF ASSESSMENT TOOLS IN EDUCATION, 2021, 8 (01): : 38 - 66
  • [26] Enhancing the Interpretability of the Overall Results of an International Test of English-Language Proficiency
    Papageorgiou, Spiros
    Morgan, Rick
    Becker, Valerie
    INTERNATIONAL JOURNAL OF TESTING, 2015, 15 (04) : 310 - 336
  • [27] The Impact of English Language Proficiency on Neuropsychological Test Performance in Ethnically Diverse Individuals
    Scott, T.
    Funes, C.
    Kim, S.
    Razani, J.
    ARCHIVES OF CLINICAL NEUROPSYCHOLOGY, 2015, 30 (06) : 475 - 475
  • [28] A Construct Validation of the Malaysian University English Test and the English Placement Test: Two High-stakes English Language Proficiency Tests
    Abu Kassim, Noor Lide
    Zubairi, Ainol
    Daud, Nurahian Mat
    MALAYSIAN JOURNAL OF LEARNING & INSTRUCTION, 2007, 4 : 1 - 16
  • [29] Aligning a University English Language Proficiency Measurement Tool with the CEFR: A Case in Malaysia
    Baharum, Nurul Najwa
    Ismail, Lilliati
    Nordin, Nooreen
    Razali, Abu Bakar
    PERTANIKA JOURNAL OF SOCIAL SCIENCE AND HUMANITIES, 2021, 29 : 157 - 178
  • [30] From Test Performance to Language Use: Using Self-Assessment to Validate a High-Stakes English Proficiency Test
    Fan, Jinsong
    Yan, Xun
    ASIA-PACIFIC EDUCATION RESEARCHER, 2017, 26 (1-2): : 61 - 73