Gaining Insights into Course Difficulty Variations Using Item Response Theory

被引:1
|
作者
Baucks, Frederik [1 ]
Schmucker, Robin [2 ]
Wiskott, Laurenz [1 ]
机构
[1] Ruhr Univ Bochum, Bochum, Germany
[2] Carnegie Mellon Univ, Pittsburgh, PA USA
关键词
curriculum analytics; item response theory; fairness; PERFORMANCE;
D O I
10.1145/3636555.3636902
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Curriculum analytics (CA) studies curriculum structure and student data to ensure the quality of educational programs. To gain statistical robustness, most existing CA techniques rely on the assumption of time-invariant course difficulty, preventing them from capturing variations that might occur over time. However, ensuring low temporal variation in course difficulty is crucial to warrant fairness in treating individual student cohorts and consistency in degree outcomes. We introduce item response theory (IRT) as a CA methodology that enables us to address the open problem of monitoring course difficulty variations over time. We use statistical criteria to quantify the degree to which course performance data meets IRT ' s theoretical assumptions and verify validity and reliability of IRTbased course difficulty estimates. Using data from 664 Computer Science and 1,355 Mechanical Engineering undergraduate students, we show how IRT can yield valuable CA insights: First, by revealing temporal variations in course difficulty over several years, we find that course difficulty has systematically shifted downward during the COVID-19 pandemic. Second, time-dependent course difficulty and cohort performance variations confound conventional course pass rate measures. We introduce IRT-adjusted pass rates as an alternative to account for these factors. Our findings affect policymakers, student advisors, accreditation, and course articulation.
引用
收藏
页码:450 / 461
页数:12
相关论文
共 50 条