Novel Considerations in the ML/AI Modeling of Large-Scale Learning Loss

被引:0
|
作者
Elizondo, Mirna [1 ]
Yu, June [2 ]
Payan, Daniel [3 ]
Feng, Li [4 ]
Tesic, Jelena [1 ]
机构
[1] Texas State Univ, Dept Comp Sci, San Marcos, TX 78666 USA
[2] State Texas Legislat Budget Board, Austin, TX 78701 USA
[3] Loves Travel Stops & Country Stores, Yukon, OK 73099 USA
[4] Texas State Univ, Dept Finance & Econ, San Marcos, TX 78666 USA
来源
IEEE ACCESS | 2025年 / 13卷
关键词
Data models; Deep learning; Random forests; Noise measurement; Nearest neighbor methods; Logistic regression; Biological system modeling; Artificial neural networks; Support vector machines; Radio frequency; Noisy tabular data; data in the wild; gradient boosting; feature selection; dimensionality reduction;
D O I
10.1109/ACCESS.2025.3526412
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This study is a path forward for the large-scale, data-driven quantitative analysis of noisy open-source data resources. The goal is to support qualitative findings of smaller studies with extensive open-source data-driven analytics in a new way. The study presented in this research focuses on learning interventions. It uses nine publicly accessible datasets to understand and mitigate factors contributing to learning loss and the practical learning recovery measures in Texas public school districts after the recent school closures. The data came from the Census Bureau 2010, USAFACTS, Texas Department of State Health Services (DSHS), the National Center for Education Statistics (CCD), the US Bureau of Labor Statistics (LAUS), and three sources from the Texas Education Agency (STAAR, TEA, ADA, ESSER). We demonstrate a novel data-driven approach to discover insights from an extensive collection of heterogeneous public data sources. For the pandemic school closure period, the mode of instruction and prior score emerged as the primary resilience factors in the learning recovery intervention method. Grade level and census community income level are the most influential factors in predicting learning loss for both Math and Reading. We demonstrate that data-driven unbiased data analysis at a larger scale can offer policymakers an actionable understanding of how to identify learning-loss tendencies and prevent them in public schools.
引用
收藏
页码:7780 / 7792
页数:13
相关论文
共 50 条
  • [1] How GPUs are driving large-scale molecule modeling and machine learning (ML)
    Berger, Mark
    ABSTRACTS OF PAPERS OF THE AMERICAN CHEMICAL SOCIETY, 2018, 255
  • [2] Large-scale modeling of wordform learning and representation
    Sibley, Daragh E.
    Kello, Christopher T.
    Plaut, David C.
    Elman, Jeffrey L.
    COGNITIVE SCIENCE, 2008, 32 (04) : 741 - 754
  • [3] A Novel Method For Modeling The Large-scale Hospital
    Zou, Chengye
    Wang, Junwei
    2018 5TH INTERNATIONAL CONFERENCE ON SYSTEMS AND INFORMATICS (ICSAI), 2018, : 1231 - 1234
  • [4] Modeling synchronization loss in large-scale brain dynamics
    Pons Rivero, Antonio J.
    Luis Cantero, Jose
    Atienza, Mercedes
    Garcia-Ojalvo, Jordi
    ARTIFICIAL NEURAL NETWORKS - ICANN 2008, PT II, 2008, 5164 : 675 - +
  • [5] Learning Quintuplet Loss for Large-Scale Visual Geolocalization
    Zhai, Qiang
    Huang, Rui
    Cheng, Hong
    Zhan, Huiqin
    Li, Jun
    Liu, Zicheng
    IEEE MULTIMEDIA, 2020, 27 (03) : 34 - 43
  • [6] Wideband Modeling of Conductor Loss in General Large-Scale Problems
    Makinen, Riku M.
    Junkin, Gary
    IEEE ANTENNAS AND WIRELESS PROPAGATION LETTERS, 2012, 11 : 1289 - 1292
  • [7] Large-Scale Integrated Photonic Device Platform for Energy-Efficient AI/ML Accelerators
    Tossoun, Bassem
    Xiao, Xian
    Cheung, Stanley
    Yuan, Yuan
    Peng, Yiwei
    Srinivasan, Sudharsanan
    Giamougiannis, George
    Huang, Zhihong
    Singaraju, Prerana
    London, Yanir
    Hejda, Matej
    Sundararajan, Sri Priya
    Hu, Yingtao
    Gong, Zheng
    Baek, Jongseo
    Descos, Antoine
    Kapusta, Morten
    Bohm, Fabian
    Van Vaerenbergh, Thomas
    Fiorentino, Marco
    Kurczveil, Geza
    Liang, Di
    Beausoleil, Raymond G.
    IEEE JOURNAL OF SELECTED TOPICS IN QUANTUM ELECTRONICS, 2025, 31 (03)
  • [8] A Novel Model for Large-Scale Online College Learning in Postpandemic Era: AI-Driven Approach
    Wang, Cong
    MOBILE INFORMATION SYSTEMS, 2021, 2021
  • [9] A Self-Learning Framework for Large-Scale Conversational AI Systems
    Liu, Xiaohu
    Guo, Chenlei
    Yao, Benjamin
    Sarikaya, Ruhi
    IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE, 2024, 19 (02) : 34 - 48
  • [10] Large-scale MR fluid dampers: modeling and dynamic performance considerations
    Yang, G
    Spencer, BF
    Carlson, JD
    Sain, MK
    ENGINEERING STRUCTURES, 2002, 24 (03) : 309 - 323