Poster: Attempts on detecting Alzheimer's disease by fine-tuning pre-trained model with Gaze Data

被引:0
|
作者
Nagasawa, Junichi [1 ,2 ]
Nakata, Yuichi [1 ,2 ]
Hiroe, Mamoru [1 ,3 ]
Zheng, Yujia [1 ]
Kawaguchi, Yutaka [1 ]
Maegawa, Yuji [1 ]
Hojo, Naoki [1 ]
Takiguchi, Tetsuya [1 ]
Nakayama, Minoru [4 ]
Uchimura, Maki [1 ]
Sonoda, Yuma [1 ]
Kowa, Hisatomo [1 ]
Nagamatsu, Takashi [1 ]
机构
[1] Kobe Univ, Kobe, Hyogo, Japan
[2] Kwansei Gakuin Univ, Sanda, Japan
[3] Osaka Seikei Univ, Osaka, Japan
[4] Tokyo Inst Technol, Tokyo, Japan
关键词
Alzheimer's disease; Antisaccade; Eye movement classifier; Fine-tuning;
D O I
10.1145/3649902.3655650
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This poster presents a study on detecting Alzheimer's disease (AD) using deep learning from gaze data. In this study, we modify an existing pre-trained deep neural network model, gazeNet, for transfer learning. The results suggest the possibility of applying this method to mild cognitive impairment screening tests.
引用
收藏
页数:3
相关论文
共 50 条
  • [21] Exploiting Syntactic Information to Boost the Fine-tuning of Pre-trained Models
    Liu, Chaoming
    Zhu, Wenhao
    Zhang, Xiaoyu
    Zhai, Qiuhong
    2022 IEEE 46TH ANNUAL COMPUTERS, SOFTWARE, AND APPLICATIONS CONFERENCE (COMPSAC 2022), 2022, : 575 - 582
  • [22] Gender-tuning: Empowering Fine-tuning for Debiasing Pre-trained Language Models
    Ghanbarzadeh, Somayeh
    Huang, Yan
    Palangi, Hamid
    Moreno, Radames Cruz
    Khanpour, Hamed
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 5448 - 5458
  • [23] TraceGuard: Fine-Tuning Pre-Trained Model by Using Stego Images to Trace Its User
    Zhou, Limengnan
    Ren, Xingdong
    Qian, Cheng
    Sun, Guangling
    MATHEMATICS, 2024, 12 (21)
  • [24] Pathologies of Pre-trained Language Models in Few-shot Fine-tuning
    Chen, Hanjie
    Zheng, Guoqing
    Awadallah, Ahmed Hassan
    Ji, Yangfeng
    PROCEEDINGS OF THE THIRD WORKSHOP ON INSIGHTS FROM NEGATIVE RESULTS IN NLP (INSIGHTS 2022), 2022, : 144 - 153
  • [25] Revisiting k-NN for Fine-Tuning Pre-trained Language Models
    Li, Lei
    Chen, Jing
    Tian, Botzhong
    Zhang, Ningyu
    CHINESE COMPUTATIONAL LINGUISTICS, CCL 2023, 2023, 14232 : 327 - 338
  • [26] Detecting the Stages of Alzheimer's Disease with Pre-trained Deep Learning Architectures
    Savas, Serkan
    ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING, 2022, 47 (02) : 2201 - 2218
  • [27] Detecting the Stages of Alzheimer’s Disease with Pre-trained Deep Learning Architectures
    Serkan Savaş
    Arabian Journal for Science and Engineering, 2022, 47 : 2201 - 2218
  • [28] Fine-tuning the hyperparameters of pre-trained models for solving multiclass classification problems
    Kaibassova, D.
    Nurtay, M.
    Tau, A.
    Kissina, M.
    COMPUTER OPTICS, 2022, 46 (06) : 971 - 979
  • [29] Improving Pre-Trained Weights through Meta-Heuristics Fine-Tuning
    de Rosa, Gustavo H.
    Roder, Mateus
    Papa, Joao Paulo
    dos Santos, Claudio F. G.
    2021 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2021), 2021,
  • [30] Fine-Tuning Pre-Trained Language Models Effectively by Optimizing Subnetworks Adaptively
    Zhang, Haojie
    Li, Ge
    Li, Jia
    Zhang, Zhongjin
    Zhu, Yuqi
    Jin, Zhi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,