Poster: Attempts on detecting Alzheimer's disease by fine-tuning pre-trained model with Gaze Data

被引:0
|
作者
Nagasawa, Junichi [1 ,2 ]
Nakata, Yuichi [1 ,2 ]
Hiroe, Mamoru [1 ,3 ]
Zheng, Yujia [1 ]
Kawaguchi, Yutaka [1 ]
Maegawa, Yuji [1 ]
Hojo, Naoki [1 ]
Takiguchi, Tetsuya [1 ]
Nakayama, Minoru [4 ]
Uchimura, Maki [1 ]
Sonoda, Yuma [1 ]
Kowa, Hisatomo [1 ]
Nagamatsu, Takashi [1 ]
机构
[1] Kobe Univ, Kobe, Hyogo, Japan
[2] Kwansei Gakuin Univ, Sanda, Japan
[3] Osaka Seikei Univ, Osaka, Japan
[4] Tokyo Inst Technol, Tokyo, Japan
关键词
Alzheimer's disease; Antisaccade; Eye movement classifier; Fine-tuning;
D O I
10.1145/3649902.3655650
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This poster presents a study on detecting Alzheimer's disease (AD) using deep learning from gaze data. In this study, we modify an existing pre-trained deep neural network model, gazeNet, for transfer learning. The results suggest the possibility of applying this method to mild cognitive impairment screening tests.
引用
收藏
页数:3
相关论文
共 50 条
  • [1] Attempts on detecting Alzheimer's disease by fine-tuning pre-trained model with Gaze Data
    Nagasawa, Junichi
    Nakata, Yuichi
    Hiroe, Mamoru
    Zheng, Yujia
    Kawaguchi, Yutaka
    Maegawa, Yuji
    Hojo, Naoki
    Takiguchi, Tetsuya
    Nakayama, Minoru
    Uchimura, Maki
    Sonoda, Yuma
    Kowa, Hisatomo
    Nagamatsu, Takashi
    PROCEEDINGS OF THE 2024 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, ETRA 2024, 2024,
  • [2] Fine-Tuning Pre-Trained Language Models with Gaze Supervision
    Deng, Shuwen
    Prasse, Paul
    Reich, David R.
    Scheffer, Tobias
    Jager, Lena A.
    PROCEEDINGS OF THE 62ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2: SHORT PAPERS, 2024, : 217 - 224
  • [3] Disfluencies and Fine-Tuning Pre-trained Language Models for Detection of Alzheimer's Disease
    Yuan, Jiahong
    Bian, Yuchen
    Cai, Xingyu
    Huang, Jiaji
    Ye, Zheng
    Church, Kenneth
    INTERSPEECH 2020, 2020, : 2162 - 2166
  • [4] Enhancing Alzheimer's Disease Classification with Transfer Learning: Fine-tuning a Pre-trained Algorithm
    Boudi, Abdelmounim
    He, Jingfei
    Abd El Kader, Isselmou
    CURRENT MEDICAL IMAGING, 2024,
  • [5] Pruning Pre-trained Language ModelsWithout Fine-Tuning
    Jiang, Ting
    Wang, Deqing
    Zhuang, Fuzhen
    Xie, Ruobing
    Xia, Feng
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 594 - 605
  • [6] Span Fine-tuning for Pre-trained Language Models
    Bao, Rongzhou
    Zhang, Zhuosheng
    Zhao, Hai
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 1970 - 1979
  • [7] Overcoming Catastrophic Forgetting for Fine-Tuning Pre-trained GANs
    Zhang, Zeren
    Li, Xingjian
    Hong, Tao
    Wang, Tianyang
    Ma, Jinwen
    Xiong, Haoyi
    Xu, Cheng-Zhong
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, ECML PKDD 2023, PT V, 2023, 14173 : 293 - 308
  • [8] Waste Classification by Fine-Tuning Pre-trained CNN and GAN
    Alsabei, Amani
    Alsayed, Ashwaq
    Alzahrani, Manar
    Al-Shareef, Sarah
    INTERNATIONAL JOURNAL OF COMPUTER SCIENCE AND NETWORK SECURITY, 2021, 21 (08): : 65 - 70
  • [9] Fine-tuning pre-trained voice conversion model for adding new target speakers with limited data
    Koshizuka, Takeshi
    Ohmura, Hidefumi
    Katsurada, Kouichi
    INTERSPEECH 2021, 2021, : 1339 - 1343
  • [10] Virtual Data Augmentation: A Robust and General Framework for Fine-tuning Pre-trained Models
    Zhou, Kun
    Zhao, Wayne Xin
    Wang, Sirui
    Zhang, Fuzheng
    Wu, Wei
    We, Ji-Rong
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 3875 - 3887