Poster: Attempts on detecting Alzheimer's disease by fine-tuning pre-trained model with Gaze Data

被引:0
|
作者
Nagasawa, Junichi [1 ,2 ]
Nakata, Yuichi [1 ,2 ]
Hiroe, Mamoru [1 ,3 ]
Zheng, Yujia [1 ]
Kawaguchi, Yutaka [1 ]
Maegawa, Yuji [1 ]
Hojo, Naoki [1 ]
Takiguchi, Tetsuya [1 ]
Nakayama, Minoru [4 ]
Uchimura, Maki [1 ]
Sonoda, Yuma [1 ]
Kowa, Hisatomo [1 ]
Nagamatsu, Takashi [1 ]
机构
[1] Kobe Univ, Kobe, Hyogo, Japan
[2] Kwansei Gakuin Univ, Sanda, Japan
[3] Osaka Seikei Univ, Osaka, Japan
[4] Tokyo Inst Technol, Tokyo, Japan
关键词
Alzheimer's disease; Antisaccade; Eye movement classifier; Fine-tuning;
D O I
10.1145/3649902.3655650
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This poster presents a study on detecting Alzheimer's disease (AD) using deep learning from gaze data. In this study, we modify an existing pre-trained deep neural network model, gazeNet, for transfer learning. The results suggest the possibility of applying this method to mild cognitive impairment screening tests.
引用
收藏
页数:3
相关论文
共 50 条
  • [31] Composed Fine-Tuning: Freezing Pre-Trained Denoising Autoencoders for Improved Generalization
    Xie, Sang Michael
    Ma, Tengyu
    Liang, Percy
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [32] An Empirical Study on Hyperparameter Optimization for Fine-Tuning Pre-trained Language Models
    Liu, Xueqing
    Wang, Chi
    59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1 (ACL-IJCNLP 2021), 2021, : 2286 - 2300
  • [33] Novel Fine-Tuning Strategy on Pre-trained Protein Model Enhances ACP Functional Type Classification
    Wang, Shaokai
    Ma, Bin
    BIOINFORMATICS RESEARCH AND APPLICATIONS, PT I, ISBRA 2024, 2024, 14954 : 371 - 382
  • [34] An efficient ptychography reconstruction strategy through fine-tuning of large pre-trained deep learning model
    Pan, Xinyu
    Wang, Shuo
    Zhou, Zhongzheng
    Zhou, Liang
    Liu, Peng
    Li, Chun
    Wang, Wenhui
    Zhang, Chenglong
    Dong, Yuhui
    Zhang, Yi
    ISCIENCE, 2023, 26 (12)
  • [35] Towards Fine-tuning Pre-trained Language Models with Integer Forward and Backward Propagation
    Tayaranian, Mohammadreza
    Ghaffari, Alireza
    Tahaei, Marzieh S.
    Rezagholizadeh, Mehdi
    Asgharian, Masoud
    Nia, Vahid Partovi
    17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 1912 - 1921
  • [36] Efficient Fine-Tuning for Low-Resource Tibetan Pre-trained Language Models
    Zhou, Mingjun
    Daiqing, Zhuoma
    Qun, Nuo
    Nyima, Tashi
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING-ICANN 2024, PT VII, 2024, 15022 : 410 - 422
  • [37] Fine-Tuning BERT-Based Pre-Trained Models for Arabic Dependency Parsing
    Al-Ghamdi, Sharefah
    Al-Khalifa, Hend
    Al-Salman, Abdulmalik
    APPLIED SCIENCES-BASEL, 2023, 13 (07):
  • [38] Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction
    Alt, Christoph
    Huebner, Marc
    Hennig, Leonhard
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1388 - 1398
  • [39] Towards Anytime Fine-tuning: Continually Pre-trained Language Models with Hypernetwork Prompts
    Jiang, Gangwei
    Jiang, Caigao
    Xue, Sigiao
    Zhang, James Y.
    Zhou, Jun
    Lian, Defu
    Wei, Ying
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 12081 - 12095
  • [40] Towards Efficient Fine-Tuning of Pre-trained Code Models: An Experimental Study and Beyond
    Shi, Ensheng
    Wang, Yanlin
    Zhang, Hongyu
    Du, Lun
    Han, Shi
    Zhang, Dongmei
    Sun, Hongbin
    PROCEEDINGS OF THE 32ND ACM SIGSOFT INTERNATIONAL SYMPOSIUM ON SOFTWARE TESTING AND ANALYSIS, ISSTA 2023, 2023, : 39 - 51