Rule-based Natural Language Processing Approach to Detect Delirium on a Pre-Trained Deep Learning Model Framework

被引:0
|
作者
Munoz, Ricardo [1 ]
Hua, Yining [1 ]
Seibold, Eva-Lotte [2 ]
Ahrens, Elena
Redaelli, Simone [2 ]
Suleiman, Aiman [2 ]
von Wedel, Dario [2 ]
Ashrafian, Sarah [1 ]
Chen, Guanqing [1 ]
Schaefer, Maximilian [1 ]
Ma, Haobo [1 ]
机构
[1] Beth Israel Deaconess Med Ctr, Boston, MA USA
[2] Harvard Med Sch, Beth Israel Deaconess Med Ctr, Boston, MA USA
来源
ANESTHESIA AND ANALGESIA | 2023年 / 136卷
关键词
D O I
暂无
中图分类号
R614 [麻醉学];
学科分类号
100217 ;
摘要
引用
收藏
页码:1028 / 1030
页数:3
相关论文
共 50 条
  • [1] Integrating Pre-trained Model into Rule-based Dialogue Management
    Quan, Jun
    Yang, Meng
    Gan, Qiang
    Xiong, Deyi
    Liu, Yiming
    Dong, Yuchen
    Ouyang, Fangxin
    Tian, Jun
    Deng, Ruiling
    Li, Yongzhi
    Yang, Yang
    Jiang, Daxin
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 16097 - 16099
  • [2] A Study of Pre-trained Language Models in Natural Language Processing
    Duan, Jiajia
    Zhao, Hui
    Zhou, Qian
    Qiu, Meikang
    Liu, Meiqin
    2020 IEEE INTERNATIONAL CONFERENCE ON SMART CLOUD (SMARTCLOUD 2020), 2020, : 116 - 121
  • [3] Pre-trained models for natural language processing: A survey
    Qiu XiPeng
    Sun TianXiang
    Xu YiGe
    Shao YunFan
    Dai Ning
    Huang XuanJing
    SCIENCE CHINA-TECHNOLOGICAL SCIENCES, 2020, 63 (10) : 1872 - 1897
  • [4] Pre-trained models for natural language processing: A survey
    QIU XiPeng
    SUN TianXiang
    XU YiGe
    SHAO YunFan
    DAI Ning
    HUANG XuanJing
    Science China Technological Sciences, 2020, 63 (10) : 1872 - 1897
  • [5] Pre-trained models for natural language processing: A survey
    QIU XiPeng
    SUN TianXiang
    XU YiGe
    SHAO YunFan
    DAI Ning
    HUANG XuanJing
    Science China(Technological Sciences), 2020, (10) : 1872 - 1897
  • [6] Pre-trained models for natural language processing: A survey
    XiPeng Qiu
    TianXiang Sun
    YiGe Xu
    YunFan Shao
    Ning Dai
    XuanJing Huang
    Science China Technological Sciences, 2020, 63 : 1872 - 1897
  • [7] A pre-trained BERT for Korean medical natural language processing
    Yoojoong Kim
    Jong-Ho Kim
    Jeong Moon Lee
    Moon Joung Jang
    Yun Jin Yum
    Seongtae Kim
    Unsub Shin
    Young-Min Kim
    Hyung Joon Joo
    Sanghoun Song
    Scientific Reports, 12
  • [8] A pre-trained BERT for Korean medical natural language processing
    Kim, Yoojoong
    Kim, Jong-Ho
    Lee, Jeong Moon
    Jang, Moon Joung
    Yum, Yun Jin
    Kim, Seongtae
    Shin, Unsub
    Kim, Young-Min
    Joo, Hyung Joon
    Song, Sanghoun
    SCIENTIFIC REPORTS, 2022, 12 (01)
  • [9] Revisiting Pre-trained Models for Chinese Natural Language Processing
    Cui, Yiming
    Che, Wanxiang
    Liu, Ting
    Qin, Bing
    Wang, Shijin
    Hu, Guoping
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 657 - 668
  • [10] Pre-Trained Language Model-Based Deep Learning for Sentiment Classification of Vietnamese Feedback
    Loc, Cu Vinh
    Viet, Truong Xuan
    Viet, Tran Hoang
    Thao, Le Hoang
    Viet, Nguyen Hoang
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE AND APPLICATIONS, 2023, 22 (03)