Domain-aware Neural Model for Sequence Labeling using Joint Learning

被引:0
|
作者
Huang, Heng [1 ]
Yan, Yuliang [1 ]
Liu, Xiaozhong [2 ]
机构
[1] Alibaba Grp, Hangzhou, Zhejiang, Peoples R China
[2] Indiana Univ, Bloomington, IN 47405 USA
关键词
D O I
10.1145/3308558.3313566
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Recently, scholars have demonstrated empirical successes of deep learning in sequence labeling, and most of the prior works focused on the word representation inside the target sentence. Unfortunately, the global information, e.g., domain information of the target document, were ignored in the previous studies. In this paper, we propose an innovative joint learning neural network which can encapsulate the global domain knowledge and the local sentence/token information to enhance the sequence labeling model. Unlike existing studies, the proposed method employs domain labeling output as a latent evidence to facilitate tagging model and such joint embedding information is generated by an enhanced highway network. Meanwhile, a redesigned CRF layer is deployed to bridge the 'local output labels' and 'global domain information'. Various kinds of information can iteratively contribute to each other, and moreover, domain knowledge can be learnt in either supervised or unsupervised environment via the new model. Experiment with multiple data sets shows that the proposed algorithm outperforms classical and most recent state-of-the-art labeling methods.
引用
收藏
页码:2837 / 2843
页数:7
相关论文
共 50 条
  • [1] Domain-Aware Multiagent Reinforcement Learning in Navigation
    Saeed, Ifrah
    Cullen, Andrew C.
    Erfani, Sarah
    Alpcan, Tansu
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [2] Feature Stylization and Domain-aware Contrastive Learning for Domain Generalization
    Jeon, Seogkyu
    Hong, Kibeom
    Lee, Pilhyeon
    Lee, Jewook
    Byun, Hyeran
    PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 22 - 31
  • [3] Using domain-aware machine learning to forecast aqueous iodine reactions
    Bilbrey, Jenna
    Marrero, Carlos Ortiz
    Schram, Malachi
    Rallo, Robert
    ABSTRACTS OF PAPERS OF THE AMERICAN CHEMICAL SOCIETY, 2019, 258
  • [4] DOMAIN-AWARE NEURAL LANGUAGE MODELS FOR SPEECH RECOGNITION
    Liu, Linda
    Gu, Yile
    Gourav, Aditya
    Gandhe, Ankur
    Kalmane, Shashank
    Filimonov, Denis
    Rastrow, Ariya
    Bulyko, Ivan
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 7373 - 7377
  • [5] Learning Domain-Aware Detection Head with Prompt Tuning
    Li, Haochen
    Zhang, Rui
    Yao, Hantao
    Song, Xinkai
    Hao, Yifan
    Zhao, Yongwei
    Li, Ling
    Chen, Yunji
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [6] Relation Extraction via Domain-aware Transfer Learning
    Di, Shimin
    Shen, Yanyan
    Chen, Lei
    KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2019, : 1348 - 1357
  • [7] DOMINO: Domain-aware loss for deep learning calibration
    Stolte, Skylar E.
    Volle, Kyle
    Indahlastari, Aprinda
    Albizu, Alejandro
    Woods, Adam J.
    Brink, Kevin
    Hale, Matthew
    Fang, Ruogu
    SOFTWARE IMPACTS, 2023, 15
  • [8] General Incremental Learning with Domain-aware Categorical Representations
    Xie, Jiangwei
    Yan, Shipeng
    He, Xuming
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 14331 - 14340
  • [9] Biometric spoofing detection by a Domain-aware Convolutional Neural Network
    Gragnaniello, Diego
    Poggi, Giovanni
    Sansone, Carlo
    Verdoliva, Luisa
    2016 12TH INTERNATIONAL CONFERENCE ON SIGNAL-IMAGE TECHNOLOGY & INTERNET-BASED SYSTEMS (SITIS), 2016, : 193 - 198
  • [10] Style Augmentation and Domain-Aware Parametric Contrastive Learning for Domain Generalization
    Li, Mingkang
    Zhang, Jiali
    Zhang, Wen
    Gong, Lu
    Zhang, Zili
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT IV, KSEM 2023, 2023, 14120 : 211 - 224