Fine-Tuning BERT for Generative Dialogue Domain Adaptation

被引:0
|
作者
Labruna, Tiziano [1 ,2 ]
Magnini, Bernardo [1 ]
机构
[1] Fdn Bruno Kessler, Via Sommar 18, Povo, Trento, Italy
[2] Free Univ Bozen Bolzano, Piazza Univ 1, Bozen Bolzano, Italy
来源
关键词
Dialogue State Tracking; Task-oriented Dialogue; Domain Adaptation;
D O I
10.1007/978-3-031-16270-1_42
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Current data-driven Dialogue State Tracking (DST) models exhibit a poor capacity to adapt themselves to domain changes, resulting in a significant degradation in performance. We propose a methodology, called Generative Dialogue Domain Adaptation, which significantly simplifies the creation of training data when a number of changes (e.g., new slot-values or new instances) occur in a domain Knowledge Base. We start from dialogues for a source domain and apply generative methods based on language models such as BERT, fine-tuned on task-related data and generate slot-values substitutions for a target domain. We have experimented dialogue domain adaptation in a few-shot setting showing promising results, although the task is still very challenging. We provide a deep analysis of the quality of the generated data and of the features that affect this task, and we emphasise that DST models are very sensitive to the distribution of slot-values in the corpus.
引用
收藏
页码:513 / 524
页数:12
相关论文
共 50 条
  • [1] Fine-Tuning BERT for Aspect Extraction in Multi-domain ABSA
    Akram, Arwa
    Sabir, Aliea
    [J]. Informatica (Slovenia), 2023, 47 (09): : 123 - 131
  • [2] Fine-tuning adaptation
    Jabri, E
    [J]. NATURE STRUCTURAL & MOLECULAR BIOLOGY, 2005, 12 (05): : 394 - 394
  • [3] Fast Fine-Tuning using Curriculum Domain Adaptation
    Shen, Lulan
    Amara, Ibtihel
    Li, Ruofeng
    Meyer, Brett
    Gross, Warren
    Clark, James J.
    [J]. 2023 20TH CONFERENCE ON ROBOTS AND VISION, CRV, 2023, : 296 - 303
  • [4] Transfer fine-tuning of BERT with phrasal paraphrases
    Arase, Yuki
    Tsujii, Junichi
    [J]. COMPUTER SPEECH AND LANGUAGE, 2021, 66
  • [5] Efficient Fine-Tuning of BERT Models on the Edge
    Vucetic, Danilo
    Tayaranian, Mohammadreza
    Ziaeefard, Maryam
    Clark, James J.
    Meyer, Brett H.
    Gross, Warren J.
    [J]. 2022 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS 22), 2022, : 1838 - 1842
  • [6] Transfer Fine-Tuning: A BERT Case Study
    Arase, Yuki
    Tsujii, Junichi
    [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 5393 - 5404
  • [7] SPEECH RECOGNITION BY SIMPLY FINE-TUNING BERT
    Huang, Wen-Chin
    Wu, Chia-Hua
    Luo, Shang-Bao
    Chen, Kuan-Yu
    Wang, Hsin-Min
    Toda, Tomoki
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 7343 - 7347
  • [8] Investigating Learning Dynamics of BERT Fine-Tuning
    Hao, Yaru
    Dong, Li
    Wei, Furu
    Xu, Ke
    [J]. 1ST CONFERENCE OF THE ASIA-PACIFIC CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 10TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (AACL-IJCNLP 2020), 2020, : 87 - 92
  • [9] Patent classification by fine-tuning BERT language model
    Lee, Jieh-Sheng
    Hsiang, Jieh
    [J]. WORLD PATENT INFORMATION, 2020, 61
  • [10] Dataset Distillation with Attention Labels for Fine-tuning BERT
    Maekawa, Aru
    Kobayashi, Naoki
    Funakoshi, Kotaro
    Okumura, Manabu
    [J]. 61ST CONFERENCE OF THE THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 2, 2023, : 119 - 127