Sequence-to-Sequence Language Models for Character and Emotion Detection in Dream Narratives

被引:0
|
作者
Cortal, Gustave [1 ,2 ]
机构
[1] Univ Paris Saclay, ENS Paris Saclay, CNRS, LMF, F-91190 Gif Sur Yvette, France
[2] Univ Paris Saclay, CNRS, LISN, F-91400 Orsay, France
来源
TRAITEMENT AUTOMATIQUE DES LANGUES | 2024年 / 65卷 / 01期
关键词
Emotion detection; Language model; Quantitative analysis of dreams; SLEEP;
D O I
暂无
中图分类号
H0 [语言学];
学科分类号
030303 ; 0501 ; 050102 ;
摘要
Analyzing dreams quantitatively depends on labor-intensive, manual annotation of dream narratives. We automate this process through a natural language sequence -to -sequence generation framework. This paper presents the first study on character and emotion detection in the english portion of the open DreamBank corpus of dream narratives. We evaluate the impact of model size, prediction order of characters, and the consideration of proper names and character traits. Our model and its generated annotations are publicly available.
引用
收藏
页码:11 / 35
页数:25
相关论文
共 50 条
  • [1] Sequence-to-Sequence Models and Their Evaluation for Spoken Language Normalization of Slovenian
    Sepesy Maučec, Mirjam
    Verdonik, Darinka
    Donaj, Gregor
    Applied Sciences (Switzerland), 2024, 14 (20):
  • [2] Utilizing Character and Word Embeddings for Text Normalization with Sequence-to-Sequence Models
    Watson, Daniel
    Zalmout, Nasser
    Habash, Nizar
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 837 - 843
  • [3] Reformulating natural language queries using sequence-to-sequence models
    Xiaoyu Liu
    Shunda Pan
    Qi Zhang
    Yu-Gang Jiang
    Xuanjing Huang
    Science China Information Sciences, 2019, 62
  • [4] Reformulating natural language queries using sequence-to-sequence models
    Liu, Xiaoyu
    Pan, Shunda
    Zhang, Qi
    Jiang, Yu-Gang
    Huang, Xuanjing
    SCIENCE CHINA-INFORMATION SCIENCES, 2019, 62 (12)
  • [5] Reformulating natural language queries using sequence-to-sequence models
    Xiaoyu LIU
    Shunda PAN
    Qi ZHANG
    Yu-Gang JIANG
    Xuanjing HUANG
    Science China(Information Sciences), 2019, 62 (12) : 254 - 256
  • [6] Sparse Sequence-to-Sequence Models
    Peters, Ben
    Niculae, Vlad
    Martins, Andre F. T.
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1504 - 1519
  • [7] Assessing incrementality in sequence-to-sequence models
    Ulmer, Dennis
    Hupkes, Dieuwke
    Bruni, Elia
    4TH WORKSHOP ON REPRESENTATION LEARNING FOR NLP (REPL4NLP-2019), 2019, : 209 - 217
  • [8] Post-OCR Document Correction with Large Ensembles of Character Sequence-to-Sequence Models
    Ramirez-Orta, Juan
    Xamena, Eduardo
    Maguitman, Ana
    Milios, Evangelos
    Soto, Axel J.
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 11192 - 11199
  • [9] An Analysis of "Attention" in Sequence-to-Sequence Models
    Prabhavalkar, Rohit
    Sainath, Tara N.
    Li, Bo
    Rao, Kanishka
    Jaitly, Navdeep
    18TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2017), VOLS 1-6: SITUATED INTERACTION, 2017, : 3702 - 3706
  • [10] Deep Reinforcement Learning for Sequence-to-Sequence Models
    Keneshloo, Yaser
    Shi, Tian
    Ramakrishnan, Naren
    Reddy, Chandan K.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (07) : 2469 - 2489