Approaching Human Performance in Behavior Estimation in Couples Therapy Using Deep Sentence Embeddings

被引:4
|
作者
Tseng, Shao-Yen [1 ]
Baucom, Brian [2 ]
Georgiou, Panayiotis [1 ]
机构
[1] Univ Southern Calif, Los Angeles, CA 90089 USA
[2] Univ Utah, Dept Psychol, Salt Lake City, UT 84112 USA
关键词
Behavioral signal processing; natural language processing; sequence-to-sequence learning; recurrent neural network;
D O I
10.21437/Interspeech.2017-1621
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Identifying complex behavior in human interactions for observational studies often involves the tedious process of transcribing and annotating large amounts of data. While there is significant work towards accurate transcription in Automatic Speech Recognition, automatic Natural Language Understanding of high-level human behaviors from the transcribed text is still at an early stage of development. In this paper we present a novel approach for modeling human behavior using sentence embeddings and propose an automatic behavior annotation framework. We explore unsupervised methods of extracting semantic information, using seq2seq models, into deep sentence embeddings and demonstrate that these embeddings capture behaviorally meaningful information. Our proposed framework utilizes LSTM Recurrent Neural Networks to estimate behavior trajectories from these sentence embeddings. Finally, we employ fusion to compare our high-resolution behavioral trajectories with the coarse, session-level behavioral ratings of human annotators in Couples Therapy. Our experiments show that behavior annotation using this framework achieves better results than prior methods and approaches or exceeds human performance in terms of annotator agreement.
引用
收藏
页码:3291 / 3295
页数:5
相关论文
共 50 条
  • [21] Analyzing the Performance of Deep Learning-based Techniques for Human Pose Estimation
    Boscolo, Federico
    Lamberti, Fabrizio
    Morra, Lia
    2024 IEEE INTERNATIONAL WORKSHOP ON SPORT, TECHNOLOGY AND RESEARCH, STAR 2024, 2024, : 193 - 198
  • [22] Human Pose Estimation Using Deep Learning: A Systematic Literature Review
    Samkari, Esraa
    Arif, Muhammad
    Alghamdi, Manal
    Al Ghamdi, Mohammed A.
    MACHINE LEARNING AND KNOWLEDGE EXTRACTION, 2023, 5 (04): : 1612 - 1659
  • [23] Human Age Estimation Using Deep Learning from Gait Data
    Pathan, Refat Khan
    Uddin, Mohammad Amaz
    Nahar, Nazmun
    Ara, Ferdous
    Hossain, Mohammad Shahadat
    Andersson, Karl
    APPLIED INTELLIGENCE AND INFORMATICS, AII 2021, 2021, 1435 : 281 - 294
  • [24] Human pose, hand and mesh estimation using deep learning: a survey
    Toshpulatov, Mukhiddin
    Lee, Wookey
    Lee, Suan
    Roudsari, Arousha Haghighian
    JOURNAL OF SUPERCOMPUTING, 2022, 78 (06): : 7616 - 7654
  • [25] Human pose, hand and mesh estimation using deep learning: a survey
    Mukhiddin Toshpulatov
    Wookey Lee
    Suan Lee
    Arousha Haghighian Roudsari
    The Journal of Supercomputing, 2022, 78 : 7616 - 7654
  • [26] Estimation on Human Motion Posture Using Improved Deep Reinforcement Learning
    Ma, Wenjing
    Zhao, Jianguang
    Zhu, Guangquan
    Journal of Computers (Taiwan), 2023, 34 (04) : 97 - 110
  • [27] Human behavior and performance in deep space exploration: next challenges and research gaps
    Francesco Pagnini
    Dietrich Manzey
    Elisabeth Rosnet
    Denise Ferravante
    Olivier White
    Nathan Smith
    npj Microgravity, 9
  • [28] Human behavior and performance in deep space exploration: next challenges and research gaps
    Pagnini, Francesco
    Manzey, Dietrich
    Rosnet, Elisabeth
    Ferravante, Denise
    White, Olivier
    Smith, Nathan
    NPJ MICROGRAVITY, 2023, 9 (01)
  • [29] Exhaust Temperature Prediction for Gas Turbine Performance Estimation by Using Deep Learning
    Chang Woo Hong
    Jeongju Kim
    Journal of Electrical Engineering & Technology, 2023, 18 : 3117 - 3125
  • [30] Exhaust Temperature Prediction for Gas Turbine Performance Estimation by Using Deep Learning
    Hong, Chang Woo
    Kim, Jeongju
    JOURNAL OF ELECTRICAL ENGINEERING & TECHNOLOGY, 2023, 18 (04) : 3117 - 3125