CreativeBot: a Creative Storyteller Agent Developed by Leveraging Pre-trained Language Models

被引:2
|
作者
Elgarf, Maha [1 ]
Peters, Christopher [1 ]
机构
[1] KTH Royal Inst Technol, Stockholm, Sweden
关键词
D O I
10.1109/IROS47612.2022.9981033
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In an attempt to nurture children's creativity, we developed a creative conversational agent to be used in a collaborative storytelling context with a child. We presented a novel approach to develop creative Artificial Intelligence (AI). Our approach uses the four creativity measures: fluency, flexibility, elaboration and originality in order to generate creative behavior. We analyzed and annotated our previously collected storytelling data sets -collected with children- according to our four creativity measures. We then used the extracted and annotated data (636 statements) in order to fine-tune two pre-trained language models (Open AI GPT-3). The two models were aimed at generating creative versus non-creative behavior in a collaborative storytelling scenario. We developed the two models to be able to assess the results and compare them together. We conducted an evaluation to assess stories generated collaboratively between a human and both agents separately (n = 26). Adult Users rated the creativity of the agent according to the stories generated. Results showed that the creative agent was perceived as significantly more creative than the non-creative agent. With the experiment results confirming the validity of our system, we may therefore proceed with testing the effects of the creative behavior of the agent on children's creativity skills.
引用
收藏
页码:13438 / 13444
页数:7
相关论文
共 50 条
  • [1] Leveraging Pre-trained Language Models for Gender Debiasing
    Jain, Nishtha
    Popovic, Maja
    Groves, Declan
    Specia, Lucia
    [J]. LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 2188 - 2195
  • [2] Leveraging pre-trained language models for code generation
    Soliman, Ahmed
    Shaheen, Samir
    Hadhoud, Mayada
    [J]. COMPLEX & INTELLIGENT SYSTEMS, 2024, 10 (03) : 3955 - 3980
  • [3] Interpreting Art by Leveraging Pre-Trained Models
    Penzel, Niklas
    Denzler, Joachim
    [J]. 2023 18TH INTERNATIONAL CONFERENCE ON MACHINE VISION AND APPLICATIONS, MVA, 2023,
  • [4] Leveraging pre-trained language models for mining microbiome-disease relationships
    Karkera, Nikitha
    Acharya, Sathwik
    Palaniappan, Sucheendra K.
    [J]. BMC BIOINFORMATICS, 2023, 24 (01)
  • [5] Leveraging pre-trained language models for mining microbiome-disease relationships
    Nikitha Karkera
    Sathwik Acharya
    Sucheendra K. Palaniappan
    [J]. BMC Bioinformatics, 24
  • [6] Pre-Trained Language Models and Their Applications
    Wang, Haifeng
    Li, Jiwei
    Wu, Hua
    Hovy, Eduard
    Sun, Yu
    [J]. ENGINEERING, 2023, 25 : 51 - 65
  • [7] Annotating Columns with Pre-trained Language Models
    Suhara, Yoshihiko
    Li, Jinfeng
    Li, Yuliang
    Zhang, Dan
    Demiralp, Cagatay
    Chen, Chen
    Tan, Wang-Chiew
    [J]. PROCEEDINGS OF THE 2022 INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA (SIGMOD '22), 2022, : 1493 - 1503
  • [8] Leveraging Pre-trained Language Model for Speech Sentiment Analysis
    Shon, Suwon
    Brusco, Pablo
    Pan, Jing
    Han, Kyu J.
    Watanabe, Shinji
    [J]. INTERSPEECH 2021, 2021, : 3420 - 3424
  • [9] LaoPLM: Pre-trained Language Models for Lao
    Lin, Nankai
    Fu, Yingwen
    Yang, Ziyu
    Chen, Chuwei
    Jiang, Shengyi
    [J]. LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 6506 - 6512
  • [10] PhoBERT: Pre-trained language models for Vietnamese
    Dat Quoc Nguyen
    Anh Tuan Nguyen
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 1037 - 1042