TaleBrush: Sketching Stories with Generative Pretrained Language Models

被引:57
|
作者
Chung, John Joon Young [1 ]
Kim, Wooseok [2 ]
Yoo, Kang Min [3 ]
Lee, Hwaran [3 ]
Adar, Eytan [1 ]
Chang, Minsuk [3 ]
机构
[1] Univ Michigan, Ann Arbor, MI 48109 USA
[2] Korea Adv Inst Sci & Technol, Daejeon, South Korea
[3] Naver AI LAB, Seongnam, South Korea
关键词
story writing; sketching; creativity support tool; story generation; controlled generation; PLOT;
D O I
10.1145/3491102.3501819
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
While advanced text generation algorithms (e.g., GPT-3) have enabled writers to co-create stories with an AI, guiding the narrative remains a challenge. Existing systems often leverage simple turn-taking between the writer and the AI in story development. However, writers remain unsupported in intuitively understanding the AI's actions or steering the iterative generation. We introduce TaleBrush, a generative story ideation tool that uses line sketching interactions with a GPT-based language model for control and sensemaking of a protagonist's fortune in co-created stories. Our empirical evaluation found our pipeline reliably controls story generation while maintaining the novelty of generated sentences. In a user study with 14 participants with diverse writing experiences, we found participants successfully leveraged sketching to iteratively explore and write stories according to their intentions about the character's fortune while taking inspiration from generated stories. We conclude with a reflection on how sketching interactions can facilitate the iterative human-AI co-creation process.
引用
下载
收藏
页数:19
相关论文
共 50 条
  • [1] TaleBrush: Visual Sketching of Story Generation with Pretrained Language Models
    Chung, John Joon Young
    Kim, Wooseok
    Yoo, Kang Min
    Lee, Hwaran
    Adar, Eytan
    Chang, Minsuk
    EXTENDED ABSTRACTS OF THE 2022 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, CHI 2022, 2022,
  • [2] Constructing Chinese taxonomy trees from understanding and generative pretrained language models
    Guo, Jianyu
    Chen, Jingnan
    Ren, Li
    Zhou, Huanlai
    Xu, Wenbo
    Jia, Haitao
    PeerJ Computer Science, 2024, 10
  • [3] A Survey of Pretrained Language Models
    Sun, Kaili
    Luo, Xudong
    Luo, Michael Y.
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT II, 2022, 13369 : 442 - 456
  • [4] A Heideggerian analysis of generative pretrained transformer models
    Floroiu, Iustin
    Timisica, Daniela
    ROMANIAN JOURNAL OF INFORMATION TECHNOLOGY AND AUTOMATIC CONTROL-REVISTA ROMANA DE INFORMATICA SI AUTOMATICA, 2024, 34 (01): : 13 - 22
  • [5] Generating Datasets with Pretrained Language Models
    Schick, Timo
    Schuetze, Hinrich
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 6943 - 6951
  • [6] Geographic Adaptation of Pretrained Language Models
    Hofmann, Valentin
    Glavas, Goran
    Ljubesic, Nikola
    Pierrehumbert, Janet B.
    Schuetze, Hinrich
    TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2024, 12 : 411 - 431
  • [7] Discourse Probing of Pretrained Language Models
    Koto, Fajri
    Lau, Jey Han
    Baldwin, Timothy
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 3849 - 3864
  • [8] Textually Pretrained Speech Language Models
    Hassid, Michael
    Remez, Tal
    Nguyen, Tu Anh
    Gat, Itai
    Conneau, Alexis
    Kreuk, Felix
    Copet, Jade
    Defossez, Alexandre
    Synnaeve, Gabriel
    Dupoux, Emmanuel
    Schwartz, Roy
    Adi, Yossi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [9] Investigating Transferability in Pretrained Language Models
    Tamkin, Alex
    Singh, Trisha
    Giovanardi, Davide
    Goodman, Noah
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 1393 - 1401
  • [10] Sketching Process Models by Mining Participant Stories
    Ivanchikj, Ana
    Pautasso, Cesare
    BUSINESS PROCESS MANAGEMENT FORUM, BPM FORUM 2019, 2019, 360 : 3 - 19