Effects of Proactive Dialogue Strategies on Human-Computer Trust

被引:32
|
作者
Kraus, Matthias [1 ]
Wagner, Nicolas [1 ]
Minker, Wolfgang [1 ]
机构
[1] Ulm Univ, Ulm, Germany
关键词
Proactivity; Intelligent Assistant; Human-Robot-Interaction; Human-Computer Trust; Spoken Dialogue System; COGNITIVE LOAD THEORY; BEHAVIOR; AUTOMATION; ATTITUDES; ROBOTS; TASK;
D O I
10.1145/3340631.3394840
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Intelligent computer systems aim at providing user-assistance for challenging tasks, like decision-making, planning, or learning. For offering optimal assistance, it is essential for such systems to decide when to be reactive or proactive and how active system behaviour should be designed. Especially, as this decision may greatly influence the user's trust in the system. Therefore, we conducted a mixed-factorial study which examines how different levels of proactivity (none, notification, suggestion, and intervention) as well as timing strategies (fixed-timing and insecurity-based) are trusted by subjects while performing a planning task. The results showed, that proactive system behaviour is perceived trustworthy in insecure situations independent of the timing. However, proactive dialogue showed strong effects on cognition-based trust (system's perceived competence and reliability) depending on task difficulty. Furthermore, fully autonomous system behaviour fails to establish an adequate human-computer trust relationship, in contrast to conservative strategies.
引用
收藏
页码:107 / 116
页数:10
相关论文
共 50 条
  • [1] A human-computer debating system prototype and its dialogue strategies
    Yuan, Tangming
    Moore, David
    Grierson, Alec
    [J]. INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2007, 22 (01) : 133 - 156
  • [2] Recent Advances on Human-Computer Dialogue
    Wang, Xiaojie
    Yuan, Caixia
    [J]. CAAI TRANSACTIONS ON INTELLIGENCE TECHNOLOGY, 2016, 1 (04) : 303 - +
  • [3] A GRAMMATICAL SPECIFICATION OF HUMAN-COMPUTER DIALOGUE
    NYMEYER, A
    [J]. COMPUTER LANGUAGES, 1995, 21 (01): : 1 - 16
  • [4] EFFECTS OF SELF-DISCLOSURE AND EMPATHY IN HUMAN-COMPUTER DIALOGUE
    Higashinaka, Ryuichiro
    Dohsaka, Kohji
    Isozaki, Hideki
    [J]. 2008 IEEE WORKSHOP ON SPOKEN LANGUAGE TECHNOLOGY: SLT 2008, PROCEEDINGS, 2008, : 109 - 112
  • [5] Human-computer dialogue breakdown detection
    de Andrade, Leonardo
    Paraboni, Ivandre
    [J]. LINGUAMATICA, 2022, 14 (01): : 17 - 31
  • [6] The human-computer dialogue in learning environments
    Tarby, JC
    [J]. COMPUTER APPLICATIONS IN ENGINEERING EDUCATION, 1997, 5 (01) : 29 - 39
  • [7] Informal logic dialogue games in human-computer dialogue
    Yuan, Tangming
    Moore, David
    Reed, Chris
    Ravenscroft, Andrew
    Maudet, Nicolas
    [J]. KNOWLEDGE ENGINEERING REVIEW, 2011, 26 (02): : 159 - 174
  • [8] Conceptual Effects of Audience Design in Human-Computer and Human-Human Dialogue
    Schmader, Christopher
    Horton, William S.
    [J]. DISCOURSE PROCESSES, 2019, 56 (02) : 170 - 190
  • [9] Proactive Human-Computer Collaboration for Information Discovery
    DiBona, Phil
    Shilliday, Andrew
    Barry, Kevin
    [J]. NEXT-GENERATION ANALYST IV, 2016, 9851
  • [10] Effective Speaker Tracking Strategies for Multi-party Human-Computer Dialogue
    Popescu, Vladimir
    Burileanu, Corneliu
    Caelen, Jean
    [J]. INTELLIGENT SYSTEMS AND TECHNOLOGIES: METHODS AND APPLICATIONS, 2009, 217 : 193 - +