Is staying out of bomb-shelters a human-automation interaction issue?

被引:1
|
作者
Cohen-Lazry, Guy [1 ]
Oron-Gilad, Tal [1 ]
机构
[1] Ben Gurion Univ Negev, Dept Ind Engn & Management, Human Factors Lab, IL-84105 Beer Sheva, Israel
关键词
Iron-Dome; Complacency; Trust in automation; Experience; Rocket defense system; TRUST; RELIANCE; WARNINGS; TASK;
D O I
10.1016/j.techsoc.2016.08.002
中图分类号
D58 [社会生活与社会问题]; C913 [社会生活与社会问题];
学科分类号
摘要
"Iron-Dome" is an anti-rocket air defense system placed around major urban areas of Israel. It was created to provide citizens with greater deal of protection against hostile rocket attacks. A study was conducted to examine whether civilians' experience with the "Iron-Dome" system affects people's perceived reliability of it, their trust in it, and their complacency to hostile rocket alerts. During the 2014 Israel-Gaza conflict (operation "Protective Edge"), an online questionnaire was used to measure civilian respondents' perceptions and actions. Results indicated that people living in geographical areas who had more experience with rocket attacks and thereby with the "Iron-Dome" system, perceived it as less reliable, had lower trust in it, and were less complacent. These results show that people's interaction with the "Iron-Dome" corresponds to the common prediction of theoretical models of human-automation interaction. This understanding may assist in planning of implementation programs and guidance of civilians for other mass protection systems in the future. (C) 2016 Elsevier Ltd. All rights reserved.
引用
收藏
页码:25 / 30
页数:6
相关论文
共 50 条
  • [1] Introducton to the Special Issue on Human-Automation Interaction in Aerospace Systems
    Feigh, Karen M.
    van Paassen, M. M.
    JOURNAL OF AEROSPACE INFORMATION SYSTEMS, 2015, 12 (07): : 429 - 430
  • [2] Preface to the Special Issue on Advancing Models of Human-Automation Interaction
    Roth, Emilie M.
    Pritchett, Amy R.
    JOURNAL OF COGNITIVE ENGINEERING AND DECISION MAKING, 2018, 12 (01) : 3 - 6
  • [3] Special issue on human-automation coagency
    Inagaki, T.
    COGNITION TECHNOLOGY & WORK, 2012, 14 (01) : 1 - 2
  • [4] Formal verification of human-automation interaction
    Degani, A
    Heymann, M
    HUMAN FACTORS, 2002, 44 (01) : 28 - 43
  • [5] History and future of human-automation interaction
    Janssen, Christian P.
    Donker, Stella F.
    Brumby, Duncan P.
    Kun, Andrew L.
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES, 2019, 131 : 99 - 107
  • [6] Challenges in implementation of human-automation interaction models
    Fereidunian, Alireza
    Lucas, Caro
    Lesani, Hamid
    Lehtonen, Matti
    Nordman, Mikael
    2007 MEDITERRANEAN CONFERENCE ON CONTROL & AUTOMATION, VOLS 1-4, 2007, : 1266 - +
  • [7] Editorial to the virtual Special Issue: Human-automation interaction in the workplace: A broadened scope of paradigms
    Wesche, Jenny S.
    Langer, Markus
    Sonderegger, Andreas
    Landers, Richard N.
    COMPUTERS IN HUMAN BEHAVIOR, 2022, 134
  • [8] Intuitive Cognition and Models of Human-Automation Interaction
    Patterson, Robert Earl
    HUMAN FACTORS, 2017, 59 (01) : 101 - 115
  • [9] The role of social support in human-automation interaction
    Sauer, Juergen
    Sonderegger, Andreas
    Semmer, Norbert K.
    ERGONOMICS, 2024, 67 (06) : 732 - 743
  • [10] Work domain modeling of human-automation interaction for in-vehicle automation
    Zhang, You
    Lintern, Gavan
    COGNITION TECHNOLOGY & WORK, 2024, 26 (04) : 585 - 601