Visual performance in augmented reality systems for mobile use

被引:7
|
作者
Menozzi, M [1 ]
Hofer, F [1 ]
Näpflin, U [1 ]
Krueger, H [1 ]
机构
[1] Swiss Fed Inst Technol, Inst Hyg & Appl Physiol, Appl Vis Res, CH-8092 Zurich, Switzerland
关键词
D O I
10.1207/S15327590IJHC1603_4
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Users of augmented reality (AR) must direct their attention toward real world as well as artificial information. The authors investigated some aspects of interference between the 2 sources of information that affect performance in completing a visual search task. The search task was carried out under 3 different conditions, 2 of them as found in AR in mobile systems. Participants were asked to detect a target that was superimposed on a background. Target and background were presented on a screen subtending a rectangular area of 55degrees x 43degrees (horizontal x vertical). The target appeared at 6 different locations on the screen. A video recording of a car drive served as the background. In 1 condition, the recording was replayed continuously. Static images of the record were sampled at 5-sec intervals and replayed as background in another condition. A uniform gray background served as a baseline. Detectability (d') of the target was highest in the baseline condition. A reduced detectability was found in the presence of static images. Lowest detectability was found in the condition with continuous playback of the video recording. A deterioration of reaction time was found to increase with the same order of conditions as listed earlier. Participants were more efficient in completing the detection task when the targets were presented in the lower part of the screen than in the upper part. The authors concluded that performance in detecting artificial information depends not only on spatial characteristics but also on temporal variations of the background on which the artificial information is superimposed. Determination of suitability of AR systems used in mobile applications therefore requires the characterization of temporal aspects of the presented visual information. They also concluded that the presentation of artificial information in the upper field of vision is a practicable alternative in the case that the lower field is overloaded. However, this statement is true only in the absence of motion. It can be assumed that a visual task involving real world information may be impeded by the adding of artificial information. Artificial information in AR systems should, therefore, be avoided whenever it is not needed. Because of the particular material used in the experiment, the outcome of the search task might have depended on the participants' driving experience. The results obtained, however, indicate that the total amount of kilometers driven is not correlated with performance on the task.
引用
收藏
页码:447 / 460
页数:14
相关论文
共 50 条
  • [41] Mobile visual media design based on artificial intelligence and augmented reality technology
    Wang X.
    Applied Mathematics and Nonlinear Sciences, 2024, 9 (01)
  • [42] Dropping Hints: Visual Hints for Improving Learning using Mobile Augmented Reality
    Wittig, Nick
    Krvavac, Mak
    Gruenefeld, Uwe
    Rademaker, Florian
    Glaser, Lukas
    Waltmann, Johannes
    Degraen, Donald
    Schneegass, Stefan
    PROCEEDINGS OF THE 2024 ACM SYMPOSIUM ON SPATIAL USER INTERACTION, SUI 2024, 2024,
  • [43] Decentralized Visual-Inertial Localization and Mapping on Mobile Devices for Augmented Reality
    Sartipi, Kourosh
    DuToit, Ryan C.
    Cohar, Christopher B.
    Roumeliotis, Stergios I.
    2019 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2019, : 2145 - 2152
  • [44] Fine-Grained Visual Recognition in Mobile Augmented Reality for Technical Support
    Zhou, Bing
    Guven, Sinem
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2020, 26 (12) : 3514 - 3523
  • [45] A Visual-inertial Fusion Based Tracking System for Mobile Augmented Reality
    Lin, Cheng
    Wang, Lianghao
    Li, Dongxiao
    Zhang, Ming
    2015 IEEE INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING, COMMUNICATIONS AND COMPUTING (ICSPCC), 2015, : 956 - 960
  • [46] Visual-Inertial RGB-D SLAM for Mobile Augmented Reality
    Williem
    Ivan, Andre
    Seok, Hochang
    Lim, Jongwoo
    Yoon, Kuk-Jin
    Cho, Ikhwan
    Park, In Kyu
    ADVANCES IN MULTIMEDIA INFORMATION PROCESSING - PCM 2017, PT II, 2018, 10736 : 928 - 938
  • [47] Transformative Reality: Augmented reality for visual prostheses
    Lui, Wen Lik Dennis
    Browne, Damien
    Kleeman, Lindsay
    Drummond, Tom
    Li, Wai Ho
    2011 10TH IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY (ISMAR), 2011,
  • [48] Streaming Mobile Augmented Reality on Mobile Phones
    Chen, David M.
    Tsai, Sam S.
    Vedantham, Ramakrishna
    Grzeszczuk, Radek
    Girod, Bernd
    2009 8TH IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY - SCIENCE AND TECHNOLOGY, 2009, : 181 - +
  • [49] Optimization in Mobile Augmented Reality Systems for the Metaverse over Wireless Communications
    Lan, Tianming
    Zhao, Jun
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 5439 - 5444
  • [50] Architectural issues in mobile augmented reality systems: A prototyping case study
    Dutoit, AH
    Creighton, O
    Klinker, G
    Kobylinski, R
    Vilsmeier, C
    Bruegge, B
    APSEC 2001: EIGHTH ASIA-PACIFIC SOFTWARE ENGINEERING CONFERENCE, PROCEEDINGS, 2001, : 341 - 344