Measuring visually guided motor performance in ultra low vision using virtual reality

被引:0
|
作者
Kartha, Arathy [1 ,2 ]
Sadeghi, Roksana [1 ,3 ]
Bradley, Chris [1 ]
Livingston, Brittnee [4 ]
Tran, Chau [5 ]
Gee, Will [5 ]
Dagnelie, Gislin [1 ]
机构
[1] Johns Hopkins Sch Med, Wilmer Eye Inst, Baltimore, MD 21205 USA
[2] SUNY, Dept Biol & Vis Sci, Coll Optometry, New York, NY 10036 USA
[3] Univ Calif Berkeley, Herbert Wertheim Sch Optometry & Vis Sci, Berkeley, CA USA
[4] Cent Assoc Blind & Visually Impaired, Dept Hist, Utica, NY USA
[5] BMORE VIRTUAL LLC, Baltimore, MD USA
关键词
hand-eye coordination; virtual reality; ultra low vision; outcome measures and assessments; vision restoration; HAND-EYE COORDINATION; INDIVIDUALS;
D O I
10.3389/fnins.2023.1251935
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
IntroductionUltra low vision (ULV) refers to profound visual impairment where an individual cannot read even the top line of letters on an ETDRS chart from a distance of 0.5 m. There are limited tools available to assess visual ability in ULV. The aim of this study was to develop and calibrate a new performance test, Wilmer VRH, to assess hand-eye coordination in individuals with ULV.MethodsA set of 55 activities was developed for presentation in a virtual reality (VR) headset. Activities were grouped into 2-step and 5-step items. Participants performed a range of tasks involving reaching and grasping, stacking, sorting, pointing, throwing, and cutting. Data were collected from 20 healthy volunteers under normal vision (NV) and simulated ULV (sULV) conditions, and from 33 participants with ULV. Data were analyzed using the method of successive dichotomizations (MSD), a polytomous Rasch model, to estimate item (difficulty) and person (ability) measures. MSD was applied separately to 2-step and 5-step performance data, then merged to a single equal interval scale.ResultsThe mean +/- SD of completion rates were 98.6 +/- 1.8%, 78.2 +/- 12.5% and 61.1 +/- 34.2% for NV, sULV and ULV, respectively. Item measures ranged from -1.09 to 5.7 logits and - 4.3 to 4.08 logits and person measures ranged from -0.03 to 4.2 logits and -3.5 to 5.2 logits in sULV and ULV groups, respectively. Ninety percent of item infits were within the desired range of [0.5,1.5], and 97% of person infits were within that range. Together with item and person reliabilities of 0.94 and 0.91 respectively, this demonstrates unidimensionality of Wilmer VRH. A Person Item map showed that the items were well-targeted to the sample of individuals with ULV in the study.DiscussionWe present the development of a calibrated set of activities in VR that can be used to assess hand-eye coordination in individuals with ULV. This helps bridge a gap in the field by providing a validated outcome measure that can be used in vision restoration trials that recruit people with ULV, and to assess rehabilitation outcomes in people with ULV.
引用
收藏
页数:9
相关论文
共 50 条
  • [41] Toward Low-Latency and Ultra-Reliable Virtual Reality
    Elbamby, Mohammed S.
    Perfecto, Cristina
    Bennis, Mehdi
    Doppler, Klaus
    IEEE NETWORK, 2018, 32 (02): : 78 - 84
  • [42] Fog-Assisted Virtual Reality MMOG with Ultra Low Latency
    Yoshihara, Tsuyoshi
    Fujita, Satoshi
    2019 SEVENTH INTERNATIONAL SYMPOSIUM ON COMPUTING AND NETWORKING (CANDAR 2019), 2019, : 121 - 129
  • [43] Using Virtual Reality to Analyze Sports Performance
    Bideau, Benoit
    Kulpa, Richard
    Vignais, Nicolas
    Brault, Sebastien
    Multon, Franck
    Craig, Cathy
    IEEE COMPUTER GRAPHICS AND APPLICATIONS, 2010, 30 (02) : 14 - 21
  • [44] A Virtual Reality Assembly Assessment Benchmark for Measuring VR Performance & Limitations
    Otto, Michael
    Lampen, Eva
    Agethen, Philipp
    Langohr, Mareike
    Zachmann, Gabriel
    Rukzio, Enrico
    52ND CIRP CONFERENCE ON MANUFACTURING SYSTEMS (CMS), 2019, 81 : 785 - 790
  • [45] Improvement of real soccer motor skills using virtual reality
    Lozano-Tarazona, Alvaro M.
    Pinzon, Diego Mauricio Rivera
    INTERACTIVE LEARNING ENVIRONMENTS, 2024, 32 (09) : 5766 - 5778
  • [46] Virtual reality technologies to provide performance feedback for motor and imagery training
    Shi, Lei
    Xu, Chunxia
    EDUCATION AND INFORMATION TECHNOLOGIES, 2024, 29 (17) : 23781 - 23799
  • [47] Measuring the Impacts of Virtual Reality Games on Cognitive Ability Using EEG Signals and Game Performance Data
    Wan, Bo
    Wang, Qi
    Su, Kejia
    Dong, Caiyu
    Song, Wenjing
    Pang, Min
    IEEE ACCESS, 2021, 9 : 18326 - 18344
  • [48] Visually Improved Digital Media Communication Using Virtual Reality Technology and Digital Twin
    Zhang, Qian
    Guo, Xiaoying
    Sun, Maojun
    Samuel, R. Dinesh Jackson
    Kumar, Priyan Malarvizhi
    JOURNAL OF INTERCONNECTION NETWORKS, 2022, 22 (SUPP04)
  • [49] Effects of Displays on Visually Controlled Task Performance in Three-Dimensional Virtual Reality Environment
    Lin, Chiuhsiang Joe
    Chen, Hung-Jen
    Cheng, Ping-Yun
    Sun, Tien-Lung
    HUMAN FACTORS AND ERGONOMICS IN MANUFACTURING & SERVICE INDUSTRIES, 2015, 25 (05) : 523 - 533
  • [50] VRiAssist: An Eye-Tracked Virtual Reality Low Vision Assistance Tool
    Masnadi, Sina
    Williamson, Brian
    Gonzalez, Andres N. Vargas
    LaViola, Joseph J., Jr.
    2020 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES WORKSHOPS (VRW 2020), 2020, : 809 - 810