Measuring visually guided motor performance in ultra low vision using virtual reality

被引:0
|
作者
Kartha, Arathy [1 ,2 ]
Sadeghi, Roksana [1 ,3 ]
Bradley, Chris [1 ]
Livingston, Brittnee [4 ]
Tran, Chau [5 ]
Gee, Will [5 ]
Dagnelie, Gislin [1 ]
机构
[1] Johns Hopkins Sch Med, Wilmer Eye Inst, Baltimore, MD 21205 USA
[2] SUNY, Dept Biol & Vis Sci, Coll Optometry, New York, NY 10036 USA
[3] Univ Calif Berkeley, Herbert Wertheim Sch Optometry & Vis Sci, Berkeley, CA USA
[4] Cent Assoc Blind & Visually Impaired, Dept Hist, Utica, NY USA
[5] BMORE VIRTUAL LLC, Baltimore, MD USA
关键词
hand-eye coordination; virtual reality; ultra low vision; outcome measures and assessments; vision restoration; HAND-EYE COORDINATION; INDIVIDUALS;
D O I
10.3389/fnins.2023.1251935
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
IntroductionUltra low vision (ULV) refers to profound visual impairment where an individual cannot read even the top line of letters on an ETDRS chart from a distance of 0.5 m. There are limited tools available to assess visual ability in ULV. The aim of this study was to develop and calibrate a new performance test, Wilmer VRH, to assess hand-eye coordination in individuals with ULV.MethodsA set of 55 activities was developed for presentation in a virtual reality (VR) headset. Activities were grouped into 2-step and 5-step items. Participants performed a range of tasks involving reaching and grasping, stacking, sorting, pointing, throwing, and cutting. Data were collected from 20 healthy volunteers under normal vision (NV) and simulated ULV (sULV) conditions, and from 33 participants with ULV. Data were analyzed using the method of successive dichotomizations (MSD), a polytomous Rasch model, to estimate item (difficulty) and person (ability) measures. MSD was applied separately to 2-step and 5-step performance data, then merged to a single equal interval scale.ResultsThe mean +/- SD of completion rates were 98.6 +/- 1.8%, 78.2 +/- 12.5% and 61.1 +/- 34.2% for NV, sULV and ULV, respectively. Item measures ranged from -1.09 to 5.7 logits and - 4.3 to 4.08 logits and person measures ranged from -0.03 to 4.2 logits and -3.5 to 5.2 logits in sULV and ULV groups, respectively. Ninety percent of item infits were within the desired range of [0.5,1.5], and 97% of person infits were within that range. Together with item and person reliabilities of 0.94 and 0.91 respectively, this demonstrates unidimensionality of Wilmer VRH. A Person Item map showed that the items were well-targeted to the sample of individuals with ULV in the study.DiscussionWe present the development of a calibrated set of activities in VR that can be used to assess hand-eye coordination in individuals with ULV. This helps bridge a gap in the field by providing a validated outcome measure that can be used in vision restoration trials that recruit people with ULV, and to assess rehabilitation outcomes in people with ULV.
引用
收藏
页数:9
相关论文
共 50 条
  • [21] Development of a virtual reality GIS using stereo vision
    Lin, Ta-Te
    Hsiung, Yuan-Kai
    Hong, Guo-Long
    Chang, Hung-Kuo
    Lu, Fu-Ming
    COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2008, 63 (01) : 38 - 48
  • [22] ULTRA-LOW-POWER GAZE TRACKING FOR VIRTUAL REALITY
    Li, Tianxing
    Liu, Qiang
    Zhou, Xia
    GETMOBILE-MOBILE COMPUTING & COMMUNICATIONS REVIEW, 2018, 22 (03) : 27 - 31
  • [23] Ultra-Low Power Gaze Tracking for Virtual Reality
    Li, Tianxing
    Akosah, Emmanuel S.
    Liu, Qiang
    Zhou, Xia
    PROCEEDINGS OF THE 15TH ACM CONFERENCE ON EMBEDDED NETWORKED SENSOR SYSTEMS (SENSYS'17), 2017,
  • [24] Ultra-Low Power Gaze Tracking for Virtual Reality
    Li, Tianxing
    Akosah, Emmanuel S.
    Liu, Qiang
    Zhou, Xia
    PROCEEDINGS OF THE 23RD ANNUAL INTERNATIONAL CONFERENCE ON MOBILE COMPUTING AND NETWORKING (MOBICOM '17), 2017, : 490 - 492
  • [25] Virtual reality technology of ultra-low altitude UAV
    Yuan, Yanwei
    Zhang, Xiaochao
    Mao, Wenhua
    Zhao, Huaping
    Nongye Jixie Xuebao/Transactions of the Chinese Society of Agricultural Machinery, 2009, 40 (06): : 147 - 152
  • [26] Measuring visual performance in the low vision range
    Colenbrander, A
    INVESTIGATIVE OPHTHALMOLOGY & VISUAL SCIENCE, 1996, 37 (03) : 3306 - 3306
  • [27] Ultra-Low Power Gaze Tracking for Virtual Reality
    Li, Tianxing
    Liu, Qiang
    Zhou, Xia
    PROCEEDINGS OF THE 15TH ACM CONFERENCE ON EMBEDDED NETWORKED SENSOR SYSTEMS (SENSYS'17), 2017,
  • [28] Low vision aids using virtual reality (VR) headsets and mobile application; preliminary report
    Yoon, Do Yeh
    Jeon, Hyun Sun
    Wee, Won Ryang
    Hyon, Joon-Young
    INVESTIGATIVE OPHTHALMOLOGY & VISUAL SCIENCE, 2019, 60 (09)
  • [29] Case Report of Anomalous Head Posture Correction with Low Vision Aid Using Virtual Reality
    Kang, Jeong Woo
    Bae, Seon Ha
    Yeo, Joon Hyung
    Moon, Nam Ju
    JOURNAL OF THE KOREAN OPHTHALMOLOGICAL SOCIETY, 2020, 61 (06): : 699 - 705
  • [30] SIMULATION OF LOW-VISION EXPERIENCE BY USING A HEAD-MOUNTED VIRTUAL REALITY SYSTEM
    Yoshioka, Yohsuke
    PROCEEDINGS OF THE INTERNATIONAL CONFERENCES ON INTERFACES AND HUMAN COMPUTER INTERACTION 2015, GAME AND ENTERTAINMENT TECHNOLOGIES 2015 AND COMPUTER GRAPHICS, VISUALIZATION, COMPUTER VISION AND IMAGE PROCESSING 2015, 2015, : 242 - 246