Vitreoretinal Surgical Instrument Tracking in Three Dimensions Using Deep Learning

被引:6
|
作者
Baldi, Pierre F. [1 ,2 ,3 ,4 ,6 ]
Abdelkarim, Sherif [1 ,2 ]
Liu, Junze [1 ,2 ]
To, Josiah K. [4 ]
Ibarra, Marialejandra Diaz [5 ]
Browne, Andrew W. [3 ,4 ,5 ,6 ]
机构
[1] Univ Calif Irvine, Dept Comp Sci, Irvine, CA USA
[2] Univ Calif Irvine, Inst Genom & Bioinformat, Irvine, CA USA
[3] Univ Calif Irvine, Dept Biomed Engn, Irvine, CA USA
[4] Univ Calif Irvine, Ctr Translat Vis Res, Dept Ophthalmol, Irvine, CA USA
[5] Univ Calif Irvine, Gavin Herbert Eye Inst, Dept Ophthalmol, Irvine, CA USA
[6] Univ Calif Irvine, Dept Comp Sci, 4038 Bren Hall, Irvine, CA 92697 USA
来源
关键词
artificial intelligence; retina surgery; deep learning; VISUAL FUNCTION; MOBILITY TEST; ORIENTATION; VISION; BLIND;
D O I
10.1167/tvst.12.1.20
中图分类号
R77 [眼科学];
学科分类号
100212 ;
摘要
Purpose: To evaluate the potential for artificial intelligence-based video analysis to determine surgical instrument characteristics when moving in the three-dimensional vitreous space. Methods: We designed and manufactured a model eye in which we recorded choreographed videos of many surgical instruments moving throughout the eye. We labeled each frame of the videos to describe the surgical tool characteristics: tool type, location, depth, and insertional laterality. We trained two different deep learning models to predict each of the tool characteristics and evaluated model performances on a subset of images. Results: The accuracy of the classification model on the training set is 84% for the x-y region, 97% for depth, 100% for instrument type, and 100% for laterality of insertion. The accuracy of the classification model on the validation dataset is 83% for the x-y region, 96% for depth, 100% for instrument type, and 100% for laterality of insertion. The closeup detection model performs at 67 frames per second, with precision for most instruments higher than 75%, achieving a mean average precision of 79.3%. Conclusions: We demonstrated that trained models can track surgical instrument movement in three-dimensional space and determine instrument depth, tip location, instrument insertional laterality, and instrument type. Model performance is nearly instantaneous and justifies further investigation into application to real-world surgical videos.
引用
收藏
页数:12
相关论文
共 50 条
  • [31] On the tracking of shelly carbonate sands using deep learning
    Wu, Mengmeng
    Zhou, Bo
    Wang, Jianfeng
    GEOTECHNIQUE, 2022, 73 (11): : 974 - 985
  • [32] Deep-learning based automated instrument-tracking and adaptive-sampling for 4D imaging of ophthalmic surgical maneuvers
    Tang, Eric
    El-Haddad, Mohamed T.
    Malone, Joseph D.
    Tao, Yuankai
    INVESTIGATIVE OPHTHALMOLOGY & VISUAL SCIENCE, 2020, 61 (07)
  • [33] Development of a Deep Learning-Based Algorithm to Detect the Distal End of a Surgical Instrument
    Sugimori, Hiroyuki
    Sugiyama, Taku
    Nakayama, Naoki
    Yamashita, Akemi
    Ogasawara, Katsuhiko
    APPLIED SCIENCES-BASEL, 2020, 10 (12):
  • [34] TensorRT-based Surgical Instrument Detection Assessment for Deep Learning on Edge Computing
    Belhaoua, Abdelkrim
    Kimpe, Tom
    IMAGE-GUIDED PROCEDURES, ROBOTIC INTERVENTIONS, AND MODELING, MEDICAL IMAGING 2024, 2024, 12928
  • [35] Image Reconstruction in Surgical Field Using Deep Learning
    Divya, S.
    Padmapriya, K.
    Ezhumalai, P.
    REVISTA GEINTEC-GESTAO INOVACAO E TECNOLOGIAS, 2021, 11 (02): : 1489 - 1496
  • [36] Surgical Instrument tracking using Wiimote technology for training in minimally invasive spine surgeries
    Vazquez Ramirez, Javier
    Lorias Espinoza, Daniel
    Perez Escamirosa, Fernando
    Hernandez, Ignacio
    Gutierrez-Gneccchi, Jose A.
    2017 14TH INTERNATIONAL CONFERENCE ON ELECTRICAL ENGINEERING, COMPUTING SCIENCE AND AUTOMATIC CONTROL (CCE), 2017,
  • [37] Correction to: Deep-Sea Organisms Tracking Using Dehazing and Deep Learning
    Huimin Lu
    Tomoki Uemura
    Dong Wang
    Jihua Zhu
    Zi Huang
    Hyoungseop Kim
    Mobile Networks and Applications, 2020, 25 : 2536 - 2536
  • [38] Design Principles of an Electromagnetic System for Surgical Instrument Tracking
    Grunin L.Y.
    Rozhentsov A.A.
    Khalimov M.
    Bulletin of the Russian Academy of Sciences: Physics, 2018, 82 (12) : 1522 - 1524
  • [39] Surgical Instrument Tracking for Vitreo-retinal Eye Surgical Procedures Using ARAS-EYE Dataset
    Lotfi, F.
    Hasani, P.
    Faraji, F.
    Motaharifar, M.
    Taghirad, H. D.
    Mohammadi, S. F.
    2020 28TH IRANIAN CONFERENCE ON ELECTRICAL ENGINEERING (ICEE), 2020, : 833 - 838
  • [40] Surgical instrument posture estimation and tracking based on LSTM
    Lu, Siyu
    Yang, Jun
    Yang, Bo
    Li, Xiaolu
    Yin, Zhengtong
    Yin, Lirong
    Zheng, Wenfeng
    ICT EXPRESS, 2024, 10 (03): : 465 - 471