Interactive Endoscopy: A Next-Generation, Streamlined User Interface for Lung Surgery Navigation

被引:4
|
作者
Thienphrapa, Paul [1 ]
Bydlon, Torre [1 ]
Chen, Alvin [1 ]
Vagdargi, Prasad [2 ]
Varble, Nicole [1 ]
Stanton, Douglas [1 ]
Popovic, Aleksandra [1 ]
机构
[1] Philips Res North Amer, Cambridge, MA 02141 USA
[2] Johns Hopkins Univ, I STAR Lab, Baltimore, MD USA
关键词
Interactive endoscopy; Lung surgery; VATS; Augmented reality; Human-computer interaction; AUGMENTED-REALITY; TRACKING; RECONSTRUCTION; REGISTRATION; GUIDANCE;
D O I
10.1007/978-3-030-32254-0_10
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Computer generated graphics are superimposed onto live video emanating from an endoscope, offering the surgeon visual information that is hiding in the native scene-this describes the classical scenario of augmented reality in minimally invasive surgery. Research efforts have, over the past few decades, pressed considerably against the challenges of infusing a priori knowledge into endoscopic streams. As framed, these contributions emulate perception at the level of the surgeon expert, perpetuating debates on the technical, clinical, and societal viability of the proposition. We herein introduce interactive endoscopy, transforming passive visualization into an interface that allows the surgeon to label noteworthy anatomical features found in the endoscopic video, and have the virtual annotations remember their tissue locations during surgical manipulation. The streamlined interface combines vision-based tool tracking and speech recognition to enable interactive selection and labeling, followed by tissue tracking and optical flow for label persistence. These discrete capabilities have matured rapidly in recent years, promising technical viability of the system; it can help clinicians offload the cognitive demands of visually deciphering soft tissues; and supports societal viability by engaging, rather than emulating, surgeon expertise. Through a video-assisted thoracotomy use case, we develop a proof-of-concept to improve workflow by tracking surgical tools and visualizing tissue, while serving as a bridge to the classical promise of augmented reality in surgery.
引用
收藏
页码:83 / 91
页数:9
相关论文
共 50 条
  • [32] Next-Generation Inertial Navigation Computation Based on Functional Iteration
    Wu, Y.
    2019 26TH SAINT PETERSBURG INTERNATIONAL CONFERENCE ON INTEGRATED NAVIGATION SYSTEMS (ICINS), 2019,
  • [33] Metallic Nanocomposites as Next-Generation Thermal Interface Materials
    Nagabandi, Nirup
    Yegin, Cengiz
    Feng, Xuhui
    King, Charles
    Oh, Jun Kyun
    Narumanchi, Sreekant
    Akbulut, Mustafa
    PROCEEDINGS OF THE SIXTEENTH INTERSOCIETY CONFERENCE ON THERMAL AND THERMOMECHANICAL PHENOMENA IN ELECTRONIC SYSTEMS ITHERM 2017, 2017, : 400 - 406
  • [34] Next-generation therapeutics - Chemistry and biology at the innovation interface
    Vlahos, CJ
    Coghlan, MJ
    CURRENT OPINION IN CHEMICAL BIOLOGY, 2005, 9 (04) : 333 - 335
  • [35] Graphical user interface for next generation power systems
    Lavergne, M
    INTELEC(R): TWENTY-SECOND INTERNATIONAL TELECOMMUNICATIONS ENERGY CONFERENCE, 2000, : 109 - 112
  • [36] Introduction to the Special Issue on UIDL for Next-Generation User Interfaces
    Shaer, Orit
    Jacob, Robert J. K.
    Green, Mark
    Luyten, Kris
    ACM TRANSACTIONS ON COMPUTER-HUMAN INTERACTION, 2009, 16 (04)
  • [37] User plane acceleration service for next-generation cellular networks
    Engin Zeydan
    Yekta Turk
    Telecommunication Systems, 2023, 84 : 469 - 485
  • [38] User plane acceleration service for next-generation cellular networks
    Zeydan, Engin
    Turk, Yekta
    TELECOMMUNICATION SYSTEMS, 2023, 84 (04) : 469 - 485
  • [40] A novel approach to user data federation in Next-Generation Networks
    Bartolomeo, Giovanni
    Kovacikova, Tatiana
    Petersen, Francoise
    INTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, 2010, 23 (6-7) : 802 - 816