Microscopic Analysis Using Gaze-Based Interaction

被引:1
|
作者
Fruehberger, Peter [1 ]
Klaus, Edmund [1 ]
Beyerer, Juergen [1 ]
机构
[1] Syst Technol & Image Exploitat IOSB, Fraunhofer Inst Optron, D-76131 Karlsruhe, Germany
关键词
D O I
10.1007/978-3-319-04639-6_27
中图分类号
TH742 [显微镜];
学科分类号
摘要
Fraunhofer IOSB is currently constructing an automated microscopic laboratory. Different optical microscopes will be used to analyze specimen and gain new information out of the combination of the acquired sensor data. In order to narrow down specific patterns or tune the subsequent automated microscopy process, specimen have to be examined before the actual analysis process. To keep this step less time consuming and fatiguing, we designed an intuitive human-machine interface consisting of automated focusing algorithms and gaze-based interaction. The most important task in microscopy is focusing a desired region of a specimen. Considering microscopic analysis as a visual search task [1], this region of interest is exactly the region the operator is looking at. We use this information to automatically detect the focus plane in this region, by utilizing the microscope's z-axis, and present the operator a focused image. The user does not need to know how to operate the microscope in detail, but however is successful by using the encapsulated sophisticated operational sequences like initial, continuous focusing or synthetic image enhancements. This paper presents results of realizations for guided and fully automated microscopic analysis incorporating intuitive interaction and eased operation while gaining high quality results.
引用
收藏
页码:195 / 200
页数:6
相关论文
共 50 条
  • [1] Improving usability for video analysis using gaze-based interaction
    Hild, Jutta
    Peinsipp-Byma, Elisabeth
    Klaus, Edmund
    [J]. FULL MOTION VIDEO (FMV) WORKFLOWS AND TECHNOLOGIES FOR INTELLIGENCE, SURVEILLANCE, AND RECONNAISSANCE (ISR) AND SITUATIONAL AWARENESS, 2012, 8386
  • [2] Multimodal Gaze-based Interaction
    Pfeiffer, Thies
    Wachsmuth, Ipke
    [J]. AT-AUTOMATISIERUNGSTECHNIK, 2013, 61 (11) : 770 - 776
  • [3] Gaze-based Interaction for Virtual Environments
    Jimenez, Jorge
    Gutierrez, Diego
    Latorre, Pedro
    [J]. JOURNAL OF UNIVERSAL COMPUTER SCIENCE, 2008, 14 (19) : 3085 - 3098
  • [4] Gaze-Based Interaction for VR Environments
    Piotrowski, Patryk
    Nowosielski, Adam
    [J]. IMAGE PROCESSING AND COMMUNICATIONS: TECHNIQUES, ALGORITHMS AND APPLICATIONS, 2020, 1062 : 41 - 48
  • [5] A Scrolling Approach for Gaze-Based Interaction
    Schniederjann, Florian
    Korthing, Lars
    Broesterhaus, Jonas
    Mertens, Robert
    [J]. 2019 IEEE INTERNATIONAL SYMPOSIUM ON MULTIMEDIA (ISM 2019), 2019, : 233 - 234
  • [6] Gaze-based interaction: A 30 year retrospective
    Duchowski, Andrew T.
    [J]. COMPUTERS & GRAPHICS-UK, 2018, 73 : 59 - 69
  • [7] Gaze-Based Interaction for Interactive Storytelling in VR
    Drewes, Heiko
    Mueller, Evelyn
    Rothe, Sylvia
    Hussmann, Heinrich
    [J]. AUGMENTED REALITY, VIRTUAL REALITY, AND COMPUTER GRAPHICS, 2021, 12980 : 91 - 108
  • [8] Gaze-based Kinaesthetic Interaction for Virtual Reality
    Li, Zhenxing
    Akkil, Deepak
    Raisamo, Roope
    [J]. INTERACTING WITH COMPUTERS, 2020, 32 (01) : 17 - 32
  • [9] Gaze-based Interaction on Handheld Mobile Devices
    Namnakani, Omar
    [J]. ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, ETRA 2023, 2023,
  • [10] A review of machine learning in scanpath analysis for passive gaze-based interaction
    Selim, Abdulrahman Mohamed
    Barz, Michael
    Bhatti, Omair Shahzad
    Alam, Hasan Md Tusfiqur
    Sonntag, Daniel
    [J]. FRONTIERS IN ARTIFICIAL INTELLIGENCE, 2024, 7