Detection, reconstruction and segmentation of chronic wounds using Kinect v2 sensor

被引:11
|
作者
Filko, Damir [1 ]
Cupec, Robert [1 ]
Nyarko, Emmanuel Karlo [1 ]
机构
[1] Fac Elect Engn, Kneza Trpimira 2B, Osijek 31000, Croatia
关键词
chronic wound; detection; reconstruction; segmentation; measurement; kinect v2;
D O I
10.1016/j.procs.2016.07.022
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
The advent of inexpensive RGB-D sensors pioneered by the original Kinect sensor, has paved the way for a lot of innovations in computer and robot vision applications. In this article, we propose a system which uses the new Kinect v2 sensor in a medical application for the purpose of detection, 3D reconstruction and segmentation of chronic wounds. Wound detection is based on a per block classification of wound tissue using colour histograms and nearest neighbour approach. The 3D reconstruction is similar to KinectFusion where ICP is used for determining rigid body transformation. Colour enhanced TSDF is applied for scene fusion, while the Marching cubes algorithm is used for creating the surface mesh. The wound contour is extracted by a segmentation procedure which is driven by geometrical and visual properties of the surface. Apart from the segmentation procedure, the entire system is implemented in CUDA which enables real-time operation. The end result of the developed system is a precise 3D coloured model of the segmented wound, and its measurable properties including perimeter, area and volume, which can be used for determining a correct therapy and treatment of chronic wounds. All experiments were conducted on a medical wound care model. (C) 2016 The Authors. Published by Elsevier B.V.
引用
收藏
页码:151 / 156
页数:6
相关论文
共 50 条
  • [21] Development of a System for Quantitative Evaluation of Motor Function Using Kinect v2 Sensor
    Yoshida, Hirotaka
    Honda, Takeru
    Lee, Jongho
    Yano, Shiro
    Kakei, Shinji
    Kondo, Toshiyuki
    [J]. 2016 INTERNATIONAL SYMPOSIUM ON MICRO-NANOMECHATRONICS AND HUMAN SCIENCE (MHS), 2016,
  • [22] Monitoring Volcanic and Tectonic Sandbox Analogue Models Using the Kinect v2 sensor
    Rincon, M.
    Marquez, A.
    Herrera, R.
    Galland, O.
    Sanchez-Oro, J.
    Concha, D.
    Montemayor, A. S.
    [J]. EARTH AND SPACE SCIENCE, 2022, 9 (06)
  • [23] Comparative analysis of respiratory motion tracking using Microsoft Kinect v2 sensor
    Silverstein, Evan
    Snyder, Michael
    [J]. JOURNAL OF APPLIED CLINICAL MEDICAL PHYSICS, 2018, 19 (03): : 193 - 204
  • [24] Indoor 3D Path Planning Using a Kinect V2 Sensor
    Nie, Wen
    Li, QunMing
    Zhong, Guoliang
    Deng, Hua
    [J]. 2017 IEEE 3RD INFORMATION TECHNOLOGY AND MECHATRONICS ENGINEERING CONFERENCE (ITOEC), 2017, : 527 - 531
  • [25] 3D Reconstruction Method for Fruit Tree Branches Based on Kinect v2 Sensor
    Ren D.
    Li X.
    Lin T.
    Xiong M.
    Xu Z.
    Cui G.
    [J]. Nongye Jixie Xuebao/Transactions of the Chinese Society for Agricultural Machinery, 2022, 53 : 197 - 203
  • [26] Quantifying Facial Paralysis using the Kinect v2
    Gaber, Amira
    Taher, Mona F.
    Wahed, Manal Abdel
    [J]. 2015 37TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2015, : 2497 - 2501
  • [27] Respiratory Motion Tracking Using Kinect V2
    Silverstein, E.
    Snyder, M.
    [J]. MEDICAL PHYSICS, 2016, 43 (06) : 3651 - 3651
  • [28] Evaluating the Accuracy of the Azure Kinect and Kinect v2
    Kurillo, Gregorij
    Hemingway, Evan
    Cheng, Mu-Lin
    Cheng, Louis
    [J]. SENSORS, 2022, 22 (07)
  • [29] Correction to: depth analysis of kinect v2 sensor in different mediums
    Aditi Bhateja
    Adarsh Shrivastav
    Himanshu Chaudhary
    Brejesh Lall
    Prem K. Kalra
    [J]. Multimedia Tools and Applications, 2022, 81 (25) : 35801 - 35801
  • [30] Kinect sensor performance for Windows V2 through graphical processing
    Vargas, Javier
    Marino, Christian
    Aldas, Clay
    Morales, Luis
    Toasa, Renato
    [J]. PROCEEDINGS OF 2018 10TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND COMPUTING (ICMLC 2018), 2018, : 263 - 268