Deep face segmentation for improved heart and respiratory rate estimation from videos

被引:1
|
作者
Fiedler M.-A. [1 ]
Werner P. [1 ]
Rapczyński M. [1 ]
Al-Hamadi A. [1 ]
机构
[1] Neuro-Information Technology Group, Institute for Information Technology and Communications, Otto von Guericke University Magdeburg, Magdeburg
关键词
Camera-based monitoring; Heart rate; Remote photoplethysmography; Respiratory rate; Vital signs;
D O I
10.1007/s12652-023-04607-8
中图分类号
学科分类号
摘要
The selection of a suitable region of interest (ROI) is of great importance in camera-based vital signs estimation, as it represents the first step in the processing pipeline. Since all further processing relies on the quality of the signal extracted from the ROI, the tracking of this area is decisive for the performance of the overall algorithm. To overcome the limitations of classical approaches for the ROI, such as partial occlusions or illumination variations, a custom neural network for pixel-precise face segmentation called FaSeNet was developed. It achieves better segmentation results on two datasets compared to state-of-the-art architectures while maintaining high execution efficiency. Furthermore, the Matthews Correlation Coefficient was proposed as a loss function providing a better fitting of the network weights than commonly applied losses in the field of multi-class segmentation. In an extensive evaluation with a variety of algorithms for vital signs estimation, our FaSeNet was able to achieve better results in both heart and respiratory rate estimation. Thus, a ROI for vital signs estimation could be created that is superior to other approaches. © 2023, The Author(s).
引用
收藏
页码:9383 / 9402
页数:19
相关论文
共 50 条
  • [21] ACCURATE HEART-RATE ESTIMATION FROM FACE VIDEOS USING QUALITY-BASED FUSION
    Gupta, Puneet
    Bhowmick, Brojeshwar
    Pal, Arpan
    2017 24TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2017, : 4132 - 4136
  • [22] Intelligent Remote Photoplethysmography-Based Methods for Heart Rate Estimation from Face Videos: A Survey
    Premkumar, Smera
    Hemanth, Duraisamy Jude
    INFORMATICS-BASEL, 2022, 9 (03):
  • [23] RealSense = Real Heart Rate: Illumination Invariant Heart Rate Estimation from Videos
    Chen, Jie
    Chang, Zhuoqing
    Qiu, Qiang
    Li, Xiaobai
    Sapiro, Guillermo
    Bronstein, Alex
    Pietikainen, Matti
    2016 SIXTH INTERNATIONAL CONFERENCE ON IMAGE PROCESSING THEORY, TOOLS AND APPLICATIONS (IPTA), 2016,
  • [24] Heart Rate Estimation From Facial Videos for Depression Analysis
    Mustafa, Aamir
    Bhatia, Shalini
    Hayat, Munawar
    Goecke, Roland
    2017 SEVENTH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION (ACII), 2017, : 498 - 503
  • [25] Heart rate estimation from facial videos using nonlinear mode decomposition and improved consistency check
    Halil Demirezen
    Cigdem Eroglu Erdem
    Signal, Image and Video Processing, 2021, 15 : 1415 - 1423
  • [26] Heart rate estimation from facial videos using nonlinear mode decomposition and improved consistency check
    Demirezen, Halil
    Eroglu Erdem, Cigdem
    SIGNAL IMAGE AND VIDEO PROCESSING, 2021, 15 (07) : 1415 - 1423
  • [27] Accurate face alignment and adaptive patch selection for heart rate estimation from videos under realistic scenarios
    Wang, Zhiwei
    Yang, Xin
    Cheng, Kwang-Ting
    PLOS ONE, 2018, 13 (05):
  • [28] Self-Adaptive Matrix Completion for Heart Rate Estimation from Face Videos under Realistic Conditions
    Tulyakov, Sergey
    Alameda-Pineda, Xavier
    Ricci, Elisa
    Yin, Lijun
    Cohn, Jeffrey F.
    Sebe, Nicu
    2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, : 2396 - 2404
  • [29] Heart Rate and Oxygen Level Estimation from Facial Videos Using a Hybrid Deep Learning Model
    Zheng, Yufeng
    MULTIMODAL IMAGE EXPLOITATION AND LEARNING 2024, 2024, 13033
  • [30] FITNESS HEART RATE MEASUREMENT USING FACE VIDEOS
    Zhu, Qiang
    Wong, Chau-Wai
    Fu, Chang-Hong
    Wu, Min
    2017 24TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2017, : 2000 - 2004