Video-based heart rate estimation from challenging scenarios using synthetic video generation

被引:0
|
作者
Benezeth, Yannick [1 ]
Krishnamoorthy, Deepak [2 ]
Monsalve, Deivid Johan Botina [1 ]
Nakamura, Keisuke [3 ]
Gomez, Randy [3 ]
Miteran, Johel [1 ]
机构
[1] Univ Bourgogne, ImViA, EA7535, Dijon, France
[2] Amrita Vishwa Vidyapeetham, Amrita Sch Comp, Dept Comp Sci & Engn, Chennai 601103, India
[3] Honda Res Inst Japan Co, Wako, Saitama, Japan
关键词
rPPG estimation; Data augmentation; Near-infrared; Fitness scenarios; REMOTE; PHOTOPLETHYSMOGRAPHY;
D O I
10.1016/j.bspc.2024.106598
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Remote photoplethysmography (rPPG) is an emerging technology that allows for non-invasive monitoring of physiological signals such as heart rate, blood oxygen saturation, and respiration rate using a camera. This technology has the potential to revolutionize healthcare, sports science, and affective computing by enabling continuous monitoring in real-world environments without the need for cumbersome sensors. However, rPPG technology is still in its early stages. It faces challenges such as motion artifacts, low signal-to-noise ratio, and the challenge of conducting near-infrared measurements in low-light or nighttime conditions. The performance of existing rPPG techniques has been significantly improved by deep learning approaches, primarily due to the availability of large public datasets. However, most of these datasets are limited to the regular RGB color modality, with only a few available in near-infrared. Additionally, training deep neural networks for specific applications with distinctive movements, such as sports and fitness, would require extensive amounts of video data to achieve optimal specialization and efficiency, which can be prohibitively expensive. Therefore, exploring alternative methods to augment datasets for specific applications is crucial to improve the performance of deep neural networks in rPPG. In response to these challenges, this paper presents a novel methodology to generate synthetic videos for pre-training deep neural networks to estimate heart rates from videos captured under challenging conditions accurately. We have evaluated this approach using two nearinfrared publicly available datasets, i.e. MERL (Nowara et al., 2020) and Tokyotech (Maki et al., 2019), and one challenging fitness dataset, i.e. ECG-Fitness (& Scaron;petl & iacute;k et al., 2018). Furthermore, we have collected and made publicly available a novel collection of near-infrared videos named IMVIA-NIR. Our data augmentation strategy involves generating video sequences that animate a person in a source image based on the motion captured in a driving video. Furthermore, we integrate a synthetic rPPG signal into the faces, considering various important aspects such as the temporal shape of the signal, its spatial and spectral distribution, as well as the distribution of heart rates. This comprehensive integration process ensures a realistic incorporation of the rPPG signals into the synthetic videos. Experimental results demonstrated a significant reduction in the mean absolute error (MAE) score on all datasets. Overall, this approach provides a promising solution for improving the performance of deep neural networks in rPPG under challenging conditions.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Video-based Heart Rate Estimation using Embedded Architectures
    El Boussaki, Hoda
    Latif, Rachid
    Saddik, Amine
    [J]. INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2023, 14 (05) : 1155 - 1164
  • [2] VIDEO-BASED VEHICLE DETECTION AND CLASSIFICATION IN CHALLENGING SCENARIOS
    Chen, Yiling
    Qin, GuoFeng
    [J]. INTERNATIONAL JOURNAL ON SMART SENSING AND INTELLIGENT SYSTEMS, 2014, 7 (03): : 1077 - 1094
  • [3] Automated Detection of Engagement Using Video-Based Estimation of Facial Expressions and Heart Rate
    Monkaresi, Hamed
    Bosch, Nigel
    Calvo, Rafael A.
    D'Mello, Sidney K.
    [J]. IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2017, 8 (01) : 15 - 28
  • [4] Video-based Heart Rate Measurement From Human Faces
    Pursche, T.
    Krajewski, J.
    Moeller, Reinhard
    [J]. 2012 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS (ICCE), 2012, : 544 - +
  • [5] Region of Interest Analysis Using Delaunay Triangulation for Facial Video-Based Heart Rate Estimation
    Gao, Haoyuan
    Zhang, Chao
    Pei, Shengbing
    Wu, Xiaopei
    [J]. IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73 : 1 - 12
  • [6] Improving Video-Based Resting Heart Rate Estimation: A Comparison of Two Methods
    Choe, Jeehyun
    Chung, Dahjung
    Schwichtenberg, A. J.
    Delp, Edward J.
    [J]. 2015 IEEE 58TH INTERNATIONAL MIDWEST SYMPOSIUM ON CIRCUITS AND SYSTEMS (MWSCAS), 2015,
  • [7] Video-based Noncontact Heart Rate Measurement Using Ear Features
    Cai, Xi
    Han, Guang
    Wang, Jinkuan
    [J]. PROCEEDINGS OF 2015 IEEE INTERNATIONAL CONFERENCE ON PROGRESS IN INFORMATCS AND COMPUTING (IEEE PIC), 2015, : 262 - 265
  • [8] An Improvement for Video-based Heart Rate Variability Measurement
    Li, Peixi
    Benezeth, Yannick
    Nakamura, Keisuke
    Gomez, Randy
    Li, Chao
    Yang, Fan
    [J]. 2019 IEEE 4TH INTERNATIONAL CONFERENCE ON SIGNAL AND IMAGE PROCESSING (ICSIP 2019), 2019, : 435 - 439
  • [9] Video-Based Analysis of Heart Rate Applied to Falls
    He, Xiaochuan
    Goubran, Rafik
    Robinovitch, Stephen
    Symes, Bobbi
    Lo, Bryan
    Ejupi, Andreas
    Knoefel, Frank
    [J]. 2018 IEEE INTERNATIONAL SYMPOSIUM ON MEDICAL MEASUREMENTS AND APPLICATIONS (MEMEA), 2018, : 930 - 934
  • [10] Evaluation of a video-based measure of driver heart rate
    Kuo, Jonny
    Koppel, Sjaan
    Charlton, Judith L.
    Rudin-Brown, Christina M.
    [J]. JOURNAL OF SAFETY RESEARCH, 2015, 54 : 55 - 59