A temporally adaptive schlieren approach by fusing frame- and event-based cameras

被引:1
|
作者
Lyu, Zhen [1 ]
Cai, Weiwei [1 ]
Wang, Benlong [2 ]
Liu, Yingzheng [1 ]
机构
[1] Shanghai Jiao Tong Univ, Gas Turbine Res Inst, 800 Dongchuan Rd, Shanghai 200240, Peoples R China
[2] Shanghai Jiao Tong Univ, Sch Ocean & Civil Engn, 800 Dongchuan Rd, Shanghai 200240, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
10.1364/OL.545691
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
This Letter reports what we believe to be a novel schlieren approach with adaptive temporal resolution. The fundamental concept of this approach is to fuse an event-based camera and a low-speed frame-based camera to generate high-frame-rate videos by leveraging the strengths of both. Using a novel experimental setup, events and frames are accurately aligned in both space and time. The aligned data are then fed into a neural network to generate intermediate frames. To examine the proposed approach, tests are conducted on a plate heater, burning candle, and pulsed jet. It has been tested that this approach enables continuous visualization and recording of flows with an adaptive frame rate of up to 3.3 kFPS, breaking through the short operating times of the existing schlieren technique. The developed intermediate frame generation method also outperforms similar methods by minimizing the impact of the event camera readout latency. It achieves a maximum improvement of 3.99 dB in the peak signal-to-noise ratio. (c) 2025 Optica Publishing Intelligence (AI) training, and similar technologies, are reserved.
引用
收藏
页码:289 / 292
页数:4
相关论文
共 50 条
  • [1] Combined frame- and event-based detection and tracking
    Liu, Hongjie
    Moeys, Diederik Paul
    Das, Gautham
    Neil, Daniel
    Liu, Shih-Chii
    Delbruck, Tobi
    2016 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2016, : 2511 - 2514
  • [2] On the Benefits of Visual Stabilization for Frame- and Event-Based Perception
    Rodriguez-Gomez, J. P.
    Martinez-de Dios, J. R.
    Ollero, A.
    Gallego, G.
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (10): : 8802 - 8809
  • [3] Investigating Event-Based Cameras for Video Frame Interpolation in Sports
    Deckyvere, Antoine
    Cioppa, Anthony
    Giancola, Silvio
    Ghanem, Bernard
    Van Droogenbroeck, Marc
    2024 IEEE INTERNATIONAL WORKSHOP ON SPORT, TECHNOLOGY AND RESEARCH, STAR 2024, 2024, : 138 - 143
  • [4] Target Tracking with Frame- and Event-based Cameras Involving Delayed and Irregularly-Sampled Visual Feedback for a Robotic Air-Hockey System
    Xiao, Hui
    Chen, Xu
    2022 AMERICAN CONTROL CONFERENCE, ACC, 2022, : 3771 - 3776
  • [5] UAV human teleoperation using event-based and frame-based cameras
    Rodriguez-Gomez, J. P.
    Tapia, R.
    Gomez Eguiluz, A.
    Martinez-de Dios, J. R.
    Ollero, A.
    1ST AIRPHARO WORKSHOP ON AERIAL ROBOTIC SYSTEMS PHYSICALLY INTERACTING WITH THE ENVIRONMENT (AIRPHARO 2021), 2021,
  • [6] Event-Based Background-Oriented Schlieren
    Shiba, Shintaro
    Hamann, Friedhelm
    Aoki, Yoshimitsu
    Gallego, Guillermo
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (04) : 2011 - 2026
  • [7] Real-Time, High-Speed Video Decompression Using a Frame- and Event-Based DAVIS Sensor
    Brandli, Christian
    Muller, Lorenz
    Delbruck, Tobi
    2014 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2014, : 686 - 689
  • [8] Event-based imaging velocimetry: an assessment of event-based cameras for the measurement of fluid flows
    Willert, Christian E.
    Klinner, Joachim
    EXPERIMENTS IN FLUIDS, 2022, 63 (06)
  • [9] Event-based imaging velocimetry: an assessment of event-based cameras for the measurement of fluid flows
    Christian E. Willert
    Joachim Klinner
    Experiments in Fluids, 2022, 63
  • [10] Movement Detection with Event-Based Cameras: Comparison with Frame-Based Cameras in Robot Object Tracking Using Powerlink Communication
    Barrios-Aviles, Juan
    Iakymchuk, Taras
    Samaniego, Jorge
    Medus, Leandro D.
    Rosado-Munoz, Alfredo
    ELECTRONICS, 2018, 7 (11)