Towards Continuous Streamflow Monitoring with Time-Lapse Cameras and Deep Learning

被引:3
|
作者
Gupta, Amrita [1 ]
Chang, Tony [1 ]
Walker, Jeffrey D. [2 ]
Letcher, Benjamin H. [3 ]
机构
[1] Conservat Sci Partners, Truckee, CA 96161 USA
[2] Walker Environm Res LLC, Brunswick, ME USA
[3] US Geol Survey, Eastern Ecol Sci Ctr, Turners Falls, MS USA
关键词
computational sustainability; computer vision; neural networks; learning to rank; weakly supervised learning; FLOW DURATION CURVES; PREDICTION; DROUGHT;
D O I
10.1145/3530190.3534805
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Effective water resources management depends on monitoring the volume of water flowing through streams and rivers, but collecting continuous discharge measurements using traditional streamflow gages is prohibitively expensive. Time-lapse cameras offer a low-cost option for streamflow monitoring, but training models for predicting streamflow directly from images requires streamflow data to use as labels, which are often unavailable. We address this data gap by proposing the alternative task of Streamflow Rank Estimation (SRE), in which the goal is to predict relative measures of streamflow such as percentile rank rather than absolute flow. In particular, we use a learning-to-rank framework to train SRE models using pairs of stream images ranked in order of discharge by an annotator, obviating the need for discharge training data and thus facilitating monitoring streamflow conditions at streams without gages. We also demonstrate a technique for converting SRE model predictions to stream discharge estimates given an estimated streamflow distribution. Using data and images from six small US streams, we compare the performance of SRE with conventional regression models trained to predict absolute discharge. Our results show that SRE performs nearly as well as regression models on relative flow prediction. Further, we observe that the accuracy of absolute discharge estimates obtained by mapping SRE model predictions through a discharge distribution largely depends on how well the assumed discharge distribution matches the field observed data.
引用
收藏
页码:353 / 363
页数:11
相关论文
共 50 条
  • [31] Quantification of invertebrates on fungal fruit bodies by the use of time-lapse cameras
    Lunde, Lisa F. F.
    Ferkingstad, Bendik
    Wegger, Hermann
    Hoye, Toke T. T.
    Mann, Hjalte M. R.
    Birkemoe, Tone
    ECOLOGICAL ENTOMOLOGY, 2023, 48 (05) : 544 - 556
  • [32] Time-Lapse Cameras for Measurement of Grain Corn Phenology on the Canadian Prairies
    Zhanda, Justice
    Bullock, Paul R.
    Zvomuya, Francis
    Shaykewich, Carl
    Reid, Lana M.
    Lawley, Yvonne
    Flaten, Don
    AGROSYSTEMS GEOSCIENCES & ENVIRONMENT, 2019, 2 (01) : 1 - 12
  • [33] A focus on time-lapse ethnography: learning to teach
    Douglas, Alaster Scott
    ETHNOGRAPHY AND EDUCATION, 2019, 14 (02) : 192 - 205
  • [34] Continuous models and algorithms for time-lapse seismic inversion
    Chen Yong
    Han Bo
    CHINESE JOURNAL OF GEOPHYSICS-CHINESE EDITION, 2006, 49 (04): : 1164 - 1168
  • [35] DeepNRMS: Unsupervised deep learning for noise-robust CO2 monitoring in time-lapse seismic images
    Park, Min Jun
    Frigerio, Julio
    Clapp, Bob
    Biondi, Biondo
    GEOPHYSICS, 2024, 89 (04) : IM1 - IM11
  • [36] Point cloud stacking: A workflow to enhance 3D monitoring capabilities using time-lapse cameras
    Blanch X.
    Abellan A.
    Guinau M.
    Blanch, Xabier (xabierblanch@ub.edu), 1600, MDPI AG (12):
  • [37] Point Cloud Stacking: A Workflow to Enhance 3D Monitoring Capabilities Using Time-Lapse Cameras
    Blanch, Xabier
    Abellan, Antonio
    Guinau, Marta
    REMOTE SENSING, 2020, 12 (08)
  • [38] Neural network for time-lapse seismic reservoir monitoring
    不详
    JOURNAL OF PETROLEUM TECHNOLOGY, 2001, 53 (08): : 44 - +
  • [39] Time-lapse seismic monitoring: Repeatability processing tests
    不详
    JOURNAL OF PETROLEUM TECHNOLOGY, 1998, 50 (01): : 34 - 34
  • [40] Neural network for time-lapse seismic reservoir monitoring
    JPT, Journal of Petroleum Technology, 2002, 53 (08): : 44 - 47