Towards Continuous Streamflow Monitoring with Time-Lapse Cameras and Deep Learning

被引:3
|
作者
Gupta, Amrita [1 ]
Chang, Tony [1 ]
Walker, Jeffrey D. [2 ]
Letcher, Benjamin H. [3 ]
机构
[1] Conservat Sci Partners, Truckee, CA 96161 USA
[2] Walker Environm Res LLC, Brunswick, ME USA
[3] US Geol Survey, Eastern Ecol Sci Ctr, Turners Falls, MS USA
关键词
computational sustainability; computer vision; neural networks; learning to rank; weakly supervised learning; FLOW DURATION CURVES; PREDICTION; DROUGHT;
D O I
10.1145/3530190.3534805
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Effective water resources management depends on monitoring the volume of water flowing through streams and rivers, but collecting continuous discharge measurements using traditional streamflow gages is prohibitively expensive. Time-lapse cameras offer a low-cost option for streamflow monitoring, but training models for predicting streamflow directly from images requires streamflow data to use as labels, which are often unavailable. We address this data gap by proposing the alternative task of Streamflow Rank Estimation (SRE), in which the goal is to predict relative measures of streamflow such as percentile rank rather than absolute flow. In particular, we use a learning-to-rank framework to train SRE models using pairs of stream images ranked in order of discharge by an annotator, obviating the need for discharge training data and thus facilitating monitoring streamflow conditions at streams without gages. We also demonstrate a technique for converting SRE model predictions to stream discharge estimates given an estimated streamflow distribution. Using data and images from six small US streams, we compare the performance of SRE with conventional regression models trained to predict absolute discharge. Our results show that SRE performs nearly as well as regression models on relative flow prediction. Further, we observe that the accuracy of absolute discharge estimates obtained by mapping SRE model predictions through a discharge distribution largely depends on how well the assumed discharge distribution matches the field observed data.
引用
收藏
页码:353 / 363
页数:11
相关论文
共 50 条
  • [41] Time-lapse magnetotelluric monitoring of an enhanced geothermal system
    Peacock, Jared R.
    Thiel, Stephan
    Heinson, Graham S.
    Reid, Peter
    GEOPHYSICS, 2013, 78 (03) : B121 - B130
  • [42] Permafrost monitoring using time-lapse resistivity tomography
    Hauck, C
    Mühll, DV
    PERMAFROST, VOLS 1 AND 2, 2003, : 361 - 366
  • [43] Time-lapse seismic in the steam chamber monitoring of SAGD
    Xin K.
    Shiyou Diqiu Wuli Kantan/Oil Geophysical Prospecting, 2019, 54 (04): : 882 - 890and907
  • [44] Applications of time-lapse video to air quality monitoring
    Lyons, WA
    Nelson, TE
    10TH JOINT CONFERENCE ON THE APPLICATIONS OF AIR POLLUTION METEOROLOGY WITH THE A&WMA, 1998, : 105 - 106
  • [45] Time-lapse monitoring as a tool for clinical embryo assessment
    Kirkegaard, Kirstine
    Agerholm, Inge E.
    Ingerslev, Hans Jakob
    HUMAN REPRODUCTION, 2012, 27 (05) : 1277 - 1285
  • [46] Time-lapse seismic data inversion for estimating reservoir parameters using deep learning
    Kaur, Harpreet
    Zhong, Zhi
    Sun, Alexander
    Fomel, Sergey
    INTERPRETATION-A JOURNAL OF SUBSURFACE CHARACTERIZATION, 2022, 10 (01): : T167 - T179
  • [47] A deep learning approach to track Arabidopsis seedlings’ circumnutation from time-lapse videos
    Yixiang Mao
    Hejian Liu
    Yao Wang
    Eric D. Brenner
    Plant Methods, 19
  • [48] Deep learning for characterizing CO2 migration in time-lapse seismic images
    Sheng, Hanlin
    Wu, Xinming
    Sun, Xiaoming
    Wu, Long
    FUEL, 2023, 336
  • [49] Integrating reservoir engineering and satellite remote sensing for (true) continuous time-lapse reservoir monitoring
    Xu, H.
    Nur, A.
    Leading Edge (Tulsa, OK), 2001, 20 (10): : 1176 - 1179
  • [50] A deep learning approach to track Arabidopsis seedlings' circumnutation from time-lapse videos
    Mao, Yixiang
    Liu, Hejian
    Wang, Yao
    Brenner, Eric D.
    PLANT METHODS, 2023, 19 (01)