Enhancing River Channel Dimension Estimation: A Machine Learning Approach Leveraging the National Water Model, Hydrographic Networks, and Landscape Characteristics DOI Creative Commons
Arash Modaresi Rad, J. Michael Johnson, Zahra Ghahremani

et al.

Journal of Geophysical Research Machine Learning and Computation, Journal Year: 2024, Volume and Issue: 1(4)

Published: Nov. 25, 2024

Abstract Knowledge of bankfull hydraulic geometry represents an essential requirement for various applications, including accurate flood prediction, hydrological routing, river behavior analysis, management and engineering practices, water resource management, beyond. Our work builds upon extensive body literature about estimating top‐width depth at ungauged locations to enhance the understanding observable factors that affect these parameters. Using more than 200,000 USGS Acoustic Doppler Current Profiler (ADCP) records, we developed a method employing machine learning (ML) using discharge estimates landscape characteristics from sources, National Water Model (NWM), Hydrologic Geospatial Fabric network (NHGF), EPA stream characteristic data set (StreamCat), array satellite reanalysis products. achieved log‐transformed R 2 = 0.8 predicting ( 0.77 in‐channel conditions) 0.76 0.66 in testing set. The width predictions showed lowest skill mountainous plateau regions. analysis demonstrates benefit data‐driven modeling contrast other global scaling‐based or regional statistical methods. In summary, our study illustrates how can be better predicted ML, streamflow simulations, hydrographic networks, summarized geospatial data.

Language: Английский

Deep learning for water quality DOI
Wei Zhi, Alison P. Appling, Heather E. Golden

et al.

Nature Water, Journal Year: 2024, Volume and Issue: 2(3), P. 228 - 241

Published: March 12, 2024

Language: Английский

Citations

64

Distributed Hydrological Modeling With Physics‐Encoded Deep Learning: A General Framework and Its Application in the Amazon DOI Creative Commons
Chao Wang, Shijie Jiang, Yi Zheng

et al.

Water Resources Research, Journal Year: 2024, Volume and Issue: 60(4)

Published: April 1, 2024

Abstract While deep learning (DL) models exhibit superior simulation accuracy over traditional distributed hydrological (DHMs), their main limitations lie in opacity and the absence of underlying physical mechanisms. The pursuit synergies between DL DHMs is an engaging research domain, yet a definitive roadmap remains elusive. In this study, novel framework that seamlessly integrates process‐based model encoded as neural network (NN), additional NN for mapping spatially physically meaningful parameters from watershed attributes, NN‐based replacement representing inadequately understood processes developed. Multi‐source observations are used training data, fully differentiable, enabling fast parameter tuning by backpropagation. A hybrid Amazon Basin (∼6 × 10 6 km 2 ) was established based on framework, HydroPy, global‐scale DHM, its backbone. Trained simultaneously with streamflow Gravity Recovery Climate Experiment satellite yielded median Nash‐Sutcliffe efficiencies 0.83 0.77 dynamic simulations total water storage, respectively, 41% 35% higher than those original HydroPy model. Replacing Penman‒Monteith formulation produces more plausible potential evapotranspiration (PET) estimates, unravels spatial pattern PET giant basin. parameterization interpreted to identify factors controlling variability key parameters. Overall, study lays out feasible technical modeling big data era.

Language: Английский

Citations

19

Enhancing long short-term memory (LSTM)-based streamflow prediction with a spatially distributed approach DOI Creative Commons

Qiutong Yu,

Bryan A. Tolson, Hongren Shen

et al.

Hydrology and earth system sciences, Journal Year: 2024, Volume and Issue: 28(9), P. 2107 - 2122

Published: May 14, 2024

Abstract. Deep learning (DL) algorithms have previously demonstrated their effectiveness in streamflow prediction. However, hydrological time series modelling, the performance of existing DL methods is often bound by limited spatial information, as these data-driven models are typically trained with lumped (spatially aggregated) input data. In this study, we propose a hybrid approach, namely Spatially Recursive (SR) model, that integrates long short-term memory (LSTM) network seamlessly physics-based routing simulation for enhanced The LSTM was on basin-averaged meteorological and variables derived from 141 gauged basins located Great Lakes region North America. SR model involves applying at subbasin scale local predictions which then translated to basin outlet model. We evaluated efficacy respect predicting 224 stations across compared its standalone results indicate achieved levels par used training LSTM. Additionally, able predict more accurately large (e.g., drainage area greater than 2000 km2), underscoring substantial information loss associated basin-wise feature aggregation. Furthermore, outperformed when applied were not part (i.e., pseudo-ungauged basins). implication study predictions, especially ungauged basins, can be reliably improved considering heterogeneity finer resolution via

Language: Английский

Citations

10

When ancient numerical demons meet physics-informed machine learning: adjoint-based gradients for implicit differentiable modeling DOI Creative Commons
Yalan Song, Wouter Knoben, Martyn Clark

et al.

Hydrology and earth system sciences, Journal Year: 2024, Volume and Issue: 28(13), P. 3051 - 3077

Published: July 15, 2024

Abstract. Recent advances in differentiable modeling, a genre of physics-informed machine learning that trains neural networks (NNs) together with process-based equations, have shown promise enhancing hydrological models' accuracy, interpretability, and knowledge-discovery potential. Current models are efficient for NN-based parameter regionalization, but the simple explicit numerical schemes paired sequential calculations (operator splitting) can incur errors whose impacts on representation power learned parameters not clear. Implicit schemes, however, cannot rely automatic differentiation to calculate gradients due potential issues gradient vanishing memory demand. Here we propose “discretize-then-optimize” adjoint method enable implicit first time large-scale modeling. The model demonstrates comprehensively improved performance, Kling–Gupta efficiency coefficients, peak-flow low-flow metrics, evapotranspiration moderately surpass already-competitive model. Therefore, previous sequential-calculation approach had detrimental impact model's ability represent dynamics. Furthermore, structural update describes capillary rise, better describe baseflow arid regions also produce low flows outperform even pure methods such as long short-term networks. rectified some distortions did alter spatial distributions, demonstrating robustness regionalized parameterization. Despite higher computational expenses modest improvements, success removes barrier complex enrich modeling hydrology.

Language: Английский

Citations

9

Machine Learning for a Heterogeneous Water Modeling Framework DOI Creative Commons
Jonathan Frame, Ryoko Araki,

Soelem Aafnan Bhuiyan

et al.

JAWRA Journal of the American Water Resources Association, Journal Year: 2025, Volume and Issue: 61(1)

Published: Feb. 1, 2025

ABSTRACT This technical note describes recent efforts to integrate machine learning (ML) models, specifically long short‐term memory (LSTM) networks and differentiable parameter conceptual hydrological models (δ models), into the next‐generation water resources modeling framework (Nextgen) enhance future versions of U.S. National Water Model (NWM). We address three specific methodology gaps this new framework: (1) assess model performance across many ungauged catchments, (2) diagnostic‐based selection, (3) regionalization based on catchment attributes. demonstrate that an LSTM trained CAMELS catchments can make large‐scale predictions with Nextgen New England region match average flow duration curve observed by stream gauges for streamflow low exceedance probability (high flows), but diverges from mean in high (low flows). improvements peak when using δ model, results also suggest increases may come at a cost accurately representing hydrologic states within model. propose novel approach ML predict most performant mosaic improved distributions efficiency scores throughout large sample basins. Our findings advocate development capabilities advancing operational modeling.

Language: Английский

Citations

1

CH-RUN: a deep-learning-based spatially contiguous runoff reconstruction for Switzerland DOI Creative Commons
Basil Kraft, Michael Schirmer, William H. Aeberhard

et al.

Hydrology and earth system sciences, Journal Year: 2025, Volume and Issue: 29(4), P. 1061 - 1082

Published: Feb. 27, 2025

Abstract. This study presents a data-driven reconstruction of daily runoff that covers the entirety Switzerland over an extensive period from 1962 to 2023. To this end, we harness capabilities deep-learning-based models learn complex runoff-generating processes directly observations, thereby facilitating efficient large-scale simulation rates at ungauged locations. We test two sequential deep-learning architectures: long short-term memory (LSTM) model, which is recurrent neural network able temporal features sequences, and convolution-based learns dependencies via 1D convolutions in time domain. The receive temperature, precipitation, static catchment properties as input. By driving resulting model with gridded temperature precipitation data available since 1960s, provide spatiotemporally continuous runoff. efficacy developed thoroughly assessed through spatiotemporal cross-validation compared against distributed hydrological used operationally Switzerland. demonstrates not only competitive performance, but also notable improvements traditional modeling replicating patterns, capturing interannual variability, discerning long-term trends. subsequently delineate substantial shifts Swiss water resources throughout past decades. These are characterized by increased occurrence dry years, contributing negative decadal trend runoff, particularly during summer months. insights pivotal for understanding management resources, context climate change environmental conservation. product made online. Furthermore, low requirements computational efficiency our pave way simulating diverse scenarios conducting comprehensive attribution studies. represents progression field, allowing analysis thousands frame significantly shorter than those methods.

Language: Английский

Citations

1

Metamorphic testing of machine learning and conceptual hydrologic models DOI Creative Commons
Peter Reichert, Kai Ma,

Marvin Höge

et al.

Hydrology and earth system sciences, Journal Year: 2024, Volume and Issue: 28(11), P. 2505 - 2529

Published: June 13, 2024

Abstract. Predicting the response of hydrologic systems to modified driving forces beyond patterns that have occurred in past is high importance for estimating climate change impacts or effect management measures. This kind prediction requires a model, but impossibility testing such predictions against observed data makes it difficult estimate their reliability. Metamorphic offers methodology assessing models validation with real data. It consists defining input changes which expected responses are assumed be known, at least qualitatively, and model behavior consistency these expectations. To increase gain information reduce subjectivity this approach, we extend multi-model approach include sensitivity analysis training calibration options. allows us quantitatively analyze differences between different structures options addition qualitative test In our case study, apply selected conceptual machine learning hydrological calibrated basins from CAMELS set. Our results confirm superiority over regarding quality fit during periods. However, also find inputs can deviate expectations magnitude, even sign depend on addition, cases all passed metamorphic test, there quantitative structures. demonstrates usual calibration–validation identify potential problems stimulate development improved models.

Language: Английский

Citations

6

Development of a Distributed Physics‐Informed Deep Learning Hydrological Model for Data‐Scarce Regions DOI Creative Commons
L. Zhong, Huimin Lei, Jingjing Yang

et al.

Water Resources Research, Journal Year: 2024, Volume and Issue: 60(6)

Published: June 1, 2024

Abstract Climate change has exacerbated water stress and water‐related disasters, necessitating more precise streamflow simulations. However, in the majority of global regions, a deficiency data constitutes significant constraint on modeling endeavors. Traditional distributed hydrological models regionalization approaches have shown suboptimal performance. While current deep learning (DL)‐related trained large sets excel spatial generalization, direct applicability these certain regions with unique processes can be challenging due to limited representativeness within training set. Furthermore, transfer DL pre‐trained still necessitate local for retraining, thereby constraining their applicability. To address challenges, we present physics‐informed model based framework. It involves discretization establishment differentiable discrete sub‐basins, coupled Muskingum method channel routing. By introducing upstream‐downstream relationships, errors sub‐basins propagate through river network watershed outlet, enabling optimization using downstream data, achieving simulation ungauged internal sub‐basins. The model, when solely downstream‐most station, outperforms at both station upstream held‐out stations. Additionally, comparison models, our requires fewer gauge stations training, but achieves higher precision simulating spatially stations, indicating better generalization ability. Consequently, this offers novel approach data‐scarce especially those poor representativeness.

Language: Английский

Citations

5

Deep dive into hydrologic simulations at global scale: harnessing the power of deep learning and physics-informed differentiable models (δHBV-globe1.0-hydroDL) DOI Creative Commons

Dapeng Feng,

Hylke E. Beck, Jens de Bruijn

et al.

Geoscientific model development, Journal Year: 2024, Volume and Issue: 17(18), P. 7181 - 7198

Published: Sept. 26, 2024

Abstract. Accurate hydrologic modeling is vital to characterizing how the terrestrial water cycle responds climate change. Pure deep learning (DL) models have been shown outperform process-based ones while remaining difficult interpret. More recently, differentiable physics-informed machine with a physical backbone can systematically integrate equations and DL, predicting untrained variables processes high performance. However, it unclear if such are competitive for global-scale applications simple backbone. Therefore, we use – first time at this scale (full name δHBV-globe1.0-hydroDL, shortened δHBV here) simulate rainfall–runoff 3753 basins around world. Moreover, compare purely data-driven long short-term memory (LSTM) model examine their strengths limitations. Both LSTM provide daily simulation capabilities in global basins, median Kling–Gupta efficiency values close or higher than 0.7 (and 0.78 subset of 1675 long-term discharge records), significantly outperforming traditional models. regionalized demonstrated stronger spatial generalization ability (median KGE 0.64) parameter regionalization approach 0.46) even ungauged region tests across continents. Nevertheless, relative LSTM, was hampered by structural deficiencies cold polar regions, highly arid significant human impacts. This study also sets benchmark estimates world builds foundation improving simulations.

Language: Английский

Citations

5

Deep learning insights into suspended sediment concentrations across the conterminous United States: Strengths and limitations DOI
Yalan Song,

Piyaphat Chaemchuen,

Farshid Rahmani

et al.

Journal of Hydrology, Journal Year: 2024, Volume and Issue: 639, P. 131573 - 131573

Published: June 24, 2024

Language: Английский

Citations

4