Department of Civil, Environmental, and Geospatial Engineering; Great Lakes Research Center; College of Forest Resources and Environmental Science; Department of Computer Science
Accurate estimates for the lake surface temperature (LST) of the Great Lakes are critical to understanding the regional climate. Dedicated lake models of various complexity have been used to simulate LST but they suffer from noticeable biases and can be computationally expensive. Additionally, the available historical LST datasets are limited by either short temporal coverage (<30 >years) or lower spatial resolution (0.25° × 0.25°). Therefore, in this study, we employed a deep learning model based on Long Short-Term Memory (LSTM) neural networks to produce a daily LST dataset for the Great Lakes that spans an unparalleled 42 years (1979–2020) at a spatial resolution of ~1 km. In our dataset, the Great Lakes are represented by ~33,000 unstructured grid points and the LSTM training incorporated the information from each grid point. The LSTM was trained with seven meteorological variables from reanalysis data as feature variables and the LST from a historical satellite-derived dataset as the target variable. The LSTM was able to capture the spatial heterogeneity of LST in the Great Lakes well and exhibited high correlation (≥0.92) and low bias (limited to ±1.5 °C) for the temporal evolution of LST during the training (1995–2020) and testing (1979–1994) periods.
Havens, T. C.,
Reconstructing 42 Years (1979–2020) of Great Lakes Surface Temperature through a Deep Learning Approach.
Retrieved from: https://digitalcommons.mtu.edu/michigantech-p2/108
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.