The effects of hyperparameters on deep learning of turbulent signals DOI
Panagiotis Tirchas, Dimitris Drikakis, Ioannis W. Kokkinakis

et al.

Physics of Fluids, Journal Year: 2024, Volume and Issue: 36(12)

Published: Dec. 1, 2024

The effect of hyperparameter selection in deep learning (DL) models for fluid dynamics remains an open question the current scientific literature. Many authors report results using models. However, better insight is required to assess models' behavior, particularly complex datasets such as turbulent signals. This study presents a meticulous investigation long short-term memory (LSTM) hyperparameters, focusing specifically on applications involving predicting signals shock boundary layer interaction. Unlike conventional methodologies that utilize automated optimization techniques, this research explores intricacies and impact manual adjustments model. includes number layers, neurons per layer, rate, dropout batch size investigate their model's predictive accuracy computational efficiency. paper details iterative tuning process through series experimental setups, highlighting how each parameter adjustment contributes deeper understanding complex, time-series data. findings emphasize effectiveness precise achieving superior model performance, providing valuable insights researchers practitioners who seek leverage networks intricate temporal data analysis. not only refines predictability specific contexts but also serves guide similar other specialized domains, thereby informing development more effective

Language: Английский

Informers for turbulent time series data forecast DOI
Dimitris Drikakis, Ioannis W. Kokkinakis,

Daryl L. X. Fung

et al.

Physics of Fluids, Journal Year: 2025, Volume and Issue: 37(1)

Published: Jan. 1, 2025

Long-sequence time-series forecasting requires deep learning models with high predictive capacity to capture long-range dependencies between inputs and outputs effectively. This study presents a methodology for pressure time series in shock-wave, turbulent boundary layer interaction flows. Pressure signals were extracted below the λ-shock foot six deformed rigid panel surface cases, where low-frequency unsteadiness of shock–boundary is most prominent. The Informer model demonstrated superior performance accurately predicting signals. Comparative numerical experiments revealed that generally outperformed Transformer, as indicated by lower root mean square errors more accurate power spectrum. effectively resolved better matched ground truth's low- mid-frequency content. forecasted accuracy remained robust across all deformation though subtle yet noticeable discrepancies still manifested. was heavily dependent on step size. A size four provided closer match truth deterministic manner, while eight achieved agreement stochastic sense. Larger sizes resulted gradual decline accuracy.

Language: Английский

Citations

0

Flow field prediction with self-supervised learning and graph transformer: A high performance solution for limited data and complex flow scenarios DOI
Hang Shen, Dan Zhang, Akira Rinoshika

et al.

Physics of Fluids, Journal Year: 2025, Volume and Issue: 37(4)

Published: April 1, 2025

To address the challenges of limited labeled data and insufficient global feature extraction in flow field prediction, this paper proposes a modeling approach that combines self-supervised learning Graph Transformer. The module leverages reconstruction tasks contrastive to fully exploit latent information unlabeled data, thereby enhancing joint capability for local features. Transformer incorporates self-attention mechanism, enabling effective long-range dependencies multiscale features complex fields. Experimental results demonstrate that, under 100% conditions, proposed method reduces root mean squared error achieved by graph convolutional network neural model on cylinder airfoil datasets from 0.970 0.561 0.616 0.305, achieving significant accuracy improvements 36.5% 45.6%, respectively. Under 50% still exhibits outstanding robustness, with RMSEs 0.792 0.390, ablation studies reveal exhibit strong complementarity, optimal performance when jointly employed. Furthermore, mechanism significantly enhances features, demonstrating its effectiveness capturing dependencies. demonstrates superior prediction robustness providing an efficient solution broad application potential.

Language: Английский

Citations

0

High-speed fluid–structure interaction predictions using a deep learning transformer architecture DOI
Dimitris Drikakis,

Daryl L. X. Fung,

Ioannis W. Kokkinakis

et al.

Physics of Fluids, Journal Year: 2025, Volume and Issue: 37(5)

Published: May 1, 2025

This paper presents the development and application of a Transformer deep-learning model to fluid–structure problems induced by shock-turbulent boundary layer interaction. The was trained on data from experiments conducted at hypersonic wind tunnel under flow conditions that allowed for Mach number 5.3 Reynolds ∼19.3×106/m. shock-wave turbulent interaction occurred over an elastic panel. using panel deformation measurements taken different probe locations pressure in cavity beneath subsequently applied unseen corresponding various mean pressures deformations. capability capture aeroelastic trends is promising, with interpolation accuracy shown depend volume used training location which applied. practical implications this study research are significant, offering new insights potential solutions real-world challenges.

Language: Английский

Citations

0

The effects of hyperparameters on deep learning of turbulent signals DOI
Panagiotis Tirchas, Dimitris Drikakis, Ioannis W. Kokkinakis

et al.

Physics of Fluids, Journal Year: 2024, Volume and Issue: 36(12)

Published: Dec. 1, 2024

The effect of hyperparameter selection in deep learning (DL) models for fluid dynamics remains an open question the current scientific literature. Many authors report results using models. However, better insight is required to assess models' behavior, particularly complex datasets such as turbulent signals. This study presents a meticulous investigation long short-term memory (LSTM) hyperparameters, focusing specifically on applications involving predicting signals shock boundary layer interaction. Unlike conventional methodologies that utilize automated optimization techniques, this research explores intricacies and impact manual adjustments model. includes number layers, neurons per layer, rate, dropout batch size investigate their model's predictive accuracy computational efficiency. paper details iterative tuning process through series experimental setups, highlighting how each parameter adjustment contributes deeper understanding complex, time-series data. findings emphasize effectiveness precise achieving superior model performance, providing valuable insights researchers practitioners who seek leverage networks intricate temporal data analysis. not only refines predictability specific contexts but also serves guide similar other specialized domains, thereby informing development more effective

Language: Английский

Citations

2