
Journal of Computational Physics, Journal Year: 2024, Volume and Issue: 522, P. 113560 - 113560
Published: Nov. 14, 2024
Language: Английский
Journal of Computational Physics, Journal Year: 2024, Volume and Issue: 522, P. 113560 - 113560
Published: Nov. 14, 2024
Language: Английский
Physics of Fluids, Journal Year: 2025, Volume and Issue: 37(3)
Published: March 1, 2025
This work presents a large language model (LLM)-based agent OpenFOAMGPT tailored for OpenFOAM-centric computational fluid dynamics (CFD) simulations, leveraging two foundation models from OpenAI: the GPT-4o (GPT means Generative Pre-trained Transformer) and chain-of-thought–enabled o1 preview model. Both agents demonstrate success across multiple tasks. While price of token with is six times as that GPT-4o, it consistently exhibits superior performance in handling complex tasks, zero-shot/few-shot case setup to boundary condition modifications, zero-shot turbulence adjustments, code translation. Through an iterative correction loop, efficiently addressed single-phase multiphase flow, heat transfer, Reynolds-averaged Navier–Stokes modeling, eddy simulation, other engineering scenarios, often converging limited number iterations at low costs. To embed domain-specific knowledge, we employed retrieval-augmented generation pipeline, demonstrating how preexisting simulation setups can further specialize subdomains such energy aerospace. Despite great agent, human oversight remains crucial ensuring accuracy adapting shifting contexts. Fluctuations over time suggest need monitoring mission-critical applications. Although our demonstrations focus on OpenFOAM, adaptable nature this framework opens door developing LLM-driven into wide range solvers codes. By streamlining CFD approach has potential accelerate both fundamental research industrial advancements.
Language: Английский
Citations
3International Journal of Heat and Mass Transfer, Journal Year: 2024, Volume and Issue: 235, P. 126149 - 126149
Published: Sept. 7, 2024
Language: Английский
Citations
15Foundations of Data Science, Journal Year: 2024, Volume and Issue: 7(1), P. 298 - 337
Published: Oct. 9, 2024
Language: Английский
Citations
10Research Square (Research Square), Journal Year: 2024, Volume and Issue: unknown
Published: May 29, 2024
Language: Английский
Citations
7Physics of Fluids, Journal Year: 2025, Volume and Issue: 37(1)
Published: Jan. 1, 2025
We study large eddy simulation (LES) in the form of vorticity transport equations (VTE), employing six subgrid-scale (SGS) models, including dynamic Smagorinsky model, mixed velocity gradient scale similarity approximate deconvolution and iterative (DIAD) model. In a priori study, correlation coefficient SGS stress given by DIAD is significantly higher than those other structural relative error lowest. posteriori validation, models outperform functional predicting energy spectra, enstrophy probability density functions (PDFs) vorticity, strain-rate tensor, flux, production term. These results confirm feasibility VTE-based LES. They also indicate that classic modeling approaches for terms filtered Navier–Stokes (NSE) are applicable to counterparts VTE.
Language: Английский
Citations
1AIAA SCITECH 2022 Forum, Journal Year: 2025, Volume and Issue: unknown
Published: Jan. 3, 2025
Language: Английский
Citations
0AIAA SCITECH 2022 Forum, Journal Year: 2025, Volume and Issue: unknown
Published: Jan. 3, 2025
Language: Английский
Citations
0Journal of Computational Physics, Journal Year: 2025, Volume and Issue: unknown, P. 114080 - 114080
Published: May 1, 2025
Language: Английский
Citations
0Journal of Computational Physics, Journal Year: 2024, Volume and Issue: 522, P. 113577 - 113577
Published: Nov. 15, 2024
We propose a new neural network based large eddy simulation framework for the incompressible Navier-Stokes equations on paradigm "discretize first, filter and close next".This leads to full model-data consistency allows employing closure models in same environment as where they have been trained.Since LES discretization error is included learning process, can learn account discretization.Furthermore, we employ divergence-consistent discrete defined through face-averaging provide novel theoretical numerical analysis.This preserves divergence-free constraint by construction, unlike general filters such volume-averaging filters.We show that using formulation coupled with convolutional model produces stable accurate results both a-priori a-posteriori training, while (divergence-inconsistent) requires training or other stabilityenforcing measures.
Language: Английский
Citations
3Results in Engineering, Journal Year: 2024, Volume and Issue: unknown, P. 102949 - 102949
Published: Sept. 1, 2024
Language: Английский
Citations
2