Smart models for next-generation aircraft engine design
SAFRAN relies on high-fidelity numerical simulations of the physics of helicopter engine combustors for their design. These unsteady simulations require massive computational power as they encompass high-Reynolds flows with multi-physics components such as flow turbulence, reaction chemistry, or thermal radiation and conduction. For the simulations, the full combustion chamber is represented by a computational mesh, with its resolution constrained by the smallest physical scales. Many industrial systems span many orders of magnitude between the smallest and largest scales, resulting in intractable mesh-sizes. This multi-scale issue is also prominent in the combustion chamber case. To deal with this, it is a common practice in CFD to solve for the largest scales only and to model the smallest ones. That is, sub-grid scale (SGS) models are used that additionally keep the mesh resolution in check. The corresponding modelling approach is called LES (Large Eddy Simulation).
Figure 1: Very high hidelity LES of a real Safran helicopter engine combustor (1.1 billion point)
(courtesy of Safran Helicopter Engines)
Exascale computing enables the computation of multi-scale multi-physics problems on ever bigger meshes with billions of elements, opening new avenues for obtaining highly-resolved results. Design trends towards more sustainable engines follow a fast pace, with increasing operating pressures for higher efficiency, and exploration of alternatives fuels, notably with an increasing interest in combusting H₂. These factors combine to significantly decrease the smallest physical scales, thus requiring exponentially larger meshes to solve the governing equations with sufficient accuracy. These challenges will not be met by solely relying on the brute-force of Exascale computing. Additionally, other strategies are needed to enable the application of smarter and more complex SGS models to mimic more complex physics, thus increasing the workload of the modeled scales and alleviating the resolved ones. To manually address the complexity of these models by hand is unfeasible. Therefore, the interest in data-driven approaches is continuously growing in the CFD community. Novel approaches to automatically address the model complexity range from simple statistical methods to full modern deep-learning techniques.
This SAFRAN use-case focuses on the design of new high-pressure combustors burning hydrogen. An important scale determining the mesh size is the flame thickness, which will be much smaller for these new configurations as compared to traditional combustor setups. The flame can be made artificially bigger, e.g., by using the popular thickened-flame approach. With this approach the finest flame-turbulence interactions are, however, no longer resolved and must be represented by an appropriate SGS model, known as a turbulent combustion model. SAFRAN relies on several such models, with increments aiming at describing ever more unresolved physics. Most of these historical models are hand-designed from physical principles, but recently several data-driven approaches have been shown to capture more scales, both in a priori and in a posteriori setups. To advance these results and to enable hybrid simulations with data-driven SGS models, it is absolutely mandatory to further investigate these methods, notably by addressing the following questions:
What constitutes a sufficient training database to cover the application cases and how to avoid out-of-distribution uses
of the models once trained?
How do models trained a priori translate to a posteriori setups, and how can the training procedure be made robust
to this transfer?
What approaches can be used to propagate observed errors a posteriori in the training setup?
To address these challenges, transverse methodologies from the fields of data topology, surrogate modeling, and data assimilation are employed in RAISE.