ExaFLOW Flagship runs underway

All of these cases were designed to expose some of the difficulties encountered in CFD, for instance complex geometry, intricate physical interactions etc., but having the same common denominator that they represent relevant cases which can be scaled to large sizes (i.e. number of grid points, and running time) to be of industrial relevance.

In the present blog, we describe the progress that we are making for one of those flagship runs, namely the incompressible flow around a asymmetric wing profile, the NACA 4412 airfoil. Whereas we have done similar simulations before in our group, the major innovation from ExaFLOW comes with a novel treatment of the discretisation inside the computational domain: For the first time, we allow the mesh to evolve dynamically depending on the estimated computational error at any given point in space and time. During ExaFLOW, we coupled this so-called adaptive mesh refinement to the highly accurate spectral-element code Nek5000. Special focus has been on the design of the preconditioners necessary to efficiently solve the arising linear systems, the definition of the error indicators (in this case the so-called spectral error indicators), and the overall scalability of our implementation.

So far, we are running the NACA-4412 profile at a Reynolds number, based on chord length and free-stream velocity, of 200000. This corresponds to roughly the speed of a glider air plane. Our simulations are started with a very coarse initial mesh, which is conformal (i.e. all the element vertices coincide between two neighbouring elements). We then use Nek5000 to integrate in time the governing Navier-Stokes equations, and thus producing progressively smaller turbulent scales. The error indicator of course identifies these points as regions with low resolution, and refines the mesh. In Figure 1, this refinement is shown schematically; the boundary layers close to the surface are properly identified and resolved, similarly, the complex region just behind the trailing edge (the so-called wake) also leads to increased resolution.

Figure 1: Regions of low resolutions

Previously, all these considerations had to be done manually by the respective computational scientist, who essentially had to predict in beforehand where a specific resolution is necessary. Now, the computer does that automatically as the simulation progresses.

Figure 2 shows the final refinement structure of the mesh by highlighting those regions which require refinement levels higher than three (i.e. more than three consecutive refinements were necessary to reach a specific error goal). Again, it becomes obvious how the boundary layers and the wake region are properly identified as critical regions.

Figure 2: Structure of the mesh by highlighting the regions which require refinement levels higher than three

The actual turbulent flow is visualised in Figure 3. We show isocontours of negative lambda2, i.e. vortices which make up the turbulent fluctuations. As one can see, the whole domain is characterised by small-scale turbulence, which needs to be properly resolved on the numerical mesh. Note that the large-scale quasi-2D vortex downstream of the wing is a transient feature due to the initial condition.

Figure 3: Turbulent flow

Let us mention just a few technical aspects: The present simulation has been run on 2048 cores on the Beskow supercomputer, located at PDC (KTH Stockholm). The refined mesh contains around 180k elements, connected non-conformally between each other. Given our pre-conditioners, we have only around 30 pressure iterations, which is similar to a standard, conformal case. It turns out that the overhead induced by the mesh adaptation is very small, and does not exceed 5% of the total running time. Therefore, we can say that the code performance and scalability has not been compromised at all.

The new adaptive simulation capabilities included in the open-source code Nek5000 provide a very useful feature that will be used in the future for many simulations: As mentioned above, the simulation determines the required mesh on its own, making the design of an initial mesh much simpler. Therefore, the through-put time for a computational engineer is much faster, but still certain error measures of the solution are fulfilled. We anticipate that solution-aware adaptive simulation techniques, in particular based on a underlying high-order discretisation, will play a major role in future CFD. This will be particularly in cases where local high accuracy is necessary to predict a specific flow behaviour, e.g. the region of flow separation from the surface, or the transition point from laminar-to-turbulent flow.