See how advanced simulation enhances nuclear safety

Learn how the UNESCO Chair in Numerical Methods spearheads frontier innovation in the Global South

News

Back

Successful final review ExaQUte Project

Feb 21, 2022

The ExaQUte Project is a European H2020 research project coordinated by CIMNE, that spanned from June 2018 to November 2021. The institutions involved in the project include: BSC, TUM, INRIA, VSB-TUO, EPFL, UPC and STR. The main objective of the ExaQUte project is to construct a framework for the solution of Uncertainty Quantification and Optimization Under Uncertainties problems, preparing the ground for the exploitation of the upcoming exa-scale computational systems. Check out this video for a brief introduction of the project: What is ExaQUte?

After three years of activity, the project was reviewed by the European Commission the past 27th of January of 2022, with a successful evaluation.

The project targeted the design of civil engineering structures as its final application. To achieve this goal, several advancements on different fields were required. The different tools developed were combined on different test cases and benchmarks to ensure the software packages are validated and ready for its industry use.

Some of the results accomplished during the project are listed next.

Development of a scheduling tool to extract parallelism in the MLMC algorithm across samples and levels

The ExaQUte project uses a task-based approach to solve the Monte Carlo algorithms, as these algorithms require the solution of independent realizations, for which statistics are computed. The scheduling of these tasks on the computing infrastructure and the management of the task dependencies is performed by the programming models (PyCOMPSs/HyperLoom/Quake), which enable the exploitation of supercomputers.

Exaqute

Figure 1. ExaQUte software framework

Development of embedded solvers for multiphysics problems

The simulation of complex geometries was tackled with the so-called embedded methods, which allow to implicitly represent the geometry of study. During the project, a complete workflow was constructed that enables the use of CAD geometries to solve embedded multiphysics problems. In this process, the exact geometry is used to create a level set representation, including the runtime computation of the tessellation and the adaptive refinement of the volume mesh.

Exaqute

Figure 2. Level set representation of the geometry of interest from CAD. The tessellation is performed at runtime with a user-defined accuracy.

Develop parallel adaptive refinement methods for embedded domains

Adaptive refinement has been widely used across the project to improve the computational meshes. Metric-based refinement was employed, which enables a user-defined strategy to remesh the domain. For the scope of the project, the metric was constructed using both the geometrical information and the solution of previous simulations.

A great contribution of the project has been the release of the MPI-parallel version of the MMG remesher: ParMMG (MmgTools/ParMmg (github.com)). This permits the refinement of large-scale cases on distributed environments, where the remeshing operation can be parallelized.

Exaqute

Figure 3. Adapted computational mesh using the velocity field on a previous solution. There is more nodal presence in those regions of the domain where the change of velocity is greater.

Develop space-time methods for the numerical simulation of multiphysics problems

Performing accurate simulations of the wind flow around high-rise buildings has a great computational cost. In this case, long time windows must be considered to model wind gusts. In order to accelerate these simulations, an ensemble approach was considered, where long simulations are replaced with a combination of shorter simulations, exploiting the parallelism of supercomputers. Perturbed initial conditions are applied on the shorter simulations to produce different realizations that can be combined to compute statistics and replace large single simulations. Special care was put into the definition of the initial conditions in order to minimize the time that has to be thrown away (burn-in time) at the start of the simulations.

Exaqute

Figure 4. Simulation of a high-rise building under the action of wind accelerated with ensemble average methods. The wind generation is performed on-the-fly.

Extension of MLMC to use an adaptively refined space-time mesh hierarchy

The application of Multilevel Monte Carlo (MLMC) methods that exploit the multiphysics solvers and the adaptive refinement techniques was one of the main objectives of the project. These methods are applied to perform uncertainty quantification on several levels of refinement, where a different number of simulations is performed at each level. The key idea is that most of the uncertainty will be captured at the low-resolution levels, for which many number of simulations will be run. This is corrected with a few runs on the higher resolution levels. Adaptive algorithms for MC and MLMC were developed for different risk measures, for which the algorithms are able to predict the amount of levels and the number of simulation per level required to reach a certain accuracy.

Exaqute

Figure 5. Hierarchy of three levels of refinement. The meshes were adaptively refined using the velocity field of the previous solutions.

Combine MLMC methods with gradient-based optimization techniques based on adjoint problems

Another main objective in ExaQUte was the solution of Optimization Under Uncertainties problems on the target application. To this end, a workflow that starts with a CAD geometry which is optimized considering uncertainties was developed. The final shape is recovered in CAD geometry after finding the optimal geometry parameters considered.

Exaqute

Figure 6. Initial and final designs of a high-rise building with the twist angle and taper ratio as geometry parameters. The analysis was performed considering random variables on the wind generation. The shape is optimized by selecting a risk measure (mean, CVaR) which is estimated using Monte Carlo methods.

Learn more

Related News

Science meets Data: insights from UNESCO Chair seminar by Prof. Michael Ortiz
Science meets Data: insights from UNESCO Chair seminar by Prof. Michael Ortiz

  On 28 October, Professor Michael Ortiz gave a seminar entitled “Science Meets Data: Scientific Computing in the Age of Artificial Intelligence” at the Palau Robert in Barcelona, to mark his appointment as holder of the UNESCO Chair in Numerical Methods in...

CIMNE Showcases Seismic Simulation Advances and VR Innovation at Spanish Nuclear Society’s Annual Meeting
CIMNE Showcases Seismic Simulation Advances and VR Innovation at Spanish Nuclear Society’s Annual Meeting

  The 51st Annual Meeting of the Spanish Nuclear Society (SNE), held in Cáceres, reaffirmed itself as the key gathering of the nuclear sector in Spain. With 694 participants, 248 papers, 42 technical sessions, 23 exhibitors, and 31 sponsors and collaborators, the...

Desenvolupen un mètode numèric que reforça la fiabilitat de la impressió 3D de formigó
Desenvolupen un mètode numèric que reforça la fiabilitat de la impressió 3D de formigó

  El clúster de Simulació de sòlids i fluids per a processos industrials del CIMNE ha participat en un estudi que proposa un mètode numèric per simular la impressió 3D d’estructures de formigó i les capes que el conformen. Aquest model és una eina valuosa per...

Tags

Share: