Speaker
Description
Predictive and reliable simulations of fusion plasmas provide one important pathway towards accelerating fusion research. A popular approach for efficiently computing the dynamics of turbulent systems for a wide range of applications is the Large Eddy Simulation (LES) technique. Here, the system is simulated with only the largest scales resolved explicitly, while the unresolved scales are accounted for by a Sub-Grid-Scale (SGS) model. In this presentation, we will give this old idea a new twist. Specifically, we will develop an SGS model based on a Neural Network (NN) with Learned Corrections (LC) on the resolved scales to create a hybrid numerical and ML approach. As will be demonstrated, by using a non-propagated field, this approach can be remarkably effective allowing us to cut off virtually the entire inertial range, just retaining the drive range. This is fundamentally different from previous studies, which focused on the much simpler problem of removing diffusion-dominated scales in the dissipation range. However, removing (large) parts of the inertial range while retaining the integrity of the cascade dynamics has been the major challenge facing LES approaches. Simply extending approaches that work within the dissipation range to the inertial range is typically not a viable option. Here, we introduce a model that is able to overcome these difficulties and do so very efficiently. In fact, it is able to produce physically indistinguishable results even when removing (large) parts of the inertial range, while allowing for a relative speedup of about three orders of magnitude.
Speaker's Affiliation | Max Planck Institute for Plasma Physics |
---|---|
Member State or IGO/NGO | Germany |