Since 18 of December 2019 conferences.iaea.org uses Nucleus credentials. Visit our help pages for information on how to Register and Sign-in using Nucleus.

28 November 2023 to 1 December 2023
IAEA Headquarters
Europe/Vienna timezone
Workshop programme now available

Data-driven model for divertor plasma

28 Nov 2023, 11:00
35m
Conference Room 1 (CR1), C Building, 2nd floor (IAEA Headquarters)

Conference Room 1 (CR1), C Building, 2nd floor

IAEA Headquarters

Invited Physics/Engineering

Speaker

Ben Zhu (Lawrence Livermore National Laboratory)

Description

Magnetically confining a high temperature plasma in a toroidal device (e.g., tokamak, stellarator, etc) is arguably the most promising approach for mankind to archive controlled thermonuclear fusion energy. One critical concern of this approach is to find and maintain the proper heat and particle exhaust at the divertor region – a region at where the magnetic topology changes from “closed” to “open” and the high temperature plasma may be directly in contact with the vessel wall. Modeling divertor plasma is not a trivial task due to its multi-physics, multi-scale nature. For instance, the widely used axisymmetric 2D edge transport codes in the community (e.g., SOLPS, UEDGE, etc) often take days or even months to attain converged steady state divertor solutions once sophisticated plasma and neutral dynamics are included. This time-consuming process not only affects the divertor plasma physics research but also impacts the high-fidelity divertor model applications, e.g., in new device design, discharge scenario development and real-time plasma control.

Machine learning technique offers an alternative solution to this challenge. A fast yet fairly accurate data-driven surrogate model for divertor plasma prediction is possible by leveraging the latent feature space concept. The idea is to construct and train two neural networks – an autoencoder that finds a proper latent space representation (LSR) of plasma state by compressing the desired multi-modal diagnostic measurements, and a forward model using multi-layer perception (MLP) that projects a set of divertor plasma control parameters to its corresponding LSR. By combining the forward model and the decoder network from autoencoder, this data-driven surrogate model predicts a consistent set of diagnostic measurements based on a few key parameters controlling the divertor plasma. This idea is first tested to predict downstream plasma properties with limited upstream information in one-dimensional flux tube configuration. The resulting surrogate model is at least four orders of magnitude faster than the conventional numerical model and provides fairly accurate divertor plasma predictions, usually within a few percent relative error margin. It has 99% successful rate predicting divertor plasma detachment – a bifurcation phenomenon features a sudden decrease of heat load on the divertor plate. This pilot study successfully demonstrates that the complicated divertor plasma state has a low-dimensional representation in latent space that could be utilized for surrogate modeling. Following the same methodology, we recently extend the work to modeling realistic 2D axisymmetric configuration. Application-specific surrogate models are constructed, trained, and tested. These models appear to be able to fulfill many different tasks (e.g., initial solution prediction for code acceleration, integrated tokamak divertor design, and divertor plasma detachment control), suggesting that machine learning is a powerful tool for divertor plasma physics and fusion energy research.

Speaker's Affiliation Lawrence Livermore National Laboratory
Member State or IGO/NGO USA

Primary author

Ben Zhu (Lawrence Livermore National Laboratory)

Co-authors

Mr Harsh Bhatia (LLNL) Mr Menglong Zhao (LLNL) xueqiao Xu (Lawrence Livermore National Laboratory)

Presentation materials