Since 18 of December 2019 conferences.iaea.org uses Nucleus credentials. Visit our help pages for information on how to Register and Sign-in using Nucleus.

Technical Meeting on the Compilation of Nuclear Data Experiments for Radiation Characterization

Europe/Vienna
M Building Press Room (VIC)

M Building Press Room

VIC

Wagramer Str. 5, 1400 Wien
Jean-Christophe SUBLET (IAEA)
Description

The purpose of the event is to transfer into technology the experimental integral radiation information to be used as part of the validation and verification processes of nuclear model and simulation code systems, to provide various schemas to perform validation and verification, and to deploy the numerical data streams to users though the open application programming interface.

 

Participants
  • Albert (Skip) Kahler
  • Benoit Forget
  • Bor Kos
  • Coline Larmier
  • Cédric Jouanne
  • Daniel Siefman
  • David Heinrichs
  • Davide Laghi
  • Davide Mancusi
  • Gareth Morgan
  • Gediminas Stankunas
  • Ludmila MARIAN
  • Lydie GIOT
  • Marco Fabbri
  • Mark Gilbert
  • Muriel Fallot
  • Oliver Buss
  • Paul Romano
  • Peneliau Yannick
  • Shin OKUMURA
  • Ville Valtavirta
  • Xiaofei WU
  • Yanyan Ding
  • Zhigang Ge
    • 12:30
      Lunch Break M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien
    • Opening Session: Arjan Koning / Jean-Christophe Sublet M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien
    • 1
      CoNDERC white paper and status M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien

      The purpose of the project entitled Compilation of Nuclear Data Experiments for Radiation Characterisation (CoNDERC) is to transfer into technology the experimental integral radiation information that can be used as part of the Validation and Verification processes of nuclear model and code systems, and to provide various schemas to perform the necessary V&V protocols. The IAEA coordinate, organize affiliate’s institution research and development to construct several databases and code infrastructures based on their own V&V activities mainly associated with inventory, activation-transmutation, source term, reaction rates and radiation shielding R&D, but also outreaching to engineering systems.

      The project has seen significant progress in the past four years, ensuring its goals. It is expected to still gain momentum once its features and deliverables are fully functional.

      Speaker: Dr Jean-Christophe SUBLET (IAEA)
    • 2
      Data Management in CoNDERC M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien

      Short presentation explaining the workflow of ingesting and disseminating data on the new CoNDERC website. The presentation will include possible future developments using GitHub for collecting and releasing data sets.

      Speaker: Ludmila MARIAN
    • 3
      Integral benchmark activities for radiation transport under the auspices of the NEA Nuclear Science Committee M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien

      The international SINBAD, ICSBEP, IRPhE, and SFCOMPO projects under the auspices of the Nuclear Science Committee (NSC) of the OECD Nuclear Energy Agency (NEA) provide integral benchmarks to improve nuclear data evaluations and serve as cornerstones in the verification and validation (V&V) process of radiation transport simulations. Experts from OECD NEA member countries, and beyond, continuously challenge state-of-the-art simulation methods on radiation transport in international benchmarks organised by the NSC Working Parties on Nuclear Criticality Safety (WPNCS) and on Scientific Issues and Uncertainty Analysis of Reactor Systems (WPRS). WPRS is currently conducting 17 different benchmark phases on light water (LWR), molten salt (MSR), sodium-cooled fast (SFR), lead-cooled fast (LFR), and high temperature gas-cooled (HTGR) reactor systems, and co-sponsors the biannual SATIF workshops with focus on accelerators and irradiation facilities. This presentation gives an update on the status of NSC activities with regard to integral benchmarks for radiation transport and shows how they benefit from new services provided by the OECD NEA Data Bank (DB). It serves as a basis to discuss future cooperation with the CoNDERC project.

      Speaker: Oliver Buss (OECD Nuclear Energy Agency (NEA))
    • 4
      Status of JADE, an open-source software for nuclear data libraries V&V M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien

      In the last couple of years, a combined effort between NIER, Università di Bologna and Fusion For Energy led to the development of JADE, a python-based open-source software for the Verification and Validation of nuclear data libraries. Nuclear data is fundamental for particle and radiation transport simulations which, in turn, are responsible for the evaluation of key quantities for fusion-related machines design such as nuclear heating, DPA, particles production and dose rates. The aim for the project is to offer standardization and automation to the V&V process of data libraries in order to speed up their release cycles and, at the same time, improve the quality of the data. JADE takes advantage of MCNP for the particles and radiation transport simulations and, even if it is potentially applicable to the whole nuclear industry, a particular focus on fusion applications is obtained through the selections of the default benchmarks that have been implemented. The code was recently made publicly available to the community and the status of its development is summarized. The more important features and benchmarks (both computational and experimental) are described, together with a brief discussion on the major case studies where JADE has been recently used. Lastly, the current strength and limitations of the tool are evaluated and the foreseen future developments for the project are outlined.

      Speakers: Davide Laghi (NIER ingegneria / Università di Bologna), Dr Marco Fabbri (Fusion For Energy)
    • 5
      Comment on the preparation of a new benchmark based on ASP foil irradiation campaigns M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien

      In this presentation I will show some early work to create a fusion-relevant nuclear data benchmark based on a series of experimental campaigns performed at the ASP DT accelerator in the UK. The experiments involved the irradiation of thin foils samples in a near mono-energetic 14 MeV neutron source. High-resolution gamma spectroscopy was then performed with time evolution to produce activity decay curves for each gamma signal (corresponding to different nuclear reactions) produced. To date, the analysis has only looked a few of the several 100 experiments performed in those campaigns, which could become a useful, scriptable, and rapid test suite for many key reactions that produce short-term gamma activation in materials.

      Speaker: Mark Gilbert (CCFE)
    • 6
      Nuclear Data Needs for Fusion Neutronics Applications M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien

      Fusion neutronics is an important component of fusion power plant development. Neutronics informs key engineering and design decisions, including shielding specifications, safe operating procedures, and dose limits. Neutronics workflows are typically validated by simulating a prototypic experiment from which experimental data has been collected. This allows for the calculation of uncertainty estimates due to systematic and stochastic errors. Characterization of these uncertainties allows for evaluation of their relative importance to key modeling output and gives insight to engineers on design margins and trade-offs.
      Currently there are insufficient benchmarks in the Shielding Integral Benchmark Archive and Database (SINBAD) [1] database to comprehensively cover all aspects of neutronics in fusion devices. Moreover, some evaluations are lacking in detail. Several crucial areas of interest have been identified by both public and private enterprises who are in the process of designing fusion pilot plants.
      We propose a two-fold campaign of benchmark experiments. Firstly, a series of computation benchmark experiments which would focus on the validation of neutronics workflows used to determine quantities of interest such as: fluid activation, analysis of very large models, the skyshine effect, variance reduction techniques, the effect of homogenization, and shutdown dose rate calculations. As part of the computational analysis, we also propose a series of 1D models [2] based on the current ITER and fusion pilot plant designs to assess sensitivity to nuclear data in such configurations.
      The second set of benchmarks should be new experimental benchmark designs driven by data needs that are identified by the computational benchmarks. This should include benchmarks which address the lack of data in operational regimes outside of the currently operating machines.
      The presentation will include a more detailed overview of the needs of fusion neutronics analysts regarding benchmarking, uncertainty quantification, and modeling. The recent ORNL endeavor of creating a new shutdown dose rate computational benchmark experiment based on ITER geometry for the needs of a blind test will also be presented.

      References
      1. Kodeli et al., Radiation Shielding and Dosimetry Experiments Updates in the SINBAD Database, Radiation Protection Dosimetry (2005), Vol 116, No.1–4, pp.558–561
      2. Bohm et al., Neutronics Calculations to Support the Fusion Evaluated Nuclear Data Library (FENDL). Fusion Science and Technology 77, no. 7-8 (2021): 813-828.

      *Notice: This manuscript has been authored by UT-Battelle, LLC, under contract DE-AC05-00OR22725 with the US Department of Energy (DOE). The US government retains and the publisher, by accepting the article for publication, acknowledges that the US government retains a nonexclusive, paid-up, irrevocable, worldwide license to publish or reproduce the published form of this manuscript, or allow others to do so, for US government purposes. DOE will provide public access to these results of federally sponsored research in accordance with the DOE Public Access Plan (http://energy.gov/downloads/doe-public-access-plan).

      Speaker: Bor Kos (ORNL)
    • 12:30
      Lunch Break M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien
    • 7
      Analysis of the FNS Duct experiment with the Monte Carlo code TRIPOLI-4® M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien

      The FNS Duct experiment is a neutron streaming experiment performed in the JAERI Fusion Neutronics Source facility. It aims at studying the transport of neutrons in a complex labyrinth. A deuterium beam hits a titanium hydride target (enriched in tritium) and produces 14 MeV neutrons. A stainless steel block with a dogleg inside is located in front of the neutron source and suitable detectors measure activation rates and neutron flux spectra in the dogleg or behind the steel block. These dogleg configurations are typical of ITER diagnostic configurations. The diagnostics monitor the plasma and the facing components. They let the suitable signals travel through the shielding of the reactor in order to be processed far away in a dedicated area where electronic devices are protected against the neutron and gamma fluxes. The dogleg is a typical configuration that protects the back-end of the facility and in the same time allow the transmission of the signals.
      The purpose of the study is to analyze the FNS Duct experiment with the Monte Carlo code TRIPOLI-4®. This latest is a Monte Carlo code dedicated to the transport of neutrons, photons, electrons and positrons. It is widely used in the field of radiation protection simulation thanks to the variance reduction techniques that it implements. It is well suited to this kind of analyses.
      The analysis focuses first on the model of the experiment (geometrical model, source model) and tests different variance reduction techniques. The Adaptative Multilevel Splitting technique is applied to the experiment simulation. Then, TRIPOLI-4® results are compared with the experiment with both FENDL-2.1 and FENDL-3.1d nuclear data libraries. The results are quite consistent with the experimental results when a dedicated normalization is applied, consisting in making experimental and calculation results consistent for a detector at the entrance of the labyrinth.

      Speaker: Yannick PENELIAU (CEA)
    • 8
      MCNP benchmarking activities of FENDL 3.2b nuclear data library employing fully-heterogeneous DEMO divertor model M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien

      The paper presents results from neutron flux calculations in the DEMO divertor region in the case of the Helium Cooled Pebble Bed (HCPB) concept used as a breeding blanket (BB) option. Calculations were performed with support by ADVANTG (AutomateD VAriaNce reducTion Generator) with FW-CADIS variance reduction parameters tool used for the variance reduction purposes. Such coupled computational method led to neutron flux in cells estimation and flux maps production, which were obtained for the divertor of the EU DEMO reactor. The specified DEMO neutron source was used for benchmark studies using FENDL-2.1 and FENDL-3.2b nuclear data libraries for MCNP6 code for neutron transport calculations. Neutron energy spectra as 709 energy groups from 1.05e-11 MeV to 10e3 MeV was used. This paper contains computational data from performed calculations using EU DEMO1 2017 as an 11.25° toroidal sector of the full tokamak model with a homogeneous HCPB breeding blanket structure. As for the divertor, the 2019 full heterogeneous configuration model has been used for benchmarking FENDL libraries.
      Results from the neutron flux maps showed that the results are similar in many cases using both nuclear data libraries, however the highest ratio between results is 1.3775 for statistical error and 1.3453 for neutron flux inside the divertor. Neutron flux calculations in different cells of the divertor MCNP model showed similar neutrons disposition in the energy region in both investigated nuclear data libraries. Although ratio results can vary between 1.274 and 0.768, however ratio between the two investigated libraries is 1.006 on average.

      Speaker: Gediminas Stankunas (Lithuanian Energy Institute)
    • 9
      Serpent and Nuclear Data: Needs, processing and verification M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien

      This talk gives an overview of the development and applications of Serpent at VTT Technical research centre of Finland Ltd focusing on the nuclear and atomic data needs of Serpent as well as the validation of Serpent at VTT.

      Serpent transports neutrons and photons in a decoupled or coupled manner starting from a criticality source, a fixed source or a radioactive decay source. Serpent is often used for burnup calculations.

      Neutron interaction modelling in Serpent is based on ACE format data, with some application specific data either appended to the end of normal ACE-files (such as additional nonlocal/local energy deposition data) or read directly from ENDF files (dec, nfy, sfy data and energy dependent branching ratios). Photoatomic interactions are modelled based on data that is not as cohesively collected as the neutron interaction data is: At the moment ACE-format photoatomic cross sections are complemented with additional data from auxilliary files.

      The neutron interaction data processing at VTT is conducted with NJOY. Serpent contains an automated stochastic testing routine for ACE libraries, based on reaction, energy and angular sampling for neutrons in randomly generated materials. This routine is utilized at VTT to identify inconsistencies in newly generated libraries. Serpent also executes several checks to the nuclide data included in a calculation to identify inconsistencies in the data such as stable nuclides with decay channels and issues a warning to the user.

      The validation of Serpent at VTT has been application and target specific. The largest amount of preparation has gone to the criticality safety validation package, which contains several hundred Serpent inputs for critical experiments from the LEU-COMP-THERM section of the ICSBEP handbook. Intended for criticality safety validation of Serpent for wet storage geometries, this effort is mainly intended for determining the Upper Safety Limit for the k-eff for a specific application. A smaller amount of effort has been put to using SINBAD experiments for the validation of the photon transport in Serpent and to using SFCOMPO data for the validation of Serpent's burnup capabilities.

      Speaker: Dr Ville Valtavirta (VTT)
    • 10
      Don't Forget What We Already Know M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien

      A number of benchmark compilations have been developed in recent years that are utilized by nuclear data testers worldwide. These include Handbooks from the International Criticality Safety Benchmark Evaluation Project (ICSBEP), and the International Reactor Physics Evaluation Project (IRPhEP), as well as the Shielding Integral Benchmark Archive Database (SINBAD) and the Spent Fuel Isotopic Composition Database (SFCOMPO). Preceding these was the Cross Section Evaluation Working Group's (CSEWG) Benchmark Book. First issued in the 1970s with updates in the 1980s and 1990s it contains separate chapters for Fast and Thermal critical systems, the Coupled Fast Reactor Measurement Facility (CFRMF) Dosimetry Benchmark as well as a variety of Shielding Benchmarks. Beyond that there have been thousands of critical experiments performed over the decades, yielding a wealth of data suitable for cross section data testing. Some of these have been highlighted in recent "Big Papers" validating ENDF/B-VII.0, VII.1 and VIII.0 neutron and thermal scattering law nuclear data files. A number of web links are provided that lead the reader to these data. We also review a number of long-standing approximations that exist in current Monte Carlo benchmark models. These approximations date from when the typical stochastic uncertainty in a Monte Carlo's kcalc calculation was several hundred pcm, as opposed to modern calculations that often produce single digit stochastic uncertainties. With the recent release of a new Japanese Evaluated Nuclear Data Library, JENDL-5, as well as on-going nuclear data testing to support future nuclear data file releases (e.g., JEFF-4 and ENDF/B-VIII.1) now is an opportune time to review the applicability of these approximations.

      Speaker: Dr Albert (Skip) Kahler (Kahler Nuclear Data Services, LLC)
    • 11
      Lessons Learned from the BEAVRS Benchmark M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien

      With escalating costs of nuclear experiments, there has been a growing reliance on Monte Carlo simulations for the design of advanced reactors and the “validation” of high-fidelity deterministic transport codes, but validation is also needed for Monte Carlo codes. The current integral experimental database is composed of many simple critical experiments and small research reactors, but often lack the complexity and pitfalls of real nuclear systems. In this talk, I will present the BEAVRS benchmark that was developed as a realistic test of high-fidelity methods and discuss some of the limitations of both the methods and benchmark. The benchmark has been used by many groups to test both deterministic and stochastic codes with great success, but the results also highlight some of the limitations of the benchmark from the difficulties in modelling the geometric complexity of a relatively “simple” reactor design to the limitations of the measurement acquisition systems. Additionally, while the design is meant to be symmetric, a large tilt is observed in the core detector measurements which is unexplained from the core description. Without accounting for this tilt through post-processing of the results, the comparison with high-fidelity codes is quite poor and the addition of corrections increases the uncertainties of the measurements.

      The benchmark and the results gathered from the literature also present another interesting conclusion in that deterministic codes provide just as good results at the Monte Carlo results. While this is not entirely surprising since no one publishes bad results, it also highlights the fidelity of multigroup self-shielding methods and the limitations of Monte Carlo codes in getting statistically significant results in many small regions and convergence issues on such large systems.

      In the latter part of this presentation, a recent validation effort for the energy domain will be presented. An analytical benchmark was developed where the flux is resolved analytically in energy. This benchmark definition relies on the pole representation of nuclear data and provides an analytical expression for the scalar flux. Additionally, the benchmark was extended to also compute the adjoint flux, thus allowing for the validation of uncertainty quantification methods.

      Speaker: Benoit Forget (MIT)
    • 12
      Verification and Validation Activities with OpenMC M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien

      This talk will give a broad overview of the verification and validation activities being performed using the OpenMC particle transport code. The OpenMC community has relied extensively on benchmark models from the ICSBEP handbook for both cross-code comparisons and comparison to experiment. To date, about 400 different benchmark models from ICSBEP have been created with OpenMC. Along with this, a set of Python tools has been developed for automating the execution and analysis of benchmark simulations. Separately, tools have been developed for cross-code comparison of simple broomstick and spherical shell models that have been invaluable for neutron and photon physics validation.

      Recently, a set of OpenMC models based on ICSBEP benchmarks has been created for inclusion in the CoNDERC repository, taking advantage of the unique capabilities in OpenMC. These benchmarks go beyond simple evaluation of k-eff and include reaction rate tallies, spatial flux profiles, and other physical measures. These additions to CoNDERC lay the groundwork for future additions of OpenMC models focused on other areas (e.g., SINBAD benchmarks for shielding/fusion applications).

      Two pathways for converting MCNP models to OpenMC models currently exist: the csg2csg converter developed by Andy Davis and a more recent project called openmc_mcnp_adapter. These capabilities provide another useful resource for performing cross-code comparison. These efforts will be discussed, in particular how they fit in to the overall V&V activities with OpenMC and promising areas for future work.

      Speaker: Paul Romano (Argonne National Laboratory)
    • 12:30
      Lunch Break M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien
    • 13
      Benchmarking of nuclear data for TRIPOLI-5, the new Monte Carlo code at CEA M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien

      The PATMOS mini-app is a prototype of a massively parallel Monte Carlo particle transport code, developed at CEA in order to conceive alternative algorithms for novel HPC architectures, in view of the TRIPOLI-5® production code. Recently, the sampling laws for modeling neutron physics as provided in nuclear data libraries have been implemented into PATMOS, first within the so-called « free-gas » model (without treatment of the unresolved resonance range) and then by adding thermal neutron scattering treatment in order to include crystal or molecular bond-effects. As a first step towards the validation and verification of this implementation, code-to-code comparisons have been performed between PATMOS and two other reference Monte Carlo transport codes, TRIPOLI-4® and OpenMC, over around 560 isotopes taken from the JEFF-3.3 nuclear data library. First, the energy or angle distributions have been compared between the three codes for each isotope and reaction, at various incident energies, by resorting to Kolmogorov-Smirnov statistical tests, thanks to dedicated sampling routines. Then, the evaluation of the microscopic cross sections (as well as the multiplicity) by each code has been verified, in order to detect possible discrepancies. Finally, more than 5000 configurations have been tested for a simple benchmark consisting in a sphere filled with a single isotope, irradiated by a single-energy and isotropic source located at the center of the sphere (ten representative incident energies have been considered). The results of the fiducial quantity (flux per unit of lethargy) obtained with PATMOS and with the other reference Monte Carlo codes have then been compared by using the Holm-Bonferroni statistical test. The comparison between PATMOS and TRIPOLI-4® was found to be more involved because of the post-processing of nuclear data; indeed, TRIPOLI-4® relies on ENDF files, while PATMOS relies on ACE files, which leads to discrepancies in underlying nuclear data “seen” by the different codes. Our work has allowed i) validating the implementation of the free-gas model and of the thermal scattering laws in PATMOS, thanks to the perfect statistical agreement between PATMOS and OpenMC; ii) highlighting some inconsistencies in nuclear data; iii) detecting some implementation errors in the sampling routines.

      Speaker: Coline Larmier (CEA)
    • 14
      Testing and verification of nuclear data with the GALILEE-1 processing code and the TRIPOLI-4 Monte Carlo code M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien

      This work follows the MCNP - TRIPOLI-4 comparisons for criticality benchmarks using U5, U8, Pu9, for which comparisons were made between ENDF/B-VIII, JEFF-3.3, TENDL-17, TENDL-19 libraries. The study of shielding benchmarks shows differences between the libraries for different nuclei and indicates the importance of the scattering anisotropy. The new Fe56 evaluation of TENDL-21 provides a significant improvement in comparison to TENDL-19. Anisotropy plays a very important role for these configurations and we observe that the JEFF40T1 library with an Fe56 evaluation using LRF=7 in RRR highlights the importance of processing anisotropy during reconstruction.
      In addition, examples will be presented on the influence of PTs in URR, on the TSL data in particular for ZrH and on the treatment of photonuclear reactions.

      Speakers: Dr Cedric Jouanne (CEA Saclay), Dr Cédric Jouanne (CEA Saclay)
    • 19:30
      Social Gathering at Das Bootshaus - Alte Donau The lower Alte Donau 61, A-1220 Vienna

      The lower Alte Donau 61, A-1220 Vienna

      The lower Alte Donau 61, A-1220 Vienna

      Please choose your starter/dessert in advance
      main cause at the restaurant

    • 15
      What we've been up to beyond k-eff at LLNL M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien

      The Nuclear Criticality Safety Division at Lawrence Livermore National Laboratory is developing an extended suite of validation benchmarks for its Monte Carlo code COG. The benchmarks serve to validate nuclear data and to support COG's software quality assurance framework. The current database has 3,395 criticality benchmarks. However, particular focus has been given to including benchmarks that are not criticality experiments. These experiments include $\beta_{eff}$, shielding, photoneutron, spectral indices, neutron spectra, subcritical assemblies, Godiva thermo-mechanical behavior after a pulse, time of flight spectra, pulsed neutron die-away in moderators, and pulsed sphere experiments. Many of the these benchmarks are reproductions of historical experiments, but some are new experiments that have been conducted at Lawrence Livermore National Laboratory and submitted to international benchmark databases. Here, we present the benchmarks that we are investigating and results on the simulation bias with different computational methods and nuclear data libraries.

      Speaker: Daniel Siefman (Lawrence Livermore National Laboratory)
    • 16
      Pulsed Neutron Die-Away Experiments at LLNL M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien

      Pulsed-neutron die-away (PNDA) experiments can be useful benchmarks to validate neutron thermal scattering laws (TSLs). The experiment uses a neutron generator to impinge a short ($\sim$10$^{-4}$ s), mono-energetic neutron pulse on a target sample. After the pulse, the neutron population within the sample moderates and reaches thermal equilibrium with a fundamental spatial mode and characteristic decay-time eigenvalue. The eigenvalue can be extracted from the experimental measurements of the neutron flux and used as an integral parameter in validation. For certain materials and geometric configurations, the eigenvalue is heavily influenced by thermal neutron scattering of only the target material. For that reason, a PNDA experiment can have a higher sensitivity to TSLs than is commonly available in critical experiments. Herein, we present results for a series of new PNDA experiments conducted at Lawrence Livermore National Laboratory with plastic materials (high-density polyethylene and Lucite) and for light water. We compare the experimental integral parameters to simulated results and report trends in the biases.

      Speaker: Daniel Siefman (Lawrence Livermore National Laboratory)
    • 17
      New fission-product decay data measurements to improve decay heat calculations M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien

      The determination of decay heat is a major safety issue for a reactor in operation but also for the transport of burnt fuel and nuclear waste management. The calculation of decay heat through the summation method relies on the combination of reactor simulations to estimate the fuel inventory and on nuclear data: decay properties of the fission products and actinides, fission yields and cross sections. Some fission products in the decay data libraries have decay schemes which are biased by the Pandemonium effect. The Pandemonium effect arises from the low efficiency at high energy of Germanium detectors. This effect has direct consequences on decay heat calculation with an overestimation of the β- contribution and an under-estimation of the γ contribution. To overcome this effect, the Total Absorption Gamma Spectroscopy (TAGS), based on the full detection of the deexcitation gamma cascade for each populated level is used. The impact of these new decay data measurements performed with the TAGS technique on decay heat calculations will be discussed. This work is part of an IAEA coordinated paper, in preparation.

      Speaker: Lydie GIOT (CNRS/IN2P3/SUBATECH)
    • 12:30
      Lunch Break M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien
    • 18
      Radioactive Decay Calculations from the Independent Fission Product Yield M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien

      This talk will give a brief introduction of radioactive decay calculations. We developed a Python package for radioactive decay calculations from the independent yields, Y_I(Z,A,M), by keeping track of the inventory including isomeric states, M.

      It supports decay chains of radio nuclides, metastable states, and branching decays. By default it reads the ENDF-6 format decay data and convert into simple text or JSON format, and then create a decay chain from a particular fission product. The code solves the Bateman equation analytically. To undergo from Y_I(Z,A,M) to the cumulative yields, Y_C(Z,A,M), a time-independent calculation is performed. The outputs are Y_C(Z,A,M), decay heats from gamma and beta ray components, and delayed neutron yield. The code includes a plot method for drawing decay chain diagrams.

      We also developed a web tools for the nuclear data visualizations, which is mostly for the cross sections, but also fission product yields.

      Speaker: Shin OKUMURA (IAEA)
    • Discussion: Technical discussions M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien
    • Discussion: Wrap up M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien
      Convener: Dr Jean-Christophe SUBLET (IAEA)
    • 12:30
      Lunch Break M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien
    • Closing Session M Building Press Room

      M Building Press Room

      VIC

      Wagramer Str. 5, 1400 Wien