1. Benchmarking Experiments for
Criticality Safety and Reactor
Physics Applications – II –
Tutorial
John D. Bess and J. Blair Briggs – INL
Ian Hill (IDAT) – OECD/NEA
www.inl.gov
2012 ANS Annual Meeting
Chicago, Illinois
June 24-28, 2012
This paper was prepared at Idaho National Laboratory for the U.S. Department of
Energy under Contract Number (DE-AC07-05ID14517)
2. Purpose of this Tutorial
• Discuss the benchmark process for
ICSBEP and IRPhEP
• Provide brief demonstration of
DICE and IDAT
2
3. Acknowledgements
• ICSBEP and IRPhEP are collaborative efforts that
involve numerous scientists, engineers,
administrative support personnel and program
sponsors from 24 different countries and the
OECD/NEA.
• The authors would like to acknowledge the efforts
of all of those dedicated individuals without whom
those two projects would not be possible.
3
4. Outline
I. Introduction to Benchmarking
a. Overview
b. ICSBEP/IRPhEP
II. Benchmark Experiment Availability
a. DICE Demonstration
b. IDAT Demonstration
III. Dissection of a Benchmark Report
a. Experimental Data
b. Experiment Evaluation
c. Benchmark Model
d. Sample Calculations
e. Benchmark Measurements
IV. Benchmark Participation
NRAD
4
5. Outline
I. Introduction to Benchmarking
a. Overview
b. ICSBEP/IRPhEP
II. Benchmark Experiment Availability
a. DICE Demonstration
b. IDAT Demonstration
III. Dissection of a Benchmark Report
a. Experimental Data
b. Experiment Evaluation
c. Benchmark Model
d. Sample Calculations
e. Benchmark Measurements
IV. Benchmark Participation
5
10. Why Do We Have Nuclear Benchmarks?
• Nuclear Safety • Research and
– Plant Operations Development
• Training – New Reactor Designs
– Transportation – Design Validation
– Waste Disposal • Computational Methods
– Experimentation – Cross-Section Data
– Accident Analysis – Code Verification
– Standards Development
– Homeland Security • Fundamental Physics
– Model Validation
• Materials
– Testing • Fun
– Physics Validation
– Interrogation
10
11. Cross Section Evaluation Working Group (CSEWG)
The majority of
data testing utilize
critical benchmark
models defined in
the ICSBEP
Handbook.
CENDL-3.1 was
also extensively
tested with
ICSBEP Handbook
data.
11
12. Monte Carlo Code Validation
Use of select
ICSBEP
benchmarks to
validate code
performance
Useful to validate
other codes:
SCALE,
SERPENT,
TRIPOLI, etc.
12
14. Physics Criteria for Benchmarks
Physics Criteria for Benchmarks Working Group was initiated in the 1980s
as part of the U. S. DOE sponsored Nuclear Criticality Technology and
Safety Project (NCT&SP)
(1) Method used to determine k-effective
(2) Consistency among experimentally measured parameters. (e.g.,
fundamental mode multiplication should be determined by more than
one method in order to insure consistency.)
(3) A rigorous and detailed description of the experimental mockup, its
mechanical supports, and its surroundings is necessary. (Accompanying
photographs and drawings are essential.)
(4) A complete specification of the geometry dimensions and material
compositions including the methods of determination and the known
sources of error and their potential propagation is necessary. Also, for
completeness, unknown but suspected sources of error should be listed.
(5) A series of experiments is desirable in order to demonstrate the
reproducibility of the results.
14
15. Physics Criteria for Benchmarks (continued)
Physics Criteria for Benchmarks Working Group (continued)
(6) A description of the experiment and results should appear in a
refereed publication.
(7) These criteria were established primarily to provide guidelines for
future experiments.
(8) Many of the earlier experiments do not satisfy all of these criteria.
(9) Failure to meet these criteria does not automatically disqualify an
experiment from being considered as acceptable for use as a
benchmark.
(10) An attempt is being made here to supplement the originally published
data, through the evaluation process, to meet these criteria.
15
16. ANSI/ANS-19.5-1995 (Currently being Revised)
• Requirements for Reference Reactor Physics
Measurements
– Criteria for Evaluation of Data:
• To qualify as reference data
– Measurements performed with accepted and proven techniques
– Or techniques shall be adequately demonstrated to be valid
• Reported uncertainties should be consistent with evidence
• Compare with similar independent configurations or with a
wide range of data sets using accepted calculational methods
• Check for consistency
16
18. Purpose of the ICSBEP
• Compile benchmark-experiment data into a
standardized format that allows analysts to easily
use the data to validate calculational techniques and
cross section data.
• Evaluate the data and quantify overall uncertainties
through various types of sensitivity analyses
• Eliminate a large portion of the tedious and redundant
research and processing of experiment data
• Streamline the necessary step of validating computer
codes and nuclear data with experimental data
• Preserve valuable experimental data that will be of
use for decades
18
19. History of ICSBEP
• Initiated in October of 1992 by the Department of
Energy’s Defense Programs
• Organized and managed through the Idaho
National Laboratory (INL)
• Involves nationally known criticality safety experts
19
21. History of ICSBEP (continued)
• An International Criticality Safety Data Exchange
component was added to the project in 1994
• The ICSBEP became an official activity of the
OECD/NEA’s Nuclear Science Committee (NSC)
in 1995
21
22. ICSBEP International Partners – 20 Countries
Sweden
2007
Hungary 1994 Russian Federation
United Slovenia 1998 1994
Canada Kingdom Serbia 1999
2005 1994 Kazakhstan 1999
Czech Republic 2004
France Poland 2005
Spain 1994
2000 Republic
United States of Korea
1992 China 1996
Israel 2000 2006
Japan
1994
India
2005
Brazil
2004
Argentina
2008
22
23. Purpose of the IRPhEP
• Similar to the ICSBEP
• Focus to collect data regarding the numerous
experiments in support of nuclear energy and
technology performed at research laboratories
• Experiments represent significant investments of
time, infrastructure, expertise, and cost that might
not have received adequate documentation
• Measurements also include data regarding
reactivity measurements, reaction rates, buckling,
burnup, etc., that are of significant worth for
current and future research and development
efforts
23
24. History of IRPhEP
• Initiated as a pilot activity in 1999 by the OECD NEA Nuclear Science
Committee – INL involvement from the beginning
• Endorsed as an official activity of the NSC in June of 2003
• Patterned after its predecessor, the ICSBEP
• Focuses on other integral measurements such as buckling, spectral
characteristics, reactivity effects, reactivity coefficients, kinetics
measurements, reaction-rate and power distributions, nuclide
compositions and other miscellaneous types of measurements in
addition to the critical configuration
• Involves internationally known reactor physics experts
• Technical Review Group managed through the Idaho National
Laboratory (INL) for the OECD/NEA
24
27. What Gets Benchmarked?
• Criticality/Subcriticality • Reaction-Rate
• Buckling/Extrapolation Distributions
Length • Power Distribution Data
• Spectral Characteristics • Isotopic Measurements
• Reactivity Effects • Miscellaneous
• Reactivity Coefficient
Data If it is worth measuring, then
it is worth evaluating.
• Kinetics Measurements
Data
27
28. International Handbook of Evaluated Criticality
Safety Benchmark Experiments
September 2011 Edition
• 20 Contributing Countries
• Spans over 62,600 Pages
• Evaluation of 532 Experimental Series
• 4,550 Critical, Subcritical, or K∞
Configurations
• 24 Criticality-Alarm/ Shielding
Benchmark Configurations – numerous
dose points each
• 155 fission rate and transmission
measurements and reaction rate ratios
for 45 different materials
http://icsbep.inl.gov/ Sept 2012
http://www.oecd-nea.org/science/wpncs/icsbep/ 617 & 4,703 28
29. International Handbook of Evaluated Reactor
Physics Benchmark Experiments
March 2012 Edition
• 16 Contributing Countries
• Data from 56 Experimental Series
performed at 32 Reactor Facilities
• Data from 52 out of the 56 series are
published as approved benchmarks
• Data from 4 out of the 56 series are
published in DRAFT form
http://irphep.inl.gov
http://www.oecd-nea.org/science/wprs/irphe/
29
30. Accomplishments of the ICSBEP and IRPhEP
As a result of ICSBEP and IRPhEP efforts:
• A large portion of the tedious and redundant
research and processing of critical experiment
data has been eliminated
• The necessary step in criticality safety analyses of
validating computer codes with benchmark critical
data is greatly streamlined
• Valuable criticality safety experimental data are
preserved and will be of use for decades
30
31. Accomplishments of the ICSBEP and IRPhEP
(continued)
The work of the ICSBEP and IRPhEP has:
• Highlighted gaps in data
• Retrieved “lost” data
• Helped to identify deficiencies and errors in cross
section processing codes and neutronics codes
• Improved experimental planning, execution and
reporting
31
32. Accomplishments of the ICSBEP and IRPhEP
(continued)
Over 400 scientists from 24 different countries
have combined their efforts to produce the
ICSBEP and IRPhEP Handbooks.
These two handbooks continue to grow and
provide high-quality integral benchmark data that
will be of use to the criticality safety, nuclear data,
and reactor physics communities for future
decades.
32
33. The ICSBEP is Featured in the 2003 September
& October Issues of NS&E
33
34. ICSBEP Evaluation Content & Format
All ICSBEP evaluations follow the same general
format:
1. Describe the Experiments
2. Evaluate the Experiments
3. Derive Benchmark Specifications
4. Provide Results from Sample Calculations
A. Typical Input Listings
B. Supporting Information
34
35. IRPhEP Evaluation Format
1.0 DETAILED DESCRIPTION
1.1 Description of the Critical or Subcritical
Configuration
1.1.1 Overview of Experiment
1.1.2 Description of Experimental Configuration
1.1.3 Description of Material Data
1.1.4 Temperature Data
1.1.5 Additional Information Relevant to Critical and
Subcritical Measurements
35
36. IRPhEP Evaluation Format (continued)
1.2 Description of Buckling and Extrapolation Length
Measurements
1.3 Description of Spectral Characteristics Measurements
1.4 Description of Reactivity Effects Measurements
1.5 Description of Reactivity Coefficient Measurements
1.6 Description of Kinetics Measurements
1.7 Description of Reaction Rate Distribution Measurements
1.8 Description of Power Distribution Measurements
1.9 Description of Isotopic Measurements
1.10 Description of Other Miscellaneous Types of Measurements
36
37. Benchmark Process General Overview
1. Identify Experiment 5. Independent Review
2. Evaluate Experiment of Benchmark Report
a. Prepare Benchmark 6. Distribution of
Report Benchmark Report to
3. Internal Review of Technical Review
Benchmark Report Group
4. Submit Benchmark 7. Technical Review
Experiment to Meeting
ICSBEP/IRPhEP 8. Resolve Action Items
9. Handbook Publication
37