SlideShare a Scribd company logo
1 of 38
Benchmarking Experiments for
               Criticality Safety and Reactor
                Physics Applications – II –
                           Tutorial

                     John D. Bess and J. Blair Briggs – INL
                          Ian Hill (IDAT) – OECD/NEA
www.inl.gov




                                  2012 ANS Annual Meeting
                                       Chicago, Illinois
                                      June 24-28, 2012


              This paper was prepared at Idaho National Laboratory for the U.S. Department of
                          Energy under Contract Number (DE-AC07-05ID14517)
Purpose of this Tutorial

• Discuss the benchmark process for
  ICSBEP and IRPhEP

• Provide brief demonstration of
  DICE and IDAT



                                      2
Acknowledgements
• ICSBEP and IRPhEP are collaborative efforts that
  involve numerous scientists, engineers,
  administrative support personnel and program
  sponsors from 24 different countries and the
  OECD/NEA.
• The authors would like to acknowledge the efforts
  of all of those dedicated individuals without whom
  those two projects would not be possible.




                                                       3
Outline
I.   Introduction to Benchmarking
     a.   Overview
     b.   ICSBEP/IRPhEP
II. Benchmark Experiment Availability
     a.   DICE Demonstration
     b.   IDAT Demonstration
III. Dissection of a Benchmark Report
     a.   Experimental Data
     b.   Experiment Evaluation
     c.   Benchmark Model
     d.   Sample Calculations
     e.   Benchmark Measurements
IV. Benchmark Participation
                                        NRAD

                                               4
Outline
I.   Introduction to Benchmarking
     a.   Overview
     b.   ICSBEP/IRPhEP
II. Benchmark Experiment Availability
     a.   DICE Demonstration
     b.   IDAT Demonstration
III. Dissection of a Benchmark Report
     a.   Experimental Data
     b.   Experiment Evaluation
     c.   Benchmark Model
     d.   Sample Calculations
     e.   Benchmark Measurements
IV. Benchmark Participation


                                        5
INTRODUCTION TO
BENCHMARKING

                  6
What Is a Benchmark?
• Merriam-Webster
• “a point of reference from
  which measurements
  may be made”
• “something that serves
  as a standard by which
  others are measured or
  judged”
• “a standardized problem
  or test that serves as a
  basis for evaluation or      © Gary Price
  comparison (as of
  computer system
  performance)”
                                              7
WHY ARE YOU INTERESTED IN
BENCHMARKS?

                            8
How Does Benchmark Design Apply to You?




                                          9
Why Do We Have Nuclear Benchmarks?
• Nuclear Safety      • Research and
   – Plant Operations   Development
      • Training            – New Reactor Designs
   – Transportation         – Design Validation
   – Waste Disposal        • Computational Methods
   – Experimentation          – Cross-Section Data
   – Accident Analysis        – Code Verification
   – Standards Development
   – Homeland Security     • Fundamental Physics
                              – Model Validation
• Materials
   – Testing              • Fun
   – Physics Validation
   – Interrogation


                                                     10
Cross Section Evaluation Working Group (CSEWG)

                                   The majority of
                                  data testing utilize
                                  critical benchmark
                                  models defined in
                                      the ICSBEP
                                       Handbook.

                                  CENDL-3.1 was
                                   also extensively
                                      tested with
                                 ICSBEP Handbook
                                         data.
                                                         11
Monte Carlo Code Validation


                                Use of select
                                  ICSBEP
                               benchmarks to
                                validate code
                                performance

                              Useful to validate
                                other codes:
                                  SCALE,
                                SERPENT,
                               TRIPOLI, etc.
                                                   12
Prior to the ICSBEP and IRPhEP




                                 13
Physics Criteria for Benchmarks
Physics Criteria for Benchmarks Working Group was initiated in the 1980s
as part of the U. S. DOE sponsored Nuclear Criticality Technology and
Safety Project (NCT&SP)
(1) Method used to determine k-effective
(2) Consistency among experimentally measured parameters. (e.g.,
    fundamental mode multiplication should be determined by more than
    one method in order to insure consistency.)
(3) A rigorous and detailed description of the experimental mockup, its
    mechanical supports, and its surroundings is necessary. (Accompanying
    photographs and drawings are essential.)
(4) A complete specification of the geometry dimensions and material
    compositions including the methods of determination and the known
    sources of error and their potential propagation is necessary. Also, for
    completeness, unknown but suspected sources of error should be listed.
(5) A series of experiments is desirable in order to demonstrate the
    reproducibility of the results.
                                                                           14
Physics Criteria for Benchmarks (continued)
Physics Criteria for Benchmarks Working Group (continued)
(6) A description of the experiment and results should appear in a
     refereed publication.
(7) These criteria were established primarily to provide guidelines for
     future experiments.
(8) Many of the earlier experiments do not satisfy all of these criteria.
(9) Failure to meet these criteria does not automatically disqualify an
     experiment from being considered as acceptable for use as a
     benchmark.
(10) An attempt is being made here to supplement the originally published
     data, through the evaluation process, to meet these criteria.




                                                                            15
ANSI/ANS-19.5-1995 (Currently being Revised)
• Requirements for Reference Reactor Physics
  Measurements
  – Criteria for Evaluation of Data:
     • To qualify as reference data
         – Measurements performed with accepted and proven techniques
         – Or techniques shall be adequately demonstrated to be valid
     • Reported uncertainties should be consistent with evidence
     • Compare with similar independent configurations or with a
       wide range of data sets using accepted calculational methods
     • Check for consistency




                                                                        16
Dilbert’s Incorrect View of Benchmarking




                                           17
Purpose of the ICSBEP
• Compile benchmark-experiment data into a
  standardized format that allows analysts to easily
  use the data to validate calculational techniques and
  cross section data.
• Evaluate the data and quantify overall uncertainties
  through various types of sensitivity analyses
• Eliminate a large portion of the tedious and redundant
  research and processing of experiment data
• Streamline the necessary step of validating computer
  codes and nuclear data with experimental data
• Preserve valuable experimental data that will be of
  use for decades

                                                           18
History of ICSBEP
• Initiated in October of 1992 by the Department of
  Energy’s Defense Programs
• Organized and managed through the Idaho
  National Laboratory (INL)
• Involves nationally known criticality safety experts




                                                         19
ICSBEP U.S. Participants

           Hanford




                     INL
    LLNL
                                          ANL                 BAPL


                                    RFP


                           LANL                 ORNL
                                          Oak Ridge Y-12
                                  SNL                      SRNL




                                                                     20
History of ICSBEP (continued)
• An International Criticality Safety Data Exchange
  component was added to the project in 1994


• The ICSBEP became an official activity of the
  OECD/NEA’s Nuclear Science Committee (NSC)
  in 1995




                                                      21
ICSBEP International Partners – 20 Countries


                                                Sweden
                                                 2007
                                                      Hungary 1994          Russian Federation
                                   United             Slovenia 1998               1994
      Canada                      Kingdom             Serbia 1999
       2005                         1994              Kazakhstan 1999
                                                      Czech Republic 2004
                                          France      Poland 2005
                                    Spain  1994
                                    2000                                                            Republic
    United States                                                                                   of Korea
        1992                                                                         China            1996
                                                             Israel 2000             2006
                                                                                                 Japan
                                                                                                  1994
                                                                            India
                                                                            2005




                         Brazil
                         2004



                    Argentina
                      2008




                                                                                                               22
Purpose of the IRPhEP
• Similar to the ICSBEP
• Focus to collect data regarding the numerous
  experiments in support of nuclear energy and
  technology performed at research laboratories
• Experiments represent significant investments of
  time, infrastructure, expertise, and cost that might
  not have received adequate documentation
• Measurements also include data regarding
  reactivity measurements, reaction rates, buckling,
  burnup, etc., that are of significant worth for
  current and future research and development
  efforts

                                                         23
History of IRPhEP
• Initiated as a pilot activity in 1999 by the OECD NEA Nuclear Science
  Committee – INL involvement from the beginning

• Endorsed as an official activity of the NSC in June of 2003

• Patterned after its predecessor, the ICSBEP

• Focuses on other integral measurements such as buckling, spectral
  characteristics, reactivity effects, reactivity coefficients, kinetics
  measurements, reaction-rate and power distributions, nuclide
  compositions and other miscellaneous types of measurements in
  addition to the critical configuration

• Involves internationally known reactor physics experts

• Technical Review Group managed through the Idaho National
  Laboratory (INL) for the OECD/NEA

                                                                           24
IRPhEP International Partners – 16 Countries




                                               25
Purpose of the ICSBEP and IRPhEP




                                   26
What Gets Benchmarked?
• Criticality/Subcriticality   • Reaction-Rate
• Buckling/Extrapolation         Distributions
  Length                       • Power Distribution Data
• Spectral Characteristics • Isotopic Measurements
• Reactivity Effects           • Miscellaneous
• Reactivity Coefficient
  Data                           If it is worth measuring, then
                                       it is worth evaluating.
• Kinetics Measurements
  Data

                                                                  27
International Handbook of Evaluated Criticality
Safety Benchmark Experiments
September 2011 Edition
• 20 Contributing Countries
• Spans over 62,600 Pages
• Evaluation of 532 Experimental Series
• 4,550 Critical, Subcritical, or K∞
  Configurations
• 24 Criticality-Alarm/ Shielding
  Benchmark Configurations – numerous
  dose points each
• 155 fission rate and transmission
  measurements and reaction rate ratios
  for 45 different materials

  http://icsbep.inl.gov/                           Sept 2012
  http://www.oecd-nea.org/science/wpncs/icsbep/   617 & 4,703   28
International Handbook of Evaluated Reactor
Physics Benchmark Experiments
March 2012 Edition
• 16 Contributing Countries
• Data from 56 Experimental Series
  performed at 32 Reactor Facilities
• Data from 52 out of the 56 series are
  published as approved benchmarks
• Data from 4 out of the 56 series are
  published in DRAFT form




  http://irphep.inl.gov
  http://www.oecd-nea.org/science/wprs/irphe/
                                                29
Accomplishments of the ICSBEP and IRPhEP


As a result of ICSBEP and IRPhEP efforts:
• A large portion of the tedious and redundant
  research and processing of critical experiment
  data has been eliminated
• The necessary step in criticality safety analyses of
  validating computer codes with benchmark critical
  data is greatly streamlined
• Valuable criticality safety experimental data are
  preserved and will be of use for decades


                                                         30
Accomplishments of the ICSBEP and IRPhEP
(continued)

The work of the ICSBEP and IRPhEP has:
• Highlighted gaps in data
• Retrieved “lost” data
• Helped to identify deficiencies and errors in cross
  section processing codes and neutronics codes
• Improved experimental planning, execution and
  reporting


                                                        31
Accomplishments of the ICSBEP and IRPhEP
(continued)

 Over 400 scientists from 24 different countries
  have combined their efforts to produce the
  ICSBEP and IRPhEP Handbooks.

 These two handbooks continue to grow and
  provide high-quality integral benchmark data that
  will be of use to the criticality safety, nuclear data,
  and reactor physics communities for future
  decades.


                                                       32
The ICSBEP is Featured in the 2003 September
& October Issues of NS&E




                                               33
ICSBEP Evaluation Content & Format


All ICSBEP evaluations follow the same general
format:
 1.   Describe the Experiments
 2.   Evaluate the Experiments
 3.   Derive Benchmark Specifications
 4.   Provide Results from Sample Calculations
 A.   Typical Input Listings
 B.   Supporting Information




                                                 34
IRPhEP Evaluation Format


1.0 DETAILED DESCRIPTION
1.1 Description of the Critical or Subcritical
   Configuration
    1.1.1   Overview of Experiment
    1.1.2   Description of Experimental Configuration
    1.1.3   Description of Material Data
    1.1.4   Temperature Data
    1.1.5   Additional Information Relevant to Critical and
            Subcritical Measurements



                                                              35
IRPhEP Evaluation Format (continued)

1.2  Description of Buckling and Extrapolation Length
     Measurements
1.3 Description of Spectral Characteristics Measurements
1.4 Description of Reactivity Effects Measurements
1.5 Description of Reactivity Coefficient Measurements
1.6 Description of Kinetics Measurements
1.7 Description of Reaction Rate Distribution Measurements
1.8 Description of Power Distribution Measurements
1.9 Description of Isotopic Measurements
1.10 Description of Other Miscellaneous Types of Measurements


                                                            36
Benchmark Process General Overview
1. Identify Experiment   5. Independent Review
2. Evaluate Experiment      of Benchmark Report
  a. Prepare Benchmark   6. Distribution of
     Report                 Benchmark Report to
3. Internal Review of       Technical Review
   Benchmark Report         Group
4. Submit Benchmark      7. Technical Review
   Experiment to            Meeting
   ICSBEP/IRPhEP         8. Resolve Action Items
                         9. Handbook Publication

                                                   37
Questions?




             GROTESQUE




                         38

More Related Content

Similar to Benchmark Tutorial -- I - Introduction

The influence of data curation on QSAR Modeling – Presented at American Chemi...
The influence of data curation on QSAR Modeling – Presented at American Chemi...The influence of data curation on QSAR Modeling – Presented at American Chemi...
The influence of data curation on QSAR Modeling – Presented at American Chemi...Kamel Mansouri
 
Henry&Hobbs, 'Developing long-term agro-ecological trial datasets for C and N...
Henry&Hobbs, 'Developing long-term agro-ecological trial datasets for C and N...Henry&Hobbs, 'Developing long-term agro-ecological trial datasets for C and N...
Henry&Hobbs, 'Developing long-term agro-ecological trial datasets for C and N...TERN Australia
 
Development Of An ICSBEP Benchmark 2011
Development Of An ICSBEP Benchmark 2011Development Of An ICSBEP Benchmark 2011
Development Of An ICSBEP Benchmark 2011jdbess
 
An examination of data quality on QSAR Modeling in regards to the environment...
An examination of data quality on QSAR Modeling in regards to the environment...An examination of data quality on QSAR Modeling in regards to the environment...
An examination of data quality on QSAR Modeling in regards to the environment...Kamel Mansouri
 
The eNanoMapper database for nanomaterial safety information: storage and query
The eNanoMapper database for nanomaterial safety information: storage and queryThe eNanoMapper database for nanomaterial safety information: storage and query
The eNanoMapper database for nanomaterial safety information: storage and queryNina Jeliazkova
 
Benchmarking and Performance on AWS - AWS India Summit 2012
Benchmarking and Performance on AWS - AWS India Summit 2012Benchmarking and Performance on AWS - AWS India Summit 2012
Benchmarking and Performance on AWS - AWS India Summit 2012Amazon Web Services
 
167 - Productivity for proof engineering
167 - Productivity for proof engineering167 - Productivity for proof engineering
167 - Productivity for proof engineeringESEM 2014
 
Orbital presentation pt1_200112_v1
Orbital presentation pt1_200112_v1Orbital presentation pt1_200112_v1
Orbital presentation pt1_200112_v1ensmjd
 
Orbital presentation pt1_200112_v1
Orbital presentation pt1_200112_v1Orbital presentation pt1_200112_v1
Orbital presentation pt1_200112_v1ensmjd
 
Orbital presentation pt1_200112_v1
Orbital presentation pt1_200112_v1Orbital presentation pt1_200112_v1
Orbital presentation pt1_200112_v1ensmjd
 
ERIM, REDm-MED and Orbital
ERIM, REDm-MED and OrbitalERIM, REDm-MED and Orbital
ERIM, REDm-MED and Orbitalulcerd
 
HITSC 2010 06-30 slides
HITSC 2010 06-30 slidesHITSC 2010 06-30 slides
HITSC 2010 06-30 slidesBrian Ahier
 
Empirical se 2013-01-17
Empirical se 2013-01-17Empirical se 2013-01-17
Empirical se 2013-01-17Ivica Crnkovic
 
Genome in a Bottle Consortium Workshop Welcome Aug. 16
Genome in a Bottle Consortium Workshop Welcome Aug. 16Genome in a Bottle Consortium Workshop Welcome Aug. 16
Genome in a Bottle Consortium Workshop Welcome Aug. 16GenomeInABottle
 

Similar to Benchmark Tutorial -- I - Introduction (20)

The influence of data curation on QSAR Modeling – Presented at American Chemi...
The influence of data curation on QSAR Modeling – Presented at American Chemi...The influence of data curation on QSAR Modeling – Presented at American Chemi...
The influence of data curation on QSAR Modeling – Presented at American Chemi...
 
Henry&Hobbs, 'Developing long-term agro-ecological trial datasets for C and N...
Henry&Hobbs, 'Developing long-term agro-ecological trial datasets for C and N...Henry&Hobbs, 'Developing long-term agro-ecological trial datasets for C and N...
Henry&Hobbs, 'Developing long-term agro-ecological trial datasets for C and N...
 
Development Of An ICSBEP Benchmark 2011
Development Of An ICSBEP Benchmark 2011Development Of An ICSBEP Benchmark 2011
Development Of An ICSBEP Benchmark 2011
 
An examination of data quality on QSAR Modeling in regards to the environment...
An examination of data quality on QSAR Modeling in regards to the environment...An examination of data quality on QSAR Modeling in regards to the environment...
An examination of data quality on QSAR Modeling in regards to the environment...
 
An examination of data quality on QSAR Modeling in regards to the environment...
An examination of data quality on QSAR Modeling in regards to the environment...An examination of data quality on QSAR Modeling in regards to the environment...
An examination of data quality on QSAR Modeling in regards to the environment...
 
The needs for chemistry standards, database tools and data curation at the ch...
The needs for chemistry standards, database tools and data curation at the ch...The needs for chemistry standards, database tools and data curation at the ch...
The needs for chemistry standards, database tools and data curation at the ch...
 
The eNanoMapper database for nanomaterial safety information: storage and query
The eNanoMapper database for nanomaterial safety information: storage and queryThe eNanoMapper database for nanomaterial safety information: storage and query
The eNanoMapper database for nanomaterial safety information: storage and query
 
Benchmarking and Performance on AWS - AWS India Summit 2012
Benchmarking and Performance on AWS - AWS India Summit 2012Benchmarking and Performance on AWS - AWS India Summit 2012
Benchmarking and Performance on AWS - AWS India Summit 2012
 
Visual Search for Musical Performances and Endoscopic Videos
Visual Search for Musical Performances and Endoscopic VideosVisual Search for Musical Performances and Endoscopic Videos
Visual Search for Musical Performances and Endoscopic Videos
 
167 - Productivity for proof engineering
167 - Productivity for proof engineering167 - Productivity for proof engineering
167 - Productivity for proof engineering
 
Orbital presentation pt1_200112_v1
Orbital presentation pt1_200112_v1Orbital presentation pt1_200112_v1
Orbital presentation pt1_200112_v1
 
Orbital presentation pt1_200112_v1
Orbital presentation pt1_200112_v1Orbital presentation pt1_200112_v1
Orbital presentation pt1_200112_v1
 
Orbital presentation pt1_200112_v1
Orbital presentation pt1_200112_v1Orbital presentation pt1_200112_v1
Orbital presentation pt1_200112_v1
 
ERIM, REDm-MED and Orbital
ERIM, REDm-MED and OrbitalERIM, REDm-MED and Orbital
ERIM, REDm-MED and Orbital
 
HITSC 2010 06-30 slides
HITSC 2010 06-30 slidesHITSC 2010 06-30 slides
HITSC 2010 06-30 slides
 
NAA ko method
NAA ko methodNAA ko method
NAA ko method
 
Chapter9
Chapter9Chapter9
Chapter9
 
Empirical se 2013-01-17
Empirical se 2013-01-17Empirical se 2013-01-17
Empirical se 2013-01-17
 
The Importance of Open Data and Models for Energy Systems Analysis
The Importance of Open Data and Models for Energy Systems AnalysisThe Importance of Open Data and Models for Energy Systems Analysis
The Importance of Open Data and Models for Energy Systems Analysis
 
Genome in a Bottle Consortium Workshop Welcome Aug. 16
Genome in a Bottle Consortium Workshop Welcome Aug. 16Genome in a Bottle Consortium Workshop Welcome Aug. 16
Genome in a Bottle Consortium Workshop Welcome Aug. 16
 

More from jdbess

Benchmark Tutorial -- II - Availability
Benchmark Tutorial -- II - AvailabilityBenchmark Tutorial -- II - Availability
Benchmark Tutorial -- II - Availabilityjdbess
 
NRAD - ANS 2012
NRAD - ANS 2012NRAD - ANS 2012
NRAD - ANS 2012jdbess
 
GROTESQUE - ANS 2012
GROTESQUE - ANS 2012GROTESQUE - ANS 2012
GROTESQUE - ANS 2012jdbess
 
NRAD Reactor Benchmark Update
NRAD Reactor Benchmark UpdateNRAD Reactor Benchmark Update
NRAD Reactor Benchmark Updatejdbess
 
MIRTE 2010
MIRTE 2010MIRTE 2010
MIRTE 2010jdbess
 
IRPhEP 2011
IRPhEP 2011IRPhEP 2011
IRPhEP 2011jdbess
 
ICSBEP 2010
ICSBEP 2010ICSBEP 2010
ICSBEP 2010jdbess
 
FSP-Be - NETS2011
FSP-Be - NETS2011FSP-Be - NETS2011
FSP-Be - NETS2011jdbess
 
IRPhEP - ICAPP2010
IRPhEP - ICAPP2010IRPhEP - ICAPP2010
IRPhEP - ICAPP2010jdbess
 
FFTF - PHYSOR2010
FFTF - PHYSOR2010FFTF - PHYSOR2010
FFTF - PHYSOR2010jdbess
 
HTTR - PHYSOR2010
HTTR - PHYSOR2010HTTR - PHYSOR2010
HTTR - PHYSOR2010jdbess
 
HTTR - ANSWM 2009
HTTR - ANSWM 2009HTTR - ANSWM 2009
HTTR - ANSWM 2009jdbess
 
UO2F2 - ANSWM 2009
UO2F2 - ANSWM 2009UO2F2 - ANSWM 2009
UO2F2 - ANSWM 2009jdbess
 
CSNR - YPC 2009
CSNR - YPC 2009CSNR - YPC 2009
CSNR - YPC 2009jdbess
 
TANKS - ANS 2009
TANKS - ANS 2009TANKS - ANS 2009
TANKS - ANS 2009jdbess
 
FSP - NETS 2009
FSP - NETS 2009FSP - NETS 2009
FSP - NETS 2009jdbess
 
NTR - NETS 2009
NTR -  NETS 2009NTR -  NETS 2009
NTR - NETS 2009jdbess
 
HTTR - M&C 2009
HTTR - M&C 2009HTTR - M&C 2009
HTTR - M&C 2009jdbess
 
CSNR - STAIF 2008
CSNR - STAIF 2008CSNR - STAIF 2008
CSNR - STAIF 2008jdbess
 
LEGO Reactor - ICAPP 2008
LEGO Reactor - ICAPP 2008LEGO Reactor - ICAPP 2008
LEGO Reactor - ICAPP 2008jdbess
 

More from jdbess (20)

Benchmark Tutorial -- II - Availability
Benchmark Tutorial -- II - AvailabilityBenchmark Tutorial -- II - Availability
Benchmark Tutorial -- II - Availability
 
NRAD - ANS 2012
NRAD - ANS 2012NRAD - ANS 2012
NRAD - ANS 2012
 
GROTESQUE - ANS 2012
GROTESQUE - ANS 2012GROTESQUE - ANS 2012
GROTESQUE - ANS 2012
 
NRAD Reactor Benchmark Update
NRAD Reactor Benchmark UpdateNRAD Reactor Benchmark Update
NRAD Reactor Benchmark Update
 
MIRTE 2010
MIRTE 2010MIRTE 2010
MIRTE 2010
 
IRPhEP 2011
IRPhEP 2011IRPhEP 2011
IRPhEP 2011
 
ICSBEP 2010
ICSBEP 2010ICSBEP 2010
ICSBEP 2010
 
FSP-Be - NETS2011
FSP-Be - NETS2011FSP-Be - NETS2011
FSP-Be - NETS2011
 
IRPhEP - ICAPP2010
IRPhEP - ICAPP2010IRPhEP - ICAPP2010
IRPhEP - ICAPP2010
 
FFTF - PHYSOR2010
FFTF - PHYSOR2010FFTF - PHYSOR2010
FFTF - PHYSOR2010
 
HTTR - PHYSOR2010
HTTR - PHYSOR2010HTTR - PHYSOR2010
HTTR - PHYSOR2010
 
HTTR - ANSWM 2009
HTTR - ANSWM 2009HTTR - ANSWM 2009
HTTR - ANSWM 2009
 
UO2F2 - ANSWM 2009
UO2F2 - ANSWM 2009UO2F2 - ANSWM 2009
UO2F2 - ANSWM 2009
 
CSNR - YPC 2009
CSNR - YPC 2009CSNR - YPC 2009
CSNR - YPC 2009
 
TANKS - ANS 2009
TANKS - ANS 2009TANKS - ANS 2009
TANKS - ANS 2009
 
FSP - NETS 2009
FSP - NETS 2009FSP - NETS 2009
FSP - NETS 2009
 
NTR - NETS 2009
NTR -  NETS 2009NTR -  NETS 2009
NTR - NETS 2009
 
HTTR - M&C 2009
HTTR - M&C 2009HTTR - M&C 2009
HTTR - M&C 2009
 
CSNR - STAIF 2008
CSNR - STAIF 2008CSNR - STAIF 2008
CSNR - STAIF 2008
 
LEGO Reactor - ICAPP 2008
LEGO Reactor - ICAPP 2008LEGO Reactor - ICAPP 2008
LEGO Reactor - ICAPP 2008
 

Benchmark Tutorial -- I - Introduction

  • 1. Benchmarking Experiments for Criticality Safety and Reactor Physics Applications – II – Tutorial John D. Bess and J. Blair Briggs – INL Ian Hill (IDAT) – OECD/NEA www.inl.gov 2012 ANS Annual Meeting Chicago, Illinois June 24-28, 2012 This paper was prepared at Idaho National Laboratory for the U.S. Department of Energy under Contract Number (DE-AC07-05ID14517)
  • 2. Purpose of this Tutorial • Discuss the benchmark process for ICSBEP and IRPhEP • Provide brief demonstration of DICE and IDAT 2
  • 3. Acknowledgements • ICSBEP and IRPhEP are collaborative efforts that involve numerous scientists, engineers, administrative support personnel and program sponsors from 24 different countries and the OECD/NEA. • The authors would like to acknowledge the efforts of all of those dedicated individuals without whom those two projects would not be possible. 3
  • 4. Outline I. Introduction to Benchmarking a. Overview b. ICSBEP/IRPhEP II. Benchmark Experiment Availability a. DICE Demonstration b. IDAT Demonstration III. Dissection of a Benchmark Report a. Experimental Data b. Experiment Evaluation c. Benchmark Model d. Sample Calculations e. Benchmark Measurements IV. Benchmark Participation NRAD 4
  • 5. Outline I. Introduction to Benchmarking a. Overview b. ICSBEP/IRPhEP II. Benchmark Experiment Availability a. DICE Demonstration b. IDAT Demonstration III. Dissection of a Benchmark Report a. Experimental Data b. Experiment Evaluation c. Benchmark Model d. Sample Calculations e. Benchmark Measurements IV. Benchmark Participation 5
  • 7. What Is a Benchmark? • Merriam-Webster • “a point of reference from which measurements may be made” • “something that serves as a standard by which others are measured or judged” • “a standardized problem or test that serves as a basis for evaluation or © Gary Price comparison (as of computer system performance)” 7
  • 8. WHY ARE YOU INTERESTED IN BENCHMARKS? 8
  • 9. How Does Benchmark Design Apply to You? 9
  • 10. Why Do We Have Nuclear Benchmarks? • Nuclear Safety • Research and – Plant Operations Development • Training – New Reactor Designs – Transportation – Design Validation – Waste Disposal • Computational Methods – Experimentation – Cross-Section Data – Accident Analysis – Code Verification – Standards Development – Homeland Security • Fundamental Physics – Model Validation • Materials – Testing • Fun – Physics Validation – Interrogation 10
  • 11. Cross Section Evaluation Working Group (CSEWG) The majority of data testing utilize critical benchmark models defined in the ICSBEP Handbook. CENDL-3.1 was also extensively tested with ICSBEP Handbook data. 11
  • 12. Monte Carlo Code Validation Use of select ICSBEP benchmarks to validate code performance Useful to validate other codes: SCALE, SERPENT, TRIPOLI, etc. 12
  • 13. Prior to the ICSBEP and IRPhEP 13
  • 14. Physics Criteria for Benchmarks Physics Criteria for Benchmarks Working Group was initiated in the 1980s as part of the U. S. DOE sponsored Nuclear Criticality Technology and Safety Project (NCT&SP) (1) Method used to determine k-effective (2) Consistency among experimentally measured parameters. (e.g., fundamental mode multiplication should be determined by more than one method in order to insure consistency.) (3) A rigorous and detailed description of the experimental mockup, its mechanical supports, and its surroundings is necessary. (Accompanying photographs and drawings are essential.) (4) A complete specification of the geometry dimensions and material compositions including the methods of determination and the known sources of error and their potential propagation is necessary. Also, for completeness, unknown but suspected sources of error should be listed. (5) A series of experiments is desirable in order to demonstrate the reproducibility of the results. 14
  • 15. Physics Criteria for Benchmarks (continued) Physics Criteria for Benchmarks Working Group (continued) (6) A description of the experiment and results should appear in a refereed publication. (7) These criteria were established primarily to provide guidelines for future experiments. (8) Many of the earlier experiments do not satisfy all of these criteria. (9) Failure to meet these criteria does not automatically disqualify an experiment from being considered as acceptable for use as a benchmark. (10) An attempt is being made here to supplement the originally published data, through the evaluation process, to meet these criteria. 15
  • 16. ANSI/ANS-19.5-1995 (Currently being Revised) • Requirements for Reference Reactor Physics Measurements – Criteria for Evaluation of Data: • To qualify as reference data – Measurements performed with accepted and proven techniques – Or techniques shall be adequately demonstrated to be valid • Reported uncertainties should be consistent with evidence • Compare with similar independent configurations or with a wide range of data sets using accepted calculational methods • Check for consistency 16
  • 17. Dilbert’s Incorrect View of Benchmarking 17
  • 18. Purpose of the ICSBEP • Compile benchmark-experiment data into a standardized format that allows analysts to easily use the data to validate calculational techniques and cross section data. • Evaluate the data and quantify overall uncertainties through various types of sensitivity analyses • Eliminate a large portion of the tedious and redundant research and processing of experiment data • Streamline the necessary step of validating computer codes and nuclear data with experimental data • Preserve valuable experimental data that will be of use for decades 18
  • 19. History of ICSBEP • Initiated in October of 1992 by the Department of Energy’s Defense Programs • Organized and managed through the Idaho National Laboratory (INL) • Involves nationally known criticality safety experts 19
  • 20. ICSBEP U.S. Participants Hanford INL LLNL ANL BAPL RFP LANL ORNL Oak Ridge Y-12 SNL SRNL 20
  • 21. History of ICSBEP (continued) • An International Criticality Safety Data Exchange component was added to the project in 1994 • The ICSBEP became an official activity of the OECD/NEA’s Nuclear Science Committee (NSC) in 1995 21
  • 22. ICSBEP International Partners – 20 Countries Sweden 2007 Hungary 1994 Russian Federation United Slovenia 1998 1994 Canada Kingdom Serbia 1999 2005 1994 Kazakhstan 1999 Czech Republic 2004 France Poland 2005 Spain 1994 2000 Republic United States of Korea 1992 China 1996 Israel 2000 2006 Japan 1994 India 2005 Brazil 2004 Argentina 2008 22
  • 23. Purpose of the IRPhEP • Similar to the ICSBEP • Focus to collect data regarding the numerous experiments in support of nuclear energy and technology performed at research laboratories • Experiments represent significant investments of time, infrastructure, expertise, and cost that might not have received adequate documentation • Measurements also include data regarding reactivity measurements, reaction rates, buckling, burnup, etc., that are of significant worth for current and future research and development efforts 23
  • 24. History of IRPhEP • Initiated as a pilot activity in 1999 by the OECD NEA Nuclear Science Committee – INL involvement from the beginning • Endorsed as an official activity of the NSC in June of 2003 • Patterned after its predecessor, the ICSBEP • Focuses on other integral measurements such as buckling, spectral characteristics, reactivity effects, reactivity coefficients, kinetics measurements, reaction-rate and power distributions, nuclide compositions and other miscellaneous types of measurements in addition to the critical configuration • Involves internationally known reactor physics experts • Technical Review Group managed through the Idaho National Laboratory (INL) for the OECD/NEA 24
  • 25. IRPhEP International Partners – 16 Countries 25
  • 26. Purpose of the ICSBEP and IRPhEP 26
  • 27. What Gets Benchmarked? • Criticality/Subcriticality • Reaction-Rate • Buckling/Extrapolation Distributions Length • Power Distribution Data • Spectral Characteristics • Isotopic Measurements • Reactivity Effects • Miscellaneous • Reactivity Coefficient Data If it is worth measuring, then it is worth evaluating. • Kinetics Measurements Data 27
  • 28. International Handbook of Evaluated Criticality Safety Benchmark Experiments September 2011 Edition • 20 Contributing Countries • Spans over 62,600 Pages • Evaluation of 532 Experimental Series • 4,550 Critical, Subcritical, or K∞ Configurations • 24 Criticality-Alarm/ Shielding Benchmark Configurations – numerous dose points each • 155 fission rate and transmission measurements and reaction rate ratios for 45 different materials http://icsbep.inl.gov/ Sept 2012 http://www.oecd-nea.org/science/wpncs/icsbep/ 617 & 4,703 28
  • 29. International Handbook of Evaluated Reactor Physics Benchmark Experiments March 2012 Edition • 16 Contributing Countries • Data from 56 Experimental Series performed at 32 Reactor Facilities • Data from 52 out of the 56 series are published as approved benchmarks • Data from 4 out of the 56 series are published in DRAFT form http://irphep.inl.gov http://www.oecd-nea.org/science/wprs/irphe/ 29
  • 30. Accomplishments of the ICSBEP and IRPhEP As a result of ICSBEP and IRPhEP efforts: • A large portion of the tedious and redundant research and processing of critical experiment data has been eliminated • The necessary step in criticality safety analyses of validating computer codes with benchmark critical data is greatly streamlined • Valuable criticality safety experimental data are preserved and will be of use for decades 30
  • 31. Accomplishments of the ICSBEP and IRPhEP (continued) The work of the ICSBEP and IRPhEP has: • Highlighted gaps in data • Retrieved “lost” data • Helped to identify deficiencies and errors in cross section processing codes and neutronics codes • Improved experimental planning, execution and reporting 31
  • 32. Accomplishments of the ICSBEP and IRPhEP (continued)  Over 400 scientists from 24 different countries have combined their efforts to produce the ICSBEP and IRPhEP Handbooks.  These two handbooks continue to grow and provide high-quality integral benchmark data that will be of use to the criticality safety, nuclear data, and reactor physics communities for future decades. 32
  • 33. The ICSBEP is Featured in the 2003 September & October Issues of NS&E 33
  • 34. ICSBEP Evaluation Content & Format All ICSBEP evaluations follow the same general format: 1. Describe the Experiments 2. Evaluate the Experiments 3. Derive Benchmark Specifications 4. Provide Results from Sample Calculations A. Typical Input Listings B. Supporting Information 34
  • 35. IRPhEP Evaluation Format 1.0 DETAILED DESCRIPTION 1.1 Description of the Critical or Subcritical Configuration 1.1.1 Overview of Experiment 1.1.2 Description of Experimental Configuration 1.1.3 Description of Material Data 1.1.4 Temperature Data 1.1.5 Additional Information Relevant to Critical and Subcritical Measurements 35
  • 36. IRPhEP Evaluation Format (continued) 1.2 Description of Buckling and Extrapolation Length Measurements 1.3 Description of Spectral Characteristics Measurements 1.4 Description of Reactivity Effects Measurements 1.5 Description of Reactivity Coefficient Measurements 1.6 Description of Kinetics Measurements 1.7 Description of Reaction Rate Distribution Measurements 1.8 Description of Power Distribution Measurements 1.9 Description of Isotopic Measurements 1.10 Description of Other Miscellaneous Types of Measurements 36
  • 37. Benchmark Process General Overview 1. Identify Experiment 5. Independent Review 2. Evaluate Experiment of Benchmark Report a. Prepare Benchmark 6. Distribution of Report Benchmark Report to 3. Internal Review of Technical Review Benchmark Report Group 4. Submit Benchmark 7. Technical Review Experiment to Meeting ICSBEP/IRPhEP 8. Resolve Action Items 9. Handbook Publication 37
  • 38. Questions? GROTESQUE 38