HSEEP Exercise Evaluation and Improvement
ODP’s Mission <ul><li>Primary responsibility within the executive branch to build and sustain the preparedness of the US t...
ODP’s Responsibilities <ul><li>Grant programs for planning, equipment, training and exercises </li></ul><ul><li>National t...
Grant Programs <ul><li>State Homeland Security Program </li></ul><ul><li>Law Enforcement Terrorism Prevention Program </li...
State Homeland  Security Program <ul><ul><li>Purpose: to enhance capacity of states and locals to prevent, respond to, and...
Law Enforcement Terrorism Prevention Program <ul><ul><li>Provide law enforcement communities with funds to support the fol...
Citizen Corps Program <ul><ul><li>Provides funds to support Citizen Corps Councils with planning, outreach, and management...
Urban Areas Security  Initiative Program <ul><ul><li>Address the unique needs of large urban areas – 50 cities </li></ul><...
Fire Fighter Assistance <ul><li>Protect public and fire fighters against fire and fire-related hazards </li></ul><ul><ul><...
Strategy Process Overview END RESULT  = Capability Improvements Statewide Homeland Security Strategy Conducted at the  loc...
Strategy Participants  <ul><ul><li>Fire Service </li></ul></ul><ul><ul><li>HazMat </li></ul></ul><ul><ul><li>Emergency Med...
Assessment Overview Shortfalls or “Gaps” Agricultural Vulnerability Assessment * CBRNE: Chemical, Biological, Radiological...
Threat Assessment Agricultural Vulnerability Assessment Threat Assessment Vulnerability Assessment <ul><li>Who:   </li></u...
Vulnerability Assessment Risk Assessment <ul><li>Who:   </li></ul><ul><li>All response disciplines at local, state, and fe...
Capabilities and Needs:  Planning <ul><li>The results from the risk assessment process (threat and vulnerability) provide ...
State Homeland Security Strategy <ul><li>Developed by State based on local needs </li></ul><ul><li>Provides blueprint for ...
State Assistance Plans <ul><li>ODP uses the strategies and needs assessment data to tailor and formulate a State/Metro Ass...
National Training Program <ul><li>Training for federal, state and local homeland security professionals </li></ul><ul><li>...
ODP Training Program <ul><li>ODP offers more than 40 courses  (Examples) </li></ul><ul><ul><li>Live chemical agents traini...
National Exercise Program <ul><li>Responsible for National Exercise Program </li></ul><ul><li>Threat and performance-based...
Assess Program Success Through Exercises  <ul><li>Performance measures for ODP’s grant, training, and exercise programs ar...
Overview of HSEEP <ul><li>Threat- and Performance-based Exercises </li></ul><ul><li>Cycle of exercises </li></ul><ul><li>I...
HSEEP Manuals <ul><li>Volume I:  Program Overview and  Doctrine </li></ul><ul><li>Volume II: Exercise Evaluation and  Impr...
Vol I:  HSEEP Overview  and Doctrine <ul><li>ODP’s exercise and evaluation doctrine </li></ul><ul><li>Uniform approach for...
Vol II:  Exercise Evaluation  and Improvement <ul><li>Defines exercise evaluation and improvement process </li></ul><ul><l...
Vol III: Exercise  Development <ul><li>Defines exercise planning and design process </li></ul><ul><li>Provides guidance fo...
Vol IV: Sample Documents <ul><li>Provides sample letters, planning documents, checklists, scenarios, etc. </li></ul><ul><l...
Exercise Evaluation <ul><li>Assess preparedness at federal, state and local levels </li></ul><ul><li>Validate strengths an...
Evaluation  Enhancements <ul><li>Focus on performance of critical tasks and mission outcomes  </li></ul><ul><li>Use of uni...
Exercise Evaluation Methodology Development <ul><li>Exercise Evaluation Working Group </li></ul><ul><li>Builds on  </li></...
Exercise Evaluation and Improvement Process Exercise Evaluation and Improvement Process  Data Collection and Analysis Step...
Levels of Analysis <ul><li>Performance is analyzed at three levels: </li></ul><ul><ul><li>Task level </li></ul></ul><ul><u...
Levels of Analysis <ul><li>Task Level Performance </li></ul><ul><ul><li>Answers the question:  did the person or team do t...
Levels of Analysis <ul><li>Agency/Discipline/Function Level Performance  —  Multiple teams </li></ul><ul><ul><li>Answers t...
Levels of Analysis <ul><li>Mission Level Performance </li></ul><ul><ul><li>Answers the question: were the mission level ou...
Mission Outcomes Pre-Event Emergency Response Post-Event <ul><li>Prevention/Deterrence </li></ul><ul><li>Emergency Assessm...
Evaluation Requirements <ul><li>Determine what outcomes will be evaluated, based on exercise objectives </li></ul><ul><li>...
Exercise  Evaluation Guides <ul><li>ODP has developed Exercise Evaluation Guides that: </li></ul><ul><ul><li>Identify the ...
The EVALPLAN <ul><li>Exercise-specific information </li></ul><ul><li>Plans, policies, procedures, and agreements </li></ul...
Recruiting and  Assigning Evaluators <ul><li>Setting expectations – evaluators must be available for: </li></ul><ul><ul><l...
Recording Observations <ul><li>The emphasis is on Who? What? When? Where? How? Why? </li></ul><ul><li>Record observations ...
Record  Significant Activities <ul><li>Initiating scenario events </li></ul><ul><li>Facility activities </li></ul><ul><li>...
Evaluator Summary  <ul><li>Compile observations into chronological narrative of events </li></ul><ul><li>Describe outcomes...
Data Analysis <ul><li>Conduct Hotwash </li></ul><ul><li>Develop timeline of significant events </li></ul><ul><li>Analyze p...
Hotwash <ul><li>Player Hotwash: </li></ul><ul><ul><li>Usually held immediately following exercise play  </li></ul></ul><ul...
Timeline Development <ul><li>Identify the appropriate outcome for  </li></ul><ul><li>each activity </li></ul>Make a team t...
Analysis of  Performance <ul><li>Analysis of activities </li></ul><ul><ul><li>What tasks were to be accomplished </li></ul...
Root Cause Analysis 1. Why did it happen?   2. Why did that happen? 3. Why was that? 4. And why was that?  5. And why was ...
Integrated Analysis <ul><li>Allows further identification of: </li></ul><ul><ul><li>Successes and best practices </li></ul...
Recommendations for Improvement <ul><li>Questions for identifying recommendations for improvement:  </li></ul><ul><ul><li>...
The After-Action  Report (AAR) <ul><li>Serves as feedback tool </li></ul><ul><li>Summarizes what happened </li></ul><ul><l...
After-Action Report <ul><li>Prepared in two stages:  </li></ul><ul><ul><li>Draft AAR – completed immediately after the exe...
AAR Format <ul><li>Executive Summary </li></ul><ul><li>Part 1:  Exercise  Overview  </li></ul><ul><li>Part 2: Exercise Goa...
Improvement Process <ul><li>Improving preparedness activities:  </li></ul><ul><ul><li>Conduct exercise debrief </li></ul><...
Exercise Debrief <ul><li>Provides a forum for jurisdiction officials to: </li></ul><ul><ul><li>Hear the results of the ana...
Improvement Plan <ul><li>Developed by local jurisdiction during debrief </li></ul><ul><li>Identifies how recommendations w...
Finalize AAR <ul><li>Improvement Plan is included in final AAR  </li></ul><ul><li>Final AAR submitted to ODP through State...
Monitor Implementation <ul><li>ODP Exercise Management System (under development) will provide: </li></ul><ul><ul><li>Cent...
Sharing Lessons Learned <ul><li>Ready-Net – Web-based, secure information network </li></ul><ul><ul><li>National repositor...
Benefits of HSEEP Approach <ul><li>Nationwide consistency </li></ul><ul><li>More useful after action reports and improveme...
Exercise Evaluation Training Course <ul><li>2 ½ days - Exercise Evaluation methodology </li></ul><ul><li>6 sessions to tra...
Goal for Working Group <ul><li>Review and modify Exercise Evaluation Guides for Radiological and Biological attacks </li><...
Upcoming SlideShare
Loading in …5
×

HSEEP Mission Overview Briefing

663
-1

Published on

Published in: News & Politics, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
663
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
6
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Welcome to the HSEEP Exercise Evaluation Training Course. This course has been developed by ODP with support from contractors who have helped put together the HSEEP Exercise Evaluation and Improvement guidance document. A feedback form is included in your notebook. We need your help to make this course more helpful to you and others. Please complete it at the end of each section, no later than the end of each day. You’ll be reminded again later. Instructors Notes Introduce yourself and the rest of the instructors and support staff including facility host if appropriate. Include relevant aspects of your background to establish credibility or allow each instructor to do the same. Make general administrative remarks, i.e.: Locations of restroom and other facilities (i.e. phones) Where to pick-up phone/fax messages Time allocated for breaks and lunch Exit locations Rules of engagement Use cell phones outside class room Set pagers/beepers on vibrate No smoking in facility NO TEST, but there will be a course completion certificate, based on full participation
  • ODP was created with in the Department of Justice in 1998 to enhance preparedness to respond to acts of terrorism. ODP moved into the new Department of Homeland Security, along with 21 other agencies, on March 1, 2003. With that move ODP was given a broader mission which is that ODP has primary…. The assignment of expanded responsibilities broadens ODP’s constituency from a primarily state and local focus to include federal departments and agencies, tribal governments, the private sector and the international community. ODP’s original emphasis on response capabilities has been broadened to include efforts to reduce vulnerabilities, and to prevent and recover from acts of terrorism.
  • ODP’s mission is implemented through an integrated program of grants, training, and exercise. For those of you who are contractors your focus is on developing and conducting exercises. Do you know how those exercises fit into the larger integrated program? How many of you are familiar with the Assessment and Strategy Development process implemented under the grant programs? How many are familiar with ODP’s training programs and the course catalogue? I am going to provide a quick overview of ODP’s programs, the assessment and strategy process and our national training and exercise program. I will then talk about the Homeland Security Exercise and Evaluation Program and how the evaluation of performance under exercises serves as a primary measure of effectiveness for all of ODP’s programs.
  • SHSP Purpose:- to enhance capacity of states and local jurisdictions to prevent, respond to, and recover from terrorism Provides funds for Homeland security and emergency operations planning The purchase of specialized equipment Costs for the design, development, and conduct of state CBRNE and cyber security training programs and attendance of ODP sponsored training Costs relate to the design, development, conduct, and evaluation of CBRNE and cyber security exercises Costs associated with implementing State Homeland Security Assessments and Strategies Funds provided to states, D.C., territories; .(2003- $1.9 B) (2004- $1.9 B)
  • LETPP Provide law enforcement communities with funds to support the following prevention activities: Information sharing to preempt terrorist attacks Target hardening to reduce vulnerability of selected high value targets Recognition of potential or developing threats Interoperable communications Intervention of terrorists before they can execute a threat Can be used for planning, organization, training, exercises, and equipment
  • CCP Provides funds to support Citizen Corps Councils with planning, outreach, and management of Citizen Corps program and activities Bring together the appropriate leadership to form and sustain a Citizen Corp Council Develop and implement a plan for the community to engage all citizens in homeland security, community preparedness, and family safety Conduct public education and outreach to inform public about their role in crime prevention, mitigation, emergency preparedness and public health measures Develop and implement Citizen Corps programs offering training and volunteer opportunities to support first responders, disaster relief groups, and community safety efforts to include the four charter federal programs Community Emergency Response Teams (CERT), Neighborhood Watch, Volunteers in Police Service (VIPS), and Medical Reserve Corps (MRC) Coordinate Citizen Corps activities with other DHS funded programs and other federal initiatives
  • UASI New Program in FY 2003 Designed to address the unique needs of large urban areas – 50 cities Conduct jurisdictional assessment and develop Urban Area Homeland Security Strategy. Direct grants to local jurisdictions for planning, equipment, training, exercise, and administration and operational activities related to heightened threat levels
  • The states manage the assessment and strategy process, engaging local jurisdictions in the assessments. The state designates local jurisdictions for this purpose which may be done at a county or regional level. For the Urban Area Security Initiative Program the assessment must be coordinated with and developed jointly by the core city, core county, jurisdictions contiguous to the core city and county or jurisdictions with which the core city and county have established mutual aid agreements. The Urban Areas develop a strategy and are coordinated with the state strategy.
  • The assessment and strategy process is an inclusive process with active participants from the full spectrum of homeland security disciplines.
  • The assessment process is comprehensive and is supported by an on-line assessment tool. It requires that local jurisdictions and state agencies conduct a risk and needs assessment. The risk assessment includes a threat and a vulnerability assessment. Jurisdictions may also choose to conduct an agricultural vulnerability assessment. The agricultural assessment is new with the current assessment tool. Jurisdictions also conduct a needs assessment that looks at capabilities and needs.
  • The threat assessment measures the existence of potential threat elements located within the jurisdiction, and their capability, targeting, motivations, and history. Only the sensitive information rating is included in the assessment. The assessment is used to determine the most probable kind of WMD incident that could occur at a potential target. This is useful information to get from the local jurisdiction when working with them to determine the type of exercise to conduct.
  • The vulnerability assessment provides a vulnerability profile and rating for all potential targets with the jurisdiction.
  • Through the capabilities and need assessment, local jurisdictions assess their capabilities by discipline in 5 areas: planning organization equipment training exercises.
  • This section of the course describes some basic principles of the exercise evaluation and improvement activities. This section will be given by ODP for all contractors and state sessions. Additional slides and materials to be added by ODP.
  • Exercises provide a means to train and practice prevention, response, and recovery capabilities in a risk-free environment and to assess and improve performance . The goal of exercise evaluation is to validate strengths and identify improvement opportunities for the participating organization(s). This is accomplished by: observing the exercise and collecting supporting data; analyzing the data to compare performance against expected outcomes; and determining what changes need to be made to the procedures, plans, staffing, equipment, communications, organizations, and interagency coordination to ensure expected outcomes. The information obtained during an exercise can help review performance at several different levels, listed on this slide. We’ll discuss these in more detail in a moment. The level of analysis conducted on the exercise data will vary depending on the type of exercise. This training focuses on the most complicated exercise, the full-scale, multi—jurisdictional exercises. In addition, we will discuss the differences involved in conducting a tabletop exercise.
  • Here we’ll give you a very brief overview of the exercise design process, and how evaluation fits into it . The overall process can be thought of in two major phases: Evaluation Planning, Observation, and Analysis (steps 1 through 4) Improving Preparedness (Steps 5 through 8) Instructor’s Notes: The rest of this section should be done quickly since most of it is covered in Sections III and IV.
  • Task Level Performance: At the most fundamental level, an exercise evaluation can look at the ability to perform individual prevention and response tasks. A task can be defined as “work with a measurable output that has utility”. Analysis at this level will answer the question: D id the individuals or team carry out the task the way that you expected and in a way that achieved the goal of the function? In other words: Did the person or small team do the right thing the right way at the right time? The evaluation of the performance of individual tasks can help determine whether personnel, training, and equipment are sufficient for the individuals/teams to do their job. Such information is useful for team leaders and first-line supervisors when determining training needs, scheduling maintenance, and routine purchasing. Note: DHS/ODP DHS/ODP developed Emergency Responder Guidelines that identify the essential tasks that response agencies must perform to effectively prevent, respond to, and recover from a threat or act of terrorism, including those involving the use of CBRNE weapons. The task in this picture is a firefighter sizing up a fire. The outcome or result of this task is an assessment of the extent and type of fire, so that firefighters can determine how to attack it. {Note: need more action oriented task.} Instructor Notes:
  • Exercise evaluation also assesses performance of: agencies (e.g., police or fire department), disciplines (e.g., local, state, and federal law enforcement agencies), and functions , often as defined with the Incident Command System (e.g., HazMat team, Emergency Operation Center, or fire services). The purpose of evaluation at this level is to answer the question : Did the larger team or organization accomplish its duties correctly in accordance with approved plans, policies, procedures, and agreements? Or did the Team deviate from planned response in an appropriate and successful way to meet the need, threat and resources available at the time? The analysis at this level is useful for assessing such issues as advanced planning and preparation; how the members work together at the discipline, department, or organizational level; and how well team members communicate across organizational and jurisdictional boundaries. This information is used by department managers and agency executives at the state and local level in developing annual operating plans and budgets, in communicating with political officials in setting long-range training and planning goals, and in developing interagency and interjurisdictional agreements. In this example, the agency or discipline represented is fire or Emergency Medical Services (EMS). Multiple tasks are being performed to evacuate this patient, such as monitoring the patient’s medical signs, transporting the patient on a stretcher, and using appropriate PPE.
  • Mission Level Performance As public officials know, success in a real emergency is measured by outcomes (or results). The public expects its government, and law enforcement and response agencies to prevent terrorist attacks if possible, and if attacked, to respond to and recover from these attacks quickly and effectively, mitigate the associated hazards, care for the victims, and protect the public. By focusing on performance and the root causes for variances from expected outcomes, public officials will be able to target their limited resources on improvements that will have the greatest effect on terrorism preparedness. In this example, we have multiple disciplines and agencies represented in an Emergency Operations Center (EOC); their mission is emergency management for public safety. Why are we concerned about outcomes? Example: In the TOPOFF 2000 exercise, a military team was fully trained and equipped to rescue and decontaminate victims. In the scenario, there was a chemical WMD release at a port facility. After the live victims had been evacuated, there were many contaminated dead bodies remaining on the pier. The bodies were baking in the sun. Seagulls had arrived to pick at the remains. The victims’ families were traumatized, and the media were in a frenzy. But this military team wouldn’t remove the bodies to a makeshift morgue because it wasn’t in their procedure. Their procedure only called for handling live victims – not dead ones. So they followed procedures, but missed the point: victim care. In this case, care for the victims’ remains and the well-being of the victims’ families. So while we want to measure adherence to procedures, we also want to make sure we achieve the desired results.   Instructor Notes:
  • For purposes of the HSEEP, ODP has defined 8 Mission Level Outcomes :   Prevention/Deterrence – ability to prevent or deter terrorist actions Emergency Assessment - ability to detect an event, determine its impact, classify the event, conduct environmental monitoring, and make government-to-government notifications Emergency Management – ability to direct, control, and coordinate a response, provide emergency public information to the population-at-risk and the population-at-large, and manage resources. Incident/Hazard Mitigation - ability to control and contain an incident at its source and to reduce the magnitude of its impact. This outcome also includes all response tasks conducted at the incident scene except those specifically associated with victim care Public Protection - ability to keep uninjured people from becoming injured, once an incident has occurred. Victim Care - ability to medically treat victims and handle human remains. Investigation/Apprehension – ability to find the cause and source of the attack, prevent secondary attacks, and to identify and apprehend those responsible Recovery/Remediation – ability to restore essential services and business, cleanup the environment and render the affected area safe, compensate victims, and restore a sense of well-being to the community   Instructor Notes :
  • Section I- DRAFT 05/21/10 Commonly, many of the same people are involved in both the Exercise Planning Team and the Exercise Evaluation Team. Certainly the Exercise Director will participate and monitor the evaluation team’s planning although the team should be lead by the Lead Evaluator. The Exercise Lead Controller should also be involved. Exercise Planning and Evaluation Planning are best accomplished concurrently. Lets look now at general requirements for successful exercise evaluation planning. The evaluation planning team will use the EXPLAN and MSEL to plan the evaluation, as follows:   The evaluation planning team will first use the exercise goals and objectives to determine what performance outcomes should be evaluated. Once the outcomes to be evaluated are determined, the team identifies what activities should be evaluated. Based on these activities, the team identifies which functions (e.g., individuals, teams, disciplines, organizations) should be evaluated. From the functions, the evaluation planning team can identify where the observations should take place (i.e., what locations) and which specific tasks should be evaluated.   Once these steps have been completed, the evaluation planning team can identify or develop the appropriate evaluation tools for the evaluators to use.
  • In the end of this section, we will discuss the ODP Exercise Evaluation Guides. These are tools that ODP has developed to create consistency in the evaluation process across all exercises. Instructor Notes: As the students how an evaluator knows what to look for, once they have been assigned to a particular location Explain that EEGs provide a consistent way of defining those tasks. In the set of EEGs, there is basically one guide for every task to be observed. The Evaluation Team would just select those EEGs pertinent to the particular exercises, as part of the planning. Direct students to the EEGs in the notebooks. Explain how the guides are organized according to the eight mission outcomes described earlier. Thus, there will be 8 sets of EEGs, once completed. Since the first set pertaining to tasks that lead to prevention and deterrence outcome are not yet developed, the EEGs in the notebook start with second set, under the outcome of Assessment. All EEGs under this outcome start with Roman Numeral II. ODP wants to have these EEGs used for every ODP-sponsored exercise. This way, the AARs can be written up following this structure, so that data can later be rolled up consistently, to the national level, as described in the beginning of this course. Explain that the next 3 slides walk through the structure of an individual task EEG. can be sorted by function/discipline or outcome.
  • Section I- DRAFT 05/21/10 Good evaluation planning should result in the development of an exercise Evaluation Plan (EVALPLAN), and as we have already said, evaluation planning starts as early as possible in the exercise design and planning. The EVALPLAN provides an overview of the exercise and the plan for the evaluation of the exercise. It is distributed to exercise planners, controllers and evaluators. It should include the purpose of the exercise, a list of tasks and outcomes to be evaluated, and a list of participating jurisdictions, as well as administrative and logistical information for the exercise. There may be some duplication of material between the EXPLAN and the EVALPLAN. This allows evaluators to obtain all the important information they need in one planning document.   The EVALPLAN typically consists of:   Exercise-specific information – scenario/MSEL, the map of play site (including evaluation locations), and the exercise schedule (including the evaluation schedule), appropriate plans, policies, procedures and agreements.   Evaluator team information: how many evaluators are needed, where they will be located, and how they are organized. Evaluator instructions -- on what to do before they arrive (e.g., review exercise materials, jurisdictional plans and procedures, and the EVALPLAN) as well as how to proceed upon arrival. Ask students for examples, good and bad, from their experience and whether (and when) the evaluation planning was or was not incorporated into the exercise development process.
  • Section I- DRAFT 05/21/10 Expectations: Evaluators are generally expected to be available for the pre-exercise training and briefing/site visit, the exercise itself, the post-exercise hot-wash, and for data analysis and contribution to the AAR. This time commitment is usually equivalent to one day before the exercise, the exercise day(s), and full one day after the exercise. Instructor Note: Facilitate a discussion as to how these expectations can be met. For Example: Incentives Highest level possible of support from management and leadership
  • Section I- DRAFT 05/21/10 One way to record and capture information is to ask the questions who, what, when, where, how, and why. Information can be captured using the ODP EEGs or even just blank sheets of paper with a column for recording events and times. The EEG provide space to create a chronological record of the action to address the above questions. Although the evaluator should be familiar with the expected outcomes and steps outlined in the “What to Look For” section of the guide and the questions in the “Data Analysis Questions and Measures” section, he/she should not try to use this as a checklist. It is important to concentrate on simply recording what is happening. The analysis of how well the exercise met expectations is done later during the analysis phase.
  • Section I- DRAFT 05/21/10 Here are a few items usually considered as significant events during an exercise: Initiating scenario events (release begins) Facility staffing, activation, time, and completion patterns (For example, who is there vs. who should be at the facility and when did they arrive) Actions of responders Key decisions made by players such as directors, coordinators, judges, politicians, the times the decisions were initiated and the times they were completed Deviations from plans and procedures Exact times events were completed Evaluators should remember that real world emergency events can occur and should be recorded as well. Of course, a real world event could delay exercise play, cancel play at one or more exercise locations, or cancel the entire exercise. Perhaps the best way to record significant events is to use a timeline format in taking your notes.
  • Section I- DRAFT 05/21/10 Once the exercise is completed, and the evaluators have collected the appropriate additional information, each evaluator should then compile his or her observations into a chronological narrative of events, describing outcomes achieved or not achieved. For any outcomes that are not achieved, the evaluator and the evaluator’s team should analyze the sequence of events and attempt to determine the cause of the issue, using the questions discussed above, i.e.: what happened what was supposed to happen if there is a difference, why? What is the impact of that difference? What should be learned from this? What improvements might be recommended? The evaluator should also refer to the specific questions provided at the end of each evaluation form, which may help in conducting a more detailed analysis of the specific events observed. Evaluators will then bring their individual narratives to the team analysis, described below.
  • Section I- DRAFT 05/21/10 For the next several viewgraphs, we will discuss data analysis of the exercise activities. These include conducting a post exercise hotwash, developing a timeline of events and the initial narrative or what happened and finally we will move into analyzing the performance of the individuals, teams, functions, and the outcomes. But first the player hotwash.
  • Section I- DRAFT 05/21/10 The hotwash is best held as soon as possible after exercise play ends, and before players start departing from the area. It shouldn’t take more than an hour. The hotwash is typically facilitated by the evaluator assigned to the particular location. It allows the evaluator to ask questions to clarify points or situations. It is important not to make judgmental or subjective statements. The hotwash: allows players to participate in self-assessment of the exercise play facilitates an interactive discussion to clarify actual activities during the actual response gives a general assessment of how the organization performed in the exercise Fills in information for evaluators Provides an opportunity for players to comment on how well the exercise was planned and conducted, and the effectiveness of the mock-ups and simulations
  • Section I- DRAFT 05/21/10 Second, the participants identify the mission outcomes associated with particular activities and events. The outcomes are shown here in the far right column of the viewgraph. These are all tied to the Emergency Management Outcome. Separate timelines are produced for each exercise location. During the analysis and report writing activities, the timelines will be shared as necessary to facilitate the analysis. A consolidated timeline will also be produced, as discussed a little later. If the timelines are produced and combined electronically, there are many ways to the data to help in discovering just what happened and the root cause of an issue.
  • Section I- DRAFT 05/21/10 Following the timeline development the group can then analyze the events, as follows: Review the site-specific objectives and tasks to be accomplished at that location Determine which tasks went well and which need improvement Identify the strengths and weaknesses in carrying out those tasks Determine why an action was not accomplished – the root cause analysis Recommend an improvement action – include, if possible, recommendations on who will carry out the action, what the action is, and when it should be accomplished The next two slides discuss these activities in more detail. The evaluation group from each location presents the results of their findings to the larger exercise evaluation group. The process is then repeated across locations and jurisdictions.
  • Section I- DRAFT 05/21/10 For each action, the participants should search for the “root cause” to try to determine the reason that an expected action did not occur or was not performed as expected.   A number of different analysis tools are available for root-cause analysis. One commonly used tool is the “why staircase” (or 5-why technique). This tool is used to help determine why there was a difference between what was planned and what actually occurred. It also helps an analysis team detect flaws in its reasoning. The analysis team should keep asking why that happened or did not happen until they are satisfied that they have the cause. It is important to reach this level of understanding in order to make recommendations to enhance preparedness.   Instructor Notes: Time permitting, ask students for a real-life example of a real or exercise issue that they’re aware of. Instructor: Here’s a few examples: An order was given to evacuate an area after a chemical release. An elderly shut-in was not evacuated. Why? (no one checked because she wasn’t on a list because the list-making process was flawed because not all sources of identifying special needs people were used, etc.) A supply of atropine delivered to a staging area was discovered to be beyond its shelf life. Why? (no one was assigned responsibility to check expiration dates because it wasn’t in anyone’s procedures because . . .?)  
  • Section I- DRAFT 05/21/10 We began this morning by discussing analysis within single lanes: functions/disciplines/locations. We now want to cross outside our lanes to examine performance across functions/disciplines/locations. This is called “integrated analysis”. Remember the Subaru commercial with the frustrated girl in kindergarten drawing and being told to “stay within the lines”. Then it flashes forward to her 4-wheeling in the mud and says “sometimes you just have to break the rules”—meaning you have to go outside the lines and interact with the rest of the environment. Integrated analysis is where we get creative and expand beyond our basic analysis of what happened at each function/discipline/location. We’re looking to see how each group interacts with others, and where disconnects may be occurring. This further helps us to identify: Additional successes and best practices Additional gaps or problems Root causes (to new problems or those previously identified) Recommendations for improvement How is this done? Following the analysis by each group, a more limited group of evaluators, controllers, and players then meet to analyze performance across the various functions and locations. Location-specific analysis focuses on assessing the performance of individual (or team) tasks and, to some extent, functions and disciplines. An integrated analysis cuts across locations and further identifies functional or discipline performance and, most important, mission-critical performance .
  • Section I- DRAFT 05/21/10 Once a root cause is determined, the individual evaluator and the team should use the following questions as a guide for developing recommendations for improvement : What changes need to be made to plans, policies, procedures, and relationships/ agreements to support the performance of essential tasks? What changes need to be made to organizational structures to support the performance of essential tasks? What changes need to be made to leadership and management processes to support the performance of essential tasks? What training is needed to support the performance of essential tasks? What changes to or additional resources are needed to support the performance of essential tasks? Note that these are the initial recommendations of the individual evaluators and their evaluation teams. They play an important part in developing the final set of recommendations that will be contained in the draft and final After-Action Reports. Although some recommendations will be immediately right-on target and adopted fully in the final report, others will bear modifying once all exercise information is in and the senior evaluators, exercise director, and players confer. Instructor Notes: Time permitting, ask students for examples of each item.
  • Section I- DRAFT 05/21/10 So let’s begin writing our report. The AAR is the capstone of the exercise. It is the tool used to provide feedback to the participating jurisdiction(s) on their performance during the exercise. The AAR provides: summary of what happened and recommendations for improvements May include “lessons learned” -- knowledge gained from an innovation or experience that provides valuable evidence  positive or negative  recommending how to approach a similar problem in the future. Although every recommendation that comes out of the analysis process may result in a lesson learned for the participating jurisdictions, it is those that may have applicability to other jurisdictions that should be highlighted as lessons learned in the After Action Report (AAR).
  • Section I- DRAFT 05/21/10 The first draft is prepared by the whole evaluation team – with input from controllers, SIMCELL members, participants -- while on site. A draft report is then prepared by evaluation team leaders for review by participants, who add how they’ll address the recommendations. The review is conducted in a follow-up visit to debrief the exercise. Then the final report is prepared.
  • Section I- DRAFT 05/21/10 AAR Format: Refer to Appendix D in your HSEEP Volume 2; it has no page numbers, but it’s the very last section of the manual. This is the outline of the AAR: Executive Summary Part 1: Exercise Overview Part 2: Exercise Goals and Objectives Part 3: Exercise Events Synopsis Part 4: Analysis of Mission Outcomes Part 5: Analysis of Critical Task Performance Part 6: Conclusion Appendix A: Improvement Plan Matrix Notice sections 4 and 5: Analysis of Mission Outcomes and Critical Task Performance. This is the meat of the report. The instruction and practical activities of the last 2 days are intended to help you prepare this report. If you use the evaluation process as defined in the HSEEP, then the report-drafting should be almost anti-climactic. It won’t quite write itself, but you should be nearly there. Instructors Notes: -Ask students to describe the features of the Exec Summary. Will give flavor of report. Ask about: -Audience -Content (exercise design, successes, improvements, followup) -Length -Intro (report purpose)
  • The effort of an exercise is wasted if the lessons from the exercise are not translated into actions that result in improvements to the capabilities tested. The draft After Action Report (AAR) will present observations and recommendations based on the data collection and analysis completed by the evaluation team. The evaluation team will assist the jurisdiction(s) that conducted the exercise in turning those recommendations into action. They will debrief the exercise to the participating agency officials and, as appropriate, to public officials and assist them in identifying and documenting corrective actions for program improvement.
  • The exercise debrief provides a forum for jurisdiction officials to: hear the results of the analysis and validate the findings and recommendations presented in the draft AAR. The presentation includes a discussion of: the exercise objectives; what happened during the exercise; any differences between expected performance and actual performance; the reasons for differences and their impact on the response; lessons learned; and recommendations for improvement. The debrief should be interactive, with the jurisdiction officials validating the observations and recommendations and/or providing insights into some activities that might have been missed or misinterpreted by the evaluation team. The draft AAR would be modified to incorporate any clarifying information. The debrief should also include a facilitated discussion of ways that the jurisdiction can build on strengths identified and begin to address recommendations for improvement through the development of an Improvement Plan.  
  • The Improvement Plan (IP) is the means by which the lessons learned from the exercise are turned into concrete, measurable steps that result in improved response capabilities. An initial IP should be developed at the debrief, while all of the key officials are together. The initial IP should identify: what actions will be taken to address each recommendation presented in the draft AAR, who or what agency will be responsible for taking the action, and the timeline for completion. If more information is needed to answer these questions, the initial IP should at identify which agency will explore the issue further. The IP should be realistic and establish priorities for the use of limited resources.  During the meeting, the facilitator should assist the officials in identifying sources of funding, or exploring alternative solutions if funds will not be immediately available.
  • Generally, the initial IP will be included in the final AAR. The final AAR should follow the same format as the draft AAR discussed in the previous chapters with the addition of the improvement steps that will be taken. The improvement steps that will be taken to address a specific recommendation will generally be listed in the AAR immediately following the recommendation.  
  •   DHS/ODP is developing a secure internet-based Exercise Management System that will: provide a centralized calendar of exercises across the country provide for the electronic submission of AAR/IPs to the SAA and DHS/ODP, and Monitor implementation of IPs. The system is being designed so that all information flow is through the SAA, providing them with a tool to enhance the management of their exercise program. All AAR/IPs and follow-up information will be designated “For Official Use Only.”  
  • DHS/ODP will provide copies of the AARs to the Memorial Institute for the Prevention of Terrorism’s (MIPT) Ready-Net, a Web-based best practices and lessons learned information network for first responders and emergency planners nationwide. MIPT Ready-Net will serve as the national repository for best practices and lessons learned. Ready-Net will analyze the information and pull out the best practices, lessons learned, and trends It will be accessible to approved users within the response community through the DHS/ODP secure portal. All AAR information will be secure and will be provided to approved users in summary form and/or with all identifying information removed. .
  • As ODP contractors, we are looking to you to help states and local jurisdictions to implement the HSEEP evaluation process: Helps to have everyone using the same, familiar approach at different exercises Improves state and local preparedness Results in realistic plans that can be implemented Because planning and conducting an exercise requires a significant commitment of resources, it is important to maximize the benefits gained from the exercise through implementation of these DHS/ODP evaluation and improvement process.
  • HSEEP Mission Overview Briefing

    1. 1. HSEEP Exercise Evaluation and Improvement
    2. 2. ODP’s Mission <ul><li>Primary responsibility within the executive branch to build and sustain the preparedness of the US to reduce vulnerabilities, prevent, respond to, and recover from acts of terrorism (Homeland Security Act). </li></ul>
    3. 3. ODP’s Responsibilities <ul><li>Grant programs for planning, equipment, training and exercises </li></ul><ul><li>National training program </li></ul><ul><li>National exercise program </li></ul>
    4. 4. Grant Programs <ul><li>State Homeland Security Program </li></ul><ul><li>Law Enforcement Terrorism Prevention Program </li></ul><ul><li>Citizen Corps Program </li></ul><ul><li>Urban Areas Security Initiative Program </li></ul><ul><li>Fire Fighter Assistance Program </li></ul>
    5. 5. State Homeland Security Program <ul><ul><li>Purpose: to enhance capacity of states and locals to prevent, respond to, and recover from terrorism </li></ul></ul><ul><ul><li>Provides funds for </li></ul></ul><ul><ul><ul><li>Homeland security and emergency operations planning </li></ul></ul></ul><ul><ul><ul><li>The purchase of specialized equipment </li></ul></ul></ul><ul><ul><ul><li>CBRNE and cyber security training programs </li></ul></ul></ul><ul><ul><ul><li>CBRNE and cyber security exercises </li></ul></ul></ul><ul><ul><ul><li>State Homeland Security Assessments and Strategies </li></ul></ul></ul>
    6. 6. Law Enforcement Terrorism Prevention Program <ul><ul><li>Provide law enforcement communities with funds to support the following prevention activities: </li></ul></ul><ul><ul><ul><li>Information sharing to preempt terrorist attacks </li></ul></ul></ul><ul><ul><ul><li>Target hardening </li></ul></ul></ul><ul><ul><ul><li>Recognition of potential or developing threats </li></ul></ul></ul><ul><ul><ul><li>Interoperable communications </li></ul></ul></ul><ul><ul><ul><li>Intervention of terrorists before they can execute a threat </li></ul></ul></ul><ul><ul><ul><li>Planning, organization, training, exercises, and equipment </li></ul></ul></ul>
    7. 7. Citizen Corps Program <ul><ul><li>Provides funds to support Citizen Corps Councils with planning, outreach, and management of Citizen Corps program and activities </li></ul></ul><ul><ul><ul><li>Form and sustain a Citizen Corp Council </li></ul></ul></ul><ul><ul><ul><li>Engage citizens in homeland security </li></ul></ul></ul><ul><ul><ul><li>Conduct public education and outreach </li></ul></ul></ul><ul><ul><ul><li>Develop and implement Citizen Corps programs </li></ul></ul></ul><ul><ul><ul><li>Coordinate Citizen Corps activities with other DHS funded programs and other federal initiatives </li></ul></ul></ul>
    8. 8. Urban Areas Security Initiative Program <ul><ul><li>Address the unique needs of large urban areas – 50 cities </li></ul></ul><ul><ul><li>Conduct jurisdictional assessment and develop Urban Area Homeland Security Strategy. </li></ul></ul><ul><ul><li>Funds for planning, equipment, training, exercise, and administration and operational activities related to heightened threat levels </li></ul></ul>
    9. 9. Fire Fighter Assistance <ul><li>Protect public and fire fighters against fire and fire-related hazards </li></ul><ul><ul><li>Fire fighting Operations and Safety </li></ul></ul><ul><ul><li>Fire Prevention </li></ul></ul><ul><ul><li>Fire fighting Vehicles </li></ul></ul>
    10. 10. Strategy Process Overview END RESULT = Capability Improvements Statewide Homeland Security Strategy Conducted at the local and state levels Created at the regional and state level Created by ODP STEP 1 STEP 2 STEP 3 Assessments State Assistance Plan State and Urban Area use strategy to identify & allocate all HS resources
    11. 11. Strategy Participants <ul><ul><li>Fire Service </li></ul></ul><ul><ul><li>HazMat </li></ul></ul><ul><ul><li>Emergency Medical Services </li></ul></ul><ul><ul><li>Law Enforcement </li></ul></ul><ul><ul><li>Emergency Management </li></ul></ul><ul><ul><li>Public Safety Communications </li></ul></ul><ul><ul><li>Public Health </li></ul></ul><ul><ul><li>Health Care </li></ul></ul><ul><ul><li>Public Works </li></ul></ul><ul><ul><li>Government Administrative </li></ul></ul><ul><ul><li>Private Sector </li></ul></ul><ul><ul><li>Non-Profit/Voluntary Sector </li></ul></ul><ul><li>State and local jurisdictions </li></ul><ul><li>All First Responder Disciplines </li></ul>
    12. 12. Assessment Overview Shortfalls or “Gaps” Agricultural Vulnerability Assessment * CBRNE: Chemical, Biological, Radiological, Nuclear, Explosive Planning Factors CBRNE* Scenarios Required Capabilities Current Capabilities Statewide Homeland Security Strategy Threat Assessment Vulnerability Assessment Risk Assessment Needs Assessment
    13. 13. Threat Assessment Agricultural Vulnerability Assessment Threat Assessment Vulnerability Assessment <ul><li>Who: </li></ul><ul><li>Local, state, and federal law enforcement officials </li></ul><ul><li>What: </li></ul><ul><ul><li>Identify number of Potential Threat Elements (PTEs) </li></ul></ul><ul><ul><li>Identify threat factors (existence, violent history, intentions, WMD capability, and targeting) </li></ul></ul><ul><ul><li>Identify motivations (political, religious, environmental, racial, or special interest) </li></ul></ul><ul><ul><li>Identify WMD capabilities (CBRNE) </li></ul></ul>Risk Assessment
    14. 14. Vulnerability Assessment Risk Assessment <ul><li>Who: </li></ul><ul><li>All response disciplines at local, state, and federal levels </li></ul><ul><li>What: </li></ul><ul><li>Identify critical infrastructure/ potential targets </li></ul><ul><li>Evaluate targets for: </li></ul><ul><ul><li>Level of visibility </li></ul></ul><ul><ul><li>Criticality of target site </li></ul></ul><ul><ul><li>Impact outside of jurisdiction </li></ul></ul><ul><ul><li>Access to target </li></ul></ul><ul><ul><li>Target threat of hazard </li></ul></ul><ul><ul><li>Target site population capacity </li></ul></ul><ul><ul><li>Potential collateral mass casualties </li></ul></ul>Agricultural Vulnerability Assessment Vulnerability Assessment Threat Assessment
    15. 15. Capabilities and Needs: Planning <ul><li>The results from the risk assessment process (threat and vulnerability) provide a link to the capabilities and needs assessment process. </li></ul><ul><ul><li>Planning </li></ul></ul><ul><ul><li>Organization </li></ul></ul><ul><ul><li>Equipment </li></ul></ul><ul><ul><li>Training </li></ul></ul><ul><ul><li>Exercises </li></ul></ul>
    16. 16. State Homeland Security Strategy <ul><li>Developed by State based on local needs </li></ul><ul><li>Provides blueprint for planning of homeland security efforts to enhance preparedness and for use of resources </li></ul>
    17. 17. State Assistance Plans <ul><li>ODP uses the strategies and needs assessment data to tailor and formulate a State/Metro Assistance Plan (SAP/MAP) for each state </li></ul><ul><li>A SAP/MAP is a blueprint for the delivery of ODP training, exercise, technical assistance and equipment services </li></ul>
    18. 18. National Training Program <ul><li>Training for federal, state and local homeland security professionals </li></ul><ul><li>Based on critical tasks to prevent, respond to or recover from a terrorist incident </li></ul><ul><li>Over 40 courses available </li></ul>
    19. 19. ODP Training Program <ul><li>ODP offers more than 40 courses (Examples) </li></ul><ul><ul><li>Live chemical agents training – Center for Domestic Preparedness </li></ul></ul><ul><ul><li>Live explosives training – New Mexico Institute of Mining and Technology </li></ul></ul><ul><ul><li>Radiological and nuclear agents training – Nevada Test Site </li></ul></ul><ul><ul><li>Advanced emergency medical training using human patient simulators – Texas A&M </li></ul></ul><ul><ul><li>Training on bioterrorism – Louisiana State University </li></ul></ul>
    20. 20. National Exercise Program <ul><li>Responsible for National Exercise Program </li></ul><ul><li>Threat and performance-based excises at federal, state, local, and international levels </li></ul><ul><li>Strategy and Exercise Planning Workshops to define exercise needs and plan for each state </li></ul>
    21. 21. Assess Program Success Through Exercises <ul><li>Performance measures for ODP’s grant, training, and exercise programs are tied to performance of critical tasks </li></ul><ul><li>Percent of jurisdictions that can perform critical tasks as demonstrated through exercises </li></ul><ul><ul><li>500,000+ population </li></ul></ul><ul><ul><li>100,000+ population </li></ul></ul><ul><ul><li>50,000+ population </li></ul></ul>
    22. 22. Overview of HSEEP <ul><li>Threat- and Performance-based Exercises </li></ul><ul><li>Cycle of exercises </li></ul><ul><li>Increasing complexity </li></ul><ul><li>To improve preparedness </li></ul>
    23. 23. HSEEP Manuals <ul><li>Volume I: Program Overview and Doctrine </li></ul><ul><li>Volume II: Exercise Evaluation and Improvement </li></ul><ul><li>Volume III: Exercise Development </li></ul><ul><li>Volume IV: Sample Exercise Documents and Formats </li></ul>
    24. 24. Vol I: HSEEP Overview and Doctrine <ul><li>ODP’s exercise and evaluation doctrine </li></ul><ul><li>Uniform approach for exercise design, development, conduct, and evaluation </li></ul><ul><li>Exercise design and implementation process </li></ul><ul><li>Suite of common scenarios (TBD) </li></ul>
    25. 25. Vol II: Exercise Evaluation and Improvement <ul><li>Defines exercise evaluation and improvement process </li></ul><ul><li>Provides uniform set of evaluation guides </li></ul><ul><li>Defines data analysis process </li></ul><ul><li>Includes standardized After-Action Report template </li></ul>
    26. 26. Vol III: Exercise Development <ul><li>Defines exercise planning and design process </li></ul><ul><li>Provides guidance for the development and conduct of various types of exercises </li></ul>
    27. 27. Vol IV: Sample Documents <ul><li>Provides sample letters, planning documents, checklists, scenarios, etc. </li></ul><ul><li>Reduces development time for exercise design team </li></ul>
    28. 28. Exercise Evaluation <ul><li>Assess preparedness at federal, state and local levels </li></ul><ul><li>Validate strengths and identify improvement opportunities, resulting in improved preparedness </li></ul><ul><li>Provide guide for resource allocations </li></ul>
    29. 29. Evaluation Enhancements <ul><li>Focus on performance of critical tasks and mission outcomes </li></ul><ul><li>Use of uniform evaluation tools </li></ul><ul><li>Enhanced data analysis </li></ul><ul><li>Debriefing meeting with key officials </li></ul><ul><li>Improvement Plan </li></ul><ul><li>Track implementation of improvements </li></ul><ul><li>Suite of common scenarios (TBD) </li></ul>
    30. 30. Exercise Evaluation Methodology Development <ul><li>Exercise Evaluation Working Group </li></ul><ul><li>Builds on </li></ul><ul><ul><li>Responder Guidelines </li></ul></ul><ul><ul><li>ODP exercise experience </li></ul></ul><ul><ul><li>CSEP and other programs </li></ul></ul><ul><li>Will continue to enhance and improve </li></ul>
    31. 31. Exercise Evaluation and Improvement Process Exercise Evaluation and Improvement Process Data Collection and Analysis Step 1 Plan & Organize the Evaluation Step 2 Observe the Exercise & Collect Data Step 3 Analyze Data Step 4 Develop After Action Report Improving Preparedness Step 5 Conduct Debriefing Step 6 Identify Improvements Step 7 Finalize After Action Report Step 8   T rack Implementation Evaluation Planning, Observation, and Analysis
    32. 32. Levels of Analysis <ul><li>Performance is analyzed at three levels: </li></ul><ul><ul><li>Task level </li></ul></ul><ul><ul><li>Agency/discipline/function level </li></ul></ul><ul><ul><li>Mission level (within and across communities) </li></ul></ul>
    33. 33. Levels of Analysis <ul><li>Task Level Performance </li></ul><ul><ul><li>Answers the question: did the person or team do the right thing the right way at the right time? </li></ul></ul><ul><ul><li>Helps assess need for training, equipment, personnel, etc. </li></ul></ul><ul><li>Task = work with measurable output that has utility </li></ul>
    34. 34. Levels of Analysis <ul><li>Agency/Discipline/Function Level Performance — Multiple teams </li></ul><ul><ul><li>Answers the question: did the larger team or organization perform duties in accordance with plans and policies? </li></ul></ul><ul><ul><li>Helps assess communication, coordination, planning budgets, etc. </li></ul></ul>
    35. 35. Levels of Analysis <ul><li>Mission Level Performance </li></ul><ul><ul><li>Answers the question: were the mission level outcomes achieved? </li></ul></ul><ul><ul><li>Addresses jurisdictional preparedness </li></ul></ul><ul><li>Outcomes = results </li></ul>
    36. 36. Mission Outcomes Pre-Event Emergency Response Post-Event <ul><li>Prevention/Deterrence </li></ul><ul><li>Emergency Assessment </li></ul><ul><li>Emergency Management </li></ul><ul><li>Hazard Mitigation </li></ul><ul><li>Public Protection </li></ul><ul><li>Victim Care </li></ul><ul><li>Investigation/Apprehension </li></ul><ul><li>Recovery/Remediation </li></ul>
    37. 37. Evaluation Requirements <ul><li>Determine what outcomes will be evaluated, based on exercise objectives </li></ul><ul><li>Identify activities to be evaluated </li></ul><ul><li>Identify which functions should be observed </li></ul><ul><li>Determine where observations will take place </li></ul><ul><li>Identify the appropriate evaluation tools </li></ul>
    38. 38. Exercise Evaluation Guides <ul><li>ODP has developed Exercise Evaluation Guides that: </li></ul><ul><ul><li>Identify the activities that the evaluator should be observing </li></ul></ul><ul><ul><li>Provide consistency in tasks across exercises </li></ul></ul><ul><ul><li>Link individual tasks to disciplines and outcomes </li></ul></ul>
    39. 39. The EVALPLAN <ul><li>Exercise-specific information </li></ul><ul><li>Plans, policies, procedures, and agreements </li></ul><ul><li>Evaluator recruiting and assignments </li></ul><ul><li>Evaluator training and instructions </li></ul>
    40. 40. Recruiting and Assigning Evaluators <ul><li>Setting expectations – evaluators must be available for: </li></ul><ul><ul><li>pre-exercise training and briefing </li></ul></ul><ul><ul><li>pre-exercise site visit </li></ul></ul><ul><ul><li>the entire exercise (hours to days) </li></ul></ul><ul><ul><li>post-exercise hot-wash </li></ul></ul><ul><ul><li>post-exercise data analysis (1 day) </li></ul></ul><ul><ul><li>contribution to the draft AAR </li></ul></ul>
    41. 41. Recording Observations <ul><li>The emphasis is on Who? What? When? Where? How? Why? </li></ul><ul><li>Record observations through: </li></ul><ul><ul><li>use of Evaluator Guides </li></ul></ul><ul><ul><li>blank sheets of paper </li></ul></ul><ul><li>Collect exercise documents </li></ul>
    42. 42. Record Significant Activities <ul><li>Initiating scenario events </li></ul><ul><li>Facility activities </li></ul><ul><li>Response actions </li></ul><ul><li>Key decisions made by Players </li></ul><ul><li>Deviations from plans and procedures </li></ul><ul><li>Completion time of events </li></ul>
    43. 43. Evaluator Summary <ul><li>Compile observations into chronological narrative of events </li></ul><ul><li>Describe outcomes achieved or not – use questions below and evaluation guides: </li></ul><ul><ul><li>What happened? </li></ul></ul><ul><ul><li>What was supposed to happen? </li></ul></ul><ul><ul><li>If there is a difference, why? </li></ul></ul><ul><ul><li>What is the impact of that difference? </li></ul></ul><ul><ul><li>What should be learned from this? </li></ul></ul><ul><ul><li>What improvements might be recommended? </li></ul></ul>
    44. 44. Data Analysis <ul><li>Conduct Hotwash </li></ul><ul><li>Develop timeline of significant events </li></ul><ul><li>Analyze performance: </li></ul><ul><ul><li>Individuals </li></ul></ul><ul><ul><li>Teams/Functions </li></ul></ul><ul><ul><li>Outcomes </li></ul></ul>
    45. 45. Hotwash <ul><li>Player Hotwash: </li></ul><ul><ul><li>Usually held immediately following exercise play </li></ul></ul><ul><ul><li>Typically facilitated by the evaluator </li></ul></ul><ul><li>Provides opportunity for: </li></ul><ul><ul><li>Player self-assessment </li></ul></ul><ul><ul><li>An interactive discussion </li></ul></ul><ul><ul><li>Clarification of observed events </li></ul></ul><ul><ul><li>Assessment of exercise simulations </li></ul></ul>
    46. 46. Timeline Development <ul><li>Identify the appropriate outcome for </li></ul><ul><li>each activity </li></ul>Make a team timeline of actions Focus on significant actions EM JIC staff JIC EAS copied & distributed at JIC to all staff work stations; additional copies on tables in media room. 0912 EM EOC JIC EAS message from EOC received by Fax 0910 EM Forest Co. JIC Forest Co. PIO and Assistant arrive at EOC; PIO immediately calls in to EOC 0906 EM Rushmore Co. PIO JIC First media call to JIC requesting info. on event. PIO provides initial incident info. & tells reporter to watch for news release shortly re: JIC activation. 0905 EM Rushmore Co. JIC Three staff members arrive at JIC – PIO, Deputy, Admin. Ass’t; begin setting up (computers, removing displays from storage, job aids at work stations, etc.) 0852 Outcome Team Location Observations Time
    47. 47. Analysis of Performance <ul><li>Analysis of activities </li></ul><ul><ul><li>What tasks were to be accomplished </li></ul></ul><ul><ul><li>How well were they performed </li></ul></ul><ul><ul><li>Root causes of differences between expected and actual performance </li></ul></ul><ul><ul><li>Recommendations </li></ul></ul>
    48. 48. Root Cause Analysis 1. Why did it happen? 2. Why did that happen? 3. Why was that? 4. And why was that? 5. And why was that? ***Root Cause*** 6. And so on… Each step must completely explain the step above … … down to the basic underlying causal factor.
    49. 49. Integrated Analysis <ul><li>Allows further identification of: </li></ul><ul><ul><li>Successes and best practices </li></ul></ul><ul><ul><li>New gaps and problems </li></ul></ul><ul><ul><li>Root causes </li></ul></ul><ul><ul><li>Recommendations for improvement </li></ul></ul><ul><li>Compares observations from different locations and functions </li></ul>
    50. 50. Recommendations for Improvement <ul><li>Questions for identifying recommendations for improvement: </li></ul><ul><ul><li>What training and/or equipment is needed? </li></ul></ul><ul><ul><li>What changes need to be made to plans and procedures, or organization structures? </li></ul></ul><ul><ul><li>What changes could be made to the management processes? </li></ul></ul>
    51. 51. The After-Action Report (AAR) <ul><li>Serves as feedback tool </li></ul><ul><li>Summarizes what happened </li></ul><ul><li>Identifies successes and recommendations for improvement </li></ul><ul><li>May include lessons learned to share with other jurisdictions </li></ul><ul><li>Help jurisdictions focus resources on greatest needs </li></ul>
    52. 52. After-Action Report <ul><li>Prepared in two stages: </li></ul><ul><ul><li>Draft AAR – completed immediately after the exercise for review </li></ul></ul><ul><ul><ul><li>Community adds improvement steps/corrective actions </li></ul></ul></ul><ul><ul><li>Final AAR </li></ul></ul>
    53. 53. AAR Format <ul><li>Executive Summary </li></ul><ul><li>Part 1: Exercise Overview </li></ul><ul><li>Part 2: Exercise Goals and Objectives </li></ul><ul><li>Part 3: Exercise Events Synopsis </li></ul><ul><li>Part 4: Analysis of Mission Outcomes </li></ul><ul><li>Part 5: Analysis of Critical Task Performance </li></ul><ul><li>Part 6: Conclusion </li></ul><ul><li>Appendix A: Improvement Plan Matrix </li></ul>
    54. 54. Improvement Process <ul><li>Improving preparedness activities: </li></ul><ul><ul><li>Conduct exercise debrief </li></ul></ul><ul><ul><li>Identify improvements </li></ul></ul><ul><ul><li>Finalize AAR </li></ul></ul><ul><ul><li>Track implementation </li></ul></ul>Step 5 Conduct Debriefing Step 6 Identify Improvements Step 7 Finalize After Action Report Step 8 Track Implementation
    55. 55. Exercise Debrief <ul><li>Provides a forum for jurisdiction officials to: </li></ul><ul><ul><li>Hear the results of the analysis </li></ul></ul><ul><ul><li>Validate the findings and recommendations in draft AAR </li></ul></ul><ul><ul><li>Begin development of Improvement Plan </li></ul></ul>
    56. 56. Improvement Plan <ul><li>Developed by local jurisdiction during debrief </li></ul><ul><li>Identifies how recommendations will be addressed: </li></ul><ul><ul><li>What actions </li></ul></ul><ul><ul><li>Who is responsible </li></ul></ul><ul><ul><li>Timeline for completion </li></ul></ul>
    57. 57. Finalize AAR <ul><li>Improvement Plan is included in final AAR </li></ul><ul><li>Final AAR submitted to ODP through State Administrative Agency </li></ul>
    58. 58. Monitor Implementation <ul><li>ODP Exercise Management System (under development) will provide: </li></ul><ul><ul><li>Centralized calendar of exercises across the country </li></ul></ul><ul><ul><li>Electronic submission of AAR/IPs to the SAA and ODP </li></ul></ul><ul><ul><li>Monitoring of Improvement Plan implementation </li></ul></ul>
    59. 59. Sharing Lessons Learned <ul><li>Ready-Net – Web-based, secure information network </li></ul><ul><ul><li>National repository for best practices and lessons learned </li></ul></ul><ul><ul><li>Accessible to approved users within the response community </li></ul></ul><ul><ul><li>Administered by the Memorial Institute for the Prevention of Terrorism </li></ul></ul>
    60. 60. Benefits of HSEEP Approach <ul><li>Nationwide consistency </li></ul><ul><li>More useful after action reports and improvement plans </li></ul><ul><li>Ability of jurisdictions to focus resources on greatest needs </li></ul><ul><li>ENHANCED PREPAREDNESS </li></ul>
    61. 61. Exercise Evaluation Training Course <ul><li>2 ½ days - Exercise Evaluation methodology </li></ul><ul><li>6 sessions to train ODP staff and contractors as change agents (225 people) </li></ul><ul><li>Training for SAAs Feb-May 2004 </li></ul><ul><li>ODP Exercise Design Course being revised to deliver consistent message </li></ul>
    62. 62. Goal for Working Group <ul><li>Review and modify Exercise Evaluation Guides for Radiological and Biological attacks </li></ul><ul><ul><li>Are the right tasks identified? </li></ul></ul><ul><ul><li>Do other tasks need to added? </li></ul></ul><ul><ul><li>Are the conditions and typical steps logical and complete? </li></ul></ul><ul><ul><li>Are the followup analysis questions the right questions to assess performance? </li></ul></ul>
    1. A particular slide catching your eye?

      Clipping is a handy way to collect important slides you want to go back to later.

    ×