Key_carver_human_fac..

378 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
378
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Not human resources, nor just concerned with civil liberties and public perception of risk Problems in family and out of family (NASA) in family -a reportable problem, previously experienced, analysed ad understood. But this ignores unreportable or not reportable events- not understood events - not even sure if there was an event - the value of the event is not known, missing
  • Problem Domain: Multiple Agencies Land/ sea/ air forces Security services Intelligence services Support services Media, Local authorities, Public
  • CONNECT infrastructure to support a network which has processes in place that are trusted by all parties to connect the right agencies when required - vertical and horizontal COMMUNICATE share data - standards for exchange ontologies - be able to share information which is transferred with its meaning and/or context understand the difficulties in communicating - particularly failure modes (inclusion, isolation, understanding, situation awareness) COLLABORATE virtual organisation - how to build, manage and dismantle socio- technical aspects communication strategy who to share with, what to share knowing and understanding the value of information formal and informal communication routes isolation and situation awareness trust/confidence in the data, the people and the process
  • How to make the Virtual organisation work: not just the technology… Macro processes: Partnership formation, partnership assessment, relationship evaluation - interdependence, cultural learning, alignment of understanding Micro process ; cognitive processes involved in collaboration - planning, decision making, communication, failure modes Use knowledge of human behaviour and cognition to create better interfaces and better support tools at each stage of the crisis management lifecycle - to support the individual, the team and the virtual organisation Shorten the distance between the HQ and the coal face, reduce time lags, ensure good feedback loops, maintain awareness and achieve understanding, reduce misunderstanding and ignorance Right information at the right place at the right time, with its significance understood by the right person
  • Detecting small amounts of visual or acoustic energy Perceiving patterns of light or sound Detecting signals in high noise levels Improvisation and use of flexible procedures Storing very large anounts of information for long periods and recalling relevant facts at the appropriate time Inductive reasoning Exercising judgement Reacting to unexpected, low probability events Applying originality in solving problems ie generating alternative solutions Profiting from experience and altering course of action accordingly Performing fine manipulation especially where misalignment appears unexpectedly
  • 1980s: So may displays that all cannot be displayed at once. Concept of multi-function display - information stored in nested displays - pilot needs to know architecture for the information storage and needs to know which page to go to to get information- and he needs to know that the information is there in the first place. Information now processed, (not raw), largely visual information, feedback cues that were once received naturally as a result of the control mechanisms now have to be added, ie designed in by the systems engineer - eg DMU : mannikin capability, size and scale cues, weight, texture….. 1920s 1940s 1980s Automation bias Automation surprises Automation complacency Mode awareness Information management Feedback and situation awareness Behavioural adaptation Changes in human roles – possibly unforeseen Changes in the sources and patterns of workload Visual overload Acceptance that there will always be human error Usage beyond original specification Operational concerns:
  • Sit awareness : systems view Sit awareness : cognitive view
  • Second, shared SA . We define this as the extent to which common SA requirements are met across the team. In other words, it’s the degree of overlap between their individual SA’s. For example, A+B+C would be the shared SA of all three team members.
  • (MNE and LOE)
  • Skill based slips and lapses. Rule based mistakes Knowledge based mistakes operational level…environment tactical level...... process strategic level .......competence Rasmussen, Hollnagel, Reason
  • Task: Conflict resolution - pipes impacting on a bulkhead System integrator - for DMU Structures manager - responsible for bulkhead System manager - responsible for pipe network
  • The kind of questions that pilots ask themselves are: Is the runway long enough? Weight on arrival, Hydraulics system failures, braking efficiency How long would it take to get there? Altitude, Distance, Number of engines still working How much will the diversion cost the company? additional fuel, landing fees, passenger accommodation and or forwarding to original destination. What facilities are there at the airport? Maintenance facilities, Emergency facilities
  • There are different ways to use a Cognitive Model in this context We could simply automate the function and program the autopilot directly. We could present the ‘best option’ and let the pilot decide if he wants it We could provide a prioritised list of options and let the pilot select one We can act as a consultant, and let the pilot ask the model what it thinks of an option Simply presenting an answer can be appropriate, but in this instance we suspect that more supporting evidence will be required to convince the pilots that the tool has considered everything they would want to be considered. So, we have put a lot of emphasis on making the model ‘transparent’ to the pilot, as we believe he needs to understand why the model thinks an airport is (or isn’t) suitable. However, during the trials we will be varying the amount of information presented, so we can test this theory. The ‘full’ implementation, shown on the right, will work something like this: When a critical fault is detected, the system will assess every airport in its database. It will then list its top 6 candidates. Also, the pilots can request this analysis themselves. The top of the screen presents some of the system performance data on which the calculations have been based. In this example, the failure of an engine has reduced maximum long range cruising height to Flight Level 194, which affects fuel consumption and, therefore, range. Any airport on the list can be directly selected, and there is a look-up list containing every airport in the database. Selecting an airport brings up the information on the right, showing required and actual parameters for a diversion to that airport.
  • Supply the customer with the system and let them make it work, OR understand the issues ahead of time, leading to better requirement definition and a better match of the system to the requirements. KM - pattern and connection recognition, monitoring and situation assessment, a predictive capability, time criticality Communication and collaboration both within and across crisis management agencies and to public and media
  • Situational awareness or SA is an actor’s perception, understanding and appreciation of the state of his task environment. Good SA is needed to support effective decision-making and action. This is now a central concept in applied human factors research, and the assessment of individual SA is reasonably well developed. However, there is a need to pay attention to the added complexities of SA in teams. Within teams, some SA requirements will be specific to different individuals, while other aspects of SA must be shared across the group. In other words, the team should have a common perception and understanding and appreciation of certain aspects of the situation. And to achieve this, they will have to share information and coordinate their SA.
  • Clumsy - Inability of computer to solve complex critical problems
  • The overall aim is to optimise the system from both technical and human perspectives.
  • Key_carver_human_fac..

    1. 1. Human Factors in Crisis Management Advanced Technology Centre Liz Carver Advanced Technology Centre BAE SYSTEMS [email_address]
    2. 2. <ul><li>Why do we need to worry about human factors? </li></ul><ul><li>Integrated approach needed to take into account human performance at every stage of the crisis management system development lifecycle </li></ul><ul><li>Key Challenges Virtual Organisations, communication, situational awareness, failure modes, decision making under pressure </li></ul><ul><li>Conclusions </li></ul>© BAE SYSTEMS 2005 All rights reserved
    3. 3. Managing the unpredictable: what we should expect? <ul><li>Crisis management is in many ways a Catch 22 scenario.. </li></ul><ul><li>We want to try and predict the unpredictable - </li></ul><ul><li>we learn lessons, and then it is different next time - </li></ul><ul><li>or a completely new thing happens …. </li></ul><ul><li>And we have to deal with it </li></ul><ul><li>Can we re-use the generic and be ready, trained and agile to deal with the next challenge…… ??? </li></ul>
    4. 4. Crisis and emergency scenarios <ul><li>Cross border International Cooperation </li></ul><ul><ul><ul><li>Cross border police cooperation </li></ul></ul></ul><ul><ul><ul><li>Euro-wide monitoring of non-EU citizens migration streams </li></ul></ul></ul><ul><ul><ul><li>Secure European inbound and outbound commodity streams </li></ul></ul></ul><ul><li>Crisis Management </li></ul><ul><ul><ul><li>Civil Protection Agencies: Local, Regional, National, International </li></ul></ul></ul><ul><li>Security of Large Events </li></ul><ul><li>Counteracting terrorist activities </li></ul><ul><li>Computer Incursion </li></ul><ul><li>Law enforcement to support Security </li></ul>
    5. 5. Significance vs. Probability Probability Significance Thermo Nuclear Attack Dirty Bomb Biological Release Chemical Agent Release Assassination Hijacking Hostage / Ransom Car Bomb Suicide Bomb Cyber Attack High High Low Low MANPAD The terrorist needs to be lucky once….. We need to be lucky every time
    6. 6. The environment versus effective work <ul><li>Critical tools failed </li></ul><ul><li>Deliberate efforts at infrastructure disruption </li></ul><ul><li>Lack of clean water </li></ul><ul><li>Sickness……. </li></ul>
    7. 7. “ Based on our growing body of experience , we need: • Reliable, robust, flexible, simple, inexpensive, shareable, peer-to-peer, securable, collaborative, wireless voice and data over the internet at decent bandwidth in any location in the world on short notice for all players immediately… • And experts available 24x7 for discussion… • And a resilient power source to run it all…. Or our leadership responsibilities will not be met.” Advice from the field Lessons Learned from interviews on civil-military operations post-conflict.Operation Strong Angel... Eric Rasmussen Communication and collaboration Knowledge
    8. 8. Knowledge assets : examples <ul><li>10-20-30 Document </li></ul><ul><li>Lessons Learned from interviews on civil-military operations post-conflict. </li></ul><ul><li>Civil-Military Interaction Advice from Strong Angel </li></ul><ul><li>Eric Rasmussen, MD, FACP </li></ul><ul><li>Fleet Surgeon, Third Fleet </li></ul><ul><li>27 June 2000 Rim of the Pacific 2000 </li></ul><ul><li>Centre for Army lessons learned : </li></ul><ul><li>What does the Army know about Hurricane clean up? </li></ul><ul><li>Resources required, types of equipment, challenges, FAQs …. </li></ul><ul><li>Compilation of experience, simply and accessibly </li></ul>
    9. 9. 10-20-30 document : 10 Commandments, 20 Recommendations, 30 advisories <ul><li>The military should not generally be in overall charge </li></ul><ul><li>Technology cannot substitute for personal interaction </li></ul><ul><li>Personalities are more important that processes </li></ul><ul><li>Know the culture and issues that surround you </li></ul><ul><li>Work on building communication networks as you begin to plan </li></ul><ul><li>Centralise planning and decentralise action </li></ul><ul><li>Co-ordinate everything with everybody to the greatest possible extent </li></ul><ul><li>Remember that the UN agencies are distinct,anarchic and highly effective </li></ul><ul><li>Senior Commanders and Staffs need education and training for non-traditional roles </li></ul><ul><li>Even in a seemingly simple operation there WILL BE more media and more politics than anticipated… </li></ul>
    10. 10. <ul><li>Enhance Situational Awareness </li></ul><ul><li>Collecting Intelligence/ Assessing Threats/ Information Sharing </li></ul><ul><li>Improve Planning, Coordinating, and Operating Co-operatively </li></ul>‘ Integrated intelligence operations and community-wide information sharing are the exception rather than the norm.’ House Intelligence Committee Authorisation Report FY03 Priorities
    11. 11. Strategic Challenges <ul><li>The challenge to society: </li></ul><ul><li>To form agile organisations, avoiding institutionalisation, that are capable of responding quickly and appropriately to an ever uncertain and rapidly changing threat. </li></ul><ul><li>The challenge to the research and engineering community: </li></ul><ul><li>To evolve new ways of working to form the socio-technology synergies that deliver a step change in capability. </li></ul>
    12. 12. PUBLIC PUBLIC PUBLIC Fire Virtual Organisation in Crisis Scenarios Police Ambulance TV Radio Text Collaborative Action Planning Threat Assessment Through Life Plan and re-plan Implementation Emergency services Charities PVOs Media Army Military Forces - Joint Forces HQ Air Force Navy NGOs
    13. 13. Framework of Command/Management Dealing with disaster (UK Home Office publication: ) http://www.homeoffice.gov.uk/docs2/dwdrevised.pdf Planning for sustained capability Planning for specific operations Strategic Tactical Operational Establish strategic objectives and overall management framework. Ensure long-term resourcing/expertise. Determine priorities in obtaining and allocating resources; plan and co-ordinate overall response. The ‘Doers’: manage front-line operations
    14. 14. Importance of Virtual Organisations in HL/CM What do we mean by VO? …a collaborative endeavour spanning enterprise (and possibly national) boundaries in which participating entities pool resources, capabilities and information in order to achieve specific objectives. <ul><li>The context </li></ul><ul><li>To collect, access and understand a vast amount of information </li></ul><ul><li>To collaborate across multiple agencies to act on that information </li></ul><ul><li>where there </li></ul><ul><ul><li>is time pressure </li></ul></ul><ul><ul><li>is very high importance </li></ul></ul><ul><ul><li>is high risk </li></ul></ul><ul><ul><li>high uncertainty </li></ul></ul><ul><ul><li>are large consequences for failure </li></ul></ul><ul><li>Connectivity ok within organisations, but patchy across </li></ul><ul><li>organisations </li></ul><ul><li>Internet security not always trusted </li></ul><ul><li>Huge data crunching required </li></ul><ul><li>Data gathered by one organisation cannot always be used </li></ul><ul><li>or even accessed by another (firewalls) </li></ul><ul><li>Complex analysis tools - data mining, visualisation, </li></ul><ul><li>data fusion </li></ul><ul><li>Understanding the meaning -picking out patterns, normal </li></ul><ul><li>versus threat (some can be automated but remember </li></ul><ul><li>the vital role of the people in the system) </li></ul><ul><li>Communication - different cultures, terminology, ways </li></ul><ul><li>of working, isolation, information and knowledge flows </li></ul><ul><li>Virtual collaboration difficult </li></ul><ul><li>Current problems </li></ul>
    15. 15. Requirements to <ul><li>CONNECT </li></ul><ul><ul><li>infrastructure - vertical and horizontal, heterogeneous </li></ul></ul><ul><li>COMMUNICATE </li></ul><ul><ul><li>access to, and sharing of data, information and knowledge </li></ul></ul><ul><ul><li>ontologies - be able to share information which is transferred with its meaning and/or context </li></ul></ul><ul><ul><li>understand the difficulties in communicating </li></ul></ul><ul><ul><li>COLLABORATE </li></ul></ul><ul><ul><li>within a virtual organisation </li></ul></ul><ul><ul><li>socio- technical aspects including... </li></ul></ul><ul><ul><ul><li>communication strategy </li></ul></ul></ul><ul><ul><ul><li>who to share with, what to share </li></ul></ul></ul><ul><ul><ul><li>knowing and understanding the value of information </li></ul></ul></ul><ul><ul><ul><li>formal and informal communication routes </li></ul></ul></ul><ul><ul><ul><li>isolation and situation awareness </li></ul></ul></ul><ul><ul><ul><li>trust/confidence in the data, the people and the process </li></ul></ul></ul><ul><ul><ul><li>cultural differences….. </li></ul></ul></ul>Connect  Communicate  Collaborate  Achieve goals
    16. 16. Global collection and sharing of data via GRID - this is based on a real example dealing with engine maintenance GRID enabled Virtual organisation Data Centres Operations Management - Command chain <ul><li>Agencies working as one team: </li></ul><ul><ul><ul><li>governments </li></ul></ul></ul><ul><ul><ul><li>intelligence services </li></ul></ul></ul><ul><ul><ul><li>emergency services </li></ul></ul></ul><ul><ul><ul><li>military forces </li></ul></ul></ul>Analysis Centre <ul><li>Ontologies </li></ul><ul><li>Distributed diagnostic tools </li></ul><ul><li>Plug and play diagnostic analysis toolsets </li></ul>Integration into existing processes and systems Access to global subject matter experts <ul><li>World wide data collection </li></ul><ul><li>agreements for sharing </li></ul><ul><li>cross border co-operation </li></ul><ul><li>cross agency co-operation </li></ul><ul><li>Historic and Political data </li></ul><ul><li>High risk personnel data </li></ul><ul><li>Sensor data </li></ul><ul><li>Tracking of finances /people/ materials </li></ul><ul><li>etc etc etc </li></ul>Site(s) of incidents/attacks Multiple connections
    17. 17. Collaboration in Virtual Organisations: not just the technology….. But also….. risks associated with ‘not knowing’ or ‘not understanding’ Communication Knowledge exploitation Cultures Dispersed nature of the organisation <ul><li>Multi-agency, location, organisation, </li></ul><ul><li>nationality </li></ul><ul><li>Expectations, ways of working, language, </li></ul><ul><li>values and beliefs </li></ul><ul><li>Cultural learning </li></ul><ul><li>h </li></ul><ul><li>Knowledge capture, sharing and re-use </li></ul><ul><ul><li>Turning the tacit, explicit </li></ul></ul><ul><ul><li>Tacit to tacit sharing </li></ul></ul><ul><li>Access, understanding </li></ul><ul><li>Information flow, decision making, S A </li></ul><ul><li>Information overload, breadth versus depth </li></ul><ul><li>Trust - in people, in information and the </li></ul><ul><li>technology </li></ul><ul><li>Feedback loops: check/confirm </li></ul><ul><li>Leadership and organisational structure </li></ul><ul><li>Shared and understood goals (vert/horiz) </li></ul><ul><li>Collaborative spaces, tools, methods </li></ul><ul><li>` </li></ul><ul><li>Media, richness, isolation, flow </li></ul><ul><li>Formal vs informal </li></ul><ul><li>Motivation </li></ul><ul><li>Intent </li></ul><ul><li>Failure modes - eg e-mail </li></ul>
    18. 18. Systems approach – a balance Effective Crisis Management Technology Process <ul><li>Information and knowledge flows </li></ul><ul><li>Network structure </li></ul><ul><li>Management of data, information and knowledge : access, security </li></ul><ul><li>Reporting procedures </li></ul><ul><li>Cross agency standardisation and protocols </li></ul><ul><li>Understanding the requirements </li></ul><ul><li>Functional allocation </li></ul><ul><ul><li>- what are the people best at? What is the technology best suited to? Where are the best hybrids? </li></ul></ul><ul><li>Where are the operational failure . modes? </li></ul><ul><ul><li>How can we ameliorate these - technology, training, organisational design, communication strategy </li></ul></ul><ul><li>Ontologies- negotiation strategies between data sets </li></ul><ul><li>Intelligent agents for information fusion- s/w agent supported activities </li></ul><ul><ul><li>Dealing with huge amounts of data - fusion, pattern extraction, making connections </li></ul></ul><ul><li>Displays and interfaces that support sensemaking </li></ul><ul><ul><li>Analysis and visualisation tools </li></ul></ul><ul><li>Tools to support decision making and collaborative planning eg Bayesian networks for status assessment </li></ul>People <ul><li>Cognitive processes and behaviours </li></ul><ul><li>Knowledge capture, sharing and re-use - Learning from experience </li></ul><ul><li>Organisational structures - power distance </li></ul><ul><li>Team situation awareness, shared situation awareness and distributed cognition </li></ul><ul><li>Multi-agency, multi-culture communication and collaboration </li></ul><ul><li>Understanding impacts and costs </li></ul>
    19. 19. Task Analysis and functional allocation M an and machine capabilities <ul><li>What MAN is good at: </li></ul>What MACHINES are good at: Cognitive Task analysis <ul><ul><li>Detecting small amounts of visual or acoustic energy </li></ul></ul><ul><ul><li>Perceiving patterns of light or sound </li></ul></ul><ul><ul><li>Detecting signals in high noise levels </li></ul></ul><ul><ul><li>Improvisation and use of flexible procedures </li></ul></ul><ul><ul><li>Storing very large anounts of information for long periods and recalling relevant facts at the appropriate time </li></ul></ul><ul><ul><li>Inductive reasoning </li></ul></ul><ul><ul><li>Exercising judgement </li></ul></ul><ul><ul><li>Reacting to unexpected, low probability events </li></ul></ul><ul><ul><li>Applying originality in solving problems ie generating alternative solutions </li></ul></ul><ul><ul><li>Profiting from experience and altering course of action accordingly </li></ul></ul><ul><ul><li>Performing fine manipulation especially where misalignment appears unexpectedly </li></ul></ul><ul><li>Ability to respond quickly to control signals and to apply great force smoothly and precisely </li></ul><ul><li>Ability to perform repetitive, routine tasks </li></ul><ul><li>Good at monitoring tasks (both man and machine) </li></ul><ul><li>Ability to store and recall large amounts of information in short time period </li></ul><ul><li>Ability to store information briefly and then to erase it completely </li></ul><ul><li>Ability to reason deductively, including computational ability </li></ul><ul><li>Ability to handle highly complex operations and rapid computations with high accuracy </li></ul><ul><li>Ability to do many different things at once </li></ul><ul><li>Insensitivity to extraneous factors </li></ul><ul><li>Ability to repeat operations very rapidly, continuously and in precisely the same way over a long time period </li></ul><ul><li>Ability to operate in environments which are hostile to man or are beyond human tolerance </li></ul><ul><li>Sensitivity to stiimuli outside human range eg infra red </li></ul>Fitts (1951)
    20. 20. Impacts of new technology systems <ul><ul><li>Automation benefits </li></ul></ul><ul><ul><li>common approaches </li></ul></ul><ul><ul><li>standardised quality </li></ul></ul><ul><ul><li>reduction in error </li></ul></ul><ul><ul><li>capitalise on previous experience in storing good practice and ideas </li></ul></ul><ul><ul><li>reduced task time </li></ul></ul><ul><ul><li>Visibility of information to wider team </li></ul></ul><ul><li>Hidden impacts that may not be immediately obvious </li></ul><ul><ul><li>Potentially different sources of error which may not be traceable </li></ul></ul><ul><ul><li>Reduced innovation in solution development </li></ul></ul><ul><ul><li>Reduced situation awareness </li></ul></ul><ul><ul><li>Short cuts and non-standard workarounds to overcome shortcomings in the software </li></ul></ul><ul><ul><li>Increased functionality which may or may not be required. </li></ul></ul><ul><ul><li>Integration of automated system with other relevant systems - errors propagate across many systems </li></ul></ul>
    21. 21. The development of cockpit displays 1980s 1920s 1940s
    22. 22. Situation Awareness <ul><li>Noticing what is happening... </li></ul><ul><li>Understanding what it all means... </li></ul><ul><li>Assessing implications for responses... </li></ul>SA Vulnerabilities <ul><li>Automation / mode awareness </li></ul><ul><li>Unexpected events </li></ul><ul><li>Perceptual distortions </li></ul>Who is that? What is he doing? Why ? What should I do? What do I need to know?
    23. 23. Situation awareness states Actual state of SA Good SA Poor SA Self-awareness: perceived state of SA ‘ Good SA’ ‘ Poor SA’ Appropriate Confidence Inappropriate Confidence Appropriate Caution Inappropriate Caution
    24. 24. SA in Teams Shared SA = the amount of correct SA held in common by multiple team members Overlapping areas represent the correct SA that is shared by individuals.
    25. 25. Commanders Intent / EBO / E-Briefing Common intent: the aim, objective or desired intent as explicitly articulated by the commander as well as the implicit intent – ie the interpretation of the meaning behind the explicit intent which is affected by role, training, doctrine, expectation and so on Effects based operations: taking into account the unintended effects of a specific plan of action from the beginning – ie to ensure that the desired effects are the outcome with as few of the negative effects as possible. Use of operational net assessment – nodes, links and relationships between nodes to assess the likelihood of effects Common intent drives the outcomes since the resulting plan should embody and express the commanders intent E-Briefing - how to get the commanders intent over to groups for people who are dispersed?
    26. 26. Objectives for a Combined Response <ul><li>saving and protecting life </li></ul><ul><li>relieving suffering </li></ul><ul><li>protecting property </li></ul><ul><li>providing the public with information </li></ul><ul><li>containing the emergency – limiting its escalation or spread </li></ul><ul><li>maintaining critical services </li></ul><ul><li>maintaining normal services at an appropriate level </li></ul><ul><li>protecting the health and safety of personnel </li></ul><ul><li>safeguarding the environment </li></ul><ul><li>facilitating investigations and inquiries </li></ul><ul><li>promoting self-help and recovery </li></ul><ul><li>restoring normality as soon as possible </li></ul><ul><li>evaluating the response and identifying lessons to be learned. </li></ul>The doctrine - rules of engagement
    27. 27. Understanding the Thinking of Military Commanders in a Coalition (Multinational : UK, US, FR, GE, CA, AU) MNE3 Feb ‘04 Effects Based Operations need alignment of instruments of power (MOD, FCO, DFID, HMCE) All groups need to understand Coalition Commander’s intentions OGD understand Commander but it highlights that others in HQ don’t understand as well
    28. 28. Text Video conferencing Teleconferencing Communication challenges Requirement for interactivity to develop clarity Reducing emotional content and ability to motivate Closed loop, tightly coupled, very rich communication, verbal and non-verbal Discrete activities for message preparation and sending, possibilities for misunderstandings and errors Tight feedback loop and short response time Face to face Netmeeting/Webex, Centric studio Decreasing emotional content Decreasing communication richness Face to face <ul><ul><li>. </li></ul></ul><ul><ul><li>. </li></ul></ul><ul><ul><li>. </li></ul></ul><ul><ul><li>. </li></ul></ul><ul><ul><li>. </li></ul></ul>Access Grid Cell phone? Cell phone?
    29. 29. Failure modes eg for communication, for data exchange, for analysis and diagnosis, for logistics chain, for maintenance There will be organisational errors, system errors, and human errors Message preparation Message sending/ transmission Message receiving Message understanding Message response Message reading/ viewing Receipt of response Message prioritisation Message selection For example, e-mail:
    30. 30. Taxonomy of communication problems (Grayson and Billings, 1981) Erroneous data (formulation errors) 2) Errors in judgement 3) Conflicting interpretation 14.8 Other inaccuracies in content Similar-sounding names or numerics led to confusion in meaning or in the identity of the intended recipient 1.6 Mis-interpretable (phonetic similarity Equipment malfunction resulting in a complete loss of a message 2.9 Absent (equipment failure) Content of the message lost or severely distorted to the point where the recipient could not understand the intended message. 3.2 Garbled phraseology Originator failed to provide all the necessary information to the recipient to understand the communication 5.5 Incomplete content Message composition, phraseology, or presentation could lead to a misinterpretation or misunderstanding by the recipient 9.9 Ambiguous phraseology Failure to maintain listening watch, proper lookout, or read available correct information 10.3 Recipient not monitoring Message not useful to the recipient because it arrived too early or too late 13.3 Untimely transmission Failure to originate or transmit a required or appropriate message 37.2 Absent (not sent) Definition % Category
    31. 31. Information Processing Failures <ul><li>As a result of : </li></ul><ul><li>Communication </li></ul><ul><li>Information processing </li></ul><ul><li>Search - Did not even see it – therefore not known 25% </li></ul><ul><li>Detection - Saw it but did not understand it - not understood 30% </li></ul><ul><li>Interpretation - seen but not used - not valued 45% </li></ul><ul><li>(Gale, Baggage searching tasks, 2005) </li></ul><ul><li>Refs to 9/11 and to Columbia shuttle disaster </li></ul>
    32. 32. Dimensions of information trust Individual differences in dealing with information and variation over time Setting confidence limits and accept/reject criteria Behaviour Information usage Volition Willingness to accept information Disposition Innate tendency to trust information Habit Acquired tendency to trust information Cognition Perceived trust- worthiness of information Decision Conscious decision to trust ambiguous information Affect Emotional influences associated with information
    33. 33. Attributes of the information itself <ul><li>Perceived trustworthiness of information is reflected in the criteria used to evaluate information quality </li></ul>Chopra and Wallace (2002) Cognition Perceived trustworthiness of information Perceived Security Network security and data protection safeguards Perceived Stability Unlikelihood of accidental loss or alteration Perceived Credibility Accuracy, currency, validity, coverage, believability Perceived Objectivity Freedom from human bias, distortion
    34. 34. Transformation of information during lifecycle How does the information change as it flows round the system? <ul><li>Verbal (public, vernacular) </li></ul><ul><li>Text, notes </li></ul><ul><li>Translation/Codification </li></ul><ul><li>(common shared language, short codes) </li></ul><ul><li>Edited, codified, filtered :selection / emphasis, summarised, translated </li></ul>
    35. 35. ENHANCE collaborative working demonstrator INTERNET Structures Manager (sub-contractor) System Manager (subcontractor) System Integrator (Prime contractor) DMU geometry, white board telephone, shared applications, text chat, file transfer, video
    36. 36. ENHANCE DMU study Structural Engineer
    37. 37. DMU Demonstrator - confidence in being understood Less confident More confident More confident Less confident
    38. 38. DMU Demonstrator : ease of understanding wrt media Easier to understand More difficult to understand More difficult Easier
    39. 39. Understanding the picture <ul><li>Understanding </li></ul><ul><ul><li>building the picture vs being given the picture </li></ul></ul><ul><ul><li>readiness to listen, capability to understand, mapping </li></ul></ul><ul><ul><li>What sense to make of the quantities of data and information collected </li></ul></ul><ul><li>Knowledge sharing and re-use </li></ul>Karl Weick - ‘Imagination’ versus ‘fancy’ (OKLC, Boston, 2005) Imagination is not a gift usually associated with bureaucracy.. Susceptibility to mirror your own expectation in the data … .in new situations we turn to Tom Clancy for inspiration about what to do… ‘ in family’ and ‘out of family’ problems: reportable problem, previously experienced, analysed and understood versus unreportable, not reportable events not understood events - not even sure that there was an event…..
    40. 40. Socio-Technical checklist for virtual team-working : Data Exchange <ul><li>Did the technology adequately support the detail of the information being shared? </li></ul><ul><li>Did all parties have access to the same support and analysis tools? With the same degree of functionality? </li></ul><ul><li>Were there adequate translators between databases to allow transparent reading by all? </li></ul><ul><li>Were participants able to interact, modify, offer alternatives to the shared data and to log the changes for future reference? </li></ul><ul><li>Was there sufficient opportunity for participants to question the rationale behind decisions? </li></ul><ul><li>Were participants aware of any risks of errors? Is this any greater or less than the risk when interacting with a data set alone? </li></ul><ul><li>Were participants aware of the assumptions made in any processing of raw data? Can they find out? Are they aware of any limitations this might mean for re-use? </li></ul>
    41. 41. Decision Making Models <ul><li>But what about when…? (Klein et al 1993) </li></ul><ul><li>The goals are ill-defined, and tasks ill-structured </li></ul><ul><li>Uncertainty, ambiguity and missing data </li></ul><ul><li>Shifting and competing goals </li></ul><ul><li>Dynamic and continually changing conditions </li></ul><ul><li>Action feedback loops – need to update, replan, react to changing conditions </li></ul><ul><li>Time stress </li></ul><ul><li>High stakes </li></ul><ul><li>Multiple players </li></ul><ul><li>Organisational goals and norms – cultural differences </li></ul><ul><li>Experienced decision makers </li></ul>Traditional decision making theory: Identify the problem, generate a set of options for solving the problem /choice alternatives, evaluate, choose and implement eg OODA Observe Orient Decide Act
    42. 42. Attributes of recognition-primed decision model (Klein, 1993) <ul><li>Focus on situation assessment </li></ul><ul><li>A ‘good enough’ rather than an ‘optimised’ solution is the goal </li></ul><ul><li>First option (if deemed workable) is usually adopted for experienced decision makers </li></ul><ul><li>Serial generation and evaluation of options </li></ul><ul><li>Check option will work by simulating mentally </li></ul><ul><li>Focus on elaborating and improving the system </li></ul><ul><li>The decision maker is primed to act </li></ul><ul><li>In practice – commanders not even comparing two options – they were more acting and reacting on the basis of their prior experience. </li></ul><ul><li>(Rhona Flin ) </li></ul>
    43. 43. London Fire and Emergency planning authority : Decision making model Information about the task or event Information about resources Information about risk and benefit Gathering and thinking Objectives Evaluating ACTING Plan Communicating Controlling Outcome DECIDING
    44. 44. Cognitive Task Analysis Sample data for engine failure in flight scenario: <ul><li>Is the runway long enough? </li></ul><ul><ul><li>Weight on arrival, Hydraulics status... </li></ul></ul><ul><li>How long would it take to get there? </li></ul><ul><ul><li>Altitude, Distance, Engine status... </li></ul></ul><ul><li>How much would this diversion cost the carrier? </li></ul><ul><ul><li>Additional fuel, Fees, Passenger accommodation... </li></ul></ul><ul><li>What facilities are there at the airport? </li></ul><ul><ul><li>Emergency facilities, Maintenance facilities... </li></ul></ul>
    45. 45. Use of Cognitive Task Model in Cognitive Engineering Automating Human Functions Recommending the 'best' option Providing a prioritised list of options Ask the model 'what do you think?' Cognitive Task Model
    46. 46. Crisis lifecycle - Operator support Communication infrastructure + Communication protocols + Communication behaviours Action planning - what Implementation plan - how Through crisis plan and re-plan Surveillance Crisis identification Crisis aversion /mitigation Information fusion Data display and visualisation Situation awareness HCI Situation awareness Situation awareness Situation awareness Decision support - cognitive task analysis and modelling to aid in multi-criteria decision analysis Collaborative planning ‘ What if’ scenario testing Impact assessment Effects based planning Ontology based information exchange Knowledge sharing - with who? how much? Push and Pull Risk and threat assessment Risk Perception BUILDING THE PICTURE UNDERSTANDING THE PICTURE CHANGING THE PICTURE in a GOAL ORIENTED FASHION E- Briefing and Tasking Logistics and resource allocation Data and Information fusion – how to display and visualise value, impacts, reliability, currency etc Metrics for trust and confidence in information, assessment of reliability Prioritisation and dealing with info overload
    47. 47. Challenges <ul><li>how to work together … </li></ul><ul><ul><li>multi- site, multi- discipline, multi- organisation, multi- national </li></ul></ul><ul><ul><li>how to handle the vast amounts of information </li></ul></ul><ul><ul><li>value the informal communication routes </li></ul></ul><ul><ul><li>how to get the functional allocation right </li></ul></ul><ul><ul><li>intent, SA, decision making support, experiential training </li></ul></ul><ul><ul><li>how to deal with very different levels of sophistication, cultural differences…. </li></ul></ul><ul><li>visualisation of reality and support for understanding and sense making </li></ul><ul><ul><ul><li>not just a black box </li></ul></ul></ul><ul><ul><ul><li>information trust </li></ul></ul></ul><ul><ul><ul><li>checks and balances - error capture, tracking </li></ul></ul></ul><ul><ul><ul><li>confidence limits /assumptions </li></ul></ul></ul>
    48. 48. Conclusions <ul><li>Systems view required </li></ul><ul><ul><li>Collaboration doesn’t happen on its own </li></ul></ul><ul><ul><li>Knowledge sharing doesn’t happen on its own </li></ul></ul><ul><li>The systems we build must enable these activities to both happen and flourish </li></ul><ul><li>Systems approach – people, technology and process </li></ul><ul><li>HF tools , models and analysis techniques have much to offer throughout the system development lifecycle </li></ul><ul><li>Much research is out there – it needs to be implemented and the lessons learned applied </li></ul>
    49. 49. Thanks for listening!
    50. 50. CM and related topics – Project links <ul><li>ORCHESTRA www. eu -orchestra.org </li></ul><ul><li>improving risk management - The overall goal of ORCHESTRA is to design and implement an open service oriented software architecture that will improve the interoperability among actors involved in Multi-Risk Management. </li></ul><ul><li>GMES http://www.gmes.info </li></ul><ul><li>Global monitoring for environment and security </li></ul><ul><li>INSPIRE http://inspire.jrc.it/state_of_play.cfm </li></ul><ul><li>WIN http://www.win- eu .org </li></ul><ul><li>Wide Information Network - WIN community information system for risk management providing access to international humanitarian aid, alert services, international charter space for major disasters, user communities and so on. </li></ul><ul><li>OASIS http://www.oasis-fp6.org </li></ul><ul><li>Open advanced system for improved crisis management </li></ul><ul><li>EGERIS http://www.egeris.org/ ; European Generic emergency response information system. – Planning and tasking, situation monitoring and decision support. (forest fire, flood and earthquake scenarios) </li></ul><ul><li>MOSAIC – http://www.mosaic-network.org/ </li></ul><ul><li>Mobile Worker Support Environments - mobile technologies : applications and visions </li></ul>
    51. 51. What does a successful system look like? <ul><li>It does what it says on the tin </li></ul><ul><li>It is interoperable with existing systems and other current and potential systems in network </li></ul><ul><li>It is acceptable to the users </li></ul><ul><li>it does what they want it to </li></ul><ul><li>it allows them to do what they need to do in an intuitive and effective manner supporting their behaviours </li></ul>
    52. 53. Learning from experience Do Adapt Learn Reflect Why did it work? Why didn’t it? There is a problem What can I do about it? Do I know what to do? Do I need to find out something new? What do I need? Does it match what I need? What will be the impact of the changes? What worked? What didn’t work?
    53. 54. Situational Awareness in Teams Situation SA Observation Orientation Decision-making Action SA SA SA Information sharing Coordination of SA Situation Individual SA SA among multiple team members
    54. 55. Attributes of (poor) automation <ul><li>Strong - (can behave autonomously) </li></ul><ul><li>Silent - (provides inadequate feedback about activities and intentions) </li></ul><ul><li>Clumsy - ( non - intuitive, interrupts activity, frustration) </li></ul><ul><li>Obstructive - (difficult for automation to be reconfigured in desired way) </li></ul><ul><li>Brittle - ( failure to decay gracefully) </li></ul>
    55. 56. HCI Checklist
    56. 57. <ul><ul><li>Organisational impacts: </li></ul></ul><ul><ul><ul><li>Management styles </li></ul></ul></ul><ul><ul><ul><li>Team structures </li></ul></ul></ul><ul><ul><ul><li>Cultural differences </li></ul></ul></ul><ul><ul><ul><li>Communication issues </li></ul></ul></ul><ul><ul><ul><ul><li>Information flow </li></ul></ul></ul></ul><ul><ul><ul><li>Knowledge exploitation </li></ul></ul></ul><ul><ul><li>Task level impacts: </li></ul></ul><ul><ul><ul><li>Human Computer Interaction (HCI) </li></ul></ul></ul><ul><ul><ul><li>Workload, situation awareness </li></ul></ul></ul><ul><ul><ul><li>Error management </li></ul></ul></ul><ul><ul><ul><li>Automation issues </li></ul></ul></ul><ul><ul><ul><li>Functional allocation </li></ul></ul></ul>Socio-technical evaluation: <ul><ul><li>Identification of user requirements, target audience descriptions, cultural assessments, knowledge and information flow, SNA, decision making support, visualisation, Guidelines and evaluation checklists </li></ul></ul>
    57. 58. Human Factors considerations in design, implementation and evaluation process <ul><li>System Design Process: </li></ul><ul><li>The design process should enable the involvement of a wide range of people so that all necessary experience is utilised. </li></ul><ul><li>Allocation of System Functions: </li></ul><ul><li>The allocation of system functions between humans and machines should aim to meet both the human and technical requirements of the system, and should not be solely technically driven. </li></ul><ul><li>Information & Control Systems: </li></ul><ul><li>The local information and control systems should be designed to meet the operational needs of people working in the system. </li></ul><ul><li>Job Design: </li></ul><ul><li>The human functions in the system should be in jobs which maximise the flexibility, responsiveness, mental health and motivation of those operating and supporting the system. </li></ul><ul><li>Organisation Structure: </li></ul><ul><li>The local organisation structure within which the system operates, should provide sufficient local control, flexibility and support to ensure optimum efficiency. </li></ul><ul><li>Hardware and Software: </li></ul><ul><li>All hardware and software in the system should be designed and located for ease of use, efficiency, safety and comfort. </li></ul><ul><li>Health and Safety: </li></ul><ul><li>A safe, healthy and comfortable working environment should be provided. </li></ul>
    58. 59. Information needs <ul><li>8.25 “Organisations must consider both their own and other organisations’ </li></ul><ul><li>information needs: </li></ul><ul><li>• who needs information </li></ul><ul><li>• what information is needed </li></ul><ul><li>• when it is needed </li></ul><ul><li>• what channels are available and most suitable </li></ul><ul><li>• how to convey information clearly and unambiguously </li></ul><ul><li>• how to keep full and proper records </li></ul><ul><li>• how to build in feedback mechanisms that ensure information is not </li></ul><ul><li>only received, but also understood and acted upon appropriately.” </li></ul><ul><li>Dealing with disaster (UK Home Office publication: ) http://www.homeoffice.gov.uk/docs2/dwdrevised.pdf </li></ul>
    59. 60. DMU Demonstrator - ease of communication More difficult Easier More difficult
    60. 61. Data, Information Management, Knowledge sharing and exploitation <ul><li>Communication, situation awareness (both shared and individual), shared view of intent </li></ul><ul><li>Dealing with large amounts of data and information </li></ul><ul><li>Technologies and behaviours </li></ul><ul><li>Tacit and explicit </li></ul><ul><li>Formal and informal </li></ul><ul><li>Completeness, reliability, currency </li></ul><ul><li>Information trust, people trust </li></ul><ul><li>Capture, sharing, store, re-use </li></ul><ul><li>Visualisation </li></ul><ul><li>Sensemaking, understanding , readiness to see/understand </li></ul>

    ×