The concept of “Resilience Analytics” has recently been proposed as a means to leverage the promise of big data to improve the resilience of interdependent critical infrastructure systems and the communities supported by them. Given recent advances in machine learning and other data-driven analytic techniques, as well as the prevalence of high-profile natural and man-made disasters, the temptation to pursue resilience analytics without question is almost overwhelming. Indeed, we find big data analytics capable to support resilience to rare, situational surprises captured in analytic models. Nonetheless, this article examines the efficacy of resilience analytics by answering a single motivating question: Can big data analytics help Cyber Physical Social (CPS) systems adapt to surprise? This article explains the limitations of resilience analytics when critical infrastructure systems are challenged by fundamental surprises never conceived during model development. In these cases, adoption of resilience analytics may prove either useless for decision support or harmful by increasing dangers during unprecedented events. We demonstrate that these dangers are not limited to a single CPS context by highlighting the limits of analytic models during hurricanes, dam failures, blackouts, and stock market crashes. We conclude that resilience analytics alone are not able to adapt to the very events that motivate their use and may, ironically, make CPS systems more vulnerable. We present avenues for future research to address this deficiency, with emphasis on improvisation to adapt CPS systems to fundamental surprise.
1. The document discusses science process skills which are tools used by scientists to study the world. These skills include observing, classifying, measuring, inferring, predicting, and communicating.
2. It explains the importance of teaching these skills to both students and teachers. Students learn to understand science concepts and become engaged with science, while teachers improve critical thinking and ability to answer questions.
3. Examples are provided for each skill to illustrate its application, such as using graphs to communicate data or comparing features to classify objects.
Quantitative and qualitative analysis of dataNisha M S
This document provides an overview of several qualitative and quantitative research methods and analysis techniques. It discusses interpretative phenomenological analysis (IPA) for qualitative analysis, which aims to explore participants' experiences and perspectives while acknowledging the researcher's own biases. It also reviews grounded theory methodology, discourse analysis techniques, and narrative analysis approaches. For quantitative analysis, it outlines organizing data, visual presentation methods, measures of central tendency, measures of variation, and common statistical tests. The document presents steps and considerations for applying these diverse analytical methods to research.
This tutorial discusses theories of intelligence, the role of genetics and environment, and the relationship between intelligence and success. It addresses the module objectives of contrasting theories of intelligence from Spearman, Thurstone, Gardner and Sternberg; explaining stereotype threat and lift; and discussing strategies to reduce stereotype threat. The tutorial explores evidence from twin studies and research on the biological basis of intelligence. It also summarizes the findings of Terman's longitudinal study of gifted children and the traits beyond intelligence that predicted their success.
1. Heuristics are mental shortcuts that help people make quick judgments and decisions but can sometimes result in cognitive biases.
2. Common heuristics include availability, where judgments are based on examples that readily come to mind, and representativeness, where likelihood is judged based on similarity to prototypes.
3. Biases from heuristics include the illusion of validity, where internal consistency increases unwarranted confidence, and anchoring bias, where initial values unduly influence final judgments due to insufficient adjustment.
The document outlines different research paradigms including positivism, which uses scientific methods and quantitative data, interpretivism/constructivism, which relies on participants' views and qualitative data, and pragmatism, which
This document provides an introduction to scenario planning and discusses its importance for dealing with uncertainty. It outlines some key concepts:
1. Scenario planning allows organizations to think about multiple possible futures rather than relying on single predictions, helping them adapt to changing environments.
2. Cognitive biases like overconfidence and confirmation bias can prevent organizations from detecting signals of change or updating their thinking. Scenario planning addresses this by challenging assumptions.
3. Scenarios are used to embed signals about the future into organizations' mental models of the world in order to draw conclusions and take action, facilitating learning.
4. Constructing and discussing scenarios explicitly challenges conventional wisdom and helps integrate alternative views of the future into decision making.
This document provides an overview of quantitative research methods. It defines quantitative research as investigating relationships between variables through collection and analysis of numerical data. Key characteristics include using structured instruments to gather data from large, representative samples in a way that can be replicated. Quantitative research aims to classify features, count them, and statistically explain observations. Strengths include generalizability and ability to predict outcomes, while weaknesses include lack of contextual understanding. The document also introduces key quantitative research concepts like variables, types of variables, and research design.
This is an intro to Primer in Theory Construction by Paul Reynolds. Presented in Faculty of Entrepreneurship University of Tehran. Advanced Theories of Management( Dr.Arabiun)
1. The document discusses science process skills which are tools used by scientists to study the world. These skills include observing, classifying, measuring, inferring, predicting, and communicating.
2. It explains the importance of teaching these skills to both students and teachers. Students learn to understand science concepts and become engaged with science, while teachers improve critical thinking and ability to answer questions.
3. Examples are provided for each skill to illustrate its application, such as using graphs to communicate data or comparing features to classify objects.
Quantitative and qualitative analysis of dataNisha M S
This document provides an overview of several qualitative and quantitative research methods and analysis techniques. It discusses interpretative phenomenological analysis (IPA) for qualitative analysis, which aims to explore participants' experiences and perspectives while acknowledging the researcher's own biases. It also reviews grounded theory methodology, discourse analysis techniques, and narrative analysis approaches. For quantitative analysis, it outlines organizing data, visual presentation methods, measures of central tendency, measures of variation, and common statistical tests. The document presents steps and considerations for applying these diverse analytical methods to research.
This tutorial discusses theories of intelligence, the role of genetics and environment, and the relationship between intelligence and success. It addresses the module objectives of contrasting theories of intelligence from Spearman, Thurstone, Gardner and Sternberg; explaining stereotype threat and lift; and discussing strategies to reduce stereotype threat. The tutorial explores evidence from twin studies and research on the biological basis of intelligence. It also summarizes the findings of Terman's longitudinal study of gifted children and the traits beyond intelligence that predicted their success.
1. Heuristics are mental shortcuts that help people make quick judgments and decisions but can sometimes result in cognitive biases.
2. Common heuristics include availability, where judgments are based on examples that readily come to mind, and representativeness, where likelihood is judged based on similarity to prototypes.
3. Biases from heuristics include the illusion of validity, where internal consistency increases unwarranted confidence, and anchoring bias, where initial values unduly influence final judgments due to insufficient adjustment.
The document outlines different research paradigms including positivism, which uses scientific methods and quantitative data, interpretivism/constructivism, which relies on participants' views and qualitative data, and pragmatism, which
This document provides an introduction to scenario planning and discusses its importance for dealing with uncertainty. It outlines some key concepts:
1. Scenario planning allows organizations to think about multiple possible futures rather than relying on single predictions, helping them adapt to changing environments.
2. Cognitive biases like overconfidence and confirmation bias can prevent organizations from detecting signals of change or updating their thinking. Scenario planning addresses this by challenging assumptions.
3. Scenarios are used to embed signals about the future into organizations' mental models of the world in order to draw conclusions and take action, facilitating learning.
4. Constructing and discussing scenarios explicitly challenges conventional wisdom and helps integrate alternative views of the future into decision making.
This document provides an overview of quantitative research methods. It defines quantitative research as investigating relationships between variables through collection and analysis of numerical data. Key characteristics include using structured instruments to gather data from large, representative samples in a way that can be replicated. Quantitative research aims to classify features, count them, and statistically explain observations. Strengths include generalizability and ability to predict outcomes, while weaknesses include lack of contextual understanding. The document also introduces key quantitative research concepts like variables, types of variables, and research design.
This is an intro to Primer in Theory Construction by Paul Reynolds. Presented in Faculty of Entrepreneurship University of Tehran. Advanced Theories of Management( Dr.Arabiun)
This document provides information on various qualitative data collection methods, with a focus on observation and interview techniques. It discusses three broad categories of data collection: indirect observation, direct observation, and elicitation. For observation, it describes different types of observers and challenges of observation. It emphasizes the importance of practicing observational skills. For interviews, it outlines types of interviews, issues to consider, and describes the in-depth interview process from planning to conducting the interview. Key functions of communication in interviews are also summarized.
This document discusses research methods and characteristics of good research. It covers the definition of research as a systematic search for knowledge. Research aims to discover answers, gain new insights, describe characteristics, determine frequencies of occurrences, and test hypotheses. Good research is empirical, logical, cyclical, analytical, critical, and methodical. It allows findings to be replicated. Qualities of good researchers include intellectual curiosity, prudence, tolerance to criticism, and intellectual honesty. Research must be systematic, accurate, and precise.
This document discusses research methods and characteristics of good research. It covers the definition of research as a systematic search for knowledge. Research aims to discover answers, gain new insights, describe characteristics, determine frequencies of occurrences, and test hypotheses. Good research is empirical, logical, cyclical, analytical, critical, and methodical. It allows findings to be replicated. Qualities of good researchers include intellectual curiosity, prudence, tolerance to criticism, and intellectual honesty. Research must be systematic, accurate, and precise.
The document defines and discusses different types of educational research. It begins by defining research as a systematic process of investigation that uses specialized tools and procedures to solve problems. Educational research specifically aims to develop a science of behavior in educational situations to determine the most effective teaching methods. The document outlines that educational research is purposeful, objective, and concerned with solving problems through quantitative and systematic analysis of collected data. It then describes the main types of educational research as quantitative (experimental, quasi-experimental, correlational) and qualitative (survey, case studies, documentary analysis, ethnographic, historical, philosophical). Several specific methodologies like experimental research, quasi-experimental research, and correlation research are also defined.
The document discusses the Next Generation Science Standards (NGSS) and what they mean for teaching Earth and space science. It provides background on the development of the NGSS, which were created through a state-led process to update science standards based on frameworks from the National Research Council. The NGSS emphasize three dimensions for each performance expectation: disciplinary core ideas, scientific and engineering practices, and crosscutting concepts. They represent a shift toward more emphasis on engineering practices, application of skills and knowledge, and using science explanations.
NSLC 2014 Session 2 : Critical Thinking in PsychologyJamie Davies
This document summarizes a session on teaching critical thinking skills in psychology. It includes:
- Learning outcomes related to describing critical thinking, reflecting on teaching those skills, describing components of a toaster, and discussing strategies to embed critical thinking in the curriculum.
- Definitions and components of critical thinking abilities like analysis, reasoning, evaluation, problem solving.
- An example task asking participants to list instructions for making toast without a toaster.
- Discussion of embedding critical thinking into classroom activities and using a "thinking ladder" model to develop higher-order skills like forming opinions and decisions.
Importance of Research in Daily Life.pptxJaymarGalag1
Here are 3 potential research topics with brief explanations:
1. Factors influencing student motivation and engagement in online learning.
This topic would use a qualitative research approach like surveys and interviews to understand students' experiences with online learning during the pandemic, what helps or hinders their motivation to learn remotely, and how schools and teachers can better support remote student engagement.
2. Impact of school nutrition programs on academic performance.
This topic could use a quantitative research method like comparing test scores between students who participate in school meal programs versus those who don't, while controlling for other variables, to analyze how adequate nutrition influences academic outcomes. Survey and interview data could also provide context.
3. Community perceptions of local environmental issues.
Practical Research 1 Lesson 1, 2 and 3.pptxcarlo842542
Here are 3 potential research topics with brief explanations:
1. Factors influencing student motivation and academic performance in Lidong High School. This would be a quantitative study examining relationships between variables like financial support, extracurricular activities, teacher quality, facilities etc. and outcomes like grades, attendance, completion rates.
2. Experiences of out-of-school youth in Lidong. This would be a qualitative study using interviews and focus groups to understand the perspectives and lives of young people who have left school early. Themes around financial pressures, family responsibilities, lack of perceived value of education may emerge.
3. Farmers' perceptions of the impacts of climate change on abaca production. This mixed-methods study would analyze
The document discusses several key aspects of the nature of science:
1. Science aims to understand the natural world through careful methodology like observing, measuring, and experimenting. Scientific knowledge is also shaped by human creativity and logic.
2. Scientific theories are substantiated explanations that are continually tested against evidence. Laws summarize relationships demonstrated by evidence.
3. While scientific knowledge is durable, it is also subject to change as new evidence emerges. Scientists try to avoid bias and ensure validity through practices like peer review.
4. Science is influenced by social and political factors as research requires funding and support that can change over time and between cultures. The history of stem cell research is one example.
Episode 14 : Research Methodology ( Part 4 )
Research Methodology: A Step-by-Step Guide for Beginners
by Ranjith Kumar
The Research Methods Knowledge Base by William Trochim
UNIT 1:
[1] Overview of Research Methodologies
1.1: Need for research
1.2: Concepts of research and its methodologies
1.3: Classifications of research
1.4: Sequences in conducting research
SAJJAD KHUDHUR ABBAS
Chemical Engineering , Al-Muthanna University, Iraq
Oil & Gas Safety and Health Professional – OSHACADEMY
Trainer of Trainers (TOT) - Canadian Center of Human
Development
This video is incomplete. We need to share ideas for instructions on completing the following tasks: writing the formal outline, drafting the paper, creating internal documentations and avoiding plagiarism.
This document discusses key concepts related to facilitating learning and cognitive development. It covers 5 principles that influence learning: cognitive/metacognitive, motivational/affective, developmental/social, individual differences/factors, and influences on learning. It also discusses Piaget's theory of cognitive development, which outlines 4 stages from birth through adulthood. Finally, it provides an overview of Bronfenbrenner's ecological systems theory, which describes 4 levels of environmental influences on a child's development: microsystem, mesosystem, macrosystem, and chronosystem.
With the objective of enabling colleges and universities to produce high quality research that will advance learning and national development, it is our duty as responsible higher education institution to make faculty members capable of conducting research endeavors. This research capability and productivity building seminar workshop highlights CHED’s National Higher Education Research Agenda-2 (NHERA) as well as CHED’s priority areas for research. Furthermore, it will reiterate the need to inculcate research ethics when conducting and publishing research works. Various research methods will also be tackled to determine how research methods and designs are planned by the researcher. Likewise, the challenges in crafting research proposals as well as the challenges of statistical analysis and interpretation will be elucidated by chosen speakers who are experts in their own field of specialization.
HOW TO CITE: Aban, J. L. (2015). Different Methods of Research. DON MARIANO MARCOS MEMORIAL STATE UNIVERSITY - North La Union Campus (College of Education) Research capability and productivity building seminar-workshop 2015. July 16, 2015.
This presentation was provided by Dave Kochalko of Artifacts during the NISO event, "Is This Still Working? Incentives to Publish, Metrics, and New Reward Systems," held on February 20, 2019.
The document provides an overview of key concepts in sociology including:
1. It defines sociology as the scientific study of social interaction and organization.
2. It discusses several theoretical perspectives in sociology including functionalism, conflict theory, and symbolic interactionism.
3. It outlines the scientific research process and different methods used in sociological research such as surveys, participant observation, case studies, and experiments.
4. It emphasizes the importance of ethics and protecting subjects in sociological research.
This presentation is provide introduction to research design with focus on distinction between different strategies' of Research. Especially qualitative, quantitative and mixed methods. .
This document provides an overview of the scientific method and importance of research skills. It discusses how human knowledge is often flawed and unscientific, developing through guesses and speculation rather than scientific evidence. It then covers the scientific method, which involves generating hypotheses based on initial observations and theories, then collecting data to test the theories. The document outlines the key stages of the scientific method and importance of validity, reliability and analyzing data to make inferences beyond the sample. Finally, it discusses discussing the implications of findings and how they can inform future research directions.
The document discusses scientific method and research. It defines scientific method as integrating deductive and inductive reasoning to systematically study problems through hypothesis formulation, evidence collection, and hypothesis testing. Research is defined as the objective analysis and recording of controlled observations that can lead to generalizations, principles, or theories. The key aspects of research discussed include formulating problems, developing hypotheses, collecting evidence, analyzing data, and reporting results. Both qualitative and quantitative research approaches are described.
This document discusses the nature and structure of science. It defines science as a systematic and organized body of knowledge accumulated through empirical observations and experimentation. The key aspects covered include:
- Science has both a product form (organized knowledge) and process form (scientific method of inquiry).
- Scientific knowledge is structured in a hierarchy from facts to concepts to generalizations to theories and laws.
- The scientific method involves systematic steps like observation, hypothesis testing, and theory building.
- Science aims to describe, predict, and further our understanding of nature in an objective and evidence-based manner. Scientific findings are also subject to change with new evidence.
Recently, the machine learning community has expressed strong interest in applying latent variable modeling strategies to causal inference problems with unobserved confounding. Here, I discuss one of the big debates that occurred over the past year, and how we can move forward. I will focus specifically on the failure of point identification in this setting, and discuss how this can be used to design flexible sensitivity analyses that cleanly separate identified and unidentified components of the causal model.
I will discuss paradigmatic statistical models of inference and learning from high dimensional data, such as sparse PCA and the perceptron neural network, in the sub-linear sparsity regime. In this limit the underlying hidden signal, i.e., the low-rank matrix in PCA or the neural network weights, has a number of non-zero components that scales sub-linearly with the total dimension of the vector. I will provide explicit low-dimensional variational formulas for the asymptotic mutual information between the signal and the data in suitable sparse limits. In the setting of support recovery these formulas imply sharp 0-1 phase transitions for the asymptotic minimum mean-square-error (or generalization error in the neural network setting). A similar phase transition was analyzed recently in the context of sparse high-dimensional linear regression by Reeves et al.
More Related Content
Similar to GDRR Opening Workshop - Rethinking Resilience Analytics - Daniel Eisenberg, August 5, 2019
This document provides information on various qualitative data collection methods, with a focus on observation and interview techniques. It discusses three broad categories of data collection: indirect observation, direct observation, and elicitation. For observation, it describes different types of observers and challenges of observation. It emphasizes the importance of practicing observational skills. For interviews, it outlines types of interviews, issues to consider, and describes the in-depth interview process from planning to conducting the interview. Key functions of communication in interviews are also summarized.
This document discusses research methods and characteristics of good research. It covers the definition of research as a systematic search for knowledge. Research aims to discover answers, gain new insights, describe characteristics, determine frequencies of occurrences, and test hypotheses. Good research is empirical, logical, cyclical, analytical, critical, and methodical. It allows findings to be replicated. Qualities of good researchers include intellectual curiosity, prudence, tolerance to criticism, and intellectual honesty. Research must be systematic, accurate, and precise.
This document discusses research methods and characteristics of good research. It covers the definition of research as a systematic search for knowledge. Research aims to discover answers, gain new insights, describe characteristics, determine frequencies of occurrences, and test hypotheses. Good research is empirical, logical, cyclical, analytical, critical, and methodical. It allows findings to be replicated. Qualities of good researchers include intellectual curiosity, prudence, tolerance to criticism, and intellectual honesty. Research must be systematic, accurate, and precise.
The document defines and discusses different types of educational research. It begins by defining research as a systematic process of investigation that uses specialized tools and procedures to solve problems. Educational research specifically aims to develop a science of behavior in educational situations to determine the most effective teaching methods. The document outlines that educational research is purposeful, objective, and concerned with solving problems through quantitative and systematic analysis of collected data. It then describes the main types of educational research as quantitative (experimental, quasi-experimental, correlational) and qualitative (survey, case studies, documentary analysis, ethnographic, historical, philosophical). Several specific methodologies like experimental research, quasi-experimental research, and correlation research are also defined.
The document discusses the Next Generation Science Standards (NGSS) and what they mean for teaching Earth and space science. It provides background on the development of the NGSS, which were created through a state-led process to update science standards based on frameworks from the National Research Council. The NGSS emphasize three dimensions for each performance expectation: disciplinary core ideas, scientific and engineering practices, and crosscutting concepts. They represent a shift toward more emphasis on engineering practices, application of skills and knowledge, and using science explanations.
NSLC 2014 Session 2 : Critical Thinking in PsychologyJamie Davies
This document summarizes a session on teaching critical thinking skills in psychology. It includes:
- Learning outcomes related to describing critical thinking, reflecting on teaching those skills, describing components of a toaster, and discussing strategies to embed critical thinking in the curriculum.
- Definitions and components of critical thinking abilities like analysis, reasoning, evaluation, problem solving.
- An example task asking participants to list instructions for making toast without a toaster.
- Discussion of embedding critical thinking into classroom activities and using a "thinking ladder" model to develop higher-order skills like forming opinions and decisions.
Importance of Research in Daily Life.pptxJaymarGalag1
Here are 3 potential research topics with brief explanations:
1. Factors influencing student motivation and engagement in online learning.
This topic would use a qualitative research approach like surveys and interviews to understand students' experiences with online learning during the pandemic, what helps or hinders their motivation to learn remotely, and how schools and teachers can better support remote student engagement.
2. Impact of school nutrition programs on academic performance.
This topic could use a quantitative research method like comparing test scores between students who participate in school meal programs versus those who don't, while controlling for other variables, to analyze how adequate nutrition influences academic outcomes. Survey and interview data could also provide context.
3. Community perceptions of local environmental issues.
Practical Research 1 Lesson 1, 2 and 3.pptxcarlo842542
Here are 3 potential research topics with brief explanations:
1. Factors influencing student motivation and academic performance in Lidong High School. This would be a quantitative study examining relationships between variables like financial support, extracurricular activities, teacher quality, facilities etc. and outcomes like grades, attendance, completion rates.
2. Experiences of out-of-school youth in Lidong. This would be a qualitative study using interviews and focus groups to understand the perspectives and lives of young people who have left school early. Themes around financial pressures, family responsibilities, lack of perceived value of education may emerge.
3. Farmers' perceptions of the impacts of climate change on abaca production. This mixed-methods study would analyze
The document discusses several key aspects of the nature of science:
1. Science aims to understand the natural world through careful methodology like observing, measuring, and experimenting. Scientific knowledge is also shaped by human creativity and logic.
2. Scientific theories are substantiated explanations that are continually tested against evidence. Laws summarize relationships demonstrated by evidence.
3. While scientific knowledge is durable, it is also subject to change as new evidence emerges. Scientists try to avoid bias and ensure validity through practices like peer review.
4. Science is influenced by social and political factors as research requires funding and support that can change over time and between cultures. The history of stem cell research is one example.
Episode 14 : Research Methodology ( Part 4 )
Research Methodology: A Step-by-Step Guide for Beginners
by Ranjith Kumar
The Research Methods Knowledge Base by William Trochim
UNIT 1:
[1] Overview of Research Methodologies
1.1: Need for research
1.2: Concepts of research and its methodologies
1.3: Classifications of research
1.4: Sequences in conducting research
SAJJAD KHUDHUR ABBAS
Chemical Engineering , Al-Muthanna University, Iraq
Oil & Gas Safety and Health Professional – OSHACADEMY
Trainer of Trainers (TOT) - Canadian Center of Human
Development
This video is incomplete. We need to share ideas for instructions on completing the following tasks: writing the formal outline, drafting the paper, creating internal documentations and avoiding plagiarism.
This document discusses key concepts related to facilitating learning and cognitive development. It covers 5 principles that influence learning: cognitive/metacognitive, motivational/affective, developmental/social, individual differences/factors, and influences on learning. It also discusses Piaget's theory of cognitive development, which outlines 4 stages from birth through adulthood. Finally, it provides an overview of Bronfenbrenner's ecological systems theory, which describes 4 levels of environmental influences on a child's development: microsystem, mesosystem, macrosystem, and chronosystem.
With the objective of enabling colleges and universities to produce high quality research that will advance learning and national development, it is our duty as responsible higher education institution to make faculty members capable of conducting research endeavors. This research capability and productivity building seminar workshop highlights CHED’s National Higher Education Research Agenda-2 (NHERA) as well as CHED’s priority areas for research. Furthermore, it will reiterate the need to inculcate research ethics when conducting and publishing research works. Various research methods will also be tackled to determine how research methods and designs are planned by the researcher. Likewise, the challenges in crafting research proposals as well as the challenges of statistical analysis and interpretation will be elucidated by chosen speakers who are experts in their own field of specialization.
HOW TO CITE: Aban, J. L. (2015). Different Methods of Research. DON MARIANO MARCOS MEMORIAL STATE UNIVERSITY - North La Union Campus (College of Education) Research capability and productivity building seminar-workshop 2015. July 16, 2015.
This presentation was provided by Dave Kochalko of Artifacts during the NISO event, "Is This Still Working? Incentives to Publish, Metrics, and New Reward Systems," held on February 20, 2019.
The document provides an overview of key concepts in sociology including:
1. It defines sociology as the scientific study of social interaction and organization.
2. It discusses several theoretical perspectives in sociology including functionalism, conflict theory, and symbolic interactionism.
3. It outlines the scientific research process and different methods used in sociological research such as surveys, participant observation, case studies, and experiments.
4. It emphasizes the importance of ethics and protecting subjects in sociological research.
This presentation is provide introduction to research design with focus on distinction between different strategies' of Research. Especially qualitative, quantitative and mixed methods. .
This document provides an overview of the scientific method and importance of research skills. It discusses how human knowledge is often flawed and unscientific, developing through guesses and speculation rather than scientific evidence. It then covers the scientific method, which involves generating hypotheses based on initial observations and theories, then collecting data to test the theories. The document outlines the key stages of the scientific method and importance of validity, reliability and analyzing data to make inferences beyond the sample. Finally, it discusses discussing the implications of findings and how they can inform future research directions.
The document discusses scientific method and research. It defines scientific method as integrating deductive and inductive reasoning to systematically study problems through hypothesis formulation, evidence collection, and hypothesis testing. Research is defined as the objective analysis and recording of controlled observations that can lead to generalizations, principles, or theories. The key aspects of research discussed include formulating problems, developing hypotheses, collecting evidence, analyzing data, and reporting results. Both qualitative and quantitative research approaches are described.
This document discusses the nature and structure of science. It defines science as a systematic and organized body of knowledge accumulated through empirical observations and experimentation. The key aspects covered include:
- Science has both a product form (organized knowledge) and process form (scientific method of inquiry).
- Scientific knowledge is structured in a hierarchy from facts to concepts to generalizations to theories and laws.
- The scientific method involves systematic steps like observation, hypothesis testing, and theory building.
- Science aims to describe, predict, and further our understanding of nature in an objective and evidence-based manner. Scientific findings are also subject to change with new evidence.
Similar to GDRR Opening Workshop - Rethinking Resilience Analytics - Daniel Eisenberg, August 5, 2019 (20)
Recently, the machine learning community has expressed strong interest in applying latent variable modeling strategies to causal inference problems with unobserved confounding. Here, I discuss one of the big debates that occurred over the past year, and how we can move forward. I will focus specifically on the failure of point identification in this setting, and discuss how this can be used to design flexible sensitivity analyses that cleanly separate identified and unidentified components of the causal model.
I will discuss paradigmatic statistical models of inference and learning from high dimensional data, such as sparse PCA and the perceptron neural network, in the sub-linear sparsity regime. In this limit the underlying hidden signal, i.e., the low-rank matrix in PCA or the neural network weights, has a number of non-zero components that scales sub-linearly with the total dimension of the vector. I will provide explicit low-dimensional variational formulas for the asymptotic mutual information between the signal and the data in suitable sparse limits. In the setting of support recovery these formulas imply sharp 0-1 phase transitions for the asymptotic minimum mean-square-error (or generalization error in the neural network setting). A similar phase transition was analyzed recently in the context of sparse high-dimensional linear regression by Reeves et al.
Many different measurement techniques are used to record neural activity in the brains of different organisms, including fMRI, EEG, MEG, lightsheet microscopy and direct recordings with electrodes. Each of these measurement modes have their advantages and disadvantages concerning the resolution of the data in space and time, the directness of measurement of the neural activity and which organisms they can be applied to. For some of these modes and for some organisms, significant amounts of data are now available in large standardized open-source datasets. I will report on our efforts to apply causal discovery algorithms to, among others, fMRI data from the Human Connectome Project, and to lightsheet microscopy data from zebrafish larvae. In particular, I will focus on the challenges we have faced both in terms of the nature of the data and the computational features of the discovery algorithms, as well as the modeling of experimental interventions.
1) The document presents a statistical modeling approach called targeted smooth Bayesian causal forests (tsbcf) to smoothly estimate heterogeneous treatment effects over gestational age using observational data from early medical abortion regimens.
2) The tsbcf method extends Bayesian additive regression trees (BART) to estimate treatment effects that evolve smoothly over gestational age, while allowing for heterogeneous effects across patient subgroups.
3) The tsbcf analysis of early medical abortion regimen data found the simultaneous administration to be similarly effective overall to the interval administration, but identified some patient subgroups where effectiveness may vary more over gestational age.
Difference-in-differences is a widely used evaluation strategy that draws causal inference from observational panel data. Its causal identification relies on the assumption of parallel trends, which is scale-dependent and may be questionable in some applications. A common alternative is a regression model that adjusts for the lagged dependent variable, which rests on the assumption of ignorability conditional on past outcomes. In the context of linear models, Angrist and Pischke (2009) show that the difference-in-differences and lagged-dependent-variable regression estimates have a bracketing relationship. Namely, for a true positive effect, if ignorability is correct, then mistakenly assuming parallel trends will overestimate the effect; in contrast, if the parallel trends assumption is correct, then mistakenly assuming ignorability will underestimate the effect. We show that the same bracketing relationship holds in general nonparametric (model-free) settings. We also extend the result to semiparametric estimation based on inverse probability weighting.
We develop sensitivity analyses for weak nulls in matched observational studies while allowing unit-level treatment effects to vary. In contrast to randomized experiments and paired observational studies, we show for general matched designs that over a large class of test statistics, any valid sensitivity analysis for the weak null must be unnecessarily conservative if Fisher's sharp null of no treatment effect for any individual also holds. We present a sensitivity analysis valid for the weak null, and illustrate why it is conservative if the sharp null holds through connections to inverse probability weighted estimators. An alternative procedure is presented that is asymptotically sharp if treatment effects are constant, and is valid for the weak null under additional assumptions which may be deemed reasonable by practitioners. The methods may be applied to matched observational studies constructed using any optimal without-replacement matching algorithm, allowing practitioners to assess robustness to hidden bias while allowing for treatment effect heterogeneity.
This document discusses difference-in-differences (DiD) analysis, a quasi-experimental method used to estimate treatment effects. The author notes that while widely applicable, DiD relies on strong assumptions about the counterfactual. She recommends approaches like matching on observed variables between similar populations, thoughtfully specifying regression models to adjust for confounding factors, testing for parallel pre-treatment trends under different assumptions, and considering more complex models that allow for different types of changes over time. The overall message is that DiD requires careful consideration and testing of its underlying assumptions to draw valid causal conclusions.
We present recent advances and statistical developments for evaluating Dynamic Treatment Regimes (DTR), which allow the treatment to be dynamically tailored according to evolving subject-level data. Identification of an optimal DTR is a key component for precision medicine and personalized health care. Specific topics covered in this talk include several recent projects with robust and flexible methods developed for the above research area. We will first introduce a dynamic statistical learning method, adaptive contrast weighted learning (ACWL), which combines doubly robust semiparametric regression estimators with flexible machine learning methods. We will further develop a tree-based reinforcement learning (T-RL) method, which builds an unsupervised decision tree that maintains the nature of batch-mode reinforcement learning. Unlike ACWL, T-RL handles the optimization problem with multiple treatment comparisons directly through a purity measure constructed with augmented inverse probability weighted estimators. T-RL is robust, efficient and easy to interpret for the identification of optimal DTRs. However, ACWL seems more robust against tree-type misspecification than T-RL when the true optimal DTR is non-tree-type. At the end of this talk, we will also present a new Stochastic-Tree Search method called ST-RL for evaluating optimal DTRs.
A fundamental feature of evaluating causal health effects of air quality regulations is that air pollution moves through space, rendering health outcomes at a particular population location dependent upon regulatory actions taken at multiple, possibly distant, pollution sources. Motivated by studies of the public-health impacts of power plant regulations in the U.S., this talk introduces the novel setting of bipartite causal inference with interference, which arises when 1) treatments are defined on observational units that are distinct from those at which outcomes are measured and 2) there is interference between units in the sense that outcomes for some units depend on the treatments assigned to many other units. Interference in this setting arises due to complex exposure patterns dictated by physical-chemical atmospheric processes of pollution transport, with intervention effects framed as propagating across a bipartite network of power plants and residential zip codes. New causal estimands are introduced for the bipartite setting, along with an estimation approach based on generalized propensity scores for treatments on a network. The new methods are deployed to estimate how emission-reduction technologies implemented at coal-fired power plants causally affect health outcomes among Medicare beneficiaries in the U.S.
Laine Thomas presented information about how causal inference is being used to determine the cost/benefit of the two most common surgical surgical treatments for women - hysterectomy and myomectomy.
We provide an overview of some recent developments in machine learning tools for dynamic treatment regime discovery in precision medicine. The first development is a new off-policy reinforcement learning tool for continual learning in mobile health to enable patients with type 1 diabetes to exercise safely. The second development is a new inverse reinforcement learning tools which enables use of observational data to learn how clinicians balance competing priorities for treating depression and mania in patients with bipolar disorder. Both practical and technical challenges are discussed.
The method of differences-in-differences (DID) is widely used to estimate causal effects. The primary advantage of DID is that it can account for time-invariant bias from unobserved confounders. However, the standard DID estimator will be biased if there is an interaction between history in the after period and the groups. That is, bias will be present if an event besides the treatment occurs at the same time and affects the treated group in a differential fashion. We present a method of bounds based on DID that accounts for an unmeasured confounder that has a differential effect in the post-treatment time period. These DID bracketing bounds are simple to implement and only require partitioning the controls into two separate groups. We also develop two key extensions for DID bracketing bounds. First, we develop a new falsification test to probe the key assumption that is necessary for the bounds estimator to provide consistent estimates of the treatment effect. Next, we develop a method of sensitivity analysis that adjusts the bounds for possible bias based on differences between the treated and control units from the pretreatment period. We apply these DID bracketing bounds and the new methods we develop to an application on the effect of voter identification laws on turnout. Specifically, we focus estimating whether the enactment of voter identification laws in Georgia and Indiana had an effect on voter turnout.
This document summarizes a simulation study evaluating causal inference methods for assessing the effects of opioid and gun policies. The study used real US state-level data to simulate the adoption of policies by some states and estimated the effects using different statistical models. It found that with fewer adopting states, type 1 error rates were too high, and most models lacked power. It recommends using cluster-robust standard errors and lagged outcomes to improve model performance. The study aims to help identify best practices for policy evaluation studies.
We study experimental design in large-scale stochastic systems with substantial uncertainty and structured cross-unit interference. We consider the problem of a platform that seeks to optimize supply-side payments p in a centralized marketplace where different suppliers interact via their effects on the overall supply-demand equilibrium, and propose a class of local experimentation schemes that can be used to optimize these payments without perturbing the overall market equilibrium. We show that, as the system size grows, our scheme can estimate the gradient of the platform’s utility with respect to p while perturbing the overall market equilibrium by only a vanishingly small amount. We can then use these gradient estimates to optimize p via any stochastic first-order optimization method. These results stem from the insight that, while the system involves a large number of interacting units, any interference can only be channeled through a small number of key statistics, and this structure allows us to accurately predict feedback effects that arise from global system changes using only information collected while remaining in equilibrium.
We discuss a general roadmap for generating causal inference based on observational studies used to general real world evidence. We review targeted minimum loss estimation (TMLE), which provides a general template for the construction of asymptotically efficient plug-in estimators of a target estimand for realistic (i.e, infinite dimensional) statistical models. TMLE is a two stage procedure that first involves using ensemble machine learning termed super-learning to estimate the relevant stochastic relations between the treatment, censoring, covariates and outcome of interest. The super-learner allows one to fully utilize all the advances in machine learning (in addition to more conventional parametric model based estimators) to build a single most powerful ensemble machine learning algorithm. We present Highly Adaptive Lasso as an important machine learning algorithm to include.
In the second step, the TMLE involves maximizing a parametric likelihood along a so-called least favorable parametric model through the super-learner fit of the relevant stochastic relations in the observed data. This second step bridges the state of the art in machine learning to estimators of target estimands for which statistical inference is available (i.e, confidence intervals, p-values etc). We also review recent advances in collaborative TMLE in which the fit of the treatment and censoring mechanism is tailored w.r.t. performance of TMLE. We also discuss asymptotically valid bootstrap based inference. Simulations and data analyses are provided as demonstrations.
We describe different approaches for specifying models and prior distributions for estimating heterogeneous treatment effects using Bayesian nonparametric models. We make an affirmative case for direct, informative (or partially informative) prior distributions on heterogeneous treatment effects, especially when treatment effect size and treatment effect variation is small relative to other sources of variability. We also consider how to provide scientifically meaningful summaries of complicated, high-dimensional posterior distributions over heterogeneous treatment effects with appropriate measures of uncertainty.
Climate change mitigation has traditionally been analyzed as some version of a public goods game (PGG) in which a group is most successful if everybody contributes, but players are best off individually by not contributing anything (i.e., “free-riding”)—thereby creating a social dilemma. Analysis of climate change using the PGG and its variants has helped explain why global cooperation on GHG reductions is so difficult, as nations have an incentive to free-ride on the reductions of others. Rather than inspire collective action, it seems that the lack of progress in addressing the climate crisis is driving the search for a “quick fix” technological solution that circumvents the need for cooperation.
This document discusses various types of academic writing and provides tips for effective academic writing. It outlines common academic writing formats such as journal papers, books, and reports. It also lists writing necessities like having a clear purpose, understanding your audience, using proper grammar and being concise. The document cautions against plagiarism and not proofreading. It provides additional dos and don'ts for writing, such as using simple language and avoiding filler words. Overall, the key message is that academic writing requires selling your ideas effectively to the reader.
Machine learning (including deep and reinforcement learning) and blockchain are two of the most noticeable technologies in recent years. The first one is the foundation of artificial intelligence and big data, and the second one has significantly disrupted the financial industry. Both technologies are data-driven, and thus there are rapidly growing interests in integrating them for more secure and efficient data sharing and analysis. In this paper, we review the research on combining blockchain and machine learning technologies and demonstrate that they can collaborate efficiently and effectively. In the end, we point out some future directions and expect more researches on deeper integration of the two promising technologies.
In this talk, we discuss QuTrack, a Blockchain-based approach to track experiment and model changes primarily for AI and ML models. In addition, we discuss how change analytics can be used for process improvement and to enhance the model development and deployment processes.
More from The Statistical and Applied Mathematical Sciences Institute (20)
Level 3 NCEA - NZ: A Nation In the Making 1872 - 1900 SML.pptHenry Hollis
The History of NZ 1870-1900.
Making of a Nation.
From the NZ Wars to Liberals,
Richard Seddon, George Grey,
Social Laboratory, New Zealand,
Confiscations, Kotahitanga, Kingitanga, Parliament, Suffrage, Repudiation, Economic Change, Agriculture, Gold Mining, Timber, Flax, Sheep, Dairying,
Temple of Asclepius in Thrace. Excavation resultsKrassimira Luka
The temple and the sanctuary around were dedicated to Asklepios Zmidrenus. This name has been known since 1875 when an inscription dedicated to him was discovered in Rome. The inscription is dated in 227 AD and was left by soldiers originating from the city of Philippopolis (modern Plovdiv).
How to Download & Install Module From the Odoo App Store in Odoo 17Celine George
Custom modules offer the flexibility to extend Odoo's capabilities, address unique requirements, and optimize workflows to align seamlessly with your organization's processes. By leveraging custom modules, businesses can unlock greater efficiency, productivity, and innovation, empowering them to stay competitive in today's dynamic market landscape. In this tutorial, we'll guide you step by step on how to easily download and install modules from the Odoo App Store.
Philippine Edukasyong Pantahanan at Pangkabuhayan (EPP) CurriculumMJDuyan
(𝐓𝐋𝐄 𝟏𝟎𝟎) (𝐋𝐞𝐬𝐬𝐨𝐧 𝟏)-𝐏𝐫𝐞𝐥𝐢𝐦𝐬
𝐃𝐢𝐬𝐜𝐮𝐬𝐬 𝐭𝐡𝐞 𝐄𝐏𝐏 𝐂𝐮𝐫𝐫𝐢𝐜𝐮𝐥𝐮𝐦 𝐢𝐧 𝐭𝐡𝐞 𝐏𝐡𝐢𝐥𝐢𝐩𝐩𝐢𝐧𝐞𝐬:
- Understand the goals and objectives of the Edukasyong Pantahanan at Pangkabuhayan (EPP) curriculum, recognizing its importance in fostering practical life skills and values among students. Students will also be able to identify the key components and subjects covered, such as agriculture, home economics, industrial arts, and information and communication technology.
𝐄𝐱𝐩𝐥𝐚𝐢𝐧 𝐭𝐡𝐞 𝐍𝐚𝐭𝐮𝐫𝐞 𝐚𝐧𝐝 𝐒𝐜𝐨𝐩𝐞 𝐨𝐟 𝐚𝐧 𝐄𝐧𝐭𝐫𝐞𝐩𝐫𝐞𝐧𝐞𝐮𝐫:
-Define entrepreneurship, distinguishing it from general business activities by emphasizing its focus on innovation, risk-taking, and value creation. Students will describe the characteristics and traits of successful entrepreneurs, including their roles and responsibilities, and discuss the broader economic and social impacts of entrepreneurial activities on both local and global scales.
How to Setup Default Value for a Field in Odoo 17Celine George
In Odoo, we can set a default value for a field during the creation of a record for a model. We have many methods in odoo for setting a default value to the field.
🔥🔥🔥🔥🔥🔥🔥🔥🔥
إضغ بين إيديكم من أقوى الملازم التي صممتها
ملزمة تشريح الجهاز الهيكلي (نظري 3)
💀💀💀💀💀💀💀💀💀💀
تتميز هذهِ الملزمة بعِدة مُميزات :
1- مُترجمة ترجمة تُناسب جميع المستويات
2- تحتوي على 78 رسم توضيحي لكل كلمة موجودة بالملزمة (لكل كلمة !!!!)
#فهم_ماكو_درخ
3- دقة الكتابة والصور عالية جداً جداً جداً
4- هُنالك بعض المعلومات تم توضيحها بشكل تفصيلي جداً (تُعتبر لدى الطالب أو الطالبة بإنها معلومات مُبهمة ومع ذلك تم توضيح هذهِ المعلومات المُبهمة بشكل تفصيلي جداً
5- الملزمة تشرح نفسها ب نفسها بس تكلك تعال اقراني
6- تحتوي الملزمة في اول سلايد على خارطة تتضمن جميع تفرُعات معلومات الجهاز الهيكلي المذكورة في هذهِ الملزمة
واخيراً هذهِ الملزمة حلالٌ عليكم وإتمنى منكم إن تدعولي بالخير والصحة والعافية فقط
كل التوفيق زملائي وزميلاتي ، زميلكم محمد الذهبي 💊💊
🔥🔥🔥🔥🔥🔥🔥🔥🔥
A Visual Guide to 1 Samuel | A Tale of Two HeartsSteve Thomason
These slides walk through the story of 1 Samuel. Samuel is the last judge of Israel. The people reject God and want a king. Saul is anointed as the first king, but he is not a good king. David, the shepherd boy is anointed and Saul is envious of him. David shows honor while Saul continues to self destruct.
GDRR Opening Workshop - Rethinking Resilience Analytics - Daniel Eisenberg, August 5, 2019
1. Rethinking Resilience Analytics
Daniel Eisenberg (NPS), David L. Alderson (NPS), Thomas Seager (ASU)
SAMSI Games, Decisions, Risk, & Reliability Workshop, Aug 2019
2. Naval Postgraduate School (NPS)
America's national security research university
History Highlights
1909 Founded at U.S. Naval Academy
1951 Moved to Monterey, CA
Operations Research Curriculum
• Facilities of a graduate research
university
• Faculty who work for the U.S.
Navy, with clearances
• Students with fresh operational
experience
2017:
• 65 M.S. and 15 Ph.D. programs
• 612 faculty
• 1432 resident students includes
(166 international / 47 countries)
• 909 distributed learning students
3. NPS Center for Infrastructure Defense (CID)
Operations Research Department
David Alderson
Professor, OR
Director, NPS Center for
Infrastructure Defense
Ph.D., Stanford University,
2003
Gerald Brown
Distinguished Emeritus
Professor, OR
Member, National Academy
of Engineering
Ph.D., U.C.L.A., 1974
W. Matthew Carlyle
Professor & Chair, OR
Ph.D., Stanford University,
1997
Javier Salmerón
Professor, OR
Ph.D., Universidad
Politécnica (Spain), 1998
Robert Dell
Professor, OR
Ph.D., S.U.N.Y. Buffalo,
1990
Daniel Eisenberg
Research Assistant
Professor, OR
Deputy Director, NPS Center
for Infrastructure Defense
Ph.D., Arizona State
University, 2018
Dan Nussbaum
Visiting Professor, OR
Chair, NPS Energy
Academic Group
Ph.D., Michigan State
Univ., 1971
3
Alan Howard
Deputy Director, NPS
Energy Academic Group
MBA/MIM in International
Management, 2000
NPS Energy Academic Group (EAG)
Larry Walzer
Program Manager, NPS
Energy Academic Group
M.B.A. Boston Univ. 1999
M.S. National Security
Affairs, NPS, 2007
4. Rethinking Resilience Analytics
Daniel Eisenberg (NPS), David L. Alderson (NPS), Thomas Seager (ASU)
SAMSI Games, Decisions, Risk, & Reliability Workshop, Aug 2019
5. resilience analytics = “the
systematic use of advanced data-
driven methods to understand,
visualize, design, and manage
interdependent infrastructures to
enhance their resilience and the
resilience of the communities and
services that rely upon them”
Rethinking Resilience Analytics
Daniel Eisenberg (NPS), David L. Alderson (NPS), Thomas Seager (ASU)
SAMSI Games, Decisions, Risk, & Reliability Workshop, Aug 2019
6. resilience analytics = “the
systematic use of advanced data-
driven methods to understand,
visualize, design, and manage
interdependent infrastructures to
enhance their resilience and the
resilience of the communities and
services that rely upon them”
Rethinking Resilience Analytics
Daniel Eisenberg (NPS), David L. Alderson (NPS), Thomas Seager (ASU)
SAMSI Games, Decisions, Risk, & Reliability Workshop, Aug 2019
7. resilience analytics = “the
systematic use of advanced data-
driven methods to understand,
visualize, design, and manage
interdependent infrastructures to
enhance their resilience and the
resilience of the communities and
services that rely upon them”
Rethinking Resilience Analytics
Daniel Eisenberg (NPS), David L. Alderson (NPS), Thomas Seager (ASU)
SAMSI Games, Decisions, Risk, & Reliability Workshop, Aug 2019
Twitter Feeds +
Machine Learning Models
=
More Resilience
8. resilience analytics = “the
systematic use of advanced data-
driven methods to understand,
visualize, design, and manage
interdependent infrastructures to
enhance their resilience and the
resilience of the communities and
services that rely upon them”
Rethinking Resilience Analytics
Daniel Eisenberg (NPS), David L. Alderson (NPS), Thomas Seager (ASU)
SAMSI Games, Decisions, Risk, & Reliability Workshop, Aug 2019
Twitter Feeds +
Machine Learning Models
=
More Resilience
Uhh… No
9. Rethinking Resilience Analytics
Daniel Eisenberg (NPS), David L. Alderson (NPS), Thomas Seager (ASU)
SAMSI Games, Decisions, Risk, & Reliability Workshop, Aug 2019
16. 3. SURPRISE happens…
Cognitive scientists* typically distinguish between two types of surprise:
*References:
• Lanir Z. Fundamental surprise. Eugene, OR: Decision Research. 1986.
• Wears RL, Webb LK. Fundamental on situational surprise: A case study with implications for resilience.
In: Resilience engineering in practice. vol. 2; 2014. p. 33-46.
Situational Surprise Fundamental Surprise
17. 3. SURPRISE happens…
Cognitive scientists* typically distinguish between two types of surprise:
*References:
• Lanir Z. Fundamental surprise. Eugene, OR: Decision Research. 1986.
• Wears RL, Webb LK. Fundamental on situational surprise: A case study with implications for resilience.
In: Resilience engineering in practice. vol. 2; 2014. p. 33-46.
Situational Surprise Fundamental Surprise
• compatible with previous beliefs • refutes basic beliefs about 'how things work'
18. 3. SURPRISE happens…
Cognitive scientists* typically distinguish between two types of surprise:
*References:
• Lanir Z. Fundamental surprise. Eugene, OR: Decision Research. 1986.
• Wears RL, Webb LK. Fundamental on situational surprise: A case study with implications for resilience.
In: Resilience engineering in practice. vol. 2; 2014. p. 33-46.
Situational Surprise Fundamental Surprise
• compatible with previous beliefs • refutes basic beliefs about 'how things work'
• one cannot define in advance the issues
for which one must be alert
• failures and system responses are
well-modeled or measurable
19. 3. SURPRISE happens…
Cognitive scientists* typically distinguish between two types of surprise:
*References:
• Lanir Z. Fundamental surprise. Eugene, OR: Decision Research. 1986.
• Wears RL, Webb LK. Fundamental on situational surprise: A case study with implications for resilience.
In: Resilience engineering in practice. vol. 2; 2014. p. 33-46.
Situational Surprise Fundamental Surprise
• compatible with previous beliefs • refutes basic beliefs about 'how things work'
• one cannot define in advance the issues
for which one must be alert
• can be averted or mitigated by
information about the future
• advance information on fundamental
surprise actually causes the surprise
• failures and system responses are
well-modeled or measurable
20. 3. SURPRISE happens…
Cognitive scientists* typically distinguish between two types of surprise:
*References:
• Lanir Z. Fundamental surprise. Eugene, OR: Decision Research. 1986.
• Wears RL, Webb LK. Fundamental on situational surprise: A case study with implications for resilience.
In: Resilience engineering in practice. vol. 2; 2014. p. 33-46.
Situational Surprise Fundamental Surprise
• compatible with previous beliefs • refutes basic beliefs about 'how things work'
• one cannot define in advance the issues
for which one must be alert
• can be averted or mitigated by
information about the future
• advance information on fundamental
surprise actually causes the surprise
• learning from situational surprise
seems easy
• learning from fundamental surprise is
difficult
• failures and system responses are
well-modeled or measurable
21. 3. SURPRISE happens…
Cognitive scientists* typically distinguish between two types of surprise:
*References:
• Lanir Z. Fundamental surprise. Eugene, OR: Decision Research. 1986.
• Wears RL, Webb LK. Fundamental on situational surprise: A case study with implications for resilience.
In: Resilience engineering in practice. vol. 2; 2014. p. 33-46.
Situational Surprise Fundamental Surprise
• compatible with previous beliefs • refutes basic beliefs about 'how things work'
• one cannot define in advance the issues
for which one must be alert
• can be averted or mitigated by
information about the future
• advance information on fundamental
surprise actually causes the surprise
• learning from situational surprise
seems easy
• learning from fundamental surprise is
difficult
• failures and system responses are
well-modeled or measurable
22. 3. SURPRISE happens…
Cognitive scientists* typically distinguish between two types of surprise:
*References:
• Lanir Z. Fundamental surprise. Eugene, OR: Decision Research. 1986.
• Wears RL, Webb LK. Fundamental on situational surprise: A case study with implications for resilience.
In: Resilience engineering in practice. vol. 2; 2014. p. 33-46.
Situational Surprise Fundamental Surprise
• compatible with previous beliefs • refutes basic beliefs about 'how things work'
• one cannot define in advance the issues
for which one must be alert
• can be averted or mitigated by
information about the future
• advance information on fundamental
surprise actually causes the surprise
• learning from situational surprise
seems easy
• learning from fundamental surprise is
difficult
• failures and system responses are
well-modeled or measurable
Buy a ticket and lose.
23. 3. SURPRISE happens…
Cognitive scientists* typically distinguish between two types of surprise:
*References:
• Lanir Z. Fundamental surprise. Eugene, OR: Decision Research. 1986.
• Wears RL, Webb LK. Fundamental on situational surprise: A case study with implications for resilience.
In: Resilience engineering in practice. vol. 2; 2014. p. 33-46.
Situational Surprise Fundamental Surprise
• compatible with previous beliefs • refutes basic beliefs about 'how things work'
• one cannot define in advance the issues
for which one must be alert
• can be averted or mitigated by
information about the future
• advance information on fundamental
surprise actually causes the surprise
• learning from situational surprise
seems easy
• learning from fundamental surprise is
difficult
• failures and system responses are
well-modeled or measurableBuy a ticket and win.
Buy a ticket and lose.
24. 3. SURPRISE happens…
Cognitive scientists* typically distinguish between two types of surprise:
*References:
• Lanir Z. Fundamental surprise. Eugene, OR: Decision Research. 1986.
• Wears RL, Webb LK. Fundamental on situational surprise: A case study with implications for resilience.
In: Resilience engineering in practice. vol. 2; 2014. p. 33-46.
Situational Surprise Fundamental Surprise
• compatible with previous beliefs • refutes basic beliefs about 'how things work'
• one cannot define in advance the issues
for which one must be alert
• can be averted or mitigated by
information about the future
• advance information on fundamental
surprise actually causes the surprise
• learning from situational surprise
seems easy
• learning from fundamental surprise is
difficult
• failures and system responses are
well-modeled or measurableBuy a ticket and win.
Don’t buy a ticket
and win.
Buy a ticket and lose.
28. 5. Responding to Fundamental Surprise (I)
…the User and the Modeler can work together to fix the model.
Hurricane Ophelia 2017
What happens when the Modeler and User are both present…?
Hurricane Ophelia Oct 13, 2017 – 02:00 Hurricane Ophelia Oct 13, 2017 – 14:00
32. 5. Responding to Fundamental Surprise (II)
What happens when the Modeler goes away…?
…the User can only abandon the model and is forced to improvise!
Oroville Dam, February 2017
35. MODELER
WORLD
MODEL
Volume
Velocity
Variety
ANALYTICS
Frame of
Reference:
• Background
• Beliefs
• Biases
• Describe
• Predict
• Prescribe
Goals:
6. Responding to Fundamental Surprise (III)
May
contain
situational
surprise
What happens when the User is automated…?
communication &
alignment of goals
In the face of fundamental surprise, the Modeler might be too slow to adapt.
May contain
fundamental surprise
36. 6. Responding to Fundamental Surprise (2)
What happens when the User is automated…?
In the face of fundamental surprise, the Modeler might be too slow to adapt.
…again, there are many examples of this occurring.
37. 7. Responding to Fundamental Surprise (IV)
MODELER
WORLD
USER
MODEL
Volume
Velocity
Variety
ANALYTICS
Goals:
Frame:
communication &
alignment of goals
Frame of
Reference:
• Background
• Beliefs
• Biases
• Describe
• Predict
• Prescribe
Goals:
May contain
fundamental surprise
May
contain
situational
surprise
What happens when the Modeler goes away... and the User is automated…?
38. 7. Responding to Fundamental Surprise (IV)
WORLD
MODEL
Volume
Velocity
Variety
ANALYTICS
May
contain
situational
surprise
What happens when the Modeler goes away... and the User is automated…?
No longer any mechanism for adapting to fundamental surprise!
39. 7. Responding to Fundamental Surprise (IV)
What happens when the Modeler goes away... and the User is automated…?
…we might be seeing more of these situations in the future.
No longer any mechanism for adapting to fundamental surprise!
40. 8. Rethinking Resilience Analytics
USER,
MODELER,
YOU
WORLD
MODEL
Volume
Velocity
Variety
ANALYTICS
So we need analytics that can also adapt to fundamental surprise… but how?
42. USER
MODELER
PRESENTABSENT
PRESENT ABSENT
• Collective Improvisation – unconstrained,
improvised actions tacit knowledge from User
/ Modeler shared experiences.
• Phronesis – shared intent, ability to disobey
User / Modeler requirements
• Explicit Commands – implementing User /
Modeler needs based strict requirements
• Working at Cross Purposes – inability to
share tacit or explicit requirements. Collective
action worsens situations.
8. Rethinking Resilience Analytics
43. USER
MODELER
PRESENTABSENT
PRESENT ABSENT
• Collective Improvisation – unconstrained,
improvised actions tacit knowledge from User
/ Modeler shared experiences.
• Phronesis – shared intent, ability to disobey
User / Modeler requirements
• Explicit Commands – implementing User /
Modeler needs based strict requirements
• Working at Cross Purposes – inability to
share tacit or explicit requirements. Collective
action worsens situations.
• Override the Model – slow down, isolate,
or turn off analytics before they cause
damage.
• Roll Back the Model – revert the model to
a previous version that worked.
• Kludge the Model – implement a “quick fix”
that improves current decisions but may
increase overall model deficiencies.
• Replace the Model – develop and deploy
an entirely new model and analytic
8. Rethinking Resilience Analytics
44. USER
MODELER
PRESENTABSENT
PRESENT ABSENT
• Collective Improvisation – unconstrained,
improvised actions tacit knowledge from User
/ Modeler shared experiences.
• Phronesis – shared intent, ability to disobey
User / Modeler requirements
• Explicit Commands – implementing User /
Modeler needs based strict requirements
• Working at Cross Purposes – inability to
share tacit or explicit requirements. Collective
action worsens situations.
• Query the Model – experiment with
existing analytics to find new ways they can
support decisions during surprise
• Augment the Model – find new data or
observations not captured in current
analytics to augment decision-making..
• Abandon the Model and Apply Heuristics
– ignore analytics and base decisions on
expert judgement gained in past or similar
conditions.
• Guess or Gamble – make decisions
without decision support based on luck.
• Override the Model – slow down, isolate,
or turn off analytics before they cause
damage.
• Roll Back the Model – revert the model to
a previous version that worked.
• Kludge the Model – implement a “quick fix”
that improves current decisions but may
increase overall model deficiencies.
• Replace the Model – develop and deploy
an entirely new model and analytic
8. Rethinking Resilience Analytics
45. USER
MODELER
PRESENTABSENT
PRESENT ABSENT
• Collective Improvisation – unconstrained,
improvised actions tacit knowledge from User
/ Modeler shared experiences.
• Phronesis – shared intent, ability to disobey
User / Modeler requirements
• Explicit Commands – implementing User /
Modeler needs based strict requirements
• Working at Cross Purposes – inability to
share tacit or explicit requirements. Collective
action worsens situations.
• Query the Model – experiment with
existing analytics to find new ways they can
support decisions during surprise
• Augment the Model – find new data or
observations not captured in current
analytics to augment decision-making..
• Abandon the Model and Apply Heuristics
– ignore analytics and base decisions on
expert judgement gained in past or similar
conditions.
• Guess or Gamble – make decisions
without decision support based on luck.
• Override the Model – slow down, isolate,
or turn off analytics before they cause
damage.
• Roll Back the Model – revert the model to
a previous version that worked.
• Kludge the Model – implement a “quick fix”
that improves current decisions but may
increase overall model deficiencies.
• Replace the Model – develop and deploy
an entirely new model and analytic
• Fail Silent / Fail Safe – automatic overrides
shut down automated systems before they
incur human or economic loss
• Safe Fail – system failures incur losses, but
in a planned and expected way
• Cascade and Catastrophe – system failure
is not arrested or absorbed in any effective
way and causes even greater, compounding
failures
8. Rethinking Resilience Analytics
46. resilience analytics = “the
systematic use of advanced data-
driven methods to understand,
visualize, design, and manage
interdependent infrastructures to
enhance their resilience and the
resilience of the communities and
services that rely upon them”
Rethinking Resilience Analytics
Daniel Eisenberg (NPS), David L. Alderson (NPS), Thomas Seager (ASU)
SAMSI Games, Decisions, Risk, & Reliability Workshop, Aug 2019
47. resilience analytics = “the
systematic use of advanced data-
driven methods to understand,
visualize, design, and manage
interdependent infrastructures to
enhance their resilience and the
resilience of the communities and
services that rely upon them”
Rethinking Resilience Analytics
Daniel Eisenberg (NPS), David L. Alderson (NPS), Thomas Seager (ASU)
SAMSI Games, Decisions, Risk, & Reliability Workshop, Aug 2019
Twitter Feeds +
Machine Learning Models
=
More Resilience
Sometimes…
But We Know When it Won’t
48. Contact Information
• Dr. Daniel Eisenberg
Research Assistant Professor
Deputy Director, NPS Center for Infrastructure Defense
Department of Operations Research
Naval Postgraduate School
831-656-2358, daniel.eisenberg@nps.edu
http://faculty.nps.edu/deisenberg
• NPS Center for Infrastructure Defense
Director: Dr. David Alderson
http://www.nps.edu/cid
Unclassified. Distribution unlimited. Material contained herein represents the sole opinion of the author and does
not necessarily represent the views of the U.S. Department of Defense or its components.