Disasters and Humans (DEMS3706 SU2020, Dr. Eric Kennedy)
AP/DEMS3706 Note Share
Hello everyone! Think of this space as a crowdsourced notebook . . . everyone is welcome to take and share DEMS3706 lecture and reading notes here. -[;.
Module One - Rational, Irrational, or Something Else? 2
Cognitive Biases - Definitions 2
Bounded Rationality (Tversky, Kahneman) 6
Representativeness 6
Availability Bias 7
Adjustment and Anchoring 8
Cultural Cognition (Kahan, Braman) 8
DEMS3706 Lecture #1 10
DEMS3706 Lecture #2 (Cultural Cognition) 11
Module Two - Uncertainty & Prediction 13
Prediction, Cognition and the Brain (Bubic, von Cramon, Schubotz) 13
“A 30% Chance of Rain Tomorrow”: How Does the Public Understand Probabilistic Weather Forecasts? (Gigerenzer et al.) 16
Don’t Believe the COVID-19 Models (Tufekci) 18
Lecture #1 20
Lecture #2 21
Module Three - Fear, Anxiety, and All Things Scary 25
Lecture #1 25
Module Four - Decision-making Under Pressure 29
Lecture #1 29
Module Five - Expertise & Thinking as an Institution 33
54Lecture #1 33
Module Six - PTSD & Mental Health 35
Disasters and Humans (DEMS3706 SU2020, Dr. Eric Kennedy) 1
Module One - Rational, Irrational, or Something Else?Cognitive Biases - Definitions
Here are two images of cognitive biases of the ones that are required from the reading guide. The examples are simple and easy to follow:
12 Cognitive Biases That Can Impact Search Committee Decisions
https://www.visualcapitalist.com/50-cognitive-biases-in-the-modern-world/
Bias
Definition
Bias in Action (how this bias applies to disasters)-
Anchoring
This bias is described by individuals relying on an initial piece of information to make decisions. Comment by Eric Kennedy: Nice! Think of the example I gave during tutorial: students first were asked to think of the last two digits of their student number, then guess the number of countries in Africa. The lower the student #, the lower the guess. The higher the student #, the higher the guess. They got /anchored/ towards their initial number!
-During a large-scale disaster, a country may choose to proceed in a manner similar to a different country that went through the same experience, instead of searching for additional information to create the most successful plan. Comment by Eric Kennedy: Yes, these are good: early reactions to the pandemic will shape later ones... although this is also an example of priming.
If you wanted an example that's specific to anchoring, think about the magic "2 meter" number for physical distancing in lines. That number being introduced so early has powerfully affected what we see as "reasonable" physical distancing amounts... if it had started at 5m, we would be in a very different world of assumptions!
-This could also have been observed in how different countries proceeded with closures and containment during the pandemic.
Authority bias
This is defined as the tendency for people to rely more heavily on the opinion of a someone perceive ...
Introduction to ArtificiaI Intelligence in Higher Education
Disasters and Humans (DEMS3706 SU2020, Dr. Eric Kennedy)APDEMS370
1. Disasters and Humans (DEMS3706 SU2020, Dr. Eric Kennedy)
AP/DEMS3706 Note Share
Hello everyone! Think of this space as a crowdsourced
notebook . . . everyone is welcome to take and share DEMS3706
lecture and reading notes here. -[;.
Module One - Rational, Irrational, or Something Else? 2
Cognitive Biases - Definitions 2
Bounded Rationality (Tversky, Kahneman) 6
Representativeness 6
Availability Bias 7
Adjustment and Anchoring 8
Cultural Cognition (Kahan, Braman) 8
DEMS3706 Lecture #1 10
DEMS3706 Lecture #2 (Cultural Cognition) 11
Module Two - Uncertainty & Prediction 13
Prediction, Cognition and the Brain (Bubic, von Cramon,
Schubotz) 13
“A 30% Chance of Rain Tomorrow”: How Does the Public
Understand Probabilistic Weather Forecasts? (Gigerenzer et al.)
16
Don’t Believe the COVID-19 Models (Tufekci) 18
Lecture #1 20
Lecture #2 21
Module Three - Fear, Anxiety, and All Things Scary 25
Lecture #1 25
Module Four - Decision-making Under Pressure 29
Lecture #1 29
Module Five - Expertise & Thinking as an Institution 33
54Lecture #1 33
Module Six - PTSD & Mental Health 35
Disasters and Humans (DEMS3706 SU2020, Dr. Eric Kennedy)
1
2. Module One - Rational, Irrational, or Something Else?Cognitive
Biases - Definitions
Here are two images of cognitive biases of the ones that are
required from the reading guide. The examples are simple and
easy to follow:
12 Cognitive Biases That Can Impact Search Committee
Decisions
https://www.visualcapitalist.com/50-cognitive-biases-in-the-
modern-world/
Bias
Definition
Bias in Action (how this bias applies to disasters)-
Anchoring
This bias is described by individuals relying on an initial piece
of information to make decisions. Comment by Eric Kennedy:
Nice! Think of the example I gave during tutorial: students first
were asked to think of the last two digits of their student
3. number, then guess the number of countries in Africa. The
lower the student #, the lower the guess. The higher the student
#, the higher the guess. They got /anchored/ towards their initial
number!
-During a large-scale disaster, a country may choose to proceed
in a manner similar to a different country that went through the
same experience, instead of searching for additional information
to create the most successful plan. Comment by Eric Kennedy:
Yes, these are good: early reactions to the pandemic will shape
later ones... although this is also an example of priming.
If you wanted an example that's specific to anchoring, think
about the magic "2 meter" number for physical distancing in
lines. That number being introduced so early has powerfully
affected what we see as "reasonable" physical distancing
amounts... if it had started at 5m, we would be in a very
different world of assumptions!
-This could also have been observed in how different countries
proceeded with closures and containment during the pandemic.
Authority bias
This is defined as the tendency for people to rely more heavily
on the opinion of a someone perceived as a figure of authority.
-People may perceive the risk of a disaster differently based on
how information is conveyed by figures of authority.
Comment by Eric Kennedy: Yes, or think about people on
Twitter who claim to be doctors or economists being seen as
more credible because of their perceived authority... even if
they aren't saying something true!
Automation bias
This bias is the dependence and excessive use of automated
systems which can potentially lead to incorrect information and
decisions. Comment by Eric Kennedy: A slight adjustment here:
It's also excessive deference or trust to automation.
4. For the example, think about how much people are trusting
computer models of how COVID will spread. They seem
trustworthy because the come from computers - more so than if
they were calculated by hand!
-During a disaster: Ordering, importing or making protective
equipment. A system can help estimate the quantity required but
can not determine the quality of the product needed or its
effectiveness in protecting people.
Availability Heuristic:
This bias is the tendency to think an event is more likely to
happen because it is more fresh, new, different, relevant and
present in the memory or experience. Comment by Eric
Kennedy: Slight adjustment: This isn't just thinking that an
event is more likely to happen - it's more generally being
influenced by fresh, new, or prominent ideas.
So, the example of thinking of a repeat disaster is great! But, it
could also be something like asking people "where are you most
likely to catch COVID?" People will be more likely to
remember places they've been recently, even if they're less risky
(e.g., I think about the grocery store I visited three days ago,
not the hospital I visited three weeks ago).
-Major disasters like the 2004 Earthquake and Tsunami, every
earthquake right after brought about the same dread as it would
be followed by another Tsunami.
-Post-9/11, politicians and citizens around the world believed
intentional hazards such as terrorist attacks were a highly likely
threat and therefore dedicated (and continue to dedicate)
millions of dollars of public and private funding toward the
“War on Terrorism”. In reality, natural hazards are far more
likely to occur yet receive less funding, media attention, etc.
Bandwagon Effect
The increased likelihood for an individual to accept a belief or
engage in a behaviour as more of those around them accept the
5. trend. They do not evaluate the underlying reasons or
implications.
-People may prepare for disaster in the same way they see
others prepare rather than thinking through their individual
needs. For example the toilet paper craze at the start of
quarantine/ social distancing measures. People saw some
individuals stocking up, and decided to do the same, even
though there was no reason for toilet paper supplies to run out
and COVID targets the respiratory system Comment by Eric
Kennedy: Great example! Yes, the run on toilet paper was in
part thanks to the bandwagon effect. Imagine being in the store
and seeing everyone grabbing at toilet paper; you'd feel
compelled to grab it too!
Bizarreness Effect
The tendency to remember things that are unusual or “Bizarre”
Comment by Eric Kennedy: Yes - think of this as a subset
of availability heuristic: here, things stand out because they're
unique, rather than because they're recent.
-The time spent inside due to COVID mights stick in people's
minds longer and later in life due to the fact that is it not our
usual behaviour
Choice-Supportive Bias
Tendency to overemphasize the benefits of a decision once it
has been made, and minimize the negatives Comment by Eric
Kennedy: Yes, good work! Basically, we subconsciously rewrite
the stories: we play up the reasons our decisions made sense,
and downplay the benefits of other alternatives.
-First responders to a disaster may highlight the benefits of
their decisions to the press even if they made the wrong call in
order to keep people calm and trusting in the authorities
Confirmation Bias
Being more attentive to information that reinforces already held
beliefs while ignoring evidence to what is opposed to one's set
of beliefs. Comment by Eric Kennedy: Yes, and also more
easily accepting of these pieces of evidence (e.g., be less
critical of supportive evidence than you would be of opposing
6. evidence).
This is closely linked to what we're talking about in the Kahan
reading in terms of "biased assimilation"
-Reading evidence/ articles and interpreting information that
supports your own belief/ point of view. Algorithms on
Facebook for example make this very easy because it might
show you only the views of your friends who are supporting the
same political party for example.
Dunning-Kruger Effect
You overestimate your own low ability to do something
Comment by Eric Kennedy: Yes - I'd phrase this a little
differently: People with low abilities tend to overestimate their
abilities at the given task.
-Instinctive reaction to rescue someone from a burning house
without proper protection might result in your own fatality.
Duration Effect
(More commonly called “Duration Neglect”) Comment by Eric
Kennedy: I've adapted this answer to make it a little clearer, so
double check this.
We tend to under-weight the duration of the event, and
overweight the experience at the end of an event.
-Public perceptions of how bad the COVID-19 pandemic was
will be influenced by the last few weeks/months (e.g., did it
taper off quietly as vaccines arrived, or was it in a bad
upswing).
Framing Effect
Decision is influenced by whether the options presented-
positively or negatively.
- If a COVID-19 vaccine is advertised as safe in 99% of people
who receive it, people will be more supportive than if the
vaccine is advertised as having dangerous effects in 1% of
people who receive it. Comment by Eric Kennedy: Changed
the example here to make this clearer.
Fundamental Attribution Error
7. Under-emphasizing situational explanations for one’s behaviour
while over-emphasizing dispositional explanations Comment by
Eric Kennedy: In simple terms, we're more charitable with
ourselves than others! So, if someone else does something bad,
we blame that on their character and ignore the difficult
situation they were in. But, if we do something bad, we explain
it using the situation.
- If someone else causes a car accident, we’d say “they’re a
terrible driver!” But, if we cause a car accident, we’d say “the
weather conditions were bad; this was an anomaly; I’m
normally a good driver” Comment by Eric Kennedy: Changed
the example here to make it a little clearer.
Gambler’s Fallacy
Believing that a particular outcome “is due to occur” since a
different outcome has occurred more frequently than usual in
the past when the outcome is independent of the past.
-Believing that after a bad hurricane season we’re “due” for a
mild one. Comment by Eric Kennedy: Changed this to make it a
disaster-related example.
Hindsight Bias
Overestimating the possibility of predicting a past event when
the event was unforeseeable.
- For instance, claiming that a plane crash was inevitable…
when there wasn’t consensus of that beforehand.
Lake Woebegone Effect
Overestimating one’s achievement or capability relative to
others.
-80% of drivers consider themselves to be above-average
drivers. Obviously, that can't be true, but it's common that we
think what we experience is above average. People would rate
themselves as more prepared for disasters; more adherent with
physical distancing guidelines
Placebo Effect
Reporting effects from a treatment when the effects could not
have originated from the treatment.
-A sugar pill that's used in a control group during a clinical
8. trial. The placebo effect is when an improvement of symptoms
is observed, despite using a non active treatment. It's believed
to occur due to psychological factors like expectations or
classical conditioning.
Planning Fallacy
Underestimating the amount of time required to accomplish a
task.
-Saying a house will be completed in a certain time frame. But a
house can only be built on time if there are no delivery delays,
no employee absences, no hazardous weather conditions, etc.
Comment by Eric Kennedy: Yes, good. A disaster example
of this might be underestimating the amount of time, effort, and
resources required to evacuate a population.
Priminge
Previous stimulus unintentionally and unconsciously guides
responses to future stimulus.
-Exposing someone to the word "yellow" will evoke a faster
response to the word "banana" than it would to unrelated words
like "television." Because yellow and banana are more closely
linked in memory, people respond faster when the second word
is presented. Comment by Eric Kennedy: To add a disaster
example here, imagine that you made a brochure to encourage
people to prepare for disasters. They're going to be influenced
in how they prepare based on the imagery and words used,
rather than the 'real' threats they face.
Sunk Cost Fallacy
Justifying future expenditure on something using past
investment on that thing.
-Individuals continue a behavior or endeavor as a result of
previously invested resources (time, money or effort). This
fallacy, which is related to loss aversion and status quo bias,
can also be viewed as bias resulting from an ongoing
commitment.
-“I might as well keep eating because I already bought the food”
or “I might as well keep watching this terrible movie because
9. I've watched an hour of it already”. Comment by Eric Kennedy:
Or, in a disaster context: continuing to build levees higher and
higher, even if no longer a good mitigation solution.
Here are two examples of cognitive biases lists of the ones that
are required from the reading guide.The examples are simple
and easy to follow and you do not have to look up each one:
12 Cognitive Biases That Can Impact Search Committee
Decisions
https://www.visualcapitalist.com/50-cognitive-biases-in-the-
modern-world/
Bounded Rationality (Tversky, Kahneman)
Many decisions are based on beliefs on the probability of
uncertain events.
People rely on a limited number of heuristic principles which
10. reduce complex tasks of assessing probability to simple
judgements.
Representativeness
Many probabilistic questions take one of the following forms:
· What is the probability that object A belongs to class B?
· What is the probability that event A originates from process
B?
· What is the probability that process B will generate event A?
People typically rely on the representativeness heuristic: the
probability is evaluated by the degree to which A resembles (is
representative of) B.
“Steve is quiet. What’s Steve’s job? Steve is probably a
librarian.”
This can cause insensitivity to the prior probability of
outcomes.
The prior probability/base-rate frequency is ignored in favour of
representativeness.
“Most of the people in town are farmers, but Steve is quiet so
he must be a librarian.”
The heuristic can cause insensitivity to sample size.
Probability shows the similarity of a sample statistic to a
population parameter is lower when the sample size is small.
The heuristic ignores this fact.
· “The probability of getting 60% heads when tossing coins is
the same whether you toss a coin 10 times or 1000 times.”
The heuristic can cause people to expect a sequence based on
the essential characteristics of a random process to be locally
representative of the process.
“The sequence HTHTTH is more likely than HHHTTT or
HHHHTH”
Gambler’s fallacy is an example of this expectation.
11. The heuristic can cause insensitivity to predictability, that is,
one ignores how much predictive power a piece of information
has when making a prediction, instead.
· Ex. “Eric Kennedy did a guest lecture for one of my courses
once and I was very impressed, therefore Eric Kennedy must be
the best professor at York.“
A good fit between the predicted outcome and the input
information may be called the illusion of validity.
The internal consistency of a pattern of inputs is a major
determinant of one's confidence in predictions based on these
inputs.
This illusion persists even when the judge is aware of the
factors that limit the accuracy of his predictions.
The heuristic causes misconceptions of regressions to the mean.
Consider two random variables X and Y which have the same
distribution. If one selects individuals whose average X score
deviates from the mean of X by k units, then the average of
their Y scores will usually deviate from the mean of Y by less
than k units.
This statistical phenomenon is explained with all sorts of cargo
cult thinking.
Availability Bias
The probability of an event is assessed by the ease of which
instances or occurrences can be brought to mind.
“I can think of many terrorist attacks that happened in Toronto,
therefore terrorism is a major risk in Toronto.”
Biases due to the retrievability of instances: larger, more
frequent classes of outcomes may be recalled more easily than
less frequent classes, but other factors also affect recall.
The size of a class is judged by the availability of its instances.
A class whose instances are easily retrieved will appear more
numerous than a class of equal frequency whose instances are
less retrievable.
Biases due to the effectiveness of a search set: if the method of
retrieving instances of one class is more effective than of
12. another, the first will appear to be more frequent.
“There are more words that start with ‘r’ than those that have
‘r’ as the third letter.”
Biases of imaginability: if the instances of a class are generated
rather than stored in memory, the more easily imagined class is
perceived to be more frequent.
“There are more groups of 2 that can be constructed from 100
people than there are groups of 8.”
Illusory correlation: the co-occurrence of events is
overestimated when the events are associated.
“Suspiciousness is seen in the eyes. When this man draws
pictures, the eyes are weird. He must be suspicious.”
Adjustment and Anchoring
People typically start an estimate from an initial value then
make adjustments. These adjustments are insufficient.
Biases in the evaluation of conjunctive and disjunctive events:
the probability of conjunctive events are overestimated and the
probability of disjunctive events is underestimated.
“A project with many consecutive steps, each one depending on
the previous is likely to succeed.”
“A system with many critical components, with the failure of
any individual component causing the entire system to fail, is
not unsafe.”Cultural Cognition (Kahan, Braman)
· People agree that well-being is good, but disagree on what
generates well-being
· It is naive to assume poor education causes this disagreement
· Yes, empirical proof often requires high technical skill to
understand
· But there often isn’t even a consensus to believe
· Opinions correlate to membership in a variety of social
groups, even among experts
· Beliefs cluster (gun control opponents often don’t believe in
global warming)
· Cultural cognition: the psychological disposition of persons to
13. conform their factual beliefs about the instrumental efficacy (or
perversity) of law to their cultural evaluations of the activities
subject to regulation.
· Cultural beliefs are prior to factual beliefs
· People like policies that line up with their cultural values
· What people think the about the consequences of policies
derive from their cultural worldviews
· We can’t all be experts, so this is what we do ¯_(ツ)_/¯
· People have cultural worldviews that follow the dimensions of
“group” and “grid” proposed by Douglas and Wildavsky in Risk
and Culture
· Group: Individualist (Low) vs Solidarist/Communitarian
(High)
· Collective responsibility vs individual responsibility
· Grid: Egalitarian (Low) vs Hierarchist (High)
· Social distribution vs hierarchical distribution
· Individuals select certain risks for attention and disregard
others in a way that reflects and reinforces the particular
worldviews to which they adhere
· On the environment:
· Egalitarians, solidarists: Reducing environmental risk justifies
regulating commercial activities that are productive of social
inequality, legitimize unconstrained self-interest, therefore it
exists
· Individualists: The existence of environmental risk threatens
market autonomy and other private orderings, therefore it
doesn’t exist
· Hierarchists: The existence of environmental risk woul d mean
the social and governmental elites are incompetent, therefore it
doesn’t exist
· Douglas wrote in Purity and Danger that morality, defined by
culture, prescribes what causes danger
· Don’t commit adultery or incest, don’t disrespect or challenge
your rulers
· Disgust and revulsion are strongly connected to contravention
of ordered relations, contradiction of cherished classifications
14. · Why do we have cultural cognition?
· The nexus between danger and morality justifies the accepted
system of morality
· We need to know what actions and states promote our interests
· Cultural dissonance avoidance: It’s uncomfortable to have
beliefs about what’s harmful and benign that contradict
commitments and affiliations essential to one’s identity
· Affect: Perceptions of how harmful activities are informed by
their affect, the visceral reactions those activities trigger,
largely determined by culture.
· In-group/out-group dynamics: We trust people like us
· Group polarization: We want to avoid censure for opposing
opinions, so we agree with the dominant opinion even if we
don’t agree all that much
· Naive realism = individuals tend to believe that the
opinions/beliefs/values held by their own cultural group were
reached via objective assessment whereas the beliefs held by
opposing cultural groups are based on biased information
sources and influenced by their worldviews. Therefore,
evidence and truth can never be transmitted across borders
because each side thinks the others’ sources are biased.
· Reactive Devaluation = individuals who belong to one cultural
group downplay and even dismiss the persuasiveness of the
opposing group’s arguments and evidence, especially in when
intergroup conflict takes place.
· What should we do about it?
· Cultural cognition diminishes our ability to integrate reliable
information or even identify it
· Scientists aren’t immune to it
· We should make policies palatable to all cultural sides
DEMS3706 Lecture #1
· Air France
· the problem here is not the plane, the problem is that in the
minutes that followed the computer glitch, the pilots
experienced serious confusion, lack of orientation, lack of
15. awareness. This disorientation caused the pilots to stall the
plane; this decision is what ultimately caused the fatal plane
crash
· Sensory confusion leads to incorrect inputs . . . this is why we
should pay attention to cognitive biases
· Different Types of Illusions
· Luminance and contrast (dark spots appear at the intersection
of white lines) = this illusion is based on the contrast of light
and dark colours and plays with the way our brain processes
colours and light.
· Geometric/angle illusions
· Motion
· Impossible Figures
· Size consistency
· Basically, brains are constantly searching for ways to process
information more quickly and easily, and we continue to make
predictable mistakes because we take shortcuts which cause us
to incorrectly interpret information
· Tversky & Kahneman
· Bounded Rationality = in order to process the overwhelming
amounts of information that we face daily, our brains use
shortcuts or heuristics
· Loss aversion = we tend to prefer avoiding losses to acquiring
equivalent gains
· We are less likely to take chances if we perceive a loss will
result from our actions; we are more likely to take a chance if
we perceive our chance-taking will result in some positive gain
· There is the way we should think of something
(mathematically) and the way we do think of things
(heuristically)
· Representativeness
· Our probability estimations are influenced by how closely an
example (let’s say person) matches our pre-existing stereotypes.
EG. the probability that Steve is a librarian seems higher
because he sounds like the stereotypical librarian
· Cass Sunstein
16. · Nudges: By identifying and understanding cognitive
heuristics, you can harness the power of these biases and use
them to elicit positive behaviour (or, in the case of
corporations, elicit corporate-friendly behaviours)
· Kahneman and Tversky’s system 1 vs system 2 thinking
· System 1 - rapid and subconscious thinking
· System 2 - formal, logical and mathematical thinking
· Kahneman and Tversky's theory is that, even when we think
we're using the slow, methodological, logical system two mode
- we're often being tricked by system one mistakes.
DEMS3706 Lecture #2 (Cultural Cognition)
· Optical illusions, cognitive biases and heuristics (optical
illusions for our brains/ shortcuts our brains take)
· Bounded rationality = we don’t possess the bandwidth to
process all stimuli and information rationally
·
· Beliefs vs Actions
· Belief disagreements pertain to beliefs rather than actions . . .
people disagree about what to believe
· Action disagreements pertain to actions . . . people disagree
about what to do in a given situation and that is when dangerous
consequences can occur (because inaction is as much of a
decision as action)
· So why do disagreements occur?
· Theory 1: Disagreement exists because people are uniformed.
· We disagree because I am well versed in the topic, well read,
well informed and the disagreer is not.
· We often associate disagreement with the belief that “they are
wrong” or use fundamental attribution error/out-group
homogeneity effect
· Theory 2: Affective Preferences
· Some of our preferences and beliefs are influenced by our past
experiences, how we feel when thinking of a certain belief or
value
· Our feelings and past experiences can bleed into our beliefs
and actions
17. · But this theory only explains light, affective disagreements; it
doesn’t really explain substantial, significant disagreements
· Theory 3: Upbringing
· Perhaps our beliefs are influenced by how we were raised/the
things we are taught to believe (after all, our parent’s beliefs
are strong predictors of our religions, politics, etc)
· However, we possess different beliefs than our parents and our
parents don’t teach us about everything - so how do we form
beliefs that are not directly related to our parents, upbringing?
· Theory 4: Lack of Education?
· Perhaps those who disagree just don’t possess enough
information? Links to Theory 1.
· However, evidence suggests that the more education one
receives, the greater the degree of convergence between
people’s beliefs. In other words, two people tend to disagree
MORE as their education increases.
· This finding indicates that confirmation bias plays a role in
education
·
· Cultural Cognition
· Kahan and Braman argue that people disagree because of
cultural cognition . . . people hold the same goals, but they
disagree about the best way to achieve these goals
· K and B posit that beliefs travel as a pack - beliefs follow an
identifiable pattern. That is, opinions regarding controversial
issues such as climate change, abortion, and gun control are
linked, despite that they are unrelated
· Belief patterns do not occur by chance. We have prior beliefs
and prior values that shape what we believe.
· We do not approach issues as blank slates. We approach issues
with our pre-existing set of values and worldviews and make
decisions based on what solutions best fit with our current
beliefs.
· Why does this happen? We cannot answer big, controversial
questions by ourselves - we cannot gather sufficient proof
independently. Subsequently, to answer these questions, we
18. have to trust experts/take the word of people we trust. And we
tend to trust the experts we know of/are exposed to - we are
familiar with the experts we CHOOSE to watch and those we
choose to watch are often those that share our values. In other
words, we trust the people that share our values.
· EG. If you choose to watch Fox News, you are more likely to
trust the experts featured on Fox News.
· But how do we know who is “like us”? The answer lies in
groups and grids.
· Group (individualistic vs communitarian)
· Individualistic = singular, value being able to take care of
yourself
· Communitarian = concerned with the well being of the whole,
value a system that cares for others,
· Grid (who ought to get what -- egalitarian vs hierarchical)
· Egalitarian (low grid) = everyone should have access to the
same resources
Grid called grid because it is related to levels of hierarchy.
Group is called group because it refers to those in our circle -
do we see ourselves as part of a wider community or as a
singular individual
· Individualist/Hierarchical = Republicans and Conservatives
· Egalitarian/Communitarian = NDP
· Egal/Individ = Green Party
· Values turn into beliefs
· Biased Assimilation = you accept information and view it as
reliable if it matches with your previous beliefs -- explains
confirmation bias
· Naive realism = individuals attribute their beliefs to research
and objective assessment whereas they view other people’s
beliefs as biased
· Reactive devaluation = dismiss valid information/evidence if
it presented by the “other” group
Last week, we learned why we don’t always perceive things the
19. way they actually are -- why our senses deceive us. This week,
we explain how our flawed perceptions impact our
beliefsModule Two - Uncertainty & PredictionPrediction,
Cognition and the Brain (Bubic, von Cramon, Schubotz)
· Prediction/Predictive Processing - Any type of processing that
incorporates or generates information about future states of the
body or environment
·
· Dimensions of Prediction
·
· Type (probabilistic, deterministic)
· Specificity (low, high)
· Level (explicit, implicit)
· Domain (motor, perceptual, cognitive)
· Timescale (long, short)
· Events can be predictable if they occur in a non-random
fashion allowing the brain to find a deterministic or
probabilistic regularity to the relationship between different
events
· The brain may still attempt to predict if the input is random
· The brain may use analogies if the input is novel
· In non-random contexts predictions are generated by learning
and identifying associations, esp. temporal associations
· Accumulate info about statistical regularities while dealing
with noise and uncertainty
· Applying inference rules, analogies
· Concrete/first-order rules - Repetition of stimulus triggers an
expectation of continuation of its appearance
· Higher-order rules - Specific non-interchangeable stimuli,
more complex events, different contexts
· Serial order processing
· Linear/flat sequences (local dependencies)
· Non-linear/hierarchical sequences (long-distance
dependencies)
· Spatial or abstract associations can also be used (contextual
frames) to help identify stimuli
20. · Recombining past events creates more complex predictions,
“memories of the future”
· Anticipation/Preparation/Predictive Coding - Formulating
short-term expectations and communicating them to sensory or
motor areas
· Elevated levels of processing in sensory or motor areas of the
brain prior to and facilitating the processing of an expected
perceptual or motor event (David LaBerge); impact of
predictions on current behaviour (Martin V. Butz et al.);
Mapping of causes to sensory expressions (Karl Friston)
· Levels
· Explicit (Representations of future states)
· Implicit (Behaviours or habits)
· Explicit is more likely to predict upcoming stimulus before it
appears
· Expectation/Prediction/Prospective Code - Representation of
what is predicted to occur in the future
· An item stored in working or long-term memory including
information about the spatial and temporal characteristics of an
expected event
· May be abstract or verbal which does not necessarily pre-
activate relevant sensory cortices
· Prospective Code - Representations of present events which
contain info about their future effects or goals
· Timescales
· Formulation
· Based on long-term experience
· Based on short-term exposure to non-random patterns
· Event distance
· Short-term/online
· Ongoing behaviour, motor control, more accurate
· Long-term/offline
· Not immediately relevant, less accurate
· Multiple expectations with different time/space can be made
for the same event across different brain systems
· Prospection - Consideration of potential distant future events
21. · Ability to pre-experience the future by simulating it in our
minds (Gilbert and Wilson); stored information being used to
imagine, simulate, and predict future events (Daniel Schacter)
· May lack the detail of genuine perception; shortened,
essentialized
· May not be realistic or reliable; prone to error
· Based on exemplars of a specific scenario, but run in a
decontextualized, comparative manner
· Benefits of prediction
· Shortens perception time
· Save resources
· Prepare reactions
· Faster recognition and interpretation by limiting potential
responses to environment
· Increased accuracy, speed, maintenance of info processing
· Creates coherent, stable representations of environment
· Guide top-down deployment of attention, improve info
seeking, decision-making
· Trigger and guide behaviour
· Prospective codes
· Actions are preceded by response-related anticipation,
voluntary behaviour is controlled by a representation of its
outcomes (goal-directed behaviour)
· Could also be facilitated by expected emotional consequences
of actions, which are used as the basis of additional predictions
· Prediction is a “bias signal” that improves the computational
efficiency of specific areas
· Expected stimuli (matches) are processed more efficiently
than unexpected stimuli (mismatches)
· However, mismatches have higher relevancy and priority
· They are more valuable since they signal unsuccessful
learning
· They cost more attention (less efficient)
· Must check behavioural relevance
· Relevant to the current mental set?
· Environmental noise?
22. · If relevant and informative, update knowledge, adapt
behaviour
· Novel events, environment changes
· Prediction allows us to direct our behavior towards the future,
while remaining well grounded and guided by the information
pertaining to the present and the past.
“A 30% Chance of Rain Tomorrow”: How Does the Public
Understand Probabilistic Weather Forecasts? (Gigerenzer et al.)
· In 1980 a study of college students by Murphy et al. found
people misunderstand the meaning of probabilistic statements
like “a precipitation probability forecast of 30%”
· It wasn’t a misunderstanding of probabilities, but a
misunderstanding of “the event to which the probabilities r efer”
· This study examined how the public in five different countries
understood such probabilistic statements
· The paper also argues that the confusion is due to not knowing
the reference class to which the statement refers
· A forecast such as “There is a 30% chance of rain tomorrow”
conveys a single-event probability, which, by definition, does
not specify the class of events to which it refers
· In view of this ambiguity, the public will likely interpret the
statement by attaching more than one reference class to
probabilities of rain, and not necessarily the class intended by
meteorologists.
· Consequently, laypeople may interpret a probability of rain
very differently than intended by experts.
· A psychiatrist who prescribed Prozac to depressed patients
used to inform them that they had a 30–50% chance of
developing a sexual problem such as impotence or loss of sexual
interest
· Many patients became concerned and anxious.
· Telling patients that out of every 10 people to whom he
prescribes Prozac, three to five experience sexual problems
seemed to put patients more at ease
23. · Many had thought that something would go awry in 30–50% of
their sexual encounters
· The original approach to risk communication left the reference
class unclear
· When risks are solely communicated in terms of single-event
probabilities, people have little choice but to fill in a class
spontaneously, based on their own perspective on the situation
· The National Weather Service defines the probability of
precipitation as “the likelihood of occurrence (expressed as a
percentage) of a measurable amount of liquid
precipitation...during a specified period of time at any given
point in the forecast area”
· In practice, this means the rain forecast is the percentage
correct of those days when rain was forecast
· 30% chance of rain does not mean that it will rain tomorrow in
30% of the area or 30% of the time.
· Hypotheses:
· The public has no common understanding of what a
probability of rain means
· The confusion should be lower and the prevalence of the days
interpretation should be higher
· among people in countries that have been exposed to
probabilistic weather
· among people who have been exposed to probabilistic weather
forecasts for a larger proportion of their lives
· Probabilities of rain were introduced into mass media weather
forecasts in:
· New York in 1965
· Amsterdam in 1975
· Berlin in the late 1980s
· Milan only on the Internet
· Athens, never
· The prevalence of the days interpretation in the five countries
is not positively correlated with length of national exposure
· Consistent with the individual-exposure hypothesis, the
proportion of individual exposure was positively related to
24. choosing the days interpretation
· Individual exposure as in proportion of each participant’ s life
during which he or she had been exposed to weather forecasts
expressed in probabilistic terms
· A few respondents to the open question referred to classes of
events other than area or time
· The inclusion of quantitative probabilities in weather for ecasts
has been advocated because probabilities can “express the
uncertainty inherent in forecasts in a precise, unambiguous
manner, whereas...traditional forecast terminology is subject
to...misinterpretations”
· If probabilities are really unambiguous, one may ask why
probabilistic forecasts are still so widely misunderstood.
· In countries such as Greece, probabilities of rain are simply
not provided to the public
· This also holds to some degree in other countries, where only
some mass media use probabilities.
· When probabilistic weather forecasts are provided, they are
typically presented without explaining what class of events they
refer to.
· Third, in the rare cases where an explanation is presented, it
sometimes specifies the wrong reference class. (e.g.
Netherlands)
· The authors suggest misunderstandings can be easily reduced
if a statement specifying the intended reference class is added.
· Quantitative probabilities will continue to confuse the public
as long as experts do not spell out the reference class when they
communicate with the public.Don’t Believe the COVID-19
Models (Tufekci)
· Epidemiologists routinely turn to models to predict the
progression of an infectious disease.
· The Trump administration has just released the model for the
trajectory of the COVID-19 pandemic in America.
· We can expect a lot of back-and-forth about whether its
mortality estimates are too high or low.
· There is no right answer.
25. · Right answers are not what epidemiological models are for.
· Sometimes, when we succeed it looks like we overreacted.
· A near miss can make a model look false.
· But that’s not always what happened.
· It just means we won.
· Fighting public suspicion of these models is as old as modern
epidemiology
· John Snow’s famous cholera maps in 1854 proved London’s
cholera was spreading through water that came out of pumps,
not the city’s foul-smelling air
· Many people didn’t believe Snow, because they lived in a
world without a clear understanding of germ theory and only the
most rudimentary microscopes.
· In our time, however, the problem is sometimes that people
believe epidemiologists, and then get mad when their models
aren’t crystal balls.
· When an epidemiological model is believed and acted on, it
can look like it was false.
· Epidemiologists also have to estimate the impact of
interventions like social isolation.
· Limited, perhaps censored data
· Can we trust China?
· Differing interventions
· (e.g. China entire families quarantined vs US quarantines)
· These models describe a range of possibilities
· The variety of potential outcomes coming from a single
epidemiological model may seem extreme and even
counterintuitive
· Those possibilities are highly sensitive to our actions
· Epidemics are especially sensitive to initial inputs and timi ng
· Where should those parameters come from?
· Model-makers have to work with the data they have, yet a
novel virus has a lot of unknowns.
· Epidemics grow exponentially
· A model’s robustness depends on how often it gets tried out
and tweaked based on data and its performance.
26. · Trump was projected to lose in most US election polls
· Only 2 elections before 2016 occurred with Facebook
· Bad models or the less likely outcome naturally occurred?
· With this novel coronavirus, there are a lot of things we don’t
know because we’ve never tested our models, and we have no
way to do so.
· Why should we use models if they’re not certain?
· Epidemiology gives us agency to identify and calibrate our
actions by pruning catastrophic branches of a tree of
possibilities that lies before us.
· Epidemiological models have “tails”—the extreme ends of the
probability spectrum.
· Think of those tails as branches in a decision tree.
· At the beginning of a pandemic, we have the disadvantage of
higher uncertainty, but the advantage of lower costs to actions
· The disease is less widespread.
· By acting we change the underlying parameters
· The U.K. had almost no social-isolation measures in place, to
let the virus run its course through the population, with the
exception of the elderly to create “herd immunity.”
· Let enough people get sick and recover from the mild version
of the disease
· An epidemiological model from Imperial College London
projected that without drastic interventions, more than half a
million Britons would die from COVID-19
· The stark numbers prompted British Prime Minister Boris
Johnson to change course, shutting down public life and
ordering the population to stay at home.
· Neil Ferguson, the scientist who led the Imperial College
team, testified before Parliament that he expected deaths in the
U.K. to top out at about 20,000
· This caused media outrage
· One former New York Times reporter described it as “a
remarkable turn,”
· The British tabloid the Daily Mail ran a story about how the
scientist had a “patchy” record in modeling.
27. · The conservative site The Federalist even declared, “The
Scientist Whose Doomsday Pandemic Model Predicted
Armageddon Just Walked Back the Apocalyptic Predictions.”
· There was no turn in the model.
· The model lays out a range of predictions from tens of
thousands to 500,000 dead which all depend on how people
react.
· The spread of the disease depends on exactly when you stop
cases from doubling
· Even a few days can make an enormous difference.
· Lombardy and Veneto, took different approaches to the
community spread of the epidemic.
· Both mandated social distancing, but only Veneto undertook
massive contact tracing and testing early on.
· Lombardy is now tragically overrun with the disease, while
Veneto has managed to mostly contain the epidemic
· The model worked with limited parameters and data
· Imperial College model uses numbers from Wuhan, China,
along with some early data from Italy
· Many of these data are not yet settled, and many questions
remain
· What’s the attack rate—the number of people who get infected
within an exposed group, like a household?
· Do people who recover have immunity?
· How widespread are asymptomatic cases, and how infectious
are they?
· Are there super-spreaders?
· What are the false positive and false negative rates of our
tests?
· EtcLecture #1
· some predictions are so vast and overwhelming/impossible
· Prediction is composed of different components and we must
pull apart different parts of prediction to make a forecast about
the future
· Anticipation = things that are short-term, relatively simple,
28. and closely connected to our sense and motor skills (eg. seeing
a visual pattern
· Prospection = long-term, abstract prediction . . . about long-
term events that lack motor sensory (more theoretical things
that lack clear patterns). EG. when will the second wave of
COVID-19 occur
· Anticipation is the process of forming an expectation
· Expectation is very sense based, what we generate from
anticipation and prospection. Anticipation and prospection are
the processes and expectation is the result
· There are many brain regions involved in making predictions
and different brain regions light up when making predictions.
Lots of involvement from prefrontal cortex
· Prediction is not located in one part of your brain, it involves
many areas of your brain
· The parts of the brain involved in prediction are also used in
other tasks
· Blurred line . . . all parts work together, there is not just one
part of the brain that handles prediction
· Our brains have evolved to handle some predictions very well
(linear predictions) and some predictions pretty poorly
(exponential predictions)
· Bad at handling exponential predictions because we haven’t
been exposed to exponential changes as much . . . we are biased
toward linear patterns and predictions, we simplify things down
and get fixated on one specific aspect or component of our
prediction (we create certainty where certainty doesn’t exist)
· When we see an event over and over again, our neural
connections get stronger and stronger (if you have experienced
hurricanes before and have always gotten by without
evacuating, you will tend not to evacuate when faced with a
hurricane warning)
· neural connections that make up the memory of an event make
you think that this future scenario will follow past scenarios
· Our brains are good at handling short-term predictions because
evolutionarily, we have had hundreds of thousands of years of
29. practice making short-term predictions/looking at patterns
· Long-term predictions involve looking at situations that will
break the pattern/deviate from the linear pattern
Lecture #2
· Prediction can inform an individual’s future behaviour,
prediction can also influence at the organizational level
· Prediction is useful because it helps us get a picture of what is
occurring/it informs us
· Prediction reduces anxiety and gives us clarity and relief from
fear, helping move things from the general realm of anxiety to
specific fear/focused anxiety.
· Prediction allows us to do stuff in the future, and it also
reduces cognitive load by reducing stress
· Risk = probability x consequence
· Probability = how likely it is that something occurs
· Consequence = the impact of that event if it occurs
· Therefore, predicting risk requires you to predict the elements
of risk
· 30% Chance of Rain (misunderstanding of reference classes)
· People often misunderstand the reference class a prediction is
about
· A reference class is the event that the prediction refers to
· When people see weather forecasts, they frequently
misinterpret the forecast statement and apply the probability to
all different classes
· Theory states that people misunderstand predictions because
they misunderstand the reference class this
· Predictions are powerful because they simplify things. We feel
like we have control over our anxieties
30. · Imaginaries are complex expectations about the future, often
shared by many people, regarding norms, values, institutions,
and possibilities
· Pandemics are controllable and predictable vs pandemics are
divine punishment
· Imaginaries are composed by several sub-predictions and sub-
beliefs, so imaginaries depend on assumptions and beliefs about
what will occur
· They are essentially meta-predictions (a prediction about lots
of other predictions)
· Imaginaries are influenced by our group-grid status and
influence what we think is possible, what we worry about and
what we don’t, and how much control we think we have
· Imaginaries tend to be shared by some people (people working
in public health) but not others (people who don’t think the
virus is a big deal)
· Google prediction algorithm holds an imaginary that is
different than the individual “googler’s” imaginary - Google
predicts the individual is ill, individual might just be
· Optional Reading: Pandemic Prophecies - How we construct
imaginaries
· Optional Reading: Algorithms and Google Flu Trends -
Creators of Google Flu Trends have diff. imaginaries than users,
which makes it break
· Optional Reading: Simulation and expertise in bushfire
prediction - Experts have certain ways of building imaginaries
· Thinking about the future involves more brain regions and
complexity than the present since it involves simulation
· But it’s not all that different
· Prediction involves simplification
· Prospection is different from anticipation
· Dimensions of prediction
· We don’t know much about how correction happens or how the
different regions of the brain work together
· Cultural cognition and other biases shape our reference
classes, imaginaries and other parts of our predictions
31. · E.g. Trump administration: Only x number of people will die
from COVID-19, but we’re framing it as y number of people
would’ve died
· Imaginaries are from sociology
Module Three - Fear, Anxiety, and All Things ScaryLecture #1
32. Fear:
· an emotional or affective response to a situation
· Fear is a reaction, rather than a thing itself
· Fear is an emotion that is caused by a perceived threat
· Often connected to behavioural responses like fight, flight,
freeze response
· Fear, like other cognitive processes, has an evolutionary
purpose, it helps us stay alive
· Fear is temporary, happens just when the fear trigger is
activated
· Fear triggers biophysical reactions
· Fear is not the same as risk . . . our biases and bounded
rationality will affect what fears are more salient or significant
to us
· Perceived risk may or may not line up with objective risk
What Fears Exist With Respect to C-19?
What Distinguishes Fear?
· Compared to fear, a phobia is seen as an “irrational” or
“overactive” fear response . . . a fear response that doesn’t
make sense and exists for no beneficial or apparent reason
· Anxiety is more generalized feeling of dread . . . longer
lasting than fear (fear response is not just triggered when you
are in a situation but also any time you think of that situation),
future-oriented, more diffuse threat (not a focused threat but a
general threat -- dogs vs any animal), creates excessive caution
and avoidance of situation (whereas fear often leads to coping
strategies)
Social construction
· Some things are decided by society, we as a society decide
33. what is considered a fear and what is deemed a phobia based on
what we see as rational and what fear we see as irrational.
· Fear vs phobia isn’t preordained, the difference between the
two is decided by societal perceptions
· We as a society decide what is a rational fear vs an irrational
phobia
From Anxiety to Fear
· Something can switch from being a fear to an anxiety or vice
versa (anxiety to fear)
· Sometimes this is done explicitly
· Capitalizing on an anxiety to cause a desired change . . . take
an anxiety and turn it into a specific fear that help achieve a
purpose (blaming a disaster or incident on an ethnic group will
cause fear of that ethnic group)
· Scapegoating . . . turn a big anxiety and manipulate it into a
fear about a specific ethnic group
· Can also take a fear and manipulate it into a big anxiety . . .
· EG. environmental groups may take a specific fear (fear of a
terrorist breaking into a nuclear plant and using the plant for
negative means) and try to turn it into a general anxiety about
anything nuclear (so now we want to avoid nuclear at all costs)
· Innate vs Learned Fear (a spectrum, not binary)
· Some fears are relatively common across all humans and
society (eg. disasters, tornadoes, earthquakes) . . . these are
considered “innate fears” because most people possess these
fears
· Learned fears are fears that have developed through particular
experiences. Eg. a traumatizing childhood experience results in
development of a fear
· A personal experience that triggers a fear in a particular
individual
· Little Albert Experiment: exposed an infant to an array of
different stimuli to gauge his reaction. Then linked stimuli to
noise . . . white rat and scary sounds. This is an example of a
34. learned feared.
· Fear of dying is an innate fear, but the degree to which we fear
particular causes of sickness or death varies.
Fears are Historically Situated
· Fears have changed throughout history
· Some fears are no longer relevant (or as relevant) because we
have found solutions to them . . . eg. polio or smallpox
· New stimuli can create new manifestations of fear (eg. fear of
flying was created due to creation of airplane)
· Some groups try to intentionally create fears to advance their
purposes (government tries to amplify fear of terrorism because
that will allow them to execute certain goals . . . more funding
for defence, airport security, etc)
Normalization and Acclimatization (researchers use these terms
synonymously)
· Fears can also change at the individual level
· Normalization occurs when we become less fearful of certain
stimuli. Through prolonged exposure, you subconsciously
become trained to not fear that stimuli
· Often caused by prolonged exposure
· Results in less acute stress reactions
· Doesn’t mean the stress disappears (peak stress subsides,
Measuring fear
· There are two types of definitions . . . conceptual and
operational
· Operational definitions are linked to measurement. Define
something in a way that allows you to MEASURE it. Define
fear in a measurable way.
· Conceptual definitions are just theory/words
Fear Appeals
· We can use fear to motivate change
·
35. Protection Motivation theory
· Evaluate threat (how much of a risk is this), evaluate coping
mechanisms (how would I cope with this), and evaluate rewards
of coping with threat and costs of not coping with threat to
determine to what degree we will allow the fear to guide our
decision
·
36. Module Four - Decision-making Under PressureLecture #1
· Use first person statements (in this reflection, I will show that
this theory caused this phenomenon)
· Clear use of theory statement
· Have you thought of other perspectives? Have you anticipated
objections?
· Give a clear definition of a theory and choose one aspect of it
as a whole
· Choose an aspect of RPDM (the comparison to past experience
types, how this is contrasted with making explicit choices)
· Use shorter quotations (single out a few words rather than a
long winded quotes)
· Use quotes from the theory
· Change and adapt as things go
· Don’t commit to your initial thesis or plan
· “I started with automation bias, but I actually wanted to talk
about confirmation bias”
· The granularity of the aspect of COVID-19 you use in your
reflections should be like “mask-wearing,” “vaccine hesitancy,”
etc.
· Use news sources or even /r/coronavirus on Reddit for
inspiration
· Acute stress vs chronic stress
· Acute stress
· Flight/fight/freeze
· Evolutionary basis
· If we didn’t take one of these actions, we were less likely to
survive a sudden threat
37. · Fight - stop the threat
· Flight - run away
· Freeze - don’t be seen
· Freeze is a newer element/newer response. We haven’t studied
it as much because it does not align with our understanding of
threat response, does not adhere to our understanding of
physical responses to threats
· Amygdala
· Controls limbic system
· Emotions
· Learning
· Memory
· Feeds into the hypothalamus
· Hypothalamus
· Controls autonomic nervous system
· Unconscious activities
· Breathing
· Heart rate
· BP
· Blood vessels
· 3 branches of ANS
· Enteric nervous system
· Eating, digestion
· Sympathetic nervous system
· Quick mobilization
· Heart races, eyes dilate
· Adrenaline
· [See slides]
· “Speed up”
· Parasympathetic nervous system
· Rest, digestion
· “Slow down”
· Shock
· Two meanings: Medical shock, and Acute Stress Reaction
(ASR) “shock”
· Shock
38. · Insufficient blood flow to body
· ASR
· Increased heart rate
· Sweating
· [See slides]
· Everyone feels ASR
· Even emergency responders
· Train emergency responders to control their stress responses
and reduce severity of stress and ASR symptoms via repeated
exposure, checklists and procedures (guidance/having a
plan/and slowing down reduces stress)
· How do people behave under stress?
· How do we find out?
· [See slides]
· Ask people
· Unreliable, biased
· Planning bias, etc.
· “Oh, I’ll be calm, then I’ll do a kung-fu kick and save
everyone”
· Modelling
· Mathematical computer simulations
· But model might not be accurate to real life
· Drills/simulations
· High fidelity simulation
· Desensitization achieved with drills
· Study past disasters
· Excessive drills might desensitize people to a real emer gency
· Hotel evac experiment
· Volunteers chosen for fake evac, around 40-50
· Volunteers drilled evac
· Then taken to a hotel as a “reward” which was the actual
experiment
· Simulated smoke
· Very early morning
· What did we learn from these experiments?
· Not everyone will evac
39. · If someone personally tells you to evac, your chance increases
· E.g. police or fire knock on your door
· Factors in evac rate
· Perceived threat
· Frq. false alarms
· Difficulty of evac
· Responsibility for dependents
· The more responsibility, the less likely
· Specificity
· More info = perceived more severe
· More general is less effective
· Perceived accuracy of past warnings
· People don’t drop everything and evac
· They gather belongings, etc.
· Sudden evacuations
· People will act irrationally under stress
· 2016 Fort Mac fires
· Take wrong evac routes
· Take inappropriate/bizzare items
· No clothes, but bear head mantlepiece
· Half a blender and a watermelon
· Garbage bin
· Bag of potatoes
· Read the reading guide this module, the theories are stated
within
· Moodle forum is open
· Slides include some theories not covered this lecture
· Goal seduction
· Salience
· Reliability of tests . . . if the actual emergency differs from
the tests (eg. tests don’t have smoke, extreme heat, etc.), the
test might not give an accurate estimation of how people react.
· Goal seduction . . . get so focused on one goal that you
forget/lose sight of other information
· Can be explicit = people take on more risk because they
40. believe they have to take risks in order to achieve goal
· Subconscious = sympathetic response and adrenaline causes
you to forget obvious information (get a call to fight a fire and
you forget to turn the stove off)
· Making your reading reflections go from A to A+
· Clear use of the theories
· Refuting rebuttals
· Succinctness
· Every sentence should have a purpose and meaning
· Some creativity
· Salience
· How pronounced something is
· A cyclist is more conspicuous wearing hi-vis than all black
· Inattention blindness
· Missing something in plain sight
· E.g. a driver sees an obstacle but hits it anyways, a nurse
retrieves a neuromuscular blocking agent instead of an
antibiotic, reads the label, and administers it anyways killing
the patient
· “Reading” vs reading
· System 1 vs System 2 thinking
· We need to filter noise from relevant info
· Conspicuousness vs salience
· Conspicuousness - “How much something jumps out”
· Conspicuousness is diminished if we are fatigued, overloaded
with tasks, have reduced capacity (alcohol, age)
· Sensory conspicuousness (brightness, shape, is the font big,
does it stand out to our senses)
· Cognitively conspicuous (cocktail party effect)
· Purposeful attention (you are looking for something in
particular so you see that thing more easily)
· Salience - “How important it is”, is the information judged as
being relevant
41. Module Five - Expertise & Thinking as an InstitutionLecture #1
· Theories of expertise development
· Credentials and training (formal courses, apprenticeship
· Experience (number of hours, number of experiences, range of
experiences)
· Socialization (develop expertise by nature of who you spend
time with, who you speak with . . . Learning how to think,
debate, exchange ideas)
· 10,000 Hours (practice makes an expert . . . experts tend to
demonstrate a lot of experience doing something, specifically
10,000 hours of experience. However, the raw number of hours
does not provide a full picture of expertise . . . expertise
depends on mentorship, if you are practicing in a meaningful
way, etc.)
· Linear Development (Dreyfus) . . . Expertise consists of a
series of stages (1) Novice, (2) Advanced Beginner, (3)
Competence, (4) Proficient, and (5) Expert.
· Experience and Socialization (Collins and Evans) . . .
spending time around OTHER experts is important to becoming
an expert because it gives you the tacit or implicit knowledge
that cannot be explained explicitly. EG. nurses/doctors can’t
become experts in bedside manner by reading a book or paper,
you have to experience this knowledge or witness other people
demonstrating it. Learning from example. For Collins and
Evans, you only gain tacit knowledge by witnessing and
spending time around other experts and observing
(socialization, training, and experience)
· How much should we trust the different recommendations
experts have and what are some problems associated with
listening to experts?
· Only respecting/listening to certain kinds of experts leads to a
narrowing of knowledge
42. · Theory of Lay Expertise . . . Many researchers argue that we
don’t appreciate the range of experts that exist, we often
dismiss people because they don’t have credentials/formal
training, don’t “look like” traditional experts, or are perceived
to be too close to situation
· EG. During AIDS, professional researchers were viewed as
experts, but the different form of expertise of patients, family
members, paramedics, etc. were ignored.
· Expertise may not be the same but it is complementary
· For too long, we thought the experts were the people wear ing
the labcoats; however, ‘lay communities’ have valuable
expertise that is different from traditional expertise but still
valuable.
· Experts can tell us how to do something, but they can’t tell us
what to do.
· Theory: Honest Brokers (call them honest brokers because
they provide you with guidance without trying to sway you to
one side . . . first tell me what your goal is, then I will tell you
the best way to achieve it. Honest brokers try to give you as
many options as possible based on your goal, unlike issue
advocates who give you one option and really push you to
choose that ).
· When selecting an expert to rely on, you have to identify an
expert who will be able to act as an honest broker and fulfill all
those roles. When listening to experts on the news, you have to
identify which role they assume because that will help you
become a more astute citizen/listener)Lecture #2
· Knowledge Generation . . . what info/knowledge do we need
and how are we going to acquire this info
· Knowledge Validation . . . once we have produced or acquired
this knowledge, we have to decide whether it is trustworthy (ask
questions regarding veracity, credibility, salience, legibility)
· Knowledge Circulation . . . After the knowledge has been
deemed reliable, it needs to be shared with individuals who
43. NEED the info
· Application . . . what do we do with this information, how do
we apply it?
· James Scott . . . three components of “governmental regimes”
or “bureaucratic logics”
· Simplification = removing elements that are seen as excessive
or irrelevant or superfluous to what you are trying to achieve
· Legibility = simplifying things to increase legibility or make
things easier to count/read/to fit into your databases or
knowledge systems
· Manipulability = making it easier to manipulate your
landscape
· By adhering to simplification, legibility, manipulability you
unintentionally create a harmful monoculture
· We use S, L, and M in all four phases of Knowledge Systems
because they help us achieve our knowledge systems goals.
· COMPSTAT and S, L, M . . . simplified crime into a two
dimensional system, pressure to decrease crime turned things
into a monoculture
44. Module Six - PTSD & Mental Health
Title
Your Name
Course
45. Date
Introduction
Here you should introduce your project. This will include
indicating the purpose of the project; investigating the
characteristics of our species and the role we play in our
ecosystem. Then you should provide a brief outline of what you
will be discussing. Essentially, you will be comparing your
role/job and lifestyle in your current habit with that of a
research biologist in a unique habitat. In your introduction, it is
helpful to indicate your current niche and habitat and the
assigned niche and habitat.
**ALERT: You are comparing your lifestyle and role in the
community with that of a research biologist. Do NOT compare
yourself to the animal.
Letter of Last Name: Assigned Niche
A-G An Ecologist Studying Penguin Mating in Antarctica
(not Arctic)
H-N An Ecologist Studying Elephant Behavior in a Sub-
Saharan Desert
O-T A Botanist Studying Endangered Plants in a Tropical
Rain Forest
U-Z An Ecologist Studying Kangaroo Behavior in the
Australian Outback
****Please note that there are no indigenous people in
Antarctica; nonetheless, there are research teams that live there
for extended periods of time.
Important Definitions:
Ecologist – A biologist that studies the interrelationships
between organisms and their environments.
Botanist – A biologist that studies plants.
46. Habitat – Is an area inhabited by particular species.
Niche - The role (job), activities and resources used by an
organism.
Niche Comparison
Scoring Guide Expectation:Provides a description of personal
niche (4 pts) and assigned niche (4 pts).
You should provide a paragraph describing both your personal
niche (role/job) and your assigned niche. For example, in your
current niche, all of you are students. Many of you are parents
and/or a spouse. Also, some of you have full-time jobs, serve in
the military and/or volunteer in your community. All of us are
consumers. You want to indicate what you do in these roles.
For example, as a student, you attend seminars, read
assignments and complete homework. Next, you should
describe your assigned role. For example, if I am a botanist
studying endangered plants in a tropical rain forest, I would
need to perform field trips into the forest to identify endangered
plants, document the characteristics of the plant by writing and
pictures and write-up my research results. I would also need to
be knowledgeable my environment and have the ability to track
my location by GPS or compass. I would also want to continue
to study the literature published by other botanists identifyin g
endangered plants. In addition, as a botanist, I may still be a
spouse or a parent; although I don’t know if they would be in
the rainforest with me or back home.
Scoring Guide Expectation: List the similarities (5 pts) and
differences (5 pts) between your personal niche and assigned
niche.
In the next few paragraphs, you should describe the similarities
47. and differences between your personal niche and your assigned
niche. An example of a similarity is that in both niches, you
have to perform research; one as a Kaplan student and the other
as a botanist. Other similarities may be needing to provide for
you and your family’s needs. You want to think of additional
similarities in how you would live every day in both niches.
Next you should describe the differences between the two
niches. One example may be how your get the food you eat. In
your current niche, you may likely go to a local grocery store
and in the rainforest you may need to bring in everything for
your research trip with you and supplement your supplies by
eating plants and animals from the rainforest. Also, another
difference is how you get to work. You may currently commute
by bus or car; whereas in the rainforest, you may be living at a
campsite and walking into the rainforest to work. You should
think of additional differences in how you would live on a daily
basis in both niches.
Habitat Comparison
Scoring Guide Expectation: A description of your personal
habitat and your assigned habitat highlighting the similarities (5
pts) and differences (5 pts).
Under this next subheading, you will describe your habitat and
your assigned habitat. For your current habitat you should
answer the following questions. Where do you live? Is it in an
urban city, suburbs or rural area? What is the typical weather
like? Are there four seasons? What are the temperatures like?
Is it a large population or a small population? Are there well
maintained roads and highways and access to transportation?
Do people live in homes? Next, you will want to do some
research on your assigned habitat. You will then want to
answer the same questions. Finally, you should describe the
similarities and differences between these two habitats.
Survival Characteristics for Each Niche
Scoring Guide Expectation: Provides a discussion of several
characteristics (2.5 pts each; two for each niche) that provide a
survival advantage of the individuals who currently occupy both
48. niches (10 pts total).
Next, you will discuss characteristics that would provide you a
survival advantages in both niches. What is being referred to
here is physical characteristics that provide a survival
advantage. What do you have available to you to maintain your
health? In my current niche, my work provides free flu shots.
Having the availability of flu shots every year provides me a
survival advantage in my current niche that allows me to stay
healthy. Another survival advantage I have in my current niche
is the ability to maintain by health through readily having
access to fresh fruits and vegetable, meats, dairy and grains to
maintain a healthy diet and the availability of healthcare.
You will want to perform research on what physical
characteristics provide a survival advantage in the assigned
niche. It is helpful to perform research on the indigenous
humans in that area and identify any characteristics that provide
a survival advantages. Examples may be skin color due to sun
exposure, height, weight, immune system and general health
characteristics. For example, I did research on the native forest
people and some of the survival characteristics are to be smaller
in size and to sweat less. The reason they sweat less is the
tropical rain forests have high humidity and therefore the sweat
doesn’t readily evaporate to decrease the body temperature
(Exploring the Tropic Rainforest, 2002).
Human adaptation
Scoring Guide Expectation: Provide a description of how have
humans adapted to the two habitats and niches (10 pts total).
Next, you will discuss how humans have adapted to both niches.
For example, my current niche is the Baltimore -Washington DC
metropolitan area, people have adapted by either owning a car
or using publication transportation including the subway
system. In this area, everything is spread out making
transportation a necessity. I live about one mile away from my
local grocery store and five miles away from work. A car is
necessary for me to live my daily life. Another adaptation is
the style of homes. My home for example has two air
49. conditioner and heating systems. This is because in the summer
it becomes extremely hot and humid and in the winter it
becomes cold. There are many ways humans have adapted for a
convenient lifestyle in the United States.
For your assigned niche, you will want to perform some
research on how people have adapted for living in these
environments. For my assigned niche, one way people have
adapted to living in the tropical rain forest is to have significant
knowledge regarding the plant species. They will know what is
safe to eat and what is poisonous. Also, individuals that live in
the rainforest have expertise in hunting and fishing in order to
be able to live off the land. You will also want to research the
type of housing for the assigned habitat. For example, in the
Australian Outback, some people live in dugouts where part of
the home is underground. This is to deal with the constant high
temperatures. Please note that there are no indigenous people in
Antarctica; nonetheless, there are research teams that live there
for extended periods of time.
Difficulties/Challenges in the Assigned Niche
Scoring Guide Expectation: Provide an explanation of what
difficulties might you have living in the assigned niche (10 pts
total).
You will want to research what are the difficulties or
challenges living in your assigned niche. For example, one of
the challenges of living in the rainforest is the wildlife. I would
have to be very careful of predators including snakes, jaguars
and monkeys. There is also the danger of disease; malaria and
yellow fever are a big issue. How do I protect myself? There is
also the danger of flash flooding in the rainforest.
Cultural Adaptations in Both Niches
Scoring Guide Expectation: Lists the types of cultural
adaptations that have evolved from living in their personal
niche and from living in the assigned niche (10 pts total).
You will want to describe cultural adaptations that have
evolved from living in your personal niche and that people have
developed living in the assigned niche. Typical cultural
50. adaptations for people living in an urban environment in the
United States are the availability of fast food; constant use of
cell phones and texting for communications; internet access in
most homes and coffee shops; the majority of people owning a
car; and the easy availability of stores. Also, in urban areas,
people have adapted to living around large groups of people
with diverse backgrounds.
You will next want to describe the cultural adaptations for
those living in your assigned niche. For example, some people
that live in the rainforest often make their living by “rubber
tapping” where they extract latex from rubber trees without
harming the tree (Life in the Rainforest, n.d.). Some indigenous
people live in tribes and homeschool their children. They teach
them about the rainforest and how to live off the land.
Biological Adaptations in Both Niches
Scoring Guide Expectation: Discusses biological adaptations
that have occurred, such as dealing with food spoilage, insect
vector control, food, and waterborne illnesses, etc. (8 pts total).
You will want to describe biological adaptations that have
occurred in both niches. In my current niche, we use
refrigerators to keep our food from spoiling. We also have
access to fresh fruits and vegetables at a local grocery store.
We also have access to medical and immunizations. In my
assigned niche, living in the rainforest, refrigerating food would
be an issue. However, having daily access to the natural food
products in the rainforest would allow survival. However, you
have to have the knowledge and skill set to live off the land.
As a botanist studying in the rainforest, there is risk of malaria
infection as well as waterborne illnesses. I would avoid
infection by using insect repellent, wearing long sleeves and
pants, sleeping with netting and using antimalarial medications.
How would I avoid waterborne illnesses? You should research
how people avoid these types of issues living in your assigned
niche.
Conclusion
51. You should provide a short conclusion paragraph summarizing
your main points.
References:
Exploring the Tropic Rainforest. (2002). Retrieved on
November 18th, 2015 from
http://www.mbgnet.net/sets/rforest/explore/people.htm .
Life in the Rainforest. (n.d.). Retrieved on November 18th,
2015 from https://rainforestrescue.sky.com/amazon-
rainforest/forest-people .