A Psychological Approach to How Trust
is Built and Lost in the Context of Risk
J. Richard Eiser
University of Sheffield, UK
Mathew White
Friedrich-Schiller Universität, Jena, Germany
Structure of this talk
 How risk depends on human decisions.
 Decisions and their consequences.
 Trust as a social judgement about decision-makers
and information sources.
 ‘Marginal trust’ – changes in trust as a consequence
of specific events.
 Contributory factors – negativity bias, cognitive
consistency, diagnosticity, decision types
 Conclusions.
Risk depends on human decisions
Risk involves uncertainty about the
likelihood of events and the value of their
consequences
Risk arises from interactions between
people and their social and physical
environment.
Risk depends not only on physical
conditions but also on human actions and
decisions (e.g. Chernobyl, Hurricane
Katrina, Kashmir earthquake).
Risks are social
Poor decisions exacerbate risk for ourselves
and others.
We often rely on others to manage and
alleviate risks on our behalf.
We often rely on others to inform us about
risks and advise us what to do.
Inequality within and between societies
increases vulnerability and limits access to
help and information.
Hence…
Understanding risk involves understanding
not only physical conditions but also how
people make decisions.
Risk perception implies judgements about
the quality of our own and others’
decisions.
Experts should make higher quality
decisions and/or give higher quality
information (or else they’re not experts).
What do we mean by ‘quality’?
Within the context of risk management:
Ability to discriminate danger and safety.
Use of appropriate criterion for balancing
different costs and benefits.
Within the context of risk communication:
These, plus…
Avoidance of bias due to personal interest.
Use of appropriate criterion for warning about
danger (neither too alarmist nor complacent).
Decisions and their consequences
In an uncertain environment, we need to
differentiate between safety and danger.
Some situations are clearly safe, others are
clearly dangerous.
What happens in between?
An approach derived from the psychology
of perception: Signal Detection Theory.
Discriminating danger
Danger
Safety
Risky
criterion
Cautious
criterion
?
Decision-outcome combinations
When deciding whether something is safe
or dangerous, there are four possibilities:
Dangerous – treat as dangerous (“Hit”).
Dangerous – treat as safe (“Miss”).
Safe – treat as dangerous (“False alarm”).
Safe – treat as safe (“Correct all clear”).
Consequences
These different combinations can have
different costs and benefits.
Misses can often appear more costly than
false alarms.
An excessively precautionary approach can
deprive users of benefits of a technology,
and/or expose them to alternative, perhaps
greater, risks (e.g. using cars after a train
crash).
Trust as a social judgement
Trust in experts implies a positive judgement
of the quality of their decisions and/or
information.
Trust can depend on implicit estimates of the
others’ competence, partiality and honesty.
If ‘experts’ are seen as having a vested
interest, this may undermine trust.
Decision-makers who share one’s interests
and values are more trusted.
Example 1: Mobile Phones
Respondents rated different sources of
information about possible health risks of
mobile phones in terms of:
Trust.
Knowledge.
Warning criterion (how much evidence source
would need before warning).
Industry seen as knowledgeable, but
reluctant to warn and therefore distrusted.
Knowledge
0 1 2 3 4 5
Trust
0
1
2
3
4
5
Environmentalists
Media
Medics
Government
Scientists
Industry
Warning criterion
0 1 2 3 4 5 6
Trust
0
1
2
3
4
5
Environmentalists
Media
Medics
Government
Scientists
Industry
Example 2: Contaminated land.
 Local residents rated different sources of
information about possible health risks of
contaminated land in terms of:
Trust.
Expertise at judging how safe or dangerous.
Bias in decision-making/communication.
Openness.
Having residents’ own interests at heart.
 Perceived expertise does not guarantee trust
without impartiality, openness and shared values.
How much would you trust what each of the
following might tell you about risks from
contaminated land?
If there was contaminated land in your
neighbourhood, how able do you think each of the
following would be to judge how safe or dangerous
it was?
Conclusions of surveys
Baseline levels of trust only partly reflect
perceived expertise.
Perceived self-interest, openness and shared
values are also important.
Need for an experimental approach to
unconfound these factors.
Need to examine how specific events may
influence marginal trust.
Marginal trust
Many policy makers know public trust is low
But how can they build it & avoid losing it?
Four psychological insights:
1) Negativity bias (prior)
2) Desire for cognitive consistency
3) Information diagnosticity
4) Decision outcome types (Miss, False Alarms etc.)
1) Negativity bias
 "Bad is stronger than good"
(Baumeister et al, 2001; Rozin & Royzman, 2001)
 Info. valence Effect on trust
Positive Small increase 
Negative Large decrease 
 Trust = easier to lose than gain (trust asymmetry)
 “Trust comes on foot and leaves on horseback”
Slovic (1993)
Valence Trust 
Negative -4.73
Positive +3.07
F(1,102) = 82.64, p<0.001 p
η2
= .45.
Terrible News !!!!!! -7 -6 -5 -4 -3 -2 -1 0 1 2 3 4 5 6 7
Local board authority can close plant
Responsive to any sign of problems
Local advisory board established
Employees carefully trained
Employees rewarded for finding problems
On-site government inspector
Employees informed of problems
Community has access to records
Neighbours notified of problems
Public encouraged to tour plant
Try to meet with public
Employees closely supervised
EPA monitor radioactive emissions
No problems for five years
Conduct emergency training
Record keeping is good
Managers live near plant
Hold regular public hearings
Mandatory drug testing
Nearby health better than average
Evacuation plan exists
Effective emergency action taken
No problems in past year
No evidence of withholding information
Operates according to regulations
Contribute to local charities
Don't contribute to local charities
Serious accident is controlled
Officials live far away
Little communication with community
Accident occurs in another state
No public hearings
Emergency response plans not rehearsed
Public tours not permitted
Accused of releasing radiation
Delayed safety inspections
Denied access to records
Poor record keeping
Health nearby worse than average
Employees not informed of problems
Official lied to the government
Plant covered up problem
No adequate emergency response plan
Employees drunk on job
Records were falsified
e.g. Keep good
records
e.g. Keep bad
records
Increase
trust
Decrease
trust
45 "events" in a nuclear
power plant
2) Desire for cognitive consistency
 People want stability in their belief structures
 We tend to trust good news about things/from
people we like but not for things/people we don’t
(Hovland, Janis & Kelley, 1953)
 People don’t like nuclear power
 So greater effect of bad news may be due to a
confirmatory bias
 What about a less negatively viewed industry?
Negativity or cognitive consistency?
Sample = 68 students
1) Attitudes (-3 to +3): Nuc. = -.47; Phar. = +.50, p < 0.01
2) DV Trust change (Slovic,1993; Cvetkovich et al. 2002)
“How would your level of trust in the management of a
particular nuclear power (pharmaceutical) plant be
affected by the following information?”
(‘Much less trust–3 to Much more trust +3)
Negativity or cognitive consistency?
• 12 events (6 positive & 6 negative) either nuclear
power or pharmaceuticals
0
1
2
3
Nuclear Pharmaceuticals
Industry
Absolute
effect
on
trust
Negative
Positive
F(1,66) = 8.16 ,
p < 0.01, p
η2
= .11
Negativity or cognitive consistency?
So ‘Trust Asymmetry’ isn’t ubiquitous
 Replicated in other domains (e.g. additives)
Good news for already trusted sources but
doesn’t help distrusted sources build trust
Fortunately there is more to the story
Slovic (1993)
-7 -6 -5 -4 -3 -2 -1 0 1 2 3 4 5 6 7
Local board authority can close plant
Responsive to any sign of problems
Local advisory board established
Employees carefully trained
Employees rewarded for finding problems
On-site government inspector
Employees informed of problems
Community has access to records
Neighbours notified of problems
Public encouraged to tour plant
Try to meet with public
Employees closely supervised
EPA monitor radioactive emissions
No problems for five years
Conduct emergency training
Record keeping is good
Managers live near plant
Hold regular public hearings
Mandatory drug testing
Nearby health better than average
Evacuation plan exists
Effective emergency action taken
No problems in past year
No evidence of withholding information
Operates according to regulations
Contribute to local charities
Don't contribute to local charities
Serious accident is controlled
Officials live far away
Little communication with community
Accident occurs in another state
No public hearings
Emergency response plans not rehearsed
Public tours not permitted
Accused of releasing radiation
Delayed safety inspections
Denied access to records
Poor record keeping
Health nearby worse than average
Employees not informed of problems
Official lied to the government
Plant covered up problem
No adequate emergency response plan
Employees drunk on job
Records were falsified
e.g. Keep good
records
e.g. Keep bad
records
Increase
trust
Decrease
trust
Look at the variance!
Some good news is very
good for trust
Some bad news is not
so bad for trust
Unpacking why might
help us build trust
3) Information diagnosticity
 We make the world simpler by categorising others
E.g. Friendly/Unfriendly; Honest/Dishonest etc.
 The info. we use varies in terms of diagnosticity
i.e. how good is it at differentiating people
 One important aspect = information specificity
i.e. relate to a single event or many events
 Jo took £10 from the till …. a) last Wednesday
or b) every day last week
3) Information diagnosticity
 Slovic info. differed in terms of specificity:
A) High specificity: Events
“A plant official is found to have lied about a safety matter.”
B) Low specificity: Policies
“There is careful selection and training of plant employees.”
 Trust should be more affected by policy (low
specificity) than event (high specificity) info.
 Re-analysed data in terms of events vs policies
Re-analysis of Slovic (1993)
High specificity info.
(EVENTS)
Low specificity info.
(POLICIES)
-8 -6 -4 -2 0 2 4 6 8
Change in trust
-8 -6 -4 -2 0 2 4 6 8
Change in trust
Negative
policies
Positive
policies
Negative
events
Positive
events
0
1
2
3
Event Policy Event Policy
Absolute
change
in
trust
Negative
Positive
Re-analysis of Slovic (1993) + new study
Reanalysis New Study
Valence: F(1, 102) = 82.64, p < 0.001 F(1,35) = 7.61, p < 0.01
Specificity: F(1, 102) = 3.89 , p = 0.051 F(1,35) = 12.19, p < 0.001
V X S: F(1, 102) = 118.17, p < 0.001 F(1,35) = 13.26, p < 0.001
0
1
2
3
Event Policy Event Policy
Absolute
change
in
trust
Negative
Positive
Re-analysis of Slovic (1993) + new study
Reanalysis New Study
Valence: F(1, 102) = 82.64, p < 0.001 F(1,35) = 7.61, p < 0.01
Specificity: F(1, 102) = 3.89 , p = 0.051 F(1,35) = 12.19, p < 0.001
V X S: F(1, 102) = 118.17, p < 0.001 F(1,35) = 13.26, p < 0.001
3) Information diagnosticity
Trust asymmetry exists for events (high specificity)
but not for policies (low specificity)
a) Bad events have large negative effects on trust
b) Good events have small positive effect
c) Good and bad policies have similar large effects
Conclusion: Promote positive policies not events!
4) Event types
 Our final psychological insight again suggests it‘s a
little more complicated
4) Event types
 Our final psychological insight again suggests it‘s a
little more complicated
Thought reactor operations were
“Dangerous” “Safe”
Reactor Dangerous A) HIT B) MISS
really was Safe C) FALSE ALARM D) ALL CLEAR
 Which engineer would you trust/distrust most?
4) Event types
 Our final psychological insight again suggests it‘s a
little more complicated
Thought reactor operations were
“Dangerous” “Safe”
Reactor Dangerous A) HIT B) MISS
really was Safe C) FALSE ALARM D) ALL CLEAR
 Which engineer would you trust/distrust most?
Risk communication
 What about if you learned that some of them had
tried to cover up their mistakes?
Predictions
H1) Discrimination ability: Correct > Incorrect
Hits & All Clears > False Alarms & Misses
H2) Response bias: Caution > Risk
Hits & False Alarms > All Clears & Misses
Benefits of Hit loom larger; Costs of Miss loom larger
H3) Communication bias: Transparency > Reticence
Open > Closed
-3
-2
-1
0
1
2
3
Dangerous Safe Dangerous Safe
Trust
change
Correct
Incorrect
Open Closed
Predictions
FA
FA
H
AC
M
M
AC
H
Communication bias
4) Event types
 189 Students with three different scenarios :
1) Nuclear power- tank corrosion
2) Vaccine - holiday
3) Computer virus - in uni library
Between Ps design per scenario:
2 (discrimination ability - correct/incorrect)
x
2 (response bias - “safe”/”dangerous”)
x
2 (communication bias “open”/”closed”)
DV = Trust change
-3
-2
-1
0
1
2
3
Dangerous Safe Dangerous Safe
Trust
change
Correct
Incorrect
Open Closed
Nuclear power
H
AC
AC
H
Communication bias
-3
-2
-1
0
1
2
3
Dangerous Safe Dangerous Safe
Trust
change
Correct
Incorrect
Open Closed
Nuclear power
FA
FA
H
AC
M
M
AC
H
Communication bias
-3
-2
-1
0
1
2
3
Dangerous Safe Dangerous Safe
Trust
change
Correct
Incorrect
Open Closed
Nuclear power
FA
FA
H
AC
M
M
AC
H
Communication bias
-3
-2
-1
0
1
2
3
Dangerous Safe Dangerous Safe
Trust
change
Correct
Incorrect
Open Closed
Travel vaccines
FA
FA
H
AC
M
M
AC
H
Communication bias
-3
-2
-1
0
1
2
3
Dangerous Safe Dangerous Safe
Trust
change
Correct
Incorrect
Open Closed
Computer viruses
FA
FA
H
AC
M
M
AC
H
Communication bias
Summary
Correct decisions (Hits & All Clears) as predicted
„False Alarm effect” - Increases in trust
Closed Misses - Big falls in trust!
Trust change generalises from exemplar to category
(Specific doctor to doctor in general)
But: 1) Single event (Cry wolf effect?)
2) Might Misses be preferred (e.g. Legal/Rights)
Suicide bombers (with C. Cohrs)
Increase costs of False Alarm (shooting innocent)
Trust in armed police unit following incident
Busy train station, willing to die for the cause
Person either a) Real armed terrorist
b) Someone with mental illness
Almost identical Miami airport last week where
marshals shot Rigoberto Alpizar with Bipolar disorder
Piloting - 50% Shoot, 50% Don’t shoot
Suicide bombers
Only “open”
DV = 3 item trust scale (= .87)
 N = 172
Moderation analysis: Right Wing Authoritarianism
(14 item scale related to prejudice and civil rights)
H1: High RWA: Usual pattern
H2: Low RWA: Reverse pattern (Sensitive to costs of FA)
Main analysis
Ability: F(1, 171) = 37.14***
Bias: F(1, 171) = 7.07**
-3
-2
-1
0
1
2
3
"Shoot" "Don't Shoot"
Response bias
Trust
in
police
unit
Correct
Incorrect
FA
M
AC
H
-3
-2
-1
0
1
2
3
"Shoot" "Don't
Shoot"
"Shoot" "Don't
Shoot"
"Shoot" "Don't
Shoot"
Right wing authoritarianism
Trust
in
relevant
police
Correct
Incorrect
Moderation Analysis (Bias x RWA F(1,171) = 8.79***)
FA
M
AC
H
FA
M
AC
H
FA
M
AC
H
Very low 0-2.08
(N = 53)
Low 2.09-2.75
(N = 64)
Moderate 2.76-4.42
(N = 54)
Right Wing Authoritarianism
Marginal trust conclusions
1) Trust asymmetry does occur (Bad > Good)
2) In part because of congruency effects
3) But events (rather than polices) still suffer
4) Even some negative events (False Alarms) can lead
to increases in trust - but not in all situations and not
for all people!
Marginal trust conclusions
 If you want to lose trust:
Try to cover up Misses (esp. in a high risk context)
 If you want to build trust:
a) Focus on communicating positive policies
b) If you have to talk about events be open
c) And be sensitive to public’s perceptions of costs of
benefits of correct/incorrect decisions
General Conclusions (1)
Risk depends on human decisions.
Perception of risk involves evaluating
decisions.
Decisions can be evaluated in terms of:
Competence (discrimination ability)
Partiality (response bias)
Communications can be also evaluated in
terms of:
Openness
General Conclusions (2)
Trust is an outcome of such evaluations,
plus liking for the decision-maker.
Changes in trust depend on how events are
interpreted.
Prior attitudes can guide interpretations.
Bad news can have more impact than good.
But openness/willingness to admit mistakes
may increase trust.

6-Eiser-White-II (1).pptjvlblkblkblklkllkb

  • 1.
    A Psychological Approachto How Trust is Built and Lost in the Context of Risk J. Richard Eiser University of Sheffield, UK Mathew White Friedrich-Schiller Universität, Jena, Germany
  • 2.
    Structure of thistalk  How risk depends on human decisions.  Decisions and their consequences.  Trust as a social judgement about decision-makers and information sources.  ‘Marginal trust’ – changes in trust as a consequence of specific events.  Contributory factors – negativity bias, cognitive consistency, diagnosticity, decision types  Conclusions.
  • 3.
    Risk depends onhuman decisions Risk involves uncertainty about the likelihood of events and the value of their consequences Risk arises from interactions between people and their social and physical environment. Risk depends not only on physical conditions but also on human actions and decisions (e.g. Chernobyl, Hurricane Katrina, Kashmir earthquake).
  • 8.
    Risks are social Poordecisions exacerbate risk for ourselves and others. We often rely on others to manage and alleviate risks on our behalf. We often rely on others to inform us about risks and advise us what to do. Inequality within and between societies increases vulnerability and limits access to help and information.
  • 9.
    Hence… Understanding risk involvesunderstanding not only physical conditions but also how people make decisions. Risk perception implies judgements about the quality of our own and others’ decisions. Experts should make higher quality decisions and/or give higher quality information (or else they’re not experts).
  • 10.
    What do wemean by ‘quality’? Within the context of risk management: Ability to discriminate danger and safety. Use of appropriate criterion for balancing different costs and benefits. Within the context of risk communication: These, plus… Avoidance of bias due to personal interest. Use of appropriate criterion for warning about danger (neither too alarmist nor complacent).
  • 11.
    Decisions and theirconsequences In an uncertain environment, we need to differentiate between safety and danger. Some situations are clearly safe, others are clearly dangerous. What happens in between? An approach derived from the psychology of perception: Signal Detection Theory.
  • 12.
  • 13.
    Decision-outcome combinations When decidingwhether something is safe or dangerous, there are four possibilities: Dangerous – treat as dangerous (“Hit”). Dangerous – treat as safe (“Miss”). Safe – treat as dangerous (“False alarm”). Safe – treat as safe (“Correct all clear”).
  • 14.
    Consequences These different combinationscan have different costs and benefits. Misses can often appear more costly than false alarms. An excessively precautionary approach can deprive users of benefits of a technology, and/or expose them to alternative, perhaps greater, risks (e.g. using cars after a train crash).
  • 17.
    Trust as asocial judgement Trust in experts implies a positive judgement of the quality of their decisions and/or information. Trust can depend on implicit estimates of the others’ competence, partiality and honesty. If ‘experts’ are seen as having a vested interest, this may undermine trust. Decision-makers who share one’s interests and values are more trusted.
  • 18.
    Example 1: MobilePhones Respondents rated different sources of information about possible health risks of mobile phones in terms of: Trust. Knowledge. Warning criterion (how much evidence source would need before warning). Industry seen as knowledgeable, but reluctant to warn and therefore distrusted.
  • 19.
    Knowledge 0 1 23 4 5 Trust 0 1 2 3 4 5 Environmentalists Media Medics Government Scientists Industry
  • 20.
    Warning criterion 0 12 3 4 5 6 Trust 0 1 2 3 4 5 Environmentalists Media Medics Government Scientists Industry
  • 21.
    Example 2: Contaminatedland.  Local residents rated different sources of information about possible health risks of contaminated land in terms of: Trust. Expertise at judging how safe or dangerous. Bias in decision-making/communication. Openness. Having residents’ own interests at heart.  Perceived expertise does not guarantee trust without impartiality, openness and shared values.
  • 22.
    How much wouldyou trust what each of the following might tell you about risks from contaminated land?
  • 23.
    If there wascontaminated land in your neighbourhood, how able do you think each of the following would be to judge how safe or dangerous it was?
  • 24.
    Conclusions of surveys Baselinelevels of trust only partly reflect perceived expertise. Perceived self-interest, openness and shared values are also important. Need for an experimental approach to unconfound these factors. Need to examine how specific events may influence marginal trust.
  • 25.
    Marginal trust Many policymakers know public trust is low But how can they build it & avoid losing it? Four psychological insights: 1) Negativity bias (prior) 2) Desire for cognitive consistency 3) Information diagnosticity 4) Decision outcome types (Miss, False Alarms etc.)
  • 26.
    1) Negativity bias "Bad is stronger than good" (Baumeister et al, 2001; Rozin & Royzman, 2001)  Info. valence Effect on trust Positive Small increase  Negative Large decrease   Trust = easier to lose than gain (trust asymmetry)  “Trust comes on foot and leaves on horseback”
  • 27.
    Slovic (1993) Valence Trust Negative -4.73 Positive +3.07 F(1,102) = 82.64, p<0.001 p η2 = .45. Terrible News !!!!!! -7 -6 -5 -4 -3 -2 -1 0 1 2 3 4 5 6 7 Local board authority can close plant Responsive to any sign of problems Local advisory board established Employees carefully trained Employees rewarded for finding problems On-site government inspector Employees informed of problems Community has access to records Neighbours notified of problems Public encouraged to tour plant Try to meet with public Employees closely supervised EPA monitor radioactive emissions No problems for five years Conduct emergency training Record keeping is good Managers live near plant Hold regular public hearings Mandatory drug testing Nearby health better than average Evacuation plan exists Effective emergency action taken No problems in past year No evidence of withholding information Operates according to regulations Contribute to local charities Don't contribute to local charities Serious accident is controlled Officials live far away Little communication with community Accident occurs in another state No public hearings Emergency response plans not rehearsed Public tours not permitted Accused of releasing radiation Delayed safety inspections Denied access to records Poor record keeping Health nearby worse than average Employees not informed of problems Official lied to the government Plant covered up problem No adequate emergency response plan Employees drunk on job Records were falsified e.g. Keep good records e.g. Keep bad records Increase trust Decrease trust 45 "events" in a nuclear power plant
  • 28.
    2) Desire forcognitive consistency  People want stability in their belief structures  We tend to trust good news about things/from people we like but not for things/people we don’t (Hovland, Janis & Kelley, 1953)  People don’t like nuclear power  So greater effect of bad news may be due to a confirmatory bias  What about a less negatively viewed industry?
  • 29.
    Negativity or cognitiveconsistency? Sample = 68 students 1) Attitudes (-3 to +3): Nuc. = -.47; Phar. = +.50, p < 0.01 2) DV Trust change (Slovic,1993; Cvetkovich et al. 2002) “How would your level of trust in the management of a particular nuclear power (pharmaceutical) plant be affected by the following information?” (‘Much less trust–3 to Much more trust +3)
  • 30.
    Negativity or cognitiveconsistency? • 12 events (6 positive & 6 negative) either nuclear power or pharmaceuticals 0 1 2 3 Nuclear Pharmaceuticals Industry Absolute effect on trust Negative Positive F(1,66) = 8.16 , p < 0.01, p η2 = .11
  • 31.
    Negativity or cognitiveconsistency? So ‘Trust Asymmetry’ isn’t ubiquitous  Replicated in other domains (e.g. additives) Good news for already trusted sources but doesn’t help distrusted sources build trust Fortunately there is more to the story
  • 32.
    Slovic (1993) -7 -6-5 -4 -3 -2 -1 0 1 2 3 4 5 6 7 Local board authority can close plant Responsive to any sign of problems Local advisory board established Employees carefully trained Employees rewarded for finding problems On-site government inspector Employees informed of problems Community has access to records Neighbours notified of problems Public encouraged to tour plant Try to meet with public Employees closely supervised EPA monitor radioactive emissions No problems for five years Conduct emergency training Record keeping is good Managers live near plant Hold regular public hearings Mandatory drug testing Nearby health better than average Evacuation plan exists Effective emergency action taken No problems in past year No evidence of withholding information Operates according to regulations Contribute to local charities Don't contribute to local charities Serious accident is controlled Officials live far away Little communication with community Accident occurs in another state No public hearings Emergency response plans not rehearsed Public tours not permitted Accused of releasing radiation Delayed safety inspections Denied access to records Poor record keeping Health nearby worse than average Employees not informed of problems Official lied to the government Plant covered up problem No adequate emergency response plan Employees drunk on job Records were falsified e.g. Keep good records e.g. Keep bad records Increase trust Decrease trust Look at the variance! Some good news is very good for trust Some bad news is not so bad for trust Unpacking why might help us build trust
  • 33.
    3) Information diagnosticity We make the world simpler by categorising others E.g. Friendly/Unfriendly; Honest/Dishonest etc.  The info. we use varies in terms of diagnosticity i.e. how good is it at differentiating people  One important aspect = information specificity i.e. relate to a single event or many events  Jo took £10 from the till …. a) last Wednesday or b) every day last week
  • 34.
    3) Information diagnosticity Slovic info. differed in terms of specificity: A) High specificity: Events “A plant official is found to have lied about a safety matter.” B) Low specificity: Policies “There is careful selection and training of plant employees.”  Trust should be more affected by policy (low specificity) than event (high specificity) info.  Re-analysed data in terms of events vs policies
  • 35.
    Re-analysis of Slovic(1993) High specificity info. (EVENTS) Low specificity info. (POLICIES) -8 -6 -4 -2 0 2 4 6 8 Change in trust -8 -6 -4 -2 0 2 4 6 8 Change in trust Negative policies Positive policies Negative events Positive events
  • 36.
    0 1 2 3 Event Policy EventPolicy Absolute change in trust Negative Positive Re-analysis of Slovic (1993) + new study Reanalysis New Study Valence: F(1, 102) = 82.64, p < 0.001 F(1,35) = 7.61, p < 0.01 Specificity: F(1, 102) = 3.89 , p = 0.051 F(1,35) = 12.19, p < 0.001 V X S: F(1, 102) = 118.17, p < 0.001 F(1,35) = 13.26, p < 0.001
  • 37.
    0 1 2 3 Event Policy EventPolicy Absolute change in trust Negative Positive Re-analysis of Slovic (1993) + new study Reanalysis New Study Valence: F(1, 102) = 82.64, p < 0.001 F(1,35) = 7.61, p < 0.01 Specificity: F(1, 102) = 3.89 , p = 0.051 F(1,35) = 12.19, p < 0.001 V X S: F(1, 102) = 118.17, p < 0.001 F(1,35) = 13.26, p < 0.001
  • 38.
    3) Information diagnosticity Trustasymmetry exists for events (high specificity) but not for policies (low specificity) a) Bad events have large negative effects on trust b) Good events have small positive effect c) Good and bad policies have similar large effects Conclusion: Promote positive policies not events!
  • 39.
    4) Event types Our final psychological insight again suggests it‘s a little more complicated
  • 40.
    4) Event types Our final psychological insight again suggests it‘s a little more complicated Thought reactor operations were “Dangerous” “Safe” Reactor Dangerous A) HIT B) MISS really was Safe C) FALSE ALARM D) ALL CLEAR  Which engineer would you trust/distrust most?
  • 41.
    4) Event types Our final psychological insight again suggests it‘s a little more complicated Thought reactor operations were “Dangerous” “Safe” Reactor Dangerous A) HIT B) MISS really was Safe C) FALSE ALARM D) ALL CLEAR  Which engineer would you trust/distrust most? Risk communication  What about if you learned that some of them had tried to cover up their mistakes?
  • 42.
    Predictions H1) Discrimination ability:Correct > Incorrect Hits & All Clears > False Alarms & Misses H2) Response bias: Caution > Risk Hits & False Alarms > All Clears & Misses Benefits of Hit loom larger; Costs of Miss loom larger H3) Communication bias: Transparency > Reticence Open > Closed
  • 43.
    -3 -2 -1 0 1 2 3 Dangerous Safe DangerousSafe Trust change Correct Incorrect Open Closed Predictions FA FA H AC M M AC H Communication bias
  • 44.
    4) Event types 189 Students with three different scenarios : 1) Nuclear power- tank corrosion 2) Vaccine - holiday 3) Computer virus - in uni library Between Ps design per scenario: 2 (discrimination ability - correct/incorrect) x 2 (response bias - “safe”/”dangerous”) x 2 (communication bias “open”/”closed”) DV = Trust change
  • 45.
    -3 -2 -1 0 1 2 3 Dangerous Safe DangerousSafe Trust change Correct Incorrect Open Closed Nuclear power H AC AC H Communication bias
  • 46.
    -3 -2 -1 0 1 2 3 Dangerous Safe DangerousSafe Trust change Correct Incorrect Open Closed Nuclear power FA FA H AC M M AC H Communication bias
  • 47.
    -3 -2 -1 0 1 2 3 Dangerous Safe DangerousSafe Trust change Correct Incorrect Open Closed Nuclear power FA FA H AC M M AC H Communication bias
  • 48.
    -3 -2 -1 0 1 2 3 Dangerous Safe DangerousSafe Trust change Correct Incorrect Open Closed Travel vaccines FA FA H AC M M AC H Communication bias
  • 49.
    -3 -2 -1 0 1 2 3 Dangerous Safe DangerousSafe Trust change Correct Incorrect Open Closed Computer viruses FA FA H AC M M AC H Communication bias
  • 50.
    Summary Correct decisions (Hits& All Clears) as predicted „False Alarm effect” - Increases in trust Closed Misses - Big falls in trust! Trust change generalises from exemplar to category (Specific doctor to doctor in general) But: 1) Single event (Cry wolf effect?) 2) Might Misses be preferred (e.g. Legal/Rights)
  • 51.
    Suicide bombers (withC. Cohrs) Increase costs of False Alarm (shooting innocent) Trust in armed police unit following incident Busy train station, willing to die for the cause Person either a) Real armed terrorist b) Someone with mental illness Almost identical Miami airport last week where marshals shot Rigoberto Alpizar with Bipolar disorder Piloting - 50% Shoot, 50% Don’t shoot
  • 52.
    Suicide bombers Only “open” DV= 3 item trust scale (= .87)  N = 172 Moderation analysis: Right Wing Authoritarianism (14 item scale related to prejudice and civil rights) H1: High RWA: Usual pattern H2: Low RWA: Reverse pattern (Sensitive to costs of FA)
  • 53.
    Main analysis Ability: F(1,171) = 37.14*** Bias: F(1, 171) = 7.07** -3 -2 -1 0 1 2 3 "Shoot" "Don't Shoot" Response bias Trust in police unit Correct Incorrect FA M AC H
  • 54.
    -3 -2 -1 0 1 2 3 "Shoot" "Don't Shoot" "Shoot" "Don't Shoot" "Shoot""Don't Shoot" Right wing authoritarianism Trust in relevant police Correct Incorrect Moderation Analysis (Bias x RWA F(1,171) = 8.79***) FA M AC H FA M AC H FA M AC H Very low 0-2.08 (N = 53) Low 2.09-2.75 (N = 64) Moderate 2.76-4.42 (N = 54) Right Wing Authoritarianism
  • 55.
    Marginal trust conclusions 1)Trust asymmetry does occur (Bad > Good) 2) In part because of congruency effects 3) But events (rather than polices) still suffer 4) Even some negative events (False Alarms) can lead to increases in trust - but not in all situations and not for all people!
  • 56.
    Marginal trust conclusions If you want to lose trust: Try to cover up Misses (esp. in a high risk context)  If you want to build trust: a) Focus on communicating positive policies b) If you have to talk about events be open c) And be sensitive to public’s perceptions of costs of benefits of correct/incorrect decisions
  • 57.
    General Conclusions (1) Riskdepends on human decisions. Perception of risk involves evaluating decisions. Decisions can be evaluated in terms of: Competence (discrimination ability) Partiality (response bias) Communications can be also evaluated in terms of: Openness
  • 58.
    General Conclusions (2) Trustis an outcome of such evaluations, plus liking for the decision-maker. Changes in trust depend on how events are interpreted. Prior attitudes can guide interpretations. Bad news can have more impact than good. But openness/willingness to admit mistakes may increase trust.