We aren’t surprised by facial recognition at security checkpoints. But how do you feel about face-scanning toilet roll dispensers? What if they don’t just find criminals but try to detect “criminality”? Laws and policies almost always lag technology so data scientists and machine learning experts are among the first line of ethical defense. The argument in this talk is that to be ethical, any system that classifies human beings has to consider the goals of the people affected by the system, not just the builders’ goals. This is not particularly convenient, but there are concrete ways to put goal-oriented design into practice. Doing so puts us in a better position to practice ethical behavior and attempt to address problems of power and the reproduction of inequality.
EMERCE - 2024 - AMSTERDAM - CROSS-PLATFORM TRACKING WITH GOOGLE ANALYTICS.pptx
The Ethics of Everybody Else
1. T H E E T H I C S O F
E V E RY B O D Y E L S E
T Y L E R S C H N O E B E L E N , I N T E G R AT E . A I
2. S O L E T ’ S K I C K S O M E S H I T
M Y D A D C A L L S T H E S E S H I T K I C K E R S
3. I R E A L LY R E A L LY D O N ’ T L I K E “ S H I T ”
4. S A N G T H E N I G H T M A R E T O I L E T R O L L D I S P E N S E R O F M Y
J A PA N E S E H O S T FA M I LY
“It’s a small world after all…”
5. – R E P R E S E N TAT I V E S T E V E K I N G ( R - M Y H O M E S TAT E )
“We can't restore our civilization with somebody
else's babies.”
6. O T H E R I N G
I N - G R O U P S G E T T O B E H E T E R O G E N O U S I N D I V I D U A L S , F O R E V E RY O N E E L S E T H E R E ’ S
7. T H E C O R E C L A I M
Data scientists and AI practitioners must consider
the goals of the people affected by the systems
they design and build
8. B A S I C O U T L I N E
• 3 kinds of problems
• An easy unethical project
• Training data, ethical frameworks, and categories
• What you think of people
• Practical recommendations
• Technology doesn’t just happen
9. A T Y P O L O G Y O F P R O B L E M S ( R I T T E L
A N D W E B B E R , 1 9 7 3 )
• Simple problems: Identify stakeholders, articulate
their goals, build a plan, execute
• Complex problems: Decompose into multiple simple
problems
• But some problems are…
10.
11. A T Y P O L O G Y O F P R O B L E M S ( R I T T E L
A N D W E B B E R , 1 9 7 3 )
• Simple problems: Identify stakeholders, articulate their
goals, build a plan, execute
• Complex problems: Decompose into multiple simple
problems
• Wicked problems: You can articulate goals but they
are fundamentally in conflict. There is no definitive
solution.
12. A N E A S Y U N E T H I C A L P R O J E C T
D E T E C T “ C R I M I N A L I T Y ” ( B U I L D E R G O A L ~ P U B L I C S A F E T Y )
13. All four classifiers perform consistently well and
produce evidence for the validity of automated
face-induced inference on criminality… Also, we
find some discriminating structural features for
predicting criminality, such as lip curvature, eye
inner corner distance, and the so-called nose-
mouth angle.
14. G E T Y O U R FA C E S C A N N E D F O R 7 0 C M O F T O I L E T PA P E R
15. P E R C E N TA G E O F M O D E L S W I T H N O
FA L S E P O S I T I V E S
~ 0 %
16. O K AY, FA C I A L
R E C O G N I T I O N
I S D O I N G W E L L
FA C T C H E C K
17. O N E Y E A R A F T E R T H O S E S TAT S
A LT H O U G H Y O U M AY R E M E M B E R …
18. P L A C E Y O U R T R U S T I N B I A S
H T T P S : / / O P E N P O L I C I N G . S TA N F O R D . E D U / F I N D I N G S / ( P I E R S O N E T A L , 2 0 1 7 )
19. 6 0 % O F S T O P S W E R E O F A F R I C A N A M E R I C A N S ,
W H O M A K E U P 2 8 % O F O A K L A N D ’ S P O P U L AT I O N
I N O A K L A N D ( E B E R H A R D T E T A L , 2 0 1 6 )
20. T H E I N T E R A C T I O N S T H E M S E LV E S
H AV E D I F F E R E N T Q U A L I T I E S
L O G - O D D S R AT I O S F O R O F F I C E R S P E E C H I N O A K L A N D
21. A N D V O I C E S A R E I G N O R E D
S E E R I C K F O R D & K I N G ( 2 0 1 6 ) O N H O W R A C H E L J E A N T E L’ S T E S T I M O N Y WA S D I S C O U N T E D
23. V I R T U E E T H I C S : T H E A C T O R ' S M O R A L
C H A R A C T E R A N D D I S P O S I T I O N
S E E , F O R E X A M P L E , A N N A S 1 9 9 8
24. D E O N T O L O G Y: T H E D U T I E S A N D O B L I G AT I O N S
O F T H E A C T O R G I V E N T H E I R R O L E
S E E , F O R E X A M P L E , K A M M 2 0 0 8
25. C O N S E Q U E N T I A L I S M : I T ’ S T H E O U T C O M E S O F T H E A C T I O N S
( U T I L I TA R I A N I S M I S T H E M O S T FA M O U S V E R S I O N O F T H I S —
D O T H E M O S T G O O D F O R T H E M O S T P E O P L E )
S E E , F O R E X A M P L E , F O O T 1 9 6 7 ; TA U R E K 1 9 7 7 ; PA R F I T 1 9 7 8 ; T H O M S O N 1 9 8 5
26. T H E C O R E C L A I M
Regardless of your preferred ethical framework,
Data scientists and AI practitioners must consider
the goals of the people affected by the systems
they design and build
27. P E O P L E H AV E I M P L I C I T B I A S E S ( A N D T H E S E
A R E F O U N D I N D ATA , C A L I S K A N E T A L 2 0 1 7 )
T RY O U T H T T P S : / / I M P L I C I T. H A R VA R D . E D U / I M P L I C I T / TA K E AT E S T. H T M L
28. Y O U R C AT E G O R I E S A R E W R O N G
( T H E Y M AY B E U S E F U L )
29. C O N S I D E R X O
A C R O S S 1 4 K
T W I T T E R U S E R S
• A lot more women use xo than men
• 11% of all women
• 2.5% of all men
• But that means that 89% of women aren’t using
it at all.
• People who use xo are three times more likely
to use ttyl (‘talk to you later’)
• The style is more commonly adopted by
women
• But there’s other stuff going on here: age,
job, etc.
• It’s not clear that gender is even the most
important, it’s just that we’re starting with
gender-colored glasses
30. P E O P L E A R E N O T J U S T T H E S U M O F
D I F F E R E N T D E M O G R A P H I C C H A R A C T E R I S T I C S
I N T E R S E C T I O N A L I T Y ( C R E N S H A W, 1 9 8 9 )
31. D O Y O U T H I N K P E O P L E
A R E S TAT I C ?
F O R Y O U , A R E T H E Y
I N H E R E N T LY G O O D O R
B A D ?
M O S T R E S E A R C H S U G G E S T S T H AT
G O O D N E S S I S C O N T E X T U A L
32. T H E O L O G Y S T U D E N T S I N A R U S H T O G I V E A
TA L K D O N O T H E L P A S T R A N G E R I N N E E D
E V E N W H E N T H E TA L K T H E Y A R E H U R RY I N G
T O G I V E I S A B O U T T H E G O O D S A M A R I TA N
D A R L E Y A N D B AT S O N ( 1 9 7 3 )
33. W E S E E M C O N S I S T E N T B E C A U S E W E T E N D T O B E I N
C O N S I S T E N T S I T U AT I O N S / R E L AT I O N S H I P S T O E A C H O T H E R
T H E S TAT U S Q U O M A I N TA I N S I T S E L F B E C A U S E W E T E N D
T O D O T H E T H I N G W E D I D B E F O R E
F O R S O C I A L T H E O RY A L O N G T H E S E L I N E S , S E E B O U R D I E U , 1 9 7 7 ; G I D D E N S , 1 9 8 4 ; B U T L E R , 1 9 9 9
34. - J A M E S S C O T T ( 1 9 9 0 )
“Power means not having to act, or more
accurately, the capacity to be more negligent and
casual about any single performance”
Systems are not equally hospitable to all people
They require some people to perform acrobatics and contortions to get by
36. 1 ) D O A P R E M O R T E M
H AV E T H E T E A M W R I T E O U T W H AT W E N T W R O N G … B E F O R E T H E P R O J E C T E V E N B E G I N S ( K L E I N 2 0 0 7 )
37. 2 ) L I S T P E O P L E A F F E C T E D
A N D Y O U N E E D T O TA L K T O T H E M
38. A F F E C T E D M E A N S
A F F E C T E D I N
T H E I R O W N T E R M S
For example, Jehovah’s
Witnesses refuse blood
transfusions
You could choose to ignore
what someone says matters
to them…but when, where,
why, and with whom?
39. 3 ) D E T E R M I N E
I F I T ’ S A W M D
• Opaque to the people they affect
• Affect important aspects of life
• Education
• Housing
• Health
• Work
• Justice
• Finance/credit
• Can do real damage
40. 4 ) A S K F O R
J U S T I F I C AT I O N S
• Go on Ethical High Alert when you hear:
• Everyone else is doing it and we
have to keep up
• No one else is doing it so we can
lead the pack
• It makes money
• It's legal
• It's inevitable
• Check out Pope & Vasquez (2016) and
https://kspope.com/ethics/
ethicalstandards.php
41. 5 ) N A M E T H E VA L U E S E N S H R I N E D
( A N D T H E O N E S AT O D D S )
W H AT * A R E * Y O U R VA L U E S ?
42. It’s not a principle until it costs you something.
43. 6 ) C O N S I D E R D E F E N S I V E E T H I C A L
P O S I T I O N I N G
( W O R K S B E T T E R I N I N D I A A N D T H E U S T H A N I N A U S T R A L I A , D E S A I & K O U C H A K I 2 0 1 7 )
44.
45. I F Y O U ’ R E I N T H I S R O O M ,
Y O U C A N P R O B A B LY W R I T E
Y O U R O W N T I C K E T A N D
H E L P O T H E R S S E E T H AT
T H E Y C A N , T O O
B T W, W H AT D O
Y O U WA N T T O B E
D O I N G ?
( P S - W E ’ R E H I R I N G )
46. – J A C K M A , F O U N D E R / E X E C C H A I R M A N O F A L I B A B A
“The first technology revolution caused World War I”
47.
48. ~ B R E A K D O W N O F T H E C O N G R E S S O F
V I E N N A
M O R E L I K E I M P E R I A L I S T P O L I T I C S C O M I N G H O M E T O R O O S T
49. A H I S T O RY
P R O F E S S O R
R E S P O N D S
“It also sort of annoys me
because it ignores politics
and actual decisions. People
decide to go to war.”
“We can decide not to go
to war.”
50. T H E C O R E C L A I M
Technology does not just happen
Data scientists and AI practitioners must consider
the goals of the people affected by the systems
they design and build
56. S C O U R G E D F R O M H E AV E N A N D H E L L W I L L N O T A C C E P T T H E M
And I worry about being among The Uncommitted
57. I F W E T R A C E S H I T T O I T S
R O O T S W E F I N D * S K E I
‘ T O C U T, S P L I T, D I V I D E ,
S E PA R AT E ’
58. W H E R E D O E S T H I S L E AV E U S ?
• We can’t actually do our jobs or live our lives without
making distinctions
• We can recognize that distinctions have consequences
• We can practice more care and questioning in our
cutting
• But…
59. T H E R E I S S T I L L A W O R L D O F O T H E R
P E O P L E O U T S I D E O F T H I S R O O M
• We need to take seriously Kate Crawford’s critique
• Most of the people who build technology come from
privileged backgrounds
• This makes it difficult for our imagination and empathy to
extend out to everyone our systems will affect
• The implication is that we need NOT ONLY to attend to issues of
diversity and representation
• AND to educate communities who will be affected so that they,
too, can voice their goals and values
60. T H E E X T E N S I O N O F T H E C O R E C L A I M
Data scientists and AI practitioners must consider
the goals of the people affected by the systems
they design and build
The practice of ethical design among experts leads
to greater ethical capacity
But ethics are too important to be left only to
experts