1
Daniel Wilson (daniel.wilson@auckland.ac.nz)
1. What is big data ethics / data science ethics / AI ethics?
Outline:
2
4. How do we incorporate ethics into systems development?
2. O’Neil’s analysis of Weapons of Math Destruction (WMDs)
1. What is big data ethics / data science ethics / AI ethics?
3. Case study: COMPAS predictions of risk of recidivism
3
4
Classic version AI version
5
Some ethical questions in AI and data science:
6
- Who owns data that is generated by you and/or about you
online?
- Why are transparent and explainable systems important?
- When should you get consent to run experiments on users?
- Under what conditions does your right to privacy
come into play?
- How do we judge whether an algorithm is fair?
Outline:
7
1. What is big data ethics / data science ethics / AI ethics?
4. How do we incorporate ethics into systems development?
2. O’Neil’s analysis of Weapons of Math Destruction (WMDs)
3. Case study: COMPAS predictions of risk of recidivism
8
The Truth About Algorithms | Cathy O'Neil
Audio extracted from a free talk given by O'Neil at the RSA in London, 2017.
https://www.youtube.com/watch?v=heQzqX35c9A
9
1
0
Opacity: “Even if the participant is aware of
being modelled, or what the model is used for,
is the model opaque, or even invisible?”
Scale: Does the model have the capacity
to grow exponentially? Can it scale?
Damage: “Does the model work against
the subject’s interest? In short, is it
unfair? Does it damage or destroy lives?”
Outline:
11
1. What is big data ethics / data science ethics / AI ethics?
4. How do we incorporate ethics into systems development?
2. O’Neil’s analysis of Weapons of Math Destruction (WMDs)
3. Case study: COMPAS predictions of risk of recidivism
12
13
14
1
5
Opacity: The COMPAS algorithm is proprietary to
Northpointe (now Equivant).
Scale: In Broward County, everyone who is arrested must
be assessed using COMPAS.
Damage: The analysis indicates that African-Americans
are more likely than Caucasians to receive a false positive
for med/high risk of recidivism. This negatively impacts
bail decisions for them.
Verdict: based on the above characteristics, COMPAS arguably is a WMD.
16
Diagnosing the disparity in false positive rates:
17
Images: Nature, "Bias detectives: the researchers striving to make algorithms fair",
20 June 2018
Diagnosing the disparity in false positive rates:
18
Images: Nature, "Bias detectives: the researchers striving to make algorithms fair",
20 June 2018
Diagnosing the disparity in false positive rates:
19
Images: Nature, "Bias detectives: the researchers striving to make algorithms fair",
20 June 2018
20
21
Outline:
22
1. What is big data ethics / data science ethics / AI ethics?
3. Case study: COMPAS predictions of risk of recidivism
4. How do we incorporate ethics into systems development?
2. O’Neil’s analysis of Weapons of Math Destruction (WMDs)
23
Pledges: E.g., the
24
Guidelines and Principles (resources):
25
Regulation
26
Algorithmic audits / tools:
http://aequitas.dssg.io/
http://www.oneilrisk.com/
27
2
8

Introduction to the ethics of machine learning

  • 1.
  • 2.
    1. What isbig data ethics / data science ethics / AI ethics? Outline: 2 4. How do we incorporate ethics into systems development? 2. O’Neil’s analysis of Weapons of Math Destruction (WMDs) 1. What is big data ethics / data science ethics / AI ethics? 3. Case study: COMPAS predictions of risk of recidivism
  • 3.
  • 4.
  • 5.
  • 6.
    Some ethical questionsin AI and data science: 6 - Who owns data that is generated by you and/or about you online? - Why are transparent and explainable systems important? - When should you get consent to run experiments on users? - Under what conditions does your right to privacy come into play? - How do we judge whether an algorithm is fair?
  • 7.
    Outline: 7 1. What isbig data ethics / data science ethics / AI ethics? 4. How do we incorporate ethics into systems development? 2. O’Neil’s analysis of Weapons of Math Destruction (WMDs) 3. Case study: COMPAS predictions of risk of recidivism
  • 8.
    8 The Truth AboutAlgorithms | Cathy O'Neil Audio extracted from a free talk given by O'Neil at the RSA in London, 2017. https://www.youtube.com/watch?v=heQzqX35c9A
  • 9.
  • 10.
    1 0 Opacity: “Even ifthe participant is aware of being modelled, or what the model is used for, is the model opaque, or even invisible?” Scale: Does the model have the capacity to grow exponentially? Can it scale? Damage: “Does the model work against the subject’s interest? In short, is it unfair? Does it damage or destroy lives?”
  • 11.
    Outline: 11 1. What isbig data ethics / data science ethics / AI ethics? 4. How do we incorporate ethics into systems development? 2. O’Neil’s analysis of Weapons of Math Destruction (WMDs) 3. Case study: COMPAS predictions of risk of recidivism
  • 12.
  • 13.
  • 14.
  • 15.
    1 5 Opacity: The COMPASalgorithm is proprietary to Northpointe (now Equivant). Scale: In Broward County, everyone who is arrested must be assessed using COMPAS. Damage: The analysis indicates that African-Americans are more likely than Caucasians to receive a false positive for med/high risk of recidivism. This negatively impacts bail decisions for them. Verdict: based on the above characteristics, COMPAS arguably is a WMD.
  • 16.
  • 17.
    Diagnosing the disparityin false positive rates: 17 Images: Nature, "Bias detectives: the researchers striving to make algorithms fair", 20 June 2018
  • 18.
    Diagnosing the disparityin false positive rates: 18 Images: Nature, "Bias detectives: the researchers striving to make algorithms fair", 20 June 2018
  • 19.
    Diagnosing the disparityin false positive rates: 19 Images: Nature, "Bias detectives: the researchers striving to make algorithms fair", 20 June 2018
  • 20.
  • 21.
  • 22.
    Outline: 22 1. What isbig data ethics / data science ethics / AI ethics? 3. Case study: COMPAS predictions of risk of recidivism 4. How do we incorporate ethics into systems development? 2. O’Neil’s analysis of Weapons of Math Destruction (WMDs)
  • 23.
  • 24.
  • 25.
  • 26.
    26 Algorithmic audits /tools: http://aequitas.dssg.io/ http://www.oneilrisk.com/
  • 27.
  • 28.