IEEE P7003 Algorithmic Bias
Considerations
Minimize unintended, unjustified and unacceptable algorithmic bias
Ansgar Koene, Dr. Ir.
The Hague, 11 June 2018
Algorithmic Discrimination
2
Confirmation Bias: Predictive policing
3
 A need for counterfactual double blind trials
Complex individuals reduced to
simplistic binary stereotypes
4
Case study: Recidivism risk prediction
 COMPAS recidivism prediction tool
– Built by a commercial company, Northpointe, Inc.
 Estimates likelihood of criminals re-offending in future
– Inputs: Based on a long questionnaire
– Outputs: Used across US by judges and parole officers
 Are COMPAS’ estimates fair to salient social groups?
5
Machine Bias: There’s software used across the
country to predict future criminals. Propublica
Case study: Recidivism risk prediction
6
Is the algorithm fair
to all groups?
When base rates differ, no non-trivial solution can achieve similar FPR,
FNR, FDR, FOR!
7
Open invitation to join the P7003 working group
http://sites.ieee.org/sagroups-7003/
Key question when developing or
deploying an algorithmic system
8
 Who will be affected?
 What are the decision/optimization criteria?
 How are these criteria justified?
 Are these justifications acceptable in the context where the system is
used?
P7003 foundational sections
 Taxonomy of Algorithmic Bias
 Person categorization and identifying affected population groups
 Legal frameworks related to Bias
 Psychology of Bias
9
P7003 algorithm development sections
 Algorithmic system design stages
 Assurance of representativeness of testing/training/validation data
 Evaluation of system outcomes
 Evaluation of algorithmic processing
 Assessment of resilience against external manipulation to Bias
 Documentation of criteria, scope and justifications of choices
Thank You!
10

IEEE P7003 Algorithmic Bias Considerations

  • 1.
    IEEE P7003 AlgorithmicBias Considerations Minimize unintended, unjustified and unacceptable algorithmic bias Ansgar Koene, Dr. Ir. The Hague, 11 June 2018
  • 2.
  • 3.
    Confirmation Bias: Predictivepolicing 3  A need for counterfactual double blind trials
  • 4.
    Complex individuals reducedto simplistic binary stereotypes 4
  • 5.
    Case study: Recidivismrisk prediction  COMPAS recidivism prediction tool – Built by a commercial company, Northpointe, Inc.  Estimates likelihood of criminals re-offending in future – Inputs: Based on a long questionnaire – Outputs: Used across US by judges and parole officers  Are COMPAS’ estimates fair to salient social groups? 5 Machine Bias: There’s software used across the country to predict future criminals. Propublica
  • 6.
    Case study: Recidivismrisk prediction 6 Is the algorithm fair to all groups? When base rates differ, no non-trivial solution can achieve similar FPR, FNR, FDR, FOR!
  • 7.
    7 Open invitation tojoin the P7003 working group http://sites.ieee.org/sagroups-7003/
  • 8.
    Key question whendeveloping or deploying an algorithmic system 8  Who will be affected?  What are the decision/optimization criteria?  How are these criteria justified?  Are these justifications acceptable in the context where the system is used?
  • 9.
    P7003 foundational sections Taxonomy of Algorithmic Bias  Person categorization and identifying affected population groups  Legal frameworks related to Bias  Psychology of Bias 9 P7003 algorithm development sections  Algorithmic system design stages  Assurance of representativeness of testing/training/validation data  Evaluation of system outcomes  Evaluation of algorithmic processing  Assessment of resilience against external manipulation to Bias  Documentation of criteria, scope and justifications of choices
  • 10.