Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

IEEE P7003 Algorithmic Bias Considerations

503 views

Published on

Presentation by Ansgar Koene (Chair of IEEE P7003 – Algorithmic Bias Working Group, University of Notthingham, UK) at the event "Incorporating Ethical Considerations in Autonomous & Intelligent Systems (A/IS) – Policy & Industry Requirements in the Algorithmic Age". The event took place on 11 June 2018 and was jointly organized by IEEE Standards Association (IEEE-SA) and the Delft Design for Values Institute (DDFV). For more info see http://designforvalues.tudelft.nl/event/incorporating-ethical-considerations-in-ai-policy-industry-requirements/

Published in: Engineering
  • Login to see the comments

IEEE P7003 Algorithmic Bias Considerations

  1. 1. IEEE P7003 Algorithmic Bias Considerations Minimize unintended, unjustified and unacceptable algorithmic bias Ansgar Koene, Dr. Ir. The Hague, 11 June 2018
  2. 2. Algorithmic Discrimination 2
  3. 3. Confirmation Bias: Predictive policing 3  A need for counterfactual double blind trials
  4. 4. Complex individuals reduced to simplistic binary stereotypes 4
  5. 5. Case study: Recidivism risk prediction  COMPAS recidivism prediction tool – Built by a commercial company, Northpointe, Inc.  Estimates likelihood of criminals re-offending in future – Inputs: Based on a long questionnaire – Outputs: Used across US by judges and parole officers  Are COMPAS’ estimates fair to salient social groups? 5 Machine Bias: There’s software used across the country to predict future criminals. Propublica
  6. 6. Case study: Recidivism risk prediction 6 Is the algorithm fair to all groups? When base rates differ, no non-trivial solution can achieve similar FPR, FNR, FDR, FOR!
  7. 7. 7 Open invitation to join the P7003 working group http://sites.ieee.org/sagroups-7003/
  8. 8. Key question when developing or deploying an algorithmic system 8  Who will be affected?  What are the decision/optimization criteria?  How are these criteria justified?  Are these justifications acceptable in the context where the system is used?
  9. 9. P7003 foundational sections  Taxonomy of Algorithmic Bias  Person categorization and identifying affected population groups  Legal frameworks related to Bias  Psychology of Bias 9 P7003 algorithm development sections  Algorithmic system design stages  Assurance of representativeness of testing/training/validation data  Evaluation of system outcomes  Evaluation of algorithmic processing  Assessment of resilience against external manipulation to Bias  Documentation of criteria, scope and justifications of choices
  10. 10. Thank You! 10

×