Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

AI and Ethics - We are the guardians of our future

178 views

Published on

We're in charge, as data scientist and software engineers, of software that where mistakes have a very high price. It's time to look at the impact we have, how our software will be used and how to avoid creating unfair, biased and dangerous software

Published in: Engineering
  • Be the first to comment

  • Be the first to like this

AI and Ethics - We are the guardians of our future

  1. 1. WE ARE THE GUARDIANS OF OUR FUTURE Tess Ferrandez Photo: Porapak Apichodilok
  2. 2. Photo: Rosemary Ketchum GANG CRIME CLASSIFICATION Partially Generative Neural Networks For Gang Crime Classification With Partial Information
  3. 3. Photo: Perry Wilson 42 BABIES 28 ADMITTED GANG MEMBERS
  4. 4. I’M JUST AN ENGINEER
  5. 5. ONCE THE ROCKETS ARE UP WHO CARES WHERE THEY COME DOWN? That is not my department said Werner von Braun Photo: Pixabay
  6. 6. I’M JUST AN ENGINEER
  7. 7. I’M NOT JUST AN ENGINEER
  8. 8. Photo: m01229
  9. 9. Photo: Canned Muffins AI GAYDAR PAPER Deep Neural Networks Can Detect Sexual Orientation from Faces
  10. 10. Photo: m01229 CRIMINAL FACES Automated Inference on Criminality Using Face Images
  11. 11. CRIMINALS VIOLENT NON-CRIMINALS GOVERNMENT IDS CORPORATE HEADSHOTS
  12. 12. CRIMINAL NON-CRIMINAL
  13. 13. 90%
  14. 14. 86%
  15. 15. CAN WE DO IT?
  16. 16. SHOULD WE DO IT?
  17. 17. WE ARE CURRENTLY IN THE BIGGEST EXPERIMENT OF CLASSIFICATION IN HUMAN HISTORY Kate Crawford
  18. 18. Photo: from cover of Neue Illustrierte Zeitung on June 1, 1933
  19. 19. Photo: Bartek Wojtas
  20. 20. RESPONSIBILITY LAUNDERING Photo: Bartek Wojtas
  21. 21. Photo: Anton Mislawsky ALGORITHMS CAN’T BE RACIST THEY’RE JUST MATH
  22. 22. Photo: Sam Galison
  23. 23. LABELED FACES IN THE WILD
  24. 24. 77% MALE
  25. 25. 80% WHITE
  26. 26. 5% GEORGE W BUSH
  27. 27. JOY BOULAMWINI GENDER SHADES
  28. 28. https://metro.co.uk/2018/07/06/weve-got-to-stop-the-met-polices-dangerously-authoritarian-facial-recognition-surveillance-7687833/
  29. 29. Photo: Anton Mislawsky ALGORITHMS CAN’T BE RACIST THEY’RE JUST MATH BIAS LAUNDERING
  30. 30. CONFIRMATION BUBBLE
  31. 31. Photo: Fadil Elmansour AREA W. HIGH CRIME SEND MORE POLICE DO MORE ARRESTS APPEARS TO HAVE MORE CRIME SEND MORE POLICE DO MORE ARRESTS RUNAWAY FEEDBACK LOOP
  32. 32. BIAS IS AN NP HARD PROBLEM
  33. 33. REMOVE GENDER EQUAL OUTCOME EQUAL OPPORTUNITY
  34. 34. SHOULD WE JUST GIVE UP?
  35. 35. FAIR AND INCLUSIVE TRANSPARENT ACCOUNTABLE SAFE AND RELIABLE SECURITY AND PRIVACY FAIR AND INCLUSIVE TRANSPARENT ACCOUNTABLE SAFE AND RELIABLE SECURITY AND PRIVACY
  36. 36. CONSIDER HOW THE TECHNOLOGY COULD BE USED Photo: Pixabay
  37. 37. DATASHEETS FOR DATASETS
  38. 38. EVALUATE FOR FAIRNESS
  39. 39. TRANSPARENT DECISIONSTRANSPARENT DECISIONS
  40. 40. DETECT VS PREDICT
  41. 41. COVERED FACES ALONE MEN 20-40 HOODIES SHOPLIFTING POSES MEN 20-40 HOODIESCOVERED FACES ALONE
  42. 42. Photo: George Becker DEALING WITH ERRORS
  43. 43. HUMAN IN THE LOOP
  44. 44. WE’RE NOT JUST ENGINEERS
  45. 45. WE ARE THE GUARDIANS OF OUR FUTURE Tess Ferrandez Photo: Porapak Apichodilok

×