Successfully reported this slideshow.

AI and Ethics - We are the guardians of our future

3

Share

Upcoming SlideShare
Debugging .NET apps
Debugging .NET apps
Loading in …3
×
1 of 71
1 of 71

AI and Ethics - We are the guardians of our future

3

Share

Download to read offline

We're in charge, as data scientist and software engineers, of software that where mistakes have a very high price. It's time to look at the impact we have, how our software will be used and how to avoid creating unfair, biased and dangerous software

We're in charge, as data scientist and software engineers, of software that where mistakes have a very high price. It's time to look at the impact we have, how our software will be used and how to avoid creating unfair, biased and dangerous software

More Related Content

Related Books

Free with a 14 day trial from Scribd

See all

AI and Ethics - We are the guardians of our future

  1. 1. WE ARE THE GUARDIANS OF OUR FUTURE Tess Ferrandez Photo: Porapak Apichodilok
  2. 2. Photo: Rosemary Ketchum GANG CRIME CLASSIFICATION Partially Generative Neural Networks For Gang Crime Classification With Partial Information
  3. 3. Photo: Perry Wilson 42 BABIES 28 ADMITTED GANG MEMBERS
  4. 4. I’M JUST AN ENGINEER
  5. 5. ONCE THE ROCKETS ARE UP WHO CARES WHERE THEY COME DOWN? That is not my department said Werner von Braun Photo: Pixabay
  6. 6. I’M JUST AN ENGINEER
  7. 7. I’M NOT JUST AN ENGINEER
  8. 8. Photo: m01229
  9. 9. Photo: Canned Muffins AI GAYDAR PAPER Deep Neural Networks Can Detect Sexual Orientation from Faces
  10. 10. Photo: m01229 CRIMINAL FACES Automated Inference on Criminality Using Face Images
  11. 11. CRIMINALS VIOLENT NON-CRIMINALS GOVERNMENT IDS CORPORATE HEADSHOTS
  12. 12. CRIMINAL NON-CRIMINAL
  13. 13. 90%
  14. 14. 86%
  15. 15. CAN WE DO IT?
  16. 16. SHOULD WE DO IT?
  17. 17. WE ARE CURRENTLY IN THE BIGGEST EXPERIMENT OF CLASSIFICATION IN HUMAN HISTORY Kate Crawford
  18. 18. Photo: from cover of Neue Illustrierte Zeitung on June 1, 1933
  19. 19. Photo: Bartek Wojtas
  20. 20. RESPONSIBILITY LAUNDERING Photo: Bartek Wojtas
  21. 21. Photo: Anton Mislawsky ALGORITHMS CAN’T BE RACIST THEY’RE JUST MATH
  22. 22. Photo: Sam Galison
  23. 23. LABELED FACES IN THE WILD
  24. 24. 77% MALE
  25. 25. 80% WHITE
  26. 26. 5% GEORGE W BUSH
  27. 27. JOY BOULAMWINI GENDER SHADES
  28. 28. https://metro.co.uk/2018/07/06/weve-got-to-stop-the-met-polices-dangerously-authoritarian-facial-recognition-surveillance-7687833/
  29. 29. Photo: Anton Mislawsky ALGORITHMS CAN’T BE RACIST THEY’RE JUST MATH BIAS LAUNDERING
  30. 30. CONFIRMATION BUBBLE
  31. 31. Photo: Fadil Elmansour AREA W. HIGH CRIME SEND MORE POLICE DO MORE ARRESTS APPEARS TO HAVE MORE CRIME SEND MORE POLICE DO MORE ARRESTS RUNAWAY FEEDBACK LOOP
  32. 32. BIAS IS AN NP HARD PROBLEM
  33. 33. REMOVE GENDER EQUAL OUTCOME EQUAL OPPORTUNITY
  34. 34. SHOULD WE JUST GIVE UP?
  35. 35. FAIR AND INCLUSIVE TRANSPARENT ACCOUNTABLE SAFE AND RELIABLE SECURITY AND PRIVACY FAIR AND INCLUSIVE TRANSPARENT ACCOUNTABLE SAFE AND RELIABLE SECURITY AND PRIVACY
  36. 36. CONSIDER HOW THE TECHNOLOGY COULD BE USED Photo: Pixabay
  37. 37. DATASHEETS FOR DATASETS
  38. 38. EVALUATE FOR FAIRNESS
  39. 39. TRANSPARENT DECISIONSTRANSPARENT DECISIONS
  40. 40. DETECT VS PREDICT
  41. 41. COVERED FACES ALONE MEN 20-40 HOODIES SHOPLIFTING POSES MEN 20-40 HOODIESCOVERED FACES ALONE
  42. 42. Photo: George Becker DEALING WITH ERRORS
  43. 43. HUMAN IN THE LOOP
  44. 44. WE’RE NOT JUST ENGINEERS
  45. 45. WE ARE THE GUARDIANS OF OUR FUTURE Tess Ferrandez Photo: Porapak Apichodilok

Editor's Notes

  • 23% no proof
    Living on a certain block can label you as Crip or Hells Angel
    Used to justify arrest, support maximum penalties
    No right to know if you’re on it, or to challenge it
    Used for background checks for jobs
  • Quote from a song by Tom Lehrer
  • The AI Gaydar paper

    Wang, Y., & Kosinski, M. (2018). Deep neural networks are more accurate than humans at detecting sexual orientation from facial images

    50 clicks on facebook
    Used by Cambridge analytica

    81% men, 74% women
    Data from dating websites

    Fluidity, what determines someones preferences? Bi? Bi-curious? outspoken

    Long tail, criminal in k countries, death penalty,
  • Criminal or Not = 90%
  • 1856 Images of Chinese Men 18-55, No facial hair or scars – faces cut out 730 -- Criminals – ID From Police Department 235 Violent Crimes,
    536 Theft, Fraud, Corruption, Forgery, Racketeering 1126 -- Random images from internet – “corporate photos” as Non-Criminals note: 90%, SAME NETWORK ARCHITECTURE WAS ONLY ABLE TO PICK UP GENDER IN 86%
  • Criminals
    Non-criminals
  • Criminal or Not = 90%
  • Gender w. AlexNet = 86%
  • Harm can further be classified into five types: stereotyping, recognition, denigration, under-representation and ex-nomination.

    Classification is always a product of its time
    We are currently in the biggest experiment of classification in human history

    Kate Crawford
  • Wouldn’t be able to tell one Wookie from the next
    Wife: Malla, Son: Lumpy
  • Areas with more crime get more policing => more crimes reported => considered worse neighborhood, more policing
  • Scrubbing to neutral
    What is neutral – gender, ethnicity, are people from lapponia counted as finnish, sexual orientation, political orientation?
    What about words like pregnancy, or color blindness? Queen?
    Who decides?
    Everyone is a majority somewhere and a minority somewhere else
  • Quote from a song by Tom Lehrer
  • ×