-
1.
WE ARE THE
GUARDIANS
OF OUR FUTURE
Tess Ferrandez
Photo: Porapak Apichodilok
-
2.
Photo: Rosemary Ketchum
GANG CRIME
CLASSIFICATION
Partially Generative Neural Networks For Gang
Crime Classification With Partial Information
-
3.
Photo: Perry Wilson
42
BABIES
28
ADMITTED
GANG MEMBERS
-
4.
I’M JUST AN ENGINEER
-
5.
ONCE THE ROCKETS ARE
UP
WHO CARES WHERE THEY
COME DOWN?
That is not my department
said Werner von Braun
Photo: Pixabay
-
6.
I’M JUST AN ENGINEER
-
7.
I’M NOT JUST AN ENGINEER
-
8.
Photo: m01229
-
9.
Photo: Canned Muffins
AI GAYDAR
PAPER
Deep Neural Networks Can Detect
Sexual Orientation from Faces
-
10.
Photo: m01229
CRIMINAL
FACES
Automated Inference on Criminality
Using Face Images
-
11.
CRIMINALS
VIOLENT
NON-CRIMINALS
GOVERNMENT
IDS
CORPORATE
HEADSHOTS
-
12.
CRIMINAL NON-CRIMINAL
-
13.
90%
-
14.
86%
-
15.
CAN WE DO IT?
-
16.
SHOULD WE DO IT?
-
17.
WE ARE CURRENTLY
IN THE BIGGEST
EXPERIMENT OF
CLASSIFICATION
IN HUMAN
HISTORY
Kate Crawford
-
18.
Photo: from cover of Neue Illustrierte Zeitung on June 1, 1933
-
19.
Photo: Bartek Wojtas
-
20.
RESPONSIBILITY
LAUNDERING
Photo: Bartek Wojtas
-
21.
Photo: Anton Mislawsky
ALGORITHMS
CAN’T BE RACIST
THEY’RE JUST MATH
-
22.
Photo: Sam Galison
-
23.
LABELED
FACES
IN THE WILD
-
24.
77% MALE
-
25.
80% WHITE
-
26.
5% GEORGE W BUSH
-
27.
JOY BOULAMWINI
GENDER SHADES
-
28.
https://metro.co.uk/2018/07/06/weve-got-to-stop-the-met-polices-dangerously-authoritarian-facial-recognition-surveillance-7687833/
-
29.
Photo: Anton Mislawsky
ALGORITHMS
CAN’T BE RACIST
THEY’RE JUST MATH
BIAS LAUNDERING
-
30.
CONFIRMATION BUBBLE
-
31.
Photo: Fadil Elmansour
AREA W. HIGH CRIME
SEND MORE POLICE
DO MORE ARRESTS
APPEARS TO HAVE MORE CRIME
SEND MORE POLICE
DO MORE ARRESTS
RUNAWAY FEEDBACK LOOP
-
32.
BIAS IS AN
NP HARD PROBLEM
-
33.
REMOVE GENDER
EQUAL OUTCOME
EQUAL OPPORTUNITY
-
34.
SHOULD WE JUST GIVE
UP?
-
35.
FAIR AND INCLUSIVE
TRANSPARENT
ACCOUNTABLE
SAFE AND RELIABLE
SECURITY AND PRIVACY
FAIR AND INCLUSIVE
TRANSPARENT
ACCOUNTABLE
SAFE AND RELIABLE
SECURITY AND PRIVACY
-
36.
CONSIDER HOW
THE TECHNOLOGY
COULD BE USED
Photo: Pixabay
-
37.
DATASHEETS
FOR DATASETS
-
38.
EVALUATE FOR
FAIRNESS
-
39.
TRANSPARENT DECISIONSTRANSPARENT DECISIONS
-
40.
DETECT
VS PREDICT
-
41.
COVERED FACES
ALONE
MEN 20-40 HOODIES
SHOPLIFTING POSES
MEN 20-40 HOODIESCOVERED FACES
ALONE
-
42.
Photo: George Becker
DEALING WITH ERRORS
-
43.
HUMAN
IN THE LOOP
-
44.
WE’RE NOT JUST ENGINEERS
-
45.
WE ARE THE
GUARDIANS
OF OUR FUTURE
Tess Ferrandez
Photo: Porapak Apichodilok
23% no proof
Living on a certain block can label you as Crip or Hells Angel
Used to justify arrest, support maximum penalties
No right to know if you’re on it, or to challenge it
Used for background checks for jobs
Quote from a song by Tom Lehrer
The AI Gaydar paper
Wang, Y., & Kosinski, M. (2018). Deep neural networks are more accurate than humans at detecting sexual orientation from facial images
50 clicks on facebook
Used by Cambridge analytica
81% men, 74% women
Data from dating websites
Fluidity, what determines someones preferences? Bi? Bi-curious? outspoken
Long tail, criminal in k countries, death penalty,
Criminal or Not = 90%
1856 Images of Chinese Men 18-55, No facial hair or scars – faces cut out730 -- Criminals – ID From Police Department 235 Violent Crimes,
536 Theft, Fraud, Corruption, Forgery, Racketeering1126 -- Random images from internet – “corporate photos” as Non-Criminalsnote: 90%, SAME NETWORK ARCHITECTURE WAS ONLY ABLE TO PICK UP GENDER IN 86%
Criminals
Non-criminals
Criminal or Not = 90%
Gender w. AlexNet = 86%
Harm can further be classified into five types: stereotyping, recognition, denigration, under-representation and ex-nomination.
Classification is always a product of its time
We are currently in the biggest experiment of classification in human history
Kate Crawford
Wouldn’t be able to tell one Wookie from the next
Wife: Malla, Son: Lumpy
Areas with more crime get more policing => more crimes reported => considered worse neighborhood, more policing
Scrubbing to neutral
What is neutral – gender, ethnicity, are people from lapponia counted as finnish, sexual orientation, political orientation?
What about words like pregnancy, or color blindness? Queen?
Who decides?
Everyone is a majority somewhere and a minority somewhere else
Quote from a song by Tom Lehrer