2. This class
• Algorithms in society
• Previous algorithmic accountability stories
• Measuring race and gender Bias
• Unpacking ProPublica’s “Machine Bias”
7. Algorithm Tips: A Resource for Algorithmic Accountability in Government
Trielli, Stark, Diakopolous
Search terms used to find “algorithms” on .gov sites
8. Algorithm Tips: A Resource for Algorithmic Accountability in Government
Trielli, Stark, Diakopolous
18. Title VII of Civil Rights Act, 1964
• It shall be an unlawful employment practice for an employer -
• (1) to fail or refuse to hire or to discharge any individual, or otherwise to
discriminate against any individual with respect to his compensation,
terms, conditions, or privileges of employment, because of such
individual’s race, color, religion, sex, or national origin; or
• (2) to limit, segregate, or classify his employees or applicants for
employment in any way which would deprive or tend to deprive any
individual of employment opportunities or otherwise adversely affect his
status as an employee, because of such individual’s race, color,
religion, sex, or national origin.
19. Regulated Domains
Credit (Equal Credit Opportunity Act)
Education (Civil Rights Act of 1964; Education Amendments of 1972)
Employment (Civil Rights Act of 1964)
Housing (Fair Housing Act)
‘Public Accommodation’ (Civil Rights Act of 1964)
Fairness in Machine Learning, NIPS 2017 Tutorial
Solon Barocas and Moritz Hardt
20. Protected Classes
Race (Civil Rights Act of 1964); Color (Civil Rights Act of 1964); Sex (Equal
Pay Act of 1963; Civil Rights Act of 1964); Religion (Civil Rights Act of
1964);National origin (Civil Rights Act of 1964); Citizenship (Immigration
Reform and Control Act); Age (Age Discrimination in Employment Act of
1967);Pregnancy (Pregnancy Discrimination Act); Familial status (Civil
Rights Act of 1968); Disability status (Rehabilitation Act of 1973; Americans
with Disabilities Act of 1990); Veteran status (Vietnam Era Veterans'
Readjustment Assistance Act of 1974; Uniformed Services Employment and
Reemployment Rights Act); Genetic information (Genetic Information
Nondiscrimination Act)
Fairness in Machine Learning, NIPS 2017 Tutorial
Solon Barocas and Moritz Hardt
32. P(outcome | score) is fair
Fair prediction with disparate impact: A study of bias in recidivism prediction instruments,
Chouldechova
33. How We Analyzed the COMPAS Recidivism Algorithm, ProPublica
Or, as ProPublica put it
34. Even if two groups of the population admit simple classifiers, the whole population may
not (from How Big Data is Unfair)
35. The Problem
Fair prediction with disparate impact: A study of bias in recidivism prediction instruments,
Chouldechova
36. When the base rates differ by protected group and when there is not separation, one cannot have
both conditional use accuracy equality and equality in the false negative and false positive rates.
…
The goal of complete race or gender neutrality is unachievable.
…
Altering a risk algorithm to improve matters can lead to difficult stakeholder choices. If it is
essential to have conditional use accuracy equality, the algorithm will produce different false
positive and false negative rates across the protected group categories. Conversely, if it is
essential to have the same rates of false positives and false negatives across protected group
categories, the algorithm cannot produce conditional use accuracy equality. Stakeholders will
have to settle for an increase in one for a decrease in the other.
Fairness in Criminal Justice Risk Assessments: The State of the Art, Berk et. al.
Impossibility theorem