8. USA
Female Dummies in
Car Crash Tests
EU
8
● Female dummy introduced in 2011.
● 4-feet-10 inches tall, 108 pounds
● Can double as a 12-year-old child
● Female dummy used in 1 test
● … in the passenger seat
● Smaller version of a male dummy
http://www.diva-portal.org/smash/record.jsf?pid=diva2%3A1203713&dswid=
-7449https://www.washingtonpost.com/local/trafficandcommuting/female-dummy-makes
-her-mark-on-male-dominated-crash-tests/2012/03/07/gIQANBLjaS_story.html
https://www.humaneticsatd.com/crash-test-dummies/frontal-impact/hiii
-5f
9. female vehicle occupant’s
odds of being injured in a
frontal crash are 73%
greater than the odds for a
male occupant
9
- University of
Virginia, 2019
10. Heart Attacks
▪ BHF found Women are 50% more likely than men to
receive an incorrect diagnosis
▪ Estimate that over 10 years, 8,200 women have died
needlessly after a heart attack
10
11. Office
Temperature
▪ Standard office temperature developed in 1960s
▪ Overestimates female metabolic rates as much as 35%
11
https://www.nature.com/articles/nclimate2741
12. Voice Recognition
Software
▪ Google’s voice recognition software is 70% more
accurate for male voices over female voices.
12
https://makingnoiseandhearingthings.com/2016/07/12/
googles-speech-recognition-has-a-gender-bias/
13. Face Recognition
Software
▪ Microsoft and IBM can identify white male faces with
99% accuracy
▪ Both had error rates of 35% for dark-skinned women
13
https://www.theregister.co.uk/2018/02/13/facial_recognition_s
oftware_is_better_at_white_men_than_black_women/
16. Racial Bias
▪ COMPAS, designed by Northpointe is a tool that scores
offenders by likeliness to reoffend to determine prison
sentences and bail time.
▪ Criticised for being racist
16
https://medium.com/thoughts-and-reflections/racial-bias-and-gender-bias-example
s-in-ai-systems-7211e4c166a1
https://www.washingtonpost.com/news/monkey-cage/wp/2016/10/17/can-an-algorit
hm-be-racist-our-analysis-is-more-cautious-than-propublicas/
19. Human
Classification Bias
▪ Youtube sued by its creators over their demonetization
algorithm for discriminating LGBTQ content.
▪ Demonetisation is ‘confirmed’ by human classifiers
□ Allegedly outsourced from countries where
homosexualiy is illegal
19
https://www.theverge.com/2019/8/14/20805283/lgbtq-youtuber-lawsuit-discrimination-alleged-video-r
ecommendations-demonetization
https://www.htxt.co.za/2019/09/30/youtubers-discover-why-lgtbqi-content-is-being-demonetised-on-yo
utube/