5. Today
• Digital infrastructures
• Are shaped by people with: values, power, and privilege
• Reflect these values, power, and privilege
• Serve and preserve these values, power, and privilege
6. Digital Infrastructures
• The internet: a public global
technical system and
information space
• Digital infrastructures enable
new (compared to print) modes
of information delivery and
analysis
• Algorithms
• Big Data
• Social media
Algorithms
Social
Media
Big Data
7. Algorithms
Martin, K. Ethical implications and accountability of algorithms. Journal of Business Ethics (2018).
https://doi.org/10.1007/s10551-018-3921-3
9. Algorithms: Hiring
“Amazon’s computer models were
trained to vet applicants by observing
patterns in resumes submitted to the
company over a 10-year period. Most
came from men, a reflection of male
dominance across the tech industry. ”
Dastin, J. October 9, 2018. “Amazon scraps secret AI
recruiting tool that showed bias against women.”
Reuters. https://www.reuters.com/article/us-
amazon-com-jobs-automation-insight/amazon-
scraps-secret-ai-recruiting-tool-that-showed-bias-
against-women-idUSKCN1MK08G
10. Algorithms: Justice (?!)
COMPAS
• predicts risk that a person accused
of a crime might re-offend in the
future
• used in sentencing decisions in
some US courtrooms
• algorithm: commercial trade secret
• outcomes:
• significant disparities between black
vs. white defendants (Angwin et al.
2016);
• same accuracy as uninstructed
laypersons (The Atlantic ).
Martin, K. Ethical implications and accountability of
algorithms. Journal of Business Ethics (2018).
https://doi.org/10.1007/s10551-018-3921-3
Angwin, J., Larson, J., Mattu, S., & Kirchner, L. (2016a).
Machine bias: There’s software used across the country to
predict future criminals. And it’s biased against
blacks. ProPublica. https://www.propublica.org/article/mac
hine-bias-risk-assessments-in-criminal-sentencing. Accessed
3 Aug 2016.
Yong, E. A Popular Algorithm Is No Better at Predicting
Crimes Than Random People. The Atlantic (2018).
https://www.theatlantic.com/technology/archive/2018/01/
equivant-compas-algorithm/550646/.
11. Algorithms
• Understanding of problem
• Understanding of training data
• Incomplete source data (or false proxies)
• Opaque algorithm
• Knowledge vs. harm at scale
Cathy O’Neill, Weapons of Math Destruction
14. Social Media: Fake News
• On Twitter, falsehoods spread significantly faster than truths
“To understand how false news spreads, Vosoughi et al. used a data set of
rumor cascades on Twitter from 2006 to 2017. About 126,000 rumors were
spread by ∼3 million people. False news reached more people than the truth;
the top 1% of false news cascades diffused to between 1000 and 100,000
people, whereas the truth rarely diffused to more than 1000 people. Falsehood
also diffused faster than the truth. The degree of novelty and the emotional
reactions of recipients may be responsible for the differences observed.”
(http://science.sciencemag.org/content/359/6380/1146)
15. Social Media: Personal Data
• Facebook & Cambridge Analytica
scandal: harvesting of 87 million
users’ personally identifiable data via
Facebook API – without users’
individual consent
• Cadwalladr C. et E. Graham-Harrison,
“Revealed: 50 million Facebook
Profiles Harvested for Cambridge
Analytica in Major Data Breach,” The
Observer, 17 March 2018.
• Rosenberg M., Confessore N. and E.
Cadwalladr, “How Trump Consultants
Exploited the Facebook Data of
Millions », The New York Times, 17
March 2018.
• Solon O., “Facebook Says Cambridge
Analytica May Have Gained 37m More
Users’ Data,” The Guardian, 4 avril
2018.
16. Social Media
How Cambridge
Analytica turned
Facebook ‘likes’
into a lucrative
political tool
https://www.theguardian.com/
technology/2018/mar/17/faceb
ook-cambridge-analytica-
kogan-data-algorithm
17. Social Media: Platform Cultures
Amnesty International. Violence & Abuse against
Women Online.
https://infogram.com/1px2nrj55p5ndwcqxxdvp2q03zh3
x6lk0m
Amnesty International.
https://decoders.amnesty.org/projects/troll-
patrol/findings
18. • Recruit new group members
• Disseminate propaganda
• Create a sense of transnational
identity
• Increase own or group’s social
desirability by e.g. normalizing
hateful views
• Reframe news stories and
historical documents
• Mobilize group members for
real-life events
Goals
• SEO’d online content
repositories
• Dedicated community
spaces: forums and social
media groups
• Weaponization of
platform affordances
• Algorithmic amplification
• Reach, scale, instantaneity
Means &
Methods
Cyberhate
20. Google Search
Library Search
• Content is supplied by scholarly
experts trained in content-
specific expertise + research
methods
• Content is “vetted” by: peer
reviewers; journal editors;
content librarians
• Organizational Priority: research
& education
Google Search
• Content is supplied people and
organizations
• Content is ordered
algorithmically by PageRank
(algorithm details withheld)
• Content is vetted in countries
with e.g. anti-hate laws
• Organizational Priority: profit
21. Google Search
“Google is not a conventional company. We do not intend to become
one. Throughout Google’s evolution as a privately held company, we
have managed Google differently. We have also emphasized an
atmosphere of creativity and challenge, which has helped us provide
unbiased, accurate and free access to information for those who rely
on us around the world.”
(Brin & Page: Letter to Shareholders,
https://abc.xyz/investor/founders-letters/2004-ipo-letter/)
22. Google Search Fails
• “Professor”
• Noble, Algorithms of
Oppression
• “black on black crime”
23. Google Search
Search engines are not (legally) public goods or public infrastructure,
though part of what makes them powerful is that they are treated as
such.
The promise: “unbiased, accurate and free access to information”
The reality:
• unvetted (except where legally mandated) access to information
• profit-driven normalization of e.g. racist stereotypes
• normalization and dissemination of hateful views w/r/t marginalized
groups
• amplification of extremist points of view