In 2018, Google’s CEO announced seven new ethical principles for their AI research. Stating that “at its heart” AI is computer programming, Google put together an ethical advisory board. After only a week and without even meeting, the board was disbanded following a Twitter-storm of protest over one of its members, who came from a right-wing think tank. This development, along with Google’s promotion of AI as fundamentally “computer programming” reveal a potential weakness around ethics, sometimes known as ethics-washing—a performative malpractice that leaves social injustices untouched. In this presentation I will draw on two recent research projects on surveillance and facial recognition technologies to explore ways in which technological governance may need to move beyond ethics, bias and fairness, to engage with how values are structurally encoded into decision-making systems.
1. Surveillance and the Smart City
Is Ethics Enough?
Jeremy W. Crampton
Professor of Urban Data Analysis
2. The engineer’s view
“Solutionism”
Problem + tools = solutions
Problems externally derived
Tools drive possibilities
–“We can’t code for that”
–Tools are tools! Neutral Melvin Kranzberg. History of Technology
professor, Georgia Tech
4. Personal data extraction…
We touch our smartphones
2,617 times a day
79% of owners check their
device within 15 minutes of
waking
33% of Americans would rather
give up sex than lose their
phone*
* Rana Faroohar Don’t be Evil. The Case Against Big Tech, 2019, p. 28
9. Is ethics enough? (If not, why not?)
1. Taking “the ecosystem” seriously
…The Microsoft
…Google
…Apple ecosystem
Typically, all the hardware
devices and software on
that system
Ecology: interaction of all organisms &
their environment
10. 2. Technology is the
symptom, not the cause,
of injustice
> Not assessing biases of
data outputs, but
systemic or structural
encoding of values
https://www.newstatesman.com/science-tech/technology/2019/09/new-jim-code-ruha-benjamin-racial-discrimination-algorithm
11. 3. Ethics over-emphasizes
(Hoffman 2019):
> Bad actors
> Single axis disadvantage (vs.
intersectionality)
> Technological consequences
over…
Structural norms of injustice
(esp. racism and patriarchy)
15. Survey of Surveillance Anxiety and Festivals
Rationale:
NHS: 8% of UK has “anxiety disorder”; self-
harm rising (esp. women)
Surveillance is introduced piecemeal, slowly
into smart city (boiling frog)
Music festivals (Glasto, Wire) experienced
upsurge in theft, assault, even mass killing
16. Festivals as formerly “liminal” spaces
2018 YouGov survey
Suspect at music concert after being
detected by facial recognition
17. What role does surveillance play in producing
anxiety (psychological) and stress (biological)?
Is stress burden distributed equally across
demographic groups?
Online pilot survey n=201
18. Top-line findings:
> Surveillance is not a universal public good
> Majority felt surveillance itself a safety concern
> Disproportionality across gender
> Peer/Bottom-up measures of security much more preferred
> People nevertheless did not report changing any of their behaviours
19.
20. …and implications for smart city
1. Women more concerns prior to attending; more in favor of
surveillance than men, non-binaries (44%-29%-11%)
But ambivalent: surveillant security only good for reporting after a crime (comments)
--Only 1 in 3 women felt more surveillance makes more safe
--Women fear sex-related assault much more than men (51%-13%)
Related to long history of women’s movement questioning privacy as patriarchal?
--But complex; women as subjects of surveillance; Rose’s paradoxical space
2. Bottom-up peer-based security measures much more widely favored than top-down security
surveillance
--health tents, going with friends
One size fits all in smart city (increased security surveillance) may be detrimental
29. Conclusions & Further Research
> How do background (infra)structures such as built
environment produce & encode normative values?
eg., “People’s AI” (w/ J. Xing, P. Blythe, UO)
> Algorithms and data do not “sit above” but are
“intertwined in the production of social and cultural
meaning” (Hoffman, 2019)
eg., value-based ML, esp. gender
Editor's Notes
Vibe is a key factor—a particular socio-spatial experience
Applications include driver emotion & attention evaluation
Put it all together, realtime head pose, eye gaze, lips, brows and eye widening