Hartmut Winkler in 1998 "metamedia on the internet" https://www.theguardian.com/world/2016/oct/27/angela-merkel-internet-search-engines-are-distorting-our-perception
Giddens, Anthony (1994). ‘Living in a Post-Traditional Society’, in: Reflexive Modernization Politics, Tradition and Aesthetics in the Modern Social Order, ed. Ulrich Beck, Anthony Giddens, and Scott Lash, Stanford: Stanford University Press, 56-109. Spence, Michael. 1973. “Job Market Signaling.” The Quarterly Journal of Economics 87(3). Oxford University Press: 355–374. p.357
Desrosières, Alain. 2001. “How Real Are Statistics? Four Posssible Attitudes.” Social Research 68 (2). JSTOR: 339–55.
For a debate concerning the existence of an actual "Right to Explanation" see: Wachter, Sandra and Mittelstadt, Brent and Floridi, Luciano, Why a Right to Explanation of Automated Decision-Making Does Not Exist in the General Data Protection Regulation (December 28, 2016). International Data Privacy Law, 2017. Available at SSRN: https://ssrn.com/abstract=2903469 or http://dx.doi.org/10.2139/ssrn.2903469
Inquiry into the blindspots of an ethical approach, extending the reach? Complications? When we recognize that this concerns contested normative issues, things get complicated. ("the good", at a time where it's under pressure) Humanities: widen the perspective, show the interdependence;
Brent Daniel Mittelstadt, Patrick Allo, Mariarosaria Taddeo, Sandra Wachter and Luciano Floridi. 2016. The Ethics of Algorithms: Mapping the Debate. Big Data & Society 3(2).
Barocas, Solon, and Andrew D Selbst. 2015. “Big Data's Disparate Impact.” Ssrn. Dwork, Cynthia, Moritz Hardt, Toniann Pitassi, Omer Reingold, and Rich Zemel. 2011. “Fairness Through Awareness.” arXiv.org.
What is missing from the talk a bit is "and their politics". https://www.youtube.com/watch?v=jIXIuYdnyyk
Different understandings and hierarchies of values. Rawls stresses equal opportunity. Sen's critique of Rawls concerns, first and foremost, the focus on institutional procedures; (not just consequentialism) Cohen defends egalitarianism against Rawls' difference principle;
Rather than asking: is this fair or unfair, we can inquire what "brand" of fairness is strived for.
Dahrendorf, Ralf (2005): “The Rise and Fall of Meritocracy.” In: Project Syndicate, April 13 (https://www.project-syndicate.org/commentary/the-rise-and-fall-of- meritocracy). Allen, Ansgar (2012): “Life Without the ‘X’ Factor: Meritocracy Past and Present.” In: Power and Education 4/1, pp. 4–19. Littler, Jo. 2013. “Meritocracy as Plutocracy: the Marketising of ‘Equality’ Under Neoliberalism.” New Formations 80 (80): 52–72. "Through neoliberalism meritocracy has become an alibi for plutocracy, or government by a wealthy elite. It has become a key ideological term in the reproduction of neoliberal culture in Britain. It has done so by seizing the idea, practice and discourse of greater social equality which emerged in the first half of the twentieth century and marketising it. Meritocracy, as a potent blend of an essentialised notion of ‘talent’, competitive individualism and belief in social mobility, is mobilised to both disguise and gain consent for the economic inequalities wrought through neoliberalism. However, at the same time, such discourse is neither inevitable nor consistent."
What constellations are algorithms embedded in? A whole program for investigation!
Hayek, Friedrich A. 2002. “Competition as a Discovery Procedure.” The Quarterly Journal of Austrian Economics 5 (3): 9–23. Werron, Tobias. 2015. “Why Do We Believe in Competition? a Historical-Sociological View of Competition as an Institutionalized Modern Imaginary.” Distinktion: Scandinavian Journal of Social Theory 16 (2). Taylor & Francis: 186–210. doi:10.1080/1600910X.2015.1049190.
Goody, Jack (1977): The Domestication of the Savage Mind, Cambridge, UK: Cambridge University Press.
Ingber, Stanley. 1984. "The Marketplace of Ideas: a Legitimizing Myth." Duke Law Journal 1984 (1):1–91.
Leads to more theoretically grounded critiques than "filter bubbles".
What is prison for? Rehabilitation? Punishment? Extraction? What are recidivism scores for? Longer incarceration? Distribution of resources?
Bogost, Ian (2015, January 15). The Cathedral of Computation. The Atlantic. https://www.theatlantic.com/technology/archive/2015/01/the-cathedral-of- computation/384300/ r/Science , AAAS-AMA , AAAS AMA: Hi, we’re researchers from Google, Microsoft, and Facebook who study Artificial Intelligence. Ask us anything!, The Winnower5:e151896.65484 (2018). Chabot, Pascal (2003). La philosophie de Simondon, Paris: Vrin.
Bucher, Taina (2013). ‘The Friendship Assemblage: Investigating Programmed Sociality on Facebook’, in: Television & New Media, vol. 14, no. 6, 479-493. Ratto M (2005) Embedded technical expression: code and the leveraging of functionality. Information Society 21: 205–213. Behavior is already steered (Facebook is not "natural" or "normal" in any sense of the term).
Design itself often guided by A/B testing
Gurses, S., & van Hoboken, J. V. J. (2017, October 19). Privacy after the Agile Turn. https://doi.org/10.31235/osf.io/9gy73 Samuelson, Paul A. 1948. "Consumption Theory in Terms of Revealed Preference." Economica 15 (60): 243–253.
Objectivity as a democratic value? Agre, Philip E. 1994. “Surveillance and Capture: Two Models of Privacy.” The Information Society 10 (2): 101–127. Ciborra, Claudio U. (1985). ‘Reframing the Role of Computers in Organizations: The Transaction Costs Approach’, in: Proceedings of the Sixth International Conference on Information Systems, ed. Lynn Gallegos, Richard Welke and James C. Wetherbe, Chicago: Society of Information Management, 57-69.
Go beyond the individual service, look at competition, monopoly.
The production of behavior? COMPAS is part of a system that produces recidivism.
Nourished by earlier reasoning
This is a way to generalize and to widen the field" beyond small cases? How does this compare to other systems? Average RBDperday? Variance?
Rieder, Bernhard, Ariadna Matamoros-Fernández, and Òscar Coromina. 2018. “From Ranking Algorithms to ‘Ranking Cultures’.” Convergence: the International Journal of Research Into New Media Technologies 24 (1): 50–68. doi:10.1177/1354856517736982. William Webber, Alistair Moffat, and Justin Zobel (2010) A similarity measure for indefinite rankings. ACM Transactions on Information Systems 28(4)
Not Bias, but Christopher Small’s (1998) term of "musicking" http://labs.polsys.net/playground/spotify/
https://www.youtube.com/watch?v=fMym_BKWQzk Stanfill, M. 2015. “The Interface as Discourse: the Production of Norms Through Web Design.” New Media & Society 17 (7). SAGE Publications: 1059–74. doi:10.1177/1461444814520873.
Test and develop the margins; what kind of cognition/interestedness is possible? See your various decisions have consequences.
This is a way to generalize and to widen the field" beyond small cases?
How does this compare to other systems? Average RBDperday? Variance?
Critical categories? Retraining?
Nissenbaum H. Values in technical design. In: Mitcham C, editor. Encyclopedia of science, technology, and ethics. New York: Macmillan; 2005. pp. 66–70.
Truth, Justice, and Technicity: from Bias to the Politics of Systems
Truth, Justice, and Technicity: from
Bias to the Politics of Systems
Universiteit van Amsterdam
Amsterdam, July 2, 2018
Terms like data mining, data
science, big data, machine
learning, or algorithmic decision-
making point toward a set of
practices that have come to play
important roles in a variety of
These roles are increasingly under
What is the role of the humanities
scholar in this?
computer says no…
Areas of decision-making like hiring, criminal justice and policing,
access to credit and insurance, dynamic pricing, and information
ordering are some of the most emblematic problem areas.
Data mining is concerned with assessing differences and similarities
between entities in a dataset in the context of some task or decision.
The desire to discriminate – in "decentered, non-traditional societies"
(Giddens 1994) that complicate "signaling" (Spence 1973) – lies at the
heart of the practice, raising specific issues in areas like hiring, credit,
or criminal justice.
"We have stipulated that the employer cannot directly observe the marginal product
prior to hiring. What he does observe is a plethora of personal data in the form of
observable characteristics and attributes of the individual, and it is these that must
ultimately determine his assessment of the lottery he is buying." (Spence 1973)
Data mining is often used to realize "interested readings of reality" that
detect patterns in data as they relate to desired operational outcomes.
Machine learning, in particular, shifts the normative "core" of decisions
towards the empirical made data, the target variable, and some process
of labeling. Signals become meaningful in relation to a distinction.
This indicates a further shift from "metrological realism" to "accounting
realism" where the "'equivalence space' is composed not of physical
quantities (space and time), but of a general equivalent: money"
Inductive techniques allow for a deep embedding of "interestedness" into
practices and infrastructures.
The "dominant perspective" on the normative dimension of algorithmic
decision-making is highlighting important problems, but the
phenomenon merits broader interrogation.
This presentation suggests two directions for critical expansion:
1) interrogating and engaging notions of justice;
2) thinking in terms of systems rather than individual algorithms;
These lines of reasoning lead toward:
3) three ideas for possible digital methods projects;
(~truth) and "normative
raised by algorithmic
Mittelstadt et al. 2016)
Expansion 1: from truth to Justice
Epistemic concerns include the transformations of probability
assessments into binary decisions, opacity, and data problems such as the
over- or underrepresentation of a group in a dataset. Normative concerns
include disparate impact, loss of autonomy, and challenges to privacy.
"An action can be found discriminatory, for example, solely from its effect on a protected
class of people, even if made on the basis of conclusive, scrutable and well-founded
evidence." (Mittelstadt et al. 2017)
This more difficult problem arises because social reality is biased:
"Data mining takes the existing state of the world as a given, and ranks candidates according
to their predicted attributes in that world." (Barocas & Selbst 2015)
Decisions are based on the traces from societies characterized by
centuries of inequality and domination. Data mining can make these
inequalities actionable and lead to disparate impact on protected classes
Proposed solutions focus on "disparate impact detection" (Barocas &
Selbst 2015) and (technical) compensation strategies that implement "fair
affirmative action" (Dwork et al. 2011)
These strategies raise complicated questions:
☉ How to define and delimit protected classes?
☉ What are the risks of collecting data containing these class attributes?
☉ What are the broader societal effects of such classifications?
☉ Can these strategies be transferred to different national contexts?
"Any discrimination based on any ground such as sex, race, colour, ethnic or social origin,
genetic features, language, religion or belief, political or any other opinion, membership of a
national minority, property, birth, disability, age or sexual orientation shall be prohibited."
(EU Charter of Fundamental Rights)
"This paper discusses several fairness criteria that have
recently been applied to assess the fairness of recidivism
prediction instruments. We demonstrate that the criteria
cannot all be simultaneously satisfied when recidivism
prevalence differs across groups." (Chouldechova 2017)
Justice is NP-hard
Different – and (possibly) incommensurable – notions of justice are
embedded in different moral and political philosophies.
Moral and political narratives
The deep entanglement between algorithmic data analysis and
organizational practice in business and government suggests an
examination of the specific normative commitments made.
Which understandings of justice (and other values) are being put forward
explicitly or implicitly?
The narrative informing much of the procedural fairness movement is that
"Meritocracy has become an idea as uncontroversial and as homely as 'motherhood and
apple pie'." (Littler 2013)
"Indeed, nowadays meritocracy seems to be simply another version of the inequality that
characterises all societies." (Dahrendorf 2005)
While Dahrendorf still criticized academic achievement as central
measure, data mining can take almost everything as an indicator for
"merit", performing opaque readings of the inequalities the certificate
system hoped to reduce.
"Meritocracy has shifted from impersonal technology to a situation where the relation
between abilities and rewards has been deeply personalised." (Allen 2012)
True or not, the narrative remains a powerful legitimizing myth.
"Meritocracy, as a potent blend of an essentialised notion of 'talent', competitive
individualism and belief in social mobility, is mobilised to both disguise and gain consent
for the economic inequalities wrought through neoliberalism." (Littler 2013)
The notion of meritocracy points to the central role of competition in our
cultural and moral imaginaries.
Foucault (2004) argues that the key difference between classical liberalism
and neoliberalism is not the belief in markets, but whether specialization
and exchange or competition are the main source of wealth creation.
"Competition is important primarily as a discovery procedure whereby entrepreneurs
constantly search for unexploited opportunities that can also be taken advantage of by
others." (Hayek 2002 )
The extension of competitive constellations (e.g. into the public sector)
multiplies opportunities for data-based decision-making.
"Why do we believe in competition? Why do we, at least many of us, think of it as a
beneficial societal institution? Which particular kind of competition is at the heart of this
belief?" (Werron 2015)
If data mining provides new "levers on 'reality'" (Goody 1977), new
forms of designating winners and losers, is the focus on procedural
fairness and non-discrimination enough?
Beyond procedural justice
Another example: information diversity
Information diversity is often seen as a desirable good for democratic life.
"Personalisation algorithms reduce the diversity of information users encounter by excluding
content deemed irrelevant or contradictory to the user's beliefs. Information diversity can
thus be considered an enabling condition for autonomy." (Mittelstadt et al. 2017)
A critical perspective should interrogate concept of the a "marketplace of
ideas" as a "legitimizing myth" (Ingber 1984).
"In our complex society, affected by both sophisticated communication technology and
unequal allocations of resources and skills, the marketplace's inevitable bias supports
entrenched power structures or ideologies. […] A diversity of perspectives first requires a
corresponding diversity of social experiences and opportunities." (Ingber 1984)
The singular focus on the isolated agency of individual algorithms runs
against recent understandings of technology as infrastructure,
assemblage, or actor-network rather than as artifacts without causal
coupling or history.
"Concepts like 'algorithm' have become sloppy shorthands, slang terms for the act of
mistaking multipart complex systems for simple, singular ones." (Bogost 2015)
"[T]he competitive advantage really comes from the hard work of what you do with the
algorithm and all the processes around making a product, not from the core algorithm
itself." (Norvig 2018)
"Technical invention consists in rendering a system of disparate elements coherent."
This can go (far) beyond questions of data collection/construction.
Expansion 2: from algorithms to systems
Grammars of action
"Specific configurations of code, or programs, enable
some actions over others, reflecting the choices of
programmers; these choices structure users’ experiences,
what they can and cannot do or say with or through a
program. This structuring is not only political, but can be
considered expressive: a sort of embedded expression."
"Software configures friendship online by encoding
values and decisions about what is important, useful, and
relevant and what is not. Software restricts certain
activities by making others possible or impossible. As I
have shown, this becomes apparent when considering
the multifarious ways in which software." (Bucher 2013)
To move beyond individual algorithms we can look at the whole range of
"ontologies" and "functional expressions" (Petersen 2013) at work.
We can also inquire into the principles and practices informing design.
"[A]gile programming practices allow developers across services to continuously tweak,
remove, or add new features using 'build-measure-learn feedback loops'." (Gürses & van
At the center of the guiding rationale lies the theory of "revealed
preference", which holds that "the individual guinea-pig, by his market
behaviour, reveals his preference pattern" (Samuelson 1948). This fuels
and justifies the use of feedback signals as "votes".
"[W]e're making a major change to how we build
Facebook. I'm changing the goal I give our product
teams from focusing on helping you find relevant
content to helping you have more meaningful social
interactions. […] Now, I want to be clear: by making
these changes, I expect the time people spend on
Facebook and some measures of engagement will go
down. But I also expect the time you do spend on
Facebook will be more valuable. And if we do the right
thing, I believe that will be good for our community and
our business over the long term too." (Zuckerberg 2018)
Here, values seem to clash directly with the
ad-driven business model and the IPO logic.
Data mining calls for more descriptive ethics!
Values in design
The expansion of market forms
By lowering transaction cost, information technology has facilitated the
organization of many activities around market forms (Ciborra 1985).
"[B]y imposing a mathematically precise form upon previously unformalized activities,
capture standardizes those activities and their component elements and thereby prepares
them […] for an eventual transition to market-based relationships." (Agre 1994)
Some heatedly debated instances of algorithmic structuring concern
platforms that enact some kind of market structure – e.g. Facebook News
Feed (posts), Google Search (documents), Uber (transportation), etc.
Data and algorithms are used to optimize transactions, often with explicit
appeals to democratic values or consumer benefit.
Algorithmic coordination affords interested optimization.
From a purely economic framing of platforms as "intermediaries" to a
wider understanding as "mediators" and "curators of public discourse"
"New operators such as Google, Microsoft, Yahoo! and Apple, as well as the new, rising
social media firms, such as Facebook or Twitter, should by now be included in the list of the
most powerful media organisations worldwide." (Centre for Media Pluralism and Media
Do we accept "winner takes all" dynamics and cross-sector ownership /
expansion in the media sector? There is a tradition of limiting
concentration and foster diversity in "media-like" domains.
Shifts in media power
If data mining is concerned with assessing differences and similarities,
in relation to a desire to distinguish, we can interrogate more than the
methodologies and outcomes of decisions.
What are the normative commitments made in particular contexts and
how do they connect to wider systems of value?
Thinking in terms of larger ensembles and distributed causality points
toward many instances of technical and institutional design that have to
be taken into account.
Three interconnected ideas for possible digital methods projects (or
1) descriptive assemblage;
2) engaging techniques;
3) engaging values (in design);
Three ideas for digital methods projects
RankFlow for [gamergate], [syria], [trump].
(Rieder, Matamoros, Coromina, 2018)
Correlating Change (Rieder,
Matamoros, Coromina 2018)
p = 1 p = 0.8
Rank-Biased Distance (Webber, Moffat, Zobel 2010)
Three directions: description
"[S]ocial scientists might seek to elaborate a set of social scientific inscription devices,
borrowing from their colleagues in natural sciences, in market research, information
technology, etc., or they may prefer to champion description in the form of unique
narratives, much as it has been deployed in the humanities and cultural disciplines."
Combining "fairness forensics" (Crawford 2017) with other forms of
analysis, e.g. "discursive interface analysis" (Stanfill 2015) and forms of
Critique can identify layers of integrated forms and functions, discuss
their politics and identify "points of intervention"?
First idea: descriptive assemblage
Second ideas: engaging techniques
Many of the techniques in use are available and ready for
experimentation. Digital methods is a great space to experiment in
relation with actual cases.
Third idea: engaging values (in design)
Engaging "values in technical design" (Nissenbaum 2005) can take
different forms, one is to think about alternatives.
What should the "politics" of YouTube be? How would that look like?
Thinking about alternatives can deepen critique and help us confront
our own politics.
"[W]e're making a major change to how we build
Facebook. I'm changing the goal I give our product teams
from focusing on helping you find relevant content to
helping you have more meaningful social interactions.
[…] Now, I want to be clear: by making these changes, I
expect the time people spend on Facebook and some
measures of engagement will go down. But I also expect
the time you do spend on Facebook will be more
valuable. And if we do the right thing, I believe that will
be good for our community and our business over the
long term too." (Zuckerberg 2018)
What more is there to say about alternative
modes and possibilities?
Third idea: engaging values (in design)