Human Agency on
Algorithmic Systems
ANSGAR KOENE & ELVIRA PEREZ VALLEJOS, UNIVERSITY OF NOTTINGHAM
HELENA WEBB & MENISHA PATEL, UNIVERSITY OF OXFORD
AOIR 2017
User experience satisfaction on social
network sites
2
Human attention is a limited resource
Filter3
Good information service = good filtering
4
Sacrificing control for Convenience
5
Sacrificing control for Convenience
6
Personalized recommendations
 Content based – similarity to past results the user liked
 Collaborative – results that similar users liked
(people with statistically similar tastes/interests)
 Community based – results that people in the same social
network liked
(people who are linked on a social network e.g. ‘friends’)
7
How do the algorithms work?
8
User understanding of social media
algorithms: Facebook News Feed
Of 40 interviewed participants more than 60% of Facebook users
are entirely unaware of any algorithmic curation on Facebook at
all: “They believed every single story from their friends and
followed pages appeared in their news feed”.
Published at: CHI 2015
10
Pre-workshop survey of 96 teenagers
(13-17 years old)
 No preference between the internet experience to be more personalised or more
‘organic’: A.: 53% More personalised, 47% More ‘organic’
 Lack of awareness about the way search engines rank the information but
participants believe it’s important for people to know
• How much do you know?- A.: 36% Not much, 58% A little, 6% Quite a lot
• Do you think it’s important to know?- A:. 62% Yes, 16% Not really, 22% Don’t know
 Regulation role: Who makes sure that the Internet and digital world is safe and
neutral? A:.4% Police, 23% Nobody, 29% Government, 44% The big tech companies
Multi-Stakeholder Workshop,
Multiple stakeholders: academia, education, NGOs, industry
30 participants
Fairness in relation to algorithmic design and practice
Four key case studies: fake news, personalisation, gaming the system, and
transparency
What constitutes a fair algorithm?
What kinds of (legal and ethical) responsibilities do Internet companies have,
to ensure their algorithms produce results that are fair and without bias?
Fairness in relation to algorithmic design and
practice - participant recommendations
 Criteria relating to social norms and values:
 Criteria relating to system reliability:
 Criteria relating to (non-)interference with user control:
Criteria relating to social norms and values:
(i) Sometimes disparate outcome are acceptable if based on
individual lifestyle choices over which people have control.
(ii) Ethical precautions are more important than higher accuracy.
(iii)There needs to be a balancing of individual values and socio-
cultural values. Problem: How to weigh relevant social-
cultural value?
Criteria relating to system reliability:
(i) Results must be balanced with due regard for trustworthiness.
(ii) Need for independent system evaluation and monitoring over
time.
Criteria relating to (non-)interference
with user control:
(i) Subjective fairness experience depends on user objectives at
time of use, therefore requires an ability to tune the data and
algorithm.
(ii) Users should be able to limit data collection about them and
its use. Inferred personal data is still personal data. Meaning
assigned to the data must be justified towards the user.
(iii)Functioning of algorithm should be demonstrated/explained in
a way that can be understood by the data subject.
Criteria relating to (non-)interference
with user control:
(iv) If not vital to the task, there should be option to opt-out of
the algorithm
(v) Users must have freedom to explore algorithm effects, even
if this would increase the ability to “game the system”
(vi)Need for clear means of appeal/redress for impact of the
algorithmic system.
Take (some) control of News Feed priorities
18
Letting users choose the Algorithm
Evaluating fairness from outputs only
Most preferred
Least preferred
Evaluating fairness with knowledge
about the algorithm decision principles
 A1: minimise disparity while
guaranteeing at least 70% of
maximum possible total
 A2: maximise the minimum
individual outcome while
guaranteeing at least 70% of
maximum possible total
 A3: maximise total
 A4: maximise the minimum
individual outcome
 A5: minimise disparity
Most preferred
Least preferred
Conclusion
Algorithmic mediation (can) plays an important role in improving
the usefulness of online services.
Users want more options to understand, adjust or even opt-out of
algorithmic mediation
Users do not agree on a single option when choosing a ‘best’
algorithm for a given task.
Thank you!
http://unbias.wp.horizon.ac.uk/
Open invitation to join the P7003 working group
http://sites.ieee.org/sagroups-7003/
25
Revealing News Feed behaviour
Participants indicate desired changes
26
27
Machine learning principles
Classifiers
Clustering
29
E. Bakshy, S. Medding & L.A. Adamic, “Exposure to ideologically diverse news and opinion on Facebook” Science, 348, 1130-1132, 2015
Echo-chamber enhancement by NewsFeed
algorithm
10.1 million active US Facebook users
Proportion of content that is cross-cutting
30
E. Bakshy, S. Medding & L.A. Adamic, “Exposure to ideologically diverse news and opinion on Facebook” Science, 348, 1130-1132, 2015
Positioning effect in NewsFeed
31

Human Agency on Algorithmic Systems

  • 1.
    Human Agency on AlgorithmicSystems ANSGAR KOENE & ELVIRA PEREZ VALLEJOS, UNIVERSITY OF NOTTINGHAM HELENA WEBB & MENISHA PATEL, UNIVERSITY OF OXFORD AOIR 2017
  • 2.
    User experience satisfactionon social network sites 2
  • 3.
    Human attention isa limited resource Filter3
  • 4.
    Good information service= good filtering 4
  • 5.
  • 6.
  • 7.
    Personalized recommendations  Contentbased – similarity to past results the user liked  Collaborative – results that similar users liked (people with statistically similar tastes/interests)  Community based – results that people in the same social network liked (people who are linked on a social network e.g. ‘friends’) 7
  • 8.
    How do thealgorithms work? 8
  • 10.
    User understanding ofsocial media algorithms: Facebook News Feed Of 40 interviewed participants more than 60% of Facebook users are entirely unaware of any algorithmic curation on Facebook at all: “They believed every single story from their friends and followed pages appeared in their news feed”. Published at: CHI 2015 10
  • 11.
    Pre-workshop survey of96 teenagers (13-17 years old)  No preference between the internet experience to be more personalised or more ‘organic’: A.: 53% More personalised, 47% More ‘organic’  Lack of awareness about the way search engines rank the information but participants believe it’s important for people to know • How much do you know?- A.: 36% Not much, 58% A little, 6% Quite a lot • Do you think it’s important to know?- A:. 62% Yes, 16% Not really, 22% Don’t know  Regulation role: Who makes sure that the Internet and digital world is safe and neutral? A:.4% Police, 23% Nobody, 29% Government, 44% The big tech companies
  • 12.
    Multi-Stakeholder Workshop, Multiple stakeholders:academia, education, NGOs, industry 30 participants Fairness in relation to algorithmic design and practice Four key case studies: fake news, personalisation, gaming the system, and transparency What constitutes a fair algorithm? What kinds of (legal and ethical) responsibilities do Internet companies have, to ensure their algorithms produce results that are fair and without bias?
  • 13.
    Fairness in relationto algorithmic design and practice - participant recommendations  Criteria relating to social norms and values:  Criteria relating to system reliability:  Criteria relating to (non-)interference with user control:
  • 14.
    Criteria relating tosocial norms and values: (i) Sometimes disparate outcome are acceptable if based on individual lifestyle choices over which people have control. (ii) Ethical precautions are more important than higher accuracy. (iii)There needs to be a balancing of individual values and socio- cultural values. Problem: How to weigh relevant social- cultural value?
  • 15.
    Criteria relating tosystem reliability: (i) Results must be balanced with due regard for trustworthiness. (ii) Need for independent system evaluation and monitoring over time.
  • 16.
    Criteria relating to(non-)interference with user control: (i) Subjective fairness experience depends on user objectives at time of use, therefore requires an ability to tune the data and algorithm. (ii) Users should be able to limit data collection about them and its use. Inferred personal data is still personal data. Meaning assigned to the data must be justified towards the user. (iii)Functioning of algorithm should be demonstrated/explained in a way that can be understood by the data subject.
  • 17.
    Criteria relating to(non-)interference with user control: (iv) If not vital to the task, there should be option to opt-out of the algorithm (v) Users must have freedom to explore algorithm effects, even if this would increase the ability to “game the system” (vi)Need for clear means of appeal/redress for impact of the algorithmic system.
  • 18.
    Take (some) controlof News Feed priorities 18
  • 19.
    Letting users choosethe Algorithm
  • 20.
    Evaluating fairness fromoutputs only Most preferred Least preferred
  • 21.
    Evaluating fairness withknowledge about the algorithm decision principles  A1: minimise disparity while guaranteeing at least 70% of maximum possible total  A2: maximise the minimum individual outcome while guaranteeing at least 70% of maximum possible total  A3: maximise total  A4: maximise the minimum individual outcome  A5: minimise disparity Most preferred Least preferred
  • 22.
    Conclusion Algorithmic mediation (can)plays an important role in improving the usefulness of online services. Users want more options to understand, adjust or even opt-out of algorithmic mediation Users do not agree on a single option when choosing a ‘best’ algorithm for a given task.
  • 23.
    Thank you! http://unbias.wp.horizon.ac.uk/ Open invitationto join the P7003 working group http://sites.ieee.org/sagroups-7003/
  • 25.
  • 26.
  • 27.
  • 29.
  • 30.
    E. Bakshy, S.Medding & L.A. Adamic, “Exposure to ideologically diverse news and opinion on Facebook” Science, 348, 1130-1132, 2015 Echo-chamber enhancement by NewsFeed algorithm 10.1 million active US Facebook users Proportion of content that is cross-cutting 30
  • 31.
    E. Bakshy, S.Medding & L.A. Adamic, “Exposure to ideologically diverse news and opinion on Facebook” Science, 348, 1130-1132, 2015 Positioning effect in NewsFeed 31

Editor's Notes

  • #13 Our first stakeholder workshop was held on February 3rd 2017, at the Digital Catapult in London. The first workshop brought together participants from academia, education, NGOs and enterprises. We were fortunate to have 30 particiipants on the day, which was a great turnout. The workshop itself focused on four case studies each chosen as it concerned a key current debate surrounding the use of algorithms and fairness. The case studies centred around: fake news, personalisation, gaming the system, and transparency.
  • #20 This WP aims to develop a methodology and the necessary IT and techniques for revealing the impact of algorithmic biases in personalisation-based platforms to non-experts (e.g. youths), and for co-developing “fairer” algorithms in close collaboration with specialists and non-expert users. In Year 1, Sofia and Michael have been running a task that asks participants to make task allocation decisions. In a situation in which resources are limited, different algorithms might be used to determine who receives what. Participants are asked to determine which algorithm is best suited to make the allocation and this inevitably brings up issues of fairness. Disucssion reveals different models of fairness. These findings will put towards further work on the processes of algorithm design and the possibility to develop a fair algorithm.