User agency on social networks that
are mediated by algorithms
Ansgar Koene
HORIZON Digital Economy Research,
University of Nottingham
User experience satisfaction on social network sites
Human attenetion is
a limited resource
Filter
Information services, e.g. internet search, news feeds etc.
• free-to-use => no competition on price
• lots of results => no competition on quantity
• Competition on quality of service
• Quality = relevance
= appropriate filtering
Good information service = good filtering
Sacrificing control for Convenience
Sacrificing control for Convenience
Personalized recommendations
• Content based – similarity to past results the
user liked
• Collaborative – results that similar users liked
(people with statistically similar tastes/interests)
• Community based – results that people in the same
social network liked
(people who are linked on a social network e.g.
‘friends’)
How do the algorithms work?
User understanding of social
media algorithms
More than 60% of Facebook users are entirely unaware of
any algorithmic curation on Facebook at all: “They
believed every single story from their friends and followed
pages appeared in their news feed”.
Published at: CHI 2015
Revealing News Feed behaviour
Participants indicate desired changes
Information filtering, or ranking, implicitly manipulates choice
behaviour.
Many online information services are ‘free-to-use’, the
service is paid for by adverting revenue, not users directly
 Potential conflict of interest:
promote advertisement vs. match user interests
Advertising inherently tries to manipulate consumer behaviour
Personalized filtering can also be use for political spin /
propaganda etc.
Manipulation: conflict of interest
Trending Topics controversy
Q&A with N. Lundblad (Google)
Nicklas Lundblad, Head of EMEA Public Policy and Government Relations at Google
Human attention is the limited resource that services need to
compete for.
As long as there exist competing platforms, loss of agency due to
algorithms deciding what to show to users is not an issue.
Users can switch to other platform.
UnBias: Emancipating Users Against Algorithmic
Biases for a Trusted Digital Economy
WP1: ‘Youth Juries’ workshops with 13-17 year olds to co-
produce citizen education materials on properties of
information filtering/recommendation algorithms;
WP2: co-design workshops, hackathons and double-blind
testing to produce user-friendly open source tools for
benchmarking and visualizing biases in the algorithms;
WP3: design requirements for algorithms that satisfy subjective
criteria of bias avoidance based on interviews and
observation of users’ sense-making behaviour
WP4: policy briefs for an information and education governance
framework for social media usage. Developed through broad
stakeholder focus groups with representatives of
government, industry, third-sector organizations, educators,
lay-people and young people (a.k.a. “digital natives”).
Thank you for your attention
ansgar.koene@nottingham.ac.uk
Click to add
texthttp://casma.wp.horizon.ac.uk/
It’s based on data so it must be true
“More data, not better models”
Belief that ‘law of large number’ means Big Data methods do
not need to worry about model quality or sampling bias as
long as enough data is used.
“More Data” is the key to Deep-learning success compared to
previous AI
Garbage in -> garbage out
perpetuating the status-quo
ProPublica “Machine Bias”
‘equal opportunity by design’
“Big Data: A Report on Algorithmic Systems,
Opportunities, and Civil Rights“, White House
report focused on the problem of avoiding
discriminatory outcomes
“To avoid exacerbating biases by encoding them into
technological systems a principle of ‘equal
opportunity by design’—designing data systems
that promote fairness and safeguard against
discrimination from the first step of the
engineering process and continuing throughout
their lifespan.”

Dasts16 a koene_un_bias

  • 1.
    User agency onsocial networks that are mediated by algorithms Ansgar Koene HORIZON Digital Economy Research, University of Nottingham
  • 2.
    User experience satisfactionon social network sites
  • 3.
    Human attenetion is alimited resource Filter
  • 4.
    Information services, e.g.internet search, news feeds etc. • free-to-use => no competition on price • lots of results => no competition on quantity • Competition on quality of service • Quality = relevance = appropriate filtering Good information service = good filtering
  • 5.
  • 6.
  • 7.
    Personalized recommendations • Contentbased – similarity to past results the user liked • Collaborative – results that similar users liked (people with statistically similar tastes/interests) • Community based – results that people in the same social network liked (people who are linked on a social network e.g. ‘friends’)
  • 8.
    How do thealgorithms work?
  • 10.
    User understanding ofsocial media algorithms More than 60% of Facebook users are entirely unaware of any algorithmic curation on Facebook at all: “They believed every single story from their friends and followed pages appeared in their news feed”. Published at: CHI 2015
  • 11.
  • 12.
  • 13.
    Information filtering, orranking, implicitly manipulates choice behaviour. Many online information services are ‘free-to-use’, the service is paid for by adverting revenue, not users directly  Potential conflict of interest: promote advertisement vs. match user interests Advertising inherently tries to manipulate consumer behaviour Personalized filtering can also be use for political spin / propaganda etc. Manipulation: conflict of interest
  • 14.
  • 15.
    Q&A with N.Lundblad (Google) Nicklas Lundblad, Head of EMEA Public Policy and Government Relations at Google Human attention is the limited resource that services need to compete for. As long as there exist competing platforms, loss of agency due to algorithms deciding what to show to users is not an issue. Users can switch to other platform.
  • 16.
    UnBias: Emancipating UsersAgainst Algorithmic Biases for a Trusted Digital Economy WP1: ‘Youth Juries’ workshops with 13-17 year olds to co- produce citizen education materials on properties of information filtering/recommendation algorithms; WP2: co-design workshops, hackathons and double-blind testing to produce user-friendly open source tools for benchmarking and visualizing biases in the algorithms; WP3: design requirements for algorithms that satisfy subjective criteria of bias avoidance based on interviews and observation of users’ sense-making behaviour WP4: policy briefs for an information and education governance framework for social media usage. Developed through broad stakeholder focus groups with representatives of government, industry, third-sector organizations, educators, lay-people and young people (a.k.a. “digital natives”).
  • 17.
    Thank you foryour attention ansgar.koene@nottingham.ac.uk Click to add texthttp://casma.wp.horizon.ac.uk/
  • 18.
    It’s based ondata so it must be true “More data, not better models” Belief that ‘law of large number’ means Big Data methods do not need to worry about model quality or sampling bias as long as enough data is used. “More Data” is the key to Deep-learning success compared to previous AI
  • 19.
    Garbage in ->garbage out perpetuating the status-quo ProPublica “Machine Bias”
  • 20.
    ‘equal opportunity bydesign’ “Big Data: A Report on Algorithmic Systems, Opportunities, and Civil Rights“, White House report focused on the problem of avoiding discriminatory outcomes “To avoid exacerbating biases by encoding them into technological systems a principle of ‘equal opportunity by design’—designing data systems that promote fairness and safeguard against discrimination from the first step of the engineering process and continuing throughout their lifespan.”