Dr. Pierre-Nicolas Schwab
HEAD OF BIG DATA, RTBF
FOUNDER, INTOTHEMINDS
Data Innovation Summit
March, 30 2017
#DIS2017
WHAT WE NEED ARE ETHICAL ALGORITHMS
WHY WE NEED ETHICAL ALGORITHMS
• Do you trust companies
behaving badly ?
• Cow-boy behaviors must end !
– Uber charging more when your
battery is low
– Orbitz proposing more
expensive hotels to MAC users
– Biased selection algorithm for
French universities
WHY TRUST MATTERS FOR RTBF
• Trust in medias has decreased
further: 26% trust in online
media in 2017 1
• Online media less trusted of
all. Yet, main source of news
among younger audiences 2
1 Kantar Sofres, January 2017
2 Reuter digital news report 2016
ALGORITHMS PLAY A ROLE IN
BUILDING TRUST
Personalization algorithms
represent a challenge with 3 key
problems identified :
– Too much personalization  key
information may be missing
– Alternative viewpoints may be
absent
– Privacy may be threatened
Source : Reuters Digital Report 2016
ALGORITHMS ARE NEVER NEUTRAL!
• Algorithms are
– Designed for a goal
– A reflection of a person’s biases
• Algorithms use data without
knowledge of the context
• Automated data treatment
may pose ethical threats
2 TYPES OF ALGORITHMIC THREATS
1. Technical limitations
Racist chatbots, less-than-perfect image
recognition
2. The vision behind the algorithms
– Filter bubbles
– Gender inequity
– Insurance: personalization vs. risk sharing
– Company’s commercial goals :
« Netflix ’s metrics can not distinguish between
an enriched life and addition »
Neil Hunt, Netflix CPO, RecSys 2014
5 RULES TO MAKE ALGORITHMS MORE
ETHICAL
1. Avoid discrimination
2. Promote gender equity
3. Open up customers’ minds
(exploration) rather than
trapping them (exploitation)
4. Respect right of not being
tracked
5. Educate users on algorithms
EDUCATION IS KEY
• Empower users: give control back
• Build knowledge
• « Show » your algorithms (open
the black boxes !)
“An [algorithmic] system that you
can't audit is a system that you
can't use"
Marc Rotenberg, CPDP conf. 2017
WHAT WE DO CONCRETELY AT RTBF
(1/3)
• Design of recommendation
systems follow ethical rules
(deliberative democracy model)
• Focus on serendipity (≠ filter
bubble)
• Data scientists are not left
alone: functional specifications
describe what is acceptable
WHAT WE DO CONCRETELY AT RTBF
(2/3)
Recommendation systems follow
the deliberative democracy model:
• Autonomy and choice
• Information quality and debate
• Respect of minorities and
marginalised groups
• Gender equity
• Promotion of real alternative
viewpoints
WHAT WE DO CONCRETELY AT RTBF
(3/3)
GDPR and privacy  “Privacy
by design” :
• no recommendation
compatible (ON/OFF)
• Standard profiles (persona)
and possibility to « put you
in the shoes of … »
CONCLUSION : DESIGN ALGORITHMS
FOR GREATER GOOD
• Algorithms are instrumental to
gain consumers’ trust
• Recommendation algorithms
are not neutral  design
them carefully and ethically
• educate users on the role of
algorithms

Slides pierre nicolas schwab DISummit 2017 (Big Data, Brussels)

  • 1.
    Dr. Pierre-Nicolas Schwab HEADOF BIG DATA, RTBF FOUNDER, INTOTHEMINDS Data Innovation Summit March, 30 2017 #DIS2017 WHAT WE NEED ARE ETHICAL ALGORITHMS
  • 2.
    WHY WE NEEDETHICAL ALGORITHMS • Do you trust companies behaving badly ? • Cow-boy behaviors must end ! – Uber charging more when your battery is low – Orbitz proposing more expensive hotels to MAC users – Biased selection algorithm for French universities
  • 3.
    WHY TRUST MATTERSFOR RTBF • Trust in medias has decreased further: 26% trust in online media in 2017 1 • Online media less trusted of all. Yet, main source of news among younger audiences 2 1 Kantar Sofres, January 2017 2 Reuter digital news report 2016
  • 4.
    ALGORITHMS PLAY AROLE IN BUILDING TRUST Personalization algorithms represent a challenge with 3 key problems identified : – Too much personalization  key information may be missing – Alternative viewpoints may be absent – Privacy may be threatened Source : Reuters Digital Report 2016
  • 5.
    ALGORITHMS ARE NEVERNEUTRAL! • Algorithms are – Designed for a goal – A reflection of a person’s biases • Algorithms use data without knowledge of the context • Automated data treatment may pose ethical threats
  • 6.
    2 TYPES OFALGORITHMIC THREATS 1. Technical limitations Racist chatbots, less-than-perfect image recognition 2. The vision behind the algorithms – Filter bubbles – Gender inequity – Insurance: personalization vs. risk sharing – Company’s commercial goals : « Netflix ’s metrics can not distinguish between an enriched life and addition » Neil Hunt, Netflix CPO, RecSys 2014
  • 7.
    5 RULES TOMAKE ALGORITHMS MORE ETHICAL 1. Avoid discrimination 2. Promote gender equity 3. Open up customers’ minds (exploration) rather than trapping them (exploitation) 4. Respect right of not being tracked 5. Educate users on algorithms
  • 8.
    EDUCATION IS KEY •Empower users: give control back • Build knowledge • « Show » your algorithms (open the black boxes !) “An [algorithmic] system that you can't audit is a system that you can't use" Marc Rotenberg, CPDP conf. 2017
  • 9.
    WHAT WE DOCONCRETELY AT RTBF (1/3) • Design of recommendation systems follow ethical rules (deliberative democracy model) • Focus on serendipity (≠ filter bubble) • Data scientists are not left alone: functional specifications describe what is acceptable
  • 10.
    WHAT WE DOCONCRETELY AT RTBF (2/3) Recommendation systems follow the deliberative democracy model: • Autonomy and choice • Information quality and debate • Respect of minorities and marginalised groups • Gender equity • Promotion of real alternative viewpoints
  • 11.
    WHAT WE DOCONCRETELY AT RTBF (3/3) GDPR and privacy  “Privacy by design” : • no recommendation compatible (ON/OFF) • Standard profiles (persona) and possibility to « put you in the shoes of … »
  • 12.
    CONCLUSION : DESIGNALGORITHMS FOR GREATER GOOD • Algorithms are instrumental to gain consumers’ trust • Recommendation algorithms are not neutral  design them carefully and ethically • educate users on the role of algorithms