5. ▪ Opaque technologies
▪ Algorithms, optimized for goals that are non-obvious
for the users
▪ Confirming prejudices and inequalities
▪ Information security risks
THE PROBLEM?
5
7. ▪ Decisions in critical situations
▫ Trolley problem
▪ Supported “terrains”
▪ Tracking
▪ Information security
▫ Jeep CAN bus
▪ We have no idea what our car can do
SELF-DRIVING CARS
7
8. ▪ Maximizing view time
▪ Conspiracy theories
▪ Sensationalism
▪ Polarization
▪ Political side-effects
▪ Balanced opinions don’t maximize view time
▪ AlgoTransparency
YOUTUBE RECOMMENDATION ENGINE
8
9. ▪ Maximizing time on site
▪ Creates echo-chambers
▪ Sensationalism
▪ Polarization
▪ Identifying fake news
▪ Using groups for political propaganda
FACEBOOK NEWSFEED
9
10. ▪ Human or algorithm chose to block a profile?
▪ Paramters of the decision
▪ Criteria; text analysis
BLOCKING ON SOCIAL NETWORKS
10
11. ▪ Filtering potential copyright-infringing uploads
▪ Is that a problem?
▪ Ad revenue?
▪ Making overprotective filters
▪ “Exceptions and limitations”
ARTICLE 13
11
12. ▪ Risk analysis based on historical data
▪ Judges have access to the results
▪ Confirming social prejudices
▪ Next: Minority report?
ASSISTING CONVICTIONS
12
13. ▪ Routers, cameras, etc. connected device
▪ Low security that people don’t know about
▪ Participation in DDoS
▫ Mirai
▪ “Internet of Shit”
IoT
13
14. ▪ Random assignment of court cases
▪ Automatic welfare decisions
▫ Bug in the Colorado welfare system
▪ Access to data?
▪ Fraud-detection
▪ Information security
PUBLIC SECTOR SYSTEMS
14
16. ▪ Decision making
▪ Content recommendation
▪ Information security
THREE PROBLEMATIC ASPECTS
16
17. 17
“Man is a hackable animal [..]
Computers are hacked through pre-
existing faulty code lines. Humans
are hacked through pre-existing
fears, hatreds, biases and cravings”
Yuval Harari
18. 18
Algorithms can make us extremist,
help us meet other extremists,
convict us and then crash us on the
highway…
And we’ll have no idea why…
20. Right
The free market will take care
of it. If companies make
money it means their clients
agree not to know how things
work.
SOLUTIONS?
Left
Let’s ban algorithms. Or at
least write a law that says
exactly how they work.
20
21. Right
The free market will take care
of it. If companies make
money it means their clients
agree not to know how things
work.
SOLUTIONS?
Left
Let’s ban algorithms. Or at
least write a law that says
exactly how they work.
21
23. 23
“...who made it, what was the thinking behind
it, what human oversight sits atop the
algorithmic decisions, what are the
assumptions underlying the algorithms, are
there hard-coded rules...”
(Expert X)
24. ▪ Description of functionality
▪ “Why am I seeing this?”
▪ Public stats
▪ Action transparency
▪ Data source transparency
▪ Public data
▪ Transparency of ML algorithm steps
▪ Open source
LEVELS OF TRANSPARENCY
24
25. ▪ Blogposts
▪ Interviews
▪ Pop-up descriptions
▪ Is there human interaction?
▪ Usually regulations get to this point
DESCRIPTION OF FUNCTIONALITY
25
26. ▪ Why am I seeing this ad?
▪ Why am I seeing this video?
▪ Why am I seeing this comment?
▪ UX
“WHY AM I SEETING THIS?”
26
27. ▪ Data on the operation of algorithms
▪ Examples:
▫ Takedowns by ContentID, % disputed takedowns
▫ % false positives
▫ Confidence intervals
▫ A/B data, human vs algorithm
PUBLIC STATS
27
28. ▪ Every action can leave a trace
▫ Who had access to our data in government systems?
▫ Which bank employee has been looking at our bank account?
▫ Which system administrator had access to our car?
▪ Public verifiability of the audit trail
▫ Merkle trees
ACTION TRANSPARENCY
28
29. ▪ Publishing intermediate steps
▫ ML algorithms usually work in iterations
▫ Neural networks – weights, values in hidden layers
▪ Public verifiability of steps
▫ Merkle trees
▫ Blockchain?
TRANSPARENCY OF ML ALGORITHM STEPS
29
30. ▪ Data sources – how was data collected, with what
rules
▫ Example: Facebook collects location data via GPS, WiFi, … maybe
Bluetooth?
▪ Publish (partial) training sets
▫ Example: training with historical conviction data
▫ Example: training with US highway system (and using it in countries
with worse infrastructure)
PUBLIC DATA
30
31. ▪ Opening critical components
▫ CAN bus
▫ Communication modules in cars
▫ Rules for decision-making
▫ Password-storing components
▪ Bug bounties
OPEN SOURCE
31
32. ▪ ...rarely
▪ Transparency doesn’t mean leaking company secrets
▪ Transparency doesn’t mean yielding one’s
competitive advantage
▪ Transparency may be beneficial for reputation
DOESN’T THIS HARM BUSIENSS?
32
33. ▪ Best practices
▪ Industrial codes
▪ Standards
▪ Regulation for critical industries
▪ General regulations (nuclear option)
HOW?
33
34. We don’t have the right
to let technology remain a black box