Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Algorithmic and technological transparency

464 views

Published on

Slides from my talk on algorithmic and technological transparency - what are the problems and what can be potential solution?

Published in: Technology
  • Be the first to comment

  • Be the first to like this

Algorithmic and technological transparency

  1. 1. Algorithmic and technological transparency
  2. 2. ABOUT ME Bozhidar Bozhanov Software engineer Former e-gov advisor Founder @ LogSentinel.com 2
  3. 3. “Technology is now everywhere” (favourite cliche)
  4. 4. 4 Technology affects our lives and our societies
  5. 5. ▪ Opaque technologies ▪ Algorithms, optimized for goals that are non-obvious for the users ▪ Confirming prejudices and inequalities ▪ Information security risks THE PROBLEM? 5
  6. 6. 6 Technology is everywhere around us… And we have no idea what it does
  7. 7. ▪ Decisions in critical situations ▫ Trolley problem ▪ Supported “terrains” ▪ Tracking ▪ Information security ▫ Jeep CAN bus ▪ We have no idea what our car can do SELF-DRIVING CARS 7
  8. 8. ▪ Maximizing view time ▪ Conspiracy theories ▪ Sensationalism ▪ Polarization ▪ Political side-effects ▪ Balanced opinions don’t maximize view time ▪ AlgoTransparency YOUTUBE RECOMMENDATION ENGINE 8
  9. 9. ▪ Maximizing time on site ▪ Creates echo-chambers ▪ Sensationalism ▪ Polarization ▪ Identifying fake news ▪ Using groups for political propaganda FACEBOOK NEWSFEED 9
  10. 10. ▪ Human or algorithm chose to block a profile? ▪ Paramters of the decision ▪ Criteria; text analysis BLOCKING ON SOCIAL NETWORKS 10
  11. 11. ▪ Filtering potential copyright-infringing uploads ▪ Is that a problem? ▪ Ad revenue? ▪ Making overprotective filters ▪ “Exceptions and limitations” ARTICLE 13 11
  12. 12. ▪ Risk analysis based on historical data ▪ Judges have access to the results ▪ Confirming social prejudices ▪ Next: Minority report? ASSISTING CONVICTIONS 12
  13. 13. ▪ Routers, cameras, etc. connected device ▪ Low security that people don’t know about ▪ Participation in DDoS ▫ Mirai ▪ “Internet of Shit” IoT 13
  14. 14. ▪ Random assignment of court cases ▪ Automatic welfare decisions ▫ Bug in the Colorado welfare system ▪ Access to data? ▪ Fraud-detection ▪ Information security PUBLIC SECTOR SYSTEMS 14
  15. 15. 15 Companies often deny wrongdoing ...until someone finds out or information is leaked
  16. 16. ▪ Decision making ▪ Content recommendation ▪ Information security THREE PROBLEMATIC ASPECTS 16
  17. 17. 17 “Man is a hackable animal [..] Computers are hacked through pre- existing faulty code lines. Humans are hacked through pre-existing fears, hatreds, biases and cravings” Yuval Harari
  18. 18. 18 Algorithms can make us extremist, help us meet other extremists, convict us and then crash us on the highway… And we’ll have no idea why…
  19. 19. 19
  20. 20. Right The free market will take care of it. If companies make money it means their clients agree not to know how things work. SOLUTIONS? Left Let’s ban algorithms. Or at least write a law that says exactly how they work. 20
  21. 21. Right The free market will take care of it. If companies make money it means their clients agree not to know how things work. SOLUTIONS? Left Let’s ban algorithms. Or at least write a law that says exactly how they work. 21
  22. 22. 22 We need more algorithmic and technological transparency
  23. 23. 23 “...who made it, what was the thinking behind it, what human oversight sits atop the algorithmic decisions, what are the assumptions underlying the algorithms, are there hard-coded rules...” (Expert X)
  24. 24. ▪ Description of functionality ▪ “Why am I seeing this?” ▪ Public stats ▪ Action transparency ▪ Data source transparency ▪ Public data ▪ Transparency of ML algorithm steps ▪ Open source LEVELS OF TRANSPARENCY 24
  25. 25. ▪ Blogposts ▪ Interviews ▪ Pop-up descriptions ▪ Is there human interaction? ▪ Usually regulations get to this point DESCRIPTION OF FUNCTIONALITY 25
  26. 26. ▪ Why am I seeing this ad? ▪ Why am I seeing this video? ▪ Why am I seeing this comment? ▪ UX “WHY AM I SEETING THIS?” 26
  27. 27. ▪ Data on the operation of algorithms ▪ Examples: ▫ Takedowns by ContentID, % disputed takedowns ▫ % false positives ▫ Confidence intervals ▫ A/B data, human vs algorithm PUBLIC STATS 27
  28. 28. ▪ Every action can leave a trace ▫ Who had access to our data in government systems? ▫ Which bank employee has been looking at our bank account? ▫ Which system administrator had access to our car? ▪ Public verifiability of the audit trail ▫ Merkle trees ACTION TRANSPARENCY 28
  29. 29. ▪ Publishing intermediate steps ▫ ML algorithms usually work in iterations ▫ Neural networks – weights, values in hidden layers ▪ Public verifiability of steps ▫ Merkle trees ▫ Blockchain? TRANSPARENCY OF ML ALGORITHM STEPS 29
  30. 30. ▪ Data sources – how was data collected, with what rules ▫ Example: Facebook collects location data via GPS, WiFi, … maybe Bluetooth? ▪ Publish (partial) training sets ▫ Example: training with historical conviction data ▫ Example: training with US highway system (and using it in countries with worse infrastructure) PUBLIC DATA 30
  31. 31. ▪ Opening critical components ▫ CAN bus ▫ Communication modules in cars ▫ Rules for decision-making ▫ Password-storing components ▪ Bug bounties OPEN SOURCE 31
  32. 32. ▪ ...rarely ▪ Transparency doesn’t mean leaking company secrets ▪ Transparency doesn’t mean yielding one’s competitive advantage ▪ Transparency may be beneficial for reputation DOESN’T THIS HARM BUSIENSS? 32
  33. 33. ▪ Best practices ▪ Industrial codes ▪ Standards ▪ Regulation for critical industries ▪ General regulations (nuclear option) HOW? 33
  34. 34. We don’t have the right to let technology remain a black box
  35. 35. 35 THANK YOU! Contacts ▪ @bozhobg ▪ techblog.bozho.net
  36. 36. ▪ https://news.vice.com/en_us/article/d3w9ja/how-youtubes-algorithm-prioritizes-conspiracy-theories ▪ https://sci-hub.tw/10.1080/21670811.2016.1208053 ▪ https://www.theguardian.com/technology/2018/feb/02/youtube-algorithm-election-clinton-trump- guillaume-chaslot ▪ https://techblog.bozho.net/self-driving-cars-open-source/ ▪ https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing ▪ http://www.austlii.edu.au/au/journals/FedJSchol/2014/17.html ▪ https://www.bellingcat.com/news/americas/2018/10/11/memes-infowars-75-fascist-activists-red- pilled/ SOURCES 36

×