Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making

269 views

Published on

Who we meet, what we choose to do, and the costs of our choices, are increasingly influenced and even driven by software systems, algorithms and infrastructure. The decisions underlying how those work are made by people just like you.

This talk challenges the audience to consider the ongoing implications of decisions made early in the design process, and provides practical examples of translating moral standpoints into real-world implementations. Presented by ThoughtWorks Principle Data Engineer Simon Aubury and Project Manager Dr. Maia Sauren.

Published in: Data & Analytics
  • Be the first to comment

  • Be the first to like this

Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making

  1. 1. 1 Dr. Maia Sauren- Program and Project manager Simon Aubury- Principal Data Engineer
  2. 2. 2 Alice
  3. 3. 3 why do we care?
  4. 4. 4 the unintended consequences of not knowing your users
  5. 5. 5
  6. 6. 6 To understand our users we need data
  7. 7. 7© ThoughtWorks 2019 References: https://uxdesign.cc/designing-ethically-pt-2-535ac61e2992 “Data is not the new oil, data is the new asbestos”
  8. 8. 8
  9. 9. 9
  10. 10. 10 10 01 11
  11. 11. 11 10 01 11 ??
  12. 12. tolerable noise or tolerable signal? 12
  13. 13. relative noise relative signal 13
  14. 14. WHAT COULD POSSIBLY GO WRONG </sarcasm> 14
  15. 15. 15
  16. 16. the law is a blunt instrument for a reason 16
  17. 17. technology is a blunt instrument for the same reason 17
  18. 18. the user is the signal not the noise 18
  19. 19. design for high signal tolerance 19
  20. 20. which of your users provide the broadest definition of signal? 20
  21. 21. 21
  22. 22. 22
  23. 23. 23
  24. 24. 24
  25. 25. From theory to practice
  26. 26. 26 Product - we’re an insurance co
  27. 27. 27 Alice wants some insurance
  28. 28. 28 What should Alice’s premiums be? A machine can do this Premium ($) = Risk (%) X Value ($) ● Features ● Algorithms
  29. 29. 29 UX ● Consent - meaningful consent ○ Gathering ○ Storage ○ Aligned to 3rd parties ● TOS - did you read it? What are our assumptions?
  30. 30. 30 UX - simple example What are our assumptions?
  31. 31. 31 UX - simple example What are our assumptions?
  32. 32. UX - protect the individual Randomized Response Technique
  33. 33. 33 Do you drive drunk? Collect useful insights across a population
  34. 34. 34 What data to calculate risk? What are our assumptions?
  35. 35. 35 Dev What are our assumptions?
  36. 36. 36 Dev What are our assumptions?
  37. 37. 37 Infra & security & storage What are our assumptions?
  38. 38. 38 Legal - Aust What are our assumptions?
  39. 39. 39 Legal - UK What are our assumptions?
  40. 40. 40 Expectations Is this fair? example@hotmail.com example@gmail.com
  41. 41. 41 Expectations Is this fair?
  42. 42. 42 Back to our risk algorithm How’s this looking? Premium ($) = Risk (%) X Value ($) ● Features ○ Gender if ! EU ○ Car Model ○ Colour (maybe) ○ Postcode ○ Customer Name ○ Email
  43. 43. 43 What data to calculate risk? What are our assumptions? X X
  44. 44. 44 What data to calculate risk? What are our assumptions?
  45. 45. 45 Alice has a blue car Even if we don’t explicitly set out to capture the data; there’s implicit relationships in our feature set
  46. 46. 46 don’t reinvent the trolley problem other people have found solutions!
  47. 47. 47 Google AI Fairness
  48. 48. 48
  49. 49. 49
  50. 50. 50 EthicalOS
  51. 51. 51 Tarot cards of tech
  52. 52. 52 Design Ethically Toolkit
  53. 53. 53 Design Ethically Toolkit
  54. 54. 54 Data Ethics Canvas
  55. 55. 55 Data Ethics Canvas
  56. 56. 56 Nicolas Steenhout
  57. 57. 57 Game Access
  58. 58. 58 Glitch TOS
  59. 59. 59 Tenon.io
  60. 60. 60 rtl.wtf
  61. 61. 61
  62. 62. 62 Risk-Based Ethical Use of Data
  63. 63. which of your users provide the broadest definition of signal? 63
  64. 64. 64 Links http://ecee.colorado.edu/~ecen7438/ https://gameaccessblog.org.uk/ https://pair-code.github.io/what-if-tool/ https://theodi.org/wp-content/uploads/2019/07/ODI-Data-Ethics-Canvas-2019-05.pdf https://slideplayer.com/slide/4472170/ DrAfter123 / Vetta / Getty Images https://www.flickr.com/photos/spiderman/2061070613 https://broadlygenderphotos.vice.com/ https://www.monash.edu/__data/assets/pdf_file/0007/216475/An-investigation-into-the-relationship-between-vehicle-colour-and-crash-risk.pdf https://www.mdpi.com/journal/ijerph https://www.express.co.uk/life-style/cars/910479/car-insurance-email-address-gmail-hotmail https://gradientinstitute.org/news/Gradient%20Institute%20Submission%20-%20Australia's%20AI%20Ethics%20Framework%20Discussion%20Paper.pdf https://consult.industry.gov.au/strategic-policy/artificial-intelligence-ethics-framework/ https://incl.ca/wp-content/uploads/2018/05/accessibility-security.pdf http://rtl.wtf/tac/booboos/ http://tarotcardsoftech.com/ https://evapenzeymoog.com/ https://www.designethically.com/toolkit http://www.soc.univ.kiev.ua/sites/default/files/newsfiles/4_slides_rrt.pdf Tenon.io https://incl.ca/ https://glitch.com/legal/#tos
  65. 65. Thank you 65 Maia Sauren maia.sauren@thoughtworks.com @sauramaia Simon Aubury simon.aubury@thoughtworks.com @SimonAubury

×