Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making

Vice President at Thoughtworks
Nov. 18, 2019
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making
1 of 65

More Related Content

What's hot

IoT Meetup Stockholm - Designing Connected ProductsIoT Meetup Stockholm - Designing Connected Products
IoT Meetup Stockholm - Designing Connected ProductsMartin Charlier
Designers, Developers & DogsDesigners, Developers & Dogs
Designers, Developers & DogsThoughtworks
Designing Connected Products - Web Directions 2015 SydneyDesigning Connected Products - Web Directions 2015 Sydney
Designing Connected Products - Web Directions 2015 SydneyMartin Charlier
Decision Intelligence: How AI and DI (and YOU) are Evolving to the Next LevelDecision Intelligence: How AI and DI (and YOU) are Evolving to the Next Level
Decision Intelligence: How AI and DI (and YOU) are Evolving to the Next LevelLorien Pratt
Prototyping Experiences for Connected ProductsPrototyping Experiences for Connected Products
Prototyping Experiences for Connected ProductsMartin Charlier
Project GuidelinesProject Guidelines
Project Guidelinestipfutures2011

Similar to Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making

Neo4j - Responsible AINeo4j - Responsible AI
Neo4j - Responsible AINeo4j
Présentation de Bruno Schroder au 20e #mforum (07/12/2016)Présentation de Bruno Schroder au 20e #mforum (07/12/2016)
Présentation de Bruno Schroder au 20e #mforum (07/12/2016)Agence du Numérique (AdN)
AI Orange Belt - Session 2AI Orange Belt - Session 2
AI Orange Belt - Session 2AI Black Belt
Creating Value With Artificial Intelligence, Rudradeb Mitra at Google DevelopersCreating Value With Artificial Intelligence, Rudradeb Mitra at Google Developers
Creating Value With Artificial Intelligence, Rudradeb Mitra at Google DevelopersProvectus
Are You Underestimating the Value Within Your Data? A conversation about grap...Are You Underestimating the Value Within Your Data? A conversation about grap...
Are You Underestimating the Value Within Your Data? A conversation about grap...Neo4j
Responsible AIResponsible AI
Responsible AINeo4j

More from Thoughtworks

Design System as a ProductDesign System as a Product
Design System as a ProductThoughtworks
Cloud-first for fast innovationCloud-first for fast innovation
Cloud-first for fast innovationThoughtworks
More impact with flexible teamsMore impact with flexible teams
More impact with flexible teamsThoughtworks
Culture of InnovationCulture of Innovation
Culture of InnovationThoughtworks
Dual-Track AgileDual-Track Agile
Dual-Track AgileThoughtworks
Amazon's Culture of InnovationAmazon's Culture of Innovation
Amazon's Culture of InnovationThoughtworks

Recently uploaded

PLM 200.pdfPLM 200.pdf
PLM 200.pdfMidhun Mohan
Quantum Karnaugh map in NV-center Quantum ComputerQuantum Karnaugh map in NV-center Quantum Computer
Quantum Karnaugh map in NV-center Quantum Computerssuserb645ae
Predictive HR Analytics_ Mastering the HR Metric ( PDFDrive ).pdfPredictive HR Analytics_ Mastering the HR Metric ( PDFDrive ).pdf
Predictive HR Analytics_ Mastering the HR Metric ( PDFDrive ).pdfSanthosh Prabhu
apidays London 2023 - 7 pillars of an API Factory, Patrick Brosse, Amadeusapidays London 2023 - 7 pillars of an API Factory, Patrick Brosse, Amadeus
apidays London 2023 - 7 pillars of an API Factory, Patrick Brosse, Amadeusapidays
OCTRI PSS Simulations in R Seminar.pdfOCTRI PSS Simulations in R Seminar.pdf
OCTRI PSS Simulations in R Seminar.pdfssuser84c78e
apidays London 2023 - Revolutionising fitness and well-being, David Turner, V...apidays London 2023 - Revolutionising fitness and well-being, David Turner, V...
apidays London 2023 - Revolutionising fitness and well-being, David Turner, V...apidays

Recently uploaded(20)

Alice has a Blue Car: Beginning the Conversation Around Ethically Aware Decision Making

Editor's Notes

  1. MC SPEAKING, Dr. Maia Sauren is a program and project manager at ThoughtWorks, with backgrounds in large-scale organisational transformation and healthcare applications. Maia’s previous lives were in biomedical engineering, science communication, and education. Simon is a Principal Data Engineer with expertise in enterprise system design for large data systems.
  2. MAIA SPEAKING Who is alice? What does Alice need, what is relevant ; and what are the consequence of our decisions? Tonight we want to engage you in a conversation .. what guidelines should you consider for your organisation and yourself when building solutions? What are the iimplications of decisions made early in the design process, and how this translates to real-world outcomes.
  3. Maia speaking Why do we care about alice, or her data? Why would anyone want that?
  4. Maia speaking Why do we care about alice, or her data? Why would anyone want that?
  5. MAIA SPEAKING Gender neutral ads for STEM jobs Women click on the ads more Ads become more expensive to show to women Women see fewer ads for jobs in STEM Algorithm aim: minimise spend
  6. Maia speaking Data needs to be Accurate Clean Diverse
  7. MAIA SPEAKING
  8. MAIA SPEAKING What is a signal? What is an error?
  9. MAIA SPEAKING What is a signal? What is an error?
  10. MAIA SPEAKING What is a signal? What is an error?
  11. MAIA SPEAKING What is a signal? What is an error?
  12. MAIA SPEAKING Noise as it applies to Within greater context Contextualised as relative harm, relative good, relative harm minimisation
  13. MAIA SPEAKING How do you teach that To a child To a machine
  14. MAIA SPEAKING How do you teach that To a child To a machine
  15. Bias laundering through technology - Gretchen Mccullock, Because internet
  16. MAIA SPEAKING
  17. MAIA SPEAKING So you have to give it blunt flexibility In the design stage
  18. MAIA SPEAKING
  19. MAIA SPEAKING So you have to give it blunt flexibility In the design stage
  20. MAIA SPEAKING
  21. Alice Saunders Refused airline ticket by British Aerospace
  22. MAIA SPEAKING
  23. MAIA SPEAKING https://broadlygenderphotos.vice.com/
  24. MAIA SPEAKING Frank Starmer Follow A large red stone (Uluru, south of Alice Springs) An adventure south of Alice Springs to visit Uluru. Here is a sunset photo. -- CC / https://www.flickr.com/photos/spiderman/2061070613
  25. SIMON SPEAKING Let’s illustrate data ethics with a scenario Can we describes the value judgements we make about Alice and approaches we make when generating, analysing and disseminating data about Alice Do we understand good practice in computing techniques, ethics and information assurance appreciative of relevant legislation, Guidelines - sensible defaults for individual teams Framework - translating values to behaviours Ethical position - organisational value statement And then Guidelines: Framework: Ethical position:
  26. SIMON SPEAKING We’re going to apply this lens by imaging the product decisions in a hypothical insurance co Congrats on the new job Think about the
  27. SIMON SPEAKING Alice has a need - an approariatly pricved insurance product What is the best thing for Alice? Let’s design a product
  28. SIMON SPEAKING Could have actuaries and risk tables People are dumb and lazy – we need robots to do the maths for them.
  29. SIMON SPEAKING Firstly - we need to engage with Alice. This is typically a sales funnel our business is based on data TOS - plain language; (Linkedin has a sentance that’s 91 words long) no dark patterns; accessible and meaningful (eg. language) organisational value statement
  30. SIMON SPEAKING Option 1 - poor Option 2 - better ; has choice An explanation of WHY; No word “Other” may make people feel like an after-thought And an explicit call out that it’s kept private Option 3 - best; don’t ask
  31. SIMON SPEAKING Credit decision based on non-compliance with instructions - blue pen users get a different weighting AFter much investigation, was discovered to be most correlated to the pen left in the office Data is not truth Humans create, generate, collect, capture, and extend data. The results are often incomplete, and the process of analyzing them can be messy. Data can be biased through what is included or excluded, how it is interpreted, and how it is presented. if the data is crappy, even the best algorithm won't help. Sometimes it's referred as "garbage in – garbage out".
  32. SIMON SPEAKING Respect privacy and the collective good Now : let’s discuss an approach for data capture for a sensitive topic. “Did you cheat on your partner in the last 12 months” “Dispose of waste illegally.” or “Lie on an insurance application” The randomized response technique (RRT) nearly 50 years can be used minimize the response bias from questions about sensitive issues. Protect the individual ; but collect accurate data in aggregate Respondents would be randomly assigned to answer either the sensitive question or the inverse question with a “yes” or “no” answer [8]. As is shown in Table 1, the assignment undergoes a randomization procedure, such as rolling a die
  33. SIMON SPEAKING Back to our example; how can we capture a meaningful response : do you drive drunk We want a represnatative answer; but if we ask Alice directly we’re unlikely to get an accurate anser How do you Respect privacy and the collective good ONLINE … without a dice? Randomized Response Technique can be applied online by a bit of A/B testing
  34. SIMON SPEAKING Imagine we’ve now got data that our users are happy to share; and we feel that it’s important to our business Or we have access to a partner data provider What can we derive from these insights? Basic example Basics of Features (Also known as parameters or variables) are the factors for a machine to look at. Those could be car mileage, user's gender, age, postcode; features are column names Algorithms - we might pick ML algoirtim (maybe a decision tree for classification, or clusstering model) that based on the data classifies users, and determines a likely risk
  35. SIMON SPEAKING Let’s discuss algoirthmic approaches to understanding Alice There;’s nothing new here; techniques here have been used for > 40 years To illustrate - lket’;s think of the simplify case where we have a corpus of relationships We can use algoirimit classifiers to associate frequence clusters of info with historric outcomes If historically we’ve seen high correlations between April and Aries ; perhaps we can automate this assumption as part of our business We could gloss this up as “AI” Just because AI can do something doesn’t mean that it should. What’s the effect on ALice is we apply this approach blindly?
  36. SIMON SPEAKING With only gender-neutral pronouns in languages like Malay, Google Translate says men are leaders, doctors, soldiers, and professors, while women are assistants, prostitutes, nurses, and maids. The results are similar in Hungarian. According to findings by Strawberry Bomb, Google Translate interprets men as intelligent, and women as stupid. It also suggests that women are unemployed, cook, and clean. Don’t presume the desirability of AI Just because AI can do something doesn’t mean that it should. When AI is incorporated into a design, designers should continually pay attention to whether people’s needs are changing, or an AI’s behavior is changing.
  37. SIMON SPEAKING Dev responsibilities Information handling Responsibilities for all parties; partners; 3rd parties providers; stakeholders PCI, PII, HIPA HIPPA alone - 51 pages in the overview document for one tech vendor Payments PCI - 71 pages for the same vendor
  38. SIMON SPEAKING notify the Australian Information Commissioner and individuals whose personal information has been subject to a data breach likely to result in serious harm data breach arises when the following three criteria are satisfied: There is unauthorised access to or unauthorised disclosure of personal information, This is likely to result in serious harm to one or more individuals; and The entity has not been able to prevent the likely risk of serious harm with remedial action. assessment within 30 days Respect privacy and the collective good While there are policies and laws that shape the governance, collection, and use of data, we must hold ourselves to a higher standard than “will we get sued?” Consider design, governance of data use for new purposes, and communication of how people’s data will be used. Call out downsides of holding data It’s a liability; Enduring
  39. SIMON SPEAKING Last line highlight
  40. SIMON SPEAKING Admiral admitted to charging drivers higher premiums for using a Hotmail address. It found some enquires using a Hotmail address up to £31.36 more expensive than those using a Gmail account.
  41. SIMON SPEAKING Car colour does in fact make a big difference to road safety - Silver, grey, red and black cars are most likely to be involved in accidents whereas white, orange and yellow are the safest colour choices
  42. SIMON SPEAKING So it’s getting a little harder to work out what features to use
  43. SIMON SPEAKING What do we now know about alice
  44. SIMON SPEAKING Baxck to our example ; did we pick good Features (Also known as parameters or variables) are the factors for a machine to look at. Those could be car mileage, user's gender, age, postcode; features are column names Algorithms - we might pick ML algoirtim (maybe a decision tree for classification, or clusstering model) that based on the data classifies users, and determines a likely risk DID we find or influence our algoithm in any other way?
  45. SIMON SPEAKING Even if we don’t explicitly set out to capture the data; there’s implicit relationships in our feature set Models can discover remarkable correlations Can we describes the value judgements we make about Alice and approaches we make when generating, analysing and disseminating data about Alice Do we understand good practice in computing techniques, ethics and information assurance appreciative of relevant legislation, Guidelines - sensible defaults for individual teams Framework - translating values to behaviours Ethical position - organisational value statement
  46. MAIA SPEAKING
  47. MAIA What If tool from Google inspect a machine learning model Test algorithmic fairness constraints Examine the effect of preset algorithmic fairness constraints, such as equality of opportunity in supervised learning.
  48. MAIA SPEAKING This is how we’re thinking internally ; a canvas Risk-Based Ethical Use of Data
  49. MAIA SPEAKING
  50. MAIA SPEAKING
  51. MAIA SPEAKING
  52. MAIA SPEAKING
  53. MAIA SPEAKING
  54. Moriel Schottlender
  55. MAIA SPEAKING This is how we’re thinking internally ; a canvas Risk-Based Ethical Use of Data
  56. MAIA SPEAKING
  57. Nobody speaking