Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Product Management Ethics in A.I. by Yammer's former Dir. of Product


Published on

From maximizing the crave-ability of food additives to notification addiction, Product Managers have a profound impact on society. In the not too distant future, a number of those Product decisions will be delivered by artificial intelligence. In this talk, we discussed ethical lessons from the history of Product Management and how we can learn from them to build ethical AI.

Former Director of Product at Yammer also talked about how to understand data quality, biases, and potential impacts, and learn what your self-driving car will do when it encounters The Trolley Problem.

Published in: Technology
  • Be the first to comment

  • Be the first to like this

Product Management Ethics in A.I. by Yammer's former Dir. of Product

  1. 1. Product Management Ethics in A.I. with Yammer’s former Director of Product
  2. 2. FREE INVITE Join 12,000+ Product Managers on
  3. 3. Product Management 2-month part-time Courses
  4. 4. Coding for Managers 2-month part-time Courses
  5. 5. Data Analytics for Managers 2-month part-time Courses
  6. 6. Include @productschool and #prodmgmt at the end of your tweet Tweet to get a free ticket for our next Event!
  7. 7. Drew Dillon Tonight’s Speaker
  8. 8. Product Management Ethics in A.I.
  9. 9. Drew Dillon Exec In Residence, Costanoa Ventures @drewdil
  10. 10. Ethics
  11. 11. Ethics in AI
  12. 12. Dr. Julie Manella
  13. 13. AI Ethics
  14. 14. Observe - actively absorb the entire situation Orient - understand blind spots and biases Decide - form a hypothesis for action Act - test your hypothesis
  15. 15. O O D A People Powered
  16. 16. O O D A Big Data Driven
  17. 17. O O D A Machine Learning
  18. 18. O O D A Artificial Intelligence
  19. 19. Interface Isolation
  20. 20. Don’t Let Interfaces Isolate Interfaces are increasingly dynamic Interface decisions are increasingly made by machines We must design systems with a north star beyond personal preference and metric optimization
  21. 21. BIASED DATA
  22. 22. All data has an opinion. You can't trust the opinions of data you didn't collect. Otis Anderson Data Analyst, OpenDoor
  23. 23. What’s wrong with this?
  24. 24. Dr. Hermann Joseph Muller
  25. 25. Muller began to realize that positive eugenics was achievable only in a society that had already achieved radical equality. Eugenics could not be the prelude to equality. Instead, equality had to be the precondition for eugenics. Without equality, eugenics would inevitably falter on the false premise that social ills, such as vagrancy, pauperism, deviance, alcoholism, and feeblemindedness were genetic ills—while, in fact, they merely reflected inequality.
  26. 26. “Equality had to be the precondition for eugenics.”
  27. 27. “Equality had to be the precondition for eugenics.”has Artificial Intelligence To avoid interface isolation.
  28. 28. Black Swans
  29. 29. Digital Systems Are (mostly) Deterministic
  30. 30. Uhh… Natural Systems Are Stochastic
  31. 31. 26 Heads 30 Tails
  32. 32. Make Allowances For Black Swans Attempted optimization of Black Swan events makes them worse We barely understand probability, often basing decisions on small samples We often are just optimizing the wrong things
  33. 33. Feedback & Experimentation
  34. 34. Program Feedback & Experimentation Optimizing systems need probabilistic models & failure states that incorporate this Product and Systems Thinking can fix this, but the whole of our profession needs to level up
  35. 35. Transparency
  36. 36. Algorithm Inventors PHDs Undergrads Specializing In AI Any Engineer Today
  37. 37. Engineers increasingly rely on defaults Defaults aren’t transparent ∴ We’re heading towards opaque decisions
  38. 38. Responsibility (Liability)
  39. 39. “Why do I care about any of this?”
  40. 40. “No one is going to buy a car that kills them.”
  41. 41. If Facebook lets users segregate themselves into smaller and smaller groups leading to isolation and conflict, whose fault is it?
  42. 42. If an autonomous drone bombs a hospital?
  43. 43. If a teenager is repeatedly arrested for typical boneheaded teenager stuff and they spend the rest of their lives in and out of the legal system?
  44. 44. Deep Remorse We Did Nothing Wrong I Was Paid To Do A Job
  45. 45. Responsibility and ethics are murky and personal Liability won’t be, but it’ll take a while
  46. 46. Okay, Let’s Wrap Up
  47. 47. But If We’re Not Thoughtful… Greater isolation and unequal treatment Biased data becoming a self-fulfilling prophecy Leaning into and intensifying tragic disasters Systems that malfunction without human input The inability to even understand the malfunction
  48. 48. Don’t let interfaces isolate Understand biases in data Make Allowances Black Swans Program Feedback & Experimentation Encourage Transparency
  49. 49. Questions?
  50. 50. Part-time Product Management Courses in San Francisco, Silicon Valley, Los Angeles and New York