Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

The Future of Moral Persuasion in Games, AR, AI Bots, and Self Trackers by Sherry Jones (April 18, 2019)

89 views

Published on

4-18-19 - This presentation was shown at the eLearning Consortium of Colorado (eLCC) Annual Conference. The focus of the talk is on the various ethical problems that currently exist in the technology industry and predictions of how future technologies, such as Digital Games, AR, AI Bots, and Self Trackers, will be designed to morally persuade users.

The presentation that includes the video can be accessed here: http://bit.ly/futureethics

Published in: Education
  • Be the first to comment

  • Be the first to like this

The Future of Moral Persuasion in Games, AR, AI Bots, and Self Trackers by Sherry Jones (April 18, 2019)

  1. 1. The Future of Moral Persuasion in Games, AR, AI Bots, and Self Trackers
  2. 2. HELLO! I am Sherry Jones Philosophy and Games Studies SME and Instructor, Rocky Mountain College of Art and Design. Steering Committee Board Member, International Game Developers Association - Learning, Games, and Education Special Interest Group (IGDA LEG). Judge, Software Information Industry Association (SIIA) CODiE Awards - Games, Virtual Reality, Augmented Reality, and Gamification in Education. Twitter @autnes Bio http://bit.ly/sherryjonesbio Slides http://bit.ly/futureethics 2
  3. 3. 1. What is Morality? What is Ethics? Pinning Down the Definitions.
  4. 4. Morality and Ethics ⊗ Morality - An individual’s personal right vs. wrong values, or understanding of right actions vs. wrong actions (this moral sense is influenced by familial, cultural, religious, political, social factors in one’s upraising). ⊗ Ethics - A system of moral codes that are applied universally to everyone in a society for the purpose of ensuring the survival of that society. ⊗ All technologies that influence user actions/behaviors are moral and political in nature. 4
  5. 5. 2. Ethical Problems in the Technology Sector Web, Facial Recognition, Autonomous Weapons, Killer Robots, AI, Trackers.
  6. 6. We [have] demonstrated that the Web failed instead of served humanity, as it was supposed to have done, and failed in many places…. The increasing centralization of the Web ended up producing—with no deliberate action of the people who designed the platform—a large-scale emergent phenomenon which is anti-human. -- Tim Bernies-Lee, creator of the World Wide Web. 6
  7. 7. Surveillance Capitalism ⊗ Technologies invade privacy by surveilling, tracking, and collecting user data (using EULA legalese as justification). ⊗ Technologies sell user data to third parties, who then create user-targeted advertising campaigns to maximize profit. ⊗ Technologies reflect the biases of their developers and judge and discriminate users’ social, moral, and monetary worth. ⊗ Third parties, such as health insurance companies, can use data to pry into personal lives and determine whether to punish or reward users for performing (un)expected behaviors. 7
  8. 8. 8 Link to Article
  9. 9. As companies and governments deploy these A.I. technologies, researchers are also realizing that some systems are woefully biased. Facial recognition services, for instance, can be significantly less accurate when trying to identify women or someone with darker skin. Other systems may include security holes unlike any seen in the past. Researchers have shown that driverless cars can be fooled into seeing things that are not really there. --- Is Ethical AI Even Possible? by Cade Metz, NY Times (March 1, 2019). 9
  10. 10. Employees at Clarifai worry that the same technological tools that drive facial recognition will ultimately lead to autonomous weapons — and that flaws in these tools will open a Pandora’s box of problems. “We in the industry know that technology can be compromised. Hackers hack. Bias is unavoidable,” read the open letter to Mr. Zeiler. --- Is Ethical AI Even Possible? by Cade Metz, NY Times (March 1, 2019). 10
  11. 11. 11 Link to Article
  12. 12. 12 Link to Article
  13. 13. 13 Link to Article
  14. 14. 14 Link to Article
  15. 15. 15 Link to Article
  16. 16. 16 Link to Article
  17. 17. 17 Link to Article
  18. 18. 18 Link to Article
  19. 19. 19 Link to Article
  20. 20. 20 Link to Video
  21. 21. 21 Link to Article
  22. 22. 3. Some Ethical Design Proposals A Call to Design Ethical Technologies.
  23. 23. 23 Ethical Design Pyramid by Indie
  24. 24. 24 Value Sensitive Design Metrics by Friedman and Kahn
  25. 25. AI Ethics from Companies and Education ⊗ Google’s AI Principles ⊗ Microsoft’s AI Ethics ⊗ Embedded EthiCS: Bringing Ethical Reasoning Into the Computer Science Curriculum 25
  26. 26. 4. Missing from Design Conversations: “Whose” Values Are We Promoting? Cultural, Political, or Industrial Values?
  27. 27. 27 Link to Article
  28. 28. Players in Eastern-cluster countries were more likely than those in the Western and Southern countries to kill a young person and spare an old person (represented, in the game, by a stooped figure holding a cane). Players in Southern countries were more likely to kill a fat person (a figure with a large stomach) and spare an athletic person (a figure that appeared mid-jog, wearing shorts and a sweatband). --- Findings from MIT's Moral Machine (January 24, 2019). 28
  29. 29. Players in countries with high economic inequality (for example, in Venezuela and Colombia) were more likely to spare a business executive (a figure walking briskly, holding a briefcase) than a homeless person (a hunched figure with a hat, a beard, and patches on his clothes). In countries where the rule of law is particularly strong—like Japan or Germany—people were more likely to kill jaywalkers than lawful pedestrians. --- Findings from MIT's Moral Machine (January 24, 2019). 29
  30. 30. 30 Link to Article
  31. 31. Google faced intense backlash soon after announcing that one of the eight council members was Kay Coles James, the president of the Heritage Foundation, a conservative thinktank with close ties to Donald Trump’s administration. James has a history of fighting against trans rights and LGBT protections, has advocated for Trump’s proposed border wall, and has taken a vocal stance against abortion rights. --- Google Scraps AI Ethics Council After Backlash: 'Back to the Drawing Board' (April 4, 2019). 31
  32. 32. 32 Link to Article
  33. 33. The composition of the HLEG AI group is part of the problem: it consisted of only four ethicists alongside 48 non-ethicists – representatives from politics, universities, civil society, and above all industry. That's like trying to build a state-of-the-art, future-proof AI mainframe for political consulting with 48 philosophers, one hacker and three computer scientists (two of whom are always on vacation). --- Ethics Washing Made in Europe by Thomas Metzinger (April 8, 2019). 33
  34. 34. Because industry acts more quickly and efficiently than politics or the academic sector, there is a risk that, as with “Fake News”, we will now also have a problem with fake ethics, including lots of conceptual smoke screens and mirrors, highly paid industrial philosophers, self-invented quality seals, and non-validated certificates for “Ethical AI made in Europe” --- Ethics Washing Made in Europe by Thomas Metzinger (April 8, 2019). 34
  35. 35. 5. Design Future: Humanlike AI Bots Questioning Moral Values Thinking Ethically.
  36. 36. Prediction #1 Future of moral persuasive design in Games, AR, AI Bots, and Self Trackers will include humanlike AIs, conducting “moral conversations” with users to reflect on moral values and question whether those values can or cannot help solve social problems. 36
  37. 37. AI Chatbot: Laozi Converse with Laozi, the chatbot to understand his moral philosophy. Place your screenshot here 37
  38. 38. Prediction #2 AI will engage users in moral reasoning, listing all possible choices for a situation, prior to the users making a decision. 38
  39. 39. 39 A Choice Scene from The Walking Dead Game
  40. 40. Prediction #3 Companies may attempt to control users’ moral behaviors to prevent future users’ problems. Ex. User types an angry/abusive letter in Google Docs. An AI bot appears and asks the user whether doing so is a good idea. Also, AI warns the user of the potential legal consequences of their actions. 40
  41. 41. Prediction #4 Technologies will be legally required to reveal their own ethical codes, moral values, and past practices, when users question the design of the technologies. Ex. Tinder directly tells the user that it is the company’s policy to evaluate the attractiveness of the user based on their photo and wealth. 41
  42. 42. 42 Hotness.ai
  43. 43. Prediction #5 Quantified Self Technologies will be legally required to tell users how their data will be used, who will get access to their data, and the consequences of sharing the data. Ex. FamilyTreeDNA tells its users that it may need to hand over user’s DNA data to government entities, that collected data is not private. 43
  44. 44. FamilyTreeDNA, an early pioneer of the rapidly growing market for consumer genetic testing, confirmed late Thursday that it has granted the Federal Bureau of Investigation access to its vast trove of nearly 2 million genetic profiles. The arrangement was first reported by BuzzFeed News. Concerns about unfettered access to genetic information gathered by testing companies have swelled since April, when police used a genealogy website to ensnare a suspect in the decades-old case of the Golden State Killer. But that site, GEDmatch, was open-source, meaning police were able to upload crime-scene DNA data to the site without permission. The latest arrangement marks the first time a commercial testing company has voluntarily given law enforcement access to user data. --- Major DNA Testing Company Sharing Genetic Data With the FBI By Kristen V Brown (February 1, 2019). 44
  45. 45. THANKS! Philosophy and Games Studies SME and Instructor, Rocky Mountain College of Art and Design. Steering Committee Board Member, International Game Developers Association - Learning, Games, and Education Special Interest Group (IGDA LEG). Judge, Software Information Industry Association (SIIA) CODiE Awards - Games, Virtual Reality, Augmented Reality, and Gamification in Education. Twitter @autnes Bio http://bit.ly/sherryjonesbio Slides http://bit.ly/futureethics 45
  46. 46. CREDITS Special thanks to all the people who made and released these awesome resources for free: ⊗ Presentation template by SlidesCarnival ⊗ Photographs by Unsplash 46

×