Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Caroline Sinders, Online Harassment Researcher, Wikimedia at The AI Conference 2017

300 views

Published on

Caroline Sinders is a machine learning designer/user researcher, artist. For the past few years, she has been focusing on the intersections of natural language processing, artificial intelligence, abuse, online harassment and politics in digital, conversational spaces.Caroline is a designer and researcher at Wikimedia, and a BuzzFeed/Eyebeam Open Lab Fellow. She holds a masters from New York University’s Interactive Telecommunications Program from New York University
Emotional Trauma and Machine Learning
How do we create, code and make emotional data inside of systems? And how do we create the necessary context t in larger systems that use data. Is it possible to use machine learning to solve very hard problems around conversation? For the past two years, I’ve been studying internet culture, online conversations, memes, and online harassment. I also worked as a user researcher at IBM Watson helping design and layout systems for chat bot software. As a designer and researcher interested in all of the nuances of human conversations and emotions, from humor to sadness, to memes and harassment, I wonder is it possible to code in emotions for machine learning systems? And what are the ethical implications of that? Can we design systems to mitigate harassment, to elevate humor? And can these systems promote human agency, and allow for participation from users to help decide and structure the system the talk in- can design and user participation help set what is harassment and what is not?

With machine learning, often the creators of the system are deciding what norms of the system and the users are left out of the collaboration. How do we create systems that are transparent for users, that also facilitate user participation? With online communities, communication, and culture, users make, users, do, users are the community.

Published in: Technology
  • Be the first to comment

Caroline Sinders, Online Harassment Researcher, Wikimedia at The AI Conference 2017

  1. 1. Emotional (labor + machine) Learning
  2. 2. Caroline Sinders User Researcher. Artist. hi! i study people on the internet.
  3. 3. how a computer understands language @carolinesinders
  4. 4. weather api stormy rainingsnowysunnyhail hello! what is the weather today? Hello, it’s sunny and 70 degrees. + location
  5. 5. @carolinesinders
  6. 6. @carolinesinders
  7. 7. cunt.
  8. 8. scunthorpe.
  9. 9. how do we transparently design with algorithms?
  10. 10. @carolinesinders
  11. 11. Social Media is a mixed emotional and identity space. @carolinesinders
  12. 12. a regular, ole, decontextualized facebook photo.
  13. 13. “remove tag.”
  14. 14. “remove photo.”
  15. 15. “remove photo.” why?
  16. 16. “remove photo.” why? “because i feel uncomfortable”
  17. 17. why? “because i feel uncomfortable” • I don’t like the photo • afraid to lose my job • afraid to upset my family • afraid to upset my peers • I feel unsafe “remove photo.”
  18. 18. “because i feel uncomfortable” • I don’t like the photo • afraid to lose my job • afraid to upset my family • afraid to upset my peers • I feel unsafe Actions A User Can Take Algorithmic Intervention mark as ‘annoying’, learn 1. mark as abuse 2. rate the level of abuse 3. put in moderator queue 4. add in an estimated wait time for response. ask for more information, contact immediately, etc.
  19. 19. Algorithmic Intervention 1. mark as abuse 2. rate the level of abuse 3. put in moderator queue 4. add in an estimated wait time for response. Moderator Enabled Learning Moderators, in real time, can re-rate, re tag, flag new abusive content and words, add in adjusted wait times Supervised Machine Learning The data that is being fed to the algorithm is constantly updated, so it’s never stale.
  20. 20. Machines process data and images faster than humans. @carolinesinders
  21. 21. Slang and harassment specific vernacular changes quickly and often.
  22. 22. How do we create spaces that have user generated definitions of safety? @carolinesinders

×