Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

The Web We Want: Dealing with the dark side of social media (work in progress)

857 views

Published on

Work in progress, presented at the Let's Get Real conference in London.

Published in: Social Media
  • Login to see the comments

The Web We Want: Dealing with the dark side of social media (work in progress)

  1. 1. The Web We Want Or, Dealing with the Dark Side of Social Media Michael Peter Edson | @mpedson Let’s Get Real conference, London 2 March 2020 Work In Progress
  2. 2. https://youtu.be/HmTcLlNJNqY (With NWA’s Express Yourself in you r mind https://youtu.be/u31FO_4d9TY)
  3. 3. Framing questions Are we compromising the safety of our audiences if we invite them to engage with us on 3rd party social media? ● Are we complicit? ● Do we have an obligation to protect our audiences? ● Do we have an obligation to protect/shepherd technology platforms (Internet, web, mobile, etc)? Hypothesis: I say yes, we have an obligation here. There are few perfect choices in front of us, but consequential short-term action is possible, and necessary, to mitigate the harms and establish clearer paths forward for ourselves and our communities.
  4. 4. I have a new scar on my face from some recent surgery, so I’ve been thinking about knots and surgeons, Knowledge and practice…
  5. 5. https://www.ted.com/talks/ed_gavagan_a_story_about_knots_and_surgeons/transcript?language=en
  6. 6. https://www.ted.com/talks/ed_gavagan_a_story_about_knots_and_surgeons/transcript?language=en “So he starts to tell them, and he's like, ’No, this is very important here. You know, when you're needing these knots, it's going to be, you know, everything's going to be happening at the same time, it's going to be -- you're going to have all this information coming at you, there's going to be organs getting in the way, it's going to be slippery, and it's just very important that you be able to do these beyond second nature, each hand, left hand, right hand, you have to be able to do them without seeing your fingers.’”
  7. 7. https://www.ted.com/talks/ed_gavagan_a_story_about_knots_and_surgeons/transcript?language=en “So he starts to tell them, and he's like, ’No, this is very important here. You know, when you're needing these knots, it's going to be, you know, everything's going to be happening at the same time, it's going to be -- you're going to have all this information coming at you, there's going to be organs getting in the way, it's going to be slippery, and it's just very important that you be able to do these beyond second nature, each hand, left hand, right hand, you have to be able to do them without seeing your fingers.’” Practice and skill — professionalism — matter in every profession. And so does luck. “Chance favors the prepared mind” (more on that later)
  8. 8. https://anildash.com/2012/12/13/the_web_we_lost/ “The first step [is…] to know your s**t.”
  9. 9. (Our s**t has changed)
  10. 10. 1994: Unix Wizard Web
  11. 11. 2000: The web of HTML (and CSS)
  12. 12. 2005: The dynamic web (Separate content and design. Coldfusion!)
  13. 13. 2006: Ajax
  14. 14. 2010: The programmable web (APIs!)
  15. 15. 2011: The web as a platform (Drupal! LAMP!)
  16. 16. 2006-2011: Mobile
  17. 17. 3rd party social media
  18. 18. • 1994: Unix Wizard Web • 2000: The web of HTML (and CSS) • 2004: The dynamic web (Separate content and design. Coldfusion!) • 2006: Ajax • 2010: The web as an application that you program • 2011: The web as a platform (Drupal! LAMP!) • 2011: Mobile • 3rd party Social Media • The corporate walled garden (“The Frightful Five” — F. Manjoo) • ...Next?! The splinternet? Skynet? The Web We Want? Each jump required different skills. New people came, (some) existing people left.
  19. 19. • 1994: Unix Wizard Web • 2000: The web of HTML (and CSS) • 2004: The dynamic web (Separate content and design. Coldfusion!) • 2006: Ajax • 2010: The web as an application that you program • 2011: The web as a platform (Drupal! LAMP!) • 2011: Mobile • 3rd party Social Media • The corporate walled garden (“The Frightful Five” — F. Manjoo) • ...Next?! The splinternet? Skynet? The Web We Want? Also, somewhere in here… “executives” started to buy in, hire staff.
  20. 20. • 1994: Unix Wizard Web • 2000: The web of HTML (and CSS) • 2004: The dynamic web (Separate content and design. Coldfusion!) • 2006: Ajax • 2010: The web as an application that you program • 2011: The web as a platform (Drupal! LAMP!) • 2011: Mobile • 3rd party Social Media • The corporate walled garden (“The Frightful Five” — F. Manjoo) • ...Next?! The splinternet? Skynet? The Web We Want? Also, somewhere in here… “executives” started to buy in, hire staff. The tech was more recognizable to them (because they used it too), and the people, employees, were more recognizable (less technical, more social)
  21. 21. (Our s**t has changed) And a lot has changed very recently, exactly in the epicenter of our current, most important skillset: community and the social web. …Therefore, we need to know what’s going on (Anil Dash) and we need a new set of sensibilities, knowledge and skills. We’ve got to get *good* at this (like Ed Gavagan’s surgeons.)
  22. 22. Book Research / UN Live / Partner stories (Through personal experience, making the case for the impact of “the dark side” on our institutions and the public)
  23. 23. Book Research Over 2,000 articles reviewed in 2019 About 1,000 pages of notes
  24. 24. Book Research Over 2,000 articles reviewed in 2019 About 1,000 pages of notes Re-living a descent into hell…
  25. 25. The Dark Side emerges “Nutsville”, 2012-2019, from the Sony Hack to Cambridge Analytica to election interference 2000’s: Worms and viruses 2013: Snowden 2014: Sony Pictures Hack 2016: Election interference 2017–2018: Elections, bots, Facebook, YouTube (recommendations, conspiracy theorists…) 2018: Cambridge Analytica
  26. 26. https://www.nytimes.com/2 017/05/10/technology/techs- frightful-five-theyve-got- us.html “…we have not, as a society, come to grips with the scope of [the Frightful Five’s] control over our lives. And we don’t have many good ways to limit it, if we decided that’s what we’d like to do.”
  27. 27. https://www.nytimes.com/2 017/05/10/technology/techs- frightful-five-theyve-got- us.html “In 2007, when Mr. Jobs unveiled the iPhone, just about everyone greeted the new device as an unalloyed good. […] That’s no longer true. The State of the Art, today, is a bag of mixed emotions. Tech might improve everything. And it’s probably also terrible in ways we’re only just starting to understand.”
  28. 28. Some of what’s screwed in our Social Media environment 1. Aggressive manipulation of civic messages by 3rd parties Example: Russia in US elections; YouTube and Brazilian elections; Facebook & Indian elections 2. Aggressive promotion of incendiary content; conspiracy theories Example: Anti-vaccination content 3. Predatory advertising to the vulnerable Example: Offer payday loans to the financially vulnerable 4. Targeted advertising designed to exclude people Example: hiding apartment rentals from minority populations; hiding elections information from those likely to vote for a political opponent 5. Undermining regulatory scrutiny Example: ensure lawmakers/regulators don’t get illegal ads/messages/products; real example = Uber’s “operation greyball” 6. Undermining legitimate groups Example: fake accounts undermine legitimate First Nations groups 7. Tolerating hate speech and harassment — Example: Twitter bots and harassment 8. Wanton disregard of viral, but destructive, content — Example: YouTube’s recommendation engine 9. Data collection, aggregation, and sale — Example: Adtech brokering 10. Anti-democratic behavior — Example: everything
  29. 29. Some of what’s screwed in our Social Media environment 1. Aggressive manipulation of civic messages by 3rd parties Example: Russia in US elections; YouTube and Brazilian elections; Facebook & Indian elections 2. Aggressive promotion of incendiary content; conspiracy theories Example: Anti-vaccination content 3. Predatory advertising to the vulnerable Example: Offer payday loans to the financially vulnerable 4. Targeted advertising designed to exclude people Example: hiding apartment rentals from minority populations; hiding elections information from those likely to vote for a political opponent 5. Undermining regulatory scrutiny Example: ensure lawmakers/regulators don’t get illegal ads/messages/products; real example = Uber’s “operation greyball” 6. Undermining legitimate groups Example: fake accounts undermine legitimate First Nations groups 7. Tolerating hate speech and harassment — Example: Twitter bots and harassment 8. Wanton disregard of viral, but destructive, content — Example: YouTube’s recommendation engine 9. Data collection, aggregation, and sale — Example: Adtech brokering 10. Anti-democratic behavior — Example: everything Drawn from my own thinking, and, 6 ways social media has become a direct threat to democracy By Pierre Omidyar; “information fiduciaries” by J. Zittrain; Jason Koebler, Zeynep Tufekci,
  30. 30. https://www.omidyar.com/news/6-ways-social-media-has-become-direct-threat-democracy
  31. 31. https://cyber.harvard.edu/story/2018-09/platforms-should-become-information-fiduciaries
  32. 32. 1. Aggressive manipulation of civic messages by 3rd parties Examples: Russia in US elections YouTube and Brazilian elections Facebook & Indian elections
  33. 33. https://www.buzzfeednews.com/article/ryanhatesthis/brazil-jair-bolsonaro-facebook-elections Recommended Reading
  34. 34. https://www.nytimes.com/2019/03/06/opinion/india-pakistan-news.html The internet truly is super-duper fake, and thanks to the malleability of digital media and the jet fuel of network virality, a digital lie can spread more quickly, and cause more damage, than an analog one. […] In India, Pakistan and everywhere else, addressing digital mendacity will require a complete social overhaul. … The information war is a forever war. We’re just getting started.
  35. 35. https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html By Paul Mozur Oct. 15, 2018
  36. 36. 2. Aggressive promotion of incendiary content; conspiracy theories Example: Anti-vaccination content; France’s Yellow Vests
  37. 37. “This isn’t the first time real- life violence has followed a viral Facebook storm and it certainly won’t be the last. Much has already been written about the anti- Muslim Facebook riots in Myanmar and Sri Lanka and the WhatsApp lynchings in Brazil and India. Well, the same process is happening in Europe now, on a massive scale. Here’s how Facebook tore France apart.” https://www.buzzfeednews.com/article/ryanhatesthis/france-paris-yellow-jackets-facebook
  38. 38. https://www.telegraph.co.uk/technology/2019/03/26/just-clicks-instagram-led-terrifying-anti-vaxx-wormhole/ It’s not just YouTube and Facebook… Instagram and Whatsapp are powerful vectors for fake news and conspiracy theories
  39. 39. http://time.com/5512032/whatsapp-india-election-2019/
  40. 40. 3. Predatory advertising to the vulnerable Example: Offer payday loans to the financially vulnerable 4. Targeted advertising designed to exclude people Example: hiding apartment rentals from minority populations; hiding elections information from those likely to vote for a political opponent These 2 are mostly from “information fiduciaries” by J. Zittrain. Not decisively proven at this point, as far as I can tell, but freaky and likely in my judgment
  41. 41. https://www.theguardian.com/technology/2019/mar/28/facebook-ads-housing-discrimination-charges-us-government-hud
  42. 42. 5. Undermining regulatory scrutiny Example: ensure lawmakers/regulators don’t get illegal ads/messages/products; Uber’s “operation greyball”
  43. 43. https://www.nytimes.com/2017/03/03/technology/uber-greyball-program-evade-authorities.html
  44. 44. 6. Undermining legitimate groups Example: fake accounts undermine legitimate First Nations groups
  45. 45. “Our human faculties for sense-making, and evaluating and validating information, are being challenged and in some ways destroyed in this new information ecosystem. We are all getting false signals. This affects our ability to construct and apply trust. #trust “It also creates an opportunity for bad actors who understand how to exploit this ecosystem.” https://www.niemanlab.org/2018/03/living-in-a-sea-of-false-signals-are-we-being-pushed-from-trust-but-verify-to-verify-then-trust/
  46. 46. “The fundamental issue with the bogus Native American pages is that they mislead the public to believe that they are coming from Native American people. They allow problematic ideas to go unanswered and worse yet - they are frequently the source of dissemination of such toxic mindsets. They do not build communities, but destroy them.” https://www.facebook.com/notes/exploiting-the-niche/part-i-warning-signs/416598235447814/ — Sarah Thompson
  47. 47. 7. Tolerating hate speech and harassment Example: Twitter bots and harassment
  48. 48. https://mashable.com/article/amnesty-study-twitter-abuse-women/
  49. 49. “TikTok’s efforts to provide locally sensitive moderation have resulted in it banning any content that could be seen as positive to gay people or gay rights, down to same-sex couples holding hands, even in countries where homosexuality has never been illegal, the Guardian can reveal.” https://www.theguardian.com/technology/2019/sep/26/tiktoks-local-moderation-guidelines-ban-pro-lgbt-content
  50. 50. 8. Wanton disregard of viral, but destructive, content Example: YouTube’s recommendation engine
  51. 51. https://www.scpr.org/programs/airtalk/2019/06/05/64558 /down-the-rabbit-hole-the-dark-side-of-youtube-s-au/ https://www.nytimes.com/2018/03/10/opinion/sunday/yo utube-politics-radical.html https://www.nytimes.com/2019/06/03/world/americas/yo utube-pedophiles.html
  52. 52. https://twitter.com/zeynep/status/1002611580859834368 Recommended Reading Everything by Zeynep
  53. 53. 9. Data collection, aggregation, and sale Example: Adtech brokering
  54. 54. https://logicmag.io/play/shengwu-li-on-online-advertising/
  55. 55. https://logicmag.io/play/shengwu-li-on-online-advertising/
  56. 56. https://logicmag.io/play/shengwu-li-on-online-advertising/
  57. 57. 10. Anti-democratic behavior Example: everything
  58. 58. https://freedomhouse.org/report/freedom-net/freedom-net-2018/rise-digital-authoritarianism
  59. 59. Still don’t believe me? https://twitter.com/baekdal/status/1198227126996099074
  60. 60. https://www.usingdata.com/usingdata/2019/11/25/society-is-more-than-a-bazaar-links-on-social-media?rq=bazaar 30 references re: the dark side of social media
  61. 61. So what’s going on? What are the big patterns?
  62. 62. https://www.nytimes.com/2018/09/27/opinion/facebook-instagram-systrom.html “Social media is in a pre-Newtonian moment, where we all understand that it works, but not how it works.” Kevin Systrom, Facebook/Instagram, 2018
  63. 63. Moxie Marlinspike on Expanding Choice Scope https://youtu.be/DoeNbZlxfUM?t=778
  64. 64. Moxie Marlinspike on Expanding Choice Scope https://youtu.be/DoeNbZlxfUM?t=778 The decision to use Facebook today is a different decision than it was 10 years ago
  65. 65. https://medium.com/berkman-klein-center/rebalancing-regulation-of-speech-hyper-local-content-on-global-web-based-platforms-1-386d65d86e32 Global-local context collapse
  66. 66. https://www.vice.com/en_us/article/xwk9zd/how-facebook-content-moderation-works Content moderation at scale
  67. 67. https://www.vice.com/en_us/article/xwk9zd/how-facebook-content-moderation-works Corporate self-interest
  68. 68. “When Facebook users learned last spring that the company had compromised their privacy in its rush to expand, allowing access to the personal information of tens of millions of people to a political data firm linked to President Trump, Facebook sought to deflect blame and mask the extent of the problem. “And when that failed […] Facebook went on the attack.” https://www.nytimes.com/2018/11/14/technology/facebook-data-russia-election-racism.html Corporate self-interest
  69. 69. The “handoff” between different sectors of society (e.g. journalism, courts, lawmakers) aren’t working now, so everyone has to do more than they would in normal times. Mixture of good and bad is confusing… As is “chance” — you don’t always know if what you do, the risks you take, will help, and that uncertainty can be hard.
  70. 70. We have a dilemma All of this presents a dilemma for small/medium sized enterprises like most of ours 1. The “no new network” problem New networks, with no users, are not an attractive option 2. Few resources/depth with which to react Most in our network are Small/Medium Enterprises run on shoestring budgets; lack political clout; work in relative isolation (though note libraries as possible exception) 3. 3rd party social media is efficient for us Where our audiences are, and good still happens there. 4. SO….
  71. 71. Remedies?
  72. 72. Back to our framing questions Are we compromising the safety of our audiences if we invite them to engage with us on 3rd party social media? ● Are we complicit? ● Do we have an obligation to protect our audiences? ● Do we have an obligation to protect/shepherd technology platforms (Internet, web, mobile, etc)? Hypothesis: I say yes, we have an obligation here. There are few perfect choices in front of us, but consequential short-term action is possible, and necessary, to mitigate the harms and establish clearer paths forward for ourselves and our communities.
  73. 73. What to do #1: Know your s**t If you use these platforms professionally you should become conversant with the social/civic/ethical realities of 3rd party social media use.
  74. 74. What to do #2: Take responsibility for informing your community Inform your community through privacy policies, notices, blogging, social media profile statements, events announcements and signage, etc. Make sure you have educated your audience.
  75. 75. What to do #3: Don’t put your eggs in 1 basket Continue to use social media, but also invest in the development of your own digital properties (i.e., your core website, newsletters, mailing lists). Use social media as a pointer to more robust content on your own site. Build your mailing lists, and use newsletters and other content to create value around these. Provide alternate means of accessing your content/community, for example, RSS, and email subscriptions. These methods won’t replace 3rd party social media platforms, but they will help to undermine its inordinate power and influence.
  76. 76. What to do #4: Have a halves and doubles mindset It’s not necessarily realistic to completely eliminate the use of 3rd party social media, but think about halving your use — cutting it in half…and doubling the speed at which you would normally do so. Amazon —> other merchants Google —> DuckDuckGo search engine Chrome —> Brave browser, Firefox Gmail —> paid mail service Mobile device —> put down the phone
  77. 77. What to do #5, Evaluate the platforms you use 1. Analyze what platforms we use (e.g., Facebook, YouTube, Twitter) a. What do you do there? What does your audience do? i. Create “Write only” content; minimal interaction with the audience ii. Interact with comments and users iii. Develop apps, features, surveys, other activities that generate personalized data b. The ethical implications of each (a moving target) 2. Tactics for mitigating harm (a spectrum of choices, for example); Each has pros and cons to be weighed a. Withdraw altogether b. Inform/educate participants c. Substitutions (Vimeo for YouTube?) d. Minimize content/interaction and direct the public to other properties (like your own website) e. Eliminate the use of 3rd party cookies f. Eliminate the use of micro-targeted advertising g. Choose vendors/platforms carefully
  78. 78. What to do #6, become an “information fiduciary” Concept invented by Jack Balkin, Yale law school. “‘Fiduciary’ has a legalese ring to it, but it’s a long-standing, commonsense notion. The key characteristic of fiduciaries is loyalty: They must act in their charges’ best interests, and when conflicts arise, must put their charges’ interests above their own. That makes them trustworthy. Like doctors, lawyers, and financial advisers, social media platforms and their concierges are given sensitive information by their users, and those users expect a fair shake — whether they’re trying to find out what’s going on in the world or how to get somewhere or do something.” …A fiduciary duty wouldn’t broadly rule out targeted advertising — dog owners would still get dog food ads — but it would preclude predatory advertising, like promotions for payday loans. It would also prevent data from being used for purposes unrelated to the expectations of the people who shared it… (Quotes from How To Exercise The Power You Didn’t Ask For, Jonathan Zittrain, Harvard Business Review, September 19, 2018.) https://blogs.harvard.edu/jzwrites/2018/10/29/how-to-exercise-the-power-you-didnt-ask-for/
  79. 79. A thought: Using a human rights standard Pretty macro, but a constructive and necessary perspective on the “discretionary and vague” policy decisions by platforms. From David Kaye, UN Special Rapporteur on freedom of opinion and expression. https://twitter.com/davidakaye/status/1099107647763054594
  80. 80. What to do #7, “model” the behaviors of civic discourse “Find a well-moderated corner of the internet. It can sometimes seem as if all the internet is deep fakes and culture wars, Trump tweets and influencer scams. It’s not, of course. The internet still abounds in lovely, wholesome niches — the fantasy sports circles, the YouTube and Instagram communities devoted to any kind of craft, the many subreddits where strangers come together to help one another out of real problems in life. "What distinguishes the productive online communities from the disturbing ones? Often it’s something simple: content moderation. The best places online are bounded by clear, well-enforced community guidelines for participation. Twitter and Facebook are toxic because there are few rules and few penalties for flouting them. A Reddit community like r/relationships, meanwhile, is a haven of incredible, empathetic discussion because its hosts spend a lot of effort policing the discussion toward productive dialogue. This gets at the plain truth of the internet: A better digital world takes work. It’s work all of us should do. From Farhad Manjoo, NY Times Opinion, 1 January 2020, Only You Can Prevent dystopia. https://www.nytimes.com/2020/01/01/opinion/social-media-2020.html]
  81. 81. What to do #8, use “POLP” at work and home POLP = Principle Of Least Privilege, from IT Security, means to use the minimum permissions you need to do any given task (so you don’t accidentally delete your own hard drive or give root permissions to a virus) I.e., Give/leak as little information to 3rd parties as possible. Use The Big Platforms when you need to, but when you don’t, don’t. ● Switch to more private browsers (Brave, Firefox, Opera) ● Switch to more private email, documents ● Disable location tracking when you don’t need it; turn off your phone; use a Faraday (signal blocking) case; use a “dumb phone” ● Don’t install apps that “leak” data to back to aggregators https://en.wikipedia.org/wiki/Principle_of_least_privilege
  82. 82. Sidebar: Popular (& obscure) apps pump data to 3rd parties 83 When you have a problem like this, You Give Apps Sensitive Personal Information. Then They Tell Facebook. Wall Street Journal testing reveals how the social-media giant collects a wide range of private data from developers; ‘This is a big mess’ By Sam Schechner and Mark Secada Feb. 22, 2019 11:07 a.m. ET Article, paywalled, summarized on twitter by Mark Schoofs, USC Annenberg School “Facebook sweeps up sensitive data — including heart rate and when a woman is having her period — from top phone apps. And users have no way to opt out.” https://www.wsj.com/articles/you-give-apps-sensitive-personal-information-then-they-tell-facebook-11550851636 , https://twitter.com/SchoofsFeed/status/1098999479141752832
  83. 83. What to do #9: Exit or Voice (or Loyalty) You have a choice: be loyal, leave, or use your voice. The Sleeping Giants movement, started with a single individual, and has had enormous influence by draining advertisers away from toxic platforms like Fox News and Breitbart. The Amnesty International model is straightforward, effective, and achievable. The Greenpeace playbook works: write a letter; get 3 friends to write letters; sign petitions… You just have to act. You have to know your s**t, and do. 84https://en.wikipedia.org/wiki/Exit,_Voice,_and_Loyalty
  84. 84. What to do #10: Be like RPG and Ravelry RPG, a venerable role-playing community, and Ravelry, a community and marketplace for knitters and fiber artists, both banned pro-Trump speech from their sites in 2018. “We cannot provide a space that is inclusive of all and also allow support for open white supremacy.” https://www.ravelry.com/content/no-trump https://www.youtube.com/watch?v=6ValJMOpt7s https://www.vice.com/en_us/article/bj4wkq/one-of-the-oldest-online- rpg-communities-banned-pro-trump-speech
  85. 85. What to do #10: Be like RPG and Ravelry https://twitter.com/yehudi_eyif/status/1145263347568467969
  86. 86. What to do about the dark side of social media 1. Know your s**t 2. Take responsibility for informing your community 3. Don’t put your eggs in 1 basket 4. Have a halves and doubles mindset 5. Evaluate the platforms you use 6. Become an “information fiduciary” 7. “model” the behaviors of civic discourse 8. Use “POLP” at work and home 9. Exit or Voice (or Loyalty) 10. Be like RPG and Ravelry 87
  87. 87. 88
  88. 88. https://www.ted.com/talks/ed_gavagan_a_story_about_knots_and_surgeons/transcript?language=en “And so I just think, they got their lecture to go to. I step off, I'm standing on the platform, and I feel my index finger in the first scar that I ever got, from my umbilical cord, and then around that, is traced the last scar that I got from my surgeon, and I think that, that chance encounter with those kids on the street with their knives led me to my surgical team, and their training and their skill and, always, a little bit of luck pushed back against chaos. “
  89. 89. Thank you! Michael Peter Edson | @mpedson Let’s Get Real conference, London 2 March 2020

×