5. Reducing the spread of misinformation on social media
must be efficient, timely, and consistent
● Semantic dissonance detection (ex. pause before allowing retweet)
● Fact-checking from knowledge bases (ex. Hoaxy.com)
● Fact-checking using crowd-sourcing (ex. Snopes.com)
● Multimedia false information detection (ex. DeepFake Authenticator)
● Bridging echo-chambers (ex. EscapeYourBubble extension)
● Detection and takedown of false information (ex. Bot Sentinel)
● “Vaccinating” against misinformation (ex. Exposing viewers to positive
consensus of climate change first reduces climate change denialism in
the “vaccinated” population)
(Kumar, Srijan, and Shah, 2018)
6. Experimental designs
● Game theory: mediate emergent collective
outcomes in voluntary vaccine schemes by
epidemic prevalence information in networks
(Sharma et al, 2019).
● Strategic node placement: Create a network
majority illusion to generate a tipping point
(25%) to achieve behavioral change in the
population (Centola, 2018).
● Create homogeneous networks conducive to
large-scale coordination (Piedrahita et al, 2018)
● Algorithmic governance: Using AI to train and
remove hostility and incivility from online
interactions.
6
(Sharma et al, PLoS Computational Biology, 2019)
9. Network science: network structures and properties
and actor/tie characteristics influence spread
● Random (Erdos-Renyi model: doesn’t occur often in the real world)
● Small-world (Watts-Strogatz model: 6-degrees of separation)
● Scale-free (Barabasi-Albert model)
(Peterson et al, 2021)
(Fix, 2016)
10. Concepts in network science
○ Homophily
○ Complex contagion
○ Information cascade
○ Network interventions
11. Concepts in network science
● Homophily
● Complex contagion
● Information cascade
● Network interventions
12. Concepts in network science
● Homophily
● Complex contagion
● Information cascade
● Network interventions
(Leskovec, Adamic,
and Huberman,
2008)
13. Concepts in network science
● Homophily
● Complex contagion
● Information cascade
● Network interventions
The three concepts
interacting with the way
social media is designed,
create problems like “echo
chambers”, “polarization”,
and “filter bubbles”.
14. Network interventions to disrupt information flow and
reduce exposure to misinformation
1. Examine the network structure and properties of misinformation flow on different
social media platforms
2. Put a pulse on misinformation typology and sentiment using ready-made tools
3. Collaborate with social media platforms to correct recommender systems’ collaborative
filtering
4. Conduct longitudinal experimental designs to examine vaccine message engagement
15. Visualizing
vaccine-related tweets in
Ontario, 2013-2016
“Jenny McCarthy’s anti-vaccine
views = misinformation. Please
ask The View to change their
mind,“ July 22, 2013
“Reality Check: CDC
Scientist Admits Data
of Vaccines and
Autism Was
Trashed”, Mar 4,
2016
Ottawa’s
vaccine-free
daycare, Feb 7,
2015 (Song, 2018)
17. Online communities retrieve information from
different sources
•https://www.youtube.com/watch?v=K-4LkJfEBDw
•Getman example
(Getman et al, Health Education, 2017)
17
19. NOT ALL INFORMATION ARE DISSEMINATED EQUALLY
(Vosoughi et al, Nature, 2018)
In a study conducted using historical Twitter data from 2006-2017, of the 126,000 tweets
shared by 3 million people, those that evoke surprise and disgust were shared further,
wider, and deeper than messages that evoked emotions of sadness and joy.
20. (Kang et al, Vaccine, 2017)
20
High closeness keywords
provide cohesiveness to the
narrative.
High degree keywords (node
size) are the most central
ideas in the narrative.
High betweenness keywords:
bridges ideas from one cluster
to another.
Semantic network analysis
of 26,389 vaccine-related
tweets in the US.
21. Network interventions to disrupt information flow and
reduce exposure to misinformation
1. Examine the network structure and properties of misinformation flow on different social
media platforms
2. Put a pulse on vaccine sentiment and misinformation typologies at the local level
3. Collaborate with social media platforms to correct recommender systems’ collaborative
filtering
4. Conduct longitudinal experimental designs to examine vaccine message engagement
23. Network interventions to disrupt information flow and
reduce exposure to misinformation
1. Examine the network structure and properties of misinformation flow on different social
media platforms
2. Put a pulse on misinformation typology and sentiment
3. Work with social media platforms to correct recommender systems’ collaborative
filtering
4. Conduct longitudinal experimental designs to examine vaccine message engagement
24. “The top 50 Facebook pages ranked by the number of public posts they
made about vaccines generated nearly half (46 percent) of the top 10,000
posts for or against vaccinations” Madrigal, The Atlantic, 2019.
26. Research Questions
1. What is the prevalence of pro-, neutral- and anti-vaccine videos on YouTube?
2. Are anti-vaccine videos more likely to be recommended than pro-vaccine
videos?
3. Do we observe a homophily effect among pro- and anti-vaccine videos? For
example, do pro-vaccine videos tend to recommend more pro-vaccine videos
and vice versa?
26
Gruzd, Abul-Fottough, Song
27. Methodology: Dataset - Seed Videos
● Using Facebook’s Crowdtangle platform,
we retrieved 8,549 public Facebook posts
linking to YouTube videos in June of 2020.
● Search keywords: vaccine OR vaccines
OR vaccination OR vaxx OR vaxxed OR
immunization
● Crowdtangle database covers 3.5M+
Facebook public pages (with >100K likes),
groups, and verified profiles.
27
Gruzd, Abul-Fottough, Song
29. Methodology: Dataset - Seed Videos
● Using Excel, we pre-processed the
dataset to get to 539 “seed” videos that
are (1) vaccine-related, (2) in English
and (3) still publicly available at the
time of data collection on July 4.
29
Gruzd, Abul-Fottough, Song
30. Methodology: Dataset - Recommended Videos
● Using the “seed” videos,
we retrieved metadata for
video recommendations
based on YouTube’s
recommender system;
● In total, we retrieved 3058
recommended videos;
● For this step, we used
YouTube Data Tools
30
Step 1
Step 2
Gruzd, Abul-Fottough, Song
31. Methodology: Dataset - Vaccine Sentiment Coding
● With the help of 8 public health
research assistants, manually
coded videos by pro-, anti-, neutral
31
● Intra Class Correlation Coefficients
were above the recommended
threshold of 0.7 for all four sets of
datasets: 0.710, 0.765, 0.780,
0.772 (based on SPSS)
Sample dataset with vaccine sentiment labels as assigned by a coder
Gruzd, Abul-Fottough, Song
32. Preliminary Results: Video Engagement
● Pro- and neutral vaccine
videos are more likely to be
viewed than anti-vaccine
videos (p=.000)
PRO ANTI NEUTRAL
other
32
Gruzd, Abul-Fottough, Song
33. Preliminary Results: Video Engagement
● Anti- are more likely to be
“liked” than pro- or neutral
vaccine videos (p=.000)
PRO ANTI NEUTRAL
other
33
Gruzd, Abul-Fottough, Song
34. Method: Social Network Analysis and Visualization
● Number of nodes: 3058 (videos)
● Number of edges: 27383
(=YouTube recommendations)
○ Example: A->B
○ If you view video A, Youtube will
likely recommend video B
34
Gruzd, Abul-Fottough, Song
36. Preliminary Results: Lolog Models
1. Clear effect of homophily based on
video sentiment (p=.000)
2. Clear effect of homophily based on
video category (p=.000)
3. No significant effect between
pro/neutral and anti in being
recommended (p=.904/.667
respectively)
4. Pro/neutral videos likely
recommend more “diverse” videos
than anti (p=.000/.000)
5. Reciprocity effect (p=.000)
36
Gruzd, Abul-Fottough, Song
37. Discussion
1. Are there any incentives for Facebook and YouTube to combat COVID and/or
vaccine related misinformation?
2. What are some interventions/models one can deploy to effectively take down
misinformation beyond fact-checking, flagging, and muting?
3. What stakeholders are at play in the decision-making processes to regulate/
penalize vaccine misinformation spreaders?
4. What are the societal, ethical, and legal implications related to spreading
covid-vaccine misinformation?
5. Can the intentional spread of false information be treated as a violation of
human rights?
37
Gruzd, Abul-Fottough, Song
38. Network interventions to disrupt information flow and
reduce exposure to misinformation
1. Examine the network structure and properties of misinformation flow on different social
media platforms
2. Put a pulse on misinformation typology and sentiment using ready-made tools
3. Collaborate with social media platforms to correct recommender systems’ collaborative
filtering
4. Conduct longitudinal experimental designs to examine vaccine message engagement
41. The battle to stop vaccine hesitancy happens here.
Most of our anti-vaccine communication
strategies are focused on in-clinic settings in
“the fortress”.
42. Future
considerations
● Research:
○ Establish research capacities in public health to collect, code, and interpret how content is
being shared on various social media platforms.
○ Evaluate the effectiveness of digital methods by creating different types of content (e.g.,
dynamic vs. static) and measure informational spread.
● Practice:
○ Promote the use of dynamic content (images, gifs, and videos).
○ Monitor how, where, and how frequent do misinformation promulgators connect with their
communities
○ Document case studies across the Province to inform social media practice to support new
Public Health Standards.