1. “Social Media, Self-Segregation and
the Spiral of Silence ”
Robert Bodle, PhD (USC)
Associate Professor of Communication and New
Media Studies, Mount St. Joseph University
@robertbodle
2. overview
• News on Twitter & Facebook
• Self-segregation, Algorithmic Filtering, &
Ferguson
• The Spiral of Silence & Snowden-NSA
Revelations
• What my students say
• Implications (for news exposure)
4. Other key findings . . .
Differences among Twitter and FB users-
1) Twitter news users more likely (than
those on FB) to report seeing news about:
•national government and politics (72% vs.
61%),
•international affairs (63% vs. 51%),
•business (55% vs. 42%)
•sports (70% vs. 55%)
2) Growth of Twitter news use cuts across
every demographic group; however FB
news use skews toward younger users
3) Re: news about gov and politics - more
people on Twitter directly follow news
outlets, than on FB (46% to 28%)
5. Where was Ferguson in my FB feed?
Algorithmic filtering via
procedural inputs:
- popularity, relevance,
recency
“A complex interplay between the
platform’s designs and your own
behavior” (Jesse Holcomb, 2014)
“Facebook’s algorithm tends
toward more positive items in
the newsfeed” Martin, 2014;
Casey, 2014).
“Facebook Wants You to
Be Happy” (Martin, 2014)
6. “Facebook is for Ice Buckets, Twitter
is for Ferguson” (McDermott, 2014)
• Roughly 8x more
posts about
Ferguson than
ALS Ice Bucket
Challenge
• However the
stories received
roughly the same
exposure on
Facebook
• Ice Bucket
challenge more
heavily promoted
by FB’s algorithms
(by a factor of 8x)
than Ferguson.
7. Other reasons for news disparity
• Inability to search
Facebook
• Non reciprocol
following easier to
diversify who to
follow on Twitter
• Homophily - 75%
of white Americans
have entirely white
social networks.
8. Social Media & the “Spiral of Silence”
(Noelle-Neumann,1974)
• Previous ‘spiral of silence’
findings offline also apply to
social media users.
• People less willing to discuss
the Snowden-NSA story in
social media than in person
(86% to 42%).
• Social media did not provide an
alternative discussion platform
for those reluctant to discuss
the issues in person.
• Social media users less likely to
discuss Snowden – NSA Story
F2F (.74 times)
Study asked participants about
the revelations by Edward Snowden
of NSA Spying
9. How might social media help contribute to
your own self-segregation and spiral of
silence?
“I know from my experience that the people I
am "friends" with are not the people who I want
to share my viewpoint with. My dad's side of
the family is older southern Baptists and they
have vastly different viewpoints on the world
than I do. I know for a fact that I don't
want to talk about Ferguson on my
Facebook with them because they only
see one side of the story.”
10. My students
“I usually choose to remain quiet because I do
not want to offend others, but also because I
am worried about receiving some of that
vitriol. As a result, I usually keep my thoughts
to myself, but I also do not tweet that much
about issues outside of my general interests. I
try to maintain my Twitter feed as funny (to
me) little jokes about sports or things I see on
the Internet and look to be inoffensive with
what I post.”
11. My students (cont.)
“When there are major social issues or news events, a
common belief begins to appear amongst many
prominent people on Twitter . . . Once that argument is
established, you will see many people choose to not
comment on the story or if they do, they will often be
bombarded with negative feedback. As a result, people
choose to sit out the discussion over these issues or
they resort to simply being trolls on the platform.
I believe that has become more common . . . people feel
uncomfortable sharing views that are unpopular or
contrarian because they fear the reactions of those they
are speaking to. I think there is less diversity
amongst opinions being shared now and it is
problematic that groupthink is becoming so wide-spread.”
13. “Social Media, Self-Segregation and
the Spiral of Silence ”
Robert Bodle, PhD (USC)
Associate Professor of Communication and New
Media Studies, Mount St. Joseph university
robert.bodle@msj.edu
@robertbodle
Thank you for your attention.
Editor's Notes
Hello, my name is Robert Bodle, Great to be with you. I am happy to be on the panel Innovative and Unexpected Uses of Social Media sponsored by the Communication and Future Division with colleagues from Miami University.
I teach at Mount St. Joseph University in Cincinnati OH, and as an Adjunct at Miami University, in Oxford OH. I am also (an outgoing) Co-Chair of the Internet Rights and Principles Coalition under the UN’s Internet Governance Forum. Over the last ten years I have researched the human rights implications of sociotechnical systems like social network sites, focusing primarily on Privacy and Freedom of Expression.
I have written previously about Predictive Algorithms and Personalization Services on Social Network Sites (implications for users and society), featured in this collection.
In this chapter I specifically look at the broader implications of algorithmic filtering on Facebook, but today I’d like to share some related insights that arose from teaching New Media sources, recognizing my students evolving “info-diet” --as Howard Rheingold puts it--of news and information, and share some studies that elaborate on trends in news consumption in the digital age.
NEXT SLIDE
I’ll first look at the evolving role of News on Twitter and Facebook
Two case studies 1) coverage of Michael Brown’s death and related events in Ferguson Missouri
2) The Snowden-NSA revelations
A recent study by Facebook’s own Research Team
And some insights from my own classes and the implications for future news and information dissemination
NEXT SLIDE
--
Personalization is when online content conforms to the prior actions of the user
in an algorithmically generated feedback loop, also known as mass customization.
These services can be quite convenient and include:
Google’s personalized search,
behavioral advertising,
featured recommendations on Amazon.com,
taste preferences on Netflix,
headlines on Yahoo! News,
Twitter Trends,
and Facebook’s News Feed rankings
Personalization services depend on tracking users browsing histories, purchasing data, and social media production
so it comes at a price – user privacy, autonomy, and freedom
personalization poses difficulties for research due to the issue of opacity of the underlying technological processes
-the proprietary black box of Facebook’s algorithms, which provides an indication of asymmetrical power relations between Facebook and its users – we don’t know really know how it really works.
Yet, can identify the underlying market logic of personalization, which is to provide an environment that can guide user interactions toward commercial ends:
The goal of personalization on facebook (evident in public remarks by the company) is to encourage people to share or engage more w/their friends
to produce even more information to sell us (we are the product) for highly targeted ads.
According to a recent poll (July 14, 2015) conducted by the Pew Research Center, the share of Americans for whom Twitter and Facebook serve as a source of news is continuing to rise.
Each platform serves as a source for news, with differences in news distribution strengths – for example, more people follow Twitter for breaking news (as it happens coverage and commentary on live events) than on FB.
Other key findings include:
NEXT SLIDE
Other key findings include:
4/11 . . .Platforms are roughly comparable for the remaining seven topics covered: people and events in your community, local weather and traffic, entertainment, crime, local government, science and technology, and health and medicine
News use growth among all users on Twitter, and mostly younger users on FB
People on Twitter more likely to directly follow news organizations, reporters or commentators however FB users more interactive commenting on
So–
1) Twitter use more important for national and international news,
2) Twitter news use growth demographically even, whereas more young people taking to FB for news,
3) and more Twitter users follow news sources directly (as opposed to friends and family).
Final thought – as platforms recognize and adapt their role in the news environment, they will begin to offer unique features that may influence changes in news use.
One key feature would be personalization algorithms for personalized news feeds which might explain why many people might have seen the Ice Bucket Challenge in the FB news feed but no sign of the Ferguson story
NEXT SLIDE
Last year many observers noticed differences in the news content of their Facebook and Twitter feeds
-for example many people saw many posts of the Ice Bucket Challenge to raise funds for the fight against Lou Gehrig’s disease in their FB news feed and nothing about Ferguson MO.
Sociologist Seynep Tufekci suggests this delayed reaction on FB is due to the platforms’ algorithmic filtering, that the platform filters out stories based on our own preferences – that our own behavior, and that the news results are a result of the
“a complex interplay between the platform’s designs and your own behavior” (Jess Holcomb, 2014)
Algorithmic filtering can be understood as a personalization service that curates content based on our own prior interests, activities and interaction, however our own activities and interactions can be structured, shaped in a way the can filter or minimize news exposure about important events.
Carla D Martin on her course blog – Race and Technology at Harvard, suggests that FB’s algorithm may hide the opinions of those with whom we disagree, that the algorithm tends toward more positive items, that FB wants you to be happy.
NEXT SLIDE
Ethan Zuckerman, co-founder of Global Voices and director of the Center for Civic Media at MIT, suggests that the slow uptake for the story on FB as opposed to the early coverage on Twitter could be a result of algorithmic censorship – citing analysis by John McDermott of Digiday (Digital marketing firm in NY),
That although there were roughly eight times as many stories about events in Ferguson posted to Facebook as stories about the ALS Ice Bucket challenge, the stories received roughly the same exposure.
The average story about the Ice Bucket Challenge was much more heavily promoted by Facebook’s algorithms (a factor of 8x) than the average story about Ferguson
Again to account for the disparity FB’s algorithm gave the Bucket Challenge story more prominence by its placement at the top of the Newsfeed, and thus censored exposure to Ferguson (thereby posing a danger to democratic discourse)
But there are other factors having to do with the design and use of the platform, suggests Zuckerman –
Inability to search FB, whereas easy to find on Twitter
Non reciprocal Non-reciprocal following makes it easier to diversify who you follow on Twitter (and use of hashtags)
People have a small # of friends of FB (average 200) and many White american Facebook users likely have few or no African-American Facebook friends, according to a 2013 American Values Survey.
AND Facebook’s algorithms support our own self-segregation by race, class, etc. by favoring stories that our similar friends are sharing. FB is not causing this behavior but helping to shaping it.
Another reason – for this disparity could be
NEXT SLIDE
Ethan Zuckerman, co-founder of Global Voices and director of the Center for Civic Media at MIT, suggests that the slow uptake for the story on FB as opposed to the early coverage on Twitter could be a result of algorithmic censorship – citing analysis by John McDermott of Digiday (Digital marketing firm in NY),
That although there were roughly eight times as many stories about events in Ferguson posted to Facebook as stories about the ALS Ice Bucket challenge, the stories received roughly the same exposure.
The average story about the Ice Bucket Challenge was much more heavily promoted by Facebook’s algorithms (a factor of 8x) than the average story about Ferguson
Again to account for the disparity FB’s algorithm gave the Bucket Challenge story more prominence by its placement at the top of the Newsfeed, and thus censored exposure to Ferguson (thereby posing a danger to democratic discourse)
But there are other factors having to do with the design and use of the platform, suggests Zuckerman –
Inability to search FB, whereas easy to find on Twitter
Non reciprocal Non-reciprocal following makes it easier to diversify who you follow on Twitter (and use of hashtags)
People have a small # of friends of FB (average 200) and many White american Facebook users likely have few or no African-American Facebook friends, according to a 2013 American Values Survey.
AND Facebook’s algorithms support our own self-segregation by race, class, etc. by favoring stories that our similar friends are sharing. FB is not causing this behavior but exacerbating it .
Another reason – for this disparity could be
NEXT SLIDE
According to a report “Social Media and the Spiral of Silence” by researchers at Pew Research Center and Rutgers University
The study set out to investigate the effect of the Internet on the so-called spiral silence
– a theory that people are less likely to express their views if they believe them to differ from those of their friends and family and colleagues. And that they will silence themselves and shy away from expressing divergent or unpopular opinions.
The study asked participants about the revelations of government spying made by Edward Snowden, and found that (read above)
The most damning thing about this study is the suggestion that those who use social media (Facebook and Twitter) regularly are more reluctant to express dissenting views in the offline world
And that the spiral of silence is not mirroring the offline world but that the Internet is shutting many of us up more than we would be reluctant to express dissenting views in person, face to face,.
NEXT SLIDE
predictive analytics reveals a fundamental assymetrical power and inequality between users and fb
FB has unprecedented access to and control over user data, whereas people have little power over their own data, how it is gathered, and how it is being used
(with myriad implications for user privacy, freedom, and autonomy)
As users become more transparent and understandable, Facebook’s data-handling practices grow more sophisticated and opaque;
People have no clue their facebook feeds are being culled, curated, and filtered.
As something mentioned this morning, without freedom of expression, you can’t be radical, you can’t be anything.
Political theorists see facebook filtering of relevancy a way to conceal rivalrous information and diverse points of view
That can contribute to polarization of political beliefs – an extension of the segmentation and social sorting already taking place in physical communities.
Algorithms and personalization filters can also filter out ideology as well as censor offensive content filtering different points of view, including incivility,
Social segmentation through personalized ads can also reinforce boundaries between social groups
And deepen social divisions based on differences between people based on race, class, age, education, location, and political ideology.
Increased social divisions are likely to result in a less tolerant society, where stereotypes, prejudice, suspicion, and hatred of the other can flourish.
algorithmic inferences about people can reinforce traditional patterns of discrimination in the market place, where some groups are not valued as highly as others.
big data can be used instrumentally to predict behavior but also to coerce in terms of social behavior that can direct consumer behavior and political belief
Asymmetrical power and inequality
Surveillance - informational privacy and safety, autonomy, volition
Censorship; algorithmic gate-keeping (Morozov, 2012)
Group polarization and deliberative democracy (Bollinger, 2010; Benkler, 2001; Pariser, 2011; Sunstein, 2009)
Homophily and intolerance (Nakamura, 2002)
Social segmentation and discrimination (Gandy, 2003; McStay, 2013; Sweeney, 2013, Terranova, 2004)
Scientificity (Schroeder, 2013)
Coercion (Mayer-Schonberger & Cukier, 2013)