CRICOS No.00213J
Echo Chambers? Filter Bubbles?
Reviewing the Evidence
Prof. Axel Bruns
Guest Professor, IKMZ, University of Zürich
a.bruns@qut.edu.au — a.bruns@ikmz.uzh.ch
CRICOS No.00213J
Journalism and Social Media
• Last week:
• The emerging digital public sphere(s) in the contemporary media environment
• This week:
• Are these publics connected, or fragmented into echo chambers and filter bubbles?
• Next week:
• What power do the major platforms have over the emerging media ecosystem?
@qutdmrc
https://www.wiley.com/en-us/Are+Filter+Bubbles+Real%3F-p-9781509536443
CRICOS No.00213J
The Concerns
CRICOS No.00213J
Obama, B.H. (2017)
• Barack Obama’s farewell address:
• “For too many of us it’s become safer to retreat into our
own bubbles, whether in our neighborhoods, or on
college campuses, or places of worship, or especially
our social media feeds, surrounded by people who look
like us and share the same political outlook and never
challenge our assumptions.”
• Nicholas Negroponte: Daily Me (1995)
• Cass Sunstein: echo chambers (2001, 2009, 2017, …)
• Eli Pariser: filter bubbles (2011)
(https://edition.cnn.com/2017/01/10/politics/president-obama-farewell-
speech/index.html, 11 Jan. 2017)
(https://www.politico.com/news/2021/01/11/guards-lament-helplessness-capitol-riot-457877)
CRICOS No.00213J
(https://www.reuters.com/world/us/53-republicans-view-trump-true-us-president-reutersipsos-2021-05-24/)
CRICOS No.00213J
Can we simply blame our
platforms and their
algorithms?
Filter bubbles?
Echo chambers?
(https://commons.wikimedia.org/wiki/File:Eli_Pariser,_author_of_The_Filter_Bubble_-_Flickr_-_Knight_Foundation.jpg)
CRICOS No.00213J
Bubble Trouble
• Echo Chambers? Filter Bubbles?
• Where exactly?
• General search engines
• News search engines, portals, and recommender systems
• Social media (but where – profiles, pages, hashtags, groups …?)
• What exactly?
• Hermetically sealed information enclaves full of misinformation?
• Self-reinforcing ideological in-groups of hyperpartisans?
• Politically partisan communities of any kind?
• Why exactly?
• Ideological and societal polarisation amongst citizens?
• Algorithmic construction of distinct and separate publics?
• Feedback loop between the two?
• Defined how exactly?
• Argument from anecdote and ‘common sense’, rather than empirical evidence
• Promoted by non-experts (Sunstein: legal scholar; Pariser: activist and tech entrepreneur)
Image: Midjourney
CRICOS No.00213J
Echo Chambers? Filter Bubbles?
• What even are they?
• Definitional uncertainty, despite (or because of) Sunstein and Pariser
• Vague uses especially in mainstream discourse, often used interchangeably
• Both fundamentally related to underlying network structures
• Fundamental differences:
• Echo chambers: connectivity, i.e. closed groups vs. overlapping publics
• Filter bubbles: communication, i.e. deliberate exclusion vs. widespread sharing
echo chamber filter bubble
CRICOS No.00213J
Working Definitions
• An echo chamber comes into being where a group of participants choose to preferentially connect with
each other, to the exclusion of outsiders. The more fully formed this network is (that is, the more
connections are created within the group, and the more connections with outsiders are severed), the more
isolated from the introduction of outside views is the group, while the views of its members are able to
circulate widely within it.
• A filter bubble emerges when a group of participants, independent of the underlying network structures of
their connections with others, choose to preferentially communicate with each other, to the exclusion of
outsiders. The more consistently they exercise this choice, the more likely it is that participants’ own views
and information will circulate amongst group members, rather than any information introduced from the
outside.
• Note that these patterns are determined by a mix of both algorithmic curation and shaping and personal
choice.
CRICOS No.00213J
The Evidence
CRICOS No.00213J
Echo Chambers and Filter Bubbles in Social Media
• Early blogosphere studies:
• Strong U.S. focus
• Polarisation and ‘mild echo chambers’
• E.g. Adamic & Glance (2005)
• Social media studies:
• Especially Twitter, less research on Facebook or other platforms
• Hashtag / keyword datasets
• ‘Open forums and echo chambers’
• Significant distinctions between @mention, retweet, follow networks
• And between lead users and more casual participants
• E.g. Williams et al. (2015)
Adamic & Glance (2005)
Williams et al. (2015)
CRICOS No.00213J
Pew Center (2016)
Süddeutsche Zeitung (2017)
Bruns et al. (2017)
And Yet…
• Social media surveys:
• Users do encounter counter-attitudinal political views in their networks …
• … to the point of exhaustion
• E.g. Pew Center (2016)
• Broader network mapping:
• Political partisans share similar interests (except for the political fringe)
• E.g. Süddeutsche Zeitung (2017)
• Comprehensive national studies:
• Whole-of-platform networks show thematic clustering, but few
fundamental disconnections
• E.g. Bruns et al. (2017)
US
CRICOS No.00213J
New Evidence on Filter Bubbles in Search
• Mid-scale tests:
• No personalised filter bubbles in search results for U.S. politicians
• 41 of 47 outlets recommended to conservatives and liberals
• Five dominant news sources: almost too much uniformity
• See Nechushtai & Lewis (2019)
• Also for Germany: Haim et al. (2018)
• Large-scale tests:
• No personalised filter bubble in searches for German parties and politicians
• Largely identical search results
• In 5-10% of cases even in the same order
• See AlgorithmWatch (2018)
Nechushtai & Lewis (2019)
(https://www.youtube.com/watch?v=lQ3KHiqGmDE)
CRICOS No.00213J
The Reasons
CRICOS No.00213J
Case Studies Shouldn’t Be Generalised
• Need to see the big picture:
• Individual hashtags or pages may be ideologically pure, …
• … but they’re embedded in a complex platform structure (Dubois & Blank 2018)
• Serendipity is ubiquitous:
• Habitual newssharing in everyday, non-political contexts
• Selective exposure ≠ selective avoidance: we seek, but we don’t evade (Weeks et al. 2016)
• Homophily ≠ heterophobia: ‘echo chambers’ might just be communities of interest
CRICOS No.00213J
Case Studies Shouldn’t Be Generalised
• Cross-ideological connections almost impossible to avoid:
• Facebook pages may be engines of homophily, …
• … but Facebook profiles are engines of context collapse (Litt & Hargittai 2016)
• Because we don’t only connect with our ‘political compadres’, pace Pariser (2015)
• ‘Hard’ echo chambers / filter bubbles are possible, but very rare:
• Requires cultish levels of devotion to ideological purity (O’Hara and Stevens 2015)
• E.g. specialty platforms for hyperpartisan fringe groups (4chan, 8chan, Gab, Parler), …
• … but the hyperpartisans are also heavy users of mainstream news
• Even if only to develop new conspiracy theories and disinformation (Garrett et al. 2013)
Image: Midjourney
CRICOS No.00213J
Self-Serving Techno-Determinism
• Humans are complicated:
• Our interests and networks are diverse, complex, inconsistent
• Politics is just a small part of what people are interested in
• Algorithms provide only limited personalisation
• Mainstream information sources + random serendipity = mixed information diet
• Moral panics based on simplistic arguments:
• Sunstein & Pariser mainly provide personal, anecdotal evidence
• Significant overestimation of the power of AI at least since Negroponte
• “A myth just waiting to concretize into common wisdom” (Weinberger 2004)
• But very handy for blame-shifting and attacking social media platforms
• “The dumbest metaphor on the Internet” (Meineck 2018):
• Not just dumb, but keeping us from seeing more important challenges
• People do encounter a diverse range of content, …
• … but the question is what they do with it
https://www.vice.com/de/article/pam5nz/deshalb-ist-filterblase-die-blodeste-metapher-des-internets
CRICOS No.00213J
The Problem
How they imagine it… What it’s really like…
How they imagine it…
Images: Midjourney
CRICOS No.00213J
Ready access to information
enables spread of ‘fake
news’, hyperpartisanship,
and polarisation.
(But also social connection
and community support.)
Hyperpartisans,
Hyperconnected
(https://twitter.com/bigfudge212121/status/1259317174776115201)
CRICOS No.00213J
Hate-Reading
Gentzkow, M., & Shapiro, J. M. (2011). Ideological Segregation Online and Offline. The
Quarterly Journal of Economics, 126, 1799–1839. https://doi.org/10.1093/qje/qjr044
Also see:
Roberts, J., & Wahl-Jorgensen, K. (2020). Breitbart’s Attacks on Mainstream Media:
Victories, Victimhood, and Vilification. In Affective Politics of Digital Media. Routledge.
CRICOS No.00213J
It’s the People, Stupid – Not the Technology
The problem with an extraterrestrial-conspiracy mailing list
isn’t that it’s an echo chamber; it’s that it thinks
there’s a conspiracy by extraterrestrials.
— Weinberger (2004)
• Eighteen years later:
• The problem isn’t that there are hyperpartisan echo chambers or filter bubbles; it’s that there
are hyperpartisan fringe groups that fundamentally reject, and actively fight, any mainstream
societal and democratic consensus.
The problem is political polarisation, not communicative fragmentation.
There is no echo chamber or filter bubble – the filter is in our heads.
CRICOS No.00213J
Researching Polarisation
and Hyperpartisanship
CRICOS No.00213J
(https://www.pewresearch.org/politics/2017/10/05/1-partisan-divides-over-political-values-widen/)
CRICOS No.00213J
The Filter in Our (?) Heads
• Understanding polarisation:
• How might we assess levels of polarisation – over time, across countries, between groups, across platforms?
• What types of polarisation are there – issue-based, ideological, affective, identity-based?
• How do individuals slide into hyperpartisanship, and how can this be reversed?
• How do hyperpartisan groups process information that challenges their worldviews?
• What processes drive their dissemination of mis/disinformation, conspiracy theories, trolling, and abuse?
• Combatting polarisation:
• How can mainstream society be protected from hyperpartisanship?
• How can hyperpartisan extremists be deradicalised?
• How can mis/disinformation be countered and neutralised?
• What role can digital media literacy play, and how can its abuse be prevented?
CRICOS No.00213J
Next Time
CRICOS No.00213J
Readings
11. 2.12.: Echo Chambers? Filter Bubbles? Reviewing the Evidence
Bruns, A. (2022). Echo Chambers? Filter Bubbles? The Misleading Metaphors That Obscure the
Real Problem. In M. Pérez-Escolar & J. M. Noguera-Vivo (Eds.), Hate Speech and Polarization in
Participatory Society (pp. 33–48). Routledge. https://doi.org/10.4324/9781003109891-4
12. 9.12.: Platform Power: How Social Media Platforms Reshape the News Industry
Meese, J., & Hurcombe, E. (2021). Facebook, News Media and Platform Dependency: The
Institutional Impacts of News Distribution on Social Platforms. New Media & Society, 23(8), 2367–
2384. https://doi.org/10.1177/1461444820926472
CRICOS No.00213J
Lecture
• NOTE:
• All lectures from here on will be in person in BIN-0-K.02!

Gatewatching 11: Echo Chambers? Filter Bubbles? Reviewing the Evidence

  • 1.
    CRICOS No.00213J Echo Chambers?Filter Bubbles? Reviewing the Evidence Prof. Axel Bruns Guest Professor, IKMZ, University of Zürich a.bruns@qut.edu.au — a.bruns@ikmz.uzh.ch
  • 2.
    CRICOS No.00213J Journalism andSocial Media • Last week: • The emerging digital public sphere(s) in the contemporary media environment • This week: • Are these publics connected, or fragmented into echo chambers and filter bubbles? • Next week: • What power do the major platforms have over the emerging media ecosystem?
  • 3.
  • 4.
  • 5.
    CRICOS No.00213J Obama, B.H.(2017) • Barack Obama’s farewell address: • “For too many of us it’s become safer to retreat into our own bubbles, whether in our neighborhoods, or on college campuses, or places of worship, or especially our social media feeds, surrounded by people who look like us and share the same political outlook and never challenge our assumptions.” • Nicholas Negroponte: Daily Me (1995) • Cass Sunstein: echo chambers (2001, 2009, 2017, …) • Eli Pariser: filter bubbles (2011) (https://edition.cnn.com/2017/01/10/politics/president-obama-farewell- speech/index.html, 11 Jan. 2017)
  • 6.
  • 7.
  • 8.
    CRICOS No.00213J Can wesimply blame our platforms and their algorithms? Filter bubbles? Echo chambers? (https://commons.wikimedia.org/wiki/File:Eli_Pariser,_author_of_The_Filter_Bubble_-_Flickr_-_Knight_Foundation.jpg)
  • 9.
    CRICOS No.00213J Bubble Trouble •Echo Chambers? Filter Bubbles? • Where exactly? • General search engines • News search engines, portals, and recommender systems • Social media (but where – profiles, pages, hashtags, groups …?) • What exactly? • Hermetically sealed information enclaves full of misinformation? • Self-reinforcing ideological in-groups of hyperpartisans? • Politically partisan communities of any kind? • Why exactly? • Ideological and societal polarisation amongst citizens? • Algorithmic construction of distinct and separate publics? • Feedback loop between the two? • Defined how exactly? • Argument from anecdote and ‘common sense’, rather than empirical evidence • Promoted by non-experts (Sunstein: legal scholar; Pariser: activist and tech entrepreneur) Image: Midjourney
  • 10.
    CRICOS No.00213J Echo Chambers?Filter Bubbles? • What even are they? • Definitional uncertainty, despite (or because of) Sunstein and Pariser • Vague uses especially in mainstream discourse, often used interchangeably • Both fundamentally related to underlying network structures • Fundamental differences: • Echo chambers: connectivity, i.e. closed groups vs. overlapping publics • Filter bubbles: communication, i.e. deliberate exclusion vs. widespread sharing echo chamber filter bubble
  • 11.
    CRICOS No.00213J Working Definitions •An echo chamber comes into being where a group of participants choose to preferentially connect with each other, to the exclusion of outsiders. The more fully formed this network is (that is, the more connections are created within the group, and the more connections with outsiders are severed), the more isolated from the introduction of outside views is the group, while the views of its members are able to circulate widely within it. • A filter bubble emerges when a group of participants, independent of the underlying network structures of their connections with others, choose to preferentially communicate with each other, to the exclusion of outsiders. The more consistently they exercise this choice, the more likely it is that participants’ own views and information will circulate amongst group members, rather than any information introduced from the outside. • Note that these patterns are determined by a mix of both algorithmic curation and shaping and personal choice.
  • 12.
  • 13.
    CRICOS No.00213J Echo Chambersand Filter Bubbles in Social Media • Early blogosphere studies: • Strong U.S. focus • Polarisation and ‘mild echo chambers’ • E.g. Adamic & Glance (2005) • Social media studies: • Especially Twitter, less research on Facebook or other platforms • Hashtag / keyword datasets • ‘Open forums and echo chambers’ • Significant distinctions between @mention, retweet, follow networks • And between lead users and more casual participants • E.g. Williams et al. (2015) Adamic & Glance (2005) Williams et al. (2015)
  • 14.
    CRICOS No.00213J Pew Center(2016) Süddeutsche Zeitung (2017) Bruns et al. (2017) And Yet… • Social media surveys: • Users do encounter counter-attitudinal political views in their networks … • … to the point of exhaustion • E.g. Pew Center (2016) • Broader network mapping: • Political partisans share similar interests (except for the political fringe) • E.g. Süddeutsche Zeitung (2017) • Comprehensive national studies: • Whole-of-platform networks show thematic clustering, but few fundamental disconnections • E.g. Bruns et al. (2017) US
  • 15.
    CRICOS No.00213J New Evidenceon Filter Bubbles in Search • Mid-scale tests: • No personalised filter bubbles in search results for U.S. politicians • 41 of 47 outlets recommended to conservatives and liberals • Five dominant news sources: almost too much uniformity • See Nechushtai & Lewis (2019) • Also for Germany: Haim et al. (2018) • Large-scale tests: • No personalised filter bubble in searches for German parties and politicians • Largely identical search results • In 5-10% of cases even in the same order • See AlgorithmWatch (2018) Nechushtai & Lewis (2019) (https://www.youtube.com/watch?v=lQ3KHiqGmDE)
  • 16.
  • 17.
    CRICOS No.00213J Case StudiesShouldn’t Be Generalised • Need to see the big picture: • Individual hashtags or pages may be ideologically pure, … • … but they’re embedded in a complex platform structure (Dubois & Blank 2018) • Serendipity is ubiquitous: • Habitual newssharing in everyday, non-political contexts • Selective exposure ≠ selective avoidance: we seek, but we don’t evade (Weeks et al. 2016) • Homophily ≠ heterophobia: ‘echo chambers’ might just be communities of interest
  • 18.
    CRICOS No.00213J Case StudiesShouldn’t Be Generalised • Cross-ideological connections almost impossible to avoid: • Facebook pages may be engines of homophily, … • … but Facebook profiles are engines of context collapse (Litt & Hargittai 2016) • Because we don’t only connect with our ‘political compadres’, pace Pariser (2015) • ‘Hard’ echo chambers / filter bubbles are possible, but very rare: • Requires cultish levels of devotion to ideological purity (O’Hara and Stevens 2015) • E.g. specialty platforms for hyperpartisan fringe groups (4chan, 8chan, Gab, Parler), … • … but the hyperpartisans are also heavy users of mainstream news • Even if only to develop new conspiracy theories and disinformation (Garrett et al. 2013) Image: Midjourney
  • 19.
    CRICOS No.00213J Self-Serving Techno-Determinism •Humans are complicated: • Our interests and networks are diverse, complex, inconsistent • Politics is just a small part of what people are interested in • Algorithms provide only limited personalisation • Mainstream information sources + random serendipity = mixed information diet • Moral panics based on simplistic arguments: • Sunstein & Pariser mainly provide personal, anecdotal evidence • Significant overestimation of the power of AI at least since Negroponte • “A myth just waiting to concretize into common wisdom” (Weinberger 2004) • But very handy for blame-shifting and attacking social media platforms • “The dumbest metaphor on the Internet” (Meineck 2018): • Not just dumb, but keeping us from seeing more important challenges • People do encounter a diverse range of content, … • … but the question is what they do with it https://www.vice.com/de/article/pam5nz/deshalb-ist-filterblase-die-blodeste-metapher-des-internets
  • 20.
  • 21.
    How they imagineit… What it’s really like… How they imagine it… Images: Midjourney
  • 22.
    CRICOS No.00213J Ready accessto information enables spread of ‘fake news’, hyperpartisanship, and polarisation. (But also social connection and community support.) Hyperpartisans, Hyperconnected (https://twitter.com/bigfudge212121/status/1259317174776115201)
  • 23.
    CRICOS No.00213J Hate-Reading Gentzkow, M.,& Shapiro, J. M. (2011). Ideological Segregation Online and Offline. The Quarterly Journal of Economics, 126, 1799–1839. https://doi.org/10.1093/qje/qjr044 Also see: Roberts, J., & Wahl-Jorgensen, K. (2020). Breitbart’s Attacks on Mainstream Media: Victories, Victimhood, and Vilification. In Affective Politics of Digital Media. Routledge.
  • 24.
    CRICOS No.00213J It’s thePeople, Stupid – Not the Technology The problem with an extraterrestrial-conspiracy mailing list isn’t that it’s an echo chamber; it’s that it thinks there’s a conspiracy by extraterrestrials. — Weinberger (2004) • Eighteen years later: • The problem isn’t that there are hyperpartisan echo chambers or filter bubbles; it’s that there are hyperpartisan fringe groups that fundamentally reject, and actively fight, any mainstream societal and democratic consensus. The problem is political polarisation, not communicative fragmentation. There is no echo chamber or filter bubble – the filter is in our heads.
  • 25.
  • 26.
  • 27.
    CRICOS No.00213J The Filterin Our (?) Heads • Understanding polarisation: • How might we assess levels of polarisation – over time, across countries, between groups, across platforms? • What types of polarisation are there – issue-based, ideological, affective, identity-based? • How do individuals slide into hyperpartisanship, and how can this be reversed? • How do hyperpartisan groups process information that challenges their worldviews? • What processes drive their dissemination of mis/disinformation, conspiracy theories, trolling, and abuse? • Combatting polarisation: • How can mainstream society be protected from hyperpartisanship? • How can hyperpartisan extremists be deradicalised? • How can mis/disinformation be countered and neutralised? • What role can digital media literacy play, and how can its abuse be prevented?
  • 28.
  • 29.
    CRICOS No.00213J Readings 11. 2.12.:Echo Chambers? Filter Bubbles? Reviewing the Evidence Bruns, A. (2022). Echo Chambers? Filter Bubbles? The Misleading Metaphors That Obscure the Real Problem. In M. Pérez-Escolar & J. M. Noguera-Vivo (Eds.), Hate Speech and Polarization in Participatory Society (pp. 33–48). Routledge. https://doi.org/10.4324/9781003109891-4 12. 9.12.: Platform Power: How Social Media Platforms Reshape the News Industry Meese, J., & Hurcombe, E. (2021). Facebook, News Media and Platform Dependency: The Institutional Impacts of News Distribution on Social Platforms. New Media & Society, 23(8), 2367– 2384. https://doi.org/10.1177/1461444820926472
  • 30.
    CRICOS No.00213J Lecture • NOTE: •All lectures from here on will be in person in BIN-0-K.02!