SlideShare a Scribd company logo
1 of 54
AI, Social Media and
Political Polarization:
Is Our Democracy at Risk?
Dave Hess &
Jim Isaak
Sept 27 & Oct 4th 2019
OLLI - Concord
Social media and artificial intelligence (AI) are reshaping our political and electoral processes – and
not for the better. Increasingly sophisticated algorithms searching through massive data sets enable
political parties, candidates, and nation states to micro-target advertising with messages tailored
to small, carefully selected segments of the population that are most susceptible to a particular
message, thereby reinforcing the political dispositions of the recipients. Even more perniciously, this
technology provides insights into how to tailor that message to most effectively influence those
recipients. Online “bots” (automated accounts) can disseminate such messages – both real and
indistinguishably fake ones – to millions of online recipients instantaneously – all known only to
those who are receiving them. And these methodologies will become even more sophisticated and
undetectable as AI advances to “deep learning”. This technology is contributing to, if not to a
significant extent causing, the further political polarization of our population as people increasingly
receive only political messages tailored to and reinforcing their own political predispositions to the
exclusion of other perspectives and points of view. We will ask: What are algorithms? Bots?
Machine learning? Deep learning? How do they work? How do they shape, influence, manipulate,
distort and polarize political discourse? And how have they been used historically to shape political
thought and influence the outcome of elections as, for example: by ISIS in Iraq ; in the Brexit vote in
England; and in our own 2016 presidential election (to mention but a few). And what does the
future hold?
9/30/2019 https://is.gd/AIPolitics
A few Technology Concepts
• Algorithms?
• Big Data?
• Digital Footprints?
• Deep learning?
• Trolls, Sockpuppets and Bots (oh my!)
And why do you care?
What are algorithms?
Recipes
Jims Fudge
• In a double boiler:
– Melt 6 Tbsp butter
– Stir in
• ½ cup cocoa
• 4 Tbsp milk
• 1 tsp vanilla
• 1 lbs powdered sugar
until smooth
– Pour into buttered pan
– Cool in frig until hardened
Actual Political Algorithm circa 1970
• For each precinct(I) in town
– R=0, D=0, U=0
– For each voter in precinct
• if voter is republican then R=R+1
• If voter is democrat then D=D+1
• Else U=U+1
– Next Voter
– Precinct(I) = {R,D,U}
• Next Precinct
• Sort Precincts (highest R first,
lowest D second)
• Print priority list for
republican_voter_contact
• Candidate objective:
prioritize precincts to
visit door-to-door
with maximum
republican contact
density
But, a typical program
• Contains thousands or millions of instructions
• Often broken into “sub routines” or
“methods” which can be reused in many
programs
[send_email(emailaddr, msg) might be an example]
• An associate tells me that there are 100,000+
identifiable “flaws” in MS Windows, but in
sections of code so old no-one is prepared to
fix it (or perhaps the source code is lost)
Software/Program objectives
• Traditional – programmer implements a model that solves the problem
• Emerging AI – Deep Blue (Chess-1996) (“brute force learning”)
• Trained Artificial Intelligence – (2010)
programmer provides data set, examples and evaluation criteria and
the program develops the model (AlphaGo, language recognition/translation)
Watson (Jeopardy 2011, medical diagnosis, xRay evaluation 2017)
Google/Facebook ad insertion, headline selection, recommendations
job candidate selection, credit evaluation, real estate ad presentation
• Goal driven AI – 2017
Programmer provides model for evaluation and defines output set,
program develops model (Alphazero, facial analytics)
• General AI -- ??? --
Program revises actions, and perhaps goals independently
(a significant % of people think it is human or super human)
Big Data?
• Disk storage is now cheap – often measured in
terabytes even on consumer computers
• 15 terabytes = total Library of Congress (text)
• Tools exist to process diverse data formats – so
queries can be made that span different
databases, and even “unstructured” data
• Facebook collects 500+ terabytes/day (2017)
• Youtube uploads 500 hours of video/minute (2019)
Don’t delete anything, you never know ….
Big Data + Deep Learning
• Can infer information about individuals that
they might not “know” or admit*
– Political orientation, property holdings
– Sexual orientation, credit card transactions
• Key “hot buttons” that will motivate them you
to act (buy, vote, …) or
not act (voter suppression)
*Key papers by Michel Kosinski, 2012+ working with Facebook now at Stanford
Digital Footprints?
• When you connect to the web/net that is your
first “foot print”
– User from this IP address, MAC id, date/time
– Store cookie #123 on user’s system
• Go to a web site/service/server
– #123 at site/url, date/time/query string
…duration
• Email content (written, read), actions
• “click thru”, “like”, even “hover” in some cases
“But I never use Facebook”
• Even if you never use Facebook, they can track
you … initially they may not know your name,
phone number, address, personality, age,
income level, political orientation, sexual
orientation, facial image, ……but
“we only share your personal data with our trusted
business partners” (typical privacy statement)
• Which either means a contract, or simply sale
of your data (what is a business partner?)
Advertising Business Model
• Facebook, Google, et al use this model
• Revenue comes from selling ad placements,
and views
– Advertisers are the customers/revenue source
We are the product
• The greatest revenue comes from
– Maximum selectivity – “Micro targeting”
– Maximum attention opportunities
(“engagement” = Getting you to stay on their site)
Channels
• A new “ad” or “news feed” item can be placed
on each page as you click though the web
• A “campaign” targeting you (or users like you)
can span from web site to web site
• In theory it can span from the web to cable TV
“ad insertion” directed to your TV,
• Certainly it can touch the paid content inserted
in streaming videos, online “radio” etc.
Fake net users/sockpuppets
• Part of the allure of the web is anonymity
• Folks often do not use their real names
• And can have many accounts with various
services –Facebook, Gmail, Hotmail, Instagram
• Some of this may be “legitimate”
• Others may be malicious
I could have 500 folks who “Like” me on Facebook,
498 are my own fake accounts: “sockpuppets”
Trolls (not just under bridges)
• Persons or organizations operating a number of
fake user ID’s pushing an agenda, independent of
Truth, Justice or the American Way
• Often using outrage and fear to gain attention and
get their content to go viral
(i.e. shared by initial users to their lists, and from
those folks to their lists, etc.)
• Create fake user “pages”, also fake “news sites”,
feed content to blogs/news-sites that are not
trying to be objective/accurate/etc.
Not every abuser is a troll
• You can get paid by Google, Facebook, etc. for
putting their ads on your site
• So, if you create a “fake news site”, and get some
good outrage/fear going, you can get rich –
some kids in Macedonia have done this
• “The Pope has endorsed Donald Trump”
(a real instance of a fake news meme that went viral)
• And the real trolls can point to this as part of
their campaigns
Bots?
“On the Internet, no one knows you are a dog”
This is half true – it is fairly easy to create “fake accounts”
(gmail, Facebook, etc.) … particularly if you have an army
of folks working at this (literally)
Facebook has tried to purge fake accounts (millions!), but
many have years of apparent activity
Bots are captive computers that can be triggered to convert
a troll post into a “trending” item by clicks, LIKEs, sharing,
re-posting, etc.
(turning off your computer reduces the risk of your computer
participating in a zombie bot-net)
Click farms are large numbers of accounts, with either
automated or humans that “click” on targeted content to
trigger “trending”, etc.
The IRA is not Irish
• Internet Research Agency is a Russian troll
farm (lots of employees, millions of fake
accounts) used to promote their agenda over
a number of years.
• One way to build web-credibility is to have a
long established site/id that gradually builds a
network … may use “real names” and bios also
IRA Trolls triggering US events (NYU)
In 2016, the technique entailed phony
IRA social media personas or groups
first announcing events, such as dueling
anti-Muslim and pro-Muslim demonstrations
that the Russians successfully
organized in May 2016 in Houston
LikeWars (2018) – some notes
• When Trump announced, 58% of his followers were
overseas, many were click farms or bots
• 30+% of the tweets in the BRIXIT window were from
botnets; ditto 2016 election, ditto Brazilian election
• “Tennessee GOP” was one fake user id,
managed by the Russian Trolls
it trended at #7 just before the 2016 election
• “Angie Dixson” is a Russian troll inflaming tension prior
to the Charlottesville white supremacist rally
“And the beat goes on”
The really sophisticated promoters
• Combine many components
– Web sites for (fake) organizations
– Sock puppet “Friends”
– Email lists and postings
– Real world “events”
– Fake news stories and sites
– Paid placements with micro targeting
– “Clickbait” ideally trending and going viral
Trolls, sockpuppets and bots – oh my!
Psych Traits Social Media Exploits
1. Conformation bias
2. Availability bias
3. Emotive bias
4. Repetition bias
5. Visual bias
6. Belonging/”herd” bias
7. Social acceptability
Attention “grabbers” (Emotive bias)
Fear and Outrage
• “Race to the bottom of the brainstem”
(Tristan Harris, ex Google Ethics manager)
– Likely to trigger a “click thru”
– Likely to be shared
• The “paid placements”
(ads, “news feed positioning”, etc.)
headlines are selected knowing this
• “Trending” is used to present current “hot” content to
target communities to build engagement
Truth, accuracy, credibility, etc are NOT a factor
Implementing a Social Media Campaign
1. ID Macro Audience
2. Goals (participation, persuasion, activate, GoTV, discourage)
3. Collect “big data”
4. In depth surveys (10,000+) “Computational linguistics”
5. Segmentation
6. Mine bid data – “Look alike builder” (Facebook)
7. Micro-target segments
tailored, cheap, confidential “under radar”
8. Conversion Rate Optimization .. Most effective slogan
A/B testing, constantly fine tuned (200,000+ posts)
“This ad uses dynamic creative, a process whereby
advertisers upload multiple image and text
options and the best performing combination for
the audience is automatically created.”
• Facebook Ad Archive note
(on a Frosted Flakes Ad, listed as a national issue)
• i.e. You have no idea what version is being presented to
any specific individual or community
Micro Targeting
• The more specific the target individual(s):
– Fewer # to “buy” (lower cost)
– More likely the desired impact
• Ideal: a single individual as he/she approaches
their decision (currently “looking”) --- which
requires
– real time monitoring of “everyone” and
– knowing who they are in detail
This is now automated
Looking Forward to 2020?
Are we there yet?
Slides posted, along with syllabus,
Resource list and links to related content
https://is.gd/AIPolitics
Perspective Difference
• How did specific initiatives impact the 2016
Trump Campaign? …
– Cambridge Analytica – not a factor
– Russia – minimal impact
• Impact on national discourse
– Cambridge Analytica – created visibility for issues
– Russia – Driving polarization at every opportunity
with “Strategic Intent” and significant financial,
staffing and expertise invested
Voter Suppression (NYU)
Facebook has said that in the weeks before the
2018 midterm election it found and removed
45,000 pieces of voter suppression content.
“Few behaviors strike as directly at the
heart of democracy as confusing or
bullying people who are entitled to vote.
The social media companies will have
to remain vigilant on this front.“
[An interesting 1st amendment question]
2020 Political Algorithm
• Select set of critical precincts in swing states
(or targeted congressional districts)
• Identify all persons likely to vote
• For each voter:
– If on-our-side then flood with ads that maximize probability of voting
(terrible things “they” will do – tied to individual’s hot button issues
and matched to their personality)
– If not(on-our-side) then flood to minimize probability of voting (at
least for viable opponent)
– Track each paid placement/click-bait to see which
individuals are reading/sharing, learn what works, more
finely profile individuals for next point of
placement/contact. – in real time, per target individual
Social Media makes Ads Visible
(sort of)
• Facebook / Instagram Ad Archive
• Google / Youtube ad Transparency
• Twitter Transparency site
Google Transparency Report May 2018 to Sept 2019
TRUMP MAKE AMERICA GREAT AGAIN COMMITTEE
$6,800,800
SENATE LEADERSHIP FUND $5,124,600
CONGRESSIONAL LEADERSHIP FUND $4,159,200
Need to Impeach $3,596,500
NRCC $3,453,900
DEDICATEDEMAILS.COM $2,800,900
NRSC $2,695,000
PRIORITIES USA ACTION & SMP $2,569,200
DONALD J. TRUMP FOR PRESIDENT, INC. $2,412,700
TOM STEYER 2020 $2,255,900
WARREN FOR PRESIDENT, INC. $2,223,500
INDEPENDENCE USA PAC $2,208,900
PETE FOR AMERICA, INC $2,029,200
BETO FOR TEXAS $1,944,000
ONE NATION $1,740,100
KAMALA HARRIS FOR THE PEOPLE $1,430,600
Analysis and Reports
• 2018 US Senate received 2 reports on 2016 Russian
Interference:
The IRA and Political Polarization in the United States,
(from Oxford (UK) Computational Propaganda Project, Dec 2018, 47 pages)
The Tactics & Tropes of the Internet Research Agency,
(from New Knowledge, a U.S. Web security firm; Dec 2018, 101 pages)
• Aug. 2019 – NYU did a report based on these, the
Muller Report and subsequent analysis:
Disinformation and the 2020 Election:
How the Social Media Industry Should Prepare
Links on website: https://is.gd/AIPolitics
From NYU report
In its 2019 Worldwide Threat Assessment,
the Office of the Director of National
Intelligence predicted that next year,
Russia and other American adversaries
“almost certainly will use online influence
operations to try to weaken democratic
institutions, undermine U.S. alliances and
partnerships, and shape policy outcomes.”
NYU recommendations for 2020
recommendations of additional steps social media companies should take to
prepare for 2020, including:
• Detect and remove deepfake videos: Realistic but fraudulent videos have
the potential to undermine political candidates and exacerbate voter
cynicism.
• Remove provably false content in general: The platforms already remove
hate speech, voter suppression, and other categories of content; the
report recommends that they add one more.
• Hire a senior content overseer: Each company needs an executive with
clout to supervise the process of guarding against disinformation.
• Improve industry-wide collaboration on disinformation: For example, when
one platform takes down abusive accounts, others should do the same
with affiliated accounts.
• Teach social media literacy in a more direct, sustained way: Users have to
take responsibility for recognizing false content, but they need more help
to do it.
Obama PSA/Deepfake Example
NYU Report - Expectations
• Deep fake videos of “candidates”
• Digital Voter Suppression – major partisan goal
• Solicitation into “real world” rally’s/demonstrations
• For profit firms hired to generate disinformation
• Domestic dis-info more prevalent than foreign
• Abuse via Instagram
Images playing bigger role than simple text
• WhatsApp – encrypted messaging, hard to monitor
– WhatsApp has served as a powerful vehicle for disseminating
false information during recent presidential elections in both
Brazil and India.
From NYU
The March 2019 Mueller report fleshed
out our understanding of how IRA operatives,
posing as grassroots U.S. activists,
mobilized Americans to participate in
dozens of rallies in 2016 and thereafter.
By expanding on this strategy in 2020,
the Russians would accomplish one of
their main goals—translating influence
online into real-world discord
2019
• Instagram has 1 billion users
– Images & video
That Instagram has outperformed Facebook as a Russian engagement machine may
‘indicate its strength as a tool in image-centric memetic warfare,’
according to a report commissioned by the Senate Intelligence Committee
• 2.38 billion for Facebook
• 2 billion for YouTube
– video
• 330 million for Twitter
• WhatsAPP has 1.5 billion users globally,
300 million of them in India and another 120 million in Brazil.
In the U.S. it has fewer than 60 million adult users.
From NYU – re Iran
• Some of the phony Iranian accounts impersonated real
Republican political candidates who ran for office in
2018.
• The Iranian network promoted antiSaudi, anti-Israeli,
and pro-Palestinian themes. It also expressed support
for the 2015 nuclear deal, from which President Trump
unilaterally withdrew. Some of the phony personas—
using names such as “John Turner” and “Ed Sullivan”—
had letters published on these themes in the New York
Daily News, Los Angeles Times, and other mainstream
media outlets.
• Misleading content created by U.S.
citizens can be difficult to distinguish
from ordinary political expression, which
enjoys First Amendment protection from
government regulation or prosecution
• Private, LLC’s protected by Citizens United are
buying placements – unlike PACs they do not
have to disclose revenue sources
(opportunities for foreign and domestic abuses)
Removing Fake Accounts
Facebook uses improved AI to delete automated
fake accounts by the billions—2.19
billion in just the first three months of 2019. Most
of these accounts were blocked within minutes of
their creation, preventing them from doing any
harm, according to the company. (NYU)
Twitter has challenged and taken down millions of
fake accounts for similar reasons. And YouTube
has done so as well, if to a lesser degree
Demoting/flagging untrue content
• Facebook is considering banning
and removing all deep-fakes
• generally, the platforms don’t remove content
deemed to be false
they typically reduce the prominence of the
untrue material, based on the notion that their
executives and employees shouldn’t be “arbiters
of the truth.”
• Facebook, for example, “down-ranks” falsehoods
in users’ News Feeds by 80%, labels the content
as untrue, and provides context on the topic. (NYU)
Removing Content (NYU)
• Facebook:
– Risk of physical violence
– Voter suppression
– Census interference
• Youtube
– Denying events
(Holocaust, Sandy Hook, Moon landing)
– Demoting miracle cures
“no obvious principle underpins these”
Archiving Ads for Analysis
• Facebook, Twitter and Google/YouTube now
require that ad purchasers confirm
their identities and locations within the U.S.
• All three have created searchable
political-advertising databases, which
allow anyone to look for patterns in
this kind of spending.
• The bulk of the distribution is via un-paid channels
– shares, retweets, etc.
• Fake assertions of “paid for by” is a problem
Account suspended on Twitter
• Goldstein, who is making her second run for
Congress .. posted a statement on Twitter that
linked Cleveland's 66 percent illiteracy rate to
Tuesday's passage of the legislation meant to
grant equal access to employment, housing, and
public accommodations to lesbian, gay, bisexual,
transgender, and queer people.
• "If CCC had LITERATE inner-city church attending
Black voters following this issue=entirely opposite
outcome," her tweet said.
Oxford research 2018
• During a 30-day period before the U.S. midterms, 25%
of Facebook and Twitter shares related to the election
contained “junk news”
• Disinformation sharing is down as a %,
but not in absolute numbers
• individual junk news stories can “hugely outperform
even the best, most important professionally produced
stories, drawing as much as four times the volume of
shares, likes, and comments.”
Information Wars: How We Lost the Global Battle
Against Disinformation and What We can Do About It
(2019 – R.Stengel*)
1. Prohibit foreign entities from buying online advertising to
influence U.S. elections
2. Make source & buyers of all political ads transparent, informing
users why they were specifically targeted
3. Provide more privacy for personal data
4. Remove verifiably & provably false information
5. Get media organizations to agree not to use stolen information
from hacks or cyber-meddling
6. Require campaigns to disclose foreign contacts & persons seeking
to influence elections
7. Appoint a senior government office to address disinformation &
election interference
* Richard Stengel was Under Secretary of State for Public Diplomacy in 2016, started
Ukraine Task Force => Russian Information Group =>Global Engagement Center
Congressional Discussions
• Honest Ads Act &
Deceptive Practices and Voter Intimidation Act
• Various Privacy protection initiatives
• Discussion of anti-trust actions against big
tech companies
• Discussion of first amendment implications
• Regulating social media companies
such as removing some “section 230”
protections from responsibility for content
What’s Next?
• Real time monitoring, by AI technology to determine
what is effective at influencing each individual’s
behavior
• AI “Nudging” … but perhaps with highly effective
results
• Obvious application: selling a product
• Next: selling a political candidate
• Then: brainwashing a sufficient amount of the
population
You can fool some of the people all of the time, and all of
the people some of the time … but can you fool a
sufficient number of the people enough of the time?
Some resources https://is.gd/AIPolitics
• McNamme, Roger; "Zucked: Waking Up to the Facebook Catastrophe“; (2019)
• Singer & Brooking; LikeWar: The Weaponization of Social Media (2018)
• Friedman, Tom; Thank You for Being Late, 2016
• Harari, Yuval Noah; 21 Lessons for the 21st Century; 2018
• Linked from website:
– Presentations on Cambridge Analytica
– Presentations on Facebook & Psychographics
– Jan 2019 IEEE Institute - emergence of Persuasive Technology
– US Senate Reports from Dec. 2018 on Russian interference
– NYU 2019 study and social media recommendations for 2020,
and paper analyzing US online political advertising
– Sept 2019 CSPAN/Georgetown U. panel on Social Media & First Amendment

More Related Content

What's hot

Covid19 - Fighting Fake News Pitch Deck
Covid19 - Fighting Fake News Pitch DeckCovid19 - Fighting Fake News Pitch Deck
Covid19 - Fighting Fake News Pitch DeckMohammed Zeeshan Fatmi
 
Fake news and how to combat it
Fake news and how to combat itFake news and how to combat it
Fake news and how to combat itSanjana Hattotuwa
 
Online Privacy in the Year of the Dragon
Online Privacy in the Year of the DragonOnline Privacy in the Year of the Dragon
Online Privacy in the Year of the DragonPhil Cryer
 
English 2201: What's News? Satire, Hoaxes & Fake News
English 2201: What's News? Satire, Hoaxes & Fake NewsEnglish 2201: What's News? Satire, Hoaxes & Fake News
English 2201: What's News? Satire, Hoaxes & Fake NewsTrudy Morgan-Cole
 
In mitigating Fake News and Hate-speech: Are we winning?
In mitigating Fake News and Hate-speech: Are we winning?In mitigating Fake News and Hate-speech: Are we winning?
In mitigating Fake News and Hate-speech: Are we winning?Demetris Paschalides
 
Fighting Fake News NAGC 2017
Fighting Fake News NAGC 2017Fighting Fake News NAGC 2017
Fighting Fake News NAGC 2017Brian Housand
 
Silicon Valley, We Have a (Dis)Information Problem
Silicon Valley, We Have a (Dis)Information ProblemSilicon Valley, We Have a (Dis)Information Problem
Silicon Valley, We Have a (Dis)Information ProblemSarah Jackson
 
Can Artificial Intelligence Predict The Spread Of Online Hate Speech?
Can Artificial Intelligence Predict The Spread Of Online Hate Speech?Can Artificial Intelligence Predict The Spread Of Online Hate Speech?
Can Artificial Intelligence Predict The Spread Of Online Hate Speech?Bernard Marr
 
The fake news debate - what do we know and what can we do?
The fake news debate - what do we know and what can we do?The fake news debate - what do we know and what can we do?
The fake news debate - what do we know and what can we do?Rasmus Kleis Nielsen
 
Online privacy concerns (and what we can do about it)
Online privacy concerns (and what we can do about it)Online privacy concerns (and what we can do about it)
Online privacy concerns (and what we can do about it)Phil Cryer
 
Facebook usato da Governi anche per fake news
Facebook usato da Governi anche per fake newsFacebook usato da Governi anche per fake news
Facebook usato da Governi anche per fake newsAndrea Spinosi Picotti
 
Handling fake news and eyewitness media
Handling fake news and eyewitness mediaHandling fake news and eyewitness media
Handling fake news and eyewitness mediaAlastair Reid
 
Cambridge analytica facebook
Cambridge analytica facebookCambridge analytica facebook
Cambridge analytica facebookRohan Hirani
 
AI for good summary
AI for good summaryAI for good summary
AI for good summaryFrank Kienle
 

What's hot (20)

Covid19 - Fighting Fake News Pitch Deck
Covid19 - Fighting Fake News Pitch DeckCovid19 - Fighting Fake News Pitch Deck
Covid19 - Fighting Fake News Pitch Deck
 
Fake news and how to combat it
Fake news and how to combat itFake news and how to combat it
Fake news and how to combat it
 
Online Privacy in the Year of the Dragon
Online Privacy in the Year of the DragonOnline Privacy in the Year of the Dragon
Online Privacy in the Year of the Dragon
 
English 2201: What's News? Satire, Hoaxes & Fake News
English 2201: What's News? Satire, Hoaxes & Fake NewsEnglish 2201: What's News? Satire, Hoaxes & Fake News
English 2201: What's News? Satire, Hoaxes & Fake News
 
Tips to spot false news
Tips to spot false newsTips to spot false news
Tips to spot false news
 
In mitigating Fake News and Hate-speech: Are we winning?
In mitigating Fake News and Hate-speech: Are we winning?In mitigating Fake News and Hate-speech: Are we winning?
In mitigating Fake News and Hate-speech: Are we winning?
 
Social-Media
Social-MediaSocial-Media
Social-Media
 
Fighting Fake News NAGC 2017
Fighting Fake News NAGC 2017Fighting Fake News NAGC 2017
Fighting Fake News NAGC 2017
 
War-of-Keywords
War-of-KeywordsWar-of-Keywords
War-of-Keywords
 
Silicon valley report
Silicon valley reportSilicon valley report
Silicon valley report
 
Silicon Valley, We Have a (Dis)Information Problem
Silicon Valley, We Have a (Dis)Information ProblemSilicon Valley, We Have a (Dis)Information Problem
Silicon Valley, We Have a (Dis)Information Problem
 
Can Artificial Intelligence Predict The Spread Of Online Hate Speech?
Can Artificial Intelligence Predict The Spread Of Online Hate Speech?Can Artificial Intelligence Predict The Spread Of Online Hate Speech?
Can Artificial Intelligence Predict The Spread Of Online Hate Speech?
 
The fake news debate - what do we know and what can we do?
The fake news debate - what do we know and what can we do?The fake news debate - what do we know and what can we do?
The fake news debate - what do we know and what can we do?
 
Online privacy concerns (and what we can do about it)
Online privacy concerns (and what we can do about it)Online privacy concerns (and what we can do about it)
Online privacy concerns (and what we can do about it)
 
Facebook usato da Governi anche per fake news
Facebook usato da Governi anche per fake newsFacebook usato da Governi anche per fake news
Facebook usato da Governi anche per fake news
 
Handling fake news and eyewitness media
Handling fake news and eyewitness mediaHandling fake news and eyewitness media
Handling fake news and eyewitness media
 
Getting the Lowdown on Fake News
Getting the Lowdown on Fake News Getting the Lowdown on Fake News
Getting the Lowdown on Fake News
 
Cambridge analytica facebook
Cambridge analytica facebookCambridge analytica facebook
Cambridge analytica facebook
 
Engaging Times
Engaging TimesEngaging Times
Engaging Times
 
AI for good summary
AI for good summaryAI for good summary
AI for good summary
 

Similar to AI, Social Media Polarizing Our Democracy

Big Data, Republicans and 2016
Big Data, Republicans and 2016Big Data, Republicans and 2016
Big Data, Republicans and 2016steveparkhurst
 
Entrepreneurial Journalism
Entrepreneurial JournalismEntrepreneurial Journalism
Entrepreneurial JournalismRobin Hamman
 
Crump Digital Age Final
Crump Digital Age FinalCrump Digital Age Final
Crump Digital Age FinalRazorfish
 
Alice Andreuzzi, Catchy Srl - "Catchy: Social Data Intelligence"
Alice Andreuzzi, Catchy Srl - "Catchy: Social Data Intelligence"Alice Andreuzzi, Catchy Srl - "Catchy: Social Data Intelligence"
Alice Andreuzzi, Catchy Srl - "Catchy: Social Data Intelligence"Data Driven Innovation
 
Social Impact of Technology - Final Presentation
Social Impact of Technology - Final PresentationSocial Impact of Technology - Final Presentation
Social Impact of Technology - Final Presentationandyjhayes
 
DataSwarm: Lessons learned from Brexit and US elections - Berlin March 2017
DataSwarm: Lessons learned from Brexit and US elections - Berlin March 2017DataSwarm: Lessons learned from Brexit and US elections - Berlin March 2017
DataSwarm: Lessons learned from Brexit and US elections - Berlin March 2017Janet Parkinson
 
10 things you need to know this week (w/c 14th May 2018)
10 things you need to know this week (w/c 14th May 2018)10 things you need to know this week (w/c 14th May 2018)
10 things you need to know this week (w/c 14th May 2018)Damian Radcliffe
 
httpsiexaminer.orgfake-news-personal-responsibility-must-trump.docx
httpsiexaminer.orgfake-news-personal-responsibility-must-trump.docxhttpsiexaminer.orgfake-news-personal-responsibility-must-trump.docx
httpsiexaminer.orgfake-news-personal-responsibility-must-trump.docxpooleavelina
 
MIMA Summit Social Marketing 101 presentation
MIMA Summit Social Marketing 101 presentationMIMA Summit Social Marketing 101 presentation
MIMA Summit Social Marketing 101 presentationNathan Wright
 
Follow the money: Innovativ bruk av sosiale medier i den amerikanske valgkampen
Follow the money: Innovativ bruk av sosiale medier i den amerikanske valgkampenFollow the money: Innovativ bruk av sosiale medier i den amerikanske valgkampen
Follow the money: Innovativ bruk av sosiale medier i den amerikanske valgkampenBente Kalsnes
 
Getting Fresh…Socially. A Social Fresh EAST Recap.
Getting Fresh…Socially. A Social Fresh EAST Recap.Getting Fresh…Socially. A Social Fresh EAST Recap.
Getting Fresh…Socially. A Social Fresh EAST Recap.ClearEdge Marketing
 
An Introduction to Maskirovka aka Information Operations
An Introduction to Maskirovka aka Information OperationsAn Introduction to Maskirovka aka Information Operations
An Introduction to Maskirovka aka Information OperationsHeather Vescent
 
New Voices: Local online participation trends and opportunities
New Voices: Local online participation trends and opportunitiesNew Voices: Local online participation trends and opportunities
New Voices: Local online participation trends and opportunitiesSteven Clift
 
Politics and social media
Politics and social mediaPolitics and social media
Politics and social mediaCeriHughes9
 
Amplification and Personalization: The impact of metrics, analytics, and algo...
Amplification and Personalization: The impact of metrics, analytics, and algo...Amplification and Personalization: The impact of metrics, analytics, and algo...
Amplification and Personalization: The impact of metrics, analytics, and algo...Nicole Blanchett
 
SURENDER SINGH Senior Prosecutor NIA
SURENDER SINGH Senior Prosecutor NIA   SURENDER SINGH Senior Prosecutor NIA
SURENDER SINGH Senior Prosecutor NIA Surender Singh
 
What's Next: The World of Fake News
What's Next: The World of Fake NewsWhat's Next: The World of Fake News
What's Next: The World of Fake NewsOgilvy Consulting
 
Social Media for NPO's
Social Media for NPO'sSocial Media for NPO's
Social Media for NPO'sAgency 323
 

Similar to AI, Social Media Polarizing Our Democracy (20)

Big Data, Republicans and 2016
Big Data, Republicans and 2016Big Data, Republicans and 2016
Big Data, Republicans and 2016
 
Entrepreneurial Journalism
Entrepreneurial JournalismEntrepreneurial Journalism
Entrepreneurial Journalism
 
Crump Digital Age Final
Crump Digital Age FinalCrump Digital Age Final
Crump Digital Age Final
 
Alice Andreuzzi, Catchy Srl - "Catchy: Social Data Intelligence"
Alice Andreuzzi, Catchy Srl - "Catchy: Social Data Intelligence"Alice Andreuzzi, Catchy Srl - "Catchy: Social Data Intelligence"
Alice Andreuzzi, Catchy Srl - "Catchy: Social Data Intelligence"
 
Social Impact of Technology - Final Presentation
Social Impact of Technology - Final PresentationSocial Impact of Technology - Final Presentation
Social Impact of Technology - Final Presentation
 
DataSwarm: Lessons learned from Brexit and US elections - Berlin March 2017
DataSwarm: Lessons learned from Brexit and US elections - Berlin March 2017DataSwarm: Lessons learned from Brexit and US elections - Berlin March 2017
DataSwarm: Lessons learned from Brexit and US elections - Berlin March 2017
 
Mmis 61 so2016
Mmis 61 so2016Mmis 61 so2016
Mmis 61 so2016
 
10 things you need to know this week (w/c 14th May 2018)
10 things you need to know this week (w/c 14th May 2018)10 things you need to know this week (w/c 14th May 2018)
10 things you need to know this week (w/c 14th May 2018)
 
httpsiexaminer.orgfake-news-personal-responsibility-must-trump.docx
httpsiexaminer.orgfake-news-personal-responsibility-must-trump.docxhttpsiexaminer.orgfake-news-personal-responsibility-must-trump.docx
httpsiexaminer.orgfake-news-personal-responsibility-must-trump.docx
 
MIMA Summit Social Marketing 101 presentation
MIMA Summit Social Marketing 101 presentationMIMA Summit Social Marketing 101 presentation
MIMA Summit Social Marketing 101 presentation
 
Follow the money: Innovativ bruk av sosiale medier i den amerikanske valgkampen
Follow the money: Innovativ bruk av sosiale medier i den amerikanske valgkampenFollow the money: Innovativ bruk av sosiale medier i den amerikanske valgkampen
Follow the money: Innovativ bruk av sosiale medier i den amerikanske valgkampen
 
Getting Fresh…Socially. A Social Fresh EAST Recap.
Getting Fresh…Socially. A Social Fresh EAST Recap.Getting Fresh…Socially. A Social Fresh EAST Recap.
Getting Fresh…Socially. A Social Fresh EAST Recap.
 
An Introduction to Maskirovka aka Information Operations
An Introduction to Maskirovka aka Information OperationsAn Introduction to Maskirovka aka Information Operations
An Introduction to Maskirovka aka Information Operations
 
New Voices: Local online participation trends and opportunities
New Voices: Local online participation trends and opportunitiesNew Voices: Local online participation trends and opportunities
New Voices: Local online participation trends and opportunities
 
Politics and social media
Politics and social mediaPolitics and social media
Politics and social media
 
Online Politics 101
Online Politics 101Online Politics 101
Online Politics 101
 
Amplification and Personalization: The impact of metrics, analytics, and algo...
Amplification and Personalization: The impact of metrics, analytics, and algo...Amplification and Personalization: The impact of metrics, analytics, and algo...
Amplification and Personalization: The impact of metrics, analytics, and algo...
 
SURENDER SINGH Senior Prosecutor NIA
SURENDER SINGH Senior Prosecutor NIA   SURENDER SINGH Senior Prosecutor NIA
SURENDER SINGH Senior Prosecutor NIA
 
What's Next: The World of Fake News
What's Next: The World of Fake NewsWhat's Next: The World of Fake News
What's Next: The World of Fake News
 
Social Media for NPO's
Social Media for NPO'sSocial Media for NPO's
Social Media for NPO's
 

More from Jim Isaak

The future - 2038
The future - 2038The future - 2038
The future - 2038Jim Isaak
 
Olli big data_andai
Olli big data_andaiOlli big data_andai
Olli big data_andaiJim Isaak
 
Zen and the Art of Motorcycle Maintainence
Zen and the Art of Motorcycle MaintainenceZen and the Art of Motorcycle Maintainence
Zen and the Art of Motorcycle MaintainenceJim Isaak
 
CyberAttack -- Whose side is your computer on?
CyberAttack -- Whose side is your computer on?CyberAttack -- Whose side is your computer on?
CyberAttack -- Whose side is your computer on?Jim Isaak
 
Spies, Lies and Sunken Subs
Spies, Lies and Sunken SubsSpies, Lies and Sunken Subs
Spies, Lies and Sunken SubsJim Isaak
 
Taking Control of your Future
Taking Control of your FutureTaking Control of your Future
Taking Control of your FutureJim Isaak
 

More from Jim Isaak (8)

The future - 2038
The future - 2038The future - 2038
The future - 2038
 
Olli big data_andai
Olli big data_andaiOlli big data_andai
Olli big data_andai
 
About time
About timeAbout time
About time
 
Zen and the Art of Motorcycle Maintainence
Zen and the Art of Motorcycle MaintainenceZen and the Art of Motorcycle Maintainence
Zen and the Art of Motorcycle Maintainence
 
CyberAttack -- Whose side is your computer on?
CyberAttack -- Whose side is your computer on?CyberAttack -- Whose side is your computer on?
CyberAttack -- Whose side is your computer on?
 
Spies, Lies and Sunken Subs
Spies, Lies and Sunken SubsSpies, Lies and Sunken Subs
Spies, Lies and Sunken Subs
 
Taking Control of your Future
Taking Control of your FutureTaking Control of your Future
Taking Control of your Future
 
2010 isaak
2010 isaak2010 isaak
2010 isaak
 

Recently uploaded

KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...M56BOOKSTORE PRODUCT/SERVICE
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Educationpboyjonauth
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationnomboosow
 
Full Stack Web Development Course for Beginners
Full Stack Web Development Course  for BeginnersFull Stack Web Development Course  for Beginners
Full Stack Web Development Course for BeginnersSabitha Banu
 
History Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptxHistory Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptxsocialsciencegdgrohi
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementmkooblal
 
Meghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media ComponentMeghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media ComponentInMediaRes1
 
internship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developerinternship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developerunnathinaik
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon AUnboundStockton
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxmanuelaromero2013
 
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfEnzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfSumit Tiwari
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptxVS Mahajan Coaching Centre
 
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfFraming an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfUjwalaBharambe
 
Biting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdfBiting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdfadityarao40181
 
Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfMahmoud M. Sallam
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxthorishapillay1
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Celine George
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 

Recently uploaded (20)

KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Education
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communication
 
Full Stack Web Development Course for Beginners
Full Stack Web Development Course  for BeginnersFull Stack Web Development Course  for Beginners
Full Stack Web Development Course for Beginners
 
History Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptxHistory Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptx
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of management
 
Meghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media ComponentMeghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media Component
 
internship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developerinternship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developer
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon A
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptx
 
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfEnzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
 
9953330565 Low Rate Call Girls In Rohini Delhi NCR
9953330565 Low Rate Call Girls In Rohini  Delhi NCR9953330565 Low Rate Call Girls In Rohini  Delhi NCR
9953330565 Low Rate Call Girls In Rohini Delhi NCR
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
 
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfFraming an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
 
Biting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdfBiting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdf
 
Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdf
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptx
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 

AI, Social Media Polarizing Our Democracy

  • 1. AI, Social Media and Political Polarization: Is Our Democracy at Risk? Dave Hess & Jim Isaak Sept 27 & Oct 4th 2019 OLLI - Concord
  • 2. Social media and artificial intelligence (AI) are reshaping our political and electoral processes – and not for the better. Increasingly sophisticated algorithms searching through massive data sets enable political parties, candidates, and nation states to micro-target advertising with messages tailored to small, carefully selected segments of the population that are most susceptible to a particular message, thereby reinforcing the political dispositions of the recipients. Even more perniciously, this technology provides insights into how to tailor that message to most effectively influence those recipients. Online “bots” (automated accounts) can disseminate such messages – both real and indistinguishably fake ones – to millions of online recipients instantaneously – all known only to those who are receiving them. And these methodologies will become even more sophisticated and undetectable as AI advances to “deep learning”. This technology is contributing to, if not to a significant extent causing, the further political polarization of our population as people increasingly receive only political messages tailored to and reinforcing their own political predispositions to the exclusion of other perspectives and points of view. We will ask: What are algorithms? Bots? Machine learning? Deep learning? How do they work? How do they shape, influence, manipulate, distort and polarize political discourse? And how have they been used historically to shape political thought and influence the outcome of elections as, for example: by ISIS in Iraq ; in the Brexit vote in England; and in our own 2016 presidential election (to mention but a few). And what does the future hold? 9/30/2019 https://is.gd/AIPolitics
  • 3. A few Technology Concepts • Algorithms? • Big Data? • Digital Footprints? • Deep learning? • Trolls, Sockpuppets and Bots (oh my!) And why do you care?
  • 4. What are algorithms? Recipes Jims Fudge • In a double boiler: – Melt 6 Tbsp butter – Stir in • ½ cup cocoa • 4 Tbsp milk • 1 tsp vanilla • 1 lbs powdered sugar until smooth – Pour into buttered pan – Cool in frig until hardened
  • 5. Actual Political Algorithm circa 1970 • For each precinct(I) in town – R=0, D=0, U=0 – For each voter in precinct • if voter is republican then R=R+1 • If voter is democrat then D=D+1 • Else U=U+1 – Next Voter – Precinct(I) = {R,D,U} • Next Precinct • Sort Precincts (highest R first, lowest D second) • Print priority list for republican_voter_contact • Candidate objective: prioritize precincts to visit door-to-door with maximum republican contact density
  • 6. But, a typical program • Contains thousands or millions of instructions • Often broken into “sub routines” or “methods” which can be reused in many programs [send_email(emailaddr, msg) might be an example] • An associate tells me that there are 100,000+ identifiable “flaws” in MS Windows, but in sections of code so old no-one is prepared to fix it (or perhaps the source code is lost)
  • 7. Software/Program objectives • Traditional – programmer implements a model that solves the problem • Emerging AI – Deep Blue (Chess-1996) (“brute force learning”) • Trained Artificial Intelligence – (2010) programmer provides data set, examples and evaluation criteria and the program develops the model (AlphaGo, language recognition/translation) Watson (Jeopardy 2011, medical diagnosis, xRay evaluation 2017) Google/Facebook ad insertion, headline selection, recommendations job candidate selection, credit evaluation, real estate ad presentation • Goal driven AI – 2017 Programmer provides model for evaluation and defines output set, program develops model (Alphazero, facial analytics) • General AI -- ??? -- Program revises actions, and perhaps goals independently (a significant % of people think it is human or super human)
  • 8. Big Data? • Disk storage is now cheap – often measured in terabytes even on consumer computers • 15 terabytes = total Library of Congress (text) • Tools exist to process diverse data formats – so queries can be made that span different databases, and even “unstructured” data • Facebook collects 500+ terabytes/day (2017) • Youtube uploads 500 hours of video/minute (2019) Don’t delete anything, you never know ….
  • 9. Big Data + Deep Learning • Can infer information about individuals that they might not “know” or admit* – Political orientation, property holdings – Sexual orientation, credit card transactions • Key “hot buttons” that will motivate them you to act (buy, vote, …) or not act (voter suppression) *Key papers by Michel Kosinski, 2012+ working with Facebook now at Stanford
  • 10. Digital Footprints? • When you connect to the web/net that is your first “foot print” – User from this IP address, MAC id, date/time – Store cookie #123 on user’s system • Go to a web site/service/server – #123 at site/url, date/time/query string …duration • Email content (written, read), actions • “click thru”, “like”, even “hover” in some cases
  • 11. “But I never use Facebook” • Even if you never use Facebook, they can track you … initially they may not know your name, phone number, address, personality, age, income level, political orientation, sexual orientation, facial image, ……but “we only share your personal data with our trusted business partners” (typical privacy statement) • Which either means a contract, or simply sale of your data (what is a business partner?)
  • 12. Advertising Business Model • Facebook, Google, et al use this model • Revenue comes from selling ad placements, and views – Advertisers are the customers/revenue source We are the product • The greatest revenue comes from – Maximum selectivity – “Micro targeting” – Maximum attention opportunities (“engagement” = Getting you to stay on their site)
  • 13. Channels • A new “ad” or “news feed” item can be placed on each page as you click though the web • A “campaign” targeting you (or users like you) can span from web site to web site • In theory it can span from the web to cable TV “ad insertion” directed to your TV, • Certainly it can touch the paid content inserted in streaming videos, online “radio” etc.
  • 14. Fake net users/sockpuppets • Part of the allure of the web is anonymity • Folks often do not use their real names • And can have many accounts with various services –Facebook, Gmail, Hotmail, Instagram • Some of this may be “legitimate” • Others may be malicious I could have 500 folks who “Like” me on Facebook, 498 are my own fake accounts: “sockpuppets”
  • 15. Trolls (not just under bridges) • Persons or organizations operating a number of fake user ID’s pushing an agenda, independent of Truth, Justice or the American Way • Often using outrage and fear to gain attention and get their content to go viral (i.e. shared by initial users to their lists, and from those folks to their lists, etc.) • Create fake user “pages”, also fake “news sites”, feed content to blogs/news-sites that are not trying to be objective/accurate/etc.
  • 16. Not every abuser is a troll • You can get paid by Google, Facebook, etc. for putting their ads on your site • So, if you create a “fake news site”, and get some good outrage/fear going, you can get rich – some kids in Macedonia have done this • “The Pope has endorsed Donald Trump” (a real instance of a fake news meme that went viral) • And the real trolls can point to this as part of their campaigns
  • 17. Bots? “On the Internet, no one knows you are a dog” This is half true – it is fairly easy to create “fake accounts” (gmail, Facebook, etc.) … particularly if you have an army of folks working at this (literally) Facebook has tried to purge fake accounts (millions!), but many have years of apparent activity Bots are captive computers that can be triggered to convert a troll post into a “trending” item by clicks, LIKEs, sharing, re-posting, etc. (turning off your computer reduces the risk of your computer participating in a zombie bot-net) Click farms are large numbers of accounts, with either automated or humans that “click” on targeted content to trigger “trending”, etc.
  • 18. The IRA is not Irish • Internet Research Agency is a Russian troll farm (lots of employees, millions of fake accounts) used to promote their agenda over a number of years. • One way to build web-credibility is to have a long established site/id that gradually builds a network … may use “real names” and bios also
  • 19. IRA Trolls triggering US events (NYU) In 2016, the technique entailed phony IRA social media personas or groups first announcing events, such as dueling anti-Muslim and pro-Muslim demonstrations that the Russians successfully organized in May 2016 in Houston
  • 20. LikeWars (2018) – some notes • When Trump announced, 58% of his followers were overseas, many were click farms or bots • 30+% of the tweets in the BRIXIT window were from botnets; ditto 2016 election, ditto Brazilian election • “Tennessee GOP” was one fake user id, managed by the Russian Trolls it trended at #7 just before the 2016 election • “Angie Dixson” is a Russian troll inflaming tension prior to the Charlottesville white supremacist rally “And the beat goes on”
  • 21. The really sophisticated promoters • Combine many components – Web sites for (fake) organizations – Sock puppet “Friends” – Email lists and postings – Real world “events” – Fake news stories and sites – Paid placements with micro targeting – “Clickbait” ideally trending and going viral Trolls, sockpuppets and bots – oh my!
  • 22. Psych Traits Social Media Exploits 1. Conformation bias 2. Availability bias 3. Emotive bias 4. Repetition bias 5. Visual bias 6. Belonging/”herd” bias 7. Social acceptability
  • 23. Attention “grabbers” (Emotive bias) Fear and Outrage • “Race to the bottom of the brainstem” (Tristan Harris, ex Google Ethics manager) – Likely to trigger a “click thru” – Likely to be shared • The “paid placements” (ads, “news feed positioning”, etc.) headlines are selected knowing this • “Trending” is used to present current “hot” content to target communities to build engagement Truth, accuracy, credibility, etc are NOT a factor
  • 24. Implementing a Social Media Campaign 1. ID Macro Audience 2. Goals (participation, persuasion, activate, GoTV, discourage) 3. Collect “big data” 4. In depth surveys (10,000+) “Computational linguistics” 5. Segmentation 6. Mine bid data – “Look alike builder” (Facebook) 7. Micro-target segments tailored, cheap, confidential “under radar” 8. Conversion Rate Optimization .. Most effective slogan A/B testing, constantly fine tuned (200,000+ posts)
  • 25. “This ad uses dynamic creative, a process whereby advertisers upload multiple image and text options and the best performing combination for the audience is automatically created.” • Facebook Ad Archive note (on a Frosted Flakes Ad, listed as a national issue) • i.e. You have no idea what version is being presented to any specific individual or community
  • 26. Micro Targeting • The more specific the target individual(s): – Fewer # to “buy” (lower cost) – More likely the desired impact • Ideal: a single individual as he/she approaches their decision (currently “looking”) --- which requires – real time monitoring of “everyone” and – knowing who they are in detail This is now automated
  • 27. Looking Forward to 2020? Are we there yet? Slides posted, along with syllabus, Resource list and links to related content https://is.gd/AIPolitics
  • 28. Perspective Difference • How did specific initiatives impact the 2016 Trump Campaign? … – Cambridge Analytica – not a factor – Russia – minimal impact • Impact on national discourse – Cambridge Analytica – created visibility for issues – Russia – Driving polarization at every opportunity with “Strategic Intent” and significant financial, staffing and expertise invested
  • 29. Voter Suppression (NYU) Facebook has said that in the weeks before the 2018 midterm election it found and removed 45,000 pieces of voter suppression content. “Few behaviors strike as directly at the heart of democracy as confusing or bullying people who are entitled to vote. The social media companies will have to remain vigilant on this front.“ [An interesting 1st amendment question]
  • 30. 2020 Political Algorithm • Select set of critical precincts in swing states (or targeted congressional districts) • Identify all persons likely to vote • For each voter: – If on-our-side then flood with ads that maximize probability of voting (terrible things “they” will do – tied to individual’s hot button issues and matched to their personality) – If not(on-our-side) then flood to minimize probability of voting (at least for viable opponent) – Track each paid placement/click-bait to see which individuals are reading/sharing, learn what works, more finely profile individuals for next point of placement/contact. – in real time, per target individual
  • 31. Social Media makes Ads Visible (sort of) • Facebook / Instagram Ad Archive • Google / Youtube ad Transparency • Twitter Transparency site
  • 32.
  • 33. Google Transparency Report May 2018 to Sept 2019 TRUMP MAKE AMERICA GREAT AGAIN COMMITTEE $6,800,800 SENATE LEADERSHIP FUND $5,124,600 CONGRESSIONAL LEADERSHIP FUND $4,159,200 Need to Impeach $3,596,500 NRCC $3,453,900 DEDICATEDEMAILS.COM $2,800,900 NRSC $2,695,000 PRIORITIES USA ACTION & SMP $2,569,200 DONALD J. TRUMP FOR PRESIDENT, INC. $2,412,700 TOM STEYER 2020 $2,255,900 WARREN FOR PRESIDENT, INC. $2,223,500 INDEPENDENCE USA PAC $2,208,900 PETE FOR AMERICA, INC $2,029,200 BETO FOR TEXAS $1,944,000 ONE NATION $1,740,100 KAMALA HARRIS FOR THE PEOPLE $1,430,600
  • 34.
  • 35.
  • 36. Analysis and Reports • 2018 US Senate received 2 reports on 2016 Russian Interference: The IRA and Political Polarization in the United States, (from Oxford (UK) Computational Propaganda Project, Dec 2018, 47 pages) The Tactics & Tropes of the Internet Research Agency, (from New Knowledge, a U.S. Web security firm; Dec 2018, 101 pages) • Aug. 2019 – NYU did a report based on these, the Muller Report and subsequent analysis: Disinformation and the 2020 Election: How the Social Media Industry Should Prepare Links on website: https://is.gd/AIPolitics
  • 37. From NYU report In its 2019 Worldwide Threat Assessment, the Office of the Director of National Intelligence predicted that next year, Russia and other American adversaries “almost certainly will use online influence operations to try to weaken democratic institutions, undermine U.S. alliances and partnerships, and shape policy outcomes.”
  • 38. NYU recommendations for 2020 recommendations of additional steps social media companies should take to prepare for 2020, including: • Detect and remove deepfake videos: Realistic but fraudulent videos have the potential to undermine political candidates and exacerbate voter cynicism. • Remove provably false content in general: The platforms already remove hate speech, voter suppression, and other categories of content; the report recommends that they add one more. • Hire a senior content overseer: Each company needs an executive with clout to supervise the process of guarding against disinformation. • Improve industry-wide collaboration on disinformation: For example, when one platform takes down abusive accounts, others should do the same with affiliated accounts. • Teach social media literacy in a more direct, sustained way: Users have to take responsibility for recognizing false content, but they need more help to do it.
  • 40. NYU Report - Expectations • Deep fake videos of “candidates” • Digital Voter Suppression – major partisan goal • Solicitation into “real world” rally’s/demonstrations • For profit firms hired to generate disinformation • Domestic dis-info more prevalent than foreign • Abuse via Instagram Images playing bigger role than simple text • WhatsApp – encrypted messaging, hard to monitor – WhatsApp has served as a powerful vehicle for disseminating false information during recent presidential elections in both Brazil and India.
  • 41. From NYU The March 2019 Mueller report fleshed out our understanding of how IRA operatives, posing as grassroots U.S. activists, mobilized Americans to participate in dozens of rallies in 2016 and thereafter. By expanding on this strategy in 2020, the Russians would accomplish one of their main goals—translating influence online into real-world discord
  • 42. 2019 • Instagram has 1 billion users – Images & video That Instagram has outperformed Facebook as a Russian engagement machine may ‘indicate its strength as a tool in image-centric memetic warfare,’ according to a report commissioned by the Senate Intelligence Committee • 2.38 billion for Facebook • 2 billion for YouTube – video • 330 million for Twitter • WhatsAPP has 1.5 billion users globally, 300 million of them in India and another 120 million in Brazil. In the U.S. it has fewer than 60 million adult users.
  • 43. From NYU – re Iran • Some of the phony Iranian accounts impersonated real Republican political candidates who ran for office in 2018. • The Iranian network promoted antiSaudi, anti-Israeli, and pro-Palestinian themes. It also expressed support for the 2015 nuclear deal, from which President Trump unilaterally withdrew. Some of the phony personas— using names such as “John Turner” and “Ed Sullivan”— had letters published on these themes in the New York Daily News, Los Angeles Times, and other mainstream media outlets.
  • 44. • Misleading content created by U.S. citizens can be difficult to distinguish from ordinary political expression, which enjoys First Amendment protection from government regulation or prosecution • Private, LLC’s protected by Citizens United are buying placements – unlike PACs they do not have to disclose revenue sources (opportunities for foreign and domestic abuses)
  • 45. Removing Fake Accounts Facebook uses improved AI to delete automated fake accounts by the billions—2.19 billion in just the first three months of 2019. Most of these accounts were blocked within minutes of their creation, preventing them from doing any harm, according to the company. (NYU) Twitter has challenged and taken down millions of fake accounts for similar reasons. And YouTube has done so as well, if to a lesser degree
  • 46. Demoting/flagging untrue content • Facebook is considering banning and removing all deep-fakes • generally, the platforms don’t remove content deemed to be false they typically reduce the prominence of the untrue material, based on the notion that their executives and employees shouldn’t be “arbiters of the truth.” • Facebook, for example, “down-ranks” falsehoods in users’ News Feeds by 80%, labels the content as untrue, and provides context on the topic. (NYU)
  • 47. Removing Content (NYU) • Facebook: – Risk of physical violence – Voter suppression – Census interference • Youtube – Denying events (Holocaust, Sandy Hook, Moon landing) – Demoting miracle cures “no obvious principle underpins these”
  • 48. Archiving Ads for Analysis • Facebook, Twitter and Google/YouTube now require that ad purchasers confirm their identities and locations within the U.S. • All three have created searchable political-advertising databases, which allow anyone to look for patterns in this kind of spending. • The bulk of the distribution is via un-paid channels – shares, retweets, etc. • Fake assertions of “paid for by” is a problem
  • 49. Account suspended on Twitter • Goldstein, who is making her second run for Congress .. posted a statement on Twitter that linked Cleveland's 66 percent illiteracy rate to Tuesday's passage of the legislation meant to grant equal access to employment, housing, and public accommodations to lesbian, gay, bisexual, transgender, and queer people. • "If CCC had LITERATE inner-city church attending Black voters following this issue=entirely opposite outcome," her tweet said.
  • 50. Oxford research 2018 • During a 30-day period before the U.S. midterms, 25% of Facebook and Twitter shares related to the election contained “junk news” • Disinformation sharing is down as a %, but not in absolute numbers • individual junk news stories can “hugely outperform even the best, most important professionally produced stories, drawing as much as four times the volume of shares, likes, and comments.”
  • 51. Information Wars: How We Lost the Global Battle Against Disinformation and What We can Do About It (2019 – R.Stengel*) 1. Prohibit foreign entities from buying online advertising to influence U.S. elections 2. Make source & buyers of all political ads transparent, informing users why they were specifically targeted 3. Provide more privacy for personal data 4. Remove verifiably & provably false information 5. Get media organizations to agree not to use stolen information from hacks or cyber-meddling 6. Require campaigns to disclose foreign contacts & persons seeking to influence elections 7. Appoint a senior government office to address disinformation & election interference * Richard Stengel was Under Secretary of State for Public Diplomacy in 2016, started Ukraine Task Force => Russian Information Group =>Global Engagement Center
  • 52. Congressional Discussions • Honest Ads Act & Deceptive Practices and Voter Intimidation Act • Various Privacy protection initiatives • Discussion of anti-trust actions against big tech companies • Discussion of first amendment implications • Regulating social media companies such as removing some “section 230” protections from responsibility for content
  • 53. What’s Next? • Real time monitoring, by AI technology to determine what is effective at influencing each individual’s behavior • AI “Nudging” … but perhaps with highly effective results • Obvious application: selling a product • Next: selling a political candidate • Then: brainwashing a sufficient amount of the population You can fool some of the people all of the time, and all of the people some of the time … but can you fool a sufficient number of the people enough of the time?
  • 54. Some resources https://is.gd/AIPolitics • McNamme, Roger; "Zucked: Waking Up to the Facebook Catastrophe“; (2019) • Singer & Brooking; LikeWar: The Weaponization of Social Media (2018) • Friedman, Tom; Thank You for Being Late, 2016 • Harari, Yuval Noah; 21 Lessons for the 21st Century; 2018 • Linked from website: – Presentations on Cambridge Analytica – Presentations on Facebook & Psychographics – Jan 2019 IEEE Institute - emergence of Persuasive Technology – US Senate Reports from Dec. 2018 on Russian interference – NYU 2019 study and social media recommendations for 2020, and paper analyzing US online political advertising – Sept 2019 CSPAN/Georgetown U. panel on Social Media & First Amendment