Big	Brother	is manipulating you:	a	stochastic
model	of	opinion	control	in	social	networks
Paolo	Bolzern (Politecnico	di	Milano)
Patrizio	Colaneri (Politecnico	di	Milano)
Giuseppe	De	Nicolao	(Università	di	Pavia)
Outline
• Facebook:	facts	&	figures	and	a	quick	primer
• The	emotional	contagion	experiment
• Newsfeed	control:	from	EdgeRank	to	...
• A	stochastic	model	of	opinion	dynamics
• The	Leopard	Theorem
• Simulation	examples
• What’s	next?
Outline
• Facebook:	facts	&	figures	and	a	quick	primer
• The	emotional	contagion	experiment
• Newsfeed	control:	from	EdgeRank	to	...
• A	stochastic	model	of	opinion	dynamics
• The	Leopard	Theorem
• Simulation	examples
• Conclusions
Basic	ingredients
• The	Newsfeed:	posts that you see when you log	
in
• Your	personal	page:	you can	publish your posts
and	they might enter the	Newsfeeds of	your
friends
• You can	ask friendship and	admission to	groups
• You can	write on	pages of	friends	and	groups
• You can	put	“like”,	“angry,	“love”	on	posts
• ...
Outline
• Facebook:	facts	&	figures	and	a	quick	primer
• The	emotional	contagion	experiment
• Newsfeed	control:	from	EdgeRank	to	...
• A	stochastic	model	of	opinion	dynamics
• The	Leopard	Theorem
• Simulation	examples
• Conclusions
• Facebook’s	News	Feed—the	main	list	of	status	updates,	messages,	and	
photos	you	see	when	you	open	Facebook	on	your	computer	or	phone—is	
not	a	perfect	mirror	of	the	world.
• But	few	users	expect	that	Facebook	would	change	their	News	Feed	in	
order	to	manipulate	their	emotional	state.
• We	now	know	that’s	exactly	what	happened	two	years	ago.	For	one	week	
in	January	2012,	data	scientists	skewed	what	almost	700,000	Facebook	
users	saw	when	they	logged	into	its	service.	Some	people	were	shown	
content	with	a	preponderance	of	happy	and	positive	words;	some	were	
shown	content	analyzed	as	sadder	than	average.	And	when	the	week	was	
over,	these	manipulated	users	were	more	likely	to	post	either	especially	
positive	or	negative	words	themselves.
• This	tinkering	was	just	revealed	as	part	of a	new	study,	published	in	the	
prestigious Proceedings	of	the	National	Academy	of	Sciences.
“Two parallel experiments were conducted for positive
and negative emotion: One in which exposure to
friends’ positive emotional content in their News Feed
was reduced, and one in which exposure to negative
emotional content in their News Feed was reduced. In
these conditions, when a person loaded their News
Feed, posts that contained emotional content of the
relevant emotional valence, each emotional post had
between a 10% and 90% chance (based on their User
ID) of being omitted from their News Feed for that
specific viewing.”
Remarkable	and	unremarkable	...
A	basic	notion	(that	should	be)	taught	in	every	
statistics	course:	the	difference	between	
statistical	and	practical	significance
the false belief
that [statistically]
significant results
are automatically
big and important
The	significance	fallacy
eliminating a substantial proportion of emotional content from
a user’s feed had the monumental effect of shifting that user’s
own emotional word use by two hundredths of a standard
deviation. In other words, the manipulation had a negligible
real-world impact on users’ behavior. To put it in intuitive terms,
the effect of condition in the Facebook study is roughly
comparable to a hypothetical treatment that increased the
average height of the male population in the United States by
about one twentieth of an inch (given a standard deviation of
~2.8 inches). Theoretically interesting, perhaps, but not very
meaningful in practice.
Tal Yarkoni
if the idea that Facebook would actively try to manipulate your
behavior bothers you, you should probably stop reading this
right now and go close your account. You also should
definitely not read this paper suggesting that a single social
message on Facebook prior to the last US presidential election
the may have single-handedly increased national voter turn-
out by as much as 0.6%).
Tal Yarkoni
“Our results suggest that the	Facebook social	
message increased turnout directly by	about
60,000	voters and	indirectly through social	
contagion by	another 280,000	voters,	for	a	total
of	340,000	additional votes”
Outline
• Facebook:	facts	&	figures	and	a	quick	primer
• The	emotional	contagion	experiment
• Newsfeed	control:	from	EdgeRank	to	...
• A	stochastic	model	of	opinion	dynamics
• The	Leopard	Theorem
• Simulation	examples
• Conclusions
Outline
• Facebook:	facts	&	figures	and	a	quick	primer
• The	emotional	contagion	experiment
• Newsfeed	control:	from	EdgeRank	to	...
• A	stochastic	model	of	opinion	dynamics
• The	Leopard	Theorem
• Simulation	examples
• Conclusions
Markov chain model	of	a	Facebook user
Friend	#1
Friend	#2
Friend	#3
Friend	#4
agent
Node	=	Agent	=	Markov	Chain
arc	=	friendship
State	of	the	chain	=	opinion
The	stochastic	social	network
Friend	#1
Friend	#2
Friend	#3
Friend	#4
agent
Node	=	Agent	=	Markov	Chain
State	of	the	chain	=	opinion
The	stand-alone	model
The	interaction	model
Linear	emulative	interaction:	the	instantaneous	
transition	rates	to	opinion	j	undergo	an	increase	
which	is	proportional	through	the	constant	lj to	
the	number	of	neighbors	that	share	opinion	j.	
The	case	when all lj are	equal	goes	under	the	
name	of	unbiased	influence.
Friend	#1
Friend	#2
Friend	#3
Friend	#4
Influenced	agent
influence	=	l2 x	3/4
Friend	#1
Friend	#2
Friend	#3
Friend	#4
Influenced	agent
influence	=	l1 x	1/4
Outline
• Facebook:	facts	&	figures	and	a	quick	primer
• The	emotional	contagion	experiment
• Newsfeed	control:	from	EdgeRank	to	...
• A	stochastic	model	of	opinion	dynamics
• The	Leopard	Theorem
• Simulation	examples
• Conclusions
Markov	property
• The aggregate of all agents that forms the stand-
alone model is itself a (bigger) Markov chain,
whose state whose state is the Cartesian
product of the agents’ states.
• What happens if we assume linear emulative
interaction?
• Answer: the overall network is still a time-
homogeneous Markov chain (good news)
• Bad news: the state of the master Markov
model with M opinions is of dimension MN ,
which soon renders its use prohibitive for the
evaluation of the probability distribution as the
number N of agents increases.
The	“Leopard	Theorem”
"everything	changes	so	that	
nothing	changes”
(quote	from	the	Italian	novel	“The	
Leopard”	by	G.	Tomasi	di	Lampedusa)
The	“Leopard	Theorem”
Theorem:	For	the	unbiased	influence	linear	
emulative	model	with	an	arbitrary	network	topology	
and	identical	agents,	the	probability	distributions	of	
the	agents’	opinions	reach	asymptotically	a	steady-
state	consensus	represented	by	the	steady-state	
stand-alone	distribution	independently	of	the	initial	
probability	distribution.	
Moreover,	if	the	initial	distribution	probabilities	of	all	
agents’	opinions	are	equal,	the	equality	with	the	
stand-alone	probabilities	holds	also	in	the	transient.
nothing	changes?
Not	properly:	w.r.t.	to	stand-alone,	now	the	
opinions	of	the	agents	are	correlated.
Expectation	of	polls	does	not	change	...
...	but	Variance	changes
The	peer	assembly
A	special	case:	the	peer	assembly
• N	identical	individuals
• binary	opinions	(M	=	2),	
• interconnected	by	a	complete	graph	(each	
individual	communicates	with	all	others)	
• initial	opinions	of	each	agent	are	independent	
and	identically	distributed	random	variables
• linear	emulative	interaction	is	assumed
Such	model	can	be	“lumped”	into	a	birth-death	
Markov	process	so	that	formulas	are	available	
for:
– marginal	distributions
– mean	and	variance
Outline
• Facebook:	facts	&	figures	and	a	quick	primer
• The	emotional	contagion	experiment
• Newsfeed	control:	from	EdgeRank	to	...
• A	stochastic	model	of	opinion	dynamics
• The	Leopard	Theorem
• Simulation	examples
• Conclusions
Unbiased	influence
Unbiased	influence
“Herd	behaviour”
“Herd	behaviour”
UNILATERAL
UNILATER
AL
UNILATER
AL
“Arm	wrestling”
“Arm	wrestling”
Up and	down
Up and	down
Up and	down
Beyond	the	peer-assembly:	other	topologies
noninteracting
complete
star
smallworld
Beyond	the	peer-assembly:	other	topologies
Outline
• Facebook:	facts	&	figures	and	a	quick	primer
• The	emotional	contagion	experiment
• Newsfeed	control:	from	EdgeRank	to	...
• A	stochastic	model	of	opinion	dynamics
• The	Leopard	Theorem
• Simulation	examples
• Conclusions
Conclusions	1/2
• Agents	as	interacting	Markov	chains
• Overall	system	is	still	a	Markov	Chain
• Unbiased	influence	on	identical	agents:	affects	
variance	but	not	mean
• Analytical	results	for	the	peer	assembly
• Easy	to	simulate	other	topologies
• Next:	agents’models	drawn	from	a	
distribution?
Conclusions	2/2
• Filters	controlling	users’	newsfeeds	act	as	
influence	intensities	l
• Invisible	bias	on	interactions
• Power	to	shape	public	opinion
• Control	by	Facebook	is	already	justified	as	
countermeasure	against	fake	news	and	
“influence	operations”	by	governments	
(Russia?)	and	non-state	organization
• Lack	of	transparency
Information Operations
and Facebook
By Jen Weedon, William Nuland and Alex Stamos
April 27, 2017
Version 1.0
As our CEO, Mark Zuckerberg, wrote in February 2017:
“It is our responsibility to amplify the good effects and mitigate the bad -- to
continue increasing diversity while strengthening our common understanding so
our community can create the greatest positive impact on the world.”
«In brief, we have had to expand our security
focus from traditional abusive behavior, such
as account hacking, malware, spam and
financial scams, to include more subtle and
insidious forms of misuse, including attempts
to manipulate civic discourse and deceive
people. These are complicated issues and
our responses will constantly evolve, but we
wanted to be transparent about our
approach. The following sections explain our
understanding of these threats and
challenges and what we are doing about
them.»
Thanks!

Big brother is manipulating you