SlideShare a Scribd company logo
1 of 358
Download to read offline
.
Introduction to the Theory of Complex Systems
Stefan Thurner
Thurner Intro to the theory of CS KCL london mar 20, 2023 1 / 55
About this course
This course is an introduction to the theory of complex systems. We
learn about principles and technical tools to understand, deal with, and
manage CS. What are CS? The answer is not unique, we will try give a
practical and useful one.
Science of CS is about 30 years old. It’s state of development is maybe
best comparable to the state of Quantum Mechanics in 1920, when no
comprehensive literature existed on QM, and it was just a collection of
bits and pieces. There has not been a ’Copenhagen’ yet for CS.
However, there is a pattern emerging what a theory of CS must include.
This lecture is an attempt to show this pattern. It is a bit like discovering
a Greek mosaic that is not yet fully excavated, parts are missing, but the
main characters are already clearly visible and one understands the
picture.
Thurner Intro to the theory of CS KCL london mar 20, 2023 2 / 55
Course outline
Introduction
Scaling and driven systems
Statistical mechanics and information theory for CS
Systemic risk in financial networks and supply chains
Thurner Intro to the theory of CS KCL london mar 20, 2023 3 / 55
Literature
This course will follow
S. Thurner, P. Klimek, R. Hanel
Introduction to the Theory of Complex Systems
Oxford University Press, 2018
graphics, if not mentioned otherwise, were produced by the authors
Thurner Intro to the theory of CS KCL london mar 20, 2023 4 / 55
What are CS?
What are CS ?
Thurner Intro to the theory of CS KCL london mar 20, 2023 5 / 55
What are CS?
To provide a feeling for what complex systems are let us first remind
ourselves on three other questions
• what is physics?
• what is biology and life sciences?
• what are social sciences?
By describing what these sciences are – and what they are not – we should
develop a feeling of what complex systems are
The aim is to make clear that the science of CS is a non-trivial combination of
three disciplines and that this mix becomes something like a discipline by
itself
Thurner Intro to the theory of CS KCL london mar 20, 2023 6 / 55
What are CS?
What is physics?
Physics is the experimental, quantitative and predictive science of matter and
their interactions
quantitative statements are made with numbers: less ambiguous than
words
predictive means that statements are given in form of predictions which
can be experimentally tested
Basically one asks specific questions to nature in form of experiments – and
in fact, one gets answers. This methodology is unique in world history - the
scientific method
Thurner Intro to the theory of CS KCL london mar 20, 2023 7 / 55
What are CS?
What is physics?
Matter Relevant interaction types Length scale
Macroscopic matter gravity, electromagnetism all ranges
Molecules electromagnetism all ranges
Atoms electromagnetism, weak force ∼ 10−18
m
Hadrons & leptons electromagnetism, weak/strong 10−18
− 10−15
m
Quarks & gauge bosons electromagnetism, weak/strong 10−18
− 10−15
m
All interactions in the physical world are mediated by the exchange of gauge
bosons. For gravity the situation is not yet experimentally founded.
Thurner Intro to the theory of CS KCL london mar 20, 2023 8 / 55
What are CS?
What is physics?
The nature of fundamental forces
Typically the four fundamental forces act homogeneously and
isotropically in space and time. There are famous exceptions, however:
the strong force or interaction acts in a way as if the interaction is limited
to a ‘string’ – similar to type II superconductivity.
These interactions work on different scales, from light years to femto
meters: this means that one typically has to consider only one force for a
given phenomenon of interest. The other one can be 100, 106
, or up to
1039
times stronger than the other.
Traditionally physics does not specify which particles interact with each
other. Usually they all interact equally, the interaction strength depends
on interaction type and the form of the potential.
Thurner Intro to the theory of CS KCL london mar 20, 2023 9 / 55
What are CS?
What is physics?
What does predictive mean?
Assume you can do an experiment over and over again, e.g. drop a stone.
The theoretical task is to predict the reproducible results.
Since Newton physics follows the following recipe
Find the equations of motion to code your understanding of a dynamical
system
dp
dt
= F(x) , p = m
dx
dt
Predictive means: once F is specified the problem is solved, if the initial
and or boundary conditions are known. The result is x(t)
Compare the result with your experiments
Note: fixing initial conditions and boundary conditions means taking the
system out of the context – out of the rest of the universe. This is why physics
works!
Thurner Intro to the theory of CS KCL london mar 20, 2023 10 / 55
What are CS?
What is physics?
The same philosophy holds for arbitrarily complicated systems. Assume a
vector X(t) represents the state of a system (for example all positions and
momenta), then we get a set of equations of motion of this form
dX(t)
dt
= G(X(t))
Predictive means that in principle you can solve these equations. However,
these can be hard to solve.
Already for three bodies, this becomes a hard task, the famous three-body
problem (sun, earth, moon)
Laplace: if a daemon knowing all the initial conditions and able to solve all
equations – we could predict everything
The problem is that this daemon is hard to find. In fact the Newton-Laplace
program becomes completely useless for most systems. So are these
systems not predictable?
Thurner Intro to the theory of CS KCL london mar 20, 2023 11 / 55
What are CS?
What is physics?
Consider water and make the following experiment over and over again
cool it to 0o
C – it freezes
heat it to 100o
C it boils – almost certainly (under standard conditions)
One can maybe measure the velocity of a single gas molecule at a point in
time, but not of all, O(1023
) at the same time. But one can compute the
probability that a gas molecule has a velocity v,
p(v) ∝ exp(−
m
2kT
v2
)
For many non-interacting particles these probabilities become extremely
precise and one can make predictions about aggregates of particles.
Necessary: interactions are weak and the number of particles is large.
Note that to compute the freezing temperature of water was impossible
before simulations.
Note: The word prediction now has a much weaker meaning than in the
Laplace-Newton sense. The concept of determinism is diluted.
Thurner Intro to the theory of CS KCL london mar 20, 2023 12 / 55
What are CS?
What is physics?
The triumph of statistical mechanics
The idea of statistical mechanics is to understand the macroscopic
properties of a system from its microscopic components: relate the
micro- with the macro world
Typically in physics the macroscopic description is often simple and
corresponds to the state of the phase in which the system is (solid,
gaseous, liquid).
Physical systems have often very few phases. A system is often
prepared in one macro state (e.g. temperature and pressure is given).
There are usually many possible microstates that are related to that
macro state.
In statistical mechanics the main task is to compute the probabilities for
the many microstates that lead to that single macro state
Thurner Intro to the theory of CS KCL london mar 20, 2023 13 / 55
What are CS?
What is physics?
Traditional physics works fine for a few particles (Newton-Laplace) and for
many non-interacting particles (Boltzmann-Gibbs).
In other words, the class of systems that can be understood by physics is not
so big.
There were more severe shifts to the concept of predictability
Thurner Intro to the theory of CS KCL london mar 20, 2023 14 / 55
What are CS?
What is physics?
What does prediction mean? The crises of physics
Prediction in the 18th century is quite different from the concept of prediction
in the 21st
classical physics: exact prediction of trajectories
→crisis 1900: too many particles →
statistical physics: laws of probability allow stochastic predictions of the
macro (collective) behavior of gases, assuming trajectories are
predictable in principle
→crisis 1920s: concept of determinism evaporates completely →
QM and non-linear dynamics: unpredictable components – collective
phenomena remain predictable
→crisis 1990s: can not deal with strong interactions in stat. systems →
Complex Systems: situation can be worse than QM: unpredictable
components and complicated interactions – hope that collective is still
predictable
Thurner Intro to the theory of CS KCL london mar 20, 2023 15 / 55
What are CS?
What is physics?
Physics is analytic, complex systems are algorithmic
Physics largely follows an analytical paradigm. Knowledge of phenomena is
expressed in analytical equations that allow us to make predictions. This is
possible because interactions in physics do not change over time.
This is radically different for complex systems: interactions themselves can
change over time. In that sense, complex systems change their internal
interaction structure as they evolve–co-evolution.
Systems that change their internal structure dynamically can be viewed as
machines and are best described as algorithms—a list of rules how the
system updates its states and future interactions.
Many complex systems work like this: states of components and interactions
between them are simultaneously updated, which leads to tremendous
mathematical difficulties. Whenever it is possible to ignore the changes in the
interactions in a dynamical system, analytic descriptions become meaningful.
Physics is analytic; complex systems are algorithmic. Experimentally testable
quantitative predictions can be made with analytic or algorithmic descriptions
Thurner Intro to the theory of CS KCL london mar 20, 2023 16 / 55
What are CS?
What is physics?
What are complex systems from a physics point of view?
many particle systems (as in statistical physics)
may have stochastic components and elements (as e.g. in QM)
interactions may be specific between two objects (networks)
interactions may be complicated and co-evolving: states and interactions
they are often chaotic and driven systems
interacting bodies are not limited to matter
interactions are not limited to the 4 fundamental forces
they can show very rich phase structure
they can have many macro states, even simultaneously realized
Most physicists will not have a problem to call these extensions to traditional
physics CS. For them it is still physics. The prototype model of CS in physics
are the so-called spin glasses.
Thurner Intro to the theory of CS KCL london mar 20, 2023 17 / 55
What are CS?
What is physics?
This is not what we call Complex Systems yet—there are crucial concepts
missing
However, due to more specific interactions and the increased variety of
types of interactions, the variety of macroscopic states changes
drastically. These emerge from the properties of the system’s
components and the interactions. The phenomenon that a priori
unexpected properties may arise as a consequence of the generalized
interactions is sometimes called emergence. Such CSs can have an
extremely rich phase structure.
In the case when there is a plurality of macro-states in a system, this
leads to entirely new questions one can ask to the system:
I What is the number of macro states?
I What are their co-occurrence rates?
I What are the typical sequences of occurrence?
I What is their life-times?
I What are the transition probabilities? etc.
Thurner Intro to the theory of CS KCL london mar 20, 2023 18 / 55
What are CS?
What is physics?
What we definitely want to keep from physics
CS is the experimental, quantitative and predictive science of generalized mat-
ter with generalized interactions
Generalized interactions are described by the interaction type α, and who
interacts with whom. If there are more than 2 objects involved, interactions
are conveniently indicated by networks
Mα
ij (t)
Interactions themselves remain based on concept of exchange. For example
think of communication, where messages are exchanged, trade where goods
and services are exchanged, friendships where wine bottles are exchanged,
etc.
For many CS the framework of physics is incomplete: What are the missing
concepts? Equilibrium, co-evolution, adjacent possible, ...
Thurner Intro to the theory of CS KCL london mar 20, 2023 19 / 55
What are CS?
A note on chemistry
A note on chemistry – the science of equilibria
In chemistry interactions between atoms and molecules can be already
quite specific. So why is chemistry usually not a CS?
Classically, chemistry is based on the law of mass action – many
particles interact in a way to reach equilibrium.
αA + βB σS + τT
α, β, σ, τ are the stoichiometric constants, k+, k− are reaction rates
forward reaction: k+{A}α
{B}β
, ({.} reactive mass)
backward reaction: k−{S}σ
{T}τ
they are the same in equilibrium
K =
k+
k−
=
{S}σ
{T}τ
{A}α{B}β
Thurner Intro to the theory of CS KCL london mar 20, 2023 20 / 55
What are CS?
A note on chemistry
Many CS are characterized by the fact that they are out of equilibrium
(driven). This means that there are no fixed point type equations that can
be used to solve the problem.
In this situation the help from statistical mechanics becomes very
limited. It is hard to handle systems that are out of equilibrium.
Even the so-called stationary non-equilibrium is hard to understand –
even computationally (thermostats)
On the positive side: many CS are self-organized critical
let’s keep from chemistry
Many CS are out of equilibrium
Many CS are non-ergodic
Note: as soon as one focuses on e.g. cyclical catalytic reactions on networks
chemistry becomes a CS very soon
Thurner Intro to the theory of CS KCL london mar 20, 2023 21 / 55
What are CS?
What is Biology?
What are CS ?
Thurner Intro to the theory of CS KCL london mar 20, 2023 22 / 55
What are CS?
What is Biology?
What is biology?
Life Science is the experimental science of living matter.
What is living matter?
What are the minimal conditions for living matter to exist?
According to S.A. Kauffman
living matter has to be self-replicating
has to run through at least one Carnot cycle
has to be localized
Thurner Intro to the theory of CS KCL london mar 20, 2023 23 / 55
What are CS?
What is Biology?
Living matter is a self-sustained sequence of genetic activity over life.
It uses energy and performs work. It is constantly out of equilibrium.
Genetic activity has to do with chemical reactions which take place e.g.
within cells (compartments). Unfortunately, this is only partly true.
Chemical reactions usually involve billions of atoms or molecules.
What happens in the cell is chemistry with few molecules. If you have a
few molecules only, there arise problems:
I the law of large numbers becomes inappropriate
I the laws of diffusion become inadequate
I the concept of equilibrium becomes shaky
I without equilibrium what is the law of mass action?
If we do not have a law of mass action, how is chemistry to be done?
Consequently, traditional chemistry is often rather inadequate for living matter
Thurner Intro to the theory of CS KCL london mar 20, 2023 24 / 55
What are CS?
What is Biology?
More complications in the cell:
Molecules may be transported from site of production to where they are
needed. This changes law of diffusion even more, it becomes
anomalous diffusion, d
dt p(x, t) = D d2+ν
dx2+ν p(x, t)µ
.
Chemical binding depends on 3D structure of molecules.
Chemical binding depends on ’state’ of molecules, e.g. if they are
phosphorylated or not.
’Reaction rates’ – if one still wants to use this term – depend on the
statistical mechanics of small systems, i.e. fluctuation theorems might
become important.
Thurner Intro to the theory of CS KCL london mar 20, 2023 25 / 55
What are CS?
What is Biology?
Biological interactions happen on networks – almost exclusively
Genetic regulation governs the temporal sequence of abundance of
proteins, nucleic material and metabolites within a living organism
Genetic regulation can be viewed as a discrete interaction
Protein-protein binding is discrete, for example complex formation
Discrete interactions are described by networks:
gene-regulatory network (e.g. Boolean)
metabolic network
protein-protein network
Thurner Intro to the theory of CS KCL london mar 20, 2023 26 / 55
What are CS?
What is Biology?
Thurner Intro to the theory of CS KCL london mar 20, 2023 27 / 55
What is Biology?
Evolution
Nothing in biology makes sense except in the light of evolution. Dobzhansky
Genetic material and the process of replication involve several stochastic
components, and lead to variations in copies. Replication and variation are
two of the three main ingredients for all evolutionary processes.
What is evolution? Remember the Darwinian story
Consider a population of some kind. The offspring of this population has some
random variations (e.g. mutations). Individuals with the optimal variations (gi-
ven a certain surrounding) have a selection advantage, i.e. fitness. This fitness
manifests itself such that these individuals have the more offspring, and thus
pass on the particular variation on to a new generation. In this way ’optimal’
variations get selected over time.
Is this definition predictive science? Or is it just a convincing story?
Is the Darwinian story falsifiable? How can we measure fitness? It is an a
posteriori concept. Survival of the fittest ≡ – survival of those who survive.
Thurner Intro to the theory of CS KCL london mar 20, 2023 28 / 55
What is Biology?
Evolution
Evolution is a three-step process.
1 new thing comes into being in a given environment.
2 new thing has the chance to interact with environment. The result of this
interaction: possibility to get selected or destroyed.
3 if the new thing gets selected (survives) in this environment it becomes
part of this environment – it becomes part of the new environment for all
future, new and arriving things.
Evolution happens simultaneously on various scales (time & space): cells –
organisms – populations
Thurner Intro to the theory of CS KCL london mar 20, 2023 29 / 55
What is Biology?
Evolution
Evolution is not physics.
If you think of this three step process in terms of equations of motion
1 Write down the dynamics of the system in the form of equations of
motion.
2 Boundary conditions depend on these equations – you can not fix them.
3 Consequently, you can not solve the equations.
4 Newtonian recipe breaks down, you fail – program becomes
mathematical monster if you think of dynamically coupled boundary
conditions with the dynamical system.
The task is: try to solve it nevertheless. We will see that multi-scale methods
can be used to address this type of problems.
Thurner Intro to the theory of CS KCL london mar 20, 2023 30 / 55
What is Biology?
Evolution
The concept of evolution is fundamentally different from physics.
It is immediately evident that we are confronted with two huge problems:
Boundary conditions can not be fixed.
Phase space is not well defined – it changes over time. New elements
may emerge that change the environment substantially.
The evolutionary aspect is essential for many CS, it can not be neglected.
Thurner Intro to the theory of CS KCL london mar 20, 2023 31 / 55
What is Biology?
Evolution
Evolution is a natural phenomenon.
Evolution is a process that increases diversity, it looks as if it is a
’creative’ process.
Evolution has many forms of appearance – it is ’universal’: biological
evolution, technological innovation, economy, financial markets, history,
etc.
Evolution follows patterns, regardless in which form of appearance.
These patterns are surprisingly robust. For example: wherever you look,
there is life. Ecological niches that can get filled will get filled. Power
laws.
As a natural phenomenon it deserves a scientific explanation
(quantitative & predictive).
Thurner Intro to the theory of CS KCL london mar 20, 2023 32 / 55
What is Biology?
Evolution
A side note on evolution as a natural phenomenon.
Natural phenomena are to certain degree predictable, i.e. one can trust
patterns: drop stone it will fall down, if water gets cold it freezes, etc.
Credo in natural sciences: one can trust in the fact that natural
phenomena work. If this is the case there must be basic laws on
scientific grounds
With almost certainty: no ecological niche is empty, life is creative and
keeps on going regardless of catastrophes etc. Certain statistical
patterns in the evolutionary timeseries are repeated over and over again
– regardless of the specific system.
It will not be possible to predict what animals live on earth in 500.000
years. But it should be possible to make falsifiable predictions on
systemic quantities such as diversity, diversification rates, robustness,
resilience, adaptability. See our discussion on predictability.
Missing: ways to predict these systemic features quantitatively.
Thurner Intro to the theory of CS KCL london mar 20, 2023 33 / 55
What is Biology?
Evolution
A language for evolutionary processes – the adjacent possible
algorithmic view
This ’science’ is at its very beginning. Maybe not even the right language has
been established so far.
The adjacent possible is the set of all possible worlds that could
potentially exist in the next timestep. The adjacent possible depends
strongly on the present state of the world.
With this definition evolution is a process that continuously fills the AP.
Different from physics: a given state determines the next state. In
physics all potential states are known → nature is algorithmic
Evolution: given state determines the realization of a (huge) set of
possible states. The future states may not even be known at all –
’creative process’.
Filling of the adjacent possible determines the next adjacent possible.
Thurner Intro to the theory of CS KCL london mar 20, 2023 34 / 55
What is Biology?
Evolution
We have learned for evolutionary processes
One can not fix boundary conditions of evolutionary systems. This
means that it is impossible to take the system apart without possibly
losing critical properties. Here the triumphal concept of reductionism
starts to become inadequate.
Evolutionary CS feel their boundary conditions.
Evolutionary CS change their boundary conditions.
In physics the adjacent possible is very small. For example imagine a
falling stone. The AP is that it is on the floor in two seconds. There are
practically no other options.
Lagrangian trajectories are completely specified by initial & end point. In
evolution the AP evolves: AP(t) → AP(t + 1) → AP(t + 2) + · · ·
In physics: realization of the AP does (almost) not influence the next AP.
Thurner Intro to the theory of CS KCL london mar 20, 2023 35 / 55
What are CS?
What is Biology?
Biological systems are adaptive and robust – the concept of the edge of chaos
The possibility to adapt and robustness seem to exclude each other.
However, living systems are clearly adaptive and robust at the same time. To
explain how this is possible one can take the picturesque view:
Every dynamical system has a maximal Lyapunov exponent. It measures
how – two initially infinitesimally close trajectories – diverge over time.
The exponential rate of divergence is the Lyapunov exponent λ,
|δX(t)| ∼ eλt
|δX(0)|
where δX(0) is the initial separation
If the exponent is positive the system is called chaotic, or strongly mixing
If the exponent λ is negative the system approaches an attractor – two
initially infinitesimally adjacent trajectories converge. The system is
periodic
If the exponent is zero: the system is called quasi-periodic, or ’at the
edge of chaos’.
Thurner Intro to the theory of CS KCL london mar 20, 2023 36 / 55
What are CS?
What is Biology?
How does nature find the edge of chaos?
The set of points where the Lyapunov exponents are zero is usually of
measure zero. However, evolution seems to find and select these points.
Thurner Intro to the theory of CS KCL london mar 20, 2023 37 / 55
What are CS?
What is Biology?
How does nature find the edge of chaos?
How can a mechanism of evolution detect something of measure zero?
One explanation is self-organized criticality, where systems organize
themselves to operate at a critical point between order and randomness.
SOC can emerge from interactions in different systems, including sand
piles, precipitation, heartbeat, avalanches, forest fires, earthquakes, etc.
Or is this set simply not of measure zero?
Learn more about these questions in chapter ‘Evolution’ in the book.
Thurner Intro to the theory of CS KCL london mar 20, 2023 38 / 55
What are CS?
What is Biology?
Thurner Intro to the theory of CS KCL london mar 20, 2023 39 / 55
What are CS?
What is Biology?
Biological systems are self-organized and critical
Self-organized systems are dynamical systems that have a critical point
as an attractor. Very often these systems – at a macroscopic scale – are
characterized by scale invariance. Scale invariance means the absence
of a characteristic (length) scale.
A critical point of a system is reached at conditions (temperature,
pressure, slope in sandpile, etc.) where the characteristic length-scale
(e.g. correlation length) becomes divergent.
It is typical to slowly driven systems, where driven means that they are
driven softly away from equilibrium.
Applications cover all the sciences
Physics: particle-, geo-, plasma-, solar physics, cosmology, quantum gravity,...
Biology: evolutionary biology, ecology, neurobiology, ...
Social sciences: economics, sociology, ...
Thurner Intro to the theory of CS KCL london mar 20, 2023 40 / 55
What are CS?
What is Biology?
An intuition for self-organized critical systems—sandpile models
Imagine a pile of sand.
If the slope is too steep avalanches go off → slope becomes flatter
If the slope is too flat sand gets deposited → slope becomes steeper
The pile self-organizes toward a critical slope. The system is robust and
adaptive.
Thurner Intro to the theory of CS KCL london mar 20, 2023 41 / 55
What are CS?
What is Biology?
Let us collect the components for CS that we get from the life sciences
Interactions take place on networks
Out of equilibrium – they are driven
Evolutionary dynamics
Core components are discrete – e.g. Boolean networks
Adaptive and robust: edge of chaos – self-organized critical
Most evolutionary complex systems are path-dependent and have
memory (non-ergodic or non-Markovian). The adjacent possible is a way
of conceptualizing the evolution of the ‘reachable’ phasespace.
We learn how to do statistics of driven out of equilibrium systems in courses:
Scaling and Statistical Mechanics
Thurner Intro to the theory of CS KCL london mar 20, 2023 42 / 55
What are CS?
What is Biology?
We conclude this section with listing a few aims of current Life Science
Understand basic mechanisms (molecular, geno-phenotype relations,
epigenetics)
Manipulate and manage living matter (medicine, smart matter)
Create living matter from scratch (artificial life)
This sounds too optimistic? It is to a certain extent, but several prerequisites
for these tasks are looking certainly good at the moment. The components
are understood and basically under experimental control.
Genetic sequences of hundreds of organisms are available.
Have proteins of thousands of organisms available.
Can synthesize proteins that nature does not produce.
Can synthesize arbitrary RNA sequences on demand.
Have all metabolites of organisms available.
Can count molecules, understand basic transport mechanisms, etc.
Thurner Intro to the theory of CS KCL london mar 20, 2023 43 / 55
Lecture I What are CS?
What is social science?
What are CS ?
Thurner Intro to the theory of CS KCL london mar 20, 2023 44 / 55
What are CS?
What is social science?
Social science is the science of dynamical social interactions and their impli-
cations to society.
Traditionally it is neither quantitative nor predictive, nor does not produce
experimentally testable predictions. Why is that so?
Lack of detailed data
Lack of reproducibility / repeatability
The queen of the social sciences is economics, i.e. the science of the
invention, production, distribution, consumption, and disposal of goods and
services.
All of these components happen on networks, i.e. the associated interactions
are very much directed and interconnected. Maybe network aspects are most
illustrative when studied in the social sciences, even more than in the life
sciences.
Thurner Intro to the theory of CS KCL london mar 20, 2023 45 / 55
What are CS?
What is social science?
What are societies?
People, goods or institutions are represented by nodes of a network.
Interactions are represented by links in networks of various types.
One node may be engaged in several types of interaction.
Nodes are characterized by ‘states’: wealth, opinion, age, etc.
Nodes and links change over time.
Thurner Intro to the theory of CS KCL london mar 20, 2023 46 / 55
What are CS?
What is social science?
What are societies?
Societies are co-evolving multiplex networks, Mα
ij (t)
A multiplex network is a collection of networks on the same set of nodes
Links. α = 1: communication: full line; α = 2: trading: dashed line; α = 3: friendship: dotted line.
States. black – votes for Hillary; grey – votes for Trump.
Nodes i (humans or institutions) are characterized by states σi (t)
Thurner Intro to the theory of CS KCL london mar 20, 2023 47 / 55
What are CS?
What is social science?
Let us collect the components for CS that we get from the social sciences
Interactions happen on a collection of networks (Multiplex networks).
Networks may interact with themselves.
Networks show a rich variety in growth and re-structuring.
Networks are evolutionary and co-evolutionary objects.
Thurner Intro to the theory of CS KCL london mar 20, 2023 48 / 55
What are CS?
What is co-evolution?
The concept of co-evolution.
What is an interaction? In general an interaction can change the state of
the interacting objects, or the environment. For example, the collision of
two particles changes their momentum. The magnetic interaction of two
spins may change their orientations. An economic interaction changes
the portfolios of the participants involved.
The interaction partners (network or multiplex) of a node can be seen as
the ‘environment’ (space) of that node. The environment determines the
future state of the node.
Interactions can change over time. For example people establish new
friendships or economical links, countries terminate diplomatic relations.
The state of nodes determine the future state of the link, meaning if it
exists in the future or not.
The state (topology) of the network determines the future states of the nodes.
The state of the nodes determines the future state of the links of the network.
Thurner Intro to the theory of CS KCL london mar 20, 2023 49 / 55
What are CS?
A pictorial view
Co-evolving multiplex networks – more formally.
d
dt
σα
i (t) ∼ F

Mα
ij (t), σβ
j (t)

and
d
dt
Mα
ij (t) ∼ G

Mα
ij (t), σβ
j (t)

This is maybe the simplest way to illustrate what a CS is, or looks like.
Networks are observable (big data)
Collections of networks are manageable
States of individual nodes are (will be) observable
From a practical point of view it is useless, because G and F are not
specified. They can be stochastic. However, it should be possible to express
most CS of the form we discussed in this form.
Thurner Intro to the theory of CS KCL london mar 20, 2023 50 / 55
What are CS? A summary
1 Complex systems are composed of many elements (latin indices i).
2 Elements interact through one or more types (greek indices α).
3 Interactions are not static but change over time, Mα
ij (t).
4 Elements are characterized by states, σi (t).
5 States and interactions often evolve together by mutually updating each
other, they co-evolve.
6 Dynamics of co-evolving multilayer networks is usually non-linear.
7 CS are context-dependent. Multilayer networks provide that context and
thus offer the possibility of a self-consistent description of CSs.
8 CS are algorithmic.
9 CS are path-dependent and consequently often non-ergodic.
10 CS often have memory. Information about the past can be stored in
nodes or in the network structure of the various layers.
Thurner Intro to the theory of CS KCL london mar 20, 2023 51 / 55
What are CS?
The role of the computer
The computer is the game changer.
CS have never been accessible to control because of computational limits.
The analytic tools available until the 1980’s were simply too limited to address
problems beyond simple physics problems and equilibrium solutions in
(evolutionary) biology and economics. In the later the most famous concept
has been game theory.
Thurner Intro to the theory of CS KCL london mar 20, 2023 52 / 55
What are CS?
The role of the computer
The computer is the game changer.
The computer changed the path of science and opened the way to CS.
Mimic systems through simulation. Agent based models model
interactions by brute force and study the consequences for the collective.
In many systems there is only a single history, especially in the social
sciences, or in evolution. The computer allows to create artificial
histories of statistically equivalent copies – ensemble pictures help to
understand systemic properties that would otherwise not be handleable.
This solves the repeatability problem of CS.
Develop intuition by building the models. It forces you to systematically
think problems through. Maybe one can condense the such gained
intuition into traditional formula-language.
Computation itself profited immensely from CS in the past 3 decades.
Computational limits are very often not the limit for understanding anymore –
i.e. controlling and managing – CS. Also data issues become less and less
relevant in present times. Science experiences an unheardof revolution.
Thurner Intro to the theory of CS KCL london mar 20, 2023 53 / 55
What are CS?
Some triumphs of the science of CS:
Network theory
Genetic regulatory networks and Boolean networks
Self-organized criticality
Genetic algorithms
Auto-catalytic networks
Theory of increasing returns
Origin and statistics of power laws
Mosaic vaccines
Statistical mechanics of complex systems
Network models in epidemiology
Complexity economics and systemic risk
Allometric scaling in biology
Science of cities
Thurner Intro to the theory of CS KCL london mar 20, 2023 54 / 55
What are CS?
Aim of the course
Clarify the origin of power laws, especially in the context of driven
non-equilibrium systems.
Derive a framework for the statistics of driven systems.
Categorize probabilistic complex systems into equivalence classes that
characterize their statistical properties.
Present a generalization of statistical mechanics, and information theory,
so that they become useful for CS. In particular, we derive an entropy for
complex systems.
The overarching theme is understand co-evolutionary dynamics of states
and interactions.
Thurner Intro to the theory of CS KCL london mar 20, 2023 55 / 55
Where do scaling laws come from?
stefan thurner
KCL london mar 21 2023
work done together with
Bernat Corominas-Murtra, Rudolf Hanel and Murray
Gell-Mann
RH, ST, MGM, PNAS 111 (2014) 6905-6910
BCM, RH, ST, PNAS 112 (2015) 5348-5353
BCM, RH, ST, New J Physics 18 (2016) 093010
BCM, RH, ST, J Roy Soc Interface 12 (2016) 20150330
ST, BCM, RH, Phys Rev E 96 (2017) 032124
BCM, RH, ST, Sci Rep (2017) 11223
BCM, RH, LZ, ST, Sci Rep 8 (2018) 10837
RH, ST, Entropy 20 (2018) 838
KCL london mar 21 2023 1
1809
KCL london mar 21 2023 2
1809
Gauss’ theory of errors
KCL london mar 21 2023 3
1810
1810 Laplace: “central limit theorem”
add independent random numbers that are from identical
source → sum is a random number from a normal distribution
• simple systems: constituents don’t interact (weakly)
• entered physics through Maxwell half a century later
• handle for localized variables around average
KCL london mar 21 2023 4
e−x2
x = 1 → e−x2
= 1/e ∼ 0.3679...
x = 10 → e−x2
∼ 0.000000000000000000000000000000000000000000037201...
KCL london mar 21 2023 5
x−α
x = 1 → x−α
= 1 (for α = 1)
x = 10 → x−α
= 0.1
KCL london mar 21 2023 6
complex systems
CS are networked, dynamical, (co)evolutionary
• elements are not independent
• networks link dynamic and stochastic components
• often many sources of randomness
variables are neither independent nor errors come from identical
sources
→ no reason to believe that Gaussian statistics holds for CS
KCL london mar 21 2023 7
statistics of CS is
statistics of power laws
KCL london mar 21 2023 8
statistics of complex systems
CS variables are not localized around mean → tremendous
difficulties
• that is expressed by power laws
• often not exact power laws – but “fat tailed”
→ statistics of CS is the statistics of outliers
→ outliers are the norm rather than the exception
KCL london mar 21 2023 9
examples for (approximate)
power laws
KCL london mar 21 2023 10
city size
MEJ Newman (2005)
multiplicative
KCL london mar 21 2023 11
rainfall
SOC
KCL london mar 21 2023 12
landslides
SOC
KCL london mar 21 2023 13
hurrican damages
secondary (multiplicative) ???
KCL london mar 21 2023 14
financial interbank loans
multiplicative / preferential
KCL london mar 21 2023 15
forrest fires in various regions
SOC
KCL london mar 21 2023 16
moon crater diameters
MEJ Newman (2005)
fragmentation
KCL london mar 21 2023 17
Gamma rays from solar wind
MEJ Newman (2005)
KCL london mar 21 2023 18
movie sales
SOC
KCL london mar 21 2023 19
healthcare costs
multiplicative ???
KCL london mar 21 2023 20
particle physics
???
KCL london mar 21 2023 21
words in books
MEJ Newman (2005)
preferential / random / optimization
KCL london mar 21 2023 22
citations of scientific articles
MEJ Newman (2005)
preferential
KCL london mar 21 2023 23
website hits
MEJ Newman (2005)
preferential
KCL london mar 21 2023 24
book sales
MEJ Newman (2005)
preferential
KCL london mar 21 2023 25
telephone calls
MEJ Newman (2005)
preferential
KCL london mar 21 2023 26
earth quake magnitude
MEJ Newman (2005)
SOC
KCL london mar 21 2023 27
seismic events
SOC
KCL london mar 21 2023 28
war intensity
MEJ Newman (2005)
???
KCL london mar 21 2023 29
killings in wars
???
KCL london mar 21 2023 30
size of wars
???
KCL london mar 21 2023 31
wealth distribution
MEJ Newman (2005)
multiplicative
KCL london mar 21 2023 32
family names
MEJ Newman (2005)
multiplicative, ???
KCL london mar 21 2023 33
systemic risk in supply chain
KCL london mar 21 2023 34
more power laws ...
• networks: literally thousands of scale-free networks
• allometric scaling in biology
• dynamics in cities
• fragmentation processes
• random walks
• crackling noise
• growth with random times of observation
• blackouts
• fossil record
• bird sightings
• terrorist attacks
• fluvial discharge, contact processes
• anomalous diffusion ...
KCL london mar 21 2023 35
how do power laws arise?
KCL london mar 21 2023 36
basic routes to power laws
• criticality
• self-organized criticality
• multiplicative processes with constraints
• self-reinforcing / preferential processes
KCL london mar 21 2023 37
I criticality: power laws at phase transitions
• statistical physics: power laws emerge at phase transition
• this happens at the critical point
• power laws in various quantities: critical exponents
• various materials have same critical exponents →
→ behave identical → universality
KCL london mar 21 2023 38
II power laws through self-organised criticality
• systems find critical points themselves – no tuning necessary
• since they are at a critical point → power laws
• examples: sandpiles, earthquakes, collapse, economy, ...
(Wiesenfeld)
KCL london mar 21 2023 39
III power laws through multiplicative processes
• Gaussian distribution: add random numbers (same source)
• power law: multiply random numbers and impose constraints
e.g. minimum number can not be smaller than X
• examples: wealth distribution, city size, double Pareto,
language, ...
KCL london mar 21 2023 40
IV power laws through preferential processes
• events repeat proportional to how often they occurred before
• examples: network growth models, language formation, ...
KCL london mar 21 2023 41
V other mechanisms
• Levy stable processes
• constraint optimisation
• extreme value statistics
• return times
• generalized entropies
KCL london mar 21 2023 42
one phenomenon—many explanations
Zipf law in word frequencies
• Simon: preferential attachment
• Mandelbrot: constraint optimization
• Miller: monkeys produce random texts
• Sole: information theoretic: sender and receiver
KCL london mar 21 2023 43
motivation I
KCL london mar 21 2023 44
is there a unique principle?
KCL london mar 21 2023 45
motivation II
KCL london mar 21 2023 46
complex systems are driven systems
KCL london mar 21 2023 47
driven system = driving + relaxing process
KCL london mar 21 2023 48
would be nice to have:
statistics for driven, (stationary) non-equilibrium
systems that explains the abundance of power laws
and variations
KCL london mar 21 2023 49
many complex systems are path-dependent
• future events depend on history of past events
• often past events constrain possibilities for the future
→ sample-space of processes reduces as they unfold
KCL london mar 21 2023 50
sample-space reducing processes (SSR)
KCL london mar 21 2023 51
example: history-dependent SSR processes
9
13
7
5
1)
4)
2) 3)
5) 6)
3 1
KCL london mar 21 2023 52
phase spaces are nested
Ω1 ⊂ Ω2 ⊂ Ω3 · · · ⊂ ΩN
KCL london mar 21 2023 53
restart means
Ω1 ≡ ΩN
KCL london mar 21 2023 54
sentence-formation is SSR
funny
wolf
word
runs
funny
howls
bites
house
can
rucksack
room
light
moon
night
green
eats
door
open
blue
sun
wind
cloud
buy
high
short
grey
jacket
computer
algebra
write
strong
letter
table
fill
information
article
punch
hold
line
brown
trash
tree
deer
rabbit
space
fly
mosquito
bee
window
think
red
building
hospital
glass
wine
take
sell
plane
envelope
word
runs
funny
howls
bites
house
can
rucksack
room
light
moon
night
green
eats
door
open
blue
sun
wind
cloud
buy
high short
grey
jacket
computer
algebra
write
strong
letter
table
fill
information
article
punch
hold
line
brown
trash
tree
deer
rabbit
space
fly
mosquito
bee
window
think
red
building
hospital
glass
wine
take
sell
plane
envelope
wolf wolf
word
house
can
rucksack
room
light
moon
night
green
door
open
blue
sun
wind
cloud
buy
high
short
jacket
computer
algebra
write
letter
table
fill
information
article
punch
hold
line
brown
trash
tree
deer
rabbit
space
fly
mosquito
bee
window
think
red
building
hospital
glass
wine
take
sell
plane
envelope
runs
howls
bites
eats
grey
strong
wolf
word
runs
funny
howls
bites
house
can
rucksack
room
light
green
eats
door
open
blue
sun
wind
cloud
buy
high
short
grey
jacket
computer
algebra
write
strong
letter
table
fill
information
article
punch
hold
line
brown
trash
tree
deer
rabbit
space
fly
mosquito
bee
window
think
red
building
hospital
glass
wine
take
sell
plane
envelope
moon
night
a) b) c) d)
KCL london mar 21 2023 55
0 5 10
0
0.45
n
0 5 10
0
0.1
0.2
0.3
0.4
n
p(n)
x x
1 2 3 4 5 6 7 8 9 10 1 2 3 4 5 6 7 8 9
p(i)
1 5 10 1 5 10
10
0
0.1
0.2
0.3
0.4
Site i Site i
KCL london mar 21 2023 56
SSR leads to exact Zipf’s law
p(i) =
1
i
p(i) is probability to visit site i
KCL london mar 21 2023 57
why?
Clearly, p(i) =
PN
j=1 P(i|j) p(j) holds, with
P(i|j) =
(
1
j−1 for i  j (SSR)
1
N for i ≥ j = 1 (restart)
We get p(i) = 1
N p(1) +
PN
j=i+1
1
j−1 p(j)
→ recursive relation p(i + 1) − p(i) = −1
i p(i + 1)
p(i) = ... = p(1)i−1
KCL london mar 21 2023 58
alternative proof by induction
Let N = 2. There are two sequences φ: either φ directly
generates a 1 with p = 1/2, or first generates 2 with p = 1/2,
and then a 1 with certainty. Both sequences visit 1 but only
one visits 2. As a consequence, P2(2) = 1/2 and P2(1) = 1.
Now suppose PN−1(i) = 1/i holds. Process starts with dice
N, and probability to hit i in the first step is 1/N. Also,
any other j, N ≥ j  i, is reached with probability 1/N.
If we get j  i, we get i in the next step with probability
Pj−1(i), which leads to a recursive scheme for i  N, PN(i) =
1
N

1 +
P
ij≤N Pj−1(i)

. Since by assumption Pj−1(i) =
1/i, with i  j ≤ N holds, some algebra yields PN(i) = 1/i.
KCL london mar 21 2023 59
true for all systems with shrinking
sample-space over time
KCL london mar 21 2023 60
note: process is slowly driven
KCL london mar 21 2023 61
what if restart SSR before it is fully relaxed?
restarting or driving rate is (1 − λ) →
p(i) = i−λ
KCL london mar 21 2023 62
arbitrary driving
p(λ)
(i) =
PN
j=1 P(i|j) p(λ)
(j) , with
P(i|j) =





λ
j−1 + 1−λ
N for i  j (SSR)
1−λ
N for i ≥ j  1 (RW)
1
N for i ≥ j = 1 (restart)
We get p(λ)
(i) = 1−λ
N + 1
N p(λ)
(1) +
PN
j=i+1
λ
j−1 p(λ)
(j)
to recursive relation p(λ)
(i + 1) − p(λ)
(i) = −λ1
i p(λ)
(i + 1)
p(λ)
(i)
p(λ)(1)
=
Qi−1
j=1

1 + λ
j
−1
= exp
h
−
Pi−1
j=1 log

1 + λ
j
i
∼ exp

−
Pi−1
j=1
λ
j

∼ exp (−λ log(i)) = i−λ
KCL london mar 21 2023 63
side note: valid also for continuous case
if p(x) ∼
Z x
0
p(y)
y
dy −→ p(x) is a power
KCL london mar 21 2023 64
true for all systems with driving rate 1 − λ
(shrinking sample space with probability λ)
KCL london mar 21 2023 65
SSR processes with fast driving
same convergence speed as CLT for iid processes
KCL london mar 21 2023 66
Berry-Esseen convergence speed
10
0
10
2
10
4
10
−3
10
−2
10
−1
10
0
Number of jumps T
Distance
10
0
10
1
10
2
10
3
10
4
10
1
10
2
10
3
10
4
10
5
10
6
Rank
Number
of
site
visits
0 0.5 1
0
0.2
0.4
0.6
0.8
1
sim
=0.0
=1.0
=1.0
=0.5
=0.7
-0.5
a) b)
-0.5
-0.7
-1.0
KCL london mar 21 2023 67
SSR-based Zipf law is extremely robust
KCL london mar 21 2023 68
proof: same as before
p(i|j) =
(
λ q(i)
g(j−1) + (1 − λ)q(i) if i  j
(1 − λ)q(i) otherwise.
...
→ pλ(i) ∼
pλ(1)
q(1)1−λ

q(i)
g(i)λ

where g(j) =
P
ij+1 q(i)
polynomial priors: q(i) ∼ iα
, (α  −1) → p(i) ∼ iα(1−λ)−λ
polynomial priors: q(i) ∼ iα
, (α  −1)→ p(i) ∼ q(i)
exponential priors: q(i) ∼ eβi
→ uniform distribution
KCL london mar 21 2023 69
prior probabilities are practically irrelevant!
KCL london mar 21 2023 70
driven system = driving + relaxation part
KCL london mar 21 2023 71
relaxation = sample space reducing process
KCL london mar 21 2023 72
details of relaxing process do NOT matter
KCL london mar 21 2023 73
Zipf-law is extremely robust—accelerated SSR
KCL london mar 21 2023 74
not all transitions p(i|j) must exist—diffusion on
networks
KCL london mar 21 2023 75
SSR and diffusion on networks
KCL london mar 21 2023 76
SSR is a random walk on directed ordered NW
1
2
3
4
5
1 2 3 4 5
1/2
1/4
1/8
Node rank
Node
occupation
probability
10
0
10
2
10
0
10
−2
10
−4
Path rank
Path
probability
acyclic
p
exit
=0.3
a)
b)
-0.65
-1
pexit=0.3
Start
Stop
1/5
1/4
1/3
1/2
(1-pexit)/5
1
pexit
pexit=1)
pexit=0) 1
note fully connected
KCL london mar 21 2023 77
SSR = targeted random walk on networks
simple example Directed Acyclic Graph (no cycles)
KCL london mar 21 2023 78
KCL london mar 21 2023 79
KCL london mar 21 2023 80
KCL london mar 21 2023 81
simple routing algorithm
• take directed acyclic network fix it
• pick start-node
• perform a random walk from start-node to end-node (1)
• repeat many times from other start-nodes
• prediction visiting frequency of nodes follows Zipf law
KCL london mar 21 2023 82
all diffusion processes on DAGs are SSR
sample ER graph → direct it → pick start and end → diffuse
KCL london mar 21 2023 83
Exponential NW HEP Co-authors
KCL london mar 21 2023 84
prior probabilities are practically irrelevant!
KCL london mar 21 2023 85
what happens if introduce weights on links?
ER Graph
poisson weights power weights
KCL london mar 21 2023 86
prior probabilities are practically irrelevant!
KCL london mar 21 2023 87
what happens if we allow for cycles?
ER → direct it → change link to random direction with 1 − λ
“driving” λ = 0.8 λ = 0.5
KCL london mar 21 2023 88
Zipf’s law is an immense attractor!
KCL london mar 21 2023 89
Zipf’s law is an attractor
• no matter what the network topology is → Zipf
• no matter what the link weights are → Zipf
• if have cycles → exponent is less than one
KCL london mar 21 2023 90
all good search is SSR
KCL london mar 21 2023 91
what is good search?
a search process is good if ...
• ... at every step you eliminate more possibilities than you
actually sample
• ... every step you take eliminates branches of possibilities
if eliminate fast enough → power law in visiting times
if eliminate too little → sample entire space (exhaustive search)
if no cycles: expect Zipf’s law
KCL london mar 21 2023 92
clicking on web page is often result of search process
KCL london mar 21 2023 93
adamic  hubermann 2002
KCL london mar 21 2023 94
breslau et al 99
KCL london mar 21 2023 95
what about exponents  1?
KCL london mar 21 2023 96
1 2 3 4 5 6 7 8 9
10
11
1213141516
17181920
Multiplication factor µ
→ p(i) = i−µ
Ansatz: n(j → i) = µp(i|j) (slow driving limit)
KCL london mar 21 2023 97
10
0
10
1
10
2
10
3
10
4
10
−10
10
−8
10
−6
10
−4
10
−2
10
0
State
Relative
Frequency
µ=1
µ=1.5
µ=2
µ=2.5
10
0
10
2
10
4
10
−6
10
−4
10
−2
10
0
Energy
Relative
Frequency
µ=2.5
µ=3.5
µ=4.5
KCL london mar 21 2023 98
what if we introduce
conservation laws?
KCL london mar 21 2023 99
conservation laws in SSR processes
assume you have duplication at every jump, µ = 2
if you are at i → duplicate → one jumps to j, the other to k
conservation means: i = j + k.
conservation means:
f(i) = f(state1) + f(state2) + · · · + f(stateµ)
→ p(i) = i−2
for all µ
same result was found by E. Fermi for particle cascades
KCL london mar 21 2023 100
10
0
10
1
10
2
10
3
10
4
10
−7
10
−6
10
−5
10
−4
10
−3
10
−2
10
−1
Energy (arbitrary units)
Relative
frequency
µ=2.5
µ=3.5
µ=4.5
−2
KCL london mar 21 2023 101
complex systems are driven –
always!
KCL london mar 21 2023 102
complex systems are driven non-equilibrium
systems
• only driven systems produce non-trivial structures
• without driving: just ground state or equilibrium
• every driven system: relaxing part + driving part
• every relaxing part is a SSR
KCL london mar 21 2023 103
KCL london mar 21 2023 104
example: inelastic gas in a box
KCL london mar 21 2023 105
example: inelastic gas in a box
ρ(E0
) =
Z E∗
E0
dEρ(E0
|E, cr) ((1 − ξ)ρ(E) + ξρcharge(E))
KCL london mar 21 2023 106
inelastic gas: transition probability
ρ(E0
1|E1, cr) = 1
Z(E1)
R π
0
dζg(ζ)
R π
0
dαf(α)
R π
0
dφr(φ)
×
R ∞
0
dE2ρ(E2)θ(E1 − E2)δ(E0
1 − F(E1, E2, α, ζ, φ; c∗
r))
with
F(E1, E2, α, ζ, φ; c∗
r) = E12
1+c2
r
4 +
1−c2
r
4 q cos ζ
+cr
2
p
1 − (q cos ζ)2 (cos ζ cos 2α − sin ζ sin 2α cos φ)

with q = 2
q
E1
E12
E2
E12
and cr(α)2
= 1 − (1 − c∗
r
2
)| sin α|
use this in eigenvalue equation
KCL london mar 21 2023 107
inelastic gas: energy distribution
KCL london mar 21 2023 108
inelastic gas: dependence on driving rate
KCL london mar 21 2023 109
where do specific distributions come from?
KCL london mar 21 2023 110
assume that driving rate depends on state λ(i)
KCL london mar 21 2023 111
→ λ(x) = −x d
dx log p(x)
or p(x) ∼ exp(−
R λ(x)
x dx)
simple proof
KCL london mar 21 2023 112
proof
transition probabilities from state k to i are
pSSR(i|k) =

λ(k) qi
g(k−1) + (1 − λ(k))qi if i  k
(1 − λ(k))qi if i  k
g(k) is the cdf of qi, g(k) =
P
i≤k qi. Observing that
pλ,q(i + 1)
qi+1

1 + λ(i + 1)
qi+1
g(i)

=
pλ,q(i)
qi
we get
pλ,q(i) =
qi
Zλ,q
Y
1j≤i

1 + λ(j)
qj
g(j − 1)
−1
∼
q(i)
Zλ,q
e
−
P
j≤i λ(j)
q(j)
g(j−1)
Zλ,q is the normalisation constant. For uniform priors, taking logs and
going to continuous variables gives the result, λ(x) = −x d
dx log pλ(x).
KCL london mar 21 2023 113
the driving process determines distribution
KCL london mar 21 2023 114
special cases λ(x) = −x d
dx log p(x)
• Zipf: slow driving (λ = 1) → p(x) = x−1
• Power-law: constant driving λ(x) = α → p(x) = x−α
• Exponential: λ(x) = βx → p(x) = e−β(x−1)
• Power-law + cut-off: λ(x) = α+βx → p(x) = x−α
e−βx
• Gamma: λ(x) = 1 − α + βx → p(x) = xα−1
e−βx
KCL london mar 21 2023 115
special cases λ(x) = −x d
dx log p(x)
• Normal: λ(x) = 2βx2
→ p(x) = e−β
2(x−1)2
• Stretched exp: λ(x) = αβ|x|α
→ p(x) = e−β
α(x−1)α
• Log-normal: λ(x) = 1 − β
σ2 + log x
σ2 → 1
xe
−
(log x−β)2
2σ2
• Gompertz: λ(x) = (βeαx
− 1)βx → p(x) = eβx−αeβx
• Weibull: λ(x) = β−α
αxα
+ α − 1 →p(x) =

x
β
α−1
e
−

x
β
α
• Tsallis: λ(x) = βx
1−βx(1−Q) → p(x) = (1−(1−Q)βx)
1
1−Q
KCL london mar 21 2023 116
driving determines statistics of driven systems
slow → Zipf’s law
constant → power law
extreme driving → prior distribution (uniform)
driving state dependent → any distribution (depends on driving)
(all true for well-behaved priors only)
KCL london mar 21 2023 117
systems of SSR–nature
• any driven system (with stationary distributions)
• self-organized critical systems
• search
• fragmentation
• propagation of information in language: sentence formation
• sequences of human behavior
• games: go, chess, life ...
• record statistics
KCL london mar 21 2023 118
the 3 entropies of SSR processes
information production rate: SIT 1 + 1
2 log W
extensive entropy: SEXT S1,1 = −
P
pi log pi
Maxent functional (SD limit): SMEP −
PW
i=2
h
pi log pi
p1
+ (p1 − pi) log

1 − pi
p1
expressions valid for large large N (iterations) and W (states)
KCL london mar 21 2023 119
do we now have a ”CLT” for fat tailed
distributions?
• criticality: SSR picture applies for some
• self-organized criticality: subset of SSR
• multiplicative processes with constraints: partly applies
• self-reinforcing processes: for some SSR applies
KCL london mar 21 2023 120
conclusions
• complex systems are driven and show fat tailed distributions
• processes of SSR type abound in nature
• SSR has extremely robust attractors – priors don’t matter
• relaxation is usually SSR
• details of relaxation do not matter – note CLT !
• details of driving + SSR → explain statistics
KCL london mar 21 2023 121
Statistics of complex systems II:
entropy for complex systems
stefan thurner
KCL london mar 22 2023
with R. Hanel and M. Gell-Mann
RH, ST, Europhysics Letters 93 (2011) 20006
RH, ST, Europhysics Letters 96 (2011) 50003
RH, ST, MGM, PNAS 108 (2011) 6390-6394
RH, ST, MGM, PNAS 109 (2012) 19151-19154
RH, ST, MGM, PNAS 111 (2014) 6905-6910
KCL london mar 22 2023 1
Why Statistics of Complex Systems ?
Understand macroscopic system behavior on basis of micro-
scopic elements and their interactions → entropy
• Hope: ’thermodynamical’ relations of CS → phase diagrams
• Hope: understand distribution functions: power laws,
stretched exp, ...
• Dream: way to reduce number of parameters → handle CS
• Dream: max entropy principle for CS: predict distributions
KCL london mar 22 2023 2
What is Entropy?
Entropy has to do with ...
(1) ... quantifying information production of source
(2) ... thermodynamics: for example dU = TdS − pdV
(3) ... statistical inference: given data what is most likely
distribution?
For simple systems all three are related
S = −k
W
X
i=1
pi log pi
This is no longer true for complex, non-ergodic systems!
KCL london mar 22 2023 3
Information production
KCL london mar 22 2023 4
KCL london mar 22 2023 5
Information theory
• Receiver wants to know how correct received message is →
add redundancy
• Adding redundancy → send more info through channel →
info transmission rate reduces
• What is the transmission rate of information ?
KCL london mar 22 2023 6
Adding redundancy vs. error probability
Message encoded → received decoded
”Hi there” 100100111 → 100100101 ”Hi thor”
with redundancy
100100111 → 100100111
”Hi there” 100100111 → 100100101 ”Hi there”
100100111 → 100100111
KCL london mar 22 2023 7
KCL london mar 22 2023 8
Shannon’s first theorem
can encode a message such that you can transmit it error-free
trough a noisy channel, if the capacity of the channel is higher
than the information production rate of the source S
what is S?
S is a property of the source (thing that produces a message)
if find code that can correct for more errors than are produced
→ error free
KCL london mar 22 2023 9
What is the source? Example 1
random source: produces letters A, B, C, D, E with probability
pA = 0.4, pB = 0.1, pC = 0.2, pD = 0.2, pE = 0.1
experiment AAACDCBDCEAADADACEDAEADCABEDADDCECAAAAD
at what average rate is information produced in this source?
S = −pA ln pA − pB ln pB − pC ln pC − pD ln pD − pE ln pE
KCL london mar 22 2023 10
Example 2
KCL london mar 22 2023 11
Example 2
produces letters A, B, C with pA = 9
27, pB = 16
27, pC = 2
27
Successive symbols not independent, probabilities depend on
preceding ones
p(i|j) j
A B C
A 0 4
5
1
5
i B 1
2
1
2 0
C 1
2
2
5
1
10
experiment ABBABABABABABABBBABBBBBABABABABABBBACACABBABBBBABBABACBBBABA
S = −
P
i,j pip(j|i) ln p(j|i)
KCL london mar 22 2023 12
A source must be ...
• ... stationary
• ... ergodic (objective probabilities same as in experiment)
KCL london mar 22 2023 13
stationary and ergodic
KCL london mar 22 2023 14
No no no
KCL london mar 22 2023 15
No no no
KCL london mar 22 2023 16
Why this funny form, S = −
P
i pi ln pi ?
Appendix 2, Theorem 2
C.E. Shannon, The Bell System Technical Journal 27, 379-423, 623-656,
1948.
KCL london mar 22 2023 17
Entropy
S[p] =
W
X
i=1
g(pi)
pi ... probability for finding (micro) state i,
P
i pi = 1
W ... number of states
g ... some function. What does it look like?
KCL london mar 22 2023 18
The Shannon-Khinchin axioms
Measure for the amount of uncertainty S
• SK1: S depends continuously on p
• SK2: S maximum for equi-distribution pi = 1/W
• SK3: S(p1, p2, · · · , pW ) = S(p1, p2, · · · , pW , 0)
• SK4: S(AB) = S(A) + S(B|A)
Theorem (uniqueness theorem)
If SK1- SK4 hold, the only possibility is S[p] = −
PW
i=1 pi ln pi
Proof crucially depends on the use of SK4 and SK1
KCL london mar 22 2023 19
Interested in complex systems
KCL london mar 22 2023 20
What are Complex Systems?
• CS are made up from many elements
• These elements are in strong contact with each other
• Elements and combinations of them constrain themselves
As a consequence
• CS are intrinsically non-ergodic: not all states reachable
KCL london mar 22 2023 21
CS are ...
• evolutionary
• path-dependent
• long-memory
• long-range
• non-ergodic
all of this violates
KCL london mar 22 2023 22
What is the IT entropy of CS?
KCL london mar 22 2023 23
Remember Shannon-Khinchin axioms
• SK1: S depends continuously on p → g is continuous
• SK2: S maximal for equi-distribution pi = 1/W → g concave
• SK3: S(p1, p2, · · · , pW ) = S(p1, p2, · · · , pW , 0) → g(0) = 0
• SK4: S(AB) = S(A) + S(B|A)
note: S[p] =
PW
i g(pi). If SK1-SK4 → g(x) = −kx ln x
KCL london mar 22 2023 24
Shannon-Khinchin axiom 4 is non-sense for CS
SK4 corresponds to ergodic sources
→ SK4 violated for non-ergodic systems
→ nuke SK4
KCL london mar 22 2023 25
The ‘Complex Systems axioms’
• SK1 holds
• SK2 holds
• SK3 holds
• Sg =
PW
i g(pi) , W  1
Theorem: All systems for which these axioms hold
(1) can be uniquely classified by 2 numbers, c and d
(2) have the entropy
Sc,d =
e
1 − c + cd
 W
X
i=1
Γ (1 + d , 1 − c ln pi) −
c
e
#
e · · · Eulerconst.
KCL london mar 22 2023 26
Γ(a, b) =
R ∞
b dt ta−1
exp(−t)
KCL london mar 22 2023 27
The argument: generic properties of g
• scaling transformation W → λW: how does entropy change?
KCL london mar 22 2023 28
Mathematical property I: a scaling law!
lim
W →∞
Sg(Wλ)
Sg(W)
= ... = λ1−c
Define f(z) ≡ limx→0
g(zx)
g(x) with (0  z  1)
Theorem 1: For systems satisfying SK1, SK2, SK3: f(z) = zc
,
0  c ≤ 1
KCL london mar 22 2023 29
Theorem 1
Let g be a continuous, concave function on [0, 1] with g(0) = 0
and let f(z) = limx→0+ g(zx)/g(x) be continuous, then f is
of the form f(z) = zc
with c ∈ (0, 1].
Proof. note
f(ab) = lim
x→0
g(abx)
g(x)
= lim
x→0
g(abx)
g(bx)
g(bx)
g(x)
= f(a)f(b)
c  1 explicitly violates SK2, c  0 explicitly violates SK3.
KCL london mar 22 2023 30
Mathematical property II: yet another one !!
lim
W →∞
S(W1+a
)
S(W)Wa(1−c)
= ... = (1 + a)d
Theorem 2: Define hc(a) = limx→0
g(xa+1
)
xacg(x)...
KCL london mar 22 2023 31
Theorem 2
Let g be as before and f(z) = zc
then hc(a) = (1 + a)d
for d
constant.
Proof. We determine hc(a) again by a similar trick as we have
used for f
hc(a) = limx→0
g(xa+1
)
xacg(x) =
g

(xb
)(a+1
b
−1)+1

(xb)(a+1
b
−1)c
g(xb)
g(xb
)
x(b−1)cg(x)
= hc
a+1
b − 1

hc (b − 1)
for some constant b. By a simple transformation of variables,
a = bb0
− 1, one gets hc(bb0
− 1) = hc(b − 1)hc(b0
− 1). Setting
H(x) = hc(x − 1) one again gets H(bb0
) = H(b)H(b0
). So
H(x) = xd
for some constant d and consequently hc(a) =
(1 + a)d
KCL london mar 22 2023 32
Summary
CS are non-ergodic systems → SK1-SK3 hold
→ lim
W →∞
Sg(Wλ)
Sg(W)
= λ1−c
0 ≤ c  1
→ lim
W →∞
S(W1+a
)
S(W)Wa(1−c)
= (1 + a)d
d real
KCL london mar 22 2023 33
What functions S fulfil these 2 scaling laws?
Sc,d = re
W
X
i=1
Γ (1 + d , 1 − c ln pi)−rc
KCL london mar 22 2023 34
Which distribution maximizes Sc,d ?
pc,d(x) = e
− d
1−c

Wk

B(1+x
r)
1
d

−Wk(B)

r = 1
1−c+cd, B = 1−c
cd exp 1−c
cd

Lambert-W: solution to x = W(x)eW (x)
KCL london mar 22 2023 35
Universality classes
all SK 1-3 systems are characterized by 2 exponents: (c, d)
KCL london mar 22 2023 36
Examples
• S1,1 =
P
i g1,1(pi) = −
P
i pi ln pi + 1 (BGS entropy)
• Sq,0 =
P
i gq,0(pi) =
1−
P
i p
q
i
q−1 + 1 (Tsallis entropy)
• S1,d0 =
P
i g1,d(pi) = e
d
P
i Γ (1 + d , 1 − ln pi) − 1
d
• ...
KCL london mar 22 2023 37
Classification of entropies: order in the zoo
entropy c d
SBG =
P
i pi ln(1/pi) 1 1
• Sq1 =
1−
P
p
q
i
q−1 (q  1) c = q  1 0
• Sκ =
P
i pi(pκ
i − p−κ
i )/(−2κ) (0  κ ≤ 1) c = 1 − κ 0
• Sq1 =
1−
P
p
q
i
q−1 (q  1) 1 0
• Sb =
P
i(1 − e−bpi) + e−
b − 1 (b  0) 1 0
• SE =
P
i pi(1 − e
pi−1
pi ) 1 0
• Sη =
P
i Γ(η+1
η , − ln pi) − piΓ(η+1
η ) (η  0) 1 d = 1/η
• Sγ =
P
i pi ln1/γ
(1/pi) 1 d = 1/γ
• Sβ =
P
i pβ
i ln(1/pi) c = β 1
Sc,d =
P
i erΓ(d + 1, 1 − c ln pi) − cr c d
KCL london mar 22 2023 38
Associated distribution functions
• p(1,1) → exponentials (Boltzmann distribution) p ∼ e−ax
• p(q,0) → power-laws (q-exponentials) p ∼ 1
(a+x)b
• p(1,d0) → stretched exponentials p ∼ e−axb
• p(c,d) all others → Lambert-W exponentials p ∼ eaW (xb
)
NO OTHER POSSIBILITIES if only SK4 is violated
KCL london mar 22 2023 39
q-exponentials (powers) Lambert-exponentials
10
0
10
5
10
−30
10
−20
10
−10
10
0
x
p(x)
(b) d=0.025, r=0.9/(1−c)
c=0.2
c=0.4
c=0.6
c=0.8
10
0
10
5
10
−20
10
0
x
p(x)
(c) r=exp(−d/2)/(1−c)
(0.3,−4)
(0.3,−2)
(0.3, 2)
(0.3, 4)
(0.7,−4)
(0.7,−2)
(0.7, 2)
(0.7, 4)
KCL london mar 22 2023 40
The Lambert-W: a reminder
• solves x = W(x)eW (x)
• inverse of p ln p: [W(p)]
−1
= p ln p
• delayed diff equations ẋ(t) = αx(t − τ) → x(t) = e
1
τ W (ατ)t
Ansatz: x(t) = x0 exp[1
τ f(ατ)t] with f some function
KCL london mar 22 2023 41
−1 0 1 2
0
1
violates K2
violates K2 Stretched exponentials − asymptotically stable
(c,d)−entropy, d0
Lambert W0
exponentials
q−entropy, 0q1
compact support
of distr. function
BG−entropy
violates K3
(1,0)
(c,0)
(0,0)
d
c
(c,d)−entropy, d0
Lambert W
−1
exponentials
KCL london mar 22 2023 42
Relaxing ergodicity (kill SK4) opens door to...
• ... order the zoo of entropies through universality classes
• ... understand ubiquity of power laws (and extremely similar
functions)
• ... understand where Tsallis entropy comes from
KCL london mar 22 2023 43
Thermodynamics?
KCL london mar 22 2023 44
Thermodynamic entropy is extensive quantity
if not: thermodynamic relations are non-sense: e.g. dU = TdS
extensive entropy means: S(WA+B) = S(WA) + S(WB)
don’t confuse with additive: S(WA.WB) = S(WA) + S(WB)
KCL london mar 22 2023 45
If we require entropy to be extensive
extensive: S(WA+B) = S(WA) + S(WB) = · · ·
can proof: W(N) = exp
h
d
1−cWk

µ(1 − c)N
1
d
i
c = lim
N→∞
1 − 1/N
W0
(N)
W(N)
d = lim
N→∞
log W

1
N
W
W0
+ c − 1

growth of phasespace volume determines entropy
KCL london mar 22 2023 46
Examples
• W(N) = 2N
→ (c, d) = (1, 1) → system is BGS-type
• W(N) = Nb
→ (c, d) = (1 − 1
b, 0) → system is Tsallis-type
• W(N) = exp(λNγ
) → (c, d) = (1, 1
γ)
• ...
KCL london mar 22 2023 47
A note on phasespace volume
imagine tossing coins
1 coin: ↑, ↓ 2 states: W(1) = 2
2 coins: ↑↑, ↑↓, ↓↑,↓↓ 4 states: W(2) = 4
N coins: 2N
states: W(N) = 2N
imagine process: generate sequence of N ↑ and ↓ symbols with
equal probability. BUT after the first ↑, all subsequent symbols
must be ↑
1 coin: ↑, ↓ 2 states: W(1) = 2
2 coins: ↑↑, ↓↑, ↓↓ 3 states: W(2) = 3
3 coins: ↑↑↑, ↓↑↑, ↓↓↑, ↓↓↓ 4 states: W(3) = 4
N coins: N − 1 states: W(N) = N + 1
KCL london mar 22 2023 48
What does this imply?
• Information Theory: complex source (non-ergodic) →
Sc,d = re
W
X
i=1
Γ (1 + d , 1 − c ln pi) − rc
• Themodynamics: if want extensivity c and d follow from
phasespace volume W(N)
KCL london mar 22 2023 49
Examples
KCL london mar 22 2023 50
Example I: super-diffusion: accelerating RWs
0 10 20 30 40 50
−40
−20
0
20
40
N
X
β
free decisions
1 2
3
4
5
6
7
8
9
∆ N ∝ Nβ
β=0.5
0 2 4 6 8 10
x 10
5
−1
−0.5
0
0.5
1
x 10
5
N
x
β
(b)
β=0.5
β=0.6
β=0.7
• up-down decision of walker is followed by [Nβ
]+ steps in
same direction
• k(N) number of u-d decisions to step N → k(N) ∼ N1−β
• possible sequences: W(N) ∼ 2N1−β
→ (c, d) = (1, 1
1−β)
KCL london mar 22 2023 51
Example II: Join-a-club spin system
• NW growth: new node links to αN(t) random neighbors,
α  1
constant connectency network A (e.g. person joining club)
• each node i has 2 states: si = ±1 ; YES / NO (e.g. opinion)
• each node i has initial (’kinetic’) energy i (e.g. free will)
• interaction Hij = −JAijsisj
• spin-flip of node can occur if node has enough energy for it
(microcanonic)
→ Can show extensive entropy is Tsallis entropy (c, d) = (q, 0),
Sc,d = Sq,0
KCL london mar 22 2023 52
Example III: XY ferromagnet with transverse
H field
H =
N−1
X
i=1
(1 + γ)σx
i σx
i+1 + (1 − γ)σy
i σy
i+1 + 2λσz
i
|γ| = 1 → Ising ferromagnet
γ = 0 → isotropic XY ferromagnet
0  γ  1 → anisotropic XY ferromagnet
L ... length of block in spin chain N → ∞
→ Can show extensive entropy for the block, (c, 0)-entropy
F. Caruso, C. Tsallis, PRE 78, 021101 2011
KCL london mar 22 2023 53
Example IV: Black hole entropy
log Wblack−hole ∝ area
→ Can show extensive entropy is (c, d) = (0, 3/2)-entropy
C. Tsallis L.J.L. Cirto, arxiv 1202.2154 [cond-mat.stat-mech]
KCL london mar 22 2023 54
Statistical inference —
Maximum entropy principle
KCL london mar 22 2023 55
What is the probability to find a histogram?
Sequence x = (x1, x2, · · · , xN) with xn ∈ {1, 2} (coin tosses)
1q0
1q0
2
1q1
1q0
2 1q0
1q1
2
1q2
1q0
2 2q1
1q1
2 1q0
1q2
2
1q3
1q0
2 3q2
1q1
2 3q1
1q2
2 1q1
1q3
2
1q4
1q0
2 4q3
1q1
2 6q2
1q2
2 4q1
1q3
2 1q0
1q4
2
q1 and q2 ... probabilities to toss 1 (heads) or 2 (tails) in a trial
k = (k1, k2) ... histogram of x
KCL london mar 22 2023 56
Probability to find a histogram? Binomial
The probability to find the histogram k is
P(k; q) =
N!
k1!k2!
qk1
1 qk2
2
• N ... length of sequence x
• 2 states
• xn ∈ {1, 2}
KCL london mar 22 2023 57
Probability to find a histogram? Multinomial
The probability to find the histogram k is
P(k; q) =
N!
k1!k2! · · · kW !
| {z }
multiplicity M(k)
qk1
1 qk2
2 · · · q
kW
W
| {z }
probability G(k;q)
• N ... length of sequence x
• W states
• xn ∈ {1, 2, · · · , W}
KCL london mar 22 2023 58
Two observations
Observation 1: The factorization property
• Multiplicity M(k) does not depend on q
• All dependencies on q are captured by G(k; q)
Factorization
P(k; q) = M(k) G(k; q)
1
N log P(k; q) = 1
N log M(k) + 1
N log G(k; q)
KCL london mar 22 2023 59
Observation 2: Log of multiplicity is Shannon entropy
1
N log M(k) = 1
N log

N!
k1!k2!···kW !

∼ 1
N log

NN
k
k1
1 k
k2
2 ···k
kW
W

· · · Stirling
= log N − 1
N
PW
i=1 ki log ki
= −
PW
i=1
ki
N log ki
N = −
PW
i=1 pi log pi = SShannon[p]
1
N
log P(k; q) = −
W
X
i=1
pi log(pi)
| {z }
constraint independent
− α
W
X
i=1
pi − β
W
X
i=1
pii
| {z }
constraints
KCL london mar 22 2023 60
What is the most likely histogram k?
0 =
∂
∂ki
log P(k; q)
for fixed N ⇒
0 =
∂
∂pi
−
W
X
i=1
pi log(pi) −α
W
X
i=1
pi − β
W
X
i=1
pii
!
KCL london mar 22 2023 61
Crazy !
Some non-ergodic CS can be factorized too!
If factorization P(k; q) = M(k)G(k; q) exists then a MEP
exists with
1
φ(N)
log P(k; q)
| {z }
relative entropy
=
1
φ(N)
log M(k)
| {z }
generalized entropy
+
1
φ(N)
log G(k; q)
| {z }
cross entropy
When does factorization exist? → read RH, ST, MGM
(2014)
Theorem: for processes of type p(xN+1|k; q) (explicitly path-
dependent) factorization exists (SK1!) iff multinomial M(k)
→ becomes deformed multinomial Mu,T (k)
KCL london mar 22 2023 62
Deformed deformed multinomials → Sc,d
Mu,T (k) ≡
N !u
Q
i NT

ki
N

!u
| {z }
deformed multinomial
, N !u ≡
N
Y
n=1
u(n)
| {z }
deformed factorial
T monotonic with T(0) = 0, T(1) = 1
deformed multinomials lead to trace-form (c, d)-entropies
S[p] =
1
φ(N)
log Mu,T (k) = ... = −
W
X
i=1
Z pi
0
dz Λ(z)
Λ(z) =
1
a

T0
(z)

N
φ(N)
log u(NT(z))

− log λ

Λ(z) does not depend on N → separation of variables with
separation constant ν (derivatives w.r.t. z and N)→
KCL london mar 22 2023 63
General solution for deformed multinomials →
(c,d)-entropy
Λ(z) = T 0
(z)T (z)ν
−T 0
(1)
T 00(1)+νT 0(1)2
u(N) = λ(Nν)
−1
λ−1 ...q − shifted factorials (ν = 1)
φ(N) = N1+ν
simplest case: T(z) = z Λ(z) = T 0
(z)T (z)ν
−T 0
(1)
T 00(1)+νT 0(1)2 = zν
−1
ν →
S[p] =
PW
i=1
R pi
0
dzΛ(z) = K
1−
PW
i=1 p
Q
i
Q−1
Q ≡ 1 + ν and K constants
KCL london mar 22 2023 64
This is Sc,d-entropy !!!
KCL london mar 22 2023 65
Is extensive entropy the same as MEP-entropy?
No! they are NOT the same – but related
Both are of (c, d)-entropy type
(c, d)TD 6= (c, d)MEP
KCL london mar 22 2023 66
Example: MEP for path-dependent process
imagine a history dependent process of the following kind
P(k|θ) =
W
X
i
p(i|k − ei, θ)P(k − ei|θ) with
p(i|k, θ) =
θi
Z(k)
W
Y
j=i+1
f(kj) with f(x) = λ(x)ν
use simplest priors θi = 1
W , we have u(y) = λyν
and T(y) = y
gives entropy S =
1−
PW
i=1 p
Q
i
Q−1
we get as the prediction from the maximum entropy principle
pi = [1 − ν(1 − Q)(α + βi)]
1
1−Q with i = (i − 1)
KCL london mar 22 2023 67
Compare with simulation!
10
0
10
1
10
−4
10
−3
10
−2
10
−1
εi
=i
p
i
5000 10000
0
1
2
N
α,β
simul N=1000
N=5000
N=10000
theory N=1000
N=5000
N=10000
α
β
ν=0.25, λ=1.1
KCL london mar 22 2023 68
Which processes have non-Shannon type MEP?
• Answer: processes whose sample-space reduces as they unfold
Remarkable
• Start with microscopic path-dependent update-equations →
compute entropy → maximise it → distribution at any time in
the future
• Process is non-ergodic, path-dependent , non-stationary –
yet MEP works
KCL london mar 22 2023 69
The message
complex systems (out-of-equilibrium, non-ergodic, non-
multinomial) break the degeneracy of entropy −
P
i pi log pi
→ entropy has 3 faces
• thermodynamic - face
• information theoretic - face
• maxent entropy - face
Formulas look different for different systems and processes
KCL london mar 22 2023 70
Example1: Pólya process
a) b)
KCL london mar 22 2023 71
Example2: Sample space reducing process
9
13
7
5
1)
4)
2) 3)
5) 6)
3 1
KCL london mar 22 2023 72
The 3 entropies of Pólya and SSR processes
SIT SEXT SMEP
Pólya
1−qj
γ
1
N log N S1− γ
1−qj
,0 −
P
i log pi
SSR 1 + 1
2 log W S1,1 −
PW
i=2
h
pi log pi
p1
+ (p1 − pi) log

1 − pi
p1
i
expressions valid for large N (iterations) and W (states)
KCL london mar 22 2023 73
• every system has its entropy
• every entropy has three faces
KCL london mar 22 2023 74
Conclusions I
• complex systems are non-ergodic by nature
• all statistical SK 1-3 systems classified by 2 exponents (c, d)
• single entropy covers all systems:
Sc,d = re
P
i Γ (1 + d , 1 − c ln pi) − rc
• all known entropies of SK1-SK3 systems are special cases
• distribution functions of all systems are Lambert-W expo-
nentials. There are no exceptions!
• phasespace determines entropy (c, d)
• maxent principle: exists for path-dependent processes
KCL london mar 22 2023 75
Conclusions II
• information theoretic approach leads to (c, d)-entropies
• maxent approach leads to (c, d)-entropies
• extensive and maxentropy are both (c, d)-entropies but NOT
the same
KCL london mar 22 2023 76
A note on Rényi entropy
• relax Khinchin axiom 4:
S(A + B) = S(A) + S(B|A) → S(A + B) = S(A) + S(B) →
Rényi entropy
• SR = 1
α−1 ln
P
i pα
i violates our S =
P
i g(pi)
but: our above argument also holds for Rényi-type entropies
S = G
W
X
i=1
g(pi)
!
lim
W →∞
S(λW)
S(W)
= lim
R→∞
G

fg(z)
z G−1
(R)

R
= [for G ≡ ln] = 1
KCL london mar 22 2023 77
The Lambert-W: a reminder
• solves x = W(x)eW (x)
• inverse of p ln p: [W(p)]
−1
= p ln p
• delayed differential equations
ẋ(t) = αx(t − τ) → x(t) = e
1
τ W (ατ)t
Ansatz: x(t) = x0 exp[1
τ f(ατ)t] with f some function
KCL london mar 22 2023 78
Amount of information production in a process
Markov chain with states A1, A2, ... An
transition probabilities pik
probability for being in state l: Pl =
P
k Pkpkl
if system is in state Ai then we have the scheme

A1 A2 ... An
pi1 pi2 ... pin

Then Hi =
P
k pik log pik is the information obtained when
Markov chain is moving from Ai step to the next
KCL london mar 22 2023 79
average this over all initial states:
H = −
P
i Pi Hi = −
P
i
P
k Pipik log pik
this is the information production if Markov chain moves one
step ahead
KCL london mar 22 2023 80
Complex systems and systemic risk
KCL london mar 23 2023
Aim
• See systems as networks
• Discuss SR, resilience, robustness as network
properties
• Detect weak spots in systems
• Understand idea to quantify systemic risk
• Discuss applications in a few of examples:
finance, economy, credit, global, trading, food
supply, global spreading
Why are we interested in SR?
• Complex systems collapse
• When do they?
• Where do they break?
• Can we understand tipping points?
• Can we identify weak spots?
Remember: we see world as networks
What is collapse?
• Systems perform functions
• Functions depend on how networks form and
restructure (co-evolving)
• Collapse is rapid restructuring of networks
Collapse
• Nodes can vanish
• Nodes can change
• Links can vanish / appear
• Links can change
• Relinking can get wrong
Systemic Risk / Resilience / Robustness
Resilience
Robust
What is systemic risk?
• Probability that system can no longer perform its
function(s)
• Size of system-wide damage (often huge)
• SR is a network property
• SR can so-far practically not be quantified
• SR can so-far practically not be managed
Where will this system break?
SR is necessary to …
• … make systems more resilient
• … make complex systems managable
• … understand what transition means
To understand SR need to know …
… networks and their dynamics
Complex systems are unstable
• most complex systems are stochastic
• statistics of complex systems is statistics of power laws
– many large outliers – outliers are the norm
– non-manageability
• details matter
Breaking complexity
if understand co-evolution: we might break complexity
• by controlling every component tame complexity
à cut off power law tails
• if we forget a detail à might lose control
Example 1: Financial SR
Nodes are banks
Links are financial contracts
It is dynamic and co-evolutionary
The three types of financial risk
• economic risk: investment in business idea does
not pay off
• credit-default risk: you don't get back what you
have lent to others
• systemic risk: system stops functioning due to
local defaults and subsequent (global) cascading
Financial systemic risk
• risk that significant fraction of financial network
defaults
• systemic risk is not the same as credit-default risk
• banks care about credit-default risk
• banks have no means to manage systemic risk
• role of regulator: manage systemic risk
• incentivise banks to reduce SR
Systemic risk created on multi-layer networks
• layer 1: lending–borrowing loans
• layer 2: derivative networks
• layer 3: collateral networks
• layer 4: securities networks
• layer 5: cross-holdings
• layer 6: overlapping pfolios
• layer 7: liquidity: over-night loans
• layer 8: FX transactions
Quantification of SR
• Wanted: systemic risk-value for every financial institution
• Input: network
Google has similar problem: value for importance of web-
pages: A page is important if many important pages point to it
• number of importance à PageRank
Web page is important if many important pages point to it
Institution systemically risky if systemical risky institutions lend to it
Systemic risk factor – DebtRank R
• Is a ‘different Google’ – adapted to context of SR
(S. Battiston et al. 2012)
• superior to: eigenvector centrality, page-rank, Katz rank ...
Why?
• economic value in network that is a affected by node's default
• capitalization/leverage of banks taken into account
• cycles taken into account: no multiple defaults
How to compute it?
Basic idea is to kick one node out of the NW and see
what happens:
• How much damage is caused?
• R is percentage of damage wrt total value
• Repeat for all nodes – one after the other, then in
pairs, triples, …
Systemic risk of nodes
• Input: network of contracts between banks
• Output: all banks i get ‘damage value’ Ri
(% of total damage)
Systemic risk spreads by borrowing
Small bank becomes VERY risky
DebtRank Austria Sept 2009
size is not proportional to systemic risk
note: core-periphery structure
(a)
Systemic risk profile
Austria
0 5 10 15 20
0
0.2
0.4
0.6
0.8
1
BANK
SYST.
RISK
FACTOR
(a)
Systemic risk profile
Mexico
0 10 20 30 40
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
BANK
SYST.
RISK
FACTOR
(b) combined
How big is the next financial crisis?
Expected Systemic Loss = Sum i pdefault(i) . DebtRank (i)
Expected loss(i) = Sum j pdefault(j) . Loss-given-default(j) . Exposure(i,j)
How big is the next financial crisis?
time
EL
syst
[$/year]
2007 2008 2009 2010 2011 2012 2013
0
0.5
1
1.5
2
2.5
3
3.5
4
x 10
11
EL
syst
[$/year]
^MXGV5YUSAC
^VIX
Lehman Brothers collapse
Uncertainty about the rescue of Greece
International alarm over Eurozone crisis
Loss on derivatives of Mexican companies
Subprime crisis
Mexican GDP fell by more than 10%
ESL: Quantify quality of policy
• expected losses per year within country in case of
severe default and NO bailout
• rational decision on bailouts
– allows to compare countries
– allows to compare situation of country over time
• are policy measures working in Spain? in Greece?
Systemic risk of links
SR of banks change with every transaction
All interbank loans: Austria
10
−6
10
−4
10
−2
10
0
10
−8
10
−6
10
−4
10
−2
LOAN SIZE
SYST.
RISK
INCREASE
How to make financial systems better?
i.e. more resilient
Systemic risk is an externality
Management of systemic risk
• Systemic risk is a network property
• Manage systemic risk: re-structure financial networks
• s.t. cascading failure becomes unlikely / impossible
Systemic risk management
=
re-structure networks
Systemic risk elimination
• systemic risk spreads by borrowing from risky agents
• how risky is a transaction?
• ergo: restrict transactions with high systemic risk
à tax those transactions that increase systemic risk
Incentive to change node-linking strategy:
Systemic risk tax
• tax transactions according to their systemic risk contribution
• Banks look for deals with agents with low systemic risk
• ! liability networks re-arrange ! eliminate cascading
• no-one should pay the tax – tax serves as incentive to
• re-structure networks
• size of tax / expected systemic loss of transaction (society is
• neutral)
• if system is risk free: no tax
• credit volume MUST not be reduced by tax
Test the eficacy of tax: ABM
Rebuild system in computer: simulator = agent-based model
Banks
Firms
Households
loans
deposits
consumption
deposits
wages / dividends
The agents
Firms: ask bank for loans
– Firms sell products to households: realize profit/loss
– if surplus – deposit it bank accounts
– Firms are bankrupt if insolvent, or capital is below threshold
– if firm is bankrupt, bank writes off outstanding loans
Banks try to provide firm-loans. If they do not have enough
– approach other banks for interbank loan at interest rate rib
– bankrupt if insolvent or equity capital below zero
– bankruptcy may trigger other bank defaults
Households single aggregated agent: receives cash from rms
– (through firm-loans) and re-distributes it randomly in banks
– (household deposits), and among other firms (consumption)
For comparison: implement Tobin tax
• tax on all transactions regardless of their risk
contribution
• 0.2% of transaction (5% of interest rate)
Comparison of three schemes
• No systemic risk management
• Systemic Risk Tax (SRT)
• Tobin-like tax
Model results: Systemic risk profile
Austria Model
0 5 10 15 20
0
0.2
0.4
0.6
0.8
1
BANK
SYST.
RISK
FACTOR
(a)
0 5 10 15 20
0
0.2
0.4
0.6
0.8
1
BANK
SYST.
RISK
FACTOR
no tax
tobin tax
systemic risk tax
(b)
Model results: SR of individual loans
Austria Model
10
−6
10
−4
10
−2
10
0
10
−8
10
−6
10
−4
10
−2
LOAN SIZE
10
−4
10
−3
10
−2
10
−1
10
0
10
−4
10
−3
10
−2
10
−1
LOAN SIZE
SYST.
RISK
INCREASE
no tax
tobin tax
systemic risk tax
Model results: Distribution of losses
0 200 400 600 800
0
0.05
0.1
0.15
0.2
TOTAL LOSSES TO BANKS
FREQUENCY no tax
tobin tax
systemic risk tax
(a)
SRT eliminates systemic risk
How?
Model results: Cascading is suppressed
0 5 10 15 20
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
0.45
cascade sizes of defaulting banks (C
frequency
no tax
tobin tax
systemic risk tax
(b)
SR elimination at no econ. cost
30 40 50 60 70
0
0.02
0.04
0.06
0.08
0.1
0.12
0.14
TRANSACTION VOLUME IB MARKET
FREQUENCY no tax
tobin tax
systemic risk tax
(c)
Tobin tax reduces risk
by reducing credit volume
Policy testing
Once you have a simulator: can implement
policies and see outcomes
Implement current financial regulation: Basel III
Basel III does not reduce SR
Example 2: European Bond NWs
Data for relevant EU banks
European stress testing data 2016 (EBA)
• 51 relevant European banks (49 included in analysis)
• 44 sovereign bond investment categories (36
included)
Asset-Bank à bank exposure NW
Exposure networks
original optimal
Minimal SR in banking netwoks
Example 3: Bank-firm NWs
Combine it with interbank NW
Combine interbank NW with firm-bank network
to get a ‘complete’ credit network of a nation
then compute the DebtRank again for all banks AND firms
Systemic risk Austrian banks and firms
0.0
0.25
0.50
0.70
Companies  banks ranked by DebtRank
DebtRank
Companies
Banks
M K K M M F K L K K L L Q L F M M L N H M K G G M G M K L F D H N L H G M K K K I M K I K
0.0
0.25
0.50
0.70
B
R
A
U
U
N
I
O
N
A
G
Ö
B
B
-
I
n
f
r
a
s
t
r
u
k
t
u
r
A
G
O
M
V
A
G
O
Ö
L
a
n
d
e
s
h
o
l
d
i
n
g
G
m
b
H
M
a
r
i
n
a
G
i
o
r
i
B
e
t
e
i
l
i
g
u
n
g
s
u
n
d
V
e
r
w
a
l
t
u
n
g
s
G
m
b
H
V
E
R
B
U
N
D
A
G
N
o
v
a
r
t
i
s
A
u
s
t
r
i
a
G
m
b
H
v
o
e
s
t
a
l
p
i
n
e
A
G
T
e
l
e
k
o
m
A
u
s
t
r
i
a
A
G
R
e
m
u
s

S
e
b
r
i
n
g
H
o
l
d
i
n
g
G
m
b
H
O
b
e
r
b
a
n
k
K
B
L
e
a
s
i
n
g
G
m
b
H
E
n
e
r
g
i
e
A
G
O
b
e
r
ö
s
t
e
r
r
e
i
c
h
H
E
T
A
L
e
a
s
i
n
g
K
ä
r
n
t
e
n
G
m
b
H

C
o
K
G
T
e
l
e
k
o
m
F
i
n
a
n
z
m
a
n
a
g
e
m
e
n
t
G
m
b
H
V
o
l
k
s
b
a
n
k
e
n
H
o
l
d
i
n
g
e
G
e
n
V
B
L
e
a
s
i
n
g
F
i
n
a
n
z
i
e
r
u
n
g
s
G
m
b
H
V
o
r
a
r
l
b
e
r
g
e
r
L
a
n
d
e
s
b
a
n
k
-
H
o
l
d
i
n
g
F
r
i
t
z
E
g
g
e
r
G
m
b
H
A
l
p
h
a
F
D
G
m
b
H
H
K
M
B
e
t
e
i
l
i
g
u
n
g
s
G
m
b
H
B
o
r
e
a
l
i
s
A
G
V
o
l
k
s
b
a
n
k
S
a
l
z
b
u
r
g
L
e
a
s
i
n
g
G
m
b
H
G
e
m
e
i
n
n
ü
t
z
i
g
e
S
a
l
z
b
u
r
g
e
r
W
o
h
n
b
a
u
G
m
b
H
W
A
G
W
o
h
n
u
n
g
s
a
n
l
a
g
e
n
G
m
b
H
H
o
l
z
b
a
u
M
a
i
e
r
G
m
b
H

C
o
.
K
G
.
H
E
T
A
G
r
u
n
d
-
u
B
a
u
-
L
e
a
s
i
n
g
G
m
b
H
B
u
n
d
e
s
i
m
m
o
b
i
l
i
e
n
G
m
b
H
L
a
n
d
e
s
i
m
m
o
b
i
l
i
e
n
-
G
.
m
b
H
L
a
n
d
e
s
-
I
m
m
o
b
i
l
i
e
n
G
m
b
H
I
n
n
s
b
r
u
c
k
e
r
I
m
m
o
b
i
l
i
e
n
G
m
b
H

C
o
K
G
S
t
a
d
t
H
a
l
l
i
n
T
i
r
o
l
I
m
m
o
b
i
l
i
e
n
G
m
b
H
D
a
n
u
b
i
u
s
I
m
m
o
b
i
l
i
e
n
v
e
r
w
e
r
t
u
n
g
s
G
m
b
H
R
a
i
ff
e
i
s
e
n
-
L
e
a
s
i
n
g
A
n
l
a
g
e
n
u
K
F
Z
V
e
r
m
i
e
t
u
n
g
s
G
m
b
H
R
L
F
l
u
s
s
s
c
h
i
ff
f
a
h
r
t
s
G
m
b
H

C
o
K
G
Ö
B
B
-
P
e
r
s
o
n
e
n
v
e
r
k
e
h
r
A
G
F
l
u
g
h
a
f
e
n
W
i
e
n
A
G
F
R
I
E
D
L
G
m
b
H
E
W
A
E
n
e
r
g
i
e
-
u
W
i
r
t
s
c
h
a
f
t
s
b
e
t
r
i
e
b
e
S
t
.
A
n
t
o
n
G
m
b
H
R
a
i
ff
e
i
s
e
n
-
L
a
g
e
r
h
a
u
s
A
m
s
t
e
t
t
e
n
e
G
e
n
S
P
A
R
Ö
s
t
e
r
r
e
i
c
h
i
s
c
h
e
W
a
r
e
n
h
a
n
d
e
l
s
-
A
G
L
a
g
e
r
h
a
u
s
T
e
c
h
n
i
k
-
C
e
n
t
e
r
G
m
b
H

C
o
K
G
V
E
R
B
U
N
D
H
y
d
r
o
P
o
w
e
r
G
m
b
H
K
a
r
l
R
e
i
t
e
r
P
o
s
t
h
o
t
e
l
A
c
h
e
n
k
i
r
c
h
G
m
b
H
H
o
t
e
l
T
r
o
f
a
n
a
R
o
y
a
l
G
m
b
H
S
t
e
i
e
r
m
ä
r
k
i
s
c
h
e
K
r
a
n
k
e
n
a
n
s
t
a
l
t
e
n
G
m
b
H
Companies ranked by DebtRank
DebtRank
M Services K Finance  Insurance F Construction L Real estate N Other services
H Logistics G Automobile sector D Energy I Gastronomy Q Health
Message
more than half of the total financial SR
comes from companies
Example 4: Trading NWs
NW of international trade: Data
• Trade networks between all countries are
available
• Compute effect if supply or demand in one
country changes. What are the implications in
the networks?
• Compute primary and secondary effects
Shock spreading in trade NWs
Public administration and defence;...
Real estate activities
Human health and social work activ...
Wholesale trade, except of motor v...
Retail trade, except of motor vehi...
Administrative and support service...
Legal and accounting activities; a...
Insurance, reinsurance and pension...
Manufacture of food products, beve...
Accommodation and food service act...
Manufacture of coke and refined pe...
Financial service activities, exce...
Other service activities
Mining and quarrying
Manufacture of chemicals and chemi...
Telecommunications
Electricity, gas, steam and air co...
Activities auxiliary to financial ...
Land transport and transport via p...
Manufacture of fabricated metal pr...
Crop and animal production, huntin...
Manufacture of motor vehicles, tra...
Architectural and engineering acti...
Manufacture of computer, electroni...
Wholesale and retail trade and rep...
Manufacture of basic metals
Education
Computer programming, consultancy ...
Motion picture, video and televisi...
Construction
time, t [years]
0 2 4 6 8 10
Response
function,

Y
k
(t)

-2
-1.5
-1
-0.5
0
USA 2014
(A)
(B)
Shock,
X
i
-1
-0.5
0
(C)
t = 0y t = 1y t = 6y
Agriculture Mining Manufacturing Electricity  Water Construction Trade Transport
Accomodation InformationCommunication Finance Research Administration Other
Secondary effects:Trump’s EU Tax on steel and aluminum
Example 5: Supply chain-NWs
Data
• Production networks within countries become
visible through national VAT data
• Assume production functions
• Assume how firms can substitute for lost
suppliers
• Compute SR index: ESRI for every company
Flows of goods and services
Every company has production function
Systemic relevance / resilience of economy
Systemic core of economy (HUN)
Firms in NACE are not input-similar
How to reconstruct supply chains?
Reconstruction for Austria: communication NW
Results from reconstructed supply NW
Example 6: Food supply NWs
Supply networks on product level
Example pork
production
Systemic Risk Index – same idea
Supply chains on product level: monitor
Example 7:Global flows of SR
Use a international SC network
(highly incomplete)
What happens in country B if
A firm in country A defaults?
Compute a inter-country SR flow index
International flow of systemic risks
The poorer – the more imported SR
Computing resilience points
Economy as a collection of dynamical netzworks
Scenario: flood destroys capital in Austria
0 1 2 3 4 5 6 7 8 9 10
Direct losses in percentage of capital stock [%]
-6
-5
-4
-3
-2
-1
0
1
2
3
4
5
6
7
8
Cumulative
change
in
GDP-growth
rate
[pp]
2014
2015
2016
Inflection point w. r. t.
economic growth
250-year
event
100-year
event
1,500-year
event
Maximum: threshold at
which natural disaster
becomes a systemic event
Flood-resilience of Austrian economy
Transformation
slide1_merged.pdf
slide1_merged.pdf
slide1_merged.pdf
slide1_merged.pdf
slide1_merged.pdf
slide1_merged.pdf
slide1_merged.pdf
slide1_merged.pdf
slide1_merged.pdf
slide1_merged.pdf

More Related Content

Similar to slide1_merged.pdf

Analisis Dimencional de Ain A. Sonin
Analisis Dimencional de Ain A. SoninAnalisis Dimencional de Ain A. Sonin
Analisis Dimencional de Ain A. Sonin
Gustavo Salazar
 

Similar to slide1_merged.pdf (20)

Tarasov
TarasovTarasov
Tarasov
 
Hamiltonian formulation project Sk Serajuddin.pdf
Hamiltonian formulation project Sk Serajuddin.pdfHamiltonian formulation project Sk Serajuddin.pdf
Hamiltonian formulation project Sk Serajuddin.pdf
 
Da unified
Da unifiedDa unified
Da unified
 
análisis dimensional
análisis dimensionalanálisis dimensional
análisis dimensional
 
Analisis Dimencional de Ain A. Sonin
Analisis Dimencional de Ain A. SoninAnalisis Dimencional de Ain A. Sonin
Analisis Dimencional de Ain A. Sonin
 
Fractals
Fractals Fractals
Fractals
 
Scientific Method
Scientific MethodScientific Method
Scientific Method
 
Waves_Quantum.ppt and Pdf
Waves_Quantum.ppt and Pdf Waves_Quantum.ppt and Pdf
Waves_Quantum.ppt and Pdf
 
Math, applied math, and math in physics
Math, applied math, and math in physicsMath, applied math, and math in physics
Math, applied math, and math in physics
 
Gradu.Final
Gradu.FinalGradu.Final
Gradu.Final
 
¿Cuál es el valor de la gravedad?
¿Cuál es el valor de la gravedad?¿Cuál es el valor de la gravedad?
¿Cuál es el valor de la gravedad?
 
String theory of particle physics
String theory of particle physicsString theory of particle physics
String theory of particle physics
 
Beyond and across space: entanglement
Beyond and across space: entanglementBeyond and across space: entanglement
Beyond and across space: entanglement
 
Route in search of roots forum 3
Route in search of roots forum 3Route in search of roots forum 3
Route in search of roots forum 3
 
Toward A Computational Theory of Everything
Toward A Computational Theory of EverythingToward A Computational Theory of Everything
Toward A Computational Theory of Everything
 
The birth of quantum mechanics canvas
The birth of quantum mechanics canvasThe birth of quantum mechanics canvas
The birth of quantum mechanics canvas
 
Relativity and Quantum Mechanics Are Not "Incompatible"
Relativity and Quantum Mechanics Are Not "Incompatible"Relativity and Quantum Mechanics Are Not "Incompatible"
Relativity and Quantum Mechanics Are Not "Incompatible"
 
Extended Project
Extended ProjectExtended Project
Extended Project
 
Lorentz Length Contraction (More Discussion)
Lorentz Length Contraction (More Discussion)Lorentz Length Contraction (More Discussion)
Lorentz Length Contraction (More Discussion)
 
Consciousness universe vs simulation hypothesis - are we living in a compute...
Consciousness universe vs simulation hypothesis  - are we living in a compute...Consciousness universe vs simulation hypothesis  - are we living in a compute...
Consciousness universe vs simulation hypothesis - are we living in a compute...
 

Recently uploaded

一比一原版(MQU毕业证书)麦考瑞大学毕业证成绩单原件一模一样
一比一原版(MQU毕业证书)麦考瑞大学毕业证成绩单原件一模一样一比一原版(MQU毕业证书)麦考瑞大学毕业证成绩单原件一模一样
一比一原版(MQU毕业证书)麦考瑞大学毕业证成绩单原件一模一样
aqwaz
 
Sun day thang 4 sun life team trung dai
Sun day thang 4 sun life team trung daiSun day thang 4 sun life team trung dai
Sun day thang 4 sun life team trung dai
GiangTra20
 
一比一原版(CCSF毕业证书)旧金山城市学院毕业证原件一模一样
一比一原版(CCSF毕业证书)旧金山城市学院毕业证原件一模一样一比一原版(CCSF毕业证书)旧金山城市学院毕业证原件一模一样
一比一原版(CCSF毕业证书)旧金山城市学院毕业证原件一模一样
basxuke
 
obat aborsi Cibitung wa 082223595321 jual obat aborsi cytotec asli di Cibitung
obat aborsi Cibitung wa 082223595321 jual obat aborsi cytotec asli di Cibitungobat aborsi Cibitung wa 082223595321 jual obat aborsi cytotec asli di Cibitung
obat aborsi Cibitung wa 082223595321 jual obat aborsi cytotec asli di Cibitung
siskavia916
 
Short film analysis.pptxdddddddddddddddddddddddddddd
Short film analysis.pptxddddddddddddddddddddddddddddShort film analysis.pptxdddddddddddddddddddddddddddd
Short film analysis.pptxdddddddddddddddddddddddddddd
LeonBraley
 
prodtion diary updated.pptxyyghktyuitykiyu
prodtion diary updated.pptxyyghktyuitykiyuprodtion diary updated.pptxyyghktyuitykiyu
prodtion diary updated.pptxyyghktyuitykiyu
LeonBraley
 
Norco College - M4MH Athlete Pilot - 4.30.24 - Presentation.pdf
Norco College - M4MH Athlete Pilot - 4.30.24 - Presentation.pdfNorco College - M4MH Athlete Pilot - 4.30.24 - Presentation.pdf
Norco College - M4MH Athlete Pilot - 4.30.24 - Presentation.pdf
RebeccaPontieri
 
obat aborsi Klaten wa 082135199655 jual obat aborsi cytotec asli di Klaten
obat aborsi Klaten wa 082135199655 jual obat aborsi cytotec asli di Klatenobat aborsi Klaten wa 082135199655 jual obat aborsi cytotec asli di Klaten
obat aborsi Klaten wa 082135199655 jual obat aborsi cytotec asli di Klaten
siskavia95
 
Captain america painting competition -- 13
Captain america painting competition -- 13Captain america painting competition -- 13
Captain america painting competition -- 13
Su Yan-Jen
 
一比一原版(Drexel毕业证书)美国芝加哥城市学院毕业证如何办理
一比一原版(Drexel毕业证书)美国芝加哥城市学院毕业证如何办理一比一原版(Drexel毕业证书)美国芝加哥城市学院毕业证如何办理
一比一原版(Drexel毕业证书)美国芝加哥城市学院毕业证如何办理
Fir
 
obat aborsi wonogiri wa 081336238223 jual obat aborsi cytotec asli di wonogir...
obat aborsi wonogiri wa 081336238223 jual obat aborsi cytotec asli di wonogir...obat aborsi wonogiri wa 081336238223 jual obat aborsi cytotec asli di wonogir...
obat aborsi wonogiri wa 081336238223 jual obat aborsi cytotec asli di wonogir...
yulianti213969
 

Recently uploaded (20)

一比一原版(MQU毕业证书)麦考瑞大学毕业证成绩单原件一模一样
一比一原版(MQU毕业证书)麦考瑞大学毕业证成绩单原件一模一样一比一原版(MQU毕业证书)麦考瑞大学毕业证成绩单原件一模一样
一比一原版(MQU毕业证书)麦考瑞大学毕业证成绩单原件一模一样
 
batwhls
batwhlsbatwhls
batwhls
 
Sun day thang 4 sun life team trung dai
Sun day thang 4 sun life team trung daiSun day thang 4 sun life team trung dai
Sun day thang 4 sun life team trung dai
 
Neighborhood Guide To Atlanta’s Awe-Inspiring Art Galleries
Neighborhood Guide To Atlanta’s Awe-Inspiring Art GalleriesNeighborhood Guide To Atlanta’s Awe-Inspiring Art Galleries
Neighborhood Guide To Atlanta’s Awe-Inspiring Art Galleries
 
Hosewife Bangalore Just VIP Btm Layout 100% Genuine at your Door Step
Hosewife Bangalore Just VIP Btm Layout 100% Genuine at your Door StepHosewife Bangalore Just VIP Btm Layout 100% Genuine at your Door Step
Hosewife Bangalore Just VIP Btm Layout 100% Genuine at your Door Step
 
K_ E_ S_ Retail Store Scavenger Hunt.pptx
K_ E_ S_ Retail Store Scavenger Hunt.pptxK_ E_ S_ Retail Store Scavenger Hunt.pptx
K_ E_ S_ Retail Store Scavenger Hunt.pptx
 
一比一原版(CCSF毕业证书)旧金山城市学院毕业证原件一模一样
一比一原版(CCSF毕业证书)旧金山城市学院毕业证原件一模一样一比一原版(CCSF毕业证书)旧金山城市学院毕业证原件一模一样
一比一原版(CCSF毕业证书)旧金山城市学院毕业证原件一模一样
 
Kiff
KiffKiff
Kiff
 
obat aborsi Cibitung wa 082223595321 jual obat aborsi cytotec asli di Cibitung
obat aborsi Cibitung wa 082223595321 jual obat aborsi cytotec asli di Cibitungobat aborsi Cibitung wa 082223595321 jual obat aborsi cytotec asli di Cibitung
obat aborsi Cibitung wa 082223595321 jual obat aborsi cytotec asli di Cibitung
 
Short film analysis.pptxdddddddddddddddddddddddddddd
Short film analysis.pptxddddddddddddddddddddddddddddShort film analysis.pptxdddddddddddddddddddddddddddd
Short film analysis.pptxdddddddddddddddddddddddddddd
 
prodtion diary updated.pptxyyghktyuitykiyu
prodtion diary updated.pptxyyghktyuitykiyuprodtion diary updated.pptxyyghktyuitykiyu
prodtion diary updated.pptxyyghktyuitykiyu
 
Norco College - M4MH Athlete Pilot - 4.30.24 - Presentation.pdf
Norco College - M4MH Athlete Pilot - 4.30.24 - Presentation.pdfNorco College - M4MH Athlete Pilot - 4.30.24 - Presentation.pdf
Norco College - M4MH Athlete Pilot - 4.30.24 - Presentation.pdf
 
Russian ℂall Girls Vijay Nagar Hire Me Neha 96XXXXXXX Top Class ℂall Girl Ser...
Russian ℂall Girls Vijay Nagar Hire Me Neha 96XXXXXXX Top Class ℂall Girl Ser...Russian ℂall Girls Vijay Nagar Hire Me Neha 96XXXXXXX Top Class ℂall Girl Ser...
Russian ℂall Girls Vijay Nagar Hire Me Neha 96XXXXXXX Top Class ℂall Girl Ser...
 
obat aborsi Klaten wa 082135199655 jual obat aborsi cytotec asli di Klaten
obat aborsi Klaten wa 082135199655 jual obat aborsi cytotec asli di Klatenobat aborsi Klaten wa 082135199655 jual obat aborsi cytotec asli di Klaten
obat aborsi Klaten wa 082135199655 jual obat aborsi cytotec asli di Klaten
 
Rustampura ℂall Girls Service WhatsApp: 8527049040 High Class Esℂorts Serviℂe...
Rustampura ℂall Girls Service WhatsApp: 8527049040 High Class Esℂorts Serviℂe...Rustampura ℂall Girls Service WhatsApp: 8527049040 High Class Esℂorts Serviℂe...
Rustampura ℂall Girls Service WhatsApp: 8527049040 High Class Esℂorts Serviℂe...
 
Captain america painting competition -- 13
Captain america painting competition -- 13Captain america painting competition -- 13
Captain america painting competition -- 13
 
一比一原版(Drexel毕业证书)美国芝加哥城市学院毕业证如何办理
一比一原版(Drexel毕业证书)美国芝加哥城市学院毕业证如何办理一比一原版(Drexel毕业证书)美国芝加哥城市学院毕业证如何办理
一比一原版(Drexel毕业证书)美国芝加哥城市学院毕业证如何办理
 
obat aborsi wonogiri wa 081336238223 jual obat aborsi cytotec asli di wonogir...
obat aborsi wonogiri wa 081336238223 jual obat aborsi cytotec asli di wonogir...obat aborsi wonogiri wa 081336238223 jual obat aborsi cytotec asli di wonogir...
obat aborsi wonogiri wa 081336238223 jual obat aborsi cytotec asli di wonogir...
 
Gabriel Van Konynenburg _ Portfolio Visual
Gabriel Van Konynenburg _ Portfolio VisualGabriel Van Konynenburg _ Portfolio Visual
Gabriel Van Konynenburg _ Portfolio Visual
 
(She)nanigans - Spring / Summer 2024 Lookbook
(She)nanigans - Spring / Summer 2024 Lookbook(She)nanigans - Spring / Summer 2024 Lookbook
(She)nanigans - Spring / Summer 2024 Lookbook
 

slide1_merged.pdf

  • 1. . Introduction to the Theory of Complex Systems Stefan Thurner Thurner Intro to the theory of CS KCL london mar 20, 2023 1 / 55
  • 2. About this course This course is an introduction to the theory of complex systems. We learn about principles and technical tools to understand, deal with, and manage CS. What are CS? The answer is not unique, we will try give a practical and useful one. Science of CS is about 30 years old. It’s state of development is maybe best comparable to the state of Quantum Mechanics in 1920, when no comprehensive literature existed on QM, and it was just a collection of bits and pieces. There has not been a ’Copenhagen’ yet for CS. However, there is a pattern emerging what a theory of CS must include. This lecture is an attempt to show this pattern. It is a bit like discovering a Greek mosaic that is not yet fully excavated, parts are missing, but the main characters are already clearly visible and one understands the picture. Thurner Intro to the theory of CS KCL london mar 20, 2023 2 / 55
  • 3. Course outline Introduction Scaling and driven systems Statistical mechanics and information theory for CS Systemic risk in financial networks and supply chains Thurner Intro to the theory of CS KCL london mar 20, 2023 3 / 55
  • 4. Literature This course will follow S. Thurner, P. Klimek, R. Hanel Introduction to the Theory of Complex Systems Oxford University Press, 2018 graphics, if not mentioned otherwise, were produced by the authors Thurner Intro to the theory of CS KCL london mar 20, 2023 4 / 55
  • 5. What are CS? What are CS ? Thurner Intro to the theory of CS KCL london mar 20, 2023 5 / 55
  • 6. What are CS? To provide a feeling for what complex systems are let us first remind ourselves on three other questions • what is physics? • what is biology and life sciences? • what are social sciences? By describing what these sciences are – and what they are not – we should develop a feeling of what complex systems are The aim is to make clear that the science of CS is a non-trivial combination of three disciplines and that this mix becomes something like a discipline by itself Thurner Intro to the theory of CS KCL london mar 20, 2023 6 / 55
  • 7. What are CS? What is physics? Physics is the experimental, quantitative and predictive science of matter and their interactions quantitative statements are made with numbers: less ambiguous than words predictive means that statements are given in form of predictions which can be experimentally tested Basically one asks specific questions to nature in form of experiments – and in fact, one gets answers. This methodology is unique in world history - the scientific method Thurner Intro to the theory of CS KCL london mar 20, 2023 7 / 55
  • 8. What are CS? What is physics? Matter Relevant interaction types Length scale Macroscopic matter gravity, electromagnetism all ranges Molecules electromagnetism all ranges Atoms electromagnetism, weak force ∼ 10−18 m Hadrons & leptons electromagnetism, weak/strong 10−18 − 10−15 m Quarks & gauge bosons electromagnetism, weak/strong 10−18 − 10−15 m All interactions in the physical world are mediated by the exchange of gauge bosons. For gravity the situation is not yet experimentally founded. Thurner Intro to the theory of CS KCL london mar 20, 2023 8 / 55
  • 9. What are CS? What is physics? The nature of fundamental forces Typically the four fundamental forces act homogeneously and isotropically in space and time. There are famous exceptions, however: the strong force or interaction acts in a way as if the interaction is limited to a ‘string’ – similar to type II superconductivity. These interactions work on different scales, from light years to femto meters: this means that one typically has to consider only one force for a given phenomenon of interest. The other one can be 100, 106 , or up to 1039 times stronger than the other. Traditionally physics does not specify which particles interact with each other. Usually they all interact equally, the interaction strength depends on interaction type and the form of the potential. Thurner Intro to the theory of CS KCL london mar 20, 2023 9 / 55
  • 10. What are CS? What is physics? What does predictive mean? Assume you can do an experiment over and over again, e.g. drop a stone. The theoretical task is to predict the reproducible results. Since Newton physics follows the following recipe Find the equations of motion to code your understanding of a dynamical system dp dt = F(x) , p = m dx dt Predictive means: once F is specified the problem is solved, if the initial and or boundary conditions are known. The result is x(t) Compare the result with your experiments Note: fixing initial conditions and boundary conditions means taking the system out of the context – out of the rest of the universe. This is why physics works! Thurner Intro to the theory of CS KCL london mar 20, 2023 10 / 55
  • 11. What are CS? What is physics? The same philosophy holds for arbitrarily complicated systems. Assume a vector X(t) represents the state of a system (for example all positions and momenta), then we get a set of equations of motion of this form dX(t) dt = G(X(t)) Predictive means that in principle you can solve these equations. However, these can be hard to solve. Already for three bodies, this becomes a hard task, the famous three-body problem (sun, earth, moon) Laplace: if a daemon knowing all the initial conditions and able to solve all equations – we could predict everything The problem is that this daemon is hard to find. In fact the Newton-Laplace program becomes completely useless for most systems. So are these systems not predictable? Thurner Intro to the theory of CS KCL london mar 20, 2023 11 / 55
  • 12. What are CS? What is physics? Consider water and make the following experiment over and over again cool it to 0o C – it freezes heat it to 100o C it boils – almost certainly (under standard conditions) One can maybe measure the velocity of a single gas molecule at a point in time, but not of all, O(1023 ) at the same time. But one can compute the probability that a gas molecule has a velocity v, p(v) ∝ exp(− m 2kT v2 ) For many non-interacting particles these probabilities become extremely precise and one can make predictions about aggregates of particles. Necessary: interactions are weak and the number of particles is large. Note that to compute the freezing temperature of water was impossible before simulations. Note: The word prediction now has a much weaker meaning than in the Laplace-Newton sense. The concept of determinism is diluted. Thurner Intro to the theory of CS KCL london mar 20, 2023 12 / 55
  • 13. What are CS? What is physics? The triumph of statistical mechanics The idea of statistical mechanics is to understand the macroscopic properties of a system from its microscopic components: relate the micro- with the macro world Typically in physics the macroscopic description is often simple and corresponds to the state of the phase in which the system is (solid, gaseous, liquid). Physical systems have often very few phases. A system is often prepared in one macro state (e.g. temperature and pressure is given). There are usually many possible microstates that are related to that macro state. In statistical mechanics the main task is to compute the probabilities for the many microstates that lead to that single macro state Thurner Intro to the theory of CS KCL london mar 20, 2023 13 / 55
  • 14. What are CS? What is physics? Traditional physics works fine for a few particles (Newton-Laplace) and for many non-interacting particles (Boltzmann-Gibbs). In other words, the class of systems that can be understood by physics is not so big. There were more severe shifts to the concept of predictability Thurner Intro to the theory of CS KCL london mar 20, 2023 14 / 55
  • 15. What are CS? What is physics? What does prediction mean? The crises of physics Prediction in the 18th century is quite different from the concept of prediction in the 21st classical physics: exact prediction of trajectories →crisis 1900: too many particles → statistical physics: laws of probability allow stochastic predictions of the macro (collective) behavior of gases, assuming trajectories are predictable in principle →crisis 1920s: concept of determinism evaporates completely → QM and non-linear dynamics: unpredictable components – collective phenomena remain predictable →crisis 1990s: can not deal with strong interactions in stat. systems → Complex Systems: situation can be worse than QM: unpredictable components and complicated interactions – hope that collective is still predictable Thurner Intro to the theory of CS KCL london mar 20, 2023 15 / 55
  • 16. What are CS? What is physics? Physics is analytic, complex systems are algorithmic Physics largely follows an analytical paradigm. Knowledge of phenomena is expressed in analytical equations that allow us to make predictions. This is possible because interactions in physics do not change over time. This is radically different for complex systems: interactions themselves can change over time. In that sense, complex systems change their internal interaction structure as they evolve–co-evolution. Systems that change their internal structure dynamically can be viewed as machines and are best described as algorithms—a list of rules how the system updates its states and future interactions. Many complex systems work like this: states of components and interactions between them are simultaneously updated, which leads to tremendous mathematical difficulties. Whenever it is possible to ignore the changes in the interactions in a dynamical system, analytic descriptions become meaningful. Physics is analytic; complex systems are algorithmic. Experimentally testable quantitative predictions can be made with analytic or algorithmic descriptions Thurner Intro to the theory of CS KCL london mar 20, 2023 16 / 55
  • 17. What are CS? What is physics? What are complex systems from a physics point of view? many particle systems (as in statistical physics) may have stochastic components and elements (as e.g. in QM) interactions may be specific between two objects (networks) interactions may be complicated and co-evolving: states and interactions they are often chaotic and driven systems interacting bodies are not limited to matter interactions are not limited to the 4 fundamental forces they can show very rich phase structure they can have many macro states, even simultaneously realized Most physicists will not have a problem to call these extensions to traditional physics CS. For them it is still physics. The prototype model of CS in physics are the so-called spin glasses. Thurner Intro to the theory of CS KCL london mar 20, 2023 17 / 55
  • 18. What are CS? What is physics? This is not what we call Complex Systems yet—there are crucial concepts missing However, due to more specific interactions and the increased variety of types of interactions, the variety of macroscopic states changes drastically. These emerge from the properties of the system’s components and the interactions. The phenomenon that a priori unexpected properties may arise as a consequence of the generalized interactions is sometimes called emergence. Such CSs can have an extremely rich phase structure. In the case when there is a plurality of macro-states in a system, this leads to entirely new questions one can ask to the system: I What is the number of macro states? I What are their co-occurrence rates? I What are the typical sequences of occurrence? I What is their life-times? I What are the transition probabilities? etc. Thurner Intro to the theory of CS KCL london mar 20, 2023 18 / 55
  • 19. What are CS? What is physics? What we definitely want to keep from physics CS is the experimental, quantitative and predictive science of generalized mat- ter with generalized interactions Generalized interactions are described by the interaction type α, and who interacts with whom. If there are more than 2 objects involved, interactions are conveniently indicated by networks Mα ij (t) Interactions themselves remain based on concept of exchange. For example think of communication, where messages are exchanged, trade where goods and services are exchanged, friendships where wine bottles are exchanged, etc. For many CS the framework of physics is incomplete: What are the missing concepts? Equilibrium, co-evolution, adjacent possible, ... Thurner Intro to the theory of CS KCL london mar 20, 2023 19 / 55
  • 20. What are CS? A note on chemistry A note on chemistry – the science of equilibria In chemistry interactions between atoms and molecules can be already quite specific. So why is chemistry usually not a CS? Classically, chemistry is based on the law of mass action – many particles interact in a way to reach equilibrium. αA + βB σS + τT α, β, σ, τ are the stoichiometric constants, k+, k− are reaction rates forward reaction: k+{A}α {B}β , ({.} reactive mass) backward reaction: k−{S}σ {T}τ they are the same in equilibrium K = k+ k− = {S}σ {T}τ {A}α{B}β Thurner Intro to the theory of CS KCL london mar 20, 2023 20 / 55
  • 21. What are CS? A note on chemistry Many CS are characterized by the fact that they are out of equilibrium (driven). This means that there are no fixed point type equations that can be used to solve the problem. In this situation the help from statistical mechanics becomes very limited. It is hard to handle systems that are out of equilibrium. Even the so-called stationary non-equilibrium is hard to understand – even computationally (thermostats) On the positive side: many CS are self-organized critical let’s keep from chemistry Many CS are out of equilibrium Many CS are non-ergodic Note: as soon as one focuses on e.g. cyclical catalytic reactions on networks chemistry becomes a CS very soon Thurner Intro to the theory of CS KCL london mar 20, 2023 21 / 55
  • 22. What are CS? What is Biology? What are CS ? Thurner Intro to the theory of CS KCL london mar 20, 2023 22 / 55
  • 23. What are CS? What is Biology? What is biology? Life Science is the experimental science of living matter. What is living matter? What are the minimal conditions for living matter to exist? According to S.A. Kauffman living matter has to be self-replicating has to run through at least one Carnot cycle has to be localized Thurner Intro to the theory of CS KCL london mar 20, 2023 23 / 55
  • 24. What are CS? What is Biology? Living matter is a self-sustained sequence of genetic activity over life. It uses energy and performs work. It is constantly out of equilibrium. Genetic activity has to do with chemical reactions which take place e.g. within cells (compartments). Unfortunately, this is only partly true. Chemical reactions usually involve billions of atoms or molecules. What happens in the cell is chemistry with few molecules. If you have a few molecules only, there arise problems: I the law of large numbers becomes inappropriate I the laws of diffusion become inadequate I the concept of equilibrium becomes shaky I without equilibrium what is the law of mass action? If we do not have a law of mass action, how is chemistry to be done? Consequently, traditional chemistry is often rather inadequate for living matter Thurner Intro to the theory of CS KCL london mar 20, 2023 24 / 55
  • 25. What are CS? What is Biology? More complications in the cell: Molecules may be transported from site of production to where they are needed. This changes law of diffusion even more, it becomes anomalous diffusion, d dt p(x, t) = D d2+ν dx2+ν p(x, t)µ . Chemical binding depends on 3D structure of molecules. Chemical binding depends on ’state’ of molecules, e.g. if they are phosphorylated or not. ’Reaction rates’ – if one still wants to use this term – depend on the statistical mechanics of small systems, i.e. fluctuation theorems might become important. Thurner Intro to the theory of CS KCL london mar 20, 2023 25 / 55
  • 26. What are CS? What is Biology? Biological interactions happen on networks – almost exclusively Genetic regulation governs the temporal sequence of abundance of proteins, nucleic material and metabolites within a living organism Genetic regulation can be viewed as a discrete interaction Protein-protein binding is discrete, for example complex formation Discrete interactions are described by networks: gene-regulatory network (e.g. Boolean) metabolic network protein-protein network Thurner Intro to the theory of CS KCL london mar 20, 2023 26 / 55
  • 27. What are CS? What is Biology? Thurner Intro to the theory of CS KCL london mar 20, 2023 27 / 55
  • 28. What is Biology? Evolution Nothing in biology makes sense except in the light of evolution. Dobzhansky Genetic material and the process of replication involve several stochastic components, and lead to variations in copies. Replication and variation are two of the three main ingredients for all evolutionary processes. What is evolution? Remember the Darwinian story Consider a population of some kind. The offspring of this population has some random variations (e.g. mutations). Individuals with the optimal variations (gi- ven a certain surrounding) have a selection advantage, i.e. fitness. This fitness manifests itself such that these individuals have the more offspring, and thus pass on the particular variation on to a new generation. In this way ’optimal’ variations get selected over time. Is this definition predictive science? Or is it just a convincing story? Is the Darwinian story falsifiable? How can we measure fitness? It is an a posteriori concept. Survival of the fittest ≡ – survival of those who survive. Thurner Intro to the theory of CS KCL london mar 20, 2023 28 / 55
  • 29. What is Biology? Evolution Evolution is a three-step process. 1 new thing comes into being in a given environment. 2 new thing has the chance to interact with environment. The result of this interaction: possibility to get selected or destroyed. 3 if the new thing gets selected (survives) in this environment it becomes part of this environment – it becomes part of the new environment for all future, new and arriving things. Evolution happens simultaneously on various scales (time & space): cells – organisms – populations Thurner Intro to the theory of CS KCL london mar 20, 2023 29 / 55
  • 30. What is Biology? Evolution Evolution is not physics. If you think of this three step process in terms of equations of motion 1 Write down the dynamics of the system in the form of equations of motion. 2 Boundary conditions depend on these equations – you can not fix them. 3 Consequently, you can not solve the equations. 4 Newtonian recipe breaks down, you fail – program becomes mathematical monster if you think of dynamically coupled boundary conditions with the dynamical system. The task is: try to solve it nevertheless. We will see that multi-scale methods can be used to address this type of problems. Thurner Intro to the theory of CS KCL london mar 20, 2023 30 / 55
  • 31. What is Biology? Evolution The concept of evolution is fundamentally different from physics. It is immediately evident that we are confronted with two huge problems: Boundary conditions can not be fixed. Phase space is not well defined – it changes over time. New elements may emerge that change the environment substantially. The evolutionary aspect is essential for many CS, it can not be neglected. Thurner Intro to the theory of CS KCL london mar 20, 2023 31 / 55
  • 32. What is Biology? Evolution Evolution is a natural phenomenon. Evolution is a process that increases diversity, it looks as if it is a ’creative’ process. Evolution has many forms of appearance – it is ’universal’: biological evolution, technological innovation, economy, financial markets, history, etc. Evolution follows patterns, regardless in which form of appearance. These patterns are surprisingly robust. For example: wherever you look, there is life. Ecological niches that can get filled will get filled. Power laws. As a natural phenomenon it deserves a scientific explanation (quantitative & predictive). Thurner Intro to the theory of CS KCL london mar 20, 2023 32 / 55
  • 33. What is Biology? Evolution A side note on evolution as a natural phenomenon. Natural phenomena are to certain degree predictable, i.e. one can trust patterns: drop stone it will fall down, if water gets cold it freezes, etc. Credo in natural sciences: one can trust in the fact that natural phenomena work. If this is the case there must be basic laws on scientific grounds With almost certainty: no ecological niche is empty, life is creative and keeps on going regardless of catastrophes etc. Certain statistical patterns in the evolutionary timeseries are repeated over and over again – regardless of the specific system. It will not be possible to predict what animals live on earth in 500.000 years. But it should be possible to make falsifiable predictions on systemic quantities such as diversity, diversification rates, robustness, resilience, adaptability. See our discussion on predictability. Missing: ways to predict these systemic features quantitatively. Thurner Intro to the theory of CS KCL london mar 20, 2023 33 / 55
  • 34. What is Biology? Evolution A language for evolutionary processes – the adjacent possible algorithmic view This ’science’ is at its very beginning. Maybe not even the right language has been established so far. The adjacent possible is the set of all possible worlds that could potentially exist in the next timestep. The adjacent possible depends strongly on the present state of the world. With this definition evolution is a process that continuously fills the AP. Different from physics: a given state determines the next state. In physics all potential states are known → nature is algorithmic Evolution: given state determines the realization of a (huge) set of possible states. The future states may not even be known at all – ’creative process’. Filling of the adjacent possible determines the next adjacent possible. Thurner Intro to the theory of CS KCL london mar 20, 2023 34 / 55
  • 35. What is Biology? Evolution We have learned for evolutionary processes One can not fix boundary conditions of evolutionary systems. This means that it is impossible to take the system apart without possibly losing critical properties. Here the triumphal concept of reductionism starts to become inadequate. Evolutionary CS feel their boundary conditions. Evolutionary CS change their boundary conditions. In physics the adjacent possible is very small. For example imagine a falling stone. The AP is that it is on the floor in two seconds. There are practically no other options. Lagrangian trajectories are completely specified by initial & end point. In evolution the AP evolves: AP(t) → AP(t + 1) → AP(t + 2) + · · · In physics: realization of the AP does (almost) not influence the next AP. Thurner Intro to the theory of CS KCL london mar 20, 2023 35 / 55
  • 36. What are CS? What is Biology? Biological systems are adaptive and robust – the concept of the edge of chaos The possibility to adapt and robustness seem to exclude each other. However, living systems are clearly adaptive and robust at the same time. To explain how this is possible one can take the picturesque view: Every dynamical system has a maximal Lyapunov exponent. It measures how – two initially infinitesimally close trajectories – diverge over time. The exponential rate of divergence is the Lyapunov exponent λ, |δX(t)| ∼ eλt |δX(0)| where δX(0) is the initial separation If the exponent is positive the system is called chaotic, or strongly mixing If the exponent λ is negative the system approaches an attractor – two initially infinitesimally adjacent trajectories converge. The system is periodic If the exponent is zero: the system is called quasi-periodic, or ’at the edge of chaos’. Thurner Intro to the theory of CS KCL london mar 20, 2023 36 / 55
  • 37. What are CS? What is Biology? How does nature find the edge of chaos? The set of points where the Lyapunov exponents are zero is usually of measure zero. However, evolution seems to find and select these points. Thurner Intro to the theory of CS KCL london mar 20, 2023 37 / 55
  • 38. What are CS? What is Biology? How does nature find the edge of chaos? How can a mechanism of evolution detect something of measure zero? One explanation is self-organized criticality, where systems organize themselves to operate at a critical point between order and randomness. SOC can emerge from interactions in different systems, including sand piles, precipitation, heartbeat, avalanches, forest fires, earthquakes, etc. Or is this set simply not of measure zero? Learn more about these questions in chapter ‘Evolution’ in the book. Thurner Intro to the theory of CS KCL london mar 20, 2023 38 / 55
  • 39. What are CS? What is Biology? Thurner Intro to the theory of CS KCL london mar 20, 2023 39 / 55
  • 40. What are CS? What is Biology? Biological systems are self-organized and critical Self-organized systems are dynamical systems that have a critical point as an attractor. Very often these systems – at a macroscopic scale – are characterized by scale invariance. Scale invariance means the absence of a characteristic (length) scale. A critical point of a system is reached at conditions (temperature, pressure, slope in sandpile, etc.) where the characteristic length-scale (e.g. correlation length) becomes divergent. It is typical to slowly driven systems, where driven means that they are driven softly away from equilibrium. Applications cover all the sciences Physics: particle-, geo-, plasma-, solar physics, cosmology, quantum gravity,... Biology: evolutionary biology, ecology, neurobiology, ... Social sciences: economics, sociology, ... Thurner Intro to the theory of CS KCL london mar 20, 2023 40 / 55
  • 41. What are CS? What is Biology? An intuition for self-organized critical systems—sandpile models Imagine a pile of sand. If the slope is too steep avalanches go off → slope becomes flatter If the slope is too flat sand gets deposited → slope becomes steeper The pile self-organizes toward a critical slope. The system is robust and adaptive. Thurner Intro to the theory of CS KCL london mar 20, 2023 41 / 55
  • 42. What are CS? What is Biology? Let us collect the components for CS that we get from the life sciences Interactions take place on networks Out of equilibrium – they are driven Evolutionary dynamics Core components are discrete – e.g. Boolean networks Adaptive and robust: edge of chaos – self-organized critical Most evolutionary complex systems are path-dependent and have memory (non-ergodic or non-Markovian). The adjacent possible is a way of conceptualizing the evolution of the ‘reachable’ phasespace. We learn how to do statistics of driven out of equilibrium systems in courses: Scaling and Statistical Mechanics Thurner Intro to the theory of CS KCL london mar 20, 2023 42 / 55
  • 43. What are CS? What is Biology? We conclude this section with listing a few aims of current Life Science Understand basic mechanisms (molecular, geno-phenotype relations, epigenetics) Manipulate and manage living matter (medicine, smart matter) Create living matter from scratch (artificial life) This sounds too optimistic? It is to a certain extent, but several prerequisites for these tasks are looking certainly good at the moment. The components are understood and basically under experimental control. Genetic sequences of hundreds of organisms are available. Have proteins of thousands of organisms available. Can synthesize proteins that nature does not produce. Can synthesize arbitrary RNA sequences on demand. Have all metabolites of organisms available. Can count molecules, understand basic transport mechanisms, etc. Thurner Intro to the theory of CS KCL london mar 20, 2023 43 / 55
  • 44. Lecture I What are CS? What is social science? What are CS ? Thurner Intro to the theory of CS KCL london mar 20, 2023 44 / 55
  • 45. What are CS? What is social science? Social science is the science of dynamical social interactions and their impli- cations to society. Traditionally it is neither quantitative nor predictive, nor does not produce experimentally testable predictions. Why is that so? Lack of detailed data Lack of reproducibility / repeatability The queen of the social sciences is economics, i.e. the science of the invention, production, distribution, consumption, and disposal of goods and services. All of these components happen on networks, i.e. the associated interactions are very much directed and interconnected. Maybe network aspects are most illustrative when studied in the social sciences, even more than in the life sciences. Thurner Intro to the theory of CS KCL london mar 20, 2023 45 / 55
  • 46. What are CS? What is social science? What are societies? People, goods or institutions are represented by nodes of a network. Interactions are represented by links in networks of various types. One node may be engaged in several types of interaction. Nodes are characterized by ‘states’: wealth, opinion, age, etc. Nodes and links change over time. Thurner Intro to the theory of CS KCL london mar 20, 2023 46 / 55
  • 47. What are CS? What is social science? What are societies? Societies are co-evolving multiplex networks, Mα ij (t) A multiplex network is a collection of networks on the same set of nodes Links. α = 1: communication: full line; α = 2: trading: dashed line; α = 3: friendship: dotted line. States. black – votes for Hillary; grey – votes for Trump. Nodes i (humans or institutions) are characterized by states σi (t) Thurner Intro to the theory of CS KCL london mar 20, 2023 47 / 55
  • 48. What are CS? What is social science? Let us collect the components for CS that we get from the social sciences Interactions happen on a collection of networks (Multiplex networks). Networks may interact with themselves. Networks show a rich variety in growth and re-structuring. Networks are evolutionary and co-evolutionary objects. Thurner Intro to the theory of CS KCL london mar 20, 2023 48 / 55
  • 49. What are CS? What is co-evolution? The concept of co-evolution. What is an interaction? In general an interaction can change the state of the interacting objects, or the environment. For example, the collision of two particles changes their momentum. The magnetic interaction of two spins may change their orientations. An economic interaction changes the portfolios of the participants involved. The interaction partners (network or multiplex) of a node can be seen as the ‘environment’ (space) of that node. The environment determines the future state of the node. Interactions can change over time. For example people establish new friendships or economical links, countries terminate diplomatic relations. The state of nodes determine the future state of the link, meaning if it exists in the future or not. The state (topology) of the network determines the future states of the nodes. The state of the nodes determines the future state of the links of the network. Thurner Intro to the theory of CS KCL london mar 20, 2023 49 / 55
  • 50. What are CS? A pictorial view Co-evolving multiplex networks – more formally. d dt σα i (t) ∼ F Mα ij (t), σβ j (t) and d dt Mα ij (t) ∼ G Mα ij (t), σβ j (t) This is maybe the simplest way to illustrate what a CS is, or looks like. Networks are observable (big data) Collections of networks are manageable States of individual nodes are (will be) observable From a practical point of view it is useless, because G and F are not specified. They can be stochastic. However, it should be possible to express most CS of the form we discussed in this form. Thurner Intro to the theory of CS KCL london mar 20, 2023 50 / 55
  • 51. What are CS? A summary 1 Complex systems are composed of many elements (latin indices i). 2 Elements interact through one or more types (greek indices α). 3 Interactions are not static but change over time, Mα ij (t). 4 Elements are characterized by states, σi (t). 5 States and interactions often evolve together by mutually updating each other, they co-evolve. 6 Dynamics of co-evolving multilayer networks is usually non-linear. 7 CS are context-dependent. Multilayer networks provide that context and thus offer the possibility of a self-consistent description of CSs. 8 CS are algorithmic. 9 CS are path-dependent and consequently often non-ergodic. 10 CS often have memory. Information about the past can be stored in nodes or in the network structure of the various layers. Thurner Intro to the theory of CS KCL london mar 20, 2023 51 / 55
  • 52. What are CS? The role of the computer The computer is the game changer. CS have never been accessible to control because of computational limits. The analytic tools available until the 1980’s were simply too limited to address problems beyond simple physics problems and equilibrium solutions in (evolutionary) biology and economics. In the later the most famous concept has been game theory. Thurner Intro to the theory of CS KCL london mar 20, 2023 52 / 55
  • 53. What are CS? The role of the computer The computer is the game changer. The computer changed the path of science and opened the way to CS. Mimic systems through simulation. Agent based models model interactions by brute force and study the consequences for the collective. In many systems there is only a single history, especially in the social sciences, or in evolution. The computer allows to create artificial histories of statistically equivalent copies – ensemble pictures help to understand systemic properties that would otherwise not be handleable. This solves the repeatability problem of CS. Develop intuition by building the models. It forces you to systematically think problems through. Maybe one can condense the such gained intuition into traditional formula-language. Computation itself profited immensely from CS in the past 3 decades. Computational limits are very often not the limit for understanding anymore – i.e. controlling and managing – CS. Also data issues become less and less relevant in present times. Science experiences an unheardof revolution. Thurner Intro to the theory of CS KCL london mar 20, 2023 53 / 55
  • 54. What are CS? Some triumphs of the science of CS: Network theory Genetic regulatory networks and Boolean networks Self-organized criticality Genetic algorithms Auto-catalytic networks Theory of increasing returns Origin and statistics of power laws Mosaic vaccines Statistical mechanics of complex systems Network models in epidemiology Complexity economics and systemic risk Allometric scaling in biology Science of cities Thurner Intro to the theory of CS KCL london mar 20, 2023 54 / 55
  • 55. What are CS? Aim of the course Clarify the origin of power laws, especially in the context of driven non-equilibrium systems. Derive a framework for the statistics of driven systems. Categorize probabilistic complex systems into equivalence classes that characterize their statistical properties. Present a generalization of statistical mechanics, and information theory, so that they become useful for CS. In particular, we derive an entropy for complex systems. The overarching theme is understand co-evolutionary dynamics of states and interactions. Thurner Intro to the theory of CS KCL london mar 20, 2023 55 / 55
  • 56. Where do scaling laws come from? stefan thurner KCL london mar 21 2023
  • 57. work done together with Bernat Corominas-Murtra, Rudolf Hanel and Murray Gell-Mann RH, ST, MGM, PNAS 111 (2014) 6905-6910 BCM, RH, ST, PNAS 112 (2015) 5348-5353 BCM, RH, ST, New J Physics 18 (2016) 093010 BCM, RH, ST, J Roy Soc Interface 12 (2016) 20150330 ST, BCM, RH, Phys Rev E 96 (2017) 032124 BCM, RH, ST, Sci Rep (2017) 11223 BCM, RH, LZ, ST, Sci Rep 8 (2018) 10837 RH, ST, Entropy 20 (2018) 838 KCL london mar 21 2023 1
  • 58. 1809 KCL london mar 21 2023 2
  • 59. 1809 Gauss’ theory of errors KCL london mar 21 2023 3
  • 60. 1810 1810 Laplace: “central limit theorem” add independent random numbers that are from identical source → sum is a random number from a normal distribution • simple systems: constituents don’t interact (weakly) • entered physics through Maxwell half a century later • handle for localized variables around average KCL london mar 21 2023 4
  • 61. e−x2 x = 1 → e−x2 = 1/e ∼ 0.3679... x = 10 → e−x2 ∼ 0.000000000000000000000000000000000000000000037201... KCL london mar 21 2023 5
  • 62. x−α x = 1 → x−α = 1 (for α = 1) x = 10 → x−α = 0.1 KCL london mar 21 2023 6
  • 63. complex systems CS are networked, dynamical, (co)evolutionary • elements are not independent • networks link dynamic and stochastic components • often many sources of randomness variables are neither independent nor errors come from identical sources → no reason to believe that Gaussian statistics holds for CS KCL london mar 21 2023 7
  • 64. statistics of CS is statistics of power laws KCL london mar 21 2023 8
  • 65. statistics of complex systems CS variables are not localized around mean → tremendous difficulties • that is expressed by power laws • often not exact power laws – but “fat tailed” → statistics of CS is the statistics of outliers → outliers are the norm rather than the exception KCL london mar 21 2023 9
  • 66. examples for (approximate) power laws KCL london mar 21 2023 10
  • 67. city size MEJ Newman (2005) multiplicative KCL london mar 21 2023 11
  • 70. hurrican damages secondary (multiplicative) ??? KCL london mar 21 2023 14
  • 71. financial interbank loans multiplicative / preferential KCL london mar 21 2023 15
  • 72. forrest fires in various regions SOC KCL london mar 21 2023 16
  • 73. moon crater diameters MEJ Newman (2005) fragmentation KCL london mar 21 2023 17
  • 74. Gamma rays from solar wind MEJ Newman (2005) KCL london mar 21 2023 18
  • 75. movie sales SOC KCL london mar 21 2023 19
  • 78. words in books MEJ Newman (2005) preferential / random / optimization KCL london mar 21 2023 22
  • 79. citations of scientific articles MEJ Newman (2005) preferential KCL london mar 21 2023 23
  • 80. website hits MEJ Newman (2005) preferential KCL london mar 21 2023 24
  • 81. book sales MEJ Newman (2005) preferential KCL london mar 21 2023 25
  • 82. telephone calls MEJ Newman (2005) preferential KCL london mar 21 2023 26
  • 83. earth quake magnitude MEJ Newman (2005) SOC KCL london mar 21 2023 27
  • 85. war intensity MEJ Newman (2005) ??? KCL london mar 21 2023 29
  • 86. killings in wars ??? KCL london mar 21 2023 30
  • 87. size of wars ??? KCL london mar 21 2023 31
  • 88. wealth distribution MEJ Newman (2005) multiplicative KCL london mar 21 2023 32
  • 89. family names MEJ Newman (2005) multiplicative, ??? KCL london mar 21 2023 33
  • 90. systemic risk in supply chain KCL london mar 21 2023 34
  • 91. more power laws ... • networks: literally thousands of scale-free networks • allometric scaling in biology • dynamics in cities • fragmentation processes • random walks • crackling noise • growth with random times of observation • blackouts • fossil record • bird sightings • terrorist attacks • fluvial discharge, contact processes • anomalous diffusion ... KCL london mar 21 2023 35
  • 92. how do power laws arise? KCL london mar 21 2023 36
  • 93. basic routes to power laws • criticality • self-organized criticality • multiplicative processes with constraints • self-reinforcing / preferential processes KCL london mar 21 2023 37
  • 94. I criticality: power laws at phase transitions • statistical physics: power laws emerge at phase transition • this happens at the critical point • power laws in various quantities: critical exponents • various materials have same critical exponents → → behave identical → universality KCL london mar 21 2023 38
  • 95. II power laws through self-organised criticality • systems find critical points themselves – no tuning necessary • since they are at a critical point → power laws • examples: sandpiles, earthquakes, collapse, economy, ... (Wiesenfeld) KCL london mar 21 2023 39
  • 96. III power laws through multiplicative processes • Gaussian distribution: add random numbers (same source) • power law: multiply random numbers and impose constraints e.g. minimum number can not be smaller than X • examples: wealth distribution, city size, double Pareto, language, ... KCL london mar 21 2023 40
  • 97. IV power laws through preferential processes • events repeat proportional to how often they occurred before • examples: network growth models, language formation, ... KCL london mar 21 2023 41
  • 98. V other mechanisms • Levy stable processes • constraint optimisation • extreme value statistics • return times • generalized entropies KCL london mar 21 2023 42
  • 99. one phenomenon—many explanations Zipf law in word frequencies • Simon: preferential attachment • Mandelbrot: constraint optimization • Miller: monkeys produce random texts • Sole: information theoretic: sender and receiver KCL london mar 21 2023 43
  • 100. motivation I KCL london mar 21 2023 44
  • 101. is there a unique principle? KCL london mar 21 2023 45
  • 102. motivation II KCL london mar 21 2023 46
  • 103. complex systems are driven systems KCL london mar 21 2023 47
  • 104. driven system = driving + relaxing process KCL london mar 21 2023 48
  • 105. would be nice to have: statistics for driven, (stationary) non-equilibrium systems that explains the abundance of power laws and variations KCL london mar 21 2023 49
  • 106. many complex systems are path-dependent • future events depend on history of past events • often past events constrain possibilities for the future → sample-space of processes reduces as they unfold KCL london mar 21 2023 50
  • 107. sample-space reducing processes (SSR) KCL london mar 21 2023 51
  • 108. example: history-dependent SSR processes 9 13 7 5 1) 4) 2) 3) 5) 6) 3 1 KCL london mar 21 2023 52
  • 109. phase spaces are nested Ω1 ⊂ Ω2 ⊂ Ω3 · · · ⊂ ΩN KCL london mar 21 2023 53
  • 110. restart means Ω1 ≡ ΩN KCL london mar 21 2023 54
  • 111. sentence-formation is SSR funny wolf word runs funny howls bites house can rucksack room light moon night green eats door open blue sun wind cloud buy high short grey jacket computer algebra write strong letter table fill information article punch hold line brown trash tree deer rabbit space fly mosquito bee window think red building hospital glass wine take sell plane envelope word runs funny howls bites house can rucksack room light moon night green eats door open blue sun wind cloud buy high short grey jacket computer algebra write strong letter table fill information article punch hold line brown trash tree deer rabbit space fly mosquito bee window think red building hospital glass wine take sell plane envelope wolf wolf word house can rucksack room light moon night green door open blue sun wind cloud buy high short jacket computer algebra write letter table fill information article punch hold line brown trash tree deer rabbit space fly mosquito bee window think red building hospital glass wine take sell plane envelope runs howls bites eats grey strong wolf word runs funny howls bites house can rucksack room light green eats door open blue sun wind cloud buy high short grey jacket computer algebra write strong letter table fill information article punch hold line brown trash tree deer rabbit space fly mosquito bee window think red building hospital glass wine take sell plane envelope moon night a) b) c) d) KCL london mar 21 2023 55
  • 112. 0 5 10 0 0.45 n 0 5 10 0 0.1 0.2 0.3 0.4 n p(n) x x 1 2 3 4 5 6 7 8 9 10 1 2 3 4 5 6 7 8 9 p(i) 1 5 10 1 5 10 10 0 0.1 0.2 0.3 0.4 Site i Site i KCL london mar 21 2023 56
  • 113. SSR leads to exact Zipf’s law p(i) = 1 i p(i) is probability to visit site i KCL london mar 21 2023 57
  • 114. why? Clearly, p(i) = PN j=1 P(i|j) p(j) holds, with P(i|j) = ( 1 j−1 for i j (SSR) 1 N for i ≥ j = 1 (restart) We get p(i) = 1 N p(1) + PN j=i+1 1 j−1 p(j) → recursive relation p(i + 1) − p(i) = −1 i p(i + 1) p(i) = ... = p(1)i−1 KCL london mar 21 2023 58
  • 115. alternative proof by induction Let N = 2. There are two sequences φ: either φ directly generates a 1 with p = 1/2, or first generates 2 with p = 1/2, and then a 1 with certainty. Both sequences visit 1 but only one visits 2. As a consequence, P2(2) = 1/2 and P2(1) = 1. Now suppose PN−1(i) = 1/i holds. Process starts with dice N, and probability to hit i in the first step is 1/N. Also, any other j, N ≥ j i, is reached with probability 1/N. If we get j i, we get i in the next step with probability Pj−1(i), which leads to a recursive scheme for i N, PN(i) = 1 N 1 + P ij≤N Pj−1(i) . Since by assumption Pj−1(i) = 1/i, with i j ≤ N holds, some algebra yields PN(i) = 1/i. KCL london mar 21 2023 59
  • 116. true for all systems with shrinking sample-space over time KCL london mar 21 2023 60
  • 117. note: process is slowly driven KCL london mar 21 2023 61
  • 118. what if restart SSR before it is fully relaxed? restarting or driving rate is (1 − λ) → p(i) = i−λ KCL london mar 21 2023 62
  • 119. arbitrary driving p(λ) (i) = PN j=1 P(i|j) p(λ) (j) , with P(i|j) =      λ j−1 + 1−λ N for i j (SSR) 1−λ N for i ≥ j 1 (RW) 1 N for i ≥ j = 1 (restart) We get p(λ) (i) = 1−λ N + 1 N p(λ) (1) + PN j=i+1 λ j−1 p(λ) (j) to recursive relation p(λ) (i + 1) − p(λ) (i) = −λ1 i p(λ) (i + 1) p(λ) (i) p(λ)(1) = Qi−1 j=1 1 + λ j −1 = exp h − Pi−1 j=1 log 1 + λ j i ∼ exp − Pi−1 j=1 λ j ∼ exp (−λ log(i)) = i−λ KCL london mar 21 2023 63
  • 120. side note: valid also for continuous case if p(x) ∼ Z x 0 p(y) y dy −→ p(x) is a power KCL london mar 21 2023 64
  • 121. true for all systems with driving rate 1 − λ (shrinking sample space with probability λ) KCL london mar 21 2023 65
  • 122. SSR processes with fast driving same convergence speed as CLT for iid processes KCL london mar 21 2023 66
  • 123. Berry-Esseen convergence speed 10 0 10 2 10 4 10 −3 10 −2 10 −1 10 0 Number of jumps T Distance 10 0 10 1 10 2 10 3 10 4 10 1 10 2 10 3 10 4 10 5 10 6 Rank Number of site visits 0 0.5 1 0 0.2 0.4 0.6 0.8 1 sim =0.0 =1.0 =1.0 =0.5 =0.7 -0.5 a) b) -0.5 -0.7 -1.0 KCL london mar 21 2023 67
  • 124. SSR-based Zipf law is extremely robust KCL london mar 21 2023 68
  • 125. proof: same as before p(i|j) = ( λ q(i) g(j−1) + (1 − λ)q(i) if i j (1 − λ)q(i) otherwise. ... → pλ(i) ∼ pλ(1) q(1)1−λ q(i) g(i)λ where g(j) = P ij+1 q(i) polynomial priors: q(i) ∼ iα , (α −1) → p(i) ∼ iα(1−λ)−λ polynomial priors: q(i) ∼ iα , (α −1)→ p(i) ∼ q(i) exponential priors: q(i) ∼ eβi → uniform distribution KCL london mar 21 2023 69
  • 126. prior probabilities are practically irrelevant! KCL london mar 21 2023 70
  • 127. driven system = driving + relaxation part KCL london mar 21 2023 71
  • 128. relaxation = sample space reducing process KCL london mar 21 2023 72
  • 129. details of relaxing process do NOT matter KCL london mar 21 2023 73
  • 130. Zipf-law is extremely robust—accelerated SSR KCL london mar 21 2023 74
  • 131. not all transitions p(i|j) must exist—diffusion on networks KCL london mar 21 2023 75
  • 132. SSR and diffusion on networks KCL london mar 21 2023 76
  • 133. SSR is a random walk on directed ordered NW 1 2 3 4 5 1 2 3 4 5 1/2 1/4 1/8 Node rank Node occupation probability 10 0 10 2 10 0 10 −2 10 −4 Path rank Path probability acyclic p exit =0.3 a) b) -0.65 -1 pexit=0.3 Start Stop 1/5 1/4 1/3 1/2 (1-pexit)/5 1 pexit pexit=1) pexit=0) 1 note fully connected KCL london mar 21 2023 77
  • 134. SSR = targeted random walk on networks simple example Directed Acyclic Graph (no cycles) KCL london mar 21 2023 78
  • 135. KCL london mar 21 2023 79
  • 136. KCL london mar 21 2023 80
  • 137. KCL london mar 21 2023 81
  • 138. simple routing algorithm • take directed acyclic network fix it • pick start-node • perform a random walk from start-node to end-node (1) • repeat many times from other start-nodes • prediction visiting frequency of nodes follows Zipf law KCL london mar 21 2023 82
  • 139. all diffusion processes on DAGs are SSR sample ER graph → direct it → pick start and end → diffuse KCL london mar 21 2023 83
  • 140. Exponential NW HEP Co-authors KCL london mar 21 2023 84
  • 141. prior probabilities are practically irrelevant! KCL london mar 21 2023 85
  • 142. what happens if introduce weights on links? ER Graph poisson weights power weights KCL london mar 21 2023 86
  • 143. prior probabilities are practically irrelevant! KCL london mar 21 2023 87
  • 144. what happens if we allow for cycles? ER → direct it → change link to random direction with 1 − λ “driving” λ = 0.8 λ = 0.5 KCL london mar 21 2023 88
  • 145. Zipf’s law is an immense attractor! KCL london mar 21 2023 89
  • 146. Zipf’s law is an attractor • no matter what the network topology is → Zipf • no matter what the link weights are → Zipf • if have cycles → exponent is less than one KCL london mar 21 2023 90
  • 147. all good search is SSR KCL london mar 21 2023 91
  • 148. what is good search? a search process is good if ... • ... at every step you eliminate more possibilities than you actually sample • ... every step you take eliminates branches of possibilities if eliminate fast enough → power law in visiting times if eliminate too little → sample entire space (exhaustive search) if no cycles: expect Zipf’s law KCL london mar 21 2023 92
  • 149. clicking on web page is often result of search process KCL london mar 21 2023 93
  • 150. adamic hubermann 2002 KCL london mar 21 2023 94
  • 151. breslau et al 99 KCL london mar 21 2023 95
  • 152. what about exponents 1? KCL london mar 21 2023 96
  • 153. 1 2 3 4 5 6 7 8 9 10 11 1213141516 17181920 Multiplication factor µ → p(i) = i−µ Ansatz: n(j → i) = µp(i|j) (slow driving limit) KCL london mar 21 2023 97
  • 155. what if we introduce conservation laws? KCL london mar 21 2023 99
  • 156. conservation laws in SSR processes assume you have duplication at every jump, µ = 2 if you are at i → duplicate → one jumps to j, the other to k conservation means: i = j + k. conservation means: f(i) = f(state1) + f(state2) + · · · + f(stateµ) → p(i) = i−2 for all µ same result was found by E. Fermi for particle cascades KCL london mar 21 2023 100
  • 158. complex systems are driven – always! KCL london mar 21 2023 102
  • 159. complex systems are driven non-equilibrium systems • only driven systems produce non-trivial structures • without driving: just ground state or equilibrium • every driven system: relaxing part + driving part • every relaxing part is a SSR KCL london mar 21 2023 103
  • 160. KCL london mar 21 2023 104
  • 161. example: inelastic gas in a box KCL london mar 21 2023 105
  • 162. example: inelastic gas in a box ρ(E0 ) = Z E∗ E0 dEρ(E0 |E, cr) ((1 − ξ)ρ(E) + ξρcharge(E)) KCL london mar 21 2023 106
  • 163. inelastic gas: transition probability ρ(E0 1|E1, cr) = 1 Z(E1) R π 0 dζg(ζ) R π 0 dαf(α) R π 0 dφr(φ) × R ∞ 0 dE2ρ(E2)θ(E1 − E2)δ(E0 1 − F(E1, E2, α, ζ, φ; c∗ r)) with F(E1, E2, α, ζ, φ; c∗ r) = E12 1+c2 r 4 + 1−c2 r 4 q cos ζ +cr 2 p 1 − (q cos ζ)2 (cos ζ cos 2α − sin ζ sin 2α cos φ) with q = 2 q E1 E12 E2 E12 and cr(α)2 = 1 − (1 − c∗ r 2 )| sin α| use this in eigenvalue equation KCL london mar 21 2023 107
  • 164. inelastic gas: energy distribution KCL london mar 21 2023 108
  • 165. inelastic gas: dependence on driving rate KCL london mar 21 2023 109
  • 166. where do specific distributions come from? KCL london mar 21 2023 110
  • 167. assume that driving rate depends on state λ(i) KCL london mar 21 2023 111
  • 168. → λ(x) = −x d dx log p(x) or p(x) ∼ exp(− R λ(x) x dx) simple proof KCL london mar 21 2023 112
  • 169. proof transition probabilities from state k to i are pSSR(i|k) = λ(k) qi g(k−1) + (1 − λ(k))qi if i k (1 − λ(k))qi if i k g(k) is the cdf of qi, g(k) = P i≤k qi. Observing that pλ,q(i + 1) qi+1 1 + λ(i + 1) qi+1 g(i) = pλ,q(i) qi we get pλ,q(i) = qi Zλ,q Y 1j≤i 1 + λ(j) qj g(j − 1) −1 ∼ q(i) Zλ,q e − P j≤i λ(j) q(j) g(j−1) Zλ,q is the normalisation constant. For uniform priors, taking logs and going to continuous variables gives the result, λ(x) = −x d dx log pλ(x). KCL london mar 21 2023 113
  • 170. the driving process determines distribution KCL london mar 21 2023 114
  • 171. special cases λ(x) = −x d dx log p(x) • Zipf: slow driving (λ = 1) → p(x) = x−1 • Power-law: constant driving λ(x) = α → p(x) = x−α • Exponential: λ(x) = βx → p(x) = e−β(x−1) • Power-law + cut-off: λ(x) = α+βx → p(x) = x−α e−βx • Gamma: λ(x) = 1 − α + βx → p(x) = xα−1 e−βx KCL london mar 21 2023 115
  • 172. special cases λ(x) = −x d dx log p(x) • Normal: λ(x) = 2βx2 → p(x) = e−β 2(x−1)2 • Stretched exp: λ(x) = αβ|x|α → p(x) = e−β α(x−1)α • Log-normal: λ(x) = 1 − β σ2 + log x σ2 → 1 xe − (log x−β)2 2σ2 • Gompertz: λ(x) = (βeαx − 1)βx → p(x) = eβx−αeβx • Weibull: λ(x) = β−α αxα + α − 1 →p(x) = x β α−1 e − x β α • Tsallis: λ(x) = βx 1−βx(1−Q) → p(x) = (1−(1−Q)βx) 1 1−Q KCL london mar 21 2023 116
  • 173. driving determines statistics of driven systems slow → Zipf’s law constant → power law extreme driving → prior distribution (uniform) driving state dependent → any distribution (depends on driving) (all true for well-behaved priors only) KCL london mar 21 2023 117
  • 174. systems of SSR–nature • any driven system (with stationary distributions) • self-organized critical systems • search • fragmentation • propagation of information in language: sentence formation • sequences of human behavior • games: go, chess, life ... • record statistics KCL london mar 21 2023 118
  • 175. the 3 entropies of SSR processes information production rate: SIT 1 + 1 2 log W extensive entropy: SEXT S1,1 = − P pi log pi Maxent functional (SD limit): SMEP − PW i=2 h pi log pi p1 + (p1 − pi) log 1 − pi p1 expressions valid for large large N (iterations) and W (states) KCL london mar 21 2023 119
  • 176. do we now have a ”CLT” for fat tailed distributions? • criticality: SSR picture applies for some • self-organized criticality: subset of SSR • multiplicative processes with constraints: partly applies • self-reinforcing processes: for some SSR applies KCL london mar 21 2023 120
  • 177. conclusions • complex systems are driven and show fat tailed distributions • processes of SSR type abound in nature • SSR has extremely robust attractors – priors don’t matter • relaxation is usually SSR • details of relaxation do not matter – note CLT ! • details of driving + SSR → explain statistics KCL london mar 21 2023 121
  • 178. Statistics of complex systems II: entropy for complex systems stefan thurner KCL london mar 22 2023
  • 179. with R. Hanel and M. Gell-Mann RH, ST, Europhysics Letters 93 (2011) 20006 RH, ST, Europhysics Letters 96 (2011) 50003 RH, ST, MGM, PNAS 108 (2011) 6390-6394 RH, ST, MGM, PNAS 109 (2012) 19151-19154 RH, ST, MGM, PNAS 111 (2014) 6905-6910 KCL london mar 22 2023 1
  • 180. Why Statistics of Complex Systems ? Understand macroscopic system behavior on basis of micro- scopic elements and their interactions → entropy • Hope: ’thermodynamical’ relations of CS → phase diagrams • Hope: understand distribution functions: power laws, stretched exp, ... • Dream: way to reduce number of parameters → handle CS • Dream: max entropy principle for CS: predict distributions KCL london mar 22 2023 2
  • 181. What is Entropy? Entropy has to do with ... (1) ... quantifying information production of source (2) ... thermodynamics: for example dU = TdS − pdV (3) ... statistical inference: given data what is most likely distribution? For simple systems all three are related S = −k W X i=1 pi log pi This is no longer true for complex, non-ergodic systems! KCL london mar 22 2023 3
  • 183. KCL london mar 22 2023 5
  • 184. Information theory • Receiver wants to know how correct received message is → add redundancy • Adding redundancy → send more info through channel → info transmission rate reduces • What is the transmission rate of information ? KCL london mar 22 2023 6
  • 185. Adding redundancy vs. error probability Message encoded → received decoded ”Hi there” 100100111 → 100100101 ”Hi thor” with redundancy 100100111 → 100100111 ”Hi there” 100100111 → 100100101 ”Hi there” 100100111 → 100100111 KCL london mar 22 2023 7
  • 186. KCL london mar 22 2023 8
  • 187. Shannon’s first theorem can encode a message such that you can transmit it error-free trough a noisy channel, if the capacity of the channel is higher than the information production rate of the source S what is S? S is a property of the source (thing that produces a message) if find code that can correct for more errors than are produced → error free KCL london mar 22 2023 9
  • 188. What is the source? Example 1 random source: produces letters A, B, C, D, E with probability pA = 0.4, pB = 0.1, pC = 0.2, pD = 0.2, pE = 0.1 experiment AAACDCBDCEAADADACEDAEADCABEDADDCECAAAAD at what average rate is information produced in this source? S = −pA ln pA − pB ln pB − pC ln pC − pD ln pD − pE ln pE KCL london mar 22 2023 10
  • 189. Example 2 KCL london mar 22 2023 11
  • 190. Example 2 produces letters A, B, C with pA = 9 27, pB = 16 27, pC = 2 27 Successive symbols not independent, probabilities depend on preceding ones p(i|j) j A B C A 0 4 5 1 5 i B 1 2 1 2 0 C 1 2 2 5 1 10 experiment ABBABABABABABABBBABBBBBABABABABABBBACACABBABBBBABBABACBBBABA S = − P i,j pip(j|i) ln p(j|i) KCL london mar 22 2023 12
  • 191. A source must be ... • ... stationary • ... ergodic (objective probabilities same as in experiment) KCL london mar 22 2023 13
  • 192. stationary and ergodic KCL london mar 22 2023 14
  • 193. No no no KCL london mar 22 2023 15
  • 194. No no no KCL london mar 22 2023 16
  • 195. Why this funny form, S = − P i pi ln pi ? Appendix 2, Theorem 2 C.E. Shannon, The Bell System Technical Journal 27, 379-423, 623-656, 1948. KCL london mar 22 2023 17
  • 196. Entropy S[p] = W X i=1 g(pi) pi ... probability for finding (micro) state i, P i pi = 1 W ... number of states g ... some function. What does it look like? KCL london mar 22 2023 18
  • 197. The Shannon-Khinchin axioms Measure for the amount of uncertainty S • SK1: S depends continuously on p • SK2: S maximum for equi-distribution pi = 1/W • SK3: S(p1, p2, · · · , pW ) = S(p1, p2, · · · , pW , 0) • SK4: S(AB) = S(A) + S(B|A) Theorem (uniqueness theorem) If SK1- SK4 hold, the only possibility is S[p] = − PW i=1 pi ln pi Proof crucially depends on the use of SK4 and SK1 KCL london mar 22 2023 19
  • 198. Interested in complex systems KCL london mar 22 2023 20
  • 199. What are Complex Systems? • CS are made up from many elements • These elements are in strong contact with each other • Elements and combinations of them constrain themselves As a consequence • CS are intrinsically non-ergodic: not all states reachable KCL london mar 22 2023 21
  • 200. CS are ... • evolutionary • path-dependent • long-memory • long-range • non-ergodic all of this violates KCL london mar 22 2023 22
  • 201. What is the IT entropy of CS? KCL london mar 22 2023 23
  • 202. Remember Shannon-Khinchin axioms • SK1: S depends continuously on p → g is continuous • SK2: S maximal for equi-distribution pi = 1/W → g concave • SK3: S(p1, p2, · · · , pW ) = S(p1, p2, · · · , pW , 0) → g(0) = 0 • SK4: S(AB) = S(A) + S(B|A) note: S[p] = PW i g(pi). If SK1-SK4 → g(x) = −kx ln x KCL london mar 22 2023 24
  • 203. Shannon-Khinchin axiom 4 is non-sense for CS SK4 corresponds to ergodic sources → SK4 violated for non-ergodic systems → nuke SK4 KCL london mar 22 2023 25
  • 204. The ‘Complex Systems axioms’ • SK1 holds • SK2 holds • SK3 holds • Sg = PW i g(pi) , W 1 Theorem: All systems for which these axioms hold (1) can be uniquely classified by 2 numbers, c and d (2) have the entropy Sc,d = e 1 − c + cd W X i=1 Γ (1 + d , 1 − c ln pi) − c e # e · · · Eulerconst. KCL london mar 22 2023 26
  • 205. Γ(a, b) = R ∞ b dt ta−1 exp(−t) KCL london mar 22 2023 27
  • 206. The argument: generic properties of g • scaling transformation W → λW: how does entropy change? KCL london mar 22 2023 28
  • 207. Mathematical property I: a scaling law! lim W →∞ Sg(Wλ) Sg(W) = ... = λ1−c Define f(z) ≡ limx→0 g(zx) g(x) with (0 z 1) Theorem 1: For systems satisfying SK1, SK2, SK3: f(z) = zc , 0 c ≤ 1 KCL london mar 22 2023 29
  • 208. Theorem 1 Let g be a continuous, concave function on [0, 1] with g(0) = 0 and let f(z) = limx→0+ g(zx)/g(x) be continuous, then f is of the form f(z) = zc with c ∈ (0, 1]. Proof. note f(ab) = lim x→0 g(abx) g(x) = lim x→0 g(abx) g(bx) g(bx) g(x) = f(a)f(b) c 1 explicitly violates SK2, c 0 explicitly violates SK3. KCL london mar 22 2023 30
  • 209. Mathematical property II: yet another one !! lim W →∞ S(W1+a ) S(W)Wa(1−c) = ... = (1 + a)d Theorem 2: Define hc(a) = limx→0 g(xa+1 ) xacg(x)... KCL london mar 22 2023 31
  • 210. Theorem 2 Let g be as before and f(z) = zc then hc(a) = (1 + a)d for d constant. Proof. We determine hc(a) again by a similar trick as we have used for f hc(a) = limx→0 g(xa+1 ) xacg(x) = g (xb )(a+1 b −1)+1 (xb)(a+1 b −1)c g(xb) g(xb ) x(b−1)cg(x) = hc a+1 b − 1 hc (b − 1) for some constant b. By a simple transformation of variables, a = bb0 − 1, one gets hc(bb0 − 1) = hc(b − 1)hc(b0 − 1). Setting H(x) = hc(x − 1) one again gets H(bb0 ) = H(b)H(b0 ). So H(x) = xd for some constant d and consequently hc(a) = (1 + a)d KCL london mar 22 2023 32
  • 211. Summary CS are non-ergodic systems → SK1-SK3 hold → lim W →∞ Sg(Wλ) Sg(W) = λ1−c 0 ≤ c 1 → lim W →∞ S(W1+a ) S(W)Wa(1−c) = (1 + a)d d real KCL london mar 22 2023 33
  • 212. What functions S fulfil these 2 scaling laws? Sc,d = re W X i=1 Γ (1 + d , 1 − c ln pi)−rc KCL london mar 22 2023 34
  • 213. Which distribution maximizes Sc,d ? pc,d(x) = e − d 1−c Wk B(1+x r) 1 d −Wk(B) r = 1 1−c+cd, B = 1−c cd exp 1−c cd Lambert-W: solution to x = W(x)eW (x) KCL london mar 22 2023 35
  • 214. Universality classes all SK 1-3 systems are characterized by 2 exponents: (c, d) KCL london mar 22 2023 36
  • 215. Examples • S1,1 = P i g1,1(pi) = − P i pi ln pi + 1 (BGS entropy) • Sq,0 = P i gq,0(pi) = 1− P i p q i q−1 + 1 (Tsallis entropy) • S1,d0 = P i g1,d(pi) = e d P i Γ (1 + d , 1 − ln pi) − 1 d • ... KCL london mar 22 2023 37
  • 216. Classification of entropies: order in the zoo entropy c d SBG = P i pi ln(1/pi) 1 1 • Sq1 = 1− P p q i q−1 (q 1) c = q 1 0 • Sκ = P i pi(pκ i − p−κ i )/(−2κ) (0 κ ≤ 1) c = 1 − κ 0 • Sq1 = 1− P p q i q−1 (q 1) 1 0 • Sb = P i(1 − e−bpi) + e− b − 1 (b 0) 1 0 • SE = P i pi(1 − e pi−1 pi ) 1 0 • Sη = P i Γ(η+1 η , − ln pi) − piΓ(η+1 η ) (η 0) 1 d = 1/η • Sγ = P i pi ln1/γ (1/pi) 1 d = 1/γ • Sβ = P i pβ i ln(1/pi) c = β 1 Sc,d = P i erΓ(d + 1, 1 − c ln pi) − cr c d KCL london mar 22 2023 38
  • 217. Associated distribution functions • p(1,1) → exponentials (Boltzmann distribution) p ∼ e−ax • p(q,0) → power-laws (q-exponentials) p ∼ 1 (a+x)b • p(1,d0) → stretched exponentials p ∼ e−axb • p(c,d) all others → Lambert-W exponentials p ∼ eaW (xb ) NO OTHER POSSIBILITIES if only SK4 is violated KCL london mar 22 2023 39
  • 218. q-exponentials (powers) Lambert-exponentials 10 0 10 5 10 −30 10 −20 10 −10 10 0 x p(x) (b) d=0.025, r=0.9/(1−c) c=0.2 c=0.4 c=0.6 c=0.8 10 0 10 5 10 −20 10 0 x p(x) (c) r=exp(−d/2)/(1−c) (0.3,−4) (0.3,−2) (0.3, 2) (0.3, 4) (0.7,−4) (0.7,−2) (0.7, 2) (0.7, 4) KCL london mar 22 2023 40
  • 219. The Lambert-W: a reminder • solves x = W(x)eW (x) • inverse of p ln p: [W(p)] −1 = p ln p • delayed diff equations ẋ(t) = αx(t − τ) → x(t) = e 1 τ W (ατ)t Ansatz: x(t) = x0 exp[1 τ f(ατ)t] with f some function KCL london mar 22 2023 41
  • 220. −1 0 1 2 0 1 violates K2 violates K2 Stretched exponentials − asymptotically stable (c,d)−entropy, d0 Lambert W0 exponentials q−entropy, 0q1 compact support of distr. function BG−entropy violates K3 (1,0) (c,0) (0,0) d c (c,d)−entropy, d0 Lambert W −1 exponentials KCL london mar 22 2023 42
  • 221. Relaxing ergodicity (kill SK4) opens door to... • ... order the zoo of entropies through universality classes • ... understand ubiquity of power laws (and extremely similar functions) • ... understand where Tsallis entropy comes from KCL london mar 22 2023 43
  • 223. Thermodynamic entropy is extensive quantity if not: thermodynamic relations are non-sense: e.g. dU = TdS extensive entropy means: S(WA+B) = S(WA) + S(WB) don’t confuse with additive: S(WA.WB) = S(WA) + S(WB) KCL london mar 22 2023 45
  • 224. If we require entropy to be extensive extensive: S(WA+B) = S(WA) + S(WB) = · · · can proof: W(N) = exp h d 1−cWk µ(1 − c)N 1 d i c = lim N→∞ 1 − 1/N W0 (N) W(N) d = lim N→∞ log W 1 N W W0 + c − 1 growth of phasespace volume determines entropy KCL london mar 22 2023 46
  • 225. Examples • W(N) = 2N → (c, d) = (1, 1) → system is BGS-type • W(N) = Nb → (c, d) = (1 − 1 b, 0) → system is Tsallis-type • W(N) = exp(λNγ ) → (c, d) = (1, 1 γ) • ... KCL london mar 22 2023 47
  • 226. A note on phasespace volume imagine tossing coins 1 coin: ↑, ↓ 2 states: W(1) = 2 2 coins: ↑↑, ↑↓, ↓↑,↓↓ 4 states: W(2) = 4 N coins: 2N states: W(N) = 2N imagine process: generate sequence of N ↑ and ↓ symbols with equal probability. BUT after the first ↑, all subsequent symbols must be ↑ 1 coin: ↑, ↓ 2 states: W(1) = 2 2 coins: ↑↑, ↓↑, ↓↓ 3 states: W(2) = 3 3 coins: ↑↑↑, ↓↑↑, ↓↓↑, ↓↓↓ 4 states: W(3) = 4 N coins: N − 1 states: W(N) = N + 1 KCL london mar 22 2023 48
  • 227. What does this imply? • Information Theory: complex source (non-ergodic) → Sc,d = re W X i=1 Γ (1 + d , 1 − c ln pi) − rc • Themodynamics: if want extensivity c and d follow from phasespace volume W(N) KCL london mar 22 2023 49
  • 228. Examples KCL london mar 22 2023 50
  • 229. Example I: super-diffusion: accelerating RWs 0 10 20 30 40 50 −40 −20 0 20 40 N X β free decisions 1 2 3 4 5 6 7 8 9 ∆ N ∝ Nβ β=0.5 0 2 4 6 8 10 x 10 5 −1 −0.5 0 0.5 1 x 10 5 N x β (b) β=0.5 β=0.6 β=0.7 • up-down decision of walker is followed by [Nβ ]+ steps in same direction • k(N) number of u-d decisions to step N → k(N) ∼ N1−β • possible sequences: W(N) ∼ 2N1−β → (c, d) = (1, 1 1−β) KCL london mar 22 2023 51
  • 230. Example II: Join-a-club spin system • NW growth: new node links to αN(t) random neighbors, α 1 constant connectency network A (e.g. person joining club) • each node i has 2 states: si = ±1 ; YES / NO (e.g. opinion) • each node i has initial (’kinetic’) energy i (e.g. free will) • interaction Hij = −JAijsisj • spin-flip of node can occur if node has enough energy for it (microcanonic) → Can show extensive entropy is Tsallis entropy (c, d) = (q, 0), Sc,d = Sq,0 KCL london mar 22 2023 52
  • 231. Example III: XY ferromagnet with transverse H field H = N−1 X i=1 (1 + γ)σx i σx i+1 + (1 − γ)σy i σy i+1 + 2λσz i |γ| = 1 → Ising ferromagnet γ = 0 → isotropic XY ferromagnet 0 γ 1 → anisotropic XY ferromagnet L ... length of block in spin chain N → ∞ → Can show extensive entropy for the block, (c, 0)-entropy F. Caruso, C. Tsallis, PRE 78, 021101 2011 KCL london mar 22 2023 53
  • 232. Example IV: Black hole entropy log Wblack−hole ∝ area → Can show extensive entropy is (c, d) = (0, 3/2)-entropy C. Tsallis L.J.L. Cirto, arxiv 1202.2154 [cond-mat.stat-mech] KCL london mar 22 2023 54
  • 233. Statistical inference — Maximum entropy principle KCL london mar 22 2023 55
  • 234. What is the probability to find a histogram? Sequence x = (x1, x2, · · · , xN) with xn ∈ {1, 2} (coin tosses) 1q0 1q0 2 1q1 1q0 2 1q0 1q1 2 1q2 1q0 2 2q1 1q1 2 1q0 1q2 2 1q3 1q0 2 3q2 1q1 2 3q1 1q2 2 1q1 1q3 2 1q4 1q0 2 4q3 1q1 2 6q2 1q2 2 4q1 1q3 2 1q0 1q4 2 q1 and q2 ... probabilities to toss 1 (heads) or 2 (tails) in a trial k = (k1, k2) ... histogram of x KCL london mar 22 2023 56
  • 235. Probability to find a histogram? Binomial The probability to find the histogram k is P(k; q) = N! k1!k2! qk1 1 qk2 2 • N ... length of sequence x • 2 states • xn ∈ {1, 2} KCL london mar 22 2023 57
  • 236. Probability to find a histogram? Multinomial The probability to find the histogram k is P(k; q) = N! k1!k2! · · · kW ! | {z } multiplicity M(k) qk1 1 qk2 2 · · · q kW W | {z } probability G(k;q) • N ... length of sequence x • W states • xn ∈ {1, 2, · · · , W} KCL london mar 22 2023 58
  • 237. Two observations Observation 1: The factorization property • Multiplicity M(k) does not depend on q • All dependencies on q are captured by G(k; q) Factorization P(k; q) = M(k) G(k; q) 1 N log P(k; q) = 1 N log M(k) + 1 N log G(k; q) KCL london mar 22 2023 59
  • 238. Observation 2: Log of multiplicity is Shannon entropy 1 N log M(k) = 1 N log N! k1!k2!···kW ! ∼ 1 N log NN k k1 1 k k2 2 ···k kW W · · · Stirling = log N − 1 N PW i=1 ki log ki = − PW i=1 ki N log ki N = − PW i=1 pi log pi = SShannon[p] 1 N log P(k; q) = − W X i=1 pi log(pi) | {z } constraint independent − α W X i=1 pi − β W X i=1 pii | {z } constraints KCL london mar 22 2023 60
  • 239. What is the most likely histogram k? 0 = ∂ ∂ki log P(k; q) for fixed N ⇒ 0 = ∂ ∂pi − W X i=1 pi log(pi) −α W X i=1 pi − β W X i=1 pii ! KCL london mar 22 2023 61
  • 240. Crazy ! Some non-ergodic CS can be factorized too! If factorization P(k; q) = M(k)G(k; q) exists then a MEP exists with 1 φ(N) log P(k; q) | {z } relative entropy = 1 φ(N) log M(k) | {z } generalized entropy + 1 φ(N) log G(k; q) | {z } cross entropy When does factorization exist? → read RH, ST, MGM (2014) Theorem: for processes of type p(xN+1|k; q) (explicitly path- dependent) factorization exists (SK1!) iff multinomial M(k) → becomes deformed multinomial Mu,T (k) KCL london mar 22 2023 62
  • 241. Deformed deformed multinomials → Sc,d Mu,T (k) ≡ N !u Q i NT ki N !u | {z } deformed multinomial , N !u ≡ N Y n=1 u(n) | {z } deformed factorial T monotonic with T(0) = 0, T(1) = 1 deformed multinomials lead to trace-form (c, d)-entropies S[p] = 1 φ(N) log Mu,T (k) = ... = − W X i=1 Z pi 0 dz Λ(z) Λ(z) = 1 a T0 (z) N φ(N) log u(NT(z)) − log λ Λ(z) does not depend on N → separation of variables with separation constant ν (derivatives w.r.t. z and N)→ KCL london mar 22 2023 63
  • 242. General solution for deformed multinomials → (c,d)-entropy Λ(z) = T 0 (z)T (z)ν −T 0 (1) T 00(1)+νT 0(1)2 u(N) = λ(Nν) −1 λ−1 ...q − shifted factorials (ν = 1) φ(N) = N1+ν simplest case: T(z) = z Λ(z) = T 0 (z)T (z)ν −T 0 (1) T 00(1)+νT 0(1)2 = zν −1 ν → S[p] = PW i=1 R pi 0 dzΛ(z) = K 1− PW i=1 p Q i Q−1 Q ≡ 1 + ν and K constants KCL london mar 22 2023 64
  • 243. This is Sc,d-entropy !!! KCL london mar 22 2023 65
  • 244. Is extensive entropy the same as MEP-entropy? No! they are NOT the same – but related Both are of (c, d)-entropy type (c, d)TD 6= (c, d)MEP KCL london mar 22 2023 66
  • 245. Example: MEP for path-dependent process imagine a history dependent process of the following kind P(k|θ) = W X i p(i|k − ei, θ)P(k − ei|θ) with p(i|k, θ) = θi Z(k) W Y j=i+1 f(kj) with f(x) = λ(x)ν use simplest priors θi = 1 W , we have u(y) = λyν and T(y) = y gives entropy S = 1− PW i=1 p Q i Q−1 we get as the prediction from the maximum entropy principle pi = [1 − ν(1 − Q)(α + βi)] 1 1−Q with i = (i − 1) KCL london mar 22 2023 67
  • 246. Compare with simulation! 10 0 10 1 10 −4 10 −3 10 −2 10 −1 εi =i p i 5000 10000 0 1 2 N α,β simul N=1000 N=5000 N=10000 theory N=1000 N=5000 N=10000 α β ν=0.25, λ=1.1 KCL london mar 22 2023 68
  • 247. Which processes have non-Shannon type MEP? • Answer: processes whose sample-space reduces as they unfold Remarkable • Start with microscopic path-dependent update-equations → compute entropy → maximise it → distribution at any time in the future • Process is non-ergodic, path-dependent , non-stationary – yet MEP works KCL london mar 22 2023 69
  • 248. The message complex systems (out-of-equilibrium, non-ergodic, non- multinomial) break the degeneracy of entropy − P i pi log pi → entropy has 3 faces • thermodynamic - face • information theoretic - face • maxent entropy - face Formulas look different for different systems and processes KCL london mar 22 2023 70
  • 249. Example1: Pólya process a) b) KCL london mar 22 2023 71
  • 250. Example2: Sample space reducing process 9 13 7 5 1) 4) 2) 3) 5) 6) 3 1 KCL london mar 22 2023 72
  • 251. The 3 entropies of Pólya and SSR processes SIT SEXT SMEP Pólya 1−qj γ 1 N log N S1− γ 1−qj ,0 − P i log pi SSR 1 + 1 2 log W S1,1 − PW i=2 h pi log pi p1 + (p1 − pi) log 1 − pi p1 i expressions valid for large N (iterations) and W (states) KCL london mar 22 2023 73
  • 252. • every system has its entropy • every entropy has three faces KCL london mar 22 2023 74
  • 253. Conclusions I • complex systems are non-ergodic by nature • all statistical SK 1-3 systems classified by 2 exponents (c, d) • single entropy covers all systems: Sc,d = re P i Γ (1 + d , 1 − c ln pi) − rc • all known entropies of SK1-SK3 systems are special cases • distribution functions of all systems are Lambert-W expo- nentials. There are no exceptions! • phasespace determines entropy (c, d) • maxent principle: exists for path-dependent processes KCL london mar 22 2023 75
  • 254. Conclusions II • information theoretic approach leads to (c, d)-entropies • maxent approach leads to (c, d)-entropies • extensive and maxentropy are both (c, d)-entropies but NOT the same KCL london mar 22 2023 76
  • 255. A note on Rényi entropy • relax Khinchin axiom 4: S(A + B) = S(A) + S(B|A) → S(A + B) = S(A) + S(B) → Rényi entropy • SR = 1 α−1 ln P i pα i violates our S = P i g(pi) but: our above argument also holds for Rényi-type entropies S = G W X i=1 g(pi) ! lim W →∞ S(λW) S(W) = lim R→∞ G fg(z) z G−1 (R) R = [for G ≡ ln] = 1 KCL london mar 22 2023 77
  • 256. The Lambert-W: a reminder • solves x = W(x)eW (x) • inverse of p ln p: [W(p)] −1 = p ln p • delayed differential equations ẋ(t) = αx(t − τ) → x(t) = e 1 τ W (ατ)t Ansatz: x(t) = x0 exp[1 τ f(ατ)t] with f some function KCL london mar 22 2023 78
  • 257. Amount of information production in a process Markov chain with states A1, A2, ... An transition probabilities pik probability for being in state l: Pl = P k Pkpkl if system is in state Ai then we have the scheme A1 A2 ... An pi1 pi2 ... pin Then Hi = P k pik log pik is the information obtained when Markov chain is moving from Ai step to the next KCL london mar 22 2023 79
  • 258. average this over all initial states: H = − P i Pi Hi = − P i P k Pipik log pik this is the information production if Markov chain moves one step ahead KCL london mar 22 2023 80
  • 259. Complex systems and systemic risk KCL london mar 23 2023
  • 260. Aim • See systems as networks • Discuss SR, resilience, robustness as network properties • Detect weak spots in systems • Understand idea to quantify systemic risk • Discuss applications in a few of examples: finance, economy, credit, global, trading, food supply, global spreading
  • 261. Why are we interested in SR? • Complex systems collapse • When do they? • Where do they break? • Can we understand tipping points? • Can we identify weak spots? Remember: we see world as networks
  • 262. What is collapse? • Systems perform functions • Functions depend on how networks form and restructure (co-evolving) • Collapse is rapid restructuring of networks
  • 263. Collapse • Nodes can vanish • Nodes can change • Links can vanish / appear • Links can change • Relinking can get wrong
  • 264. Systemic Risk / Resilience / Robustness
  • 266. Robust
  • 267. What is systemic risk? • Probability that system can no longer perform its function(s) • Size of system-wide damage (often huge) • SR is a network property • SR can so-far practically not be quantified • SR can so-far practically not be managed
  • 268. Where will this system break?
  • 269. SR is necessary to … • … make systems more resilient • … make complex systems managable • … understand what transition means
  • 270. To understand SR need to know … … networks and their dynamics
  • 271. Complex systems are unstable • most complex systems are stochastic • statistics of complex systems is statistics of power laws – many large outliers – outliers are the norm – non-manageability • details matter
  • 272. Breaking complexity if understand co-evolution: we might break complexity • by controlling every component tame complexity à cut off power law tails • if we forget a detail à might lose control
  • 273. Example 1: Financial SR Nodes are banks Links are financial contracts It is dynamic and co-evolutionary
  • 274. The three types of financial risk • economic risk: investment in business idea does not pay off • credit-default risk: you don't get back what you have lent to others • systemic risk: system stops functioning due to local defaults and subsequent (global) cascading
  • 275. Financial systemic risk • risk that significant fraction of financial network defaults • systemic risk is not the same as credit-default risk • banks care about credit-default risk • banks have no means to manage systemic risk • role of regulator: manage systemic risk • incentivise banks to reduce SR
  • 276. Systemic risk created on multi-layer networks • layer 1: lending–borrowing loans • layer 2: derivative networks • layer 3: collateral networks • layer 4: securities networks • layer 5: cross-holdings • layer 6: overlapping pfolios • layer 7: liquidity: over-night loans • layer 8: FX transactions
  • 277. Quantification of SR • Wanted: systemic risk-value for every financial institution • Input: network Google has similar problem: value for importance of web- pages: A page is important if many important pages point to it • number of importance à PageRank
  • 278. Web page is important if many important pages point to it
  • 279. Institution systemically risky if systemical risky institutions lend to it
  • 280. Systemic risk factor – DebtRank R • Is a ‘different Google’ – adapted to context of SR (S. Battiston et al. 2012) • superior to: eigenvector centrality, page-rank, Katz rank ... Why? • economic value in network that is a affected by node's default • capitalization/leverage of banks taken into account • cycles taken into account: no multiple defaults
  • 281. How to compute it? Basic idea is to kick one node out of the NW and see what happens: • How much damage is caused? • R is percentage of damage wrt total value • Repeat for all nodes – one after the other, then in pairs, triples, …
  • 282. Systemic risk of nodes • Input: network of contracts between banks • Output: all banks i get ‘damage value’ Ri (% of total damage)
  • 283. Systemic risk spreads by borrowing
  • 284. Small bank becomes VERY risky
  • 285. DebtRank Austria Sept 2009 size is not proportional to systemic risk note: core-periphery structure (a)
  • 286. Systemic risk profile Austria 0 5 10 15 20 0 0.2 0.4 0.6 0.8 1 BANK SYST. RISK FACTOR (a)
  • 287. Systemic risk profile Mexico 0 10 20 30 40 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 BANK SYST. RISK FACTOR (b) combined
  • 288. How big is the next financial crisis? Expected Systemic Loss = Sum i pdefault(i) . DebtRank (i) Expected loss(i) = Sum j pdefault(j) . Loss-given-default(j) . Exposure(i,j)
  • 289.
  • 290. How big is the next financial crisis? time EL syst [$/year] 2007 2008 2009 2010 2011 2012 2013 0 0.5 1 1.5 2 2.5 3 3.5 4 x 10 11 EL syst [$/year] ^MXGV5YUSAC ^VIX Lehman Brothers collapse Uncertainty about the rescue of Greece International alarm over Eurozone crisis Loss on derivatives of Mexican companies Subprime crisis Mexican GDP fell by more than 10%
  • 291. ESL: Quantify quality of policy • expected losses per year within country in case of severe default and NO bailout • rational decision on bailouts – allows to compare countries – allows to compare situation of country over time • are policy measures working in Spain? in Greece?
  • 292. Systemic risk of links SR of banks change with every transaction
  • 293. All interbank loans: Austria 10 −6 10 −4 10 −2 10 0 10 −8 10 −6 10 −4 10 −2 LOAN SIZE SYST. RISK INCREASE
  • 294. How to make financial systems better? i.e. more resilient
  • 295. Systemic risk is an externality
  • 296. Management of systemic risk • Systemic risk is a network property • Manage systemic risk: re-structure financial networks • s.t. cascading failure becomes unlikely / impossible
  • 298. Systemic risk elimination • systemic risk spreads by borrowing from risky agents • how risky is a transaction? • ergo: restrict transactions with high systemic risk à tax those transactions that increase systemic risk
  • 299. Incentive to change node-linking strategy: Systemic risk tax • tax transactions according to their systemic risk contribution • Banks look for deals with agents with low systemic risk • ! liability networks re-arrange ! eliminate cascading • no-one should pay the tax – tax serves as incentive to • re-structure networks • size of tax / expected systemic loss of transaction (society is • neutral) • if system is risk free: no tax • credit volume MUST not be reduced by tax
  • 300. Test the eficacy of tax: ABM Rebuild system in computer: simulator = agent-based model Banks Firms Households loans deposits consumption deposits wages / dividends
  • 301. The agents Firms: ask bank for loans – Firms sell products to households: realize profit/loss – if surplus – deposit it bank accounts – Firms are bankrupt if insolvent, or capital is below threshold – if firm is bankrupt, bank writes off outstanding loans Banks try to provide firm-loans. If they do not have enough – approach other banks for interbank loan at interest rate rib – bankrupt if insolvent or equity capital below zero – bankruptcy may trigger other bank defaults Households single aggregated agent: receives cash from rms – (through firm-loans) and re-distributes it randomly in banks – (household deposits), and among other firms (consumption)
  • 302. For comparison: implement Tobin tax • tax on all transactions regardless of their risk contribution • 0.2% of transaction (5% of interest rate)
  • 303. Comparison of three schemes • No systemic risk management • Systemic Risk Tax (SRT) • Tobin-like tax
  • 304. Model results: Systemic risk profile Austria Model 0 5 10 15 20 0 0.2 0.4 0.6 0.8 1 BANK SYST. RISK FACTOR (a) 0 5 10 15 20 0 0.2 0.4 0.6 0.8 1 BANK SYST. RISK FACTOR no tax tobin tax systemic risk tax (b)
  • 305. Model results: SR of individual loans Austria Model 10 −6 10 −4 10 −2 10 0 10 −8 10 −6 10 −4 10 −2 LOAN SIZE 10 −4 10 −3 10 −2 10 −1 10 0 10 −4 10 −3 10 −2 10 −1 LOAN SIZE SYST. RISK INCREASE no tax tobin tax systemic risk tax
  • 306. Model results: Distribution of losses 0 200 400 600 800 0 0.05 0.1 0.15 0.2 TOTAL LOSSES TO BANKS FREQUENCY no tax tobin tax systemic risk tax (a) SRT eliminates systemic risk How?
  • 307. Model results: Cascading is suppressed 0 5 10 15 20 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 cascade sizes of defaulting banks (C frequency no tax tobin tax systemic risk tax (b)
  • 308. SR elimination at no econ. cost 30 40 50 60 70 0 0.02 0.04 0.06 0.08 0.1 0.12 0.14 TRANSACTION VOLUME IB MARKET FREQUENCY no tax tobin tax systemic risk tax (c) Tobin tax reduces risk by reducing credit volume
  • 309. Policy testing Once you have a simulator: can implement policies and see outcomes Implement current financial regulation: Basel III
  • 310. Basel III does not reduce SR
  • 311. Example 2: European Bond NWs
  • 312. Data for relevant EU banks European stress testing data 2016 (EBA) • 51 relevant European banks (49 included in analysis) • 44 sovereign bond investment categories (36 included)
  • 313. Asset-Bank à bank exposure NW
  • 315. Minimal SR in banking netwoks
  • 317. Combine it with interbank NW Combine interbank NW with firm-bank network to get a ‘complete’ credit network of a nation then compute the DebtRank again for all banks AND firms
  • 318. Systemic risk Austrian banks and firms 0.0 0.25 0.50 0.70 Companies banks ranked by DebtRank DebtRank Companies Banks M K K M M F K L K K L L Q L F M M L N H M K G G M G M K L F D H N L H G M K K K I M K I K 0.0 0.25 0.50 0.70 B R A U U N I O N A G Ö B B - I n f r a s t r u k t u r A G O M V A G O Ö L a n d e s h o l d i n g G m b H M a r i n a G i o r i B e t e i l i g u n g s u n d V e r w a l t u n g s G m b H V E R B U N D A G N o v a r t i s A u s t r i a G m b H v o e s t a l p i n e A G T e l e k o m A u s t r i a A G R e m u s S e b r i n g H o l d i n g G m b H O b e r b a n k K B L e a s i n g G m b H E n e r g i e A G O b e r ö s t e r r e i c h H E T A L e a s i n g K ä r n t e n G m b H C o K G T e l e k o m F i n a n z m a n a g e m e n t G m b H V o l k s b a n k e n H o l d i n g e G e n V B L e a s i n g F i n a n z i e r u n g s G m b H V o r a r l b e r g e r L a n d e s b a n k - H o l d i n g F r i t z E g g e r G m b H A l p h a F D G m b H H K M B e t e i l i g u n g s G m b H B o r e a l i s A G V o l k s b a n k S a l z b u r g L e a s i n g G m b H G e m e i n n ü t z i g e S a l z b u r g e r W o h n b a u G m b H W A G W o h n u n g s a n l a g e n G m b H H o l z b a u M a i e r G m b H C o . K G . H E T A G r u n d - u B a u - L e a s i n g G m b H B u n d e s i m m o b i l i e n G m b H L a n d e s i m m o b i l i e n - G . m b H L a n d e s - I m m o b i l i e n G m b H I n n s b r u c k e r I m m o b i l i e n G m b H C o K G S t a d t H a l l i n T i r o l I m m o b i l i e n G m b H D a n u b i u s I m m o b i l i e n v e r w e r t u n g s G m b H R a i ff e i s e n - L e a s i n g A n l a g e n u K F Z V e r m i e t u n g s G m b H R L F l u s s s c h i ff f a h r t s G m b H C o K G Ö B B - P e r s o n e n v e r k e h r A G F l u g h a f e n W i e n A G F R I E D L G m b H E W A E n e r g i e - u W i r t s c h a f t s b e t r i e b e S t . A n t o n G m b H R a i ff e i s e n - L a g e r h a u s A m s t e t t e n e G e n S P A R Ö s t e r r e i c h i s c h e W a r e n h a n d e l s - A G L a g e r h a u s T e c h n i k - C e n t e r G m b H C o K G V E R B U N D H y d r o P o w e r G m b H K a r l R e i t e r P o s t h o t e l A c h e n k i r c h G m b H H o t e l T r o f a n a R o y a l G m b H S t e i e r m ä r k i s c h e K r a n k e n a n s t a l t e n G m b H Companies ranked by DebtRank DebtRank M Services K Finance Insurance F Construction L Real estate N Other services H Logistics G Automobile sector D Energy I Gastronomy Q Health
  • 319. Message more than half of the total financial SR comes from companies
  • 321. NW of international trade: Data • Trade networks between all countries are available • Compute effect if supply or demand in one country changes. What are the implications in the networks? • Compute primary and secondary effects
  • 322. Shock spreading in trade NWs Public administration and defence;... Real estate activities Human health and social work activ... Wholesale trade, except of motor v... Retail trade, except of motor vehi... Administrative and support service... Legal and accounting activities; a... Insurance, reinsurance and pension... Manufacture of food products, beve... Accommodation and food service act... Manufacture of coke and refined pe... Financial service activities, exce... Other service activities Mining and quarrying Manufacture of chemicals and chemi... Telecommunications Electricity, gas, steam and air co... Activities auxiliary to financial ... Land transport and transport via p... Manufacture of fabricated metal pr... Crop and animal production, huntin... Manufacture of motor vehicles, tra... Architectural and engineering acti... Manufacture of computer, electroni... Wholesale and retail trade and rep... Manufacture of basic metals Education Computer programming, consultancy ... Motion picture, video and televisi... Construction time, t [years] 0 2 4 6 8 10 Response function, Y k (t) -2 -1.5 -1 -0.5 0 USA 2014 (A) (B) Shock, X i -1 -0.5 0 (C) t = 0y t = 1y t = 6y Agriculture Mining Manufacturing Electricity Water Construction Trade Transport Accomodation InformationCommunication Finance Research Administration Other
  • 323. Secondary effects:Trump’s EU Tax on steel and aluminum
  • 324. Example 5: Supply chain-NWs
  • 325. Data • Production networks within countries become visible through national VAT data • Assume production functions • Assume how firms can substitute for lost suppliers • Compute SR index: ESRI for every company
  • 326.
  • 327. Flows of goods and services
  • 328. Every company has production function
  • 329. Systemic relevance / resilience of economy
  • 330. Systemic core of economy (HUN)
  • 331. Firms in NACE are not input-similar
  • 332. How to reconstruct supply chains?
  • 333. Reconstruction for Austria: communication NW
  • 334.
  • 336. Example 6: Food supply NWs
  • 337. Supply networks on product level Example pork production
  • 338. Systemic Risk Index – same idea
  • 339. Supply chains on product level: monitor
  • 341. Use a international SC network (highly incomplete) What happens in country B if A firm in country A defaults? Compute a inter-country SR flow index
  • 342. International flow of systemic risks
  • 343. The poorer – the more imported SR
  • 345. Economy as a collection of dynamical netzworks
  • 346. Scenario: flood destroys capital in Austria
  • 347. 0 1 2 3 4 5 6 7 8 9 10 Direct losses in percentage of capital stock [%] -6 -5 -4 -3 -2 -1 0 1 2 3 4 5 6 7 8 Cumulative change in GDP-growth rate [pp] 2014 2015 2016 Inflection point w. r. t. economic growth 250-year event 100-year event 1,500-year event Maximum: threshold at which natural disaster becomes a systemic event Flood-resilience of Austrian economy