Presentation of a paper by LMC & OKW. Devil in the details:Analysis of a coevolutionary model of language evolution via relaxation of selection. Advances in Artificial Life, ECAL 2011. Proceedings of the Eleventh European Conference on the Synthesis and Simulation of Living Systems.
Genetics and epigenetics of ADHD and comorbid conditions
ECAL11 Paris - August 2011
1. Devil in the Details: Analysis of a Coevolutionary
Model of Language Evolution via Relaxation of
Selection
Luke McCrohon & Olaf Witkowski
University of Tokyo
Japan
2. A Model of Linguistic
Gene-Culture Coevolution
<source:Y&H2010>
Yamauchi & Hashimoto 2010
2
3. Yamauchi & Hashimoto 2010 :
The Baldwin Effect
• Learned behavior gradually assimilated into the agents genetic repertoire
Fitness
Landscape
Learning
No learning
3
4. • Culture compensating for genetically maladaptive traits
• e.g. biosynthesis of vitamin C
Yamauchi & Hashimoto 2010 :
Cultural Masking
4
5. Yamauchi & Hashimoto 2010:
Why Do We Care ?
• Shows cyclic repetition of stages
• Biological selection is masked, innate behavior degrades
• Selection is unmasked, behaviors are nativized (Baldwin)
• We initially wanted to investigate the influence of the rates of cultural change
5
6. Yamauchi & Hashimoto 2010
• Agents
• Chromosome: Length 12 Array (0 or 1)
• Grammar: Length 12 Array (0, 1 or null)
• Learning resource: Integer value (initially 24)
• Fitness: Integer value (initially 1)
chromosome
grammar
6
7. Yamauchi & Hashimoto 2010
• Every generation, for each agent:
• Learning: exposed to the utterances of
the previous generation, with a chance
to learn them
• Invention: if still null values in grammar,
he has a chance to invent new values
• Communication: interaction with
neighbors in the current generation to
determine his fitness
• Reproduction: new generation created,
replacing the previous one
Figure 3: The spatial organization of the population.
26
<source:Y&H2010>
chromosome
grammar
7
8. Yamauchi & Hashimoto 2010
• An agent learning a word
• matching grammar: $1
• mismatching grammar: $4
Figure 3: The spatial organization of the
teacher
grammar
learner
grammar
utterance
8
9. Yamauchi & Hashimoto 2010
• Every generation, for each agent:
• Learning: exposed to the utterances of
the previous generation, with a chance
to learn them
• Invention: if still null values in grammar,
he has a chance to invent new values
• Communication: interaction with
neighbors in the current generation to
determine his fitness
• Reproduction: new generation created,
replacing the previous one
Figure 3: The spatial organization of the population.
26
<source:Y&H2010>
chromosome
grammar
9
10. Yamauchi & Hashimoto 2010
• Every generation, for each agent:
• Learning: exposed to the utterances of
the previous generation, with a chance
to learn them
• Invention: if still null values in grammar,
he has a chance to invent new values
• Communication: interaction with
neighbors in the current generation to
determine his fitness
• Reproduction: new generation created,
replacing the previous one
Figure 3: The spatial organization of the population.
26
<source:Y&H2010>
chromosome
grammar
10
11. Yamauchi & Hashimoto 2010
• Every generation, for each agent:
• Learning: exposed to the utterances of
the previous generation, with a chance
to learn them
• Invention: if still null values in grammar,
he has a chance to invent new values
• Communication: interaction with
neighbors in the current generation to
determine his fitness
• Reproduction: new generation created,
replacing the previous one
Figure 3: The spatial organization of the population.
26
<source:Y&H2010>
chromosome
grammar
11
13. Yamauchi & Hashimoto 2010 : The Original Results
• Stage 1: Baldwin effect,
Niche Construction
• Agents go from no
culturally transmitted
language, to a highly
uniform language
shared between agents,
and consequently a
high fitness
LearningIntensity
Gene-GrammarMatch
Generations
1 2 3 2*
0 1000 2000 3000 4000 5000
0
4
8
12
16
20
24
0
2
4
6
8
10
12
Learning Intensity
Gene-Grammar
Match
(a) Overall Result
0
4
8
12
16
20
24
LearningIntensity
0
1224 24
13
14. Yamauchi & Hashimoto 2010 : The Original Results
• Stage 1: Baldwin effect,
Niche Construction
• Agents go from no
culturally transmitted
language, to a highly
uniform language
shared between agents,
and consequently a
high fitness
14
15. Yamauchi & Hashimoto 2010 : The Original Results
!
• Stage 2: Functional
Redundancy
• Cultural transmission
masks biological
selection, progressive
drop of correlation
between the gene pool
and the environment
LearningIntensity
Gene-GrammarMatch
Generations
1 2 3 2*
0 1000 2000 3000 4000 5000
0
4
8
12
16
20
24
0
2
4
6
8
10
12
Learning Intensity
Gene-Grammar
Match
(a) Overall Result
0
4
8
12
16
20
24
LearningIntensity
0
1224 24
15
16. Yamauchi & Hashimoto 2010 : The Original Results
• Stage 3: Unmasking of
Natural Selection
• Convergence on
different languages, so
that the gene-grammar
match has deteriorated
so much that biological
selection is no longer
masked.
• Biological assimilatory
process like in 1, leads
to cycles between 2
and 3
LearningIntensity
Gene-GrammarMatch
Generations
1 2 3 2*
0 1000 2000 3000 4000 5000
0
4
8
12
16
20
24
0
2
4
6
8
10
12
Learning Intensity
Gene-Grammar
Match
(a) Overall Result
0
4
8
12
16
20
24
LearningIntensity
0
1224 24
16
17. Yamauchi & Hashimoto 2010 : The Original Results
• Stage 3: Unmasking of
Natural Selection
• Convergence on
different languages, so
that the gene-grammar
match has deteriorated
so much that biological
selection is no longer
masked.
• Biological assimilatory
process like in 1, leads
to cycles between 2
and 3
17
19. Analysis : Genetic Diversity
• Is “Stage 1” really showing of a
Baldwin effect ?
• We ran the simulation with
neutral biological selection,
ignoring fitness
• The same reduction of genetic
diversity is observed
19
20. • The diversity oscillates between 5 and 10, because of genetic drift
• Fast-forward simulation. Ready ?
Analysis : Genetic Diversity
20
21. • The diversity wanders between 5 and 10, because of genetic drift
• Fast-forward simulation. Ready ?
• Results:
Analysis : Genetic Diversity
21
22. Analysis : Genetic Diversity
• Phenotypes (grammars) are even fewer
22
(10 separate runs, over 20000 generations)
23. Analysis : Masked Genetic Selection
• We observe no significant drop
below 8. Why ?
• 24 learning resources = 4 * 4
non-matching + 1 * 8 matching
• Any agent dropping below an 8
match would have its fitness
penalized.
(10 separate runs, over 10000 generations)
23
24. Analysis : Coevolutionary Attractors
• A few values of gene-grammar
match occur more frequently
than others, showing potential
local attractors
• This is caused by language
uniformity and lack of genetic
variation
• Attractors are around integer
values
(5 separate runs, over 5000 generations)
24
25. • State Transitions Graph and Density Plot
Analysis : Coevolutionary Attractors
gene-grammar matches as attractor states in the simulation,
despite them potentially representing a number of different
underlying gene-culture states.
to 12 to 11 to 10 to 9 to 8
from 12 .55 .05 .00 .00 .00
from 11 .01 .52 .07 .01 .00
from 10 .00 .02 .42 .08 .00
from 9 .00 .00 .02 .61 .03
from 8 .00 .00 .00 .04 .78
We calculated the likelihood of the simulation jumping
between each of these attractor states (±0.2 units) over a pe-
riod of 200 generations. The transition probability matrix is
presented in the table above and in the transition diagram in
figure 8. We tested these results compared against equally
sized intervals directly between the attractor states and ob-
tained probabilities of the simulation staying in the same
range approximately 5 times lower than in the case of the
attractors. This indicates that the attractors are significantly
more stable.
Figure 8: State Transition Diagram [Seed=1303037425613,
Runs=50, Generations=20000]
grammar matches that result in agents exhibiting the same
gene-grammar match value. However, as nothing in the
gent’s learning algorithm changes their probability of learn-
ng individual grammatical alleles due to a particular set of
genetic biases (only the number of matches ultimately in-
fluences learning), these different model states will behave
dentically. Because of this it is safe to view the integer value
gene-grammar matches as attractor states in the simulation,
despite them potentially representing a number of different
underlying gene-culture states.
to 12 to 11 to 10 to 9 to 8
from 12 .55 .05 – – –
from 11 .01 .52 .07 .01 –
from 10 – .02 .42 .08 –
from 9 – – .02 .61 .03
from 8 – – – .04 .78
We calculated the likelihood of the simulation jumping
between each of these attractor states (±0.2 units) over a pe-
iod of 200 generations. The transition probability matrix
s presented in the table above and in the transition diagram
n figure 8. We tested these results against transitions be-
ween equally sized intervals positioned directly between the
ttractor states and obtained probabilities of the simulation
taying in those intervals approximately 5 times lower than
n the case of the attractors. This indicates that the attractors
re significantly more stable.
• Shape of the attractors
• attractors not symmetrical
• deviation downward more
probable
• result of biological change:
random change from
optimal is often sub-
optimal
End-3 slides
26. Analysis: Steady State in the Long Run
• Transient can be
ignored
• Lower attractors
favored in the long run
End-2
27. Analysis : Sensitivity to Initial Conditions
400 agents
1000 agents
50 agents
• Changing the population size, we get
this kind of density for the gene-
grammar matches.
• Drift effect masked for a higher
population
• Qualitatively different behavior
End-1
28. End
Conclusions
• The model from Yamauchi & Hashimoto 2010 does capture some of the
intended phenomena (e.g. Cultural Shielding, Niche Construction)
• “Stage 2” does not show the claimed degradation of gene-grammar matches,
but rather is a random walk between a set of attractors
• Observed model behavior is the result of limited Cultural and Biological
Diversity which are themselves the result of the small agent population
• Population Structure and Organization are factors that can potentially
increase diversity without the computational costs of a larger agent
population
30. Devil in the Details: Analysis of a Coevolutionary
Model of Language Evolution via Relaxation of
Selection
Luke McCrohon & Olaf Witkowski
University of Tokyo
Japan