Overall poor performance of cognitive and behavioral skills are predictors of poor language skills and vice versa: we need to realise that language, perception, thought, and problem-solving constitute a continuum of interdependent functions.
Some refer to this as polymodal development of the sensory apparatus, as the developing trend in typically developing individuals (Bailey, 2002).
Sensory input participates not only in a simple additive way but also has a reciprocal influence in that it modulates synergistically the unyielding of neural networks.
This is probably why children with deafness exhibit subtle differences from hearing subjects also in functions that seem far removed from the auditory function, such as visuomotor integration or abstract thinking.
From a neuroscientific perspective, this is an interesting demonstration in relation to the extent of the role of hearing in the building of neural networks that result from sensorimotor exposure and practice.
Russian Call Girls in Pune Tanvi 9907093804 Short 1500 Night 6000 Best call g...
Ā
Cognitive Impairment and Neuroplasticity in Deaf Children
1. The risk of cognitive impairment associated with
congenital deafness in children: the critical role of
neuroplasticity in development
Dr. Marit Lobben
Department of Psychology, University of Oslo
2. Types of cognitive and motor impairment observed
in congenitally deaf children
ā¢ Deprivation of sensory input aļ¬ects
neurological development
ā¢ Sensory perception is a determinant of neurological
development. It has been suggested that hearing
may contribute to several other cognitive functions,
including:
ā¢ Clinical development of spatial integration
ā¢ Visuospatial perception
ā¢ Executive function
ā¢ Visual memory
ā¢ Sequence learning
ā¢ Attention
ā¢ Language development
Schlumberger et al., 2004
ā¢ In addition, lack of auditory input may result in
a number of bodily deļ¬ciencies:
ā¢ Motor control (slower development of
coordination; speed of repetitive, alternating
and sequential movement of limbs).
ā¢ The kinesthetic and vestibular systems: the
sense of spatial orientation for co-ordinating
movement with balance.
ā¢ Especially, a delay of the development of
complex motor sequences.
ā¢ Also, deafness is associated with lower scores
in visual gnosopraxic tasks (knowledge of how
to co-ordinate hand movement to vision),
although scores are not pathological.
ā¢ Cognitive and behavioral skills of children with
deafness start to fall behind from a very young
age (e.g. mean age 2 yrs in Kutz et al. 2003).
3. Why are all these deļ¬ciencies simultaneously
present?
ā¢ Overall poor performance of cognitive and behavioral skills are predictors of poor
language skills and vice versa: we need to realise that language, perception, thought, and
problem-solving constitute a continuum of interdependent functions.
ā¢ Some refer to this as polymodal development of the sensory apparatus, as the
developing trend in typically developing individuals (Bailey, 2002).
ā¢ Sensory input participates not only in a simple additive way but also has a reciprocal
inļ¬uence in that it modulates synergistically the unyielding of neural networks.
ā¢ This is probably why children with deafness exhibit subtle differences from hearing
subjects also in functions that seem far removed from the auditory function, such as
visuomotor integration or abstract thinking.
ā¢ From a neuroscientiļ¬c perspective, this is an interesting demonstration in relation to the
extent of the role of hearing in the building of neural networks that result from
sensorimotor exposure and practice.
4. ā¢ So in a neuroscientiļ¬c perspective, we need to see these deļ¬ciencies
in relation to two, or maybe three factors:
ā¢ 1) Neuroplasticity. Modalities other than the auditory one become
compensatory developed.
ā¢ 2) Neural networks that normally overlap with hearing (i.e. that are
in use for several functions) may be impaired because of the lack
of auditory stimuli. Overlapping networks do not develop
sufļ¬ciently.
ā¢ 3) Those abilities that are normally strengthened by spoken
language practice will be less fully developed.
ā¢ There are also confounds here, since there is a risk of interpreting
comorbidity as indicators of cognitive impairments.
ā¢ To the ļ¬rst point, if the causes of cognitive impairment in deaf
individuals is reorganisation of the brain, all modalities need to be
considered in context of each other. We will shortly observe a core
compensatory mechanism of cross-modal plasticity.
5. 1) The ļ¬ip side of the coin: cognitive enhancement
resulting from reorganization of the brain in deaf individuals
ā¢ Visuospatial abilities may
actually be enhanced as
a consequence of
deafness (Bavelier et al.,
2006). However, visual
enhancement is by
some though to be
restricted to those
aspects of vision that
would normally beneļ¬t
from auditory-visual
convergence.
ā¢ Although this has been
attributed by some to
the training of individuals
in sign language (deaf
native signers) a more
plausible explanation of
visual enhancement is
that it results from the
critical neuroplasticity
during development.
Bavelier et al. Page 13
NIH-PAAuthorManuscri
ā¢ Compared to hearing individuals, congenitally deaf detect moving
targets in visual periphery faster and more accurate when these are
attended to while central motion is ignored. That is, under conditions
that involve selective attention (e.g. detecting unpredictable locations
in periphery or ļ¬nd a target among distractors). EEG show an
increased N1 component in such tasks compared to hearing people.
NB: This effect is not observed in hearing signers, so deafness must
be driving it.
ā¢ Observed behavioral changes are accompanied by a reorganization
of multisensory areas. In neuroimaging studies, cortical changes in
deaf individuals were found in the secondary auditory area (A2), the
posterior superior temporal sulcus (pSTS), the posterior parietal area
(AG?) and ļ¬nally in motion-sensitive visual areas Middle temporal and
V5.
6. ā¢ Scott and colleagues (2014) later conļ¬rmed the proposal
that peripheral vision is typically enhanced in congenitally
deaf individuals; i.e. that they show better motion
detection, visual orienting, and selective attention in
peripheral, but not central visual ļ¬elds.
ā¢ Scott and colleagues demonstrated that for peripheral
vs. for perifoveal visual processing, there were differences
in activation level between hearing and deaf individuals in
extrastriate visual cortex including A1, motion sensitive
areas MT+/V5, superior-temporal auditory cortex, as well
as in multi-sensory and supramodal regions.
ā¢ These results may be best understood as a change in
the spatial distribution of visual spatial attention, which is
altered in deaf compared to hearing individuals.
ā¢ Supported by behavioural data: While hearing persons
exhibit greater distractibility from central than from
peripheral distractors, deaf individuals are more easily
distracted by peripheral distractors.
ā¢ These facts should be considered in deaf education;
when they appear to be more easily distracted and
cannot stay on task, educators need to understand that
their sensory apparatus actually works differently. This is
NOT then, an indication of a deļ¬cient visual attention
system in the deaf population.
Figure 1.
The proposal that Deaf individuals have greater attentional resources in the visual periphery
predicts that peripheral distractors should be more distracting to Deaf than to hearing
individuals. (a) The spatial distribution of attention as a function of eccentricity was measured
by comparing the extent to which peripheral and central distractors interfere with target
performance in Deaf and hearing individuals. (b) Hearing individuals exhibit greater
distractability from central than from peripheral distractors, in line with the view of heightened
central attention in the hearing population. By contrast, Deaf individuals exhibit greater
distractability from peripheral distractors, supporting the view that Deaf individuals have
enhanced peripheral attention.
Bavelier et al. Page 11
Trends Cogn Sci. Author manuscript; available in PMC 2010 June 15.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
Figure 1.
The proposal that Deaf individuals have greater attentional resources in the visual periphery
predicts that peripheral distractors should be more distracting to Deaf than to hearing
individuals. (a) The spatial distribution of attention as a function of eccentricity was measured
by comparing the extent to which peripheral and central distractors interfere with target
performance in Deaf and hearing individuals. (b) Hearing individuals exhibit greater
distractability from central than from peripheral distractors, in line with the view of heightened
central attention in the hearing population. By contrast, Deaf individuals exhibit greater
distractability from peripheral distractors, supporting the view that Deaf individuals have
enhanced peripheral attention.
Bavelier et al. Page 11
Trends Cogn Sci. Author manuscript; available in PMC 2010 June 15.
NIH-PAAuthorManuscriptNIH-PAAuthorManuscriptNIH-PAAuthorManuscript
ā¢ LevƤnen & Hamdorf
(2001) similarly
discovered enhanced
tactile sensitivity to
vibratory stimuli in
congenitally deaf,
which relates to the
fact that the auditory
cortex may start to
process tactile
information also in
humans.
0.2) embedded in a sequence of 250-Hz `standard' stimuli
(probability 0.8). Both tests were performed once with the
right and once with the left hand (four blocks) in a rando-
mized order across subjects. Because the performance did
not depend on the hand used, the results obtained with each
hand were averaged and used for group evaluations. In the
change-detection task, the probability of correct responses to
deviant stimuli and of incorrect responses to standard stimuli
were calculated for each subject. On the basis of these prob-
abilities, an unbiased measure of the performance, d0
, was
calculated [5] and used for group evaluations.
The smallest frequency difference detectable to the deaf
subjects (mean ^ SEM: 21 ^ 3 Hz; Fig. 1) was clearly smal-
ler than that of the hearing subjects (28 ^ 4 Hz), but the
difference was non-signiĀ®cant (t-test; P , 0:2) possibly
due to the small number of subjects in each group. The
deaf were, however, signiĀ®cantly better than the hearing
controls in detecting suprathreshold frequency changes
randomly occurring within the monotonous sequence of stan-
dard stimuli (mean ^ SEM d0
: 2.3 ^ 0.1 vs. 1.1 ^ 0.3;
P , 0:003, t-test; Fig. 1).
These results imply that tactile sensitivity is enhanced in
the congenitally deaf. However, only the ability to detect
infrequent suprathreshold changes in the sequence of
frequent stimuli was signiĀ®cantly enhanced when compared
to the hearing subjects. On the other hand, the auditory noise
used to prevent the hearing subjects from hearing the stimuli
might have interfered with their performance and could
partly explain the obtained results. However, since auditory
noise seems to have very little or no effect on auditory
change detection (see Ref. [12]), it seems unlikely that it
would critically decrease performance on tactile change
detection.
Behavioral training can signiĀ®cantly improve perfor-
mance on a variety of sensory discrimination tasks. For
S. LevaĆnen, D. Hamdorf / Neuro76
Fig. 1. Left: individual frequency-discrimination performances.
Solid and dashed lines indicate the group means for deaf and
hearing subjects, respectively. The frequency difference
7. Auditory-visual neurons in superior temporal sulcus
multisensory area (STS-MS)
ā¢ That secondary and multisensory
brain regions are affected by the
absence of one sensory modality is
maybe less surprising, in view of
the fact that the brain contain
multiple convergence zones, like
the angular gyrus, which contains
neurons that respond to two types
of modality: the bimodal neurons.
ā¢ These are higher-cognition areas
where modalities are integrated.
ā¢ A timely question, however, is
whether congenital deafness also
affects the low-level, primary
cortices, or if these quite
specialised regions are exempt
from neuroplasticity.
Beauchamps et al., 2004
8. Indications that primary auditory cortex is reorganised
for new functions in congenitally deaf individuals
ā¢ Neuroimaging studies have shown that individually deļ¬ned,
anatomical Regions of Interest (ROI) of Heschlās gyri (A1)
respond to both touch and vision in congenitally deaf
persons (Karns et al., 2012). Interestingly, some animal
studies on deafness (cats) show that primary auditory cortex
respond to vision and somatosensation.
ā¢ This may, if not explain, then at least support neuroimaging
data showing that enhanced peripheral vision in congenitally
deaf correlate with responses in primary auditory cortex of
congenitally deaf persons (Scott et al., 2014).
ā¢ In contrast to hearing participants, deaf participants
recruited, in response to peripheral visual stimuli, higher-
order visual areas (contralateral Middle Temporal+/V5)
contralateral auditory, and multisensory superior temporal
cortex, besides attention-related brain regions (left posterior
parietal cortex, left frontal eye ļ¬eld).
ā¢ In ROI analyses, peripheral visual attention recruited
multimodal areas involved in multisensory integration and
attention (STS, PPC, anterior cingulate cortex/
supplementary eye ļ¬elds, and frontal eye ļ¬elds).
Scott et al. Visual neuroplasticity in
FIGURE 3 | Differences between deaf and hearing for peripheral vs. perifoveal stimulation. Inset shows a schematic location of stimuli included in th
contrasts (11ā15ā¦ vs. 2ā7ā¦). (A) Deaf > Hearing and (B) Deaf alone. See Tables 1, 2 for a summary of signiļ¬cant clusters and corresponding atlas-based descr
supporting higher order visual processing (MT+/V5), as well
as well as multimodal areas implicated in multisensory inte-
gration and attention (STS, PPC, anterior cingulate/SEF, and
FEF) (Levanen et al., 1998; McCullough et al., 2005) which
showed greater signal change for peripheral visual processing in
deaf participants. Taken together, these data suggest that neu-
roplasticity supporting enhanced peripheral visual processing in
congenital deafness involves recruitment of low level sensory
cortex that has been deprived of its default sensory modal-
ity, as well as network-level recruitment of cortices involved in
ļ¬ndings alone might imply that Heschlās gyrus would
responsive to stimulation in general due to atrophy and fun
analyses such as those in the present study are critical t
mine the extent to which these regions demonstrate cross
neuroplasticity.
Due to methodological limitations, until recently it
clear whether Hecshlās gyrus responded to cross-moda
ulation in deafness. Studies of altered organization
participants in the visual modality have generally reporte
modal altered organization caudal to rather than over
Scott et al. Visual neuroplasticity in deafness
to all spatially smoothed EPI volumes to standardize functional
data. Group analyses were performed using FLAME mixed
effects error propagation with statistical thresholding reported
for Z > 2.3 and corrected for multiple comparisons using
a cluster probability thresholding of p = 0.05 (Worsley et al.,
1992; Beckmann et al., 2003; Woolrich et al., 2004). To test
whether differences between deaf and hearing increased as
eccentricity increased, the contrasts for each experiment were
11ā15ā¦ vs. 2ā7ā¦. Further analyses were conducted in each group
alone.
RESULTS
HESCHLāS GYRUS REGION OF INTEREST
Percent signal change in anatomically-deļ¬ned Heschlās gyrus was
measured for the 11ā15ā¦ vs. 2ā7ā¦ contrast. Heschlās gyrus was
divided into an anterior and posterior subregion (Figure 2A) to
approximate primate primary-auditory areas A1 and R respec-
tively (Da Costa et al., 2011). As shown in Figure 2B, hearing
individuals had decreased signal in Heschlās gyrus for peripheral
compared to perifoveal stimulationārepresented as a negative
signal change. In contrast, deaf individuals showed a positive
signal increase for peripheral visual stimulation relative to per-
ifoveal. Differences between deaf and hearing manifested as a
Subregion Ć Group Interaction [F(1, 15) = 6.6, p = 0.02]. Follow
up t-tests indicated that the deaf had a larger signal difference
between peripheral and perifoveal locations than the hearing in
Contralateral Anterior Heschlās Gyrus [T(15) = 1.81, 0.045, one-
tailed] and Contralateral Posterior Heschlās Gyrus [T(15) = 1.83,
p = 0.044] and tended to be larger in the Ipsilateral Posterior
subregion [T(15) = 1.66, p = 0.059]. These results indicate an
increase in HG signal with increasing visual eccentricity from
2ā7ā¦ to 11ā15ā¦ in the deaf but not hearing participants.
GROUP ANALYSES
Group-level analyses were performed to identify regions that
showed differential recruitment to peripheral vs. perifoveal visual
presentation. As shown in Figure 3A and detailed in Table 1,
for peripheral 11ā15ā¦ vs. perifoveal 2ā7ā¦ stimuli, deaf partici-
pants showed greater activation than hearing participants in left
superior-temporal auditory and multisensory cortex as well as
brain regions that have also been associated with attention [left
posterior parietal cortex (PPC), and anterior cingulate/SEF] (Z >
2.3, p < 0.05 corrected).
In cluster-corrected analyses performed separately within each
group, the hearing participants did not show any regions with a
larger response to peripheral (11ā15ā¦) vs. perifoveal (2ā7ā¦) stim-
uli. In contrast, deaf participants recruited higher order visual
areas (contralateral MT+/V5), contralateral auditory, and multi-
sensensory superior temporal cortex, and attention-related brain
regions (left PPC, left FEF) as shown in Figure 3B and Table 2.
Within each group, we did not observe signiļ¬cant signal increase
in anterior calcarine sulcus.
DISCUSSION
The present study reveals a network of brain regions exhibiting
enhanced responsiveness to peripheral visual stimuli in pro-
foundly, genetically, and congenitally deaf adults. It is particu-
larly noteworthy, given existing controversies (see Bavelier and
Neville, 2002; Karns et al., 2012), that we found that indi-
vidual anatomically-deļ¬ned Heschlās gyrus regions, the site of
human primary auditory cortex, showed a reliable increased
response to peripheral vs. perifoveal visual stimulation in deaf
participants. Greater signal in Heschlās gyrus occurred for both
the anterior and posterior division (putative analog to primate
A1 and R, respectively). Additionally, we found that regions
9. 2) Overlapping networks - spoken language and
executive function
ā¢ Executive function development is important for
behavioural self-regulation and emotional
competence, depending on excitatory and inhibitory
functioning. It is therefore a cognitive function that is
integrated in many processes in our daily lives.
ā¢ It is well established that language and executive
functions are highly interrelated, although the
direction of inļ¬uence is unclear.
ā¢ If language mediates EF, then ārules derived from
language learning enable manipulation of cognitive
processes via internal representationsā (Zelazo et
al., 2003).
ā¢ Botting and colleagues (2017) used deafness to
disentangle the direction of inļ¬uence, considering
that other populations with suboptimal EF
functioning are also cognitively deprived in other
ways (ADHD, autism, dyslexia). They found that
deaf children (n = 108, 86% born deaf; 64% wore
hearing aids; mean age 8;10 years) performed
signiļ¬cantly worse than hearing controls (n = 125)
on executive function tasks, after non-verbal
intelligence and processing speed had been
controlled for. In other words, a major study.
ā¢ The key question was āDoes atypical language
experience affect performance on nonverbal
executive function tasks?ā
ā¢ NB: The language tasks were primarily naming
tasks to picture stimuli (vocabulary; nouns and
verbs).
ā¢ The deaf children scored below the hearing peers
on the majority of EF tasks.
ā¢ The authors concluded from a series of regression
analyses that language skills aļ¬ected
executive function but not vice versa.
10. What language functions are dependent on the
prefrontal lobe?
ā¢ Is Botting and colleagues comparing apples and
oranges?
ā¢ Primary function of prefrontal lobes: executive
function and goal-directed control.
ā¢ Neurons in PFL ļ¬re differentially to same type of
stimuli, depending on type of context.
ā¢ Has a role in how context alters meaning.
ā¢ Abstract thinking (PFL damaged patients depend
more on concrete and superļ¬cial clues)
ā¢ Likely linked to ability to understand the speakerās
attitude to the utterance.
ā¢ Type of deļ¬cit has to be comparable in its
basic role across the two ļ¬elds: language
competence and executive function.
ā¢ Vocabulary, by contrast, is processed in the
temporal lobes (primarily left), contextualised
in Brocaās area (inferior frontal gyrus), visually
represented in the ventral streams (inferior
temporal lobes), made abstract in the anterior
temporal poles/lobes, and if embodied, in the
sensory areas corresponding to the meaning
of the verbs/nouns. IN OTHER WORDS, NOT
IN THE PREFRONTAL LOBE.
ā¢ Be critical. Note that this study was published
in 2017, fMRI was available!
Distributed-only view
Distributed-plus-hub view Convergent architecture
Action Colours MotionName
Action Colours MotionName
Gating architecture
Shape
Shape
Task
Task
Task-
dependent
representation
Task-
independent
representation
Action
Words
Sound
ColourShape
Motion
a
b
Task-
dependent
representation
The distributed-plu
first to argue for unifi
that abstract away from
Most earlier proposals
mute regarding the ne
central aspect of sema
specifically neuroana
colleagues9ā13
proposed
zonesā that associate d
and along with other r
articulated the importa
processing. The converg
differs in at least two res
hub view illustrated in
existence of multiple sp
for example, one that
visual representations
actions, another that e
shape and object name
that these zones becom
representing different se
because humans frequen
man-made objects, th
and action might be mo
man-made artefacts tha
Similarly, because anim
the zone that links shap
11. Six overlapping systems that involve the prefrontal
lobes, all of which participate in the language function
ā¢ Theory of mind: the ability to ascribe
mental states to conspeciļ¬cs, and make
judgments what they think
ā¢ Emotional processing: the ability to
understand and distinguish affective
states
ā¢ Social knowledge schemas: guide
judgements and behaviour
ā¢ NB: If we suspect that congenitally
deaf children are in risk of a
underdeveloped EF, which is
subserved by prefrontal lobe
structures, we should also test
them for these functions!
ā¢ WORKING MEMORY is necessary to make inferences.
ā¢ Information processing speed is also related to WM.
ā¢ INHIBITORY CONTROL is important for comprehension
in order to suppress irrelevant associations. Poor IC also
overloads an already impaired WM.
ā¢ ATTENTION: the ability to ļ¬exibly shift attention between
attributes in environment.
Duffau et al., 2014
12. The neurobiology of
sign language
ā¢ Language production: IFG, including
Brocaās area (BA44/45). This area is
equally important for sign language as
for spoken language.
ā¢ BA44 is involved in the production of
complex movements of the manual
articulators (i.e. āphonological encodingā)
ā¢ BA45/47 are engaged in modality-
independent lexical search and retrieval
(i.e. lexical-semantic processes).
ā¢ Superior parietal lobule (not in spoken
language). General function: online
control and programming of reach
movements to target locations.
ā¢ Right parietal cortex, involved in
describing spatial relationships with
ļ¬xed hand postures.
processing heard speech and therefore is not engaged during
sign language processing. In the absence of auditory input, deaf
signers recruit auditory regions for processing sign language,
as well as other nonlinguistic visual stimuli (e.g., Finney,
Clementz, Hickok, & Dobkins, 2003). It is also notable that
auditory cortex does not atrophy for congenitally deaf signers
(Emmorey, Allen, Bruss, Schenker, & Damasio, 2003). For deaf
input to auditory cortices due to congen
Cardin et al. (2013) found that when v
activation in the left STS was driven by
rather than by auditory deprivation. Dea
not signers (āoral deafā) did not show inc
left STS compared with hearing nonsigne
input. Thus, left hemisphere STG/STS ac
* Correctedp < 0.05
R
(a)
(b) (c)
L
Figure 1 (a) Example of an American Sign Language classifier construction that describes the location of a clock with res
translation: āThe clock is above and behind the tableā). (b) The contrast between expressing various locations of a single ob
above, under, and beside the table) and expressing distinct object types in a single location (on top of the table) reveals gre
superior parietal cortex for expressing spatial locations. (c) The contrast between expressing distinct object types (e.g., a lo
hammer and a cylindrical object such as a bottle) and expressing various locations of a single object reveals greater activat
cortex when retrieving classifier handshapes that express object type. Reproduced from Emmorey, K., McCullough, S., Meh
Grabowski, T. J. (2013). The biology of linguistic expression impacts neural correlates for spatial language. Journal of Cogn
517ā533.
ā¢ Language comprehension: Surprisingly, phonological processing of
signs engages the superior temporal cortex, often bilateral.
ā¢ Posterior STS was also involved in phonological processing (of
pseudo-signs).
ā¢ In the absence of auditory stimuli , deaf signers recruit auditory
cortex for producing sign language, as well as other visual stimuli
(Emmorey, 2015). Auditory cortex does not atrophy in congenitally
deaf signers.
ā¢ Left IFG for phonological, semantic and syntactic functions.
ā¢ Bilateral middle temporal cortices.
ā¢ Possibly more bilateral than spoken language.
13. 3) Cognitive abilities that are normally strengthened by spoken
language - correspond to deļ¬ciencies in hearing impaired children
ā¢ First, we need to emphasise that the use of a sign language may
improve certain higher cognitive skills, such as tasks on the mental
rotation of objects, image generation, face perception and short term
memory capacities (Bavalier et al., 2006).
ā¢ The type of approach laid out in point 3) initially has been tried out by
Conway and colleagues (2011) in the area of phonology. The
auditory scaffolding hypothesis proposes that audition may
participate in domain-general knowledge, speciļ¬cally that early
auditory deprivation of phonological sequences affects the general
ability of sequence learning. In other words, whether sequence
learning is affected by the lack of speech sound stimulus.
ā¢ Sequence learning is an ability that is present in all mammals, since it
is basic also to motor cognition. However, there are great differences
in how complex sequences can be acquired across species. In
humans, it has developed to be crucial to linguistic processing.
ā¢ von Koss Torkildsen and colleagues (2018) examined this possible
link between early deafness and later visual sequence learning, but in
contrast to some former studies with using unfamiliar visual stimuli.
ā¢ Both groups showed signiļ¬cant learning, however, the null hypothesis
was supported; there was no signiļ¬cant difference in visual sequence
learning between Prelingually deaf children and Normal hearing
children. In addition, the results could not be explained by age of
implantation of the Cochlear implant.
Turk-Browne, N.B., JungĆ, J.A., & Scholl, B.J. (2005). The
automaticity of visual statistical learning. Journal of Experi-
mental Psychology: General, 134 (4), 552ā564.
Turk-Browne, N.B., & Scholl, B.J. (2009). Flexible visual sta-
tistical learning: transfer across space and time. Journal of
Experimental Psychology: Human Perception and Perfor-
mance, 35 (1), 195ā202.
Turk-Browne, N.B., Scholl, B.J., Chun, M.M., & Johnson,
M.K. (2009). Neural evidence of statistical learning:
efficient detection of visual regularities without aware-
ness. Journal of Cognitive Neuroscience, 21 (10), 1934ā
1945.
Vaidya, C.J., Huger, M., Howard, D.V., & Howard, J.A.
Jr. (2007). Developmental differences in implicit learning of
spatial context. Neuropsychology, 21 (4), 497ā506.
Vicari, S., Finzi, A., Menghini, D., Marotta, L., Baldi, S., &
Petrosini, L. (2005). Do children with developmental dyslexia
have an implicit learning deficit? Journal of Neurology, Neu-
rosurgery, and Psychiatry, 76, 1392ā1397.
Vuontela, V., Steenari, M., Carlson, S., Koivisto, J., Fjallberg,
M., & Aronen, E. (2003). Audiospatial and visuospatial
working memory in 6ā13 year old school children. Learning
& Memory, 10, 74ā81.
Wolf, M. (2007). Proust and the squid: The story and science of
the reading brain. New York: HarperCollins.
Yang, C.D. (2004). Universal grammar, statistics or both?
Trends in Cognitive Sciences, 8 (10), 451ā456.
Received: 9 January 2009
Accepted: 21 September 2009
(A) (B) (C)
(D) (E) (F)
(G) (H) (I)
(J) (K) (L)
Appendix
Statistical learning 473
Ć 2009 Blackwell Publishing Ltd.
Stimuli used in visual
sequence learning
(Arciuli & Simpson, 2011)
14. ā¢ For discussion: hearing and motor skills (like rhythmic movement) are
connected in the brain (share neural circuits), but visual rhythms are not
linked to the motor modality. (E.g., we do not feel like clapping hands or
stamping feet when we see rhythmic sequences.)
ā¢ In that sense, the study by von Koss Torkildsen et al. conļ¬rms not that there
is no difference between hearing and non-hearing children, but that the visual
modality is not linked to sequence learning in the same way that motor
sequence learning is.
ā¢ Do studies that depend on ļ¬nger tapping sequences get more afļ¬rmative
results here?
15. Effects of intervention - normalization after 7 years
of age
ā¢ Schlumberger et al. (2004), who found that in
a number of cognitive tests, there was a
signiļ¬cant difference between children with
no CI on the one hand, and hearing and CI-
receiver children (implanted 26-39 months)
on the other:
ā¢ Raven test
ā¢ Copy drawing test
ā¢ An intelligence test for children (mazes)
[delayed only, until 7 years]
ā¢ A visual perception test
ā¢ In favour of cochlear implants (CIm) & early
intervention are therefore a better
development of these abilities, besides a
good verbal development.
ā¢ Why the change seven years of
age?
The age when the pruning processes start
in the brain. Prior to 7 yrs, synaptic
connections have been steadily
increasing. At 7, a selection process starts
where synaptic connections that are not
strengthened by learning, will wane away
and disappear, up until about 20 yrs.
16. Summary and conclusion
ā¢ We have looked at the idea that lack of auditory sensory experience may lead to
the weakening of other, cognitively related abilities, possibly related via domain-
general abilities.
ā¢ Further, we have reasoned that an underlying cause may be that if so were the
case, this should be because they depend on the same or overlapping neural
networks.
ā¢ None of the studies we looked at could conļ¬rm this, for different reasons, although
this is a scientiļ¬cally very appealing thought. Can methods and hypotheses be
improved?
ā¢ We also looked at the ļ¬ip side of the ācollateral cognitive impairment hypothesisā,
that lack of auditory stimulation may cause other brain systems to develop
differently, and put the auditory cortices to new use, as well as reorganising higher-
cognition regions as a function of plasticity in congenitally deaf individuals. For the
studies that we selected, there was more support in this latter hypothesis.
17. The reason we should care: emotional and
behavioral challenges
ā¢ Congenitally deaf children who learn sign language
(i.e. a fully functional language in all respects) seem
to lack nothing when functioning with other signers.
Why should we intervene?
ā¢ Hearing impaired children and adolescents (up to
21 yrs) may experience a higher level of emotional
and behavioural difļ¬culties than hearing children, as
rated by parents and teachers (Stevenson et al.,
2015).
ā¢ They encounter challenges in communication,
which may lead to negative effects in social-
emotional development, and their mental health,
including:
ā¢ Depression, aggression, oppositional deļ¬ant
disorder, and conduct disorder, anxiety,
somatization and delinquency. Possibly,
hyperactivity (school-based only) and ADHD is
also overrepresented among hearing impaired.
Stevenson et al., 2015