Direct instruction is a hot topic in school, but discussions about it often end up with people talking past each other, as the term can mean several things. In this talk I will look at different ways of conceptualising 'direct instruction', for example as scripted Direct Instruction (in capitals) from the seminal Project Follow Through, lecturing and more interactive teaching like Rosenshine's 'explicit instruction' and 'active learning'. I will also highlight strengths and limitations of their respective evidence bases. I will frame these more generally as a 'guidance dilemma': what amount of guidance do we use in teaching and learning in learners' journey from novice towards more expert. I will finish with some concrete recommendations.
From novice to expert: A critical evaluation of direct instruction
1. FROM NOVICE TO EXPERT
A critical evaluation of direct instruction
Dr Christian Bokhove
Associate Professor in Mathematics Education
University of Southampton
22 January 2020 4.30-6.00
Disclaimer: I have referenced all sources and used/cited them in ‘fair use’.
If, however, you feel I have to change or remove a source, please let me know
via C.Bokhove@soton.ac.uk
2. Who am I?
• Dr Christian Bokhove
• From 1998-2012 teacher mathematics & computer science, head of
ICT secondary school Netherlands
• National projects Utrecht University
• PhD ‘Use of ICT for acquiring, practicing and
assessing algebraic expertise’ with
Prof. Van Maanen and Prof. Drijvers
• Associate Professor at University of
Southampton
• Mathematics education
• Technology use
• Large-scale assessment (PISA/TIMSS)
• Research methods
3. What I intend to do
• Why is it so popular now ?
• What forms of direct instruction are there ?
• What sources are there for the efficacy of instruction ?
• Research, PISA
• Strengths and limitations of these sources
• What it boils down to: guidance dilemma
4. Context
• Discussions regarding teaching strategies have been
around for decades.
• But recently, at least in policy and (social) media they
have flared up again.
https://www.gov.uk/government/speeches/nick-gibb-the-importance-of-an-
evidence-informed-profession
7. What is direct instruction
• “Five overlapping meanings of direct instruction.
1. Instructional procedures used in the Distar (Direct
Instruction Systems in Arithmetic and Reading)
programs.
2. Academic instruction that is led by a teacher
regardless of the quality of instruction.
3. The instructional procedures that were used by
effective teachers in the teacher effects research.
4. Instructional procedures used by teachers when they
taught cognitive strategies to students.
5. Instruction where direct instruction is portrayed in
negative terms such as settings where the teacher
lectures and the students sit passively.”
http://www.centerii.org/search/Resources/FiveDirectInstruct.pdf
8. Direct Instruction (Capitals)
• Instructional procedures used in the Distar (Direct
Instruction Systems in Arithmetic and Reading) programs.
• Siegfrid Engelmann
• 1969-1972 with Becker Project Head Start “provide a
comparison of the effectiveness of different 'models' of
early childhood programs with disadvantaged children”
• 1967 - President Johnson's War on Poverty – ‘follow
through’ on temporary gains
https://psych.athabascau.ca/open/engelmann/bio.php
9. Project Follow Through
• “Social program whose primary goal was to provide
disadvantaged school-age children with compensatory
education.”
• Planned variation: not random, parents of children in a
geographical area could choose models.
• Sponsored model approach
• Affective models
• Cognitive models
• Basic Skills models
• Measured: basic academic skills, problem solving skills
and self-esteem
10. Features of Distar
• Small group instruction by ability
• Attention focused on the teacher.
• Scripted presentation of carefully designed instruction.
• Active responding as a group and individually.
• Responding is cued by the teacher.
• Frequent feedback and correction.
• High pace
• Parental engagement/involvement
11. Principles
• Engelmann & Carnine “Theory of instruction: Principles
and applications”
• Instructional analysis: nurture, the environment is the
primary variable accounting for learning
• Relationship between the environment and the learner:
can’t keep learner constant so controlled instruction.
• Faultless communication: concepts, examples, non-
examples (note overlap
with ‘variation theory’)
• Only look at behaviour if
unsuccessful learning
https://psych.athabascau.ca/open/engelmann/theory.php
12. But what does it look like ? (1)
https://www.youtube.com/watch?v=j9SjFsimywA
14. But…not adopted
• “theory and methods espoused by Engelmann were
inconsistent with the dominant thinking of American
educators”
• “the American Federation of Teachers asserts that
Engelmann's programs were criticized for being too rigid
and for emphasizing basic skills.”
• “Schools of Education in universities, boards of education,
the Ford Foundation and commercial publishers argued
against the research and the data, and they won. Opinion
triumphed over data “
https://psych.athabascau.ca/open/engelmann/direct-evid.php
15. Critique of the project
• House critique (1978): classification, measurement and
analysis
(e.g. site level). But rather confirms original analysis Abt
Associates.
• Bereiter and Kurland (1981):
• “Two models, Direct Instruction and Behavior Analysis, were at or
near the top on every achievement variable regardless of the
covariates used.”
• “not find variability among sites to be so great that it overshadows
variability among models. “
• Kennedy (1978): “structured approaches to instruction.”
Some more information from page 35 in Watkins (1997).
16. But what does it look like?
• “15 minutes of small group instruction and 30 minutes of self-
directed practice, plus strong parental support, as per the
DISTAR model is highly effective. I'm also sure there are many
ways to frustrate such effective programs.”
• 'Ease of implementation' also a variable.
“It was found that the variance in student
achievement was larger within programs
than it was between programs. No program
could produce consistency of effects
across sites. Each local context was
different, re-quiring differences in
programs, personnel, teaching methods,
budgets, leadership, and kinds of
community support.” (Berliner, 2002)
17. However…
• Need to remember that PFT started as a social service
programme.
• Administrators sought to represent all educational interests.
• By definition therefore very political with lots of interest groups.
• PFT has been undervalued but can it say something now still?
• And also…would the approach work per se in a non-
disadvantaged, secondary context, for example?
• I would say: maybe, but can’t simply assume…
There are lessons to be learned from this…
18. Recent review
• By Stockard et al. (2018).
• Overall very effective.
• But quite large differences
between measurements,
designs.
• Also some strange
inclusion criteria.
19. Rosenshine
• Originally International
Bureau of Education,
UNESCO
• Originally 2010
• But popularised by
American Educator
article called
‘Principles of
Instruction’ in 2012
https://www.aft.org/sites/default/files/periodicals/Rosenshine.pdf
20.
21. Critique Rosenshine
• Correlational. Depends heavily on work Brophy and
Good.
Process-product research. Risk survivorship bias?
• References a
bit dated.
But…
• Can’t go much
wrong with the
recommendations
• ‘I do, we do, you do’
(e.g.https://education.msu.edu/irt/PDFs/OccasionalPapers/op073.pdf)
22. Risk survivorship bias
• “the logical error of concentrating on the people or things
that made it past some selection process and overlooking
those that did not, typically because of their lack of
visibility.”
• Example WW2 (Wald)
• Successful teachers
• Successful schools
• Column
23. What we can take from this…
Rosenshine: “Guided practice, active student participation,
and fading of teacher-directed activities appear in all three
meanings. Scaffolds, modelling by the teacher, and
coaching of students also appear in all three.”
But “rather than a ‘whole system’ it should best be seen as
a set of principles derived from various sources, within
which decisions ultimately have to be left to a teacher’s
professional judgement. “ (Bokhove & Campbell, to appear)
24. Teacher-directed instruction?
• PISA is
another source
that
popularised
talk about
instruction.
• PISA 2015 →
Science
results. https://oecdedutoday.com/working-hard-and-being-kind/
https://www.theaustralian.com.au/commentary/evidence-key-if-were-to-turn-
education-fortunes-around/news-story/d314f7dfd631524f1b716a3c49bf084e
25. PISA
• Programme for International Student Assessment
• Organisation for Economic Co-operation and
Development (OECD) funded
• 15 year olds
• 2012: Mathematics, 2015: science, 2018: reading
• 70+ countries
• Sample students in schools, not classrooms
• http://www.oecd.org/pisa/
26. But we have to look deeper
• Other scales like ‘cognitive activation’ (PISA 2012) and
‘adaptive instruction’ (PISA 2015) have instruction
elements as well.
• Some scales like that of enquiry ask other things.
• Are they opposite? No.
Oliver et al. (2019)
27. • In PISA 2012 74% of students in Japan agreed or strongly
agreed that 'The teacher tells us what we have to learn'.
Sounds like quite a lot, but not when we know the OECD
average was 80% and the United Kingdom, for example,
87%.
• That is only one question that contributed in PISA 2012 to
the teacher-directed instruction scale. For completeness:
Japan scored -0.26 (but sd 0.90) and the UK 0.15 (sd
0.94) so some might conclude 'more teacher-directed
instruction in the UK'.
28. 0.15 (sd 0.94) in PISA 2012
In PISA 2015 there also was a teacher-directed instruction
scale. 0.09 (sd 0.94). But can they be compared?
29. So I took into account the complex sampling design
(student weights, 10 plausible values etc.) and looked at
that scale for the UK. It seems 0.03 (sd 0.99)
30. What about PISA 2018
• Focus on Reading
• Also a ‘teacher-directed instruction’
scale
• But negative association (quick
analysis of my own)
Remains risky to use PISA to infer
effective teaching strategies without
taking into account context (e.g.
subjects) and the nature of data
(student self-report).
32. And when you dive deeper
PISA 2015
- Teacher directed
- Adaptive instruction
- Enquiry scale
33. Critique of PISA
Among the limitations regarding such strong conclusions:
• PISA samples students from schools. This means that
claims about classroom practices actually are pretty
difficult.
• Self-report
• Answer options changed over time
• Comparison with other teaching strategies
• Comparison over time
• Correlation/causation
• And many more….
34. So….
• What we mean with instruction varies widely
• How we ask about it varies widely
• The contexts (target audience, countries, time period)
vary widely
• The conclusions about it vary widely
Is there nothing we can say?
35. What it all boils down to: guidance
• ‘Guidance Dilemma’ or
‘ ‘Assistance Dilemma’
• First formulated in the
context of Intelligent
Tutoring Systems
• Even when we read
reviews of Inquiry we
see the role of
guidance.
36.
37. How do we help learners go
from Novice to Expert ?
• De Groot / Simon famous
research with chess players.
• Chi: how experts can use their
knowledge structures to
assimilate new concepts and
make decisions about familiar
concepts.
• Ericsson’s deliberate practice:
(i) specifically to improve
performance,
(ii) that it can be repeated a lot,
(iii) feedback on results is
continuously available,
(iv) that it’s highly demanding
mentally and
(v) that it isn't much fun.
(Ericsson & Pool, 2016)
39. The key to all of this: prior knowledge
• To judge how much guidance is necessary, need to know
prior knowledge.
• Can gauge this in several ways, for example quizzes,
place in curriculum, questioning etc.
• One part of teachers’ craft knowledge, gained through
experience, is to anticipate this.
• But there are plenty of things that can be prepared:
faultless communication, examples, clear problem
statements etc.
41. Conclusion
• Different models of instruction. Not all perfect, not all bad.
Instruction is perfectly fine and can be very effective.
• Boils down to guidance and how this is organised.
• This also depends on learning goals: “abandoning the
rigid explicit instruction versus minimal guidance
dichotomy and replacing it with a more flexible approach
based on differentiating specific goals of various learner
activities in complex learning.” (Kalyuga & Singh, 2016)
• Modify the amount of guidance based on an estimate of
prior knowledge.
• (Experienced teachers often do this as ‘craft knowledge’)
42. Thank you
• Any questions?
• C.Bokhove@soton.ac.uk
• Twitter: @cbokhove
• Interested in this topic or other specialisms? Let me know.
43. Selected references
Berliner, D. C. (2002). Comment: Educational Research:The Hardest Science of All. Educational Researcher, 31(8), 18–20.
https://doi.org/10.3102/0013189X031008018
Bokhove, C., & Drijvers, P. (2012). Effects of feedback in an online algebra intervention. Technology, Knowledge and Learning,
17(1-2), 43-59. doi:10.1007/s10758-012-9191-8
Brophy, J., & Good, T.L. (1984). Teacher behaviour and student achievement. Retrieved from
https://education.msu.edu/irt/PDFs/OccasionalPapers/op073.pdf
Engelmann, S., Becker, W., Carnine, D., & Gersten, R. (1988). The Direct Instruction Follow Through Model: Design and
outcomes. Education and Treatment of Children, 11(4), 303-317. Retrieved January 21, 2020, from www.jstor.org/stable/42899079
Ericsson, A., & Pool, R. (2016). Peak: Secrets from the new science of expertise. Boston: Houghton Mifflin Harcourt.
Kalyuga, S., & Singh, A.-M. (2016). Rethinking the boundaries of cognitive load theory in complex learning. Educational
Psychology Review, 28(4), 831–852. https://doi.org/10.1007/s10648-015-9352-0
Lazonder, A. W., & Harmsen, R. (2016). Meta-Analysis of Inquiry-Based Learning: Effects of Guidance. Review of Educational
Research, 86(3), 681–718. https://doi.org/10.3102/0034654315627366
Oliver, M., McConney, A., & Woods-McConney, A. (2019). The Efficacy of Inquiry-Based Instruction in Science: a Comparative
Analysis of Six Countries Using PISA 2015. Research in Science Education. doi:10.1007/s11165-019-09901-0
Renkl, A., Atkinson, R. K., & Große, C. S. (2004). How fading worked solution steps works – A cognitive load perspective.
Instructional Science, 32, 59-82.
Rosenshine, B., (2010). Principles of Instruction. Educational Practices Series-21. UNESCO International Bureau of Education.
Retrieved from http://www.ibe.unesco.org/fileadmin/user_upload/Publications/Educational_Practices/EdPractices_21.pdf
Stockard, J., Wood, T. W., Coughlin, C., & Rasplica Khoury, C. (2018). The Effectiveness of Direct Instruction Curricula: A Meta-
Analysis of a Half Century of Research. Review of Educational Research, 88(4), 479–507.
https://doi.org/10.3102/0034654317751919
Watkins, C. L. (1997). Project Follow Through:A case study of contingencies influencing instructional practices of the educational
establishment. Retrieved from http://www.behavior.org/resources/901.pdf
Editor's Notes
Direct instruction is a hot topic in school, but discussions about it often end up with people talking past each other, as the term can mean several things. In this talk I will look at different ways of conceptualising 'direct instruction', for example as scripted Direct Instruction (in capitals) from the seminal Project Follow Through, lecturing and more interactive teaching like Rosenshine's 'explicit instruction' and 'active learning'. I will also highlight strengths and limitations of their respective evidence bases. I will frame these more generally as a 'guidance dilemma': what amount of guidance do we use in teaching and learning in learners' journey from novice towards more expert. I will finish with some concrete recommendations.
Disclaimer: I try to cover both ‘sides’. All too often, not being enormously pro or con, leads to ‘othering’.