Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Five things every teacher needs to know about research

158 views

Published on

Five things every teacher needs to know about research

Published in: Education
  • Be the first to comment

  • Be the first to like this

Five things every teacher needs to know about research

  1. 1. ‘FIVE THINGS EVERY TEACHER NEEDS TO KNOW ABOUT RESEARCH’ 8 February 2019 16:00 – 16:30 Christian Bokhove JUSCO This is the web-version of the slides. If you feel I haven’t referenced a source appropriately, or you otherwise object to the use of some content, please let me know.
  2. 2. Who am I ? • Christian Bokhove • Was maths and computer science teacher from 1998- 2012 in a secondary school in the Netherlands • PhD in 2011 at Utrecht University • Now Associate Professor at the University of Southampton, United Kingdom • Mathematics education • Large-scale assessment (TIMSS, PISA) • Research methods • Social media contrarian
  3. 3. Today • I’ve had many discussions about research on social media. • It sometimes seems as if we are talking past eachother rather than with eachother. • I think this is related to underpinning visions on education. • Such underpinning ideas are crucial for mutual understanding. • This presentation tries to contribute to this. • Based on my writings in the Times Educational Supplement, lovingly augmented with ‘the best of Twitter’. https://is.gd/5researchthings
  4. 4. Education Research • ‘Evidence-informed’ v ‘evidence-based’: still evidence but takes into account the nature of social sciences. • But what is it any way? Multidisciplinary… • Sociology • Economics • Psychology • Pedagogy • …. • Every discipline has its own set of assumptions and methods (and paradigms) • “dealing with so many variables that are extremely hard to (all) control.” (Neelen & Kirschner’s blog, 2018) https://3starlearningexperiences.wordpress.com/2018/06/26/working-in-an-evidence-informed-way/
  5. 5. Five age-old discussions… 1. Education research creates a ‘lesser form of knowledge’? 2. Cause and effect 3. One swallow does not make a summer: paradigms 4. Context 5. Quantification and measurement This requires a knowledgebase….
  6. 6. 1. ‘LESSER FORM OF KNOWLEDGE’
  7. 7. Labaree (1998) • Challenges of education research • Humans are unpredictable • Interaction between the researcher and what he/she studies • Labaree: ‘soft knowledge’.
  8. 8. As a result… Negatives • Lower status in academia • Less authority with policy makers • Pressure to be more like the ‘hard sciences’ No hard truths: a few years later there can be other insights (related to ‘publish and perish’) Positives • Inherently ‘in middle of society’ • Less consumer pressures So Labaree says: let’s utilise those positives..
  9. 9. 2. CAUSE AND EFFECT
  10. 10. David Hume (1711-1776)
  11. 11. PAGE 156 "A TREATISE OF HUMAN NATURE" Causality is a product of our experiences Causality is a ‘habit of mind’ It’s fiction Problem: spurious correlations. How can we ever know causality?
  12. 12. Kant’s reaction… • Hume’s work made Kant wake up from his ‘dogmatic slumber’. • Wrote the Prolegomena. • Kant could not agree with a conclusion that causality is just fiction. • Kant reflected on this and thought about how to formulate causality a priori • No space for in-depth discussion of this but the point is not who is right but that these topics have been discussed for centuries…
  13. 13. So we need to be cautious…. http://tylervigen.com/spurious-correlations
  14. 14. But sometimes common sense… https://www.economist.com/graphic- detail/2016/04/01/ice-cream-and-iq
  15. 15. Example: achievement - motivation • If you are good at something, you will enjoy doing it more…. • But a challenge is ‘what if you are not so good at it’ (and this can also be relative in a classroom) …. • If you like doing something, you will probably be or become better at it. It is reasonable to say the relation is bidirectional.
  16. 16. 3. PARADIGMS
  17. 17. Logical positivism: seen by many as ‘standard’ • Two sources of knowledge: empirical data and logical reasoning. • Science is cumulative, ‘progressive’. • Scientism: only science provides true knowledge. • Science should be free from values (objectivism). • Tasks of philosophy of science is normative. https://uk.sagepub.com/sites/default/files/upm-binaries/57753_Chapter_3.pptx No, don’t think this is a pejorative. It’s a perfectly normal term in the philosophy of science.
  18. 18. Critics of positivism Quine Sellars Wittgenstein Hanson Kuhn Popper Lakatos Feyerabend https://uk.sagepub.com/sites/default/files/upm-binaries/57753_Chapter_3.pptx
  19. 19. Critics of positivism Early criticism was epistemological: emphasised “scientific objectivity cannot exist by virtue of neutral observation of alleged pure data out of the outside world; objective observations in this sense are not possible at all.” Quine: observation statements are part of whole theories. Wittgenstein II: the meaning of a word is dependent on the language game of which it is a part; therefore its meaning is shown by how it is used. Hanson: observation is theory laden; to see is to see as. https://uk.sagepub.com/sites/default/files/upm-binaries/57753_Chapter_3.pptx
  20. 20. Karl Popper (1902–1994) Logical positivism: • induction; • generalisation; • verification; A generalisation can’t be verified A generalisation can be falsified (Black/white swan) Science is risk-taking. Verification is not interesting. https://uk.sagepub.com/sites/default/files/upm-binaries/57753_Chapter_3.pptx
  21. 21. B&LdeJ 21 Imre Lakatos (1922–1974) Dogmatism/conservatism within research programs and progress and rational choice between programs: relativism can be avoided: • Degenerating research program: just more and more ad-hoc hypotheses; • Progressive research programs: ad-hoc hypotheses lead to new predictions, data, applications; • Competition: rational choice, not just mob psychology (contra Kuhn); • Post-hoc, no a priori demarcation (contra falsificationism). https://uk.sagepub.com/sites/default/files/upm-binaries/57753_Chapter_3.pptx
  22. 22. 4. CONTEXT
  23. 23. The hardest science of them all • Berliner (2002) • Local conditions make generalisations hard • Need context to put in perspective • This is not relativism • Example Project Follow Through • ‘Planned variation’ • Direct Instruction on average very effective • But the variation of the different implementations perhaps larger than between different approaches. Also see this in curriculum project.
  24. 24. 5. QUANTIFICATION AND MEASUREMENT
  25. 25. • Labaree again, nu 2011 in “THE LURE OF STATISTICS FOR EDUCATIONAL RESEARCHERS” • History of ‘statistics’ • From German Statistik, from New Latin statisticum (“of the state”) and Italian statista (“statesman, politician”). Statistik introduced by Gottfried Achenwall (1749), originally designated the analysis of data about the state. • Aim was to improve credibility and stature, policy influence.
  26. 26. • Forcing a rectangular grid on a spherical world. Two problems, according to Labaree (2011): • Can affect local practical knowledge. • What are we measuring anyway? (fMRI, SES, mindset, load). Might detract from that what can’t be measured easily. Maybe better to combine qualitative and quantitative methods. Research questions can lead.
  27. 27. Complex diagram or table (Muijs & Bokhove, 2017)
  28. 28. What is being measured? • Teacher-directed • Perceived feedback • Adaptive instruction • Enquiry-based instruction Source: PISA
  29. 29. (Paas, 1992)
  30. 30. https://www.frontiersin.org/articles/10.3389/feduc.2017.00037/full
  31. 31. Example ‘learning’ • “change in long-term memory” (Kirschner, Sweller & Clark, 2006) • But how measure? Performance/learning (Soderstrom & Bjork (2015) • Willingham (2017): depends on theory. https://is.gd/onlearning
  32. 32. Many definitions… • Product (end result) or process? (Lachman, 1997) • Learning as the processing of information or experience • Learning defined as behavioural change • Learning defined as changes in behavioural mechanisms These views complement each other (Barron et al, 2015).
  33. 33. CONCLUSION
  34. 34. What to take away from this? 1. Education research creates a ‘lesser form of knowledge’? Not much point in comparing education research with the natural sciences. Research into education (multidisciplinary) has its own strengths.
  35. 35. What to take away from this? 2. Cause and effect Correlation is not causation but it’s good to probe a little bit deeper. Sometimes only one direction is plausible, sometimes both directions.
  36. 36. What to take away from this? 3. One swallow does not make a summer: paradigms Your view on science (paradigm) determines whether one swallow makes a summer or not. My view would be that one study does not negate other bodies of research. So if you notice one particular study is cited all the time….have a further look as well.
  37. 37. What to take away from this? 4. Context We can carefully generalise over contexts but there always is a context. You better understand the strengths and limitations of a study by exploring the context as well.
  38. 38. What to take away from this? 5. Quantification and measurement Measurement gives us important information but it is important to know what you are measuring. Try to find out what information actually led to the conclusions in a piece of research.
  39. 39. So make sure… • You read read read… (unfortunately this can take quite some time). • Be critical. • Postpone hard conclusions. • Steel manning: try to construct the best argument for the opposite you believe in. • Be alert about terminology (definitions). • You can disagree politely.
  40. 40. Thank you • C.Bokhove@soton.ac.uk • Twitter: @cbokhove • Website: www.bokhove.net There are only two types of people in the world: those who believe in false dichotomies, and penguins.
  41. 41. References Barron, A.B., Hebets, E.A., Cleland, T.A., Fitzpatrick, C.L., Hauber, M.E., & Stevens, J.R. (2015). Embracing multiple definitions of learning. Trends in Neurosciences, 38(7), 405-407. Open access version at https://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=1058&context=bioscihebets Berliner, D.C. (2002). Educational research: The hardest science of all. Educational Researcher, 31(8), 18-20. Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41, 75–86. Lachman, S.J. (1997). Learning is a Process: Toward an Improved Definition of Learning. The Journal of Psychology Interdisciplinary and Applied. 131(5), p. 477-480. Soderstrom, N.C., & Bjork, R.A. (2015). Learning Versus Performance: An Integrative Review. Perspectives on Psychological Science, 10(2), https://doi.org/10.1177/1745691615569000 Willingham, D.T. (2017). On the Definition of Learning.... Available on http://www.danielwillingham.com/daniel- willingham-science-and-education-blog/on-the-definition-of-learning Labaree, D.F. (1998). Educational researchers: Living with a lesser form of knowledge. Educational Researcher, 27(8), 4-12. Labaree, D.F. (2011). The lure of statistics for educational researchers. Educational Theory, 61(6), 621-632. Muijs, D., & Bokhove, C. (2017). Postgraduate student satisfaction: a multilevel analysis of PTES data. British Educational Research Journal, 43(5), 904-930. DOI: 10.1002/berj.3294 Paas, F. (1992). Training strategies for attaining transfer of problem-solving skill in statistics: A cognitive-load approach. Journal of Educational Psychology, 84(4), 429-434.

×