Research Paradigms:Ontology's, Epistemologies & Methods

181,616 views

Published on

Published in: Technology, Education
32 Comments
149 Likes
Statistics
Notes
No Downloads
Views
Total views
181,616
On SlideShare
0
From Embeds
0
Number of Embeds
1,033
Actions
Shares
0
Downloads
5,628
Comments
32
Likes
149
Embeds 0
No embeds

No notes for slide
  • Evidence based developed at Mcmaster The group of clinical epidemiologists who developed evidence-based decision-making at McMaster University in Canada (Sackett et al., 1985)
  • But what if the results had shown very significant results in favor of either mode of delivery? Would they have informed our practice? I think the answer would be a resounding “Not very likely”. The meta-analysis tells us nothing about the critical context in which the learning took place. What learner support services were in place? What was the quality of the teaching or of the content? What was the condition of the home study or the class environment - the list of contextual factors goes on and on. Thus, one can conclude that this gold standard – the use of randomly assigned comparison group research and subsequent meta-analysis is of only limited use to practicing distance educators. These results may be useful in persuading reluctant colleagues or funders about the efficacy of distance education, but they tell us little that will help us to improve our practice.
  • Research Paradigms:Ontology's, Epistemologies & Methods

    1. 1. Research Paradigms: Ontology's, Epistemologies & Methods Terry Anderson PhD Seminar
    2. 2. Research Paradigms
    3. 3. Paradigm • “a philosophical and theoretical framework of a scientific school or discipline within which theories, laws, and generalizations and the experiments performed in support of them are formulated” Merriam Webster Dictionary, 2007) • “the set of common beliefs and agreements shared between scientists about how problems should be understood and addressed” (Kuhn, 1962)
    4. 4. • Ontology: ways of constructing reality, “how things really are” and “how things really work”.. Denzin and Lincoln, (1998; 201) • Epistemology: different forms of knowledge of that reality, what nature of relationship exists between the inquirer and the inquired? How do we know? • Methodology: What tools do we use to know that reality?
    5. 5. Research Paradigm
    6. 6. Research Paradigms Positivism - Quantitative ~ discovery of the laws that govern behavior Constructivist - Qualitative ~ understandings from an insider perspective Critical - Postmodern ~ Investigate and expose the power relationships Pragmatic - interventions, interactions and their effect in multiple contexts
    7. 7. Paradigm 1 Positivism - Quantitative Research • Ontology: There is an objective reality and we can understand it and it through the laws by which it is governed. • Epistemology: employs a scientific discourse derived from the epistemologies of positivism and realism. • Method: Experimental, Deduction,
    8. 8. • “those who are seeking the strict way of truth should not trouble themselves about any object concerning which they cannot have a certainty equal to arithmetic or geometrical demonstration” – (Rene Descartes) • Inordinate support and faith in randomized controlled studies
    9. 9. Typical Positivist research Question: • • • • • What? How much? Relationship between? Or Causes this effect? Best answered with numerical precision Often formulated as hypotheses
    10. 10. • Reliability: Same results different times, different researchers • Validity: results accurately measure and reliably answer research questions. • “Without reliability, there is no validity.” • Can you think of a positivist measurement that is reliable, but not valid?
    11. 11. Examples Positivist 1 – Community of Inquiry- Content Analysis • Garrison, Anderson, Archer 1997-2003 – http://communitiesofinquiry.com - 9 papers reviewing results focusing on reliable , quantitative analysis – Identified ways to measure teaching, social and cognitive ‘presence’ – Most reliable methods are beyond current time constraints of busy teachers – Questions of validity – Serves as basic research as grounding for AI methods and major survey work. – Serves as qualitative heuristic for teachers and course designers
    12. 12. Quantitative – Meta-Analysis • Aggregates many effect sizes creating large N’s & more powerful results. • Ungerleider and Burns (2003) • Systematic review of effectiveness and efficiency of Online education versus Face to face • The type of interventions studied were extraordinary diverse –only criteria was a comparison group • “Only 10 of the 25 studies included in the in-depth review were not seriously flawed, a sobering statistic given the constraints that went into selecting them for the review.”
    13. 13. Achievement in Online versus Classroom
    14. 14. Is DE Better than Classroom Instruction? Project 1: 2000 – 2004 • Question: How does distance education compare to classroom instruction? (inclusive dates 19852002) • Total number of effect sizes: k = 232 • Measures: Achievement, Attitudes and Retention (opposite of drop-out) • Divided into Asynchronous and Synchronous DE Bernard, R. M., Abrami, P. C., Lou, Y. Borokhovski, E., Wade, A., Wozney, L., Wallet, P.A., Fiset, M., & Huang, B. (2004). How does distance education compare to classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research, 74(3), 379-439. 14
    15. 15. Equivalency: Are all types of Interaction necessary? Anderson, 2003 IRRODL
    16. 16. Anderson’s Equivalency Theorem (2003) Moore (1989) distinctions are:  Three types of interaction o student-student interaction o student-teacher interaction o Student-content interaction Anderson (2003) hypotheses state:  High levels of one out of 3 interactions will produce satisfying educational experience  Increasing satisfaction through teacher and learner interaction interaction may not be as time or cost-effective as student-content interactive learning sequences 16
    17. 17. Do the three types of interaction differ? Moore’s distinctions Achievement and Attitude Outcomes Interaction Categories Student-Student Student-Teacher Student-Content Total Between-class Achievement k g+adj. 10 0.342 44 0.254 20 0.339 74 0.291 2.437 Attitudes k g+adj. 6 0.358 30 0.052 8 0.136 44 0.090 6.892* Moore’s distinctions seem to apply for achievement (equal importance), but not for attitudes (however, samples are low for SS and SC) 17
    18. 18. Does strengthening interaction improve achievement and attitudes? Anderson’s hypotheses Achievement and Attitude Outcomes Interaction Strength Low Strength Med Strength High Strength Total (Q) Between-class k 30 29 15 74 Achievement g+adj. 0.163 0.418 0.305 0.291 17.582* SE 0.043 0.044 0.062 0.027 k 21 18 5 44 Attitudes g+adj. 0.071 0.170 -0.173 0.090 12.060* SE 0.042 0.043 0.091 0.029 Anderson’s first hypothesis about achievement appears to be supported Anderson’s second hypothesis about satisfaction (attitude) appears to be supported, but only to an extent (i.e., only 5 studies in High Category) 18
    19. 19.  Bernard, Abrami, Borokhovski, Wade, Tamin, & Surkes, (2009). Examining Three Forms of Interaction in Distance Education: A MetaAnalysis of Between-DE Studies. Review of Research in Education
    20. 20. Quantitative Research Summary • Can be useful especially when fine tuning well established practice • Provides incremental gains in knowledge, not revolutionary ones • The need to “control” context often makes results of little value to practicing professionals • In times of rapid change too early quantitative testing may mask beneficial positive capacity • Will we ever be able to afford blind reviewed, random assignment studies?
    21. 21. Paradigm 2 Interpretivist or Constructivist Paradigm • Many different varieties • Generally answer the question ‘why’ rather then ‘what’, ‘when’ or ‘how much’? • Presents special challenges in distributed contexts due to distance between participants and researchers • Currently most common type of DE research (Rourke & Szabo, 2002)
    22. 22. Interpretivist Paradigm • Ontology: World and knowledge created by social and contextual understanding. • Epistemology: How do we come to understand a unique person’s worldview • Methodology: Qualitative methods – narrative, interviews, observations, ethnography, case study, phenomenology etc.
    23. 23. Dora Maar by Picasso
    24. 24. Picasso: Mother with Dead Child II, Postscript to Guernica
    25. 25. A phenomenological viewpoint diagram by Martin Parker
    26. 26. Typical Interpretive Research Question • • • • Why? How does subject understand ? What is the “lived experience”? What meaning does the artifact or intervention have?
    27. 27. Qualitative Example – Dearnley (2003) Student support in open learning: Sustaining the Process – Practicing Nurses, weekly F2F tutorial sessions – Phenomenological study using grounded theory discourse
    28. 28. Core category to emerge was “Finding the professional voice” Dearnley and Matthew (2003 and 2004)
    29. 29. Qualitative example 2 • Mann, S. (2003) A personal inquiry into an experience of adult learning on-line. Instructional Science 31 • Conclusions: – The need to facilitate the presentation of learner and teacher identities in such a way that takes account of the loss of the normal channel – The need to make explicit the development of operating norms and conventions – reduced communicative media there is the potential for greater misunderstanding – The need to consider ways in which the developing learning community can be open to the other of uncertainty, ambiguity and difference
    30. 30. 3rd Paradigm Critical Research • Asks who gains in power? • David Noble’s critique of ‘digital diploma mills’ most prominent Canadian example • Are profits generated from user generated content exploitative? • Confronting the “net changes everything” mantra of many social software proponents. • Who is being excluded from social software? • Are MOOCs really free?
    31. 31. Critical Research Paradigm • Ontology: Reality exists and has been created by directed social bias. • Epistemology: Understand oppressed view by uncovering the “contradictory conditions of action which are hidden or distorted by everyday understanding” (Comstock) and work to help change social conditions • Methodology: Critical analysis, historic review, participate in programs of action
    32. 32. Typical Critical Paradigm Questions • How can this injustice be rectified? • Can the exploited be helped to understand the oppression that undermines them? • Who benefits from or exploits the current situation?
    33. 33. See Norm Friesen’s Friesen, N. (2009) Re-thinking e-learning research: foundations, methods, and practices. Peter Lang Publishers
    34. 34. Sample Critical Questions • Why does Facebook own all the content that we supply? • Does the power of the net further marginalize the non-connected? • Who benefits from voluntary disclosure? • Why did the One Laptop Per Child fail? • Does learning Analytics exploit student vulnerabilities and right to privacy?
    35. 35. Do Positivist, Interpretive or Critical Research Meet the Real Needs of Practicing Educators?
    36. 36. But what type of research has most effect on practice? – Kennedy (1999) - teachers rate relevance and value of results from each of major paradigms. – No consistent results – teachers are not a homogeneous group of consumers but they do find research of value – “The studies that teachers found to be most persuasive, most relevant, and most influential to their thinking were all studies that addressed the relationship between teaching and learning.”
    37. 37. But what type of research has most effect on Practice? – “The findings from this study cast doubt on virtually every argument for the superiority of any particular research genre, whether the criterion for superiority is persuasiveness, relevance, or ability to influence practitioners’ thinking.” Kennedy, (1999)
    38. 38. Paradigm #4 Pragmatism • “To a pragmatist, the mandate of science is not to find truth or reality, the existence of which are perpetually in dispute, but to facilitate human problem-solving” (Powell, 2001, p. 884).
    39. 39. Pragmatic Paradigm • Developed from frustration of the lack of impact of educational research in educational systems. • Key features: – An intervention – Empirical research in a natural context – Partnership between researchers and practitioners – Development of theory and ‘design principles”
    40. 40. Pragmatic Paradigm • Ontology: Reality is the practical effects of ideas. • Epistemology: Any way of thinking/doing that leads to pragmatic solutions is useful. • Methodology: Mixed Methods, design-based research, action research
    41. 41. Typical Pragmatic Research Question • What can be done to increase literacy of adult learners? • Can collaborative Learning online, increase student satisfaction and completion rates? • Do blog activities increase student satisfaction and learning outcomes? • How can we encourage teachers to use more web 2.0 tools in their classroom
    42. 42. Design Tradition • “Learning and productivity are the results of the designs (the structures) of complex systems of people, environments, technology, beliefs and texts” New London Group 2000 • Design Based Research opens the door for teachers, researchers and learners to become designers, not merely consumers, bosses or
    43. 43. 4th Pragmatic Paradigm Design Based Research Method • Related to engineering and architectural research • Focuses on the design, construction, implementation and adoption of a learning initiative in an authentic context • Related to ‘Development Research’ • Closest educators have to a “home grown” research methodology
    44. 44. Design-Based Research Studies – iterative, – process focused, – interventionist, – collaborative, – multileveled, – utility oriented, – theory driven and generative • (Shavelson et al, 2003)
    45. 45. Critical characteristics of design experiments • According to Reeves (2000:8), Ann Brown (1992) and Alan Collins (1992): – addressing complex problems in real contexts in collaboration with practitioners, – integrating known and hypothetical design principles with technological affordances to render plausible solutions to these complex problems, and – conducting rigorous and reflective inquiry to test and refine innovative learning environments as well as to define new design-principles.
    46. 46. Integrative Learning Design (Bannan-Ritland, 2003)
    47. 47. • “design-based research enables the creation and study of learning conditions that are presumed productive but are not well understood in practice, and the generation of findings often overlooked or obscured when focusing exclusively on the summative effects of an intervention” Wang & Hannafin, 2003
    48. 48. • Iterative because • ‘Innovation is not restricted to the prior design of an artifact, but continues as artifacts are implemented and used” • Implementations are “inevitably unfinished” (Stewart and Williams (2005) • intertwined goals of (1) designing learning environments and (2) developing theories of learning (DBRC, 2003)
    49. 49. Amiel, T., & Reeves, T. C. (2008).
    50. 50. Design Based research and the Science of Complexity • Complexity theory studies the emergence of order in multifaceted, changing and previously unordered contexts • This emerging order becomes the focus of iterate interventions and evaluations • Order emerges at the “edge of chaos” in response to rapid change, and failure of previous organization models
    51. 51. DBR Examples Call Centres At Athabasca: • • Answer 80% of student inquiries Savings of over $100,000 /year Anderson, T. (2005). Design-based research and its application to a call center innovation in distance education. Canadian Journal of Learning and Technology, 31(2), 69-84
    52. 52. • Need to study usability, scalability and innovation adoption within bureaucratic systems • Allow knowledge tools to evolve in natural context through supportive nourishment of staff Conducting Educational Design Research by Susan McKenney and Thomas C Reeves
    53. 53. Summary Paradigm Positivism Ontology Hidden rules govern teaching and learning process Interpretive/con Reality is structivist created by individuals in groups Epistemology Question Method Focus on reliable What works? and valid tools to undercover rules Quantitative Discover the underlying meaning of events and activities Why do you act this way? Qualitative Critical Society is rife Helping uncover with inequalities injustice and and injustice empowering citizens How can I change this situation? Ideological review, Civil actions Pragmatic Truth is what is useful Will this intervention improve learning? Mixed Methods, Design-Based The best method is one that solves problems
    54. 54. Summary • 4 educational research paradigms • Choice for research based on – Personal views – Research questions – Access, support and resources – Supervisor(s) attitudes! • There is no single, “best way” to do research • Arguing paradigm perspectives is not productive
    55. 55. Questions and Comments??

    ×