Research Paradigms: Ontology's,
Epistemologies & Methods
Terry Anderson
PhD Seminar
Research Paradigms
Paradigm
• “a philosophical and theoretical framework of a
scientific school or discipline within which theories,
laws, and generalizations and the experiments
performed in support of them are formulated”
Merriam Webster Dictionary, 2007)
• “the set of common beliefs and agreements shared
between scientists about how problems should be
understood and addressed” (Kuhn, 1962)
• Ontology: ways of constructing reality, “how things really are” and “how things
really work”.. Denzin and Lincoln, (1998; 201)
• Epistemology: different forms of knowledge of that reality, what nature of
relationship exists between the inquirer and the inquired? How do we know?
• Methodology: What tools do we use to know that reality?
Research Paradigm
Research Paradigms
Positivism - Quantitative
Positivism - Quantitative ~
~ discovery
discovery
of the laws that govern behavior
of the laws that govern behavior
Constructivist - Qualitative ~
Constructivist - Qualitative ~
understandings from an insider perspective
understandings from an insider perspective
Critical - Postmodern
Critical - Postmodern ~
~ Investigate
Investigate
and expose the power relationships
and expose the power relationships
Pragmatic -
Pragmatic - interventions, interactions
interventions, interactions
and their effect in multiple contexts
and their effect in multiple contexts
Paradigm 1
Positivism - Quantitative Research
• Ontology: There is an objective reality and we
can understand it and it through the laws by
which it is governed.
• Epistemology: employs a scientific discourse
derived from the epistemologies of positivism
and realism.
• Method: Experimental, Deduction,
• “those who are seeking the strict
way of truth should not trouble
themselves about any object
concerning which they cannot have
a certainty equal to arithmetic or
geometrical demonstration”
– (Rene Descartes)
• Inordinate support and faith in
randomized controlled studies
Typical Positivist research
Question:
• What?
• How much?
• Relationship between? Or Causes this effect?
• Best answered with numerical precision
• Often formulated as hypotheses
• Reliability: Same results different times,
different researchers
• Validity: results accurately measure and
reliably answer research questions.
• “Without reliability, there is no validity.”
• Can you think of a positivist measurement
that is reliable, but not valid?
Examples Positivist 1 –
Community of Inquiry- Content Analysis
• Garrison, Anderson, Archer 1997-2003
– http://communitiesofinquiry.com - 9 papers reviewing results
focusing on reliable , quantitative analysis
– Identified ways to measure teaching, social and cognitive
‘presence’
– Most reliable methods are beyond current time constraints of
busy teachers
– Questions of validity
– Serves as basic research as grounding for AI methods and major
survey work.
– Serves as qualitative heuristic for teachers and course designers
Quantitative – Meta-Analysis
• Aggregates many effect sizes creating large N’s & more
powerful results.
• Ungerleider and Burns (2003)
• Systematic review of effectiveness and efficiency of
Online education versus Face to face
• The type of interventions studied were extraordinary
diverse –only criteria was a comparison group
• “Only 10 of the 25 studies included in the in-depth
review were not seriously flawed, a sobering statistic
given the constraints that went into selecting them for
the review.”
Achievement in Online versus Classroom
Is DE Better than Classroom Instruction?
Project 1: 2000 – 2004
• Question: How does distance education compare
to classroom instruction? (inclusive dates 1985-
2002)
• Total number of effect sizes: k = 232
• Measures: Achievement, Attitudes and Retention
(opposite of drop-out)
• Divided into Asynchronous and Synchronous DE
14
Berna
Bernard, R. M., Abrami, P. C., Lou, Y. Borokhovski, E., Wade, A., Wozney, L.,
rd, R. M., Abrami, P. C., Lou, Y. Borokhovski, E., Wade, A., Wozney, L.,
Wallet, P.A., Fiset, M., & Huang, B. (2004). How does distance education
Wallet, P.A., Fiset, M., & Huang, B. (2004). How does distance education
compare to classroom instruction? A meta-analysis of the empirical literature.
compare to classroom instruction? A meta-analysis of the empirical literature.
Review of Educational Research, 74(3), 379-439
Review of Educational Research, 74(3), 379-439.
.
Equivalency: Are all types of Interaction
necessary?
Anderson,
2003
IRRODL
Anderson’s Equivalency Theorem (2003)
Moore (1989) distinctions are:
 Three types of interaction
o student-student interaction
o student-teacher interaction
o Student-content interaction
Anderson (2003) hypotheses state:
 High levels of one out of 3 interactions will produce
satisfying educational experience
 Increasing satisfaction through teacher and learner
interaction interaction may not be as time or cost-effective
as student-content interactive learning sequences
16
Do the three types of interaction
differ? Moore’s distinctions
17
Achievement and Attitude Outcomes
Achievement Attitudes
Interaction
Categories k g+adj. k g+adj.
Student-Student 10 0.342 6 0.358
Student-Teacher 44 0.254 30 0.052
Student-Content 20 0.339 8 0.136
Total 74 0.291 44 0.090
Between-class 2.437 6.892*
Moore’s distinctions seem to apply for achievement (equal importance), but not for
attitudes (however, samples are low for SS and SC)
Does strengthening interaction improve achievement
and attitudes? Anderson’s hypotheses
18
Anderson’s first hypothesis about achievement appears to be supported
Anderson’s second hypothesis about satisfaction (attitude) appears to be
supported, but only to an extent (i.e., only 5 studies in High Category)
Achievement and Attitude Outcomes
Achievement Attitudes
Interaction
Strength k g+adj. SE k g+adj. SE
Low Strength 30 0.163 0.043 21 0.071 0.042
Med Strength 29 0.418 0.044 18 0.170 0.043
High Strength 15 0.305 0.062 5 -0.173 0.091
Total 74 0.291 0.027 44 0.090 0.029
(Q) Between-class 17.582* 12.060*
Bernard, Abrami, Borokhovski, Wade, Tamin,
& Surkes, (2009). Examining Three Forms of
Interaction in Distance Education: A Meta-
Analysis of Between-DE Studies. Review of
Research in Education
Quantitative Research Summary
• Can be useful especially when fine tuning well
established practice
• Provides incremental gains in knowledge, not
revolutionary ones
• The need to “control” context often makes results of
little value to practicing professionals
• In times of rapid change too early quantitative
testing may mask beneficial positive capacity
• Will we ever be able to afford blind reviewed,
random assignment studies?
Paradigm 2
Interpretivist or Constructivist Paradigm
• Many different varieties
• Generally answer the question ‘why’ rather
then ‘what’, ‘when’ or ‘how much’?
• Presents special challenges in distributed
contexts due to distance between participants
and researchers
• Currently most common type of DE research
(Rourke & Szabo, 2002)
Interpretivist Paradigm
• Ontology: World and knowledge created by
social and contextual understanding.
• Epistemology: How do we come to
understand a unique person’s worldview
• Methodology: Qualitative methods –
narrative, interviews, observations,
ethnography, case study, phenomenology etc.
Dora Maar by Picasso
Picasso: Mother with Dead Child II,
Postscript to Guernica
A phenomenological viewpoint diagram by Martin Parker
Typical Interpretive Research
Question
• Why?
• How does subject understand ?
• What is the “lived experience”?
• What meaning does the artifact or
intervention have?
Qualitative Example
–Dearnley (2003) Student support in
open learning: Sustaining the Process
–Practicing Nurses, weekly F2F tutorial
sessions
–Phenomenological study using
grounded theory discourse
Core category to emerge was “Finding the
professional voice”
Dearnley and Matthew (2003 and 2004)
Dearnley and Matthew (2003 and 2004)
Qualitative example 2
• Mann, S. (2003) A personal inquiry into an experience of
adult learning on-line. Instructional Science 31
• Conclusions:
– The need to facilitate the presentation of learner and teacher
identities in such a way that takes account of the loss of the normal
channel
– The need to make explicit the development of operating norms and
conventions
– reduced communicative media there is the potential for greater
misunderstanding
– The need to consider ways in which the developing learning
community can be open to the other of uncertainty, ambiguity and
difference
3rd Paradigm
Critical Research
• Asks who gains in power?
• David Noble’s critique of ‘digital diploma mills’ most
prominent Canadian example
• Are profits generated from user generated content
exploitative?
• Confronting the “net changes everything” mantra of
many social software proponents.
• Who is being excluded from social software?
• Are MOOCs really free?
Critical Research Paradigm
• Ontology: Reality exists and has been created
by directed social bias.
• Epistemology: Understand oppressed view by
uncovering the “contradictory conditions of
action which are hidden or distorted by
everyday understanding” (Comstock) and work
to help change social conditions
• Methodology: Critical analysis, historic review,
participate in programs of action
Typical Critical Paradigm Questions
• How can this injustice be rectified?
• Can the exploited be helped to understand
the oppression that undermines them?
• Who benefits from or exploits the current
situation?
See Norm Friesen’s
Friesen, N. (2009) Re-thinking e-learning
research: foundations, methods, and
practices. Peter Lang Publishers
Sample Critical Questions
• Why does Facebook own all the content that
we supply?
• Does the power of the net further marginalize
the non-connected?
• Who benefits from voluntary disclosure?
• Why did the One Laptop Per Child fail?
• Does learning Analytics exploit student
vulnerabilities and right to privacy?
Do Positivist, Interpretive or Critical
Research Meet the Real Needs of
Practicing Educators?
But what type of research has most
effect on practice?
– Kennedy (1999) - teachers rate relevance and
value of results from each of major
paradigms.
– No consistent results – teachers are not a
homogeneous group of consumers but they
do find research of value
– “The studies that teachers found to be most
persuasive, most relevant, and most
influential to their thinking were all studies
that addressed the relationship between
teaching and learning.”
But what type of research has most
effect on Practice?
– “The findings from this study cast doubt on
virtually every argument for the superiority
of any particular research genre, whether the
criterion for superiority is persuasiveness,
relevance, or ability to influence practitioners’
thinking.” Kennedy, (1999)
Paradigm #4
Pragmatism
• “To a pragmatist, the mandate of science is
not to find truth or reality, the existence of
which are perpetually in dispute, but to
facilitate human problem-solving” (Powell,
2001, p. 884).
Pragmatic Paradigm
• Developed from frustration of the lack of
impact of educational research in educational
systems.
• Key features:
– An intervention
– Empirical research in a natural context
– Partnership between researchers and practitioners
– Development of theory and ‘design principles”
Pragmatic Paradigm
• Ontology: Reality is the practical effects of
ideas.
• Epistemology: Any way of thinking/doing that
leads to pragmatic solutions is useful.
• Methodology: Mixed Methods, design-based
research, action research
Typical Pragmatic
Research Question
• What can be done to increase literacy of adult
learners?
• Can collaborative Learning online, increase
student satisfaction and completion rates?
• Do blog activities increase student satisfaction
and learning outcomes?
• How can we encourage teachers to use more
web 2.0 tools in their classroom
Design Tradition
• “Learning and productivity are the
results of the designs (the structures)
of complex systems of people,
environments, technology, beliefs and
texts” New London Group 2000
• Design Based Research opens the door
for teachers, researchers and learners
to become designers, not merely
consumers, bosses or observers .
4th Pragmatic Paradigm
Design Based Research Method
• Related to engineering and architectural
research
• Focuses on the design, construction,
implementation and adoption of a learning
initiative in an authentic context
• Related to ‘Development Research’
• Closest educators have to a “home grown”
research methodology
Design-Based Research Studies
–iterative,
–process focused,
–interventionist,
–collaborative,
–multileveled,
–utility oriented,
–theory driven and generative
• (Shavelson et al, 2003)
Critical characteristics of
design experiments
• According to Reeves (2000:8), Ann Brown (1992)
and Alan Collins (1992):
– addressing complex problems in real contexts in
collaboration with practitioners,
– integrating known and hypothetical design principles
with technological affordances to render plausible
solutions to these complex problems, and
– conducting rigorous and reflective inquiry to test and
refine innovative learning environments as well as to
define new design-principles.
Integrative Learning Design
(Bannan-Ritland, 2003)
• “design-based research enables the creation
and study of learning conditions that are
presumed productive but are not well
understood in practice, and the generation of
findings often overlooked or obscured when
focusing exclusively on the summative effects
of an intervention” Wang & Hannafin, 2003
• Iterative because
• ‘Innovation is not restricted to the prior design of
an artifact, but continues as artifacts are
implemented and used”
• Implementations are “inevitably unfinished”
(Stewart and Williams (2005)
• intertwined goals of (1) designing learning
environments and (2) developing theories of
learning (DBRC, 2003)
Amiel, T., & Reeves, T. C. (2008).
Design Based research and the Science of
Complexity
• Complexity theory studies the emergence of
order in multifaceted, changing and previously
unordered contexts
• This emerging order becomes the focus of
iterate interventions and evaluations
• Order emerges at the “edge of chaos” in
response to rapid change, and failure of
previous organization models
DBR Examples
Call Centres At Athabasca:
•
•
Answer 80% of student inquiries
Savings of over $100,000 /year
Anderson, T. (2005). Design-based research and its application to a call center
innovation in distance education. Canadian Journal of Learning and Technology,
31(2), 69-84
• Need to study usability, scalability and
innovation adoption within bureaucratic
systems
• Allow knowledge tools to evolve in natural
context through supportive nourishment of
staff
Conducting Educational Design Research by
Susan McKenney and Thomas C Reeves
Paradigm Ontology Epistemology Question Method
Positivism Hidden rules
govern teaching
and learning
process
Focus on reliable
and valid tools
to undercover
rules
What works? Quantitative
Interpretive/
constructivist
Reality is
created by
individuals in
groups
Discover the
underlying
meaning of
events and
activities
Why do you act
this way?
Qualitative
Critical Society is rife
with inequalities
and injustice
Helping uncover
injustice and
empowering
citizens
How can I
change this
situation?
Ideological
review,
Civil actions
Pragmatic Truth is what is
useful
The best
method is one
that solves
problems
Will this
intervention
improve
learning?
Mixed Methods,
Design-Based
Summary
Summary
• 4 educational research paradigms
• Choice for research based on
– Personal views
– Research questions
– Access, support and resources
– Supervisor(s) attitudes!
• There is no single, “best way” to do research
• Arguing paradigm perspectives is not productive
Questions and Comments??

research-paradigms (Advanced Research Methods)-3

  • 1.
    Research Paradigms: Ontology's, Epistemologies& Methods Terry Anderson PhD Seminar
  • 2.
  • 3.
    Paradigm • “a philosophicaland theoretical framework of a scientific school or discipline within which theories, laws, and generalizations and the experiments performed in support of them are formulated” Merriam Webster Dictionary, 2007) • “the set of common beliefs and agreements shared between scientists about how problems should be understood and addressed” (Kuhn, 1962)
  • 4.
    • Ontology: waysof constructing reality, “how things really are” and “how things really work”.. Denzin and Lincoln, (1998; 201) • Epistemology: different forms of knowledge of that reality, what nature of relationship exists between the inquirer and the inquired? How do we know? • Methodology: What tools do we use to know that reality?
  • 5.
  • 6.
    Research Paradigms Positivism -Quantitative Positivism - Quantitative ~ ~ discovery discovery of the laws that govern behavior of the laws that govern behavior Constructivist - Qualitative ~ Constructivist - Qualitative ~ understandings from an insider perspective understandings from an insider perspective Critical - Postmodern Critical - Postmodern ~ ~ Investigate Investigate and expose the power relationships and expose the power relationships Pragmatic - Pragmatic - interventions, interactions interventions, interactions and their effect in multiple contexts and their effect in multiple contexts
  • 7.
    Paradigm 1 Positivism -Quantitative Research • Ontology: There is an objective reality and we can understand it and it through the laws by which it is governed. • Epistemology: employs a scientific discourse derived from the epistemologies of positivism and realism. • Method: Experimental, Deduction,
  • 8.
    • “those whoare seeking the strict way of truth should not trouble themselves about any object concerning which they cannot have a certainty equal to arithmetic or geometrical demonstration” – (Rene Descartes) • Inordinate support and faith in randomized controlled studies
  • 9.
    Typical Positivist research Question: •What? • How much? • Relationship between? Or Causes this effect? • Best answered with numerical precision • Often formulated as hypotheses
  • 10.
    • Reliability: Sameresults different times, different researchers • Validity: results accurately measure and reliably answer research questions. • “Without reliability, there is no validity.” • Can you think of a positivist measurement that is reliable, but not valid?
  • 11.
    Examples Positivist 1– Community of Inquiry- Content Analysis • Garrison, Anderson, Archer 1997-2003 – http://communitiesofinquiry.com - 9 papers reviewing results focusing on reliable , quantitative analysis – Identified ways to measure teaching, social and cognitive ‘presence’ – Most reliable methods are beyond current time constraints of busy teachers – Questions of validity – Serves as basic research as grounding for AI methods and major survey work. – Serves as qualitative heuristic for teachers and course designers
  • 12.
    Quantitative – Meta-Analysis •Aggregates many effect sizes creating large N’s & more powerful results. • Ungerleider and Burns (2003) • Systematic review of effectiveness and efficiency of Online education versus Face to face • The type of interventions studied were extraordinary diverse –only criteria was a comparison group • “Only 10 of the 25 studies included in the in-depth review were not seriously flawed, a sobering statistic given the constraints that went into selecting them for the review.”
  • 13.
    Achievement in Onlineversus Classroom
  • 14.
    Is DE Betterthan Classroom Instruction? Project 1: 2000 – 2004 • Question: How does distance education compare to classroom instruction? (inclusive dates 1985- 2002) • Total number of effect sizes: k = 232 • Measures: Achievement, Attitudes and Retention (opposite of drop-out) • Divided into Asynchronous and Synchronous DE 14 Berna Bernard, R. M., Abrami, P. C., Lou, Y. Borokhovski, E., Wade, A., Wozney, L., rd, R. M., Abrami, P. C., Lou, Y. Borokhovski, E., Wade, A., Wozney, L., Wallet, P.A., Fiset, M., & Huang, B. (2004). How does distance education Wallet, P.A., Fiset, M., & Huang, B. (2004). How does distance education compare to classroom instruction? A meta-analysis of the empirical literature. compare to classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research, 74(3), 379-439 Review of Educational Research, 74(3), 379-439. .
  • 15.
    Equivalency: Are alltypes of Interaction necessary? Anderson, 2003 IRRODL
  • 16.
    Anderson’s Equivalency Theorem(2003) Moore (1989) distinctions are:  Three types of interaction o student-student interaction o student-teacher interaction o Student-content interaction Anderson (2003) hypotheses state:  High levels of one out of 3 interactions will produce satisfying educational experience  Increasing satisfaction through teacher and learner interaction interaction may not be as time or cost-effective as student-content interactive learning sequences 16
  • 17.
    Do the threetypes of interaction differ? Moore’s distinctions 17 Achievement and Attitude Outcomes Achievement Attitudes Interaction Categories k g+adj. k g+adj. Student-Student 10 0.342 6 0.358 Student-Teacher 44 0.254 30 0.052 Student-Content 20 0.339 8 0.136 Total 74 0.291 44 0.090 Between-class 2.437 6.892* Moore’s distinctions seem to apply for achievement (equal importance), but not for attitudes (however, samples are low for SS and SC)
  • 18.
    Does strengthening interactionimprove achievement and attitudes? Anderson’s hypotheses 18 Anderson’s first hypothesis about achievement appears to be supported Anderson’s second hypothesis about satisfaction (attitude) appears to be supported, but only to an extent (i.e., only 5 studies in High Category) Achievement and Attitude Outcomes Achievement Attitudes Interaction Strength k g+adj. SE k g+adj. SE Low Strength 30 0.163 0.043 21 0.071 0.042 Med Strength 29 0.418 0.044 18 0.170 0.043 High Strength 15 0.305 0.062 5 -0.173 0.091 Total 74 0.291 0.027 44 0.090 0.029 (Q) Between-class 17.582* 12.060*
  • 19.
    Bernard, Abrami, Borokhovski,Wade, Tamin, & Surkes, (2009). Examining Three Forms of Interaction in Distance Education: A Meta- Analysis of Between-DE Studies. Review of Research in Education
  • 20.
    Quantitative Research Summary •Can be useful especially when fine tuning well established practice • Provides incremental gains in knowledge, not revolutionary ones • The need to “control” context often makes results of little value to practicing professionals • In times of rapid change too early quantitative testing may mask beneficial positive capacity • Will we ever be able to afford blind reviewed, random assignment studies?
  • 21.
    Paradigm 2 Interpretivist orConstructivist Paradigm • Many different varieties • Generally answer the question ‘why’ rather then ‘what’, ‘when’ or ‘how much’? • Presents special challenges in distributed contexts due to distance between participants and researchers • Currently most common type of DE research (Rourke & Szabo, 2002)
  • 22.
    Interpretivist Paradigm • Ontology:World and knowledge created by social and contextual understanding. • Epistemology: How do we come to understand a unique person’s worldview • Methodology: Qualitative methods – narrative, interviews, observations, ethnography, case study, phenomenology etc.
  • 23.
    Dora Maar byPicasso
  • 24.
    Picasso: Mother withDead Child II, Postscript to Guernica
  • 25.
    A phenomenological viewpointdiagram by Martin Parker
  • 26.
    Typical Interpretive Research Question •Why? • How does subject understand ? • What is the “lived experience”? • What meaning does the artifact or intervention have?
  • 27.
    Qualitative Example –Dearnley (2003)Student support in open learning: Sustaining the Process –Practicing Nurses, weekly F2F tutorial sessions –Phenomenological study using grounded theory discourse
  • 28.
    Core category toemerge was “Finding the professional voice” Dearnley and Matthew (2003 and 2004) Dearnley and Matthew (2003 and 2004)
  • 29.
    Qualitative example 2 •Mann, S. (2003) A personal inquiry into an experience of adult learning on-line. Instructional Science 31 • Conclusions: – The need to facilitate the presentation of learner and teacher identities in such a way that takes account of the loss of the normal channel – The need to make explicit the development of operating norms and conventions – reduced communicative media there is the potential for greater misunderstanding – The need to consider ways in which the developing learning community can be open to the other of uncertainty, ambiguity and difference
  • 30.
    3rd Paradigm Critical Research •Asks who gains in power? • David Noble’s critique of ‘digital diploma mills’ most prominent Canadian example • Are profits generated from user generated content exploitative? • Confronting the “net changes everything” mantra of many social software proponents. • Who is being excluded from social software? • Are MOOCs really free?
  • 31.
    Critical Research Paradigm •Ontology: Reality exists and has been created by directed social bias. • Epistemology: Understand oppressed view by uncovering the “contradictory conditions of action which are hidden or distorted by everyday understanding” (Comstock) and work to help change social conditions • Methodology: Critical analysis, historic review, participate in programs of action
  • 32.
    Typical Critical ParadigmQuestions • How can this injustice be rectified? • Can the exploited be helped to understand the oppression that undermines them? • Who benefits from or exploits the current situation?
  • 33.
    See Norm Friesen’s Friesen,N. (2009) Re-thinking e-learning research: foundations, methods, and practices. Peter Lang Publishers
  • 34.
    Sample Critical Questions •Why does Facebook own all the content that we supply? • Does the power of the net further marginalize the non-connected? • Who benefits from voluntary disclosure? • Why did the One Laptop Per Child fail? • Does learning Analytics exploit student vulnerabilities and right to privacy?
  • 35.
    Do Positivist, Interpretiveor Critical Research Meet the Real Needs of Practicing Educators?
  • 36.
    But what typeof research has most effect on practice? – Kennedy (1999) - teachers rate relevance and value of results from each of major paradigms. – No consistent results – teachers are not a homogeneous group of consumers but they do find research of value – “The studies that teachers found to be most persuasive, most relevant, and most influential to their thinking were all studies that addressed the relationship between teaching and learning.”
  • 37.
    But what typeof research has most effect on Practice? – “The findings from this study cast doubt on virtually every argument for the superiority of any particular research genre, whether the criterion for superiority is persuasiveness, relevance, or ability to influence practitioners’ thinking.” Kennedy, (1999)
  • 38.
    Paradigm #4 Pragmatism • “Toa pragmatist, the mandate of science is not to find truth or reality, the existence of which are perpetually in dispute, but to facilitate human problem-solving” (Powell, 2001, p. 884).
  • 39.
    Pragmatic Paradigm • Developedfrom frustration of the lack of impact of educational research in educational systems. • Key features: – An intervention – Empirical research in a natural context – Partnership between researchers and practitioners – Development of theory and ‘design principles”
  • 40.
    Pragmatic Paradigm • Ontology:Reality is the practical effects of ideas. • Epistemology: Any way of thinking/doing that leads to pragmatic solutions is useful. • Methodology: Mixed Methods, design-based research, action research
  • 41.
    Typical Pragmatic Research Question •What can be done to increase literacy of adult learners? • Can collaborative Learning online, increase student satisfaction and completion rates? • Do blog activities increase student satisfaction and learning outcomes? • How can we encourage teachers to use more web 2.0 tools in their classroom
  • 42.
    Design Tradition • “Learningand productivity are the results of the designs (the structures) of complex systems of people, environments, technology, beliefs and texts” New London Group 2000 • Design Based Research opens the door for teachers, researchers and learners to become designers, not merely consumers, bosses or observers .
  • 43.
    4th Pragmatic Paradigm DesignBased Research Method • Related to engineering and architectural research • Focuses on the design, construction, implementation and adoption of a learning initiative in an authentic context • Related to ‘Development Research’ • Closest educators have to a “home grown” research methodology
  • 44.
    Design-Based Research Studies –iterative, –processfocused, –interventionist, –collaborative, –multileveled, –utility oriented, –theory driven and generative • (Shavelson et al, 2003)
  • 45.
    Critical characteristics of designexperiments • According to Reeves (2000:8), Ann Brown (1992) and Alan Collins (1992): – addressing complex problems in real contexts in collaboration with practitioners, – integrating known and hypothetical design principles with technological affordances to render plausible solutions to these complex problems, and – conducting rigorous and reflective inquiry to test and refine innovative learning environments as well as to define new design-principles.
  • 46.
  • 47.
    • “design-based researchenables the creation and study of learning conditions that are presumed productive but are not well understood in practice, and the generation of findings often overlooked or obscured when focusing exclusively on the summative effects of an intervention” Wang & Hannafin, 2003
  • 48.
    • Iterative because •‘Innovation is not restricted to the prior design of an artifact, but continues as artifacts are implemented and used” • Implementations are “inevitably unfinished” (Stewart and Williams (2005) • intertwined goals of (1) designing learning environments and (2) developing theories of learning (DBRC, 2003)
  • 49.
    Amiel, T., &Reeves, T. C. (2008).
  • 50.
    Design Based researchand the Science of Complexity • Complexity theory studies the emergence of order in multifaceted, changing and previously unordered contexts • This emerging order becomes the focus of iterate interventions and evaluations • Order emerges at the “edge of chaos” in response to rapid change, and failure of previous organization models
  • 51.
    DBR Examples Call CentresAt Athabasca: • • Answer 80% of student inquiries Savings of over $100,000 /year Anderson, T. (2005). Design-based research and its application to a call center innovation in distance education. Canadian Journal of Learning and Technology, 31(2), 69-84
  • 53.
    • Need tostudy usability, scalability and innovation adoption within bureaucratic systems • Allow knowledge tools to evolve in natural context through supportive nourishment of staff Conducting Educational Design Research by Susan McKenney and Thomas C Reeves
  • 54.
    Paradigm Ontology EpistemologyQuestion Method Positivism Hidden rules govern teaching and learning process Focus on reliable and valid tools to undercover rules What works? Quantitative Interpretive/ constructivist Reality is created by individuals in groups Discover the underlying meaning of events and activities Why do you act this way? Qualitative Critical Society is rife with inequalities and injustice Helping uncover injustice and empowering citizens How can I change this situation? Ideological review, Civil actions Pragmatic Truth is what is useful The best method is one that solves problems Will this intervention improve learning? Mixed Methods, Design-Based Summary
  • 55.
    Summary • 4 educationalresearch paradigms • Choice for research based on – Personal views – Research questions – Access, support and resources – Supervisor(s) attitudes! • There is no single, “best way” to do research • Arguing paradigm perspectives is not productive
  • 56.

Editor's Notes

  • #7 Evidence based developed at Mcmaster The group of clinical epidemiologists who developed evidence-based decision-making at McMaster University in Canada (Sackett et al., 1985)
  • #20 But what if the results had shown very significant results in favor of either mode of delivery? Would they have informed our practice? I think the answer would be a resounding “Not very likely”. The meta-analysis tells us nothing about the critical context in which the learning took place. What learner support services were in place? What was the quality of the teaching or of the content? What was the condition of the home study or the class environment - the list of contextual factors goes on and on. Thus, one can conclude that this gold standard – the use of randomly assigned comparison group research and subsequent meta-analysis is of only limited use to practicing distance educators. These results may be useful in persuading reluctant colleagues or funders about the efficacy of distance education, but they tell us little that will help us to improve our practice.