Research Methods in Distance Education: Design Based Research Terry Anderson PhD Seminar  Nova University, Lisbon March 2011
Tuesday’s Agenda Lecture – Research methods and in DE Design Based research Break Design Based research in Action Athabasca’s Elgg Project
The context of Distance Education Implementation Disruptive innovation (Christensen, 2008) simpler, not wanted by main stream customers Rapid gains in functionality Cheaper Adaptive Moving from peripheral to mainstream (blended and online for full time students)
Good Research Good Theory (see  http://www.learning-theories.com ) Good Question(s) Good methodology Brave conclusions Important applications Spans multiple iterations
Because much e-learning tools and context is so emergent - need for  Demographic studies Who is using what for doing what? How are they using multi-use tools? Visualization of activity and relationships Network analysis - SNAPP Connection between context and type of use Data mining/analytics Sharing and making visible research and practitioner results
Research Paradigms
Research Paradigms
Research Paradigms Quantitative   ~  discovery of the laws that govern behavior Qualitative ~   understandings from an insider perspective Critical   ~   Investigate and expose the power relationships Design-based ~   interventions, interactions and their effect in multiple contexts
Paradigm 1 Quantitative Research employs a scientific discourse derived from the epistemologies of positivism and realism.
“ those who are seeking the strict way of truth should not trouble themselves about any object concerning which they cannot have a certainty equal to arithmetic or geometrical demonstration” (Rene Descartes) Inordinate support and faith in randomized controlled studies
Quantitative 1 –  CMC Content Analysis Anderson, Garrison, Rourke 1997-2003 http://communitiesofinquiry.com  - 9 papers reviewing results focusing on reliable , quantitative analysis Identified ways to measure teaching, social and cognitive ‘presence’ Most reliable methods are beyond current time constraints of busy teachers Questions of validity Serves as basic research as grounding for AI methods and major survey work of the future  Serves as qualitative heuristic for teachers and course designers
Quantitative – Meta-Analysis  Aggregates many effect sizes creating large N’s more powerful results. Ungerleider and Burns (2003) Systematic review of effectiveness and efficiency of Online education versus Face to face The type of interventions studied were extraordinary diverse –only criteria was a comparison group  “ Only 10 of the 25 studies included in the in-depth review were not seriously flawed, a sobering statistic given the constraints that went into selecting them for the review.”
 
Is DE Better than Classroom Instruction? Project 1: 2000 – 2004 Question:  How does distance education compare to classroom instruction?  (inclusive dates 1985-2002) Total number of effect sizes:  k = 232 Measures:  Achievement, Attitudes and Retention (opposite of drop-out) Divided into  Asynchronous  and  Synchronous  DE Bernard, R. M., Abrami, P. C., Lou, Y. Borokhovski, E., Wade, A., Wozney, L., Wallet, P.A., Fiset, M., & Huang, B. (2004). How does distance education compare to classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research, 74(3), 379-439.
Primary findings DE and CI are essentially  equal  (g+ ≈ 0.0 to low average effect) on all measures Effect size distributions are heterogeneous; some DE >> CI, some DE << CI Generally poor methodological quality Pedagogical study features account for more variation than media study features (Clark, 1994) Interactive DE an important variable* * Lou, Y., Bernard, R.M., & Abrami, P.C. (2006). Media and pedagogy in undergraduate distance education: A theory-based meta-analysis of empirical literature. Educational Technology Research & Development, 54(2), 141-176.
Summary of results: Achievement Achievement Outcomes *Significantly heterogeneous average effect Type of DE k g+ Sig. Combined 318* 0.013 * > 0.05 Synchronous 92 – 0.102 * < 0.05 Asynchronous 174 0.053 * < 0.05
Summary of results: Attitudes Attitude Outcomes *Significantly heterogeneous average effect Type of DE k g+ Sig. Combined 154 – 0.081 * < 0.05 Synchronous 83 – 0.185 * < 0.05 Asynchronous 71 – 0.034 * > 0.05
Summary of results: Retention Retention Outcomes *Significantly heterogeneous effect sizes Type of DE k g+ Sig. Combined 103 – 0.057 * < 0.05 Synchronous 17 0.005 > 0.05 Asynchronous 53 – 0.093 * < 0.05
Equivalency: Are all types of Interaction necessary? Anderson,  2003 IRRODL
Anderson’s Equivalency Theorem (2003) Moore (1989) distinctions are:   Three types of interaction student-student interaction student-teacher interaction  Student-content interaction Anderson (2003) hypotheses state: High levels of one out of 3 interactions will produce satisfying educational experience Increasing satisfaction through teacher and learner interaction interaction may not be as time or cost-effective as student-content interactive learning sequences
Do the three types of interaction differ? Moore’s distinctions Achievement and Attitude Outcomes Moore’s distinctions seem to apply for achievement (equal importance), but not for attitudes (however, samples are low for SS and SC)
Does strengthening interaction improve achievement and attitudes? Anderson’s hypotheses Anderson’s first hypothesis about achievement appears to be supported Anderson’s second hypothesis about satisfaction (attitude) appears to be supported, but only to an extent (i.e., only 5 studies in High Category) Achievement and Attitude Outcomes
Bernard, Abrami, Borokhovski, Wade, Tamin, & Surkes, (in press). Examining Three Forms of Interaction in Distance Education: A Meta-Analysis of Between-DE Studies.  Review of Research in Education
Because much web 2.0 tech use is so emergent need for:  Demographic studies Who is using what for doing what? How are they using multi-use tools? Visualization of activity and relationships Network analysis Connection between context and type of use Data mining
Quantitative Research Summary Can be useful especially when fine tuning well established practice Provides incremental gains in knowledge, not revolutionary ones The need to “control” context often makes results of little  value to practicing professionals In times of rapid change too early quantitative testing may mask beneficial positive capacity Will we ever be able to afford blind reviewed, random assignment studies?
Paradigm 2  Qualitative Paradigm Many different varieties Generally answer the question ‘why’ rather then ‘what’, ‘when’ or ‘how much’? Presents special challenges in distributed contexts due to distance between participants and researchers Currently most common type of DE research (Rourke & Szabo, 2002)
Qualitative study of Social Software Critically important: In early stages of adoption To track effects of user competence and efficacy As contexts are personalized as tools are appropriated by users for entirely different tasks than those intended by developers
Qualitative Example Dearnley (2003)  Student support in open learning: Sustaining the Process Practicing Nurses, weekly F2F tutorial sessions Phenomenological study using grounded theory discourse
Core category to emerge was “Finding the professional voice” Dearnley and Matthew (2003 and 2004)
Qualitative example 2 Mann, S. (2003)  A personal inquiry into an experience of adult learning on-line. Instructional Science 31 Conclusions: The need to facilitate the presentation of learner and teacher identities in such a way that takes account of the loss of the normal channel The need to make explicit the development of operating norms and conventions reduced communicative media there is the potential for greater misunderstanding The need to consider ways in which the developing learning community can be open to the other of uncertainty, ambiguity and difference
3rd Paradigm Critical Research Asks who gains in power? David Noble’s critique of ‘digital diploma Mills’ most prominent Canadian example  Are profits generated from user generated content exploitative? Confronting the “net changes everything” mantra of many social software proponents. Who is being excluded from social software
See Norm Friesen’s Friesen, N. (2009) Re-thinking e-learning research: foundations, methods, and practices. Peter Lang Publishers
Is the extraction of information from the masses exploitative or empowering?
Why does Facebook own all the content that we supply? Does the power of the net further marginalize the non connected? Who benefits from voluntary disclosure? Why did the One Laptop Per Child fail?
Quantitative vs. Qualitative Paradigm Wars Rekindled Current research  “more resembles the pendulum swings characteristic of art or fashion, rather than the progressive improvements characteristic of science and technology” (p. 16). Slavin (2002) in Educational Researcher Solution to embrace “evidence based learning”  Projected to increase from 5% to 75% of US Gov. funding by 2007 for “research that addresses causal questions and uses random assignments ….”  Slavin, 2002 p. 15
Do Either Qualitative or Quantitative Methods Meet Real Needs of Practicing  Distance Educators?
But what type of research has most effect on practice? Kennedy (1999) - teachers rate relevance and value of results from each of major paradigms. No consistent results – teachers are not a homogeneous group of consumers but they do find research of value “ The studies that teachers found to be most persuasive, most relevant, and most influential to their thinking were all studies that addressed the relationship between teaching and learning.”
But what type of research has most effect on Practice? “ The findings from this study cast doubt on virtually every argument for the superiority of any particular research genre, whether the criterion for superiority is persuasiveness, relevance, or ability to influence practitioners’ thinking.” Kennedy, (1999)
4th Paradigm Design-Based Research Related to engineering and architectural research Focuses on the design, construction, implementation and adoption of a learning initiative in an authentic context Related to ‘Development Research’ Closest educators have to a “home grown” research methodology
Design-Based  Research Studies iterative,  process focused,  interventionist,  collaborative,  multileveled,  utility oriented,  theory driven and generative (Shavelson et al, 2003)
Critical characteristics of  design experiments  According to Reeves (2000:8), Ann Brown (1992) and Alan Collins (1992): addressing complex problems in real contexts in collaboration with practitioners, integrating known and hypothetical design-principles with technological affordances to render plausible solutions to these complex problems, and conducting rigorous and reflective inquiry to test and refine innovative learning environments as well as to define new design-principles.
Design-based research Methodology developed by educators for educators Developed from American pragmatism – Dewey (Anderson, 2005) Recent Theme Issues: The Journal of the Instructional Sciences,  ( 13 , 1, 2004),  Educational Researcher  ( 32 , 1, 2003) and  Educational Psychologist (39, 4, 2004) See bibliography at  http://cider.athabascau.ca/CIDERSIGs/DesignBasedSIG/ My article at  www.cjlt.ca/abstracts.html
Integrative Learning Design (Bannan-Ritland, 2003)
“ design-based research enables the creation and study of learning conditions that are presumed productive but are not well understood in practice, and the generation of findings often overlooked or obscured when focusing exclusively on the summative effects of an intervention” Wang & Hannafin, 2003
Iterative because ‘ Innovation is not restricted to the prior design of an artifact, but continues as artifacts are implemented and used” Implementations are “inevitably unfinished” (Stewart and Williams (2005) intertwined goals of (1) designing learning environments and (2) developing theories of learning (DBRC, 2003)
Amiel, T., & Reeves, T. C. (2008).
Design Based research and the Science of Complexity Complexity theory studies the emergence of order in multifaceted, changing and previously unordered contexts This emerging order becomes the focus of iterate interventions and evaluations  Order emerges at the “edge of chaos” in response to rapid change,  and failure of previous organization models
Call Centres At Athabasca: Answer 80% of student inquiries Savings of over $100,000 /year Anderson, T. (2005). Design-based research and its application to a call center innovation in distance education. Canadian Journal of Learning and Technology, 31(2), 69-84
D-B Research examples Design-Based Research Strategies for Studying Situated Learning in a Multi-user Virtual Environment Chris Dede, 2004
Graduate Student Resource Hub in Design Research in Education http://www.lkl.ac.uk/projects/designresearch/
Need to study usability, scalability and innovation adoption within bureaucratic systems Allow knowledge tools to evolve in natural context through supportive nourishment of staff
Conclusion Education research is grossly under-resourced to meet the magnitude of opportunity and demand Paradigm wars are unproductive Design-based research offers a promising new research design model It can be used for Doctoral dissertations see Herrington, J., McKenney, S., Reeves, T., & Oliver, R. (2007).  Design-based research and doctoral students: Guidelines for preparing a dissertation proposal.
Design Based research in Action Phase 1 Exploration – surveys, talking to faculty and tutors, investigating open source tools, setting research questions Phase 2. Building the intervention – Elgg through two versions and 85 plugins (on going)
Phase 3 Evaluation – Before and after survey’s see: Anderson, T., Poelhuber, B., & McKerlich, R. (2010). Self Paced Learners Meet Social Software.  Online Journal of Distance Education Administration, 13  http://www.westga.edu/~distance/ojdla/Fall133/anderson_poellhuber_mcKerlich133.html Dr students – Use of past student archives Ongoing iterations and development of tools Phase 4 – Testing in multiple contexts Development of design principles/patterns
Survey Results –Anderson et al 2004 78% indicated they would interact with other students if they were also able to proceed through the course at their own pace .
Survey Results Anderson et al 2004 Only 29% of the student respondents had participated in the optional (credit and non credit) interactive computer conferences
Undergrad Survey Sept. 2009 Draft Results AU Unpaced Learners social Software Survey, Anderson Sept 2009 sent to 3763 undergrad students who enrolled in AU ungrad courses in Aug.2009  24.7% response rate N=820
Draft Results, AU Unpaced Learners Social Software Survey, Anderson Sept 2009 N = 820
Draft Results, AU Unpaced Learners Social Software Survey, Anderson Sept 2009 N = 820
Draft Results, AU Unpaced Learners Social Software Survey, Anderson Sept 2009
Draft Results, AU Unpaced Learners Social Software Survey,  Anderson, Sept 2009.
Draft Results, AU Unpaced Learners Social Software Survey,  Anderson, Sept 2009.
Draft Results, AU Unpaced Learners Social Software Survey,  Anderson, Sept 2009. 25.12% N = 820
Draft Results, AU Unpaced Learners Social Software Survey,  Anderson, Sept 2009. N = 820
Draft Results, AU Unpaced Learners Social Software Survey,  Anderson, Sept 2009. 47.93%
Draft Results, AU Unpaced Learners Social Software Survey,  Anderson, Sept 2009. 61.95% 31.47% 6.59%
Draft Results, AU Unpaced Learners Social Software Survey,  Anderson, Sept 2009.
Draft Results, AU Unpaced Learners Social Software Survey,  Anderson, Sept 2009.
Lots of Support “ Not networking with other students, and not having peers is one drawback in doing individualized studies through Athabasca, with these technologies available could solve this problem.”  “ I think that hearing other people's opinions is a great way to spark new thoughts of your own. I also think that it is a great way to ask questions rather than emailing back and forth or making long distance phone calls.”
Lots of Concerns “ People have other commitments and might not be able to join in, they like to do things on their own time.” “ I am not part of a social network due to the fact that I work in mental health, I am concerned about my privacy.” “ I'm scared as a first time user of  e-learning, that I may miss something”
Survey Conclusions We have a very heterogeneous population of net users and non users Many of our learners are “don’t know” about web 2.0 tool use in formal education – are they literate????
Challenges to AU Moving to Connectivist Pedgagogy Personal competence, literacy and tools Dealing effectively with disruptive technologies Crystallized ways of thinking about our educational development and delivery model Developing Tutor Networks Union contracts ???
How to insure we all are learning professionals?
My Personal Learning Network Professional, Hobby, Personal News Produsage, networks Personal Hosting: Blogs, E-portfolios, Presentations, Profile Bookmarks Tags Resources Collections Photos Books Formal Education Provider(s) Production Tools Email  Social Networks I PLE Identity
Open Net Athabasca University Athabasca Landing E-Portfolios Profiles Networks Bookmarks Blogs Media lab Secondlife campus AUspace AlFresco CMS Moodle Library Course Development ELGG MY AU Login Registry OERs, YouTUBE Discovery Read & Comment  Single Sign on CIDER Research/Community Networks Sample CC  Course units and  Branded OERs Passwords Passwords
Network Tool Set (example) Text Text Stepanyan, Mather & Payne, 2007
Access Controls in Elgg
Design Based research in Practice Athabasca Landing Elgg based Started in 2008 1600 users (2011) Unpaced  Paced Courses Informal Learning
The Demo! Elgg
Questions and Comments

Design based for lisbon 2011

  • 1.
    Research Methods inDistance Education: Design Based Research Terry Anderson PhD Seminar Nova University, Lisbon March 2011
  • 2.
    Tuesday’s Agenda Lecture– Research methods and in DE Design Based research Break Design Based research in Action Athabasca’s Elgg Project
  • 3.
    The context ofDistance Education Implementation Disruptive innovation (Christensen, 2008) simpler, not wanted by main stream customers Rapid gains in functionality Cheaper Adaptive Moving from peripheral to mainstream (blended and online for full time students)
  • 4.
    Good Research GoodTheory (see http://www.learning-theories.com ) Good Question(s) Good methodology Brave conclusions Important applications Spans multiple iterations
  • 5.
    Because much e-learningtools and context is so emergent - need for Demographic studies Who is using what for doing what? How are they using multi-use tools? Visualization of activity and relationships Network analysis - SNAPP Connection between context and type of use Data mining/analytics Sharing and making visible research and practitioner results
  • 6.
  • 7.
  • 8.
    Research Paradigms Quantitative ~ discovery of the laws that govern behavior Qualitative ~ understandings from an insider perspective Critical ~ Investigate and expose the power relationships Design-based ~ interventions, interactions and their effect in multiple contexts
  • 9.
    Paradigm 1 QuantitativeResearch employs a scientific discourse derived from the epistemologies of positivism and realism.
  • 10.
    “ those whoare seeking the strict way of truth should not trouble themselves about any object concerning which they cannot have a certainty equal to arithmetic or geometrical demonstration” (Rene Descartes) Inordinate support and faith in randomized controlled studies
  • 11.
    Quantitative 1 – CMC Content Analysis Anderson, Garrison, Rourke 1997-2003 http://communitiesofinquiry.com - 9 papers reviewing results focusing on reliable , quantitative analysis Identified ways to measure teaching, social and cognitive ‘presence’ Most reliable methods are beyond current time constraints of busy teachers Questions of validity Serves as basic research as grounding for AI methods and major survey work of the future Serves as qualitative heuristic for teachers and course designers
  • 12.
    Quantitative – Meta-Analysis Aggregates many effect sizes creating large N’s more powerful results. Ungerleider and Burns (2003) Systematic review of effectiveness and efficiency of Online education versus Face to face The type of interventions studied were extraordinary diverse –only criteria was a comparison group “ Only 10 of the 25 studies included in the in-depth review were not seriously flawed, a sobering statistic given the constraints that went into selecting them for the review.”
  • 13.
  • 14.
    Is DE Betterthan Classroom Instruction? Project 1: 2000 – 2004 Question: How does distance education compare to classroom instruction? (inclusive dates 1985-2002) Total number of effect sizes: k = 232 Measures: Achievement, Attitudes and Retention (opposite of drop-out) Divided into Asynchronous and Synchronous DE Bernard, R. M., Abrami, P. C., Lou, Y. Borokhovski, E., Wade, A., Wozney, L., Wallet, P.A., Fiset, M., & Huang, B. (2004). How does distance education compare to classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research, 74(3), 379-439.
  • 15.
    Primary findings DEand CI are essentially equal (g+ ≈ 0.0 to low average effect) on all measures Effect size distributions are heterogeneous; some DE >> CI, some DE << CI Generally poor methodological quality Pedagogical study features account for more variation than media study features (Clark, 1994) Interactive DE an important variable* * Lou, Y., Bernard, R.M., & Abrami, P.C. (2006). Media and pedagogy in undergraduate distance education: A theory-based meta-analysis of empirical literature. Educational Technology Research & Development, 54(2), 141-176.
  • 16.
    Summary of results:Achievement Achievement Outcomes *Significantly heterogeneous average effect Type of DE k g+ Sig. Combined 318* 0.013 * > 0.05 Synchronous 92 – 0.102 * < 0.05 Asynchronous 174 0.053 * < 0.05
  • 17.
    Summary of results:Attitudes Attitude Outcomes *Significantly heterogeneous average effect Type of DE k g+ Sig. Combined 154 – 0.081 * < 0.05 Synchronous 83 – 0.185 * < 0.05 Asynchronous 71 – 0.034 * > 0.05
  • 18.
    Summary of results:Retention Retention Outcomes *Significantly heterogeneous effect sizes Type of DE k g+ Sig. Combined 103 – 0.057 * < 0.05 Synchronous 17 0.005 > 0.05 Asynchronous 53 – 0.093 * < 0.05
  • 19.
    Equivalency: Are alltypes of Interaction necessary? Anderson, 2003 IRRODL
  • 20.
    Anderson’s Equivalency Theorem(2003) Moore (1989) distinctions are: Three types of interaction student-student interaction student-teacher interaction Student-content interaction Anderson (2003) hypotheses state: High levels of one out of 3 interactions will produce satisfying educational experience Increasing satisfaction through teacher and learner interaction interaction may not be as time or cost-effective as student-content interactive learning sequences
  • 21.
    Do the threetypes of interaction differ? Moore’s distinctions Achievement and Attitude Outcomes Moore’s distinctions seem to apply for achievement (equal importance), but not for attitudes (however, samples are low for SS and SC)
  • 22.
    Does strengthening interactionimprove achievement and attitudes? Anderson’s hypotheses Anderson’s first hypothesis about achievement appears to be supported Anderson’s second hypothesis about satisfaction (attitude) appears to be supported, but only to an extent (i.e., only 5 studies in High Category) Achievement and Attitude Outcomes
  • 23.
    Bernard, Abrami, Borokhovski,Wade, Tamin, & Surkes, (in press). Examining Three Forms of Interaction in Distance Education: A Meta-Analysis of Between-DE Studies. Review of Research in Education
  • 24.
    Because much web2.0 tech use is so emergent need for: Demographic studies Who is using what for doing what? How are they using multi-use tools? Visualization of activity and relationships Network analysis Connection between context and type of use Data mining
  • 25.
    Quantitative Research SummaryCan be useful especially when fine tuning well established practice Provides incremental gains in knowledge, not revolutionary ones The need to “control” context often makes results of little value to practicing professionals In times of rapid change too early quantitative testing may mask beneficial positive capacity Will we ever be able to afford blind reviewed, random assignment studies?
  • 26.
    Paradigm 2 Qualitative Paradigm Many different varieties Generally answer the question ‘why’ rather then ‘what’, ‘when’ or ‘how much’? Presents special challenges in distributed contexts due to distance between participants and researchers Currently most common type of DE research (Rourke & Szabo, 2002)
  • 27.
    Qualitative study ofSocial Software Critically important: In early stages of adoption To track effects of user competence and efficacy As contexts are personalized as tools are appropriated by users for entirely different tasks than those intended by developers
  • 28.
    Qualitative Example Dearnley(2003) Student support in open learning: Sustaining the Process Practicing Nurses, weekly F2F tutorial sessions Phenomenological study using grounded theory discourse
  • 29.
    Core category toemerge was “Finding the professional voice” Dearnley and Matthew (2003 and 2004)
  • 30.
    Qualitative example 2Mann, S. (2003) A personal inquiry into an experience of adult learning on-line. Instructional Science 31 Conclusions: The need to facilitate the presentation of learner and teacher identities in such a way that takes account of the loss of the normal channel The need to make explicit the development of operating norms and conventions reduced communicative media there is the potential for greater misunderstanding The need to consider ways in which the developing learning community can be open to the other of uncertainty, ambiguity and difference
  • 31.
    3rd Paradigm CriticalResearch Asks who gains in power? David Noble’s critique of ‘digital diploma Mills’ most prominent Canadian example Are profits generated from user generated content exploitative? Confronting the “net changes everything” mantra of many social software proponents. Who is being excluded from social software
  • 32.
    See Norm Friesen’sFriesen, N. (2009) Re-thinking e-learning research: foundations, methods, and practices. Peter Lang Publishers
  • 33.
    Is the extractionof information from the masses exploitative or empowering?
  • 34.
    Why does Facebookown all the content that we supply? Does the power of the net further marginalize the non connected? Who benefits from voluntary disclosure? Why did the One Laptop Per Child fail?
  • 35.
    Quantitative vs. QualitativeParadigm Wars Rekindled Current research “more resembles the pendulum swings characteristic of art or fashion, rather than the progressive improvements characteristic of science and technology” (p. 16). Slavin (2002) in Educational Researcher Solution to embrace “evidence based learning” Projected to increase from 5% to 75% of US Gov. funding by 2007 for “research that addresses causal questions and uses random assignments ….” Slavin, 2002 p. 15
  • 36.
    Do Either Qualitativeor Quantitative Methods Meet Real Needs of Practicing Distance Educators?
  • 37.
    But what typeof research has most effect on practice? Kennedy (1999) - teachers rate relevance and value of results from each of major paradigms. No consistent results – teachers are not a homogeneous group of consumers but they do find research of value “ The studies that teachers found to be most persuasive, most relevant, and most influential to their thinking were all studies that addressed the relationship between teaching and learning.”
  • 38.
    But what typeof research has most effect on Practice? “ The findings from this study cast doubt on virtually every argument for the superiority of any particular research genre, whether the criterion for superiority is persuasiveness, relevance, or ability to influence practitioners’ thinking.” Kennedy, (1999)
  • 39.
    4th Paradigm Design-BasedResearch Related to engineering and architectural research Focuses on the design, construction, implementation and adoption of a learning initiative in an authentic context Related to ‘Development Research’ Closest educators have to a “home grown” research methodology
  • 40.
    Design-Based ResearchStudies iterative, process focused, interventionist, collaborative, multileveled, utility oriented, theory driven and generative (Shavelson et al, 2003)
  • 41.
    Critical characteristics of design experiments According to Reeves (2000:8), Ann Brown (1992) and Alan Collins (1992): addressing complex problems in real contexts in collaboration with practitioners, integrating known and hypothetical design-principles with technological affordances to render plausible solutions to these complex problems, and conducting rigorous and reflective inquiry to test and refine innovative learning environments as well as to define new design-principles.
  • 42.
    Design-based research Methodologydeveloped by educators for educators Developed from American pragmatism – Dewey (Anderson, 2005) Recent Theme Issues: The Journal of the Instructional Sciences, ( 13 , 1, 2004), Educational Researcher ( 32 , 1, 2003) and Educational Psychologist (39, 4, 2004) See bibliography at http://cider.athabascau.ca/CIDERSIGs/DesignBasedSIG/ My article at www.cjlt.ca/abstracts.html
  • 43.
    Integrative Learning Design(Bannan-Ritland, 2003)
  • 44.
    “ design-based researchenables the creation and study of learning conditions that are presumed productive but are not well understood in practice, and the generation of findings often overlooked or obscured when focusing exclusively on the summative effects of an intervention” Wang & Hannafin, 2003
  • 45.
    Iterative because ‘Innovation is not restricted to the prior design of an artifact, but continues as artifacts are implemented and used” Implementations are “inevitably unfinished” (Stewart and Williams (2005) intertwined goals of (1) designing learning environments and (2) developing theories of learning (DBRC, 2003)
  • 46.
    Amiel, T., &Reeves, T. C. (2008).
  • 47.
    Design Based researchand the Science of Complexity Complexity theory studies the emergence of order in multifaceted, changing and previously unordered contexts This emerging order becomes the focus of iterate interventions and evaluations Order emerges at the “edge of chaos” in response to rapid change, and failure of previous organization models
  • 48.
    Call Centres AtAthabasca: Answer 80% of student inquiries Savings of over $100,000 /year Anderson, T. (2005). Design-based research and its application to a call center innovation in distance education. Canadian Journal of Learning and Technology, 31(2), 69-84
  • 49.
    D-B Research examplesDesign-Based Research Strategies for Studying Situated Learning in a Multi-user Virtual Environment Chris Dede, 2004
  • 50.
    Graduate Student ResourceHub in Design Research in Education http://www.lkl.ac.uk/projects/designresearch/
  • 51.
    Need to studyusability, scalability and innovation adoption within bureaucratic systems Allow knowledge tools to evolve in natural context through supportive nourishment of staff
  • 52.
    Conclusion Education researchis grossly under-resourced to meet the magnitude of opportunity and demand Paradigm wars are unproductive Design-based research offers a promising new research design model It can be used for Doctoral dissertations see Herrington, J., McKenney, S., Reeves, T., & Oliver, R. (2007). Design-based research and doctoral students: Guidelines for preparing a dissertation proposal.
  • 53.
    Design Based researchin Action Phase 1 Exploration – surveys, talking to faculty and tutors, investigating open source tools, setting research questions Phase 2. Building the intervention – Elgg through two versions and 85 plugins (on going)
  • 54.
    Phase 3 Evaluation– Before and after survey’s see: Anderson, T., Poelhuber, B., & McKerlich, R. (2010). Self Paced Learners Meet Social Software. Online Journal of Distance Education Administration, 13 http://www.westga.edu/~distance/ojdla/Fall133/anderson_poellhuber_mcKerlich133.html Dr students – Use of past student archives Ongoing iterations and development of tools Phase 4 – Testing in multiple contexts Development of design principles/patterns
  • 55.
    Survey Results –Andersonet al 2004 78% indicated they would interact with other students if they were also able to proceed through the course at their own pace .
  • 56.
    Survey Results Andersonet al 2004 Only 29% of the student respondents had participated in the optional (credit and non credit) interactive computer conferences
  • 57.
    Undergrad Survey Sept.2009 Draft Results AU Unpaced Learners social Software Survey, Anderson Sept 2009 sent to 3763 undergrad students who enrolled in AU ungrad courses in Aug.2009 24.7% response rate N=820
  • 58.
    Draft Results, AUUnpaced Learners Social Software Survey, Anderson Sept 2009 N = 820
  • 59.
    Draft Results, AUUnpaced Learners Social Software Survey, Anderson Sept 2009 N = 820
  • 60.
    Draft Results, AUUnpaced Learners Social Software Survey, Anderson Sept 2009
  • 61.
    Draft Results, AUUnpaced Learners Social Software Survey, Anderson, Sept 2009.
  • 62.
    Draft Results, AUUnpaced Learners Social Software Survey, Anderson, Sept 2009.
  • 63.
    Draft Results, AUUnpaced Learners Social Software Survey, Anderson, Sept 2009. 25.12% N = 820
  • 64.
    Draft Results, AUUnpaced Learners Social Software Survey, Anderson, Sept 2009. N = 820
  • 65.
    Draft Results, AUUnpaced Learners Social Software Survey, Anderson, Sept 2009. 47.93%
  • 66.
    Draft Results, AUUnpaced Learners Social Software Survey, Anderson, Sept 2009. 61.95% 31.47% 6.59%
  • 67.
    Draft Results, AUUnpaced Learners Social Software Survey, Anderson, Sept 2009.
  • 68.
    Draft Results, AUUnpaced Learners Social Software Survey, Anderson, Sept 2009.
  • 69.
    Lots of Support“ Not networking with other students, and not having peers is one drawback in doing individualized studies through Athabasca, with these technologies available could solve this problem.” “ I think that hearing other people's opinions is a great way to spark new thoughts of your own. I also think that it is a great way to ask questions rather than emailing back and forth or making long distance phone calls.”
  • 70.
    Lots of Concerns“ People have other commitments and might not be able to join in, they like to do things on their own time.” “ I am not part of a social network due to the fact that I work in mental health, I am concerned about my privacy.” “ I'm scared as a first time user of e-learning, that I may miss something”
  • 71.
    Survey Conclusions Wehave a very heterogeneous population of net users and non users Many of our learners are “don’t know” about web 2.0 tool use in formal education – are they literate????
  • 72.
    Challenges to AUMoving to Connectivist Pedgagogy Personal competence, literacy and tools Dealing effectively with disruptive technologies Crystallized ways of thinking about our educational development and delivery model Developing Tutor Networks Union contracts ???
  • 73.
    How to insurewe all are learning professionals?
  • 74.
    My Personal LearningNetwork Professional, Hobby, Personal News Produsage, networks Personal Hosting: Blogs, E-portfolios, Presentations, Profile Bookmarks Tags Resources Collections Photos Books Formal Education Provider(s) Production Tools Email Social Networks I PLE Identity
  • 75.
    Open Net AthabascaUniversity Athabasca Landing E-Portfolios Profiles Networks Bookmarks Blogs Media lab Secondlife campus AUspace AlFresco CMS Moodle Library Course Development ELGG MY AU Login Registry OERs, YouTUBE Discovery Read & Comment Single Sign on CIDER Research/Community Networks Sample CC Course units and Branded OERs Passwords Passwords
  • 76.
    Network Tool Set(example) Text Text Stepanyan, Mather & Payne, 2007
  • 77.
  • 78.
    Design Based researchin Practice Athabasca Landing Elgg based Started in 2008 1600 users (2011) Unpaced Paced Courses Informal Learning
  • 79.
  • 80.

Editor's Notes

  • #10 Evidence based developed at Mcmaster The group of clinical epidemiologists who developed evidence-based decision-making at McMaster University in Canada (Sackett et al., 1985)
  • #26 But what if the results had shown very significant results in favor of either mode of delivery? Would they have informed our practice? I think the answer would be a resounding “Not very likely”. The meta-analysis tells us nothing about the critical context in which the learning took place. What learner support services were in place? What was the quality of the teaching or of the content? What was the condition of the home study or the class environment - the list of contextual factors goes on and on. Thus, one can conclude that this gold standard – the use of randomly assigned comparison group research and subsequent meta-analysis is of only limited use to practicing distance educators. These results may be useful in persuading reluctant colleagues or funders about the efficacy of distance education, but they tell us little that will help us to improve our practice.
  • #36 Despite this problem, many very influential policy makers are now arguing that unless education adopts this type of “scientific and evidence based research”, we will never make progress in the discipline and will be subject to fads and superstitions forever. The famous American education researcher Robert Slavin (2002) contributed to a major revival of the paradigm wars of the 1980’s recently when he argued that educational researchers need to embrace “evidence based learning” rather than current process that “more resembles the pendulum swings characteristic of art or fashion, rather than the progressive improvements characteristic of science and technology”(p. 16). This plea has fallen on fertile ground in many government circles.
  • #61 Can you change this to % of valid replies.