Design based for lisbon 2011


Published on

Slides for PhD seminar at Nova University, Lisbon Portugal

Published in: Education
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Evidence based developed at Mcmaster The group of clinical epidemiologists who developed evidence-based decision-making at McMaster University in Canada (Sackett et al., 1985)
  • But what if the results had shown very significant results in favor of either mode of delivery? Would they have informed our practice? I think the answer would be a resounding “Not very likely”. The meta-analysis tells us nothing about the critical context in which the learning took place. What learner support services were in place? What was the quality of the teaching or of the content? What was the condition of the home study or the class environment - the list of contextual factors goes on and on. Thus, one can conclude that this gold standard – the use of randomly assigned comparison group research and subsequent meta-analysis is of only limited use to practicing distance educators. These results may be useful in persuading reluctant colleagues or funders about the efficacy of distance education, but they tell us little that will help us to improve our practice.
  • Despite this problem, many very influential policy makers are now arguing that unless education adopts this type of “scientific and evidence based research”, we will never make progress in the discipline and will be subject to fads and superstitions forever. The famous American education researcher Robert Slavin (2002) contributed to a major revival of the paradigm wars of the 1980’s recently when he argued that educational researchers need to embrace “evidence based learning” rather than current process that “more resembles the pendulum swings characteristic of art or fashion, rather than the progressive improvements characteristic of science and technology”(p. 16). This plea has fallen on fertile ground in many government circles.
  • Can you change this to % of valid replies.
  • Design based for lisbon 2011

    1. 1. Research Methods in Distance Education: Design Based Research Terry Anderson PhD Seminar Nova University, Lisbon March 2011
    2. 2. Tuesday’s Agenda <ul><li>Lecture – Research methods and in DE </li></ul><ul><ul><li>Design Based research </li></ul></ul><ul><li>Break </li></ul><ul><ul><li>Design Based research in Action </li></ul></ul><ul><ul><li>Athabasca’s Elgg Project </li></ul></ul>
    3. 3. The context of Distance Education Implementation <ul><li>Disruptive innovation (Christensen, 2008) simpler, not wanted by main stream customers </li></ul><ul><li>Rapid gains in functionality </li></ul><ul><li>Cheaper </li></ul><ul><li>Adaptive </li></ul><ul><li>Moving from peripheral to mainstream (blended and online for full time students) </li></ul>
    4. 4. Good Research <ul><li>Good Theory (see ) </li></ul><ul><li>Good Question(s) </li></ul><ul><li>Good methodology </li></ul><ul><li>Brave conclusions </li></ul><ul><li>Important applications </li></ul><ul><li>Spans multiple iterations </li></ul>
    5. 5. Because much e-learning tools and context is so emergent - need for <ul><li>Demographic studies </li></ul><ul><ul><li>Who is using what for doing what? </li></ul></ul><ul><ul><li>How are they using multi-use tools? </li></ul></ul><ul><li>Visualization of activity and relationships </li></ul><ul><li>Network analysis - SNAPP </li></ul><ul><li>Connection between context and type of use </li></ul><ul><li>Data mining/analytics </li></ul><ul><li>Sharing and making visible research and practitioner results </li></ul>
    6. 6. Research Paradigms
    7. 7. Research Paradigms
    8. 8. Research Paradigms <ul><li>Quantitative ~ discovery of the laws that govern behavior </li></ul><ul><li>Qualitative ~ understandings from an insider perspective </li></ul><ul><li>Critical ~ Investigate and expose the power relationships </li></ul><ul><li>Design-based ~ interventions, interactions and their effect in multiple contexts </li></ul>
    9. 9. Paradigm 1 Quantitative Research <ul><li>employs a scientific discourse derived from the epistemologies of positivism and realism. </li></ul>
    10. 10. <ul><li>“ those who are seeking the strict way of truth should not trouble themselves about any object concerning which they cannot have a certainty equal to arithmetic or geometrical demonstration” </li></ul><ul><ul><li>(Rene Descartes) </li></ul></ul><ul><li>Inordinate support and faith in randomized controlled studies </li></ul>
    11. 11. Quantitative 1 – CMC Content Analysis <ul><li>Anderson, Garrison, Rourke 1997-2003 </li></ul><ul><ul><li> - 9 papers reviewing results focusing on reliable , quantitative analysis </li></ul></ul><ul><ul><li>Identified ways to measure teaching, social and cognitive ‘presence’ </li></ul></ul><ul><ul><li>Most reliable methods are beyond current time constraints of busy teachers </li></ul></ul><ul><ul><li>Questions of validity </li></ul></ul><ul><ul><li>Serves as basic research as grounding for AI methods and major survey work of the future </li></ul></ul><ul><ul><li>Serves as qualitative heuristic for teachers and course designers </li></ul></ul>
    12. 12. Quantitative – Meta-Analysis <ul><li>Aggregates many effect sizes creating large N’s more powerful results. </li></ul><ul><li>Ungerleider and Burns (2003) </li></ul><ul><li>Systematic review of effectiveness and efficiency of Online education versus Face to face </li></ul><ul><li>The type of interventions studied were extraordinary diverse –only criteria was a comparison group </li></ul><ul><li>“ Only 10 of the 25 studies included in the in-depth review were not seriously flawed, a sobering statistic given the constraints that went into selecting them for the review.” </li></ul>
    13. 14. Is DE Better than Classroom Instruction? Project 1: 2000 – 2004 <ul><li>Question: How does distance education compare to classroom instruction? (inclusive dates 1985-2002) </li></ul><ul><li>Total number of effect sizes: k = 232 </li></ul><ul><li>Measures: Achievement, Attitudes and Retention (opposite of drop-out) </li></ul><ul><li>Divided into Asynchronous and Synchronous DE </li></ul>Bernard, R. M., Abrami, P. C., Lou, Y. Borokhovski, E., Wade, A., Wozney, L., Wallet, P.A., Fiset, M., & Huang, B. (2004). How does distance education compare to classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research, 74(3), 379-439.
    14. 15. Primary findings <ul><li>DE and CI are essentially equal (g+ ≈ 0.0 to low average effect) on all measures </li></ul><ul><li>Effect size distributions are heterogeneous; some DE >> CI, some DE << CI </li></ul><ul><li>Generally poor methodological quality </li></ul><ul><li>Pedagogical study features account for more variation than media study features (Clark, 1994) </li></ul><ul><li>Interactive DE an important variable* </li></ul>* Lou, Y., Bernard, R.M., & Abrami, P.C. (2006). Media and pedagogy in undergraduate distance education: A theory-based meta-analysis of empirical literature. Educational Technology Research & Development, 54(2), 141-176.
    15. 16. Summary of results: Achievement Achievement Outcomes *Significantly heterogeneous average effect Type of DE k g+ Sig. Combined 318* 0.013 * > 0.05 Synchronous 92 – 0.102 * < 0.05 Asynchronous 174 0.053 * < 0.05
    16. 17. Summary of results: Attitudes Attitude Outcomes *Significantly heterogeneous average effect Type of DE k g+ Sig. Combined 154 – 0.081 * < 0.05 Synchronous 83 – 0.185 * < 0.05 Asynchronous 71 – 0.034 * > 0.05
    17. 18. Summary of results: Retention Retention Outcomes *Significantly heterogeneous effect sizes Type of DE k g+ Sig. Combined 103 – 0.057 * < 0.05 Synchronous 17 0.005 > 0.05 Asynchronous 53 – 0.093 * < 0.05
    18. 19. Equivalency: Are all types of Interaction necessary? Anderson, 2003 IRRODL
    19. 20. Anderson’s Equivalency Theorem (2003) <ul><li>Moore (1989) distinctions are: </li></ul><ul><li>Three types of interaction </li></ul><ul><ul><li>student-student interaction </li></ul></ul><ul><ul><li>student-teacher interaction </li></ul></ul><ul><ul><li>Student-content interaction </li></ul></ul><ul><li>Anderson (2003) hypotheses state: </li></ul><ul><li>High levels of one out of 3 interactions will produce satisfying educational experience </li></ul><ul><li>Increasing satisfaction through teacher and learner interaction interaction may not be as time or cost-effective as student-content interactive learning sequences </li></ul>
    20. 21. Do the three types of interaction differ? Moore’s distinctions Achievement and Attitude Outcomes Moore’s distinctions seem to apply for achievement (equal importance), but not for attitudes (however, samples are low for SS and SC)
    21. 22. Does strengthening interaction improve achievement and attitudes? Anderson’s hypotheses Anderson’s first hypothesis about achievement appears to be supported Anderson’s second hypothesis about satisfaction (attitude) appears to be supported, but only to an extent (i.e., only 5 studies in High Category) Achievement and Attitude Outcomes
    22. 23. <ul><li>Bernard, Abrami, Borokhovski, Wade, Tamin, & Surkes, (in press). Examining Three Forms of Interaction in Distance Education: A Meta-Analysis of Between-DE Studies. Review of Research in Education </li></ul>
    23. 24. Because much web 2.0 tech use is so emergent need for: <ul><li>Demographic studies </li></ul><ul><ul><li>Who is using what for doing what? </li></ul></ul><ul><ul><li>How are they using multi-use tools? </li></ul></ul><ul><li>Visualization of activity and relationships </li></ul><ul><li>Network analysis </li></ul><ul><li>Connection between context and type of use </li></ul><ul><li>Data mining </li></ul>
    24. 25. Quantitative Research Summary <ul><li>Can be useful especially when fine tuning well established practice </li></ul><ul><li>Provides incremental gains in knowledge, not revolutionary ones </li></ul><ul><li>The need to “control” context often makes results of little value to practicing professionals </li></ul><ul><li>In times of rapid change too early quantitative testing may mask beneficial positive capacity </li></ul><ul><li>Will we ever be able to afford blind reviewed, random assignment studies? </li></ul>
    25. 26. Paradigm 2 Qualitative Paradigm <ul><li>Many different varieties </li></ul><ul><li>Generally answer the question ‘why’ rather then ‘what’, ‘when’ or ‘how much’? </li></ul><ul><li>Presents special challenges in distributed contexts due to distance between participants and researchers </li></ul><ul><li>Currently most common type of DE research (Rourke & Szabo, 2002) </li></ul>
    26. 27. Qualitative study of Social Software <ul><li>Critically important: </li></ul><ul><ul><li>In early stages of adoption </li></ul></ul><ul><ul><li>To track effects of user competence and efficacy </li></ul></ul><ul><ul><li>As contexts are personalized </li></ul></ul><ul><ul><li>as tools are appropriated by users for entirely different tasks than those intended by developers </li></ul></ul>
    27. 28. Qualitative Example <ul><ul><li>Dearnley (2003) Student support in open learning: Sustaining the Process </li></ul></ul><ul><ul><li>Practicing Nurses, weekly F2F tutorial sessions </li></ul></ul><ul><ul><li>Phenomenological study using grounded theory discourse </li></ul></ul>
    28. 29. Core category to emerge was “Finding the professional voice” Dearnley and Matthew (2003 and 2004)
    29. 30. Qualitative example 2 <ul><li>Mann, S. (2003) A personal inquiry into an experience of adult learning on-line. Instructional Science 31 </li></ul><ul><li>Conclusions: </li></ul><ul><ul><li>The need to facilitate the presentation of learner and teacher identities in such a way that takes account of the loss of the normal channel </li></ul></ul><ul><ul><li>The need to make explicit the development of operating norms and conventions </li></ul></ul><ul><ul><li>reduced communicative media there is the potential for greater misunderstanding </li></ul></ul><ul><ul><li>The need to consider ways in which the developing learning community can be open to the other of uncertainty, ambiguity and difference </li></ul></ul>
    30. 31. 3rd Paradigm Critical Research <ul><li>Asks who gains in power? </li></ul><ul><li>David Noble’s critique of ‘digital diploma Mills’ most prominent Canadian example </li></ul><ul><li>Are profits generated from user generated content exploitative? </li></ul><ul><li>Confronting the “net changes everything” mantra of many social software proponents. </li></ul><ul><li>Who is being excluded from social software </li></ul>
    31. 32. See Norm Friesen’s Friesen, N. (2009) Re-thinking e-learning research: foundations, methods, and practices. Peter Lang Publishers
    32. 33. Is the extraction of information from the masses exploitative or empowering?
    33. 34. <ul><li>Why does Facebook own all the content that we supply? </li></ul><ul><li>Does the power of the net further marginalize the non connected? </li></ul><ul><li>Who benefits from voluntary disclosure? </li></ul><ul><li>Why did the One Laptop Per Child fail? </li></ul>
    34. 35. Quantitative vs. Qualitative Paradigm Wars Rekindled <ul><li>Current research “more resembles the pendulum swings characteristic of art or fashion, rather than the progressive improvements characteristic of science and technology” (p. 16). </li></ul><ul><ul><li>Slavin (2002) in Educational Researcher </li></ul></ul><ul><li>Solution to embrace “evidence based learning” </li></ul><ul><li>Projected to increase from 5% to 75% of US Gov. funding by 2007 for “research that addresses causal questions and uses random assignments ….” Slavin, 2002 p. 15 </li></ul>
    35. 36. Do Either Qualitative or Quantitative Methods Meet Real Needs of Practicing Distance Educators?
    36. 37. But what type of research has most effect on practice? <ul><ul><li>Kennedy (1999) - teachers rate relevance and value of results from each of major paradigms. </li></ul></ul><ul><ul><li>No consistent results – teachers are not a homogeneous group of consumers but they do find research of value </li></ul></ul><ul><ul><li>“ The studies that teachers found to be most persuasive, most relevant, and most influential to their thinking were all studies that addressed the relationship between teaching and learning.” </li></ul></ul>
    37. 38. But what type of research has most effect on Practice? <ul><ul><li>“ The findings from this study cast doubt on virtually every argument for the superiority of any particular research genre, whether the criterion for superiority is persuasiveness, relevance, or ability to influence practitioners’ thinking.” Kennedy, (1999) </li></ul></ul>
    38. 39. 4th Paradigm Design-Based Research <ul><li>Related to engineering and architectural research </li></ul><ul><li>Focuses on the design, construction, implementation and adoption of a learning initiative in an authentic context </li></ul><ul><li>Related to ‘Development Research’ </li></ul><ul><li>Closest educators have to a “home grown” research methodology </li></ul>
    39. 40. Design-Based Research Studies <ul><ul><li>iterative, </li></ul></ul><ul><ul><li>process focused, </li></ul></ul><ul><ul><li>interventionist, </li></ul></ul><ul><ul><li>collaborative, </li></ul></ul><ul><ul><li>multileveled, </li></ul></ul><ul><ul><li>utility oriented, </li></ul></ul><ul><ul><li>theory driven and generative </li></ul></ul><ul><ul><ul><li>(Shavelson et al, 2003) </li></ul></ul></ul>
    40. 41. Critical characteristics of design experiments <ul><li>According to Reeves (2000:8), Ann Brown (1992) and Alan Collins (1992): </li></ul><ul><ul><li>addressing complex problems in real contexts in collaboration with practitioners, </li></ul></ul><ul><ul><li>integrating known and hypothetical design-principles with technological affordances to render plausible solutions to these complex problems, and </li></ul></ul><ul><ul><li>conducting rigorous and reflective inquiry to test and refine innovative learning environments as well as to define new design-principles. </li></ul></ul>
    41. 42. Design-based research <ul><li>Methodology developed by educators for educators </li></ul><ul><li>Developed from American pragmatism – Dewey (Anderson, 2005) </li></ul><ul><li>Recent Theme Issues: </li></ul><ul><ul><li>The Journal of the Instructional Sciences, ( 13 , 1, 2004), </li></ul></ul><ul><ul><li>Educational Researcher ( 32 , 1, 2003) and </li></ul></ul><ul><ul><li>Educational Psychologist (39, 4, 2004) </li></ul></ul><ul><ul><li>See bibliography at </li></ul></ul><ul><li>My article at </li></ul>
    42. 43. Integrative Learning Design (Bannan-Ritland, 2003)
    43. 44. <ul><li>“ design-based research enables the creation and study of learning conditions that are presumed productive but are not well understood in practice, and the generation of findings often overlooked or obscured when focusing exclusively on the summative effects of an intervention” Wang & Hannafin, 2003 </li></ul>
    44. 45. <ul><li>Iterative because </li></ul><ul><li>‘ Innovation is not restricted to the prior design of an artifact, but continues as artifacts are implemented and used” </li></ul><ul><li>Implementations are “inevitably unfinished” (Stewart and Williams (2005) </li></ul><ul><li>intertwined goals of (1) designing learning environments and (2) developing theories of learning (DBRC, 2003) </li></ul>
    45. 46. Amiel, T., & Reeves, T. C. (2008).
    46. 47. Design Based research and the Science of Complexity <ul><li>Complexity theory studies the emergence of order in multifaceted, changing and previously unordered contexts </li></ul><ul><li>This emerging order becomes the focus of iterate interventions and evaluations </li></ul><ul><li>Order emerges at the “edge of chaos” in response to rapid change, and failure of previous organization models </li></ul>
    47. 48. Call Centres At Athabasca: Answer 80% of student inquiries Savings of over $100,000 /year Anderson, T. (2005). Design-based research and its application to a call center innovation in distance education. Canadian Journal of Learning and Technology, 31(2), 69-84
    48. 49. D-B Research examples Design-Based Research Strategies for Studying Situated Learning in a Multi-user Virtual Environment Chris Dede, 2004
    49. 50. Graduate Student Resource Hub in Design Research in Education <ul><li> </li></ul>
    50. 51. <ul><li>Need to study usability, scalability and innovation adoption within bureaucratic systems </li></ul><ul><li>Allow knowledge tools to evolve in natural context through supportive nourishment of staff </li></ul>
    51. 52. Conclusion <ul><li>Education research is grossly under-resourced to meet the magnitude of opportunity and demand </li></ul><ul><li>Paradigm wars are unproductive </li></ul><ul><li>Design-based research offers a promising new research design model </li></ul><ul><li>It can be used for Doctoral dissertations see </li></ul><ul><ul><li>Herrington, J., McKenney, S., Reeves, T., & Oliver, R. (2007). Design-based research and doctoral students: Guidelines for preparing a dissertation proposal. </li></ul></ul>
    52. 53. Design Based research in Action <ul><li>Phase 1 Exploration – surveys, talking to faculty and tutors, investigating open source tools, setting research questions </li></ul><ul><li>Phase 2. Building the intervention – Elgg through two versions and 85 plugins (on going) </li></ul>
    53. 54. <ul><li>Phase 3 Evaluation – Before and after survey’s see: </li></ul><ul><ul><li>Anderson, T., Poelhuber, B., & McKerlich, R. (2010). Self Paced Learners Meet Social Software. Online Journal of Distance Education Administration, 13 </li></ul></ul><ul><ul><li>Dr students – Use of past student archives </li></ul></ul><ul><ul><li>Ongoing iterations and development of tools </li></ul></ul><ul><li>Phase 4 – Testing in multiple contexts </li></ul><ul><ul><li>Development of design principles/patterns </li></ul></ul>
    54. 55. Survey Results –Anderson et al 2004 <ul><ul><li>78% indicated they would interact with other students if they were also able to proceed through the course at their own pace . </li></ul></ul>
    55. 56. Survey Results Anderson et al 2004 <ul><li>Only 29% of the student respondents had participated in the optional (credit and non credit) interactive computer conferences </li></ul>
    56. 57. Undergrad Survey Sept. 2009 Draft Results AU Unpaced Learners social Software Survey, Anderson Sept 2009 sent to 3763 undergrad students who enrolled in AU ungrad courses in Aug.2009 24.7% response rate N=820
    57. 58. Draft Results, AU Unpaced Learners Social Software Survey, Anderson Sept 2009 N = 820
    58. 59. Draft Results, AU Unpaced Learners Social Software Survey, Anderson Sept 2009 N = 820
    59. 60. Draft Results, AU Unpaced Learners Social Software Survey, Anderson Sept 2009
    60. 61. Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009.
    61. 62. Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009.
    62. 63. Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009. 25.12% N = 820
    63. 64. Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009. N = 820
    64. 65. Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009. 47.93%
    65. 66. Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009. 61.95% 31.47% 6.59%
    66. 67. Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009.
    67. 68. Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009.
    68. 69. Lots of Support <ul><li>“ Not networking with other students, and not having peers is one drawback in doing individualized studies through Athabasca, with these technologies available could solve this problem.” </li></ul><ul><li>“ I think that hearing other people's opinions is a great way to spark new thoughts of your own. I also think that it is a great way to ask questions rather than emailing back and forth or making long distance phone calls.” </li></ul>
    69. 70. Lots of Concerns <ul><li>“ People have other commitments and might not be able to join in, they like to do things on their own time.” </li></ul><ul><li>“ I am not part of a social network due to the fact that I work in mental health, I am concerned about my privacy.” </li></ul><ul><li>“ I'm scared as a first time user of e-learning, that I may miss something” </li></ul>
    70. 71. Survey Conclusions <ul><li>We have a very heterogeneous population of net users and non users </li></ul><ul><li>Many of our learners are “don’t know” about web 2.0 tool use in formal education – are they literate???? </li></ul>
    71. 72. Challenges to AU Moving to Connectivist Pedgagogy <ul><li>Personal competence, literacy and tools </li></ul><ul><li>Dealing effectively with disruptive technologies </li></ul><ul><li>Crystallized ways of thinking about our educational development and delivery model </li></ul><ul><li>Developing Tutor Networks </li></ul><ul><li>Union contracts ??? </li></ul>
    72. 73. How to insure we all are learning professionals?
    73. 74. My Personal Learning Network Professional, Hobby, Personal News Produsage, networks Personal Hosting: Blogs, E-portfolios, Presentations, Profile Bookmarks Tags Resources Collections Photos Books Formal Education Provider(s) Production Tools Email Social Networks I PLE Identity
    74. 75. Open Net Athabasca University Athabasca Landing E-Portfolios Profiles Networks Bookmarks Blogs Media lab Secondlife campus AUspace AlFresco CMS Moodle Library Course Development ELGG MY AU Login Registry OERs, YouTUBE Discovery Read & Comment Single Sign on CIDER Research/Community Networks Sample CC Course units and Branded OERs Passwords Passwords
    75. 76. Network Tool Set (example) Text Text Stepanyan, Mather & Payne, 2007
    76. 77. Access Controls in Elgg
    77. 78. Design Based research in Practice <ul><li>Athabasca Landing </li></ul><ul><ul><li>Elgg based </li></ul></ul><ul><ul><li>Started in 2008 </li></ul></ul><ul><ul><li>1600 users (2011) </li></ul></ul><ul><ul><li>Unpaced </li></ul></ul><ul><ul><li>Paced Courses </li></ul></ul><ul><ul><li>Informal Learning </li></ul></ul>
    78. 79. <ul><li>The Demo! </li></ul><ul><li>Elgg </li></ul>
    79. 80. Questions and Comments