The document summarizes a seminar on research methods in distance education, with a focus on design-based research. It discusses four main research paradigms - quantitative, qualitative, critical, and design-based research. Design-based research is presented as a methodology developed by educators that focuses on iterative design, testing, and evaluation of learning innovations in authentic contexts. Examples of design-based research studies and results from a survey on social software use among distance learners are also summarized.
21. Do the three types of interaction differ? Moore’s distinctions Achievement and Attitude Outcomes Moore’s distinctions seem to apply for achievement (equal importance), but not for attitudes (however, samples are low for SS and SC)
22. Does strengthening interaction improve achievement and attitudes? Anderson’s hypotheses Anderson’s first hypothesis about achievement appears to be supported Anderson’s second hypothesis about satisfaction (attitude) appears to be supported, but only to an extent (i.e., only 5 studies in High Category) Achievement and Attitude Outcomes
23.
24.
25.
26.
27.
28.
29. Core category to emerge was “Finding the professional voice” Dearnley and Matthew (2003 and 2004)
30.
31.
32. See Norm Friesen’s Friesen, N. (2009) Re-thinking e-learning research: foundations, methods, and practices. Peter Lang Publishers
33. Is the extraction of information from the masses exploitative or empowering?
34.
35.
36. Do Either Qualitative or Quantitative Methods Meet Real Needs of Practicing Distance Educators?
48. Call Centres At Athabasca: Answer 80% of student inquiries Savings of over $100,000 /year Anderson, T. (2005). Design-based research and its application to a call center innovation in distance education. Canadian Journal of Learning and Technology, 31(2), 69-84
49. D-B Research examples Design-Based Research Strategies for Studying Situated Learning in a Multi-user Virtual Environment Chris Dede, 2004
50.
51.
52.
53.
54.
55.
56.
57. Undergrad Survey Sept. 2009 Draft Results AU Unpaced Learners social Software Survey, Anderson Sept 2009 sent to 3763 undergrad students who enrolled in AU ungrad courses in Aug.2009 24.7% response rate N=820
58. Draft Results, AU Unpaced Learners Social Software Survey, Anderson Sept 2009 N = 820
59. Draft Results, AU Unpaced Learners Social Software Survey, Anderson Sept 2009 N = 820
60. Draft Results, AU Unpaced Learners Social Software Survey, Anderson Sept 2009
61. Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009.
62. Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009.
63. Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009. 25.12% N = 820
64. Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009. N = 820
65. Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009. 47.93%
66. Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009. 61.95% 31.47% 6.59%
67. Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009.
68. Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009.
74. My Personal Learning Network Professional, Hobby, Personal News Produsage, networks Personal Hosting: Blogs, E-portfolios, Presentations, Profile Bookmarks Tags Resources Collections Photos Books Formal Education Provider(s) Production Tools Email Social Networks I PLE Identity
75. Open Net Athabasca University Athabasca Landing E-Portfolios Profiles Networks Bookmarks Blogs Media lab Secondlife campus AUspace AlFresco CMS Moodle Library Course Development ELGG MY AU Login Registry OERs, YouTUBE Discovery Read & Comment Single Sign on CIDER Research/Community Networks Sample CC Course units and Branded OERs Passwords Passwords
76. Network Tool Set (example) Text Text Stepanyan, Mather & Payne, 2007
Evidence based developed at Mcmaster The group of clinical epidemiologists who developed evidence-based decision-making at McMaster University in Canada (Sackett et al., 1985)
But what if the results had shown very significant results in favor of either mode of delivery? Would they have informed our practice? I think the answer would be a resounding “Not very likely”. The meta-analysis tells us nothing about the critical context in which the learning took place. What learner support services were in place? What was the quality of the teaching or of the content? What was the condition of the home study or the class environment - the list of contextual factors goes on and on. Thus, one can conclude that this gold standard – the use of randomly assigned comparison group research and subsequent meta-analysis is of only limited use to practicing distance educators. These results may be useful in persuading reluctant colleagues or funders about the efficacy of distance education, but they tell us little that will help us to improve our practice.
Despite this problem, many very influential policy makers are now arguing that unless education adopts this type of “scientific and evidence based research”, we will never make progress in the discipline and will be subject to fads and superstitions forever. The famous American education researcher Robert Slavin (2002) contributed to a major revival of the paradigm wars of the 1980’s recently when he argued that educational researchers need to embrace “evidence based learning” rather than current process that “more resembles the pendulum swings characteristic of art or fashion, rather than the progressive improvements characteristic of science and technology”(p. 16). This plea has fallen on fertile ground in many government circles.