Myths And Misperceptions About Online Learning2
Upcoming SlideShare
Loading in...5
×
 

Myths And Misperceptions About Online Learning2

on

  • 1,671 views

Invited Session featuring researchers who have conducted reviews of online learning published in Review of Educational Research. The session includes a conceptual, traditional, and meta-analytic ...

Invited Session featuring researchers who have conducted reviews of online learning published in Review of Educational Research. The session includes a conceptual, traditional, and meta-analytic review of this topic.

Statistics

Views

Total Views
1,671
Views on SlideShare
1,670
Embed Views
1

Actions

Likes
0
Downloads
20
Comments
0

1 Embed 1

http://www.slideshare.net 1

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Myths And Misperceptions About Online Learning2 Myths And Misperceptions About Online Learning2 Presentation Transcript

  • Myths and Misperceptions about Online Learning Jorge Larreamendy-Joerns, Universidad de los Andes, Mary Tallent Runnels, Texas Tech University Julie A. Thomas, Oklahoma State University Robert Bernard, Phil Abrami, Concordia University Facilitator: Peter Shea, University at Albany, State University of New York
  • Myths and Misperceptions
    • Longstanding confusion regarding online learning – purpose, history, quality, interaction, outcomes, NSD etc.
    • Need for high quality research e.g.
    • Review of Educational Research (RER)
    • #1 scholarly journal in the field of educational research
    • Opportunity for more nuanced views
    • Three kinds of reviews...
  • Conceptual Review
    • Larreamendy-Joerns, J., & Leinhardt, G. (2006). Going the distance with online education. Review of Educational Research, 76 (4), 567-606 .
  • Traditional Review
    • Tallent-Runnels, M. & Thomas, J. et. al. (2006). Teaching courses online: A review of the research. Review of Educational Research , Vol. 76, No. 1, 93-135.
  • Empirical/Meta-analytic review
    • Bernard, R.M. & Abrami, P.C. et. al. (2009) A meta-analysis of three types of interaction treatments in distance education. Review of Educational Research , 79(3): 1243 - 1289.
    • Bernard, R.M., Abrami, P.C., et. al. (2004). How does distance education compare to classroom instruction? A Meta-analysis of the empirical literature. Review of Educational Research , 74(3), 379-439.
  • Order
    • Jorge Larreamendy-Joerns: “Online education: Promises and Concerns with a Past”
    • Mary Tallent-Runnels & Julie Thomas: “Myths and Misperceptions”
    • Robert Bernard & Philip Abrami: “Distance Education and Online Learning Research: Past, Present and Future”
  • Online education: Promises and concerns with a past Jorge Larreamendy-Joerns Universidad de los Andes, Colombia
    • “ Our classroom can be worldwide. Its present bounds are marked by Western China in the far East, Dawson, Fairbanks, and Kodiak Island to the North, Chile to the South, and Senegal and the Union of South Africa in the near East. Our constituency embraces college presidents, government officials, representatives of most of the professions and vocations, and those who by stress of circumstance are debarred from the ordinary means of education.”
  • A sense of history
    • Larreamendy-Joerns, J., & Leinhardt, G. (2006). Going the distance with online education. Review of Educational Research, 76 (4), 567-606 .
    • Continuity between past (traditional distance education) and present (online education)
  • Some questions
    • What to expect from online education?
    • What to be concerned about?
    • Can online education overcome educational limitations?
    • Can it be an instrument of democratization?
    • Does it amount to marketize quality education?
    • What is the goal? Deep learning? Flaring technology? Reaching out to audiences?
  • Charting promises and concerns
    • Major visions in the current scene from a learning perspective:
      • Presentational view
      • Performance-tutoring view
      • Epistemic engagement view
    • Need not be exclusive, but signal to emphases
    • Urgency to undertake analyses of underlying conceptions of learning, knowledge, practice, and expertise.
  • Going back to the past
    • Online education has inherited many of the concerns and promissory notes of distance education.
      • Faculty resistance
      • Pedagogical imperialism
      • Learning and autonomy
      • Democratization
      • Liberal education
      • Educational quality
  • Democratization
    • Increasing either the access to higher education of populations that would be otherwise excluded, or increasing the range of people who might be served by elite institutions.
    • Anna Ticknor’s Society to Encourage Studies at Home (1873).
    • The legacy of William Rainey Harper, president of the UChicago.
    • The Department of Home-Study
  •  
  • Concerns
    • Distance education as dispensable.
    • Misalignments between expectations and actualities. Underestimation and overestimation.
    • The down of liberal arts education
    • The issue of quality
  • (Some) lessons learned
    • A non linear path
    • Instructional quality and media limitations
    • The equivalence of faculty
  • Pentimento
  • Thanks
  • Mary Tallent-Runnels and Julie Thomas
    • Males and females report similar learning benefits with online learning.
    • Faculty spend the same amount of time teaching online as they do face-to-face.
    • Online classes are taken by many existing students.
    • Universities provide support for faculty teaching online. This includes both instructional and technical assistance.
    • Technical difficulties always lower student achievement.
    • The more that students experience technical difficulties in online classes the lower they rate their instructors.
    • Online classes do not have as much interaction between students and their instructors and among students as in face-to-face classes.
    • Student interest in online classes has leveled off.
    • Taking online classes is more expensive that taking face-to-face classes.
    • Synchronous online classes (where students can work together online with the instructor) are more popular than asynchronous online classes.
    • Formats of online course evaluation are more limited than in face-to-face class evaluation.
    • Online-students have less access to lectures (and other help from instructors) than do face-to-face students.
  • Meta-Analysis of Distance Education and Online Learning Research: Past, Present and Future Robert M. Bernard Philip C. Abrami Centre for the Study of Learning and Performance Concordia University Montreal, Canada 15 th Sloan-C International Conference on Online Learning 1
  • Characteristics of a Systematic Review
    • Systematic—step-by-step approach
    • Comprehensive —encompasses all relevant research
    • Objective—minimizes bias at each step
    • Transparent —reveals all important details
    • Repeatable—other researchers can duplicate
    • Integrative —reaches general conclusions
    • Explanatory—relates findings to theory/practice
    • Relevant —adds value to practitioners/policymakers
  • Overview of Meta-Analysis
    • Quantitative synthesis of experimental research literature
    • Effect size expresses the standardized difference between treatment and control conditions
    • Estimates the magnitude of the average effect size and explores variability within effect size distributions
    • Explores moderating influences of coded study features
    • Findings generalize to populations of learners better than any single primary study can
  • The Past and Present: DE and Online Learning Compared to Classroom Instruction
    • Between 2000 and the present, 15 meta-analyses were conducted assessing the difference between DE/Online Learning and Classroom Instruction
    • What have we learned?
    • All forms of DE/OL are at least equal to CI
    • Wide variability surrounds average effect sizes from highly positive to highly negative effect sizes
    • Quality of research studies is questionable
  • What About Online Learning? Results of Meta-Analyses of OL vs. CI
  • What Does an ES + of 0.12-0.15 Mean?
  • What About Online Learning Compared to Asynchronous Distance Education? Note: Asynchronous DE studies are primarily “correspondence, video-based or broadcast” applications
  • The Future: Comparisons Between Different DE/OL Conditions
    • “… the next form of progress to advance theory will be made as researchers begin to examine how structural and technological treatments differ between DE conditions, not between DE and CI.” (Bernard, Abrami, Wade, et. al, 2009)
    • “ We need studies clarifying when to use e-learning (studies exploring strengths and weaknesses) and how to use it effectively (head-to-head comparisons of e-learning interventions).” (Cook, 2009)
  • How Studies Between DE/OL and DE/OL Might be Meta-Analyzed
    • Problem: Which condition is the “treatment” and which is the “control”?
    • One solution: Choose a construct (e.g., interactivity) and establish “greater presence” as the treatment and “lesser presence” as the control
    • Effect sizes with a (+) valence favor the treatment (greater presence) and those with a (–) valence favor the control (lesser presence)
  • We Chose Interaction as a Researchable Construct
    • Interaction is commonly understood as actions between/among individuals but here it includes individual or collective interactions with curricular content (Moore, 1989)
      • Student-student (S-S) interaction refers to interaction among individual students or among students working in small groups
      • Student–teacher (S-T) interaction focuses on dialogue between students and the instructor (e.g., clarification, feedback)
      • Student-content (S-C) interaction refers to students interacting with the subject matter to construct meaning, relate it to personal knowledge, and apply it to problem solving
  • What is an “Interaction Treatment”?
    • Definition: ITs are instructional or technology conditions implemented to enable S-S, S-T or S-C interaction
      • S-S Example: students working online collaboratively to complete a project
      • S-T Example: availability of teacher-provided feedback to students
      • S-C Example: students watching streamed video of lectures or any other content
  • Meta-Analysis of Three Interaction Treatments in Distance Education From Bernard, Abrami, Borokhovski, Wade et al. (2009). RER.
  • Summary of Results
    • Overall treatment strength was g + = 0.38 ( k = 74)
    • S-S and S-C treatments were higher than S-T
    • Higher and medium strength ITs better than low strength
    • Levels of strength differed (higher > lower) across combinations S-S + SC and S-T + SC
    • Synchronous, Asynchronous and Mixed DE not different
    11
  • One Big Caveat in this Research
    • We examined the nature of interaction treatments “instructional arrangements intended to foster interaction”
    • We did not examine levels of interaction directly (how much actual interaction the treatments produced)
    • Measuring interaction represents a challenge for the next generation of primary researchers interested in interaction in DE/OL
  • Evidence-Based Principles for Increasing Achievement Through Interactivity
    • First Principle: Increasing interaction of all kinds (S-S, S-T, S-C) leads to higher achievement gains
    • Second Principles:
      • Support the content; increase learners’ access to and engagement with whatever is intended to be learned
      • Increase access to the teacher or other students depending on the orientation to teacher- or student-centeredness and desired learning activities
      • Provide tertiary levels of support as resources (e.g., personnel, time, technology) allow
  • What We Need to Push the Field Forward
    • More research that compares at least one OL treatment to another OL treatment (no need to get too fancy)
    • Better research designs (if not RCTs, designs that at least control for selection bias)
    • More studies across the grade levels (k-12) and in higher education settings of all types
    • Better-quality descriptions of treatments
    • Better-quality measures
    • Better-quality reporting , preferably full descriptive statistics
  • References
    • Anderson, T. (2003). Getting the mix right again: An updated and theoretical rationale for interaction. International Review of Research in Open and Distance Learning, 4(2), 9–14.
    • Bernard, R.M., Abrami, P.C., Lou, Y., Borokhovski, E., Wade, A. et al. (2004). How does distance education compare to classroom instruction? A Meta-analysis of the empirical literature. Review of Educational Research, 74 (3), 379-439.
    • Bernard, R.M., Abrami, P.C., Borokhovski, E., Wade, A., Tamim, R. et al. (2009). A meta-analysis of three interaction treatments in distance education. Review of Educational Research, 79 , 1243-1289. (doi:10.3102/0034654309333844v1).
    • Cook, D.A. (2009). The failure of e-learning research to inform educational practice, and what we can do about it. Medical Teacher , 31 (2), 158-162.
    • Cook, D.A., Levinson, A.J., Garside, S., Dupras et al. (2008). Internet-based learning in the health professions: A meta-analysis. Journal of the American Medical Association, 300(10), 1181-1196 (doi:10.1001/jama.300.10.1181).
    • U.S Department of Education, Office of Planning, Evaluation, and Policy Development (2009). Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies, Washington, D.C.
    • Moore, M.G. (1989). Three types of interaction. American Journal of Distance Education, 3(2), 1–6.
    • Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006). The comparative Effectiveness of web-based and classroom instruction: A meta-analysis . Personnel Psychology, 59, 623-664.
  • Examples of Categorization Decisions: Student-Student Interaction
    • Bell, Hudson and Heinan (2004) provided two methods of online learning to Physician Assistant students in a medical terminology course.
    • Both versions of the course used the same materials, but some students worked independently on the Web (Group B) , while others (Group A) received 12 case studies in an online conference setting, which they then discussed through the use of asynchronous messaging.
    • We applied the comparative outcomes to “student-student interaction,” counting the case-based discussion participants (Group A) as the experimental group.
  • Examples of Categorization Decisions: Student-Teacher Interaction
    • Rovai (2001) compared fully online instruction in education, delivered via an asynchronous learning network (Group A) with blended instruction which included monthly face-to-face meetings with the instructor ( Group B ) .
    • Students in Group B were given extra chances to communicate (interact) with their teacher (as the means for communication online were equal in both conditions).
    • Group B was designated as the experimental group for “student–teacher interaction.”
  • Examples of Categorization Decisions: Student-Content Interaction
    • Bernard and Naidu (1994) offered nursing students two different strategies for studying in a DE course on Community Mental Health.
    • Participants received either created and studied concept maps (Group A) or answered predesigned study post-questions (Group B) .
    • Group A interacted with the course materials to a greater extent than Group B and, thus, constituted the experimental condition.