The Achievement Gap in Online Courses through a Learning Analytics Lens


Published on

Presentation at San Diego State University on April 12, 2013.

Educational researchers have found that students from under-represented minority families and other disadvantaged demographic backgrounds have lower achievement in online (or hybrid) courses compared to face-to-face course sections (Slate, Manuel, & Brinson Jr, 2002; Xu & Jaggars, 2013). However, these studies assume that "online course" is a homogeneous entity, and that student participation is uniform. The content and activity of the course is an opaque "black box", which leads to conclusions that are speculative at best and quite possibly further marginalize the very populations they intend to advocate for.

The emerging field of Learning Analytics promises to break open this black box understand how students use online course materials and the relationship between this use and student achievement. In this presentation, we will explore the countours of Learning Analytics, look at current applications of analytics, and discuss research applying a Learning Analytics research method to students from at-risk backgrounds. The findings of this research challenge stereotypes of these students as technologically unsophisticated and identify concrete learning activities that can support their success.

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Kathy
  • John
  • John
  • John
  • Kathy
  • The Achievement Gap in Online Courses through a Learning Analytics Lens

    1. 1. The Achievement Gap in Online Courses through a Learning Analytics Lens John Whitmer, Ed.D. Academic Technology ServicesCalifornia State University, Office of the Chancellor San Diego State University April 12, 2013
    2. 2. Motivating Questions …1. Do our current uses of academic technologies (such as online learning) decrease or exacerbate the achievement gap?2. If we don’t believe our current uses serve these students, how do we know? What can we do about it?
    3. 3. Agenda1. Context: CSU Achievement Gap & Conceptual Framework2. Recent Conventional Research in Online Courses3. Research using Learning Analytics & Course Redesign4. Next Steps & Discussion
    5. 5. Increasing Access to Higher Education in the U.S. Percent of Percent of Percent 1976 Enrollment 2010 Enrollment Increase (1976) (2010)All Students 10,986 21,016 91%White 9,076 83% 12,723 61% 40%Asian/Pacific 198 2% 1,282 6% 548%IslanderURM Students 1,493 14% 5,977 28% 300% American 76 1% 196 1% 158% Indian Black 1,033 9% 3,039 14% 194% Hispanic 384 3% 2,741 13% 614% Table adapted from data in NCES Digest of Educational Statistics (2011)
    6. 6. Increased Access to Higher Education Enrollment Increase by Race/Ethnicity in Higher Education (1976-2010)700%600%500%400%300%200%100% 0% All Students White Asian/Pacific URM Students American Indian Black Hispanic Islander Table adapted from data in NCES Digest of Educational Statistics (2011)
    7. 7. CSU Achievement Gap
    8. 8. .– Baseline 6-Year Graduation Rate: 46%– Target 6-Year Graduation Rate: 54%– Baseline Achievement Gap: 11%– Target Achievement Gap: 5.5% 2
    9. 9. No Significant Difference: Framing Academic Technology [academic technologies] are “mere vehicles that deliver instruction but do not influence student achievement any more than the truck that delivers our groceries causes changes in our Image courtesy bsabarnowl @ Flickr nutrition”. (Clark, 1983) Index of studies:
    11. 11. Adaptability to Online Learning: Differences Across Types of Students andAcademic Subject Areas (2013: Xu, Jaggers) Compares persistence and grade between online and f2f courses Compares same student Washington Community and Technical Colleges (2 yr) Studied 500,000 course enrollments (10% online), 40,000 individuals Data 2004-2009
    12. 12. Overall FindingsTable adapted from Xiu & Jaggers, 2013
    13. 13. Major Finding by PopulationOverall Course Result Select Populations ResultOnline GPA Average 2.77 Black -0.394F2F GPA Average 2.98 Males -0.288Entire Population -0.215 Academic PreparednessEffect by Subject -0.267 (F2F GPA<3.0 First Term) -0.314 Age < 25 -0.300 Cohort Effect (Courses w/+75% online at-risk students v. less than 25%) -0.359 Table adapted from Xiu & Jaggers, 2013
    14. 14. Major Finding by SubjectOverall Course Result Select Subjects ResultsOnline GPA Average 2.77 English -0.394F2F GPA Average 2.98 Applied Knowledge -0.322Entire Population -0.215 Social Science -0.308Effect by Subject -0.267 Table adapted from Xiu & Jaggers, 2013
    15. 15. Traditional Experimental Design Treatments Online Course F2F Course
    16. 16. Image courtesy bsabarnowl @ Flickr
    17. 17. What’s your experience with Online or Hybrid Course Design? Who has designed a fully online or hybrid course? (raise hands) Of those who have, how many think it’s harder to create online/hybrid materials than to create face to face activities? (keep hands raised)
    18. 18. Peering into the blue box  Online Course FacultyCourse Faculty Course &Design Development Student Support Training (Reassigned time, incentives, etc .) Student Specialist Support use of (Instructional online designers, etc.) materials
    19. 19. Do we need *students* to adapt … orDo we need to change how *we*approach creating & evaluating our technology-enhanced instructional materials?
    21. 21. Learner Analytics“ ... measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.” (Siemens, 2011)
    22. 22. Case Study: Intro to Religious Studies• Undergraduate, introductory, high demand• Redesigned to hybrid delivery format 54 F’s through “academy eLearning program”• Enrollment: 373 students (54% increase on largest section)• Highest LMS (Vista) usage entire campus Fall 2010 (>250k hits)• Bimodal outcomes: • 10% increase on final exam • 7% & 11% increase in DWF• Why? Can’t tell with aggregated data
    23. 23. LMS Use Variables Student Characteristic Variables Administrative Activities  Enrollment Status (calendar, announcements)  First in Family to Attend Assessment Activities College (quiz, homework,  Gender assignments, grade center)  HS GPA Content Activities  Major-College (web hits, PDF, content pages)  Pell Eligible Engagement Activities  URM and Pell-Eligibility (discussion, mail) Interaction  Under-Represented Minority  URM and Gender Interaction
    24. 24. Correlation: Student Char. w/Final Grade Scatterplot of HS GPA vs. Course Grade
    25. 25. Predict the trend LMS use and final grade is _______ compared to student characteristics and final grade: a) 50% smaller b) 25% smaller c) the same d) 200% larger e) 400% larger
    26. 26. Predict the trend LMS use and final grade is _______ compared to student characteristics and final grade: a) 50% smaller b) 25% smaller c) the same d) 200% larger e) 400% larger
    27. 27. Correlation LMS Use w/Final Grade Scatterplot ofAssessment Activity Hits vs. Course Grade
    28. 28. Combined Variables Regression Final Grade by LMS Use & Student Characteristic Variables LMS Student Use Characteristic Variables Variables 25% (r2=0.25) > +10% (r2=0.35)Explanation of change Explanation of change in final grade in final grade
    29. 29. Correlation LMS & Student Characteristic Variables w/Final Grade
    30. 30. At-Risk Students: “Over-Working Gap”
    31. 31. Activities by Pell and GradeExtra effortin content-relatedactivities
    32. 32. Next Generation Learning Analytics Graphic Courtesy Sasha Dietrichson, X-Ray Research SRL
    33. 33. COURSE REDESIGN Flagship: Program in Course Redesign, led by Carol Twig (1999-2004) – Pew funded 30 grants, $8.8M budget to redesign courses for improved outcomes & lower costs (institutionalization) Result: 25 of 30 courses reported increased learning outcomes, 5 no change (not worse!) – 17 reduction DWF (10-20%) – Cost reduction 20-77%, $3M annual savings Adopted (with modifications by CSU, SDSU, Chico State) – Chico State evaluations:
    34. 34. Sample Improvements DWF (drop-failure-withdrawal) rates at Drexel were consistently reduced 10- 12 percent in the redesigned course. At OSU, withdrawals were reduced by 3 percent, failures by 4 percent and incompletes by 1 percent. As a result, 248 more students successfully completed the course compared to the traditional course. At TCC, students in redesigned sections had a 68.4 percent success rate compared to 60.7 percent for traditional sections. Success rates were higher for all groups of students regardless of ethnicity, gender, disability, or original placement. The overall success rate for all composition students was 62 percent for the 2002-2003 year compared to 56 percent for the 1999-2000 year prior to redesign. In the traditional course at USM, faculty-taught sections typically retained about 75 percent of students while adjunct- and TA-taught sections retained 85 percent. In the redesign, the retention rate was 87 percent. The rate of D and F grades dropped from 37 percent in the traditional course to 27 percent in the redesigned course. DFW rates dropped from 26 percent in the traditional course to 22 percent in the redesign. Source: Program in Course Redesign Round III: Lessons Learned (
    35. 35. Underserved Student Experiences More comfort, higher participation in online forums Appreciate ability to anonymously rewind / repeat / review materials English language Source: Twigg, C. (2005). Increasing Success for Underserved Students: Redesigning Introductory Courses. URL: learners decreased social anxiety
    36. 36. Proven PCR Techniques for Underserved Students Interactive online tutorials (not PPT decks) Continuous assessment / feedback Increased student interaction Individualized, on-demand support Undergraduate learning assistants Structural supports that encourage student engagement / progress
    37. 37. 4. DISCUSSION
    38. 38. Discussion Do you think there is an achievement gap at SDSU for under-served students in online / hybrid / tech enhanced courses? What evidence do you have for those beliefs? What is SDSU doing about any existing achievement gap w/academic technology? What supports are in place or could be developed?
    39. 39. Feedback? Questions?John Whitmer jwhitmer@calstate.eduTwitter: johncwhitmerLearning Analytics resources: