Successfully reported this slideshow.

Whitmer, Fernandes, Kodai CSU Chico Learner Analytics


Published on

Published in: Technology, Education
  • Be the first to comment

  • Be the first to like this

Whitmer, Fernandes, Kodai CSU Chico Learner Analytics

  1. 1. Learner Analytics Beyond the Buzz DETCHE Conference 2011 Kathy Fernandes Download presentation at: Scott Kodai John Whitmer
  2. 2. “But everything we know about cognition suggests that a small group of people, no matter how intellingent, simply will not be smarter than the larger group. ... Centralization is not the answer. But aggregation is.” - J. Surowiecki, The Wisdom of Crowds, 2004
  3. 3. Ambitous Outline1. Situating Analytics2. Academic Analytics – Case Study: CSU Data Dashboard3. Learner Analytics – Case Study: CSU Chico4. Promising Efforts & Resources5. Q & A
  5. 5. Steve Lohr, NY Times, August 5, 2009
  6. 6. Economist. (2010, 11/4/2010). Augmented business: Smart systems will disrupt lots of industries, and perhaps the entire economy. The Economist.
  7. 7. Source: jisc_infonet @ 7Source: jisc_infonet @
  8. 8. What’s the promise of Analytics for Academic Technologists?1. Decision-making (and service-evaluating) based on practices (not just perceptions) and performance outcomes2. If we’re moving into a strategic role re: teaching and learning, analytics can: – demonstrate the link between technology and learning – distinguish our role from a technology service provider(PS - anyone else concerned about the validity of student evaluations and self-reported data?) – “Rate your level of technology expertise (novice, intermediate, expert)”
  9. 9. Academic Analytics“Academic Analytics marries large data sets with statistical techniques and predictive modeling to improve decision making” (Campbell and Oblinger 2007, p. 3)
  10. 10. Academic Analytics1. Term adopted in 2005 ELI research report (Goldstein & Katz, 2005) – Response to widespread adoption ERP systems, desire to use data collected for improved decision making – 380 respondents; 65% planned to increase capacity in near future2. Call to move from transactional/operational reporting to what-if analysis, predictive modeling, and alerts3. LMS identified as potential domain for future growth 10
  12. 12. CSU Graduation initiative1. System Commitment to raise freshman graduation rate 8% by 2015-20162. Cut achievement gap for under-represented minority students by 50%3. Each CSU campus created own plan & activities to meet goals More info:
  13. 13. DD Screenshot
  14. 14. Learner Analytics:“ ... measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.” (Siemens, 2011)
  15. 15. Learner Analytics1. Assess relationship between learning context (aka educational technology usage) and student learning and/or achievement2. Most research to date: LMS for fullly online courses3. More complex than Academic Analytics, considering: – Variation in LMS usage by course – LMS learning actions are patterns, not clicks – No significant difference literature: not what technology used, it’s how it’s used, who uses it, and for what purpose
  16. 16. Academic technologists have unique knowledgeto design and conduct learner analytics(it’s our magic, a la Richard Katz!) 16
  18. 18. 18
  19. 19. 19
  20. 20. 20
  21. 21. 21
  22. 22. 22
  23. 23. 23
  24. 24. Learner Analytics on Chico Vista Usage1. What is the relationship between LMS usage and student achievement?2. What is the relationship between the number of LMS tools used (aka ‘breadth of faculty LMS adoption’) and student achievement?3. Perform analysis within courses4. Ultimate goal: provide administrators and faculty with what-if modeling tools, building on reports in data warehouse 24
  25. 25. CSU Practice
  26. 26. Call to Action1. Metrics reporting is the foundation for Analytics2. Don’t need to wait for student performance data; good metrics can inspire access to performance data3. You’re *not* behind the curve, this is a rapidly emerging area that we can (should) lead ...
  27. 27. Promising Efforts & Directions1. WCET “Predictive Analytics Framework” ( – Participants: American Public University System, Colorado CCS, University of Hawaii System, University of Illinois at Springfield, Rio Salado College, University of Phoenix2. Building Organizational Capacity for Analytics Survey ( Educause Analytics “Capacity Building” initiative ( Note: each of these efforts is supported by Linda Baer, Gates Foundation
  28. 28. Resources to move forward with Analytics at your campus Learner Analytics bibliography: Visualizing Data: Essential Collection of Resources: Moodle Custom SQL queries report: Bb Stats: Bb Project Astro:
  29. 29. Q&A and Contact Info• Kathy Fernandes (• Scott Kodai (• John Whitmer ( Download presentation at: 30
  30. 30. Works CitedArnold, K. E. (2010). Signals: Applying Academic Analytics. Educause Quarterly, 33(1).California State University Office of the Chancellor. (2010). CSU Graduation Initiative Retrieved 10/18, 2010, from, J. P. (2007). Utilizing student data within the course management system to determine undergraduate student academic success: An exploratory study. Unpublished Ph.D., Educational Studies, United States -- Indiana.Campbell, J. P., DeBlois, P. B., & Oblinger, D. G. (2007). Academic Analytics: A New Tool for a New Era. EDUCAUSE Review, 42(4), 17.Goldstein, P. J., & Katz, R. N. (2005). Academic analytics: The uses of management information and technology in higher education. . Washington, DC.Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an "early warning system" for educators: A Proof of Concept. Computers & Education(54), 11.Offenstein, J., Moore, C., & Shulock, N. (2011). Advancing by Degrees: A Framework for Increasing College Completion.Siemens, G. (2011, 8/5). Learning and Academic Analytics., J. (2004). The Wisdom of Crowds. New York: Anchor Books. 31
  31. 31. BONUS SLIDES! 32
  32. 32. Academic Analytics Levels & Frequency Level 1: Extraction and reporting ofAnalytics Level Respondents transaction-level data 32 Level 1: Extraction and reporting of 6 7 17 transaction-level data 263 Level 2: Analysis and monitoring of Level 2: Analysis and monitoring of operational operational performance 51 51 performance Level 3: What-if decision support 6 Level 3: What-if Level 4: Predictive decision support Modeling/Simulation 7 263 Level 5: Automated triggers/alerts 17 Level 4: Predictive N/A 32 Modeling/Simulation Table and Chart adapted from Goldstein & Katz, 2005 33
  33. 33. Research Findings1. There is not a relationship between sophistication of technology and sophistication of application/deployment – Largest raw number of advanced users had simple transactional reporting tools2. Factors leading to higher levels application: – Leadership commitment to evidence-based decision making – Staff skills – Effective end user training 34
  35. 35. Data Dashboard Theoretical Framework & Guiding Questions1. What percentage of students reach each of the leading indicators?2. What is the impact of reaching each of the leading indicators on success rate?3. Does meeting any of the indicators reduce or eliminate gaps between Advancing by Degrees: A Framework for Increasing student groups? College Completion -Institute for Higher Education Leadership and Policy and The Education Trust 36
  36. 36. DD Screenshot
  37. 37. DD Screenshot
  39. 39. JP Campbell Dissertation Study (2007)Utilizing student data within the course management system to determine undergraduate student academic success: An exploratory study1. LMS usage for entire university for 1 semester (70,000 records, 27,000 students)2. 15 demographic variables, 20 Vista variables3. Outcome variable: student grade4. Multivariate regression to create predictive model for significant variables 40
  40. 40. How much do Vista usage variables increase predictive accuracy compared to predictions based on student characteristics only? a) 0.3% b) 5% c) 12% d) 25% e) 54% 41
  41. 41. How much do Vista usage variables increase predictive accuracy compared to predictions based on student characteristics only? a) 0.3% b) 5% c) 12% Prediction rate: 62.4% d) 25% e) 54% 42
  42. 42. Why such a small increase?1. Variation in usage creates “missing data” for tools not used in other courses2. Lesson Learned: perform analysis relative to students within the same course3. Next Generation implementation: Purdue Biology course using “Signals” early warning system with students (Arnold, 2010) – D/F grades reduced 14% – B/C grades increased 12% 43
  43. 43. Macfadyen and Dawson (2010)In a fully online biology course at the University of British Columbia (n=118, 5 sections, 3 semesters), found that:1. 33% of student grade variability could be explained by 3 variables (discussion messages posted, mail messages sent, and assessments completed)2. 13 variables (out of 22 studied) had significant correlations with final student grade (R2 values from .05 to .27) – Significant variables included number online sessions, total time only, and activities within content, mail, assessment, and discussion areas – Variables not significant included some predictable items, such as visits to MyGrades, uses of search, ‘who is online’, and the ‘compile’ tool. They also included surprising items, such as the number of assignments read, the time spent on assignments, and announcement views3. 73.7% of the students correctly classified as at-risk (i.e. final grade of D or F) through predictions based on these three variables 44