Learner Analytics and the “Big Data” Promise for Course & Program Assessment

Uploaded on

Presentation delivered at the San Diego State University "One Day in May" conference on May 22, 201 by John Whitmer, Hillary Kaplowitz, and Thomas J. Norman …

Presentation delivered at the San Diego State University "One Day in May" conference on May 22, 201 by John Whitmer, Hillary Kaplowitz, and Thomas J. Norman

Universities archive massive amounts of data about students and their activities. Students also generate significant amounts of “digital exhaust” as they use academic technologies. How can faculty and administrators use automated analysis of this data to save time and conduct targeted interventions to improve student learning?

The emerging discipline of Learner Analytics conducts analysis of this data to learn about student behaviors, predict students at-risk of failure, and identify potential interventions to help those students. In this presentation, we will discuss the contours of this discipline and review the state of research conducted to date. We will then look at several examples of Learner Analytics services and hear from California State University educators who are using these tools to help their students. Finally, we will suggest some immediate ways that Analytics can be conducted at San Diego State.

John Whitmer, California State University, Chico
Hillary Kaplowitz, California State University, Northridge
Thomas J. Norman, CSU Dominguez Hills

More in: Education
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads


Total Views
On Slideshare
From Embeds
Number of Embeds



Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

    No notes for slide
  • Kathy
  • Comparison 8 robo-graders to human grading: < .10 differenceMark Shermis, University of AkronBenHamner, Kaggle, IncSupported by Hewlett Foundation
  • Redundant?
  • Here is the oldest excuse in the book – “The dog at my homework”
  • But now we have new excuses – the electronic dog ate my electronic homework… the computer messed up. I uploaded it. Or they upload the wrong file. Or an empty one. Or the wrong format… or… or….
  • So here is an email I got from one of my students
  • I want to believe him. He’s an A student but that’s not fair…
  • Moodle report by activity and student showed me he accessed it before the deadline but no upload so no way to know if he did it or not.
  • But it was a googledoc assignment so I could go into the revision history and verify that he indeed did the work before the deadline!
  • He used data to his advantage!
  • They say Justice is blind – but in this case it is not. I had another student tell me that there grade was missing on Moodle and they know they did it. I went in to check their activity on GoogleDocs and while they did do it they finished their work at 12:22 AM which is 22 minutes late. I gave her credit for the assignment but marked down for being late – when I explained this to her and how I checked it she understood
  • Next story – students complain the work is too hard! Or… in this case
  • Economics class converted to hybrid. Students met only once a week and were given this schedule to follow – which was a carefully designed sequence to help the students learn difficult material that takes time and practice.First watch lecturesThen read bookThen do online activitiesPost questions, take practice quizThen come to class -****with questions and problems to discuss****Then take the quiz online which was graded
  • Facebook statusupdates are best at 4pm – what if we had data about what was the best time to reach our students?
  • Bb Learn ToolsReports by each content itemCourse ReportsPerformance DashboardEarly Warning SystemGrade Center


  • 1. Learner Analytics and the “Big Data” Promise for Course & Program Assessment San Diego State University “Day in May” 22 May 2012 John Whitmer, CSU Chico (& Office of the Chancellor) Hillary Kaplowitz, CSU Northridge Download slides at: Thomas Norman, CSU Dominguez Hills http://bit.ly/Kb6gsV
  • 2. Outline1. Promise of Learner Analytics2. Case Studies a) Analytics at work in the classroom (Hillary) b) Improving classroom discussion and mastery of program level outcomes (Thomas) c) Evaluating course redesign (John)3. Analytics Tools @ SDSU4. Q & A
  • 4. John Goodlad’s Place-Based Research Classroom-based research: “What is schooling?” 1,000 classrooms, 27,000 individuals 14 foundations needed to support Fundamental changes to understanding of educational practice
  • 5. Steve Lohr, NY Times, August 5, 2009
  • 6. Economist. (2010, 11/4/2010). Augmented business: Smart systems will disrupt lots of industries, and perhaps the entire economy. The Economist.
  • 7. Current GPA: 3.3 First in family to attend college SAT Score: 877 Hasn’t taken college- level math No declared majorSource: jisc_infonet @ Flickr.com 7 http://slidesha.re/IgKSTXSource: jisc_infonet @ Flickr.com
  • 8. Academic Analytics“Academic Analytics marries large data sets with statistical techniques and predictive modeling to improve decision making” (Campbell and Oblinger 2007, p. 3) 8
  • 9. DD Screenshot 9
  • 10. Learner Analytics“ ... measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.” (Siemens, 2011)
  • 11. Fundamental Questions behind Learner Analytics1. What are students doing (or not doing)? Which students are we talking about?2. Does it matter (re: achievement, engagement, learning)?3. What should we do? – Changes in student behavior? – Changes in faculty/program?
  • 12. SIGNALSPurdue Signals Project http://www.itap.purdue.edu/studio/signals/
  • 13. Wordcloud of student evaluations of Course Signals (Arnold, 2010)
  • 14. Signals Course Outcomes Fall 2009 Compare same course (w/Signals v. w/o Signals) -6.41% DWF +10.97% A/B (Arnold, 2010)
  • 15. KHANKhan Academy http://www.khanacademy.org/
  • 16. Source: wpclipart.com
  • 17. Will LearnerAnalyticsreplaceeducators withcomputer By Peter Nowak, Macleans.caalgorithms?
  • 18. “Robots aregrading your papers!” -Marc Bousquet, 4/18/2012 Blog post, Chronicle of Higher Education
  • 19. Contrasting State-of-the-Art Automated Scoring of Essays: Analysis Mark D. Shermis, University of Akron (funded by Hewlett Foundation) Compared 8 robo-graders to human grading on standardized essay questions from 6 states Outcome: very small difference in results (.03 - .12 score difference) Conclusion: are scoring algorithms sophisticated …. or are standardized essays simplistic … or do we need to stop the dichotomy of computers and people?
  • 20. -- Tom Vander Ark, Gettingsmart.com
  • 21. Or analytics can support faculty by …1. Providing behavioral data to investigate student performance2. Informing faculty about students succeeding or at risk of failing a course3. Warning students that they are likely to fail a course – before it’s too late4. Helping faculty evaluate the effectiveness of practices and course designs5. Customizing content and learning activities
  • 23. How can data help teachers and students work better together? Hillary KaplowitzInstructional Designer, Faculty Technology Center Part-Time Faculty, Cinema and Television Arts Department California State University, Northridge
  • 24. Case #1“Im not upset that you lied tome, Im upset that from now onI cant believe you.” Friedrich Nietzsche
  • 25. “Hey Professor,I just looked at my assignments andrealized that my Chapter 11 summarydid not get submitted, which Im havingtrouble believing that I didnt submit it...especially because I see that I did it,and I always submit my assignmentsas soon as I finish them.”
  • 26. Now the hard part…. Do I believe him?If I only I could check…
  • 27. And it was all his idea…The student suggested that I check Moodle and ifthat didn’t work told me how to check the RevisionHistory in GoogleDocs with step-by-stepdirections!
  • 28. Case #2“Life isnt fair. Its just fairerthan death, thats all.” William Golding
  • 29. “The quiz is unfair”
  • 30. Hybrid Course Weekly Structure 4. Post 3. Online questions1. Watch 2. Read 5. Class 6. Aplia chat and and takelectures textbook meets quiz tutoring practice quiz
  • 31. But the story was not that simple…• Reports on Moodle painted a different picture• Student was watching the lectures at 10:00 p.m.• Then immediately taking quiz
  • 32. Enabled constructive feedback… Advised the student how the structure of the course was designed to enhance learning Student revised their study habits Improved grades and thanked the instructor!
  • 33. What we can do with data now Use Reports in Moodle to verify student claims Review participant list to see last access time Empower students to review their own reports Analyze usage and advise students how to study better Review quiz results to find common misconceptions
  • 34. Could we help improve student learning outcomes if we knew the effect of… Coffee Facebook Sequencing Attendance Amount Mobile Textbook LMS LMS Activities Access
  • 35. Using Learner Analytics to ImproveClassroom Discussion and Mastery of Program Level Outcomes Thomas J. Norman, Ph. D. California State University, Dominguez Hills tnorman@csudh.edu
  • 36. Solving the Student Effort Challenge• Prior surveys revealed that a majority of Management students were reading 5 chapters or less of the assigned 15 chapters• The course average on the cumulative final was around 70%• Using online assessments has boosted these scores 7-8 percentage points!• These are tools made available by McGraw Hill and Aplia that you can use too: – McGraw Hill $39- $99 with eBook – Cengage Aplia $99 with eBook
  • 37. Benefits of Online Assignments• Assignment are due Sunday at 11:45 or 11:59 p.m.• They ensure students have read and begun working with the concepts BEFORE classroom discussion and activities• Provides immediate feedback• Automatically graded
  • 38. Aplia Real Time Metrics Progress and MasteryAt risk
  • 39. LearnSmart Tale of 3 Students 123 Student 1 warned to keep up, ignored warning and failed course Student 2 knew material, completed homework in 6 hours A student Student 3 struggled early, but caught up and did well A- student
  • 40. Analysis by AACSB Categories
  • 41. Performance by Learning Objective/Difficulty Why do my students do better at medium difficulty questions? Not working! This is working!
  • 43. LMS Learner Analytics @ Chico StateEvaluation for Program Assessment – Academy e-Learning course redesign – Intro to Religious Studies: increased enrollment from 80 to 327 students first semester – Outcome: increased mastery course concepts AND increased number D/W/F students – Why? (and for whom? And what did they do?) – What is the relationship between LMS actions, student background characteristics and student academic achievement? (6 million dollar question) 49
  • 44. Grade Distribution 50
  • 45. Grades by LMS “Dwell Time” 51
  • 46. Grades by Dwell & Tool
  • 48. Bboard Learn Tools
  • 49. BLACKBOARD LEARN: Activity Stats Reports
  • 50. Evaluation: Course Reports
  • 51. Performance Dashboard
  • 52. Evaluation: Early Warning System
  • 53. Grade Center: “Smart Views”
  • 54. Student View: “Report Card”
  • 55. Call to Action1. You’re *not* behind the curve, this is a rapidly emerging area that we can (should) lead ...2. Metrics reporting is the foundation for Analytics3. Don’t need to wait for student characteristics and detailed database information; LMS data can provide significant insights4. If there’s any ed tech software folks in the audience, please help us with better reporting!
  • 56. Draft DOE Reportreleased April 12http://1.usa.gov/GDFpnI
  • 57. Q&A and Contact InfoResources Googledoc: http://bit.ly/HrG6DmContact Info:• John Whitmer (jwhitmer@csuchico.edu)• Hillary C Kaplowitz (hillary.kaplowitz@csun.edu)• Thomas Norman (tnorman@csudh.edu) Download presentation at: http://bit.ly/Kb6gsV 64
  • 58. Works CitedAdams, B., Bienkowski, M., Feng, M., & Means, B. (2012). Enhancing Teaching Learning throughEducational Data Mining and Learning Analytics: An Issue Brief. Washington, D.C.: U.S. Department ofEducation, Office of Educational Technology.Arnold, K. E. (2010). Signals: Applying Academic Analytics. Educause Quarterly, 33(1).Bousquet, M. (2012). Robots Are Grading your Papers. Retrieved fromhttp://chronicle.com/blogs/brainstorm/robots-are-grading-your-papers/45833Campbell, J. P., DeBlois, P. B., & Oblinger, D. G. (2007). Academic Analytics: A New Tool for a New Era.EDUCAUSE Review, 42(4), 17.Economist. (2010, 11/4/2010). Augmented business: Smart systems will disrupt lots of industries, andperhaps the entire economy. The Economist.LaValle, S., Hopkins, M., Lesser, E., Shockley, R., & Kruschwitz, N. (2010). Analytics: The new path tovalue. Findings from the 2010 New Intelligent Enterprise Global Executive Study and Research Project:IBM Institute for Business Value and MIT Sloan Management Review.Manyika, J., Chui, M., Brown, B., Bughin, J., Dobbs, R., Roxburgh, C., & Hung Byers, A. (2011). Big data:The next frontier for innovation, competition, and productivity.Parry, M. (Producer). (2012, 5/14/2012). Me.edu: Debating the Coming Personalization of Higher Ed.Chronicle of Higher Education. Retrieved from http://chronicle.com/blogs/wiredcampus/me-edu-debating-the-coming-personalization-of-higher-ed/36057Siemens, G. (2011, 8/5). Learning and Academic Analytics. Retrieved fromhttp://www.learninganalytics.net/ 65