In this presentation, I discuss the technical requirements for collecting learning analytics data in an open environment, the analytics system we have created to facilitate real-time data collection, screenshots of our student and instructor dashboards, and some statistical analyses conducted to improve our dashboards.
Visit BobBodily.com for more information about my research.
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...
Using real-time dashboards to improve student engagement in virtual learning environments
1. Using real-time dashboards to
improve student engagement in
virtual learning environments
Robert Bodily, Steven Wood
Brigham Young University
2. Research questions
Dashboard Development
R1: What technical requirements are needed for an online learning system to
collect and provide students with personalized information in a student
dashboard?
R2: What functionality do students want in a dashboard and how should it be
visually represented?
Data Mining Analysis
R3: What is predictive of student success in a first-year general chemistry
course?
R4: Can we develop an early alert warning system using clickstream data?
4. Challenges with current LMS
• A lot of learning is not occurring within a Learning Management
System (LMS)
• Interoperability standards
• No access to real-time data
• Canvas data (1 day old)
• API (rate limiting factors)
• Do not track enough data
• No information on how students interact with a page
5. Our analytics system
LTI (Learning Tools Interoperability)
Single sign-on system for learning
applications and learning management
systems
xAPI (Experience API/Tincan API)
Data format specification for data
management interoperability
LRS (Learning Record Store
Database that stores xAPI statements
and provides real-time data access
10. Context
• Class
• First year chemistry course
• Blended – class 3x per week,
• Resources
• 150 videos (avg. 2 min long, supplemental resources)
• 15 weekly quizzes (unlimited question attempts)
• Participants
• 200 students (online interactions)
• 96 students took the self-report resource use survey
11. Data collected
• Quiz
• Confidence in answer (just a guess, pretty sure, very sure)
• Time spent on quiz
• Correct/incorrect
• Number of attempts per question
• Leave tab (still open, but inactive), come back to tab (active again)
• Video
• Play, pause, skip forward/backward, change play rate, change volume,
• Dashboard
• Number of times students follow recommendations given in dashboard
• Number of clicks within the dashboard
12. What course elements are predictive of
student success?
Variable Beta P-value
Online homework score 0.366 0.000
In-class IClicker scores 0.154 0.024
# of attempts/question -0.411 0.000
Amount of question navigation -0.206 0.040
# of online activity sessions -0.195 0.020
Variable Beta P-value
Read the textbook 2.443 0.059
Ask professor questions in class 7.363 0.000
Watch Khan Academy -2.738 0.051
Use the internet -3.199 0.010
Skip recitation -4.820 0.041
Model 1 – regressing online interaction data
on final exam score.
Model 2 – regressing self-report resource use
on final exam score.
14. Develop an early course prediction of
student achievement
Online student interaction data Online student interaction data AND exam scores
There is significant improvement in both models until week 3 or 4, so that seems to be a good time to make
predictions for instructors and students.