Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Examining the effect of a real time student dashboard on student behavior and student achievement

183 views

Published on

In this presentation we present a randomized control trial research study conducted to determine the effect of a real-time student dashboard on student behavior and student achievement. We also present on some of our design changes to increase student use of our dashboards.

Visit BobBodily.com for more information about my research.

Published in: Data & Analytics

Examining the effect of a real time student dashboard on student behavior and student achievement

  1. 1. Examining the Effect of a Real-Time Student Dashboard on Student Behavior and Student Achievement Robert Bodily, Charles Graham, Tarah Kerr, and Ben Mackley Brigham Young University
  2. 2. Why do student dashboards matter? ● Most learning analytics systems are concerned with collecting data, but then what? ● Provide concept, assignment, unit, or course level feedback to help students identify knowledge gaps ● Help students develop metacognitive or self-regulation strategies ● Provide information that can be understood at a glance
  3. 3. Review of Research: Dashboard Effect on Behavior 1. 21% of students accepted the system recommendation to view additional content 2. Students participating in courses using the system were more likely to continue taking classes than those who did not enroll in these courses 3. Students who enabled notifications (on 2 out of 3 systems) showed increased contributions in the social network space 4. Students visited the discussion space more frequently but did not post more frequently 5. The percentage of posts viewed increased for all students, but there were few sustained changes 6. The number of students completing assignments increased and LMS use increased 7. About 50% of students accepted recommendations from the system 8. There was an 83.3% student interaction increase after recommendations were given 9. Students completed assignments more quickly and were able to complete the entire course more quickly 10. *For two of the three visualizations, students post quantity increased; for one of three, student post quantity decreased. 11. *Students logged in more frequently, completed their coursework more quickly, completed more questions, and answered more questions correctly on assignments 12. *There were no significant differences between the treatment and control groups in terms of learning efficiency *Sample size greater than 150 and conducted an actual experiment (randomized control trial or other equivalent group mean difference testing method)
  4. 4. Review of Research: Dashboard Effect on Achievement 1. No significant achievement differences 2. Increased A’s and B’s. Decreased C’s and D’s. 3. No significant achievement differences 4. Students received more passing grades 5. Frequency and quality of posts was affected positively and negatively 6. Students performed significantly better on the evaluation task 7. Treatment group performed significantly better on final exam 8. *No significant differences between treatment and control 9. *No significant achievement differences 10. *No significant achievement differences, but one course had an effect with Pell eligible students 11. *Treatment group performed significantly better on final exam *Sample size greater than 150 and conducted an actual experiment (randomized control trial or other equivalent group mean difference testing method)
  5. 5. Review of Research: Dashboard Effect on Skills 1. Significant increase in student self-awareness accuracy 2. Female students had increased interest when they had a choice to use the system; male students reported higher interest with mandatory notifications
  6. 6. Context 1: Fall 2015, Design ● Instructor advocated for video use ● Quizzes were short (3-5 questions), easy, and based on the videos ● Dashboard was accessed in the LMS next to videos ● Access was given after the first exam ● Design: content recommender dashboard
  7. 7. Context 1: Fall 2015, Methods ● Students were randomly placed in dashboard treatment group or control group ● T-tests were used to make sure groups were equivalent across all covariates (Exam 1 score, quiz scores, video use)
  8. 8. Context 1: Fall 2015, Results ● Randomized control trial showed no significant differences between treatment and control for student behavior or student achievement ● Low student use ○ 11.5 clicks per session, 2 sessions per student, 42% of students accessed the dashboard ○ Data quality of dashboard? ○ Not useful to students? ● Evaluation showed many students: ○ Did not know they had access to the dashboard ○ Did not know how to use the system ○ Did not think it would be useful ○ Did not have time ○ Had a lot of other resources
  9. 9. Context 2: Winter 2016, Design ● Put videos within a videos tab in the dashboard to increase use (help them see data visualizations more frequently) ● Longer quizzes but still formative (unlimited attempts) ● Everyone had access ● Design: scatterplot dashboard
  10. 10. Context 2: Winter 2016, Results ● Low student use ○ Data quality in dashboard? Fall Winter Percent Access 42% 48% Sessions per student 1.98 1.89 Clicks per session 11.5 14.2
  11. 11. Context 3: Spring 2016, Design ● Everyone had access ● Demo the dashboard for everyone at the beginning of the semester ● Quizzes are graded and attempts are limited to 3 (high stakes) ● Teach the TAs about the dashboard and have them encourage it ● Have the instructor mention the benefits of the system ● Provide more resources (videos, practice quizzes, and web resources) ● Design: content recommender 2 dashboard
  12. 12. Context 3: Spring 2016, Design
  13. 13. Context 3: Spring 2016, Results ● Increased frequency of use ● Decreased clicks per session (students are more efficient) ● Perceptions of dashboard Fall Winter Spring Percent Access 42% 48% 80% Sessions per student 1.98 1.89 3.28 Clicks per session 11.5 14.2 10.6
  14. 14. Principles Learned ● Students need to be aware of and trained in a new system ○ Send students reminders ○ Instructor and teaching assistants can discuss benefits of the system ○ Demo the system for the students ● Systems need to be directly related to helping students achieve their goals ○ Unit-level feedback helped with the test review process ○ Help students identify knowledge gaps ○ Remediate knowledge gaps with videos, text resources, and practice questions ● Usability tests and system evaluations are necessary ○ We changed a lot after our usability and evaluation tests
  15. 15. Future Research ● How can we support students as they engage with online feedback? ● More rigorous evaluations and measured effects research (randomized control trial, quasi-experimental methods)

×