Visualizing Activities for Self-reflection and Awareness

2,239 views

Published on

the slides of my talk at ICWL 2010 in Shanghai, China.

Published in: Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
2,239
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
16
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Visualizing Activities for Self-reflection and Awareness

  1. 1. VISUALIZING ACTIVITIES FOR SELF-REFLECTION AND AWARENESS Sten Govaerts, Katrien Verbert, Joris Klerkx, Erik Duval Katholieke Universiteit Leuven (Belgium) http://www.role-project.euThursday 9 December 2010
  2. 2. OVERVIEW • Problem statement • Objectives • Design & Implementation • Evaluation • ConclusionThursday 9 December 2010
  3. 3. PROBLEMThursday 9 December 2010
  4. 4. PROBLEMThursday 9 December 2010
  5. 5. PROBLEMThursday 9 December 2010
  6. 6. PROBLEMThursday 9 December 2010
  7. 7. OBJECTIVES • self-monitoring for learners • awareness for teachers • time tracking • learningresource recommendationThursday 9 December 2010
  8. 8. DEMO http://ariadne.cs.kuleuven.be/monitorwidget-cgiarThursday 9 December 2010
  9. 9. STUDENT ACTIVITY MONITORThursday 9 December 2010
  10. 10. STUDENT ACTIVITY MONITORThursday 9 December 2010
  11. 11. STUDENT ACTIVITY MONITORThursday 9 December 2010
  12. 12. STUDENT ACTIVITY MONITORThursday 9 December 2010
  13. 13. DATA & DEPLOYMENTS • Contextualized Attention Metadata (CAM) data • deployed in: • ROLE PLE • RWTH-Aachen engineering • tim3track3r • MoodleThursday 9 December 2010
  14. 14. EVALUATION DATA • no students using a CAM controled PLE yet... • need time tracking data • useThursday 9 December 2010
  15. 15. TIM3TRACK3R http://tim3track3r.appspot.comThursday 9 December 2010
  16. 16. EVALUATION • usability and user satisfaction evaluation • 12 CS students •2 evaluation sessions: • task based interview with think aloud (after 1 week of tracking) • user satisfaction (SUS & MSDT) (after 1 month)Thursday 9 December 2010
  17. 17. LEARNABILITY, ERRORS & EFFICIENCY • in general, people understand the visualizations well! • some issues were uncovered...Thursday 9 December 2010
  18. 18. LEARNABILITY, ERRORS & EFFICIENCYThursday 9 December 2010
  19. 19. LEARNABILITY, ERRORS & EFFICIENCYThursday 9 December 2010
  20. 20. LEARNABILITY, ERRORS & EFFICIENCYThursday 9 December 2010
  21. 21. LEARNABILITY, ERRORS & EFFICIENCYThursday 9 December 2010
  22. 22. LEARNABILITY, ERRORS & EFFICIENCYThursday 9 December 2010
  23. 23. LEARNABILITY, ERRORS & EFFICIENCYThursday 9 December 2010
  24. 24. LEARNABILITY, ERRORS & EFFICIENCYThursday 9 December 2010
  25. 25. LEARNABILITY, ERRORS & EFFICIENCYThursday 9 December 2010
  26. 26. USER SATISFACTION • average SUS score: 73% (stdv: 9,35)Thursday 9 December 2010
  27. 27. USER SATISFACTIONThursday 9 December 2010
  28. 28. USER SATISFACTIONThursday 9 December 2010
  29. 29. USER SATISFACTIONThursday 9 December 2010
  30. 30. USER SATISFACTIONThursday 9 December 2010
  31. 31. CONCLUSION & FUTURE WORK • people can use and understand the tool, but we need to evaluate the usefulness • evaluate the real usefulness of the tool with more students • we already did a study with teachersThursday 9 December 2010
  32. 32. USE IT!? •I can put your data into the tool! •I would like to use your course for evaluation!Thursday 9 December 2010
  33. 33. THANK YOU! QUESTIONS? slides will appear on http://www.slideshare.net/stengovaertsThursday 9 December 2010

×