Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Developing an Evaluation Framework!
of Quality Indicators for Learning Analytics!
Maren Scheffel
Open Universiteit Nederla...
Last year at LAK…!
•  Group Concept Mapping study about quality indicators for learning analytics
–  brainstorming phase f...
Evaluation Study about the Framework!
•  Question: 

Is the framework applicable to evaluate learning analytics tools or a...
Example question !
http://bit.ly/EFqiLA
Results – Objectives !
Results – Objectives !
•  rather easy to judge
•  difference between actually fostering
something and having the intention...
Results – Learning Support !
Results – Learning Support !
•  take user type into account
•  different types of indicators: goal of a tool
vs. functiona...
Results – Learning Measures & Output!
Results – Learning Measures & Output!
•  overall difficult to judge
•  criterion title is confusing
•  definition of some in...
Results – Data Aspects!
Results – Data Aspects!
•  hard to judge if one is not a user
•  add possibility of “I don’t know”-option
•  user type sho...
Results – Organisational Aspects!
Results – Organisational Aspects!
•  easy to rate but hard to judge when one is
not a user
•  tools often do not provide e...
Issue Categorisation!
•  concept definitions
–  rephrasing or defining criteria or indicators more clearly
•  differentiatio...
Please Give Your Feedback!
•  Your feedback is important to us:
–  Online feedback form available at:

http://bit.ly/lace-...
Keep in Touch!!
Complete form or visit
http://www.laceproject.eu/join-community/
to be added to LACE mailing list
Thank you!!
“Developing an Evaluation Framework of Quality Indicators for Learning Analytics” 
by Maren Scheffel (OUNL) wa...
Developing an Evaluation Framework of Quality Indicators for Learning Analytics
Developing an Evaluation Framework of Quality Indicators for Learning Analytics
Upcoming SlideShare
Loading in …5
×

Developing an Evaluation Framework of Quality Indicators for Learning Analytics

908 views

Published on

short paper presentation at LAK2015 about
Developing an Evaluation Framework of Quality Indicators for Learning Analytics

Published in: Data & Analytics
  • Be the first to comment

Developing an Evaluation Framework of Quality Indicators for Learning Analytics

  1. 1. Developing an Evaluation Framework! of Quality Indicators for Learning Analytics! Maren Scheffel Open Universiteit Nederland maren.scheffel@ou.nl @m_a_s_c
  2. 2. Last year at LAK…! •  Group Concept Mapping study about quality indicators for learning analytics –  brainstorming phase for idea collection –  sorting phase for clustering –  rating phase for scaling of importance and feasibility Scheffel, M., Drachsler, H., Stoyanov, S., & Specht, M. (2014). Quality indicators for learning analytics. Educational Technology & Society, 17(4), 117–132.
  3. 3. Evaluation Study about the Framework! •  Question: Is the framework applicable to evaluate learning analytics tools or are changes needed? •  Methodology: –  turn framework into an applicable tool, i.e. a questionnaire –  compile learning analytics tools to evaluate the framework •  Eight tools analysed by eight people (members and associated partners from the LACE project) –  every person analysed two tools –  every tool was analysed twice.
  4. 4. Example question ! http://bit.ly/EFqiLA
  5. 5. Results – Objectives !
  6. 6. Results – Objectives ! •  rather easy to judge •  difference between actually fostering something and having the intention to foster something •  difference between direct and indirect fostering something •  user of tool should be indicated
  7. 7. Results – Learning Support !
  8. 8. Results – Learning Support ! •  take user type into account •  different types of indicators: goal of a tool vs. functionality of a tool •  simple yes / no not always sufficient, amount might make difference •  rephrasing or redefinition of some indicators might be necessary
  9. 9. Results – Learning Measures & Output!
  10. 10. Results – Learning Measures & Output! •  overall difficult to judge •  criterion title is confusing •  definition of some indicators needed •  differentiation of some indicators needed •  clarify user type
  11. 11. Results – Data Aspects!
  12. 12. Results – Data Aspects! •  hard to judge if one is not a user •  add possibility of “I don’t know”-option •  user type should be clarified •  differentiation of indicators needed
  13. 13. Results – Organisational Aspects!
  14. 14. Results – Organisational Aspects! •  easy to rate but hard to judge when one is not a user •  tools often do not provide enough information •  differentiation between indicators needed
  15. 15. Issue Categorisation! •  concept definitions –  rephrasing or defining criteria or indicators more clearly •  differentiations –  often needed between pairs of indicators •  framework structure –  inter-criterion homogeneity (goal vs. functionality) –  indicators should tend to be concept-driven, not feature-driven •  questionnaire adaption –  different questionnaires for different user types –  intention of tool –  answer options (I don’t know) general problem: sparsity of information
  16. 16. Please Give Your Feedback! •  Your feedback is important to us: –  Online feedback form available at: http://bit.ly/lace-eval-form –  Short form (3 pages) •  Please tell us: –  Your overall views on today’s session –  The things you particularly liked –  The areas which could be improved
  17. 17. Keep in Touch!! Complete form or visit http://www.laceproject.eu/join-community/ to be added to LACE mailing list
  18. 18. Thank you!! “Developing an Evaluation Framework of Quality Indicators for Learning Analytics” by Maren Scheffel (OUNL) was presented at LAK15 on March 18, 2015. www.laceproject.eu @laceproject This work was undertaken as part of the LACE Project, supported by the European Commission Seventh Framework Programme, grant 619424. maren.scheffel@ou.nl @m_a_s_c

×