Using Linked Data in Learning Analytics
LAK 2013 tutorial
EvaluaHon	
  of	
  Linked	
  Data	
  tools	
  for	
  Learning...
Example of scientific competitions

	
  

What are the evaluation criteria of Robot Wars?
Criteria:
•  Damage
•  Aggressio...
RecSysTEL Evaluation criteria
1. Accuracy
2. Coverage
3. Precision
4. Recall

1. Accuracy
2. Coverage
3. Precision
4. Reca...
TEL RecSys::Review study

	
  

Conclusions:
Half of the systems (11/20) still at design or prototyping
stage only 9 syste...
 

The TEL recommender
research is a bit like this...
We need to design for each domain an appropriate
recommender system ...
 

TEL recommender
experiments lack results
“The performance
transparency and
of different research
standardization.
effor...
 

Data-driven Research and Learning Analytics"

EATELHendrik Drachsler (a), Katrien Verbert (b)"
"
(a) CELSTEC, Open Univ...
 
TEL RecSys::Evaluation/datasets

	
  

"
Drachsler, H., Bogers, T., Vuorikari, R., Verbert, K., Duval, E., Manouselis, N.,...
dataTEL evaluation model
5. Dataset Framework
Formal

Data A

Datasets

Data B

Informal
Data C

Algorithms:
Algoritmen A
...
dataTEL evaluation model
5. Dataset Framework
Formal

Datasets

Informal

In Data A
LinkedUp we have Data B opportunity to...
12
Development	
  of	
  the	
  Evalua=on	
  Framework	
  
	
  

	
  

P3: Exit and
Sustainability

P1: Initialisation

P2: Es...
Group	
  Concept	
  Mapping	
  	
  

	
  

•  Group Concept Mapping resembles the
Post-it notes problem solving technique
...
 

Group	
  Concept	
  Mapping	
  	
  
brainstorm
•  innovations in way network is delivered
•  (investigate) corporate/st...
Group	
  Concept	
  Mapping	
  	
  

	
  

Hendrik Drachsler

25 February 2013

16
Group	
  Concept	
  Mapping	
  	
  

	
  

Hendrik Drachsler

25 February 2013

17
Group	
  Concept	
  Mapping	
  	
  

	
  

Hendrik Drachsler

25 February 2013

18
 

Group	
  Concept	
  Mapping	
  	
  
•  innovations in way network is delivered
•  (investigate) corporate/structural al...
 

Group	
  Concept	
  Mapping	
  	
  
•  innovations in way network is delivered
•  (investigate) corporate/structural al...
 

Group	
  Concept	
  Mapping	
  	
  

D2.1 Evaluation Criteria and Methods
•  Invited 122 external experts
•  56 experts...
Plus Minus Interesting rating

	
  

Look at and listen to the presentation
of the Evaluation Framework
Meanwhile…create n...
Group	
  Concept	
  Mapping	
  	
  

	
  

A point map

Hendrik Drachsler

25 February 2013

23
Group	
  Concept	
  Mapping	
  	
  

	
  

A cluster map 15

Hendrik Drachsler

25 February 2013

24
Group	
  Concept	
  Mapping	
  	
  

	
  

A cluster map 6

Hendrik Drachsler

25 February 2013

25
Group	
  Concept	
  Mapping	
  	
  

	
  

Clusters’ labels

Hendrik Drachsler

25 February 2013

26
Group	
  Concept	
  Mapping	
  	
  

	
  

Rating Map Priority

Hendrik Drachsler

25 February 2013

27
 

Group	
  Concept	
  Mapping	
  	
  
Rating Map Applicability

Hendrik Drachsler

25 February 2013

28
Group	
  Concept	
  Mapping	
  	
  

	
  

Hendrik Drachsler

25 February 2013

29
Group	
  Concept	
  Mapping	
  	
  

	
  

Hendrik Drachsler

25 February 2013

30
WP2: Literature review

	
  
WP2: Literature review

	
  

1. Literature review of suitable evaluation approaches and criteria
2. Review of comprising ...
WP2: Literature review

	
  
 

Many thanks for your attention!
This silde is available at:
http://www.slideshare.com/Drachsler
Email:

hendrik.drachsl...
Upcoming SlideShare
Loading in …5
×

Evaluation of Linked Data tools for Learning Analytics

708 views

Published on

Presentation given in the tutorial on 'Using Linked Data for Learning Analytics' at LAK13.

http://portal.ou.nl/documents/363049/ca242534-8996-4fc7-8e42-073cc194c763

http://creativecommons.org/licenses/by-nc-sa/3.0/

Drachsler, H., Herder, E., d'Aquin, M., Dietze, S. (2013). Presentation given in the tutorial on 'Using Linked Data for Learning Analytics' at LAK2013, the Third Conference on Learning Analytics and Knowledge, Leuven, Belgium.

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
708
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
10
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Evaluation of Linked Data tools for Learning Analytics

  1. 1.   Using Linked Data in Learning Analytics LAK 2013 tutorial EvaluaHon  of  Linked  Data  tools  for  Learning  AnalyHcs   Hendrik  Drachsler  (@hdrachsler,  drachsler.de)   (CELSTEC,  Open  Universiteit  Nederland,  NL)   Eelco  Herder   (L3S  Research  Center,  DE)   Mathieu  d’Aquin  (@mdaquin,  mdaquin.net)   (Knowledge  Media  InsHtute,  The  Open  University,  UK)   Stefan  Dietze     (L3S  Research  Center,  DE)  
  2. 2. Example of scientific competitions   What are the evaluation criteria of Robot Wars? Criteria: •  Damage •  Aggression probabilistic combination of •  Control – Item-based method – User-based method •  Applause – Matrix Factorization – (May be) content-based method 2
  3. 3. RecSysTEL Evaluation criteria 1. Accuracy 2. Coverage 3. Precision 4. Recall 1. Accuracy 2. Coverage 3. Precision 4. Recall 1. Effectiveness of learning 2. Efficiency of learning 3. Drop out rate 4. Satisfaction Combine approach by Drachsler et al. 2008 1. Reaction of learner 2. Learning improved 3. Behaviour 4. Results Kirkpatrick model by Manouselis et al. 2010 3
  4. 4. TEL RecSys::Review study   Conclusions: Half of the systems (11/20) still at design or prototyping stage only 9 systems evaluated through trials with human users. Manouselis, N., Drachsler, H., Vuorikari, R., Hummel, H. G. K., & Koper, R. (2011). Recommender Systems in Technology Enhanced Learning. In P. B. Kantor, F. Ricci, L. Rokach, & B. Shapira (Eds.), Recommender Systems Handbook (pp. 387-415). Berlin: Springer. 4
  5. 5.   The TEL recommender research is a bit like this... We need to design for each domain an appropriate recommender system that fits the goals and tasks " 5
  6. 6.   TEL recommender experiments lack results “The performance transparency and of different research standardization. efforts in recommender They need tohardly systems are be repeatable to test: comparable.” •  Validity (Manouselis et al., 2010) •  Verification •  Compare results Kaptain Kobold
 http://www.flickr.com/photos/ kaptainkobold/3203311346/ 6
  7. 7.   Data-driven Research and Learning Analytics" EATELHendrik Drachsler (a), Katrien Verbert (b)" " (a) CELSTEC, Open University of the Netherlands" (b) Dept. Computer Science, K.U.Leuven, Belgium" " 7 7
  8. 8.  
  9. 9. TEL RecSys::Evaluation/datasets   " Drachsler, H., Bogers, T., Vuorikari, R., Verbert, K., Duval, E., Manouselis, N., Beham, G., Lindstaedt, S., Stern, H., Friedrich, M., & Wolpers, M. (2010). Issues and Considerations regarding Sharable Data Sets for Recommender Systems in Technology Enhanced Learning. Presentation at the 1st Workshop Recommnder Systems in Technology Enhanced Learning (RecSysTEL) in conjunction with 5th European Conference on Technology Enhanced Learning (EC-TEL 2010): Sustaining TEL: From Innovation to Learning and Practice. September, 28, 2010, Barcelona, Spain." 9 "
  10. 10. dataTEL evaluation model 5. Dataset Framework Formal Data A Datasets Data B Informal Data C Algorithms: Algoritmen A Algoritmen B Algoritmen C Algorithms: Algoritmen D Algoritmen E Algorithms: Algoritmen B Algoritmen D Models: Learner Model A Learner Model B Models: Learner Model C Learner Model E Models: Learner Model A Learner Model C Measured attributes: Attribute A Attribute B Attribute C Measured attributes: Attribute A Attribute B Attribute C Measured attributes: Attribute A Attribute B Attribute C 17 42 10
  11. 11. dataTEL evaluation model 5. Dataset Framework Formal Datasets Informal In Data A LinkedUp we have Data B opportunity to apply a the Data C structured approach to develop a community accepted evaluation framework. Algorithms: Algorithms: Algorithms: Algoritmen A Algoritmen B Algoritmen C Algoritmen D Algoritmen E Algoritmen B Algoritmen D Learner Model A Learner Model B Learner Model C Learner Model E Learner Model A Learner Model C Measured attributes: Attribute A Attribute B Attribute C Measured attributes: Attribute A Attribute B Attribute C Measured attributes: Attribute A Attribute B Attribute C 1.  Top-Down by a literature study Models: Models: 2.  Bottom-up by Models: with experts in the field GCM 17 42 11
  12. 12. 12
  13. 13. Development  of  the  Evalua=on  Framework       P3: Exit and Sustainability P1: Initialisation P2: Establishment and Evaluation M0-M6: Preparation M7-M18: Competition cycle M18-M24: Finalising Comp etition EF proposal Expert validation Draft New versio n 3x Final release of EF Revie w of EF Refin ement of EF Literature review Group Concept Cognitive Mapping Mapping Documentation Dissemination Practical experiences and refinement Hendrik Drachsler 25 February 2013 13
  14. 14. Group  Concept  Mapping       •  Group Concept Mapping resembles the Post-it notes problem solving technique and Delphi method •  GCM involves participants in a few simple activities (generating, sorting and rating of ideas) that most people are used to. GCM is different in two substantial ways: 1. Robust analysis (MDS and HCA) GCM takes up the original participants contribution and then quantitatively aggregate it to show their collective view (as thematic clusters) 2. Visualisation GCM presents the results from the analysis as conceptual maps and other graphical representations (pattern matching and go-zones). Hendrik Drachsler 25 February 2013 14
  15. 15.   Group  Concept  Mapping     brainstorm •  innovations in way network is delivered •  (investigate) corporate/structural alignment •  assist in the development of non-traditional partnerships (Rehab with the Medicine Community) •  expand investigation and knowledge of PSN'S/PSO's •  continue STHCS sponsored forums on public health issues (medicine managed care forum) •  inventory assets of all participating agencies (providers, Venn Diagrams) •  access additional funds for telemedicine expansion •  better utilization of current technological bridge •  continued support by STHCS to member facilities •  expand and encourage utilization of interface programs to strengthen the viability and to improve the health care delivery system (ie teleconference) •  discussion with CCHN sort Decide how to manage multiple tasks. 20 Manage resources effectively. 4 Work quickly and effectively under pressure 49 Organize the work when directions are not specific. 39 e he Ra ...organize the issues... gS tin 2 1 a an g ime et ive e ct eff ly es rc ou res ly. tive ec eff n ic. s. atio ecif a sk rm t. t sp le t info tan no of por re ultip a e ud is im age m ns 2 4 ultit hat ctio an 3 ire am ew 5 om nd id an wt he 4 Sc dec ho kw d e r 3 an 5 3 cid wo nd De he 2 na ely 4 et 4 ctiv atio niz 3 ffe ga 5 orm Or ee 2 inf nt. 4 tim of ta 5 ge 1 r de 3 na 5 itu impo Ma ult 2 4 a m at is 1 1 3 an e wh 5 Sc cid 2 4 de 3 1 3 5 2 4 1 3 2 1 5 1 t 1 M Ma na ge rate Hendrik Drachsler 25 February 2013 15
  16. 16. Group  Concept  Mapping       Hendrik Drachsler 25 February 2013 16
  17. 17. Group  Concept  Mapping       Hendrik Drachsler 25 February 2013 17
  18. 18. Group  Concept  Mapping       Hendrik Drachsler 25 February 2013 18
  19. 19.   Group  Concept  Mapping     •  innovations in way network is delivered •  (investigate) corporate/structural alignment •  assist in the development of non-traditional partnerships (Rehab with the Medicine Community) •  expand investigation and knowledge of PSN'S/PSO's •  continue STHCS sponsored forums on public health issues (medicine managed care forum) •  inventory assets of all participating agencies (providers, Venn Diagrams) •  access additional funds for telemedicine expansion •  better utilization of current technological bridge •  continued support by STHCS to member facilities •  expand and encourage utilization of interface programs to strengthen the viability and to improve the health care delivery system (ie teleconference) •  discussion with CCHN …”map” the issues... organize sort Decide how to manage multiple tasks. 20 Manage resources effectively. 4 Work quickly and effectively under pressure 49 Organize the work when directions are not specific. 39 Ra ely . et he effectiv effectively s g S time tin Manage resource 1 2 1 Ma 5 4 1 5 4 2 1 3 1 3 5 4 2 na 2 3 4 2 1 3 2 1 3 2 1 rate d an . ation ks. ecific inform t sp le tas no e of rtant. ltip are mu ltitudimpo ns ge na a muat is ectio ma an dir Sc e wh w to en cid wh de e ho rk cid wo d De e the ely n an niz tio ectiv ga ma eff Or or e inf tim 5 ge e of rtant. na 5 ltitudimpo Ma 4 a muat is 1 an wh 5 Sc cide 4 de 3 3 5 4 3 2 Technology ge Information Services 4 5 3 Community & Consumer Views Regionalization Management Financing STHCS as model Hendrik Drachsler 25 February 2013 19
  20. 20.   Group  Concept  Mapping     •  innovations in way network is delivered •  (investigate) corporate/structural alignment •  assist in the development of non-traditional partnerships (Rehab with the Medicine Community) •  expand investigation and knowledge of PSN'S/PSO's •  continue STHCS sponsored forums on public health issues (medicine managed care forum) •  inventory assets of all participating agencies (providers, Venn Diagrams) •  access additional funds for telemedicine expansion •  better utilization of current technological bridge •  continued support by STHCS to member facilities •  expand and encourage utilization of interface programs to strengthen the viability and to improve the health care delivery system (ie teleconference) •  discussion with CCHN Information Services organize Technology sort Decide how to manage multiple tasks. 20 Community & Consumer Views Manage resources effectively. 4 Work quickly and effectively under pressure 49 Organize the work when directions are not specific. 39 Ra ely . et he effectiv effectively s g S time tin Manage resource 1 2 1 Ma 5 4 1 5 4 2 1 3 1 3 5 4 2 na 2 3 4 2 3 1 2 1 d an . ation ks. ecific inform t sp le tas no e of rtant. ltip are mu ltitudimpo ns ge na a muat is ectio ma an dir Sc e wh w to en cid wh de e ho rk cid wo d De e the ely n an niz tio ectiv ga ma eff Or or e inf tim 5 ge e of rtant. na 5 ltitudimpo Ma 4 a muat is 1 an wh 5 Sc cide 4 de 3 3 5 4 3 2 ge 4 5 3 3 2 1 Regionalization rate map Information Services Technology Community & Consumer Views Regionalization Financing Management Financing Management Mission & Ideology STHCS as model ...prioritize the issues... Hendrik Drachsler 25 February 2013 20
  21. 21.   Group  Concept  Mapping     D2.1 Evaluation Criteria and Methods •  Invited 122 external experts •  56 experts contributed 212 indicators for the evaluation framework •  After cleaning -> 108 indicators remained •  26 experts sorted on similarity in meaning •  26 experts rated on priority and applicability Hendrik Drachsler 25 February 2013 21
  22. 22. Plus Minus Interesting rating   Look at and listen to the presentation of the Evaluation Framework Meanwhile…create notes on P: M: I: Plus Minus Interesting Write down everything that comes to your mind, generate as many ideas as possible, do not filter your ideas.
  23. 23. Group  Concept  Mapping       A point map Hendrik Drachsler 25 February 2013 23
  24. 24. Group  Concept  Mapping       A cluster map 15 Hendrik Drachsler 25 February 2013 24
  25. 25. Group  Concept  Mapping       A cluster map 6 Hendrik Drachsler 25 February 2013 25
  26. 26. Group  Concept  Mapping       Clusters’ labels Hendrik Drachsler 25 February 2013 26
  27. 27. Group  Concept  Mapping       Rating Map Priority Hendrik Drachsler 25 February 2013 27
  28. 28.   Group  Concept  Mapping     Rating Map Applicability Hendrik Drachsler 25 February 2013 28
  29. 29. Group  Concept  Mapping       Hendrik Drachsler 25 February 2013 29
  30. 30. Group  Concept  Mapping       Hendrik Drachsler 25 February 2013 30
  31. 31. WP2: Literature review  
  32. 32. WP2: Literature review   1. Literature review of suitable evaluation approaches and criteria 2. Review of comprising initiatives such as LinkedEducation, MULCE, E3FPLE and the SIG dataTEL  
  33. 33. WP2: Literature review  
  34. 34.   Many thanks for your attention! This silde is available at: http://www.slideshare.com/Drachsler Email: hendrik.drachsler@ou.nl Skype: celstec-hendrik.drachsler Blogging at: http://www.drachsler.de Twittering at: http://twitter.com/HDrachsler Hendrik Drachsler 25 February 2013 34

×