Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Scalable Learning Analytics and Interoperability – an assessment of potential, limitations, and evidence

766 views

Published on

A presentation given at the 2015 EUNIS Congress, held at Abertay University in Dundee, June 2015.

Learning analytics is now moving from being a research interest to topic for adoption. As this happens, the challenge of efficiently and reliably moving data between systems becomes of vital practical importance. In this context, “scalable learning analytics” is not intended to refer to infrastructural throughput, but to refer to the feasibility of a combination of: a) pervasive system
integration, and b) efficient analytical and data management practices. There are a number of
considerations that are of particular relevance to learning analytics in addition to elements that are generic to analytics. This contribution to EUNIS 2015 seeks to clarify, by argument and through evidence, both where there are potential benefits and limitations to applying interoperability specifications (and standards) in the service of scalable learning analytics.

Published in: Technology
  • Be the first to comment

Scalable Learning Analytics and Interoperability – an assessment of potential, limitations, and evidence

  1. 1. Scalable Learning Analytics and Interoperability – an assessment of potential, limitations, and evidence Adam Cooper, Cetis EUNIS 2015, Abertay University, Dundee June 12th 2015 #laceproject
  2. 2. Outline • Scene setting – Meaning of “learning analytics” – Why learning analytics => interoperability • Break down the question – Low-hanging fruit – Partial solutions – Outstanding Issues • What to do about it – Call to action – Further reading 2
  3. 3. What do I Mean “Learning Analytics”? Analytics is the process of developing actionable insights through problem definition and the application of statistical models and analysis against existing and/or simulated future data. Adam Cooper, Cetis Analytics Series Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs. Society for Learning Analytics Research 3
  4. 4. Ergo: it is not trivial in-application graphical visualisations of data 4
  5. 5. Interoperability is VITAL for Institutional* Learning Analytics Data Sources • Student activity traverses multiple kinds of application • Cross-cohort activity spans multiple equivalent applications Focus = what is meaningful Analytics Process Focus = timeliness and efficiency + audit and accountability 5 * - as opposed to learning analytics/science research
  6. 6. The Data Isn’t All in Blackboard/Moodle/... And even if it were, this shouldn’t be a black box 6
  7. 7. 7 Is the Fruit Low-Hanging or High-Up?
  8. 8. The Benefits Balance 8 many and similar
  9. 9. Analytics Process = Low-Hanging? • Where do we have scale? • ... and where consistency? • (where could we?) • Where is there evidence of interoperability? 9
  10. 10. This can be quite generic 10
  11. 11. Prime Candidate: Predictive Model Markup Language Purpose Evidence 11 Encode in XML: • Pre-processing • Predictive models • Results http://www.dmg.org
  12. 12. Emerging: JSON-Stat Purpose • Data cube • Minimal metadata • Web-dev friendly JSON JSON-stat is a simple lightweight JSON dissemination format best suited for data visualization, mobile apps or open data initiatives... http://json-stat.org/ Evidence • Adopted by statistics agencies in the UK, Norway, Sweden, Catalunya, and Galicia • Javascript toolkit and R package 12
  13. 13. More Examples and Evidence Alphabet Soup: • ARFF • CSV+ • DSPL • Odata • RDF Data Cube • SDMX • VTL 13 http://bit.ly/wp7-ssqrg
  14. 14. What About Source Data? Low or High? Issue: Learning Analytics requires data from learning activities. User-centred, not application-centred. Maybe highly-contextualised. 14 Consistency&Scale Specificity&Context
  15. 15. A Simple Example of the Issue – Video/Audio Playback • Are pause and stop different events? • Is “frame number” a useful index? • Can a video be “completed”? • Can engagement be determined? • When can it be estimated? 15
  16. 16. Where is the process domain-specific? 16 Domain-specific Enquiry
  17. 17. More Generic is Lower Down • Data integration is problematic because of the lack of a common language at domain level. • BUT: recognise that modeling activity in the teaching and learning domain is work in progress – Some deeper domain-level vocabularies, but negligible evidence of use for learning analytics. – Click-counts are wholly inadequate. • => work at “meso-level” • This is NOT the end-game! 17
  18. 18. Meso-Level: Benefits Here=> 18
  19. 19. The Lowest Fruit? Experience API Purpose • Common model for expressing activity as Actor + Verb + Object • + some domain-specifics (SCORM heritage) • Verbs (etc) defined elsewhere Evidence • Widely used as an “I/O format” • Lacking evidence of cross- application learning analytics. 19
  20. 20. ADL Experience API is Not the Only Game • IMS Caliper • Contextualised Attention Metadata • PSLC TMF 20
  21. 21. Key Messages • There are mature interoperability specifications for analytics – Not learning-specific – With implementations in production-grade software e.g. PMML – And some more emerging e.g. JSON-Stat • General purpose specifications (from the education domain) are emerging – Most-often identified in discussions – Partial evidence – E.g. ADL xAPI, IMS Caliper... + XES • Vocabularies for teaching and learning activities are lacking – Existing vocabularies rooted in the software domain – ... or not designed with analytics in mind ... or unproven – Finding shared vocabularies requires collaboration – BUT Vital to long-term vision for learning analytics 21
  22. 22. Three Calls to Action Help Build Evidence (“Meso-level”) • Start adopting • Specify in procurement • Share experience, build collective expertise Domain-models • Gap analysis • Share models • Collaborate 22 PMML etc • Good practices • Issues
  23. 23. Read more... 23 http://bit.ly/LAI-BigPicture http://bit.ly/wp7-ssqrg Get Involved http://www.laceproject.eu/join-community/
  24. 24. “Scalable Learning Analytics and Interoperability – an assessment of potential, limitations, and evidence” by Adam Cooper, Cetis, was presented at EUNIS 2015 at Abertay University in Dundee on June 12th 2015. adam@cetis.org.uk This work was undertaken as part of the LACE Project, supported by the European Commission Seventh Framework Programme, grant 619424. These slides are provided under the Creative Commons Attribution Licence: http://creativecommons.org/licenses/by/4.0/. Some images used may have different licence terms. www.laceproject.eu @laceproject 24
  25. 25. Credits Learn Course At-a-Glance – Blackboard Analytics for Learn http://www.blackboard.com/Platforms/Analytics/Products/Blackboard-Analytics-for-Learn.aspx Fruit Tree – CC0 Public Domain Dogs tug of war – CC BY-NC-ND Emo Skunken http://eli-ri.deviantart.com/art/Emo-Tug-of-War-73721711 Prism – from BBC Bitesize http://www.bbc.co.uk Caliper diagram – (c) IMS GLC http://www.imsglobal.org/caliper/index.html Logos used are understood to be trademarks and are used without the owner’s endorsement of this presentation. 25

×