Advertisement
Advertisement

More Related Content

Advertisement
Advertisement

Krakow presentation speak_appsmngm_final

  1. Considering Learning Analytics: SpeakApps and the Application of a Learning Analytics Reference Model Mairéad Nic Giolla Mhichíl Dublin City University www.speakapps.org
  2. Project Overview SpeakApps 2 speakapps.eu Lifelong Learning Programme Nov 2013 – Oct 2014 KA2 LANGUAGES, Accompanying Measures Development of tools and pedagogical tasks for oral production and spoken interaction Partners Associated Partners • Institut Obert de Catalunya • University of Southern Denmark • University of Nice • University of Jÿvaskÿla • Ruhr-Universitat Bochum • Polskie Towarzystwo Kulturalne "Mikolaj Kopernik“ • Fundació Pere Closa
  3. SpeakApps... • Life Long Learning Project, 2011-2012 & 13-14 • Partners and Associate Partners: – Universitat Oberta de Catalunya, Catalan, English – Rijksuniversiteit Groningen, Dutch – University of Jyväskylä, Swedish, Finnish – Jagiellonian University Krakow, Polish – Dublin City University, Irish • Development of tools and pedagogical tasks for oral production and spoken interaction • Open Educational Resource using CEFR • SpeakApps Moodle based Platform
  4. Introduction to Analytics  Pervasiveness of technology has facilitated the collection of data and the creation of a variety of data sets, big data  Application has spread across domains and prompted business and societal applications  Google analytics – Adwords etc. (http://www.web2llp.eu/)  Smart cities – Policing using predicative models to prevent crime (Santa Cruz police department - http://edition.cnn.com/2012/07/09/tech/innovation/police-tech/)  Ultimate aim to inform decision making from resource allocation to improved services etc. How? By using a variety of data mining techniques for discovery of patterns and/ or validation of hypothesis/claims
  5. Educational Analytics  Data available in education from a variety of sources  LMS  Institutional systems Google for education  User generated content, social networks  Ferguson (2012)* provides a useful overview of the educational analytics field and suggests the following divergence in focus between:  Educational data mining focuses on the technical challenge: How can we extract value from these big sets of learning-related data?  Learning analytics focuses on the educational challenge: How can we optimise opportunities for online learning?  Academic analytics focuses on the political/economic challenge: How can we substantially improve learning opportunities and educational results at national or international levels? )
  6. Educational Analytics…  Long and Siemens (2011:32) – aptly describes the challenge: But using analytics requires that we think carefully about what we need to know and what data is most likely to tell us what we need to know. (http://net.educause.edu/ir/library/pdf/ERM1151.pdf) * See: Ferguson, Rebecca (2012). Learning analytics: drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(5/6) pp. 304–317.
  7. Student or Learner Data** Demographics • Age, home/term address, commuting distance, socio-economic status, family composition, school attended, census information, home property value, sibling activities, census information Online Behaviour • Mood and emotional analysis of Facebook, Twitter, Instagram activities, friends and their actual social network, access to VLE • (Moodle) Physical Behaviour • Library access, sports centre, clubs and societies, network access yielding co-location with others and peer groupings, lecture/lab attendance… Academic Performance • Second Level performance, University exams, course preferences, performance relative to peers in school Slide reproduced and adapted with permission from presentation Glynn, M. (2014) Using the Data from eAssessment, eAssessment 2014: Final Answer: The question of summative eAssessment, 05th September 2014, University of Dundee Scotland available from: http://www.slideshare.net/enhancingteaching/optimising-knowledge-assessment-data
  8. Levels / Objectives of Analytics** Descriptive • What has happened? Diagnostic • Why this happened? Predictive • What will happen? Prescriptive • What to do? Slide reproduced and adapted with permission from presentation Glynn, M. (2014) Using the Data from eAssessment, eAssessment 2014: Final Answer: The question of summative eAssessment, 05th September 2014, University of Dundee Scotland available from: http://www.slideshare.net/enhancingteaching/optimising-knowledge-assessment-data
  9. Open University Analytics Principles** Learning analytics is a moral practise which should align with Learning analytics is a moral practise which should align with core organisational principles core organisational principles The purpose and boundaries regarding the use of learning The purpose and boundaries regarding the use of learning Students should be engaged as active agents in the analytics should be well defined and visible analytics should be well defined and visible implementation of learning analytics Students should be engaged as active agents in the implementation of learning analytics The organisation should aim to be transparent regarding data collection and provide students with the opportunity to update their own data and consent agreements at regular intervals Modelling and interventions based on analysis of data should be free from bias and aligned with appropriate theoretical The organisation should aim to be transparent regarding data collection and provide students with the opportunity to update their own data and consent agreements at regular intervals Modelling and interventions based on analysis of data should be free from bias and aligned with appropriate theoretical and pedagogical frameworks wherever possible and pedagogical frameworks wherever possible Students are not wholly defined by their visible data or our Students are not wholly defined by their visible data or our interpretation of that data interpretation of that data Adoption of learning analytics within the organisation requires broad acceptance of the values and benefits Adoption of learning analytics within the organisation requires broad acceptance of the values and benefits (organisational culture) and the (organisational culture) and the development of appropriate development of appropriate skills skills The organisation has a responsibility to all stakeholders to use and extract meaning from student data for the benefit of The organisation has a responsibility to all stakeholders to use and extract meaning from student data for the benefit of students where feasible students where feasible Slide reproduced and adapted with permission from presentation Glynn, M. (2014) Using the Data from eAssessment, eAssessment 2014: Final Answer: The question of summative eAssessment, 05th September 2014, University of Dundee Scotland available from: http://www.slideshare.net/enhancingteaching/optimising-knowledge-assessment-data
  10. Analytics Process*  Establish the objective or claim of the LA exercise Three Stage iterative process 1. Data collection and Pre-processing  Data preparation and cleaning, removal of redundant data etc. time stamps  Application of established and evolving data mining techniques to complete this 2. Analytics and action  Explore/analyse the data to discover patterns  Data visualisation and representation 3. Post-processing  Adding new data from additional data sources  Refining the data set  Identify new indicators/metrics  Modify variables of analysis  Choose a new analytics method * See: Chatti, M.A., Dyckhoff, A.L., Schroeder, U. and Thüs, H. (2012) ‘A reference model for learning analytics’, Int. J. Technology Enhanced Learning, Vol. 4, Nos. 5/6, pp.318–33
  11.  What? What kind of data does the system gather, manage, and use for the analysis?  Who? Who is targeted by the analysis?  Why? Why does the system analyse the collected data?  How? How does the system perform the analysis of the collected data? * See: Chatti, M.A., Dyckhoff, A.L., Schroeder, U. and Thüs, H. (2012) ‘A reference model for learning analytics’, Int. J. Technology Enhanced Learning, Vol. 4, Nos. 5/6, pp.318–33 11 Learning Analytics Reference Model*
  12.  What? Data and Environment:  Which systems  Structured and/or unstructured data  Who? Stakeholder Teachers  Students  Instructional designers  Institutional stakeholders 12 Learning Analytics Reference Model*
  13.  Why? Objective  Monitoring and analysis  Prediction and intervention  Tutoring and Mentoring  Assessment and feedback Adaptation Personalization and recommendation Reflection  Challenge to identify the appropriate indicators/metrics 13 Learning Analytics Reference Model*
  14.  How? Method  Statistics: most LMS produce statistics based on behavioural data Data mining techniques and others (long list) Classification (categories known in advance) many different techniques from Data mining Clustering (categories created from the data similar data clustered together based on similar attributes not known in advance) Association rules mining leads to the discovery of interesting associations and correlations within data Social Network Analysis … 14 Learning Analytics Reference Model*
  15. 15 Dublin City University’s Moodle Analytics* What? What kind of data does the system gather, manage, and use for the analysis Using the generic Moodle Log Data i.e. behavioural data Who? Who is targeted by the analysis? Students and Lecturers Why? Why does the system analyse the collected data? Students – adapt their behaviour Teachers – review course Institutional – monitor engagement particularly of first year students How? How does the system perform the analysis of the collected data? - Model created based on c. six years of user data based on user participation within courses – identify which modules are suitable i.e. must posses a strong confidence level i.e. .6/.7 - Students who are not engaging with the Module are provided with feedback of engagement – student centred
  16. DCU Moodle Analytics Slide reproduced and adapted with permission from presentation Glynn, M. (2014) Using the Data from eAssessment, eAssessment 2014: Final Answer: The question of summative eAssessment, 05th September 2014, University of Dundee Scotland available from: http://www.slideshare.net/enhancingteaching/optimising-knowledge-assessment-data
  17.  Claim is that student and teacher oral & video recordings should be time limited to maintain the attention of the listener  Currently we recommend a maximum of one minute for learner recordings and two minutes for teacher recordings  At present this claim is based on experience  Evidence to support decision-making which will impact:  Resource allocation to refine the tool – time limitation  Learner agency  Instructional and task design 17 SpeakApps Pilot
  18. 18 SpeakApps Pilot & LA Reference Model What? What kind of data does the system gather, manage, and use for the analysis LMS data i.e. technical information i.e. device, browser, versions etc. behavioural data i.e. time stamps, click tracking, user generated content such as surveys, peer-feedback Who? Who is targeted by the analysis? Students, Teachers, Instructional Designers and Developers Why? Why does the system analyse the collected data? Students – adapt Teachers – tutoring Instructional designers – adapt task Developers – interface adaptation How? How does the system perform the analysis of the collected data? Statistics based on behavioural data and the analysis of user generated data – possible qualitative follow-up
  19. Data Types and Sources  Aggregate and integrate data produced by students from multiple sources  Challenge to source, combine and manipulate data from a wide variety of sources and in many formats  Over reliance on behavioural data from LMS, varied data sources  Structured data i.e. data from LMS etc., other institutional systems, connected devices  Unstructured data i.e. other sources user generated content/data i.e. Facebook - social network modelling, online dictionaries, translation tools thesaurus etc. 19 Concluding Remarks
  20. Student Agency in LA  Students as active agents – voluntarily collaborate in providing and accessing data  Designing interventions (if appropriate in the context) and the agency of the student:  Student at the centre of interpretation  Data representation to facilitate interpretation  Requires specific skills of interpretation 20 Concluding Remarks…
  21. Ethical and Educational Concerns  Use of data based on transparent opt-in permission of students following established research principles  Students understand that data is collected about them and actively buy-in  Privacy and stewardship of data  Emphasis on learning as a moral practice resulting in understanding rather than measuring (Reeves, 2011) 21 Concluding Remarks…
  22.  Challenging to realise the specific objectives of stakeholders  Teachers v Instructional designers in online education  Institutions v funders  Designing and focusing indicators  Necessary skills for interpretation and communicating outcomes  Representation in clear and usable formats for stakeholders  DCU to research impact on students 22 Concluding Remarks…

Editor's Notes

  1. More information on Friday 16.30-17.00, A902
  2. Eurocall presentation Bart’s keynote, based on learner behaviour as opposed to learning
  3. Additional info 1. The 8 principles specific to learning analytics as set out by OU were followed as well as general ethical principles (gaining consent, no deception of participants, correct handling of data, nothing that will cause harm to participant, no under 18s participating, gaining ethical approval from REC before commencing research) 2. Students are made very clear of what is involved in the study, how it will affect them, and the risks and benefits of taking part. 3. Students are actively involved with the intervention on a weekly basis 4. All visible data is changeable by students. Students are informed that data will change based on their interaction with Moodle, and they can change their data point on a weekly basis by engaging more with the platform. 5.Data is free from bias as it is calculated by computer programme. All data is calculated the same way for each student. All data is displayed the same way for each student. 6. The only data at use here is Moodle interaction data. This in no way defines the student. All participants are informed that our prediction of their pass/fail is based on Moodle interaction data which is not the only factor in student results. 7. All researchers are trained in data handling and contracts have been signed to prevent the passing of data to other parties. 8. Should this model be successful in predicting student scores and improving engagement, this tool could be used on a wider scale to improve student retention and progression through education.
  4. Lovely “repeatability” emerging looking at VLE activity through the years but thinking about it, it makes perfect sense – students are the same year year year out cramming at key times. So as good as this graph is it should not be surprising
Advertisement