Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Data informed decision making - Yaz El Hakim

563 views

Published on

Presented at LILAC 2016

Published in: Education
  • Be the first to comment

Data informed decision making - Yaz El Hakim

  1. 1. Data-informed decision making: Research data that can inform library procurement and investment in the student learning experience By Yaz El Hakim and Emma Warren-Jones
  2. 2. Session Objectives • To consider the history and future of data within HE e.g. IoT • Discuss the concept and benefit of Learner Analytics • To outline and discuss some of the ethical considerations and suggested good practice principles • To collect some raw data and illustrate the insights that could be drawn from these data at both the library and individual levels.
  3. 3. Workshop Outline 0-15 mins: Introduction and presentation of key concepts around learner analytics, big data and ethics. 15-35 mins: The group will be asked to participate in the collection of some raw data in order to consider ways in which currently uncollected and under-utilised data sets may facilitate future decisions within the library and other departments.  What may be defined as engagement, as opposed to monitoring, will also be discussed in relation to the JISC publication (Sclater, 2014). 35-50 mins: Discussion and Concluding points. 50-60 mins: Q&A
  4. 4. Code of practice for Learning Analytics Responsibility of educational institutions to ensure Learning Analytics are conducted responsibly, appropriately & effectively Transparency, privacy & validity of data. Monitoring vs improving learning experience
  5. 5. So why are Learning Analytics so important? • Their value… • Not in the money sense (that too) but in their ability to improve the student’s learning experience. • To realise the learning potential of a student within the context of mass higher education, learning analytics can enhance the quantity and depth of feedback that can be achieved by teaching staff. • Students who had opportunities to seek and give peer feedback were clearer of the expected standards and had greater self-awareness of their performance
  6. 6. Ice Breakers… • What types of data do you currently collect in your libraries? • What are you doing with these data? (analysed in isolation / in combination) • What decisions are these data being used to make? • Are there better datasets that you could benefit from? • What are the datasets you use to currently make library investment decisions.
  7. 7. Types of data currently being collected institutionally include: What’s the problem with this data?
  8. 8. The Teaching Excellence Framework (Chapter 3) Common metrics • 12. After informal discussions with the sector, we believe at present there are three common metrics (suitably benchmarked) that would best inform TEF judgements. We propose initially to base the common metrics on existing data collections: • Employment/destination: from the Destination of Leavers from Higher Education Surveys (outcomes), and, from early 2017, make use of the results of the HMRC data match. • Retention/continuation: from the UK Performance Indicators which are published by Higher Education Statistics Agency (HESA) (outcomes) • Student satisfaction indicators: from the National Student Survey (teaching quality and learning environment)
  9. 9. NSSE is close, Attendance is closer, but very little Actual Learning Data is utilised. • Ironically, the library or more specifically, what the library provides and how students engage with it is the central point of any degree and arguably the most explicit learning data available!!! • So why haven’t we been leading the way with real learning analytics …
  10. 10. More actual data that flows into the Analytics and Improve Predictive Modelling of the future. Let’s collect some data: 1. Provide email addresses. 2. Add your favourite book to the RefME project - ISBN is the most accurate source. 3. If you remember a specific quote – please add it. Actual Data Collection…
  11. 11. But the tradition of a degree is that you read for it… Therefore what you read matters – even more importantly what you cite, use, remember and allow to inform and transform your worldview, matters more! Real Learning Data
  12. 12. Citation Generator Reference Manager API / Widget Citation Style Access time Citation Page Edits Course Institution Level of study Field AnnotationQuote User Author Title Edition Publisher Publication Full text Date Source Institution User Library resource management Learner analytics Research impact Drive content usage (via RefME to discovery service) Data Link Structure User analytics/bibliometrics
  13. 13. Sign Ups Projects created References created User activity Field of study RefME+ Institute dashboard
  14. 14. Journal Article dashboard Publishers Container titles Reference Types Reference creation
  15. 15. Conclusion • Institutional libraries/librarians have a huge role to play in the leadership of learning analytics. • Use of data is delicate and needs to be well prepared for, both in its collection and utilisation. • Development of reports and interventions at institutional and individual levels will transform much of how we support learning and research journeys.
  16. 16. References Data intelligence notes - estates management (2011), Available from: https://www.hesa.ac.uk/intel?name=bds_emr [Accessed: 1.02.16]. King, J.H. & Richards, N.M. (2014) Big data ethics, Available from: http:// papers.ssrn.com/sol3/papers.cfm?abstract_id=2384174 [Accessed: 1.02.16]. Kuh, G.D. (2003) What we’re learning about student engagement from NSSE: Benchmarks for effective educational practices, Change: The Magazine of Higher Learning, 35, (2), 24–32. Sclater, N. (2014) Code of practice for learning analytics A literature review of the ethical and legal issues, http://repository.jisc.ac.uk/5661/1/ Learning_Analytics_A-_Literature_Review.pdf.
  17. 17. www.refme.com hello@refme.com @GetRefME /RefME

×