Durham University’s first institution-wide implementation of eXplorance BlueMalcolm Murray
Joint presentation with Julie Mulvey - Blue Admin,
Details how we used features in Blue such as the institutional hierarchy, DIG, Question Bank, Role-Based Dynamic Access to simplify the process. Also discusses a bespoke data manipulation tool (the QuBE) used to prepare data for evaluations.
Implicit bias and assumptions in user recruitment - UX in the City ManchesterJessica Lewes
A lightning talk for UX in the City Manchester 2018, starting a conversation around implicit bias and assumptions that effect user recruitment and research.
Reinventing Discovery: An Analysis of Big DataAndrew Nagy
As the late Steve Jobs would say, don’t listen to your customers, they don’t know what they want. Usability studies and live user analysis provide valuable feedback about your product or web site in terms of how the tool is used, but listening to the users about what they want out of the tool can result in a “whack-a-mole” scenario where you solve a problem for one user, but create new problems for other users. Analyzing usage data can provide a very different perspective on how live users actually use the tool and allow you to identify different personas and use cases. This talk will share how Serials Solutions collects and analyzes a dataset of queries and clicks generated by millions of users at hundreds of libraries around the world to find behaviors, patterns, successes and failures in the interface design and search algorithms and then how we leverage that to improve and redesign. We will share the details of our custom developed data warehouse system and how we leverage these tools to perform our analysis. We will also share with you before-and-afters that were developed based on the results of the ongoing analysis.
Durham University’s first institution-wide implementation of eXplorance BlueMalcolm Murray
Joint presentation with Julie Mulvey - Blue Admin,
Details how we used features in Blue such as the institutional hierarchy, DIG, Question Bank, Role-Based Dynamic Access to simplify the process. Also discusses a bespoke data manipulation tool (the QuBE) used to prepare data for evaluations.
Implicit bias and assumptions in user recruitment - UX in the City ManchesterJessica Lewes
A lightning talk for UX in the City Manchester 2018, starting a conversation around implicit bias and assumptions that effect user recruitment and research.
Reinventing Discovery: An Analysis of Big DataAndrew Nagy
As the late Steve Jobs would say, don’t listen to your customers, they don’t know what they want. Usability studies and live user analysis provide valuable feedback about your product or web site in terms of how the tool is used, but listening to the users about what they want out of the tool can result in a “whack-a-mole” scenario where you solve a problem for one user, but create new problems for other users. Analyzing usage data can provide a very different perspective on how live users actually use the tool and allow you to identify different personas and use cases. This talk will share how Serials Solutions collects and analyzes a dataset of queries and clicks generated by millions of users at hundreds of libraries around the world to find behaviors, patterns, successes and failures in the interface design and search algorithms and then how we leverage that to improve and redesign. We will share the details of our custom developed data warehouse system and how we leverage these tools to perform our analysis. We will also share with you before-and-afters that were developed based on the results of the ongoing analysis.
Test Fest: Catching up on Your Usability Testing BacklogSarah Joy Arnold
Presentation at North Carolina Librarians' Association Biennial 2017 in Winston-Salem, NC. Part of "So Many Users, Not Enough Time: Large Scale Usability Testing Methods" with Chad Haefele and Scott Goldstein.
This presentation will discuss the process that Appalachian State University Libraries used to measure and test website usability during its recent redesign and migration to a new Drupal theme. It will emphasize how we recruited a large number of users and how large sample sizes promote better design decisions. While web usability research is well known for its flexibility in needing only about a dozen users to discover most problems, robust data-driven decisions are best supported by datasets that are large and significant. Attendees will learn techniques for surveying and testing more users without greatly compromising the richness of data collected.
At UNC-Chapel Hill, the User Experience and Assessment department regularly runs usability tests to inform our decision-making and prioritize our users' perspective as we make changes. But there are more things to test than there are hours in the day. Our projects have a variety of stakeholders who are very interested in improving their services, and we found ourselves with a long list of tests we wanted to run. To catch up, we adapted Harvard Libraries' Test Fest model: five tests run simultaneously, with five participants rotating through the set of tests. Over a span of two hours, we completed 25 individual usability tests. In this one event, we caught up on much of our testing backlog. This session will outline how we planned and executed Test Fest, how we recruited participants, and what we learned from using this approach. We'll also discuss our methodologies and briefly look at the results of each test.
Collection of Resources for growing Open Science Anton Osika
This project was created during the CERN Webfest Hackathon 2013.
http://pear.ly/cg09j
Resources:
http://en.wikipedia.org/wiki/Science_2.0
http://scienceresources.wikidot.com/start
Keynote talk given at the Learning Analytics Summer Institute 2016 (LASI16) at the University of Deusto, Bilbao, Spain in June 2016 by Rebecca Ferguson.
What does the future hold for learning analytics? In terms of Europe’s priorities for learning and training, they will need to support relevant and high-quality knowledge, skills and competences developed throughout lifelong learning. More specifically, they should improve the quality and efficiency of education and training, enhance creativity and innovation, and focus on learning outcomes in areas such as employability, active-citizenship and well-being. This is a tall order and, in order to achieve it, we need to consider how our work fits into the larger picture. Drawing on the outcomes of two recent European studies, Rebecca will discuss how we can avoid potential pitfalls and develop an action plan that will drive the development of analytics that enhance both learning and teaching.
Deriving value from analytics requires much more than purchasing technology. University of Kentucky's analytics journey utilized fostering a bottom-up emergent community of practice as well as top-down organizational maneuvers. This presentation shares different aspects of the University of Kentucky score.
Sdal air education workforce analytics workshop jan. 7 , 2014.pptxkimlyman
The American Institutes for Research (AIR) and Virginia Tech are collaborating to explore and develop new approaches to combining, manipulating and understanding big data. The two are also looking at how big data analytics can help answer questions critical to solving issues in education, workforce, health, and human and social development. They held two workshops on January 7 and 27, 2014- the first on Education and Workforce Analytics and the second on Health and Social Development Analytics.
Test Fest: Catching up on Your Usability Testing BacklogSarah Joy Arnold
Presentation at North Carolina Librarians' Association Biennial 2017 in Winston-Salem, NC. Part of "So Many Users, Not Enough Time: Large Scale Usability Testing Methods" with Chad Haefele and Scott Goldstein.
This presentation will discuss the process that Appalachian State University Libraries used to measure and test website usability during its recent redesign and migration to a new Drupal theme. It will emphasize how we recruited a large number of users and how large sample sizes promote better design decisions. While web usability research is well known for its flexibility in needing only about a dozen users to discover most problems, robust data-driven decisions are best supported by datasets that are large and significant. Attendees will learn techniques for surveying and testing more users without greatly compromising the richness of data collected.
At UNC-Chapel Hill, the User Experience and Assessment department regularly runs usability tests to inform our decision-making and prioritize our users' perspective as we make changes. But there are more things to test than there are hours in the day. Our projects have a variety of stakeholders who are very interested in improving their services, and we found ourselves with a long list of tests we wanted to run. To catch up, we adapted Harvard Libraries' Test Fest model: five tests run simultaneously, with five participants rotating through the set of tests. Over a span of two hours, we completed 25 individual usability tests. In this one event, we caught up on much of our testing backlog. This session will outline how we planned and executed Test Fest, how we recruited participants, and what we learned from using this approach. We'll also discuss our methodologies and briefly look at the results of each test.
Collection of Resources for growing Open Science Anton Osika
This project was created during the CERN Webfest Hackathon 2013.
http://pear.ly/cg09j
Resources:
http://en.wikipedia.org/wiki/Science_2.0
http://scienceresources.wikidot.com/start
Keynote talk given at the Learning Analytics Summer Institute 2016 (LASI16) at the University of Deusto, Bilbao, Spain in June 2016 by Rebecca Ferguson.
What does the future hold for learning analytics? In terms of Europe’s priorities for learning and training, they will need to support relevant and high-quality knowledge, skills and competences developed throughout lifelong learning. More specifically, they should improve the quality and efficiency of education and training, enhance creativity and innovation, and focus on learning outcomes in areas such as employability, active-citizenship and well-being. This is a tall order and, in order to achieve it, we need to consider how our work fits into the larger picture. Drawing on the outcomes of two recent European studies, Rebecca will discuss how we can avoid potential pitfalls and develop an action plan that will drive the development of analytics that enhance both learning and teaching.
Deriving value from analytics requires much more than purchasing technology. University of Kentucky's analytics journey utilized fostering a bottom-up emergent community of practice as well as top-down organizational maneuvers. This presentation shares different aspects of the University of Kentucky score.
Sdal air education workforce analytics workshop jan. 7 , 2014.pptxkimlyman
The American Institutes for Research (AIR) and Virginia Tech are collaborating to explore and develop new approaches to combining, manipulating and understanding big data. The two are also looking at how big data analytics can help answer questions critical to solving issues in education, workforce, health, and human and social development. They held two workshops on January 7 and 27, 2014- the first on Education and Workforce Analytics and the second on Health and Social Development Analytics.
Presentation on learning analytics given by Rebecca Ferguson at the Nordic Learning Analytics Summer Institute (Nordic LASI), organised by the SLATE Centre, in Bergen Norway, 29 September 2017.
Presentation by Rebecca Ferguson (IET, The Open University, UK) at the Learning Analytics Summer Institute event (LASI Asia) run in Seoul, South Korea, in September 2016. This presentation, on Visions of the Future of learning analytics, is based on work carried out by the European consortium working on the Learning Analytics Community Exchange (LACE) project.
Learning analytics and Moodle: So much we could measure, but what do we want to measure? A presentation to the USQ Math and Sciences Community of Practice May 2013
Learning Analytics – Ethical questions and dilemmasTore Hoel
Workshop presentation using the Potter Box model of ethical reasoning to discuss concerns and dilemmas of Learning analytics - Open Discovery Space and Learning Analytics Community Exchange projects #laceproject #ods_eu
The ethics of MOOC research: why we should involve learnersRebecca Ferguson
Presentation given by Rebecca Ferguson at the FutureLearn Academic Network (FLAN) meeting at the University of Southampton, UK, on 2 December 2015. #flnetwork
Learning Analytics: What it is, where we are, and where we could goDoug Clow
Talk given at the Computers and Learning Research Group (CALRG) annual conference, 12 June 2013, at The Open University, UK.
This presentation briefly reviews learning analytics, using some key examples. It then assesses what the OU is doing, and then sets out some ideas for what the OU could do in future to harness the potential of data about our learners to improve their learning.
Scaling up Innovation: Why Theories of Change MatterBrandon Muramatsu
by Elaine Seymour, University of Colorado at Boulder. Presented at the Workshop on Disseminating CCLI Innovations: Arlington, VA, February 18-19, 2010. Workshop organized by Joe Tront, Flora McMartin and Brandon Muramatsu.
Using Experiments and Cognitive Science Research to Improve the Design of Onl...Joseph Jay Williams
The recent explosion of online educational resources has the potential to reorganize how we learn – from K-12 and university to the workplace and the informal learning we do every day. It also raises new questions and opportunities for research that crosses the many disciplines relevant to designing computer programs that help people learn. For example, HCI and cognitive science can provide complementary perspectives in investigating how to design the content and instructional features of an online course, such that a person processes and stores that information in a way that successfully guides their future behavior. Online educational environments provide new optimism in tackling challenges like these because they can be instrumented to collect an unprecedented scale and diversity of data, and allow iterative sequences of experiments to be embedded in authentic educational contexts with real students.
This talk presents one approach to this kind of research, using experimental comparisons to test the effects of modifying online mathematics exercises to include motivational messages and question prompts for people to explain, the design of which is guided by the psychological literature on motivation and learning. A combination of laboratory experiments and experiments embedded in real-world online education platforms (like www.KhanAcademy.org) reveal that prompting people to explain “why?” facts are true drives them beyond memorization to uncover underlying principles and patterns, and that teaching such self-questioning strategies may accelerate student learning. Motivational messages appear to have limited benefits if they are simply encouraging or aimed at raising confidence, but do increase how much effort students invest if the messages emphasize that aptitude is malleable and can be improved through persistence. Several planned experiments are presented which also use this paradigm of adding minimal but effective textual changes to online exercises to achieve practical impact and explore basic cognitive science questions about learning.
What Actually Is Artificial Intelligence?Doug Clow
Talk for MK Geek Night, 23 Sep 2021
AI means more hype, more technology, more future - and more money! But what actually is it? In this talk, Doug will explain what people mean by artificial intelligence and machine learning, what sort of problems they can solve, and how they do it. We'll see a range of examples where they're being used, and look at how it goes well and how it goes wrong, from entertaining AI weirdness to serious algorithmic bias. You won't end up being able to implement techniques like Support Vector Machines or Generative Adversarial Networks (unless you already could) but you should end up with a better idea of what the people who can are up to.
How to get to Runter End: Generating English placenames with a neural networkDoug Clow
These are slides for a talk at MK Geek Night, Thu 7 March 2019. Doug trained a neural network on the official database of placenames in England, then got it to generate its own suggestions. Some were convincing, some were funny, and some even turned out to be real places. Doug will give a bit of an explanation of how he did it, and show some of the best results.
A partial history of Educational Technology at the Open UniversityDoug Clow
This is a talk given at the OU's Computers and Learning Research Group, on 17 Jan 2019. In it I give a very partial history of educational technology at the Open University, since its founding in 1969 to the present day. It’ll be partial in multiple senses. A full history would take far longer than a single session. If I gave a comprehensively synoptic account, it’d be too broad-brush to be interesting. So I’ll be selecting elements to focus on, and I’ll be unashamedly partial in picking the ones that appeal particularly to me. We’ve always been pioneers in using technology to help our students learn. What that means has changed profoundly in some ways, and is much the same in others. As Santayana said, “Those who cannot remember the past are condemned to repeat it.” Come along to hear the digital equivalent of “I remember when all this was fields”!
Where is the evidence? A call to action for learning analyticsDoug Clow
Keynote presentation at LASI-Rocky Mountains online conference, 12 June 2017, based on a similar talk at LAK17, Learning Analytics and Knowledge Conference 2017, Vancouver. An analysis of the nature of evidence, the state of the evidence in the field of learning analytics, and some suggestions for ways to improve, based on work from the LACE project's Evidence Hub.
Trains and Balloons: An Introduction to Learning AnalyticsDoug Clow
Slides for a talk given at the Institute of Physics Higher Education Group meeting on Concept Inventories and Learning Analytics, Tue 4 April 2017, Open University, UK
A Whistestop Tour of Theories for TEL ResearchDoug Clow
Presentation to postgraduate students at the Institute of Educational Technology, The Open University, UK, 28 Feb 2017. A very brief overview of some of the theories that are often referenced in TEL research.
LAEP Visions of the Future of Learning AnalyticsDoug Clow
Presentation on the LACE project's Visions of the Future of Learning Analytics work from the LAEP project's expert workshop in Amsterdam, 15-16 March 2016.
How can universities scale up learning analytics beyond small-scale pilots to seriously use data to improve student learning? This interactive workshop was designed to help you think this through for your institution.
Universities are hard to change. Having good data and analytics is a good start, but is only one part of success. This session will provide tools and frameworks to help you analyse what else is needed, building on experiences of successful large-scale learning analytics activity at the Open University and the University of Technology, Sydney, and from the pan-European Learning Analytics Community Exchange project.
Slides for a talk at Bett, London, 20 January 2016.
Visions of the Future of Learning AnalyticsDoug Clow
Eight visions of the future of learning analytics, created as a way of exploring possible futures by the LACE (Learning Analytics Community Exchange) Project, and presented at Bett 2016, London, 20 January 2016
Moving through MOOCs: Pedagogy, Learning and Patterns of Engagement.
Presentation at EC-TEL 2015, September, 2015, Toledo, Spain.
[This is the shorter, more visual version. The detailed version is available at http://www.slideshare.net/R3beccaF/moving-through-moocs-pedagogy-learning-and-patterns-of-engagement.]
Massive open online courses (MOOCs) are part of the lifelong learning experience of people worldwide. Many of these learners participate fully. However, the high levels of dropout on most of these courses are a cause for concern. Previous studies have suggested that there are patterns of engagement within MOOCs that vary according to the pedagogy employed. The current paper builds on this work and examines MOOCs from different providers that have been offered on the FutureLearn platform. A cluster analysis of these MOOCs shows that engagement patterns are related to pedagogy and course duration. Learners did not work through a three-week MOOC in the same ways that learners work through the first three weeks of an eight-week MOOC.
Creating an action plan for learning analyticsDoug Clow
Slides for a talk at Bett 2015, London, on Friday 23 January at Excel.
Learning analytics has great potential. By using data more effectively, we can understand and improve learning and the learning environment. Trail-blazing projects, exciting demonstrations and earnest strategy papers set out a compelling vision for data in HE.
That vision can sometimes seem far from institutional reality. How can we get some of those benefits for our learners?
This interactive workshop will help participants assess their institution’s current capability for making use of learning analytics, and help them plan for action. The facilitators will draw on a wide range of practical experience, including from the pan-European Learning Analytics Community Exchange project.
Learning Analytics: Making learning better?Doug Clow
Learning Analytics: Making learning better?
Slides for a talk at Bett 2015, London, Fri 23 January, as part of the LACE project (www.laceproject.eu)
This panel discussion starts with a short introduction to learning analytics and educational data mining, highlighting how European schools are using different types of data to help support, manage and predict learning outcomes. It includes viewpoints from national school networks in the Nordic countries and the Netherlands, a research input from the European Commission supported LACE project highlighting research on the use of learning analytics and an expert input on ethical and privacy issues in the application of learning analytics. Participants will be encouraged to share their views and where interested to join the growing LACE Community
Learning Analytics Examples from the UK, Australia and North AmericaDoug Clow
Examples of Learning Analytics from the UK, Australia and North America, aimed at schools level. Slides from a talk at a pre-conference seminar on learning analytics at the EMINENT conference, European Schoolnet, Pädagogishe Hochschule Zürich, 12 November 2014.
What is Learning Analytics? Slides from a talk at a pre-conference seminar on learning analytics at the EMINENT conference, European Schoolnet, Pädagogishe Hochschule Zürich, 12 November 2014.
Learning Analytics: A General Introduction and Perspectives from the UKDoug Clow
A presentation at a seminar on learning analytics for schools held at Skolverket, the Swedish National Agency for Schools, in Stockholm, Sweden, in collaboration with the Norwegian Centre for ICT in Education, on 9 October 2014. Part of the LACE project #laceproject www.laceproject.eu
http://lanyrd.com/2014/seminar-on-learning-analytics-for-schools-in-sto-2/
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...Dr. Vinod Kumar Kanvaria
Exploiting Artificial Intelligence for Empowering Researchers and Faculty,
International FDP on Fundamentals of Research in Social Sciences
at Integral University, Lucknow, 06.06.2024
By Dr. Vinod Kumar Kanvaria
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
A Strategic Approach: GenAI in EducationPeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
The French Revolution, which began in 1789, was a period of radical social and political upheaval in France. It marked the decline of absolute monarchies, the rise of secular and democratic republics, and the eventual rise of Napoleon Bonaparte. This revolutionary period is crucial in understanding the transition from feudalism to modernity in Europe.
For more information, visit-www.vavaclasses.com
Biological screening of herbal drugs: Introduction and Need for
Phyto-Pharmacological Screening, New Strategies for evaluating
Natural Products, In vitro evaluation techniques for Antioxidants, Antimicrobial and Anticancer drugs. In vivo evaluation techniques
for Anti-inflammatory, Antiulcer, Anticancer, Wound healing, Antidiabetic, Hepatoprotective, Cardio protective, Diuretics and
Antifertility, Toxicity studies as per OECD guidelines
Operation “Blue Star” is the only event in the history of Independent India where the state went into war with its own people. Even after about 40 years it is not clear if it was culmination of states anger over people of the region, a political game of power or start of dictatorial chapter in the democratic setup.
The people of Punjab felt alienated from main stream due to denial of their just demands during a long democratic struggle since independence. As it happen all over the word, it led to militant struggle with great loss of lives of military, police and civilian personnel. Killing of Indira Gandhi and massacre of innocent Sikhs in Delhi and other India cities was also associated with this movement.
1. Evidence Hub
activity
LACE SoLAR Flare
24 October 2014
The Open University, Milton Keynes, UK
Co-Chairs:
Doug Clow, Rebecca Ferguson, Simon Cross
2. “Big data is like teenage sex: everyone
talks about it, nobody really knows how
to do it, everyone thinks everyone else
is doing it, so everyone claims they are
doing it…”
– Dan Ariely, Facebook, 6 Jan 2013
3. “Big data is like teenage sex: everyone
talks about it, nobody really knows how
to do it, everyone thinks everyone else
is doing it, so everyone claims they are
doing it…”
– Dan Ariely, Facebook, 6 Jan 2013
… and the world of education seems
obsessed about it, but the little that
does go on is often done badly, and
leaves people disillusioned.
4. 4
Nobody seems to
know what they’re
talking about …
Let’s change that!
Juvenile boat-tailed grackles, Quiscalus major
Photo (CC)-BY-SA Andrea Westmoreland https://www.flickr.com/photos/andrea_pauline/4768171665
5. 5
What do we know,
collectively?
What do we not know?
[T]here are known knowns; there are things we
know we know. We also know there are known
unknowns; that is to say we know there are some
things we do not know. But there are also unknown
unknowns - the ones we don't know we don't know.
- Donald Rumsfeld
Photo public domain http://en.wikipedia.org/wiki/File:Donald_Rumsfeld_Tommy_Franks.jpg
6. 6
hypotheses
• claims
• research questions
• propositions
Evidence is for or against …
Photo (CC)-BY Brian Hillegas https://www.flickr.com/photos/seatbelt67/502255276
7. Hypothesis A: Learning
Learning is at the heart of learning analytics. Do we see real improvements in learning
outcomes for learners? We might be able to see patterns in learners’ data, but can we take
action based on those patterns that improves their learning? We might be able to
personalise learning based on learners’ data, but does that make any difference to how
much they learn?
This hypothesis is about improved learning outcomes: e.g. cognitive gains, improved
assessment marks, better scores on tests, attainment results.
Example positive evidence
A study showing measurable improvements in scores on a test among learners who
received study prompts from a learning analytics system compared to the usual teaching
approach.
Example negative evidence
A study showing no significant difference in assessment results before and after the
introduction of a learning analytics dashboard.
7
Learning analytics improves learning outcomes.
8. Hypothesis B: Teaching
Does learning analytics optimise the learning process? We would expect that to lead to
more efficient processes, allow resources to be better targeted, and save money and time.
We’d also expect performance metrics (other than attainment) to improve. Does learning
analytics lead to improvements in retention, completion and progression?
This hypothesis is about improvements to teaching and learning that are not direct
learning gains by the learner.
Example positive evidence
A study showing improved retention among student cohorts whose tutors were prompted
to contact at-risk learners identified by a predictive model compared to other cohorts.
Example negative evidence
A learning analytics project that increased costs but resulted in no improvements in
efficiency or performance.
8
Learning analytics improves learning support and teaching,
including retention, completion and progression.
9. Hypothesis C: Uptake
Is learning analytics a fad that will never really get off the ground at real scale? Will it ever
move beyond pilot projects and demos? If a system is deployed across a whole
organisation, do the teachers and learners actually use it?
This hypothesis looks at the level of usage of learning analytics, and is concerned with
institutional and policy perspectives.
Example positive evidence
A survey of usage of learning analytics dashboards across the university sector in one
country finds them in use in more than half of organisations.
Example negative evidence
A predictive modelling project is rolled out across a group of schools, but usage is low and
the project is discontinued.
9
Learning analytics are taken up and used widely,
including deployment at scale.
10. Hypothesis D: Ethics
Learning analytics raises many ethical issues, around privacy, transparency, surveillance,
data ownership and control, and data protection. Can these real concerns be addressed
effectively, or will they prove to be barriers?
This hypothesis is about the ‘should we’ questions, rather than the ‘can we’ ones
addressed by the other hypotheses.
Example positive evidence
An organisation develops an ethics policy for learning analytics that is warmly received by
learners and other stakeholders.
Example negative evidence
A large-scale project to gather analytics data across multiple schools is shut down because
of concerns about privacy.
10
Learning analytics is used in an ethical way.
12. Your mission
• Engage with these questions
• Pool our expertise
• Inform, discuss, debate
(CC) Mike Licht on Flickr http://www.flickr.com/photos/notionscapital/4436135087/
13. Your task: build a SoLAR system
• You each have two hypotheses
• What evidence do you know in favour? Against?
– Wide definition – “this systematic review of RCTs in Science proved
it” to”I’ve heard of someone who is doing this and it seems to be Ok”
• Write on Post-Its:
– Positive = yellow, Negative = green
Photo (CC)-BY Image Editor https://www.flickr.com/photos/11304375@N07/2818891443
14. Go to your room
• Lists on the board outside
• Finish at 3.30 and come back here.
• I’ll circulate, take photos, and summarise.
15. www.laceproject.eu
@laceproject
“Evidence Hub Activity” by Doug Clow, Institute of Educational
Technology, The Open University, was presented at the LACE
SoLAR Flare, The Open University, Milton Keynes, UK, on 24
October 2014.
@dougclow
dougclow.org
doug.clow@open.ac.uk
This work was undertaken as part of the LACE Project, supported by the European Commission Seventh
Framework Programme, grant 619424.
These slides are provided under the Creative Commons Attribution Licence:
http://creativecommons.org/licenses/by/4.0/. Some images used may have different licence terms.
15
Editor's Notes
It is my very huge pleasure to welcome you all. Here and online.
And this is a terrible shame because with the right techniques and the right context, where there’s trust and enthusiastic consent, it can be pretty good.
Avoid Unknown knowns – things we know but don’t realise we do, like whether there are WMD in Iraq.