1. CRICOS No. 00213J
E L E A R N I N G S E R V I C E S
DIVISION OF TECHNOLOGY, INFORMATION AND LIBRARY SERVICES
Getting value out of
evaluation
Elizabeth Greener, Online Learning Solutions Manager
Tabetha Bozin, Online Learning Solutions Production Manager
2. CRICOS No. 00213J
• Two inner-city campuses
• Nearly 50,000 students
QUT at a glance
7. CRICOS No. 00213JCRICOS No. 00213J
QUT since 2016
Big Data Analytics Program
Four courses, two runs each
Social Media Analytics
Three runs
Business Process Management
Two runs
Introducing Robotics
Three courses, one run each
Predictive Analytics
One run
Kickstart your career
One run
CRICOS No. 00213J
15 courses (18 runs)
14. CRICOS No. 00213J
Evaluation cycle
Design evaluation eLearning lifecycle of the Robotics MOOC project, 2013-2015.
Based on Phillips, McNaught and Kennedy (2012)
16. CRICOS No. 00213J
Evaluation cycle
BEFORE
Agreement on key learning
experiences, innovation,
performance indicators
Plus
Orientation for educators and
mentors, processes for support
17. CRICOS No. 00213J
Evaluation cycle
DURING
Monitoring
• Weekly meets, shared comms
(issues log, google doc, analytics,
FAQs)
• Data snapshots (enrolments, step
activity, comments)
18. CRICOS No. 00213J
Evaluation cycle
AFTER
• Snapshot report,
benchmarking,
lessons learned.
• Innovation register.
• Next run priorities
21. CRICOS No. 00213J
Innovation as part of the evaluation process
Program Design
Course redesign
Matlab LTI Integration
Live Events
Engage - Enrol Design
23. CRICOS No. 00213J
Design one process to meet
the needs of many
Value in Evaluation
Establish clear roles and
provide training
Part of design and delivery
processes
Inform strategy, architecture,
teaching practice
Build in to reporting processes
Create templates
24. CRICOS No. 00213J
• Evaluation: is
multifaceted; outcomes
have different uses for
stakeholders; needs to be
dynamic and action-
oriented
Summary
How best to collect,
transform, visualize and
interpret data; how to
analyse in a meaningful way
for teaching staff; coping
with changes to FL platform,
business models and
datasets. How to scale up?
Future
25. CRICOS No. 00213J
Each course evaluation is building increased knowledge and understanding
of online learning and improving practice
Good morning.
To answer the why? (or what is the value in evaluation) I will talk about what we do and how we do it, To start, and briefly, here our remit in the university – to do this we need to know if what we are doing is working as part of an ecosystem. We are wanting to inform teaching practice, development of online learning for our students and contribute to enabling an ambitious transformational agenda.
So to explain the ecosystem, (at least the futurelearn part) it is made up of pink blocks.
FL now enable a range of course models, we explain to educators how each model is made up of blocks, or courses. We look at evaluation of courses at this level – the pink block level if you like!
And we show how this is part of an even bigger picture in an engage and enrol model
We apply an evaluative process around each pink block, or course. And we recognise that evaluation happens at all stages, as an important part of an iterative course design cycle.
Our evaluation processes have streamlined over time, but were built on the work we did on our first MOOCs. This was a much deeper analysis and gave us the theoretical underpinnings for our current processes. We need to have something now that is replicable, scalable and consistent across course developers so we can inform a range of stakeholders.
So this is easier….. Before , during and after!
So what do we do? A range of things too much to mention here, but happy to share later.
We have a role allocated to establish the process which ultimately will be more self service and templated. This lines up with other work here on dashboards etc. we know it is not just about providing data and processes - we also need to build capacity for mentors and educators in firstly being able to teach online and secondly understanding what data means and then what to do with it.
Here is an example of a template developed
I have included this as this is a common requirement at many institutions. The need to consider innovative teaching approaches for online learning. We have formalised this in to an innovation register – here are some examples. This way we have a recognised innovation in a course, we can focus on its efficacy and then make recommendations - either for the next delivery or for scaling up. It also helps establish credibility for what we do, and allows other parts of the univeristy to see value beyond for example revenue.
Here are some more examples – I have included these as we are presenting these tomorrow in a webinar on digital literacies. Power of having the register - multipurposed
Value
Redelivery of courses
New models of online learning
Inform strategy – architecture, transformation of teaching practices
Value
Redelivery of courses
New models of online learning
Inform strategy – architecture, transformation of teaching practices