Joint presentation with Julie Mulvey - Blue Admin,
Details how we used features in Blue such as the institutional hierarchy, DIG, Question Bank, Role-Based Dynamic Access to simplify the process. Also discusses a bespoke data manipulation tool (the QuBE) used to prepare data for evaluations.
2. @juliemulvey @malcolmmurray
Who we are
Julie Mulvey
Learning Technologist
Worked with online Evaluations since 2005
Certified Blue Administrator 2017
Dr Malcolm Murray
Head of Digital Learning
Developed a bespoke Evaluation system for Blackboard (2004-2011)
In 2018 developed a bespoke data prep tool for Blue: the QuBE
3. Session Structure
• Our initial pilot
• Institutional rethink
• Work with Professional Support Services
• Evaluation data gathering
• Preparing the data for Blue and DIG
• Launching the evaluations
• Response rates
• Lessons learned
@juliemulvey @malcolmmurray
Bluenotes GLOBAL 2018
CONFERENCEbn
4. @juliemulvey @malcolmmurray
Institutional Hierarchy
DIG – Data Integrity Gateway
Automated Processing – the QuBE
Question Bank
Role-Based Report Access
Novelty
@juliemulvey @malcolmmurray
Photo by Braydon Anderson, shared via Unsplash
6. @juliemulvey @malcolmmurray
Two phases:
• 60 postgraduate courses
• 100 undergraduate courses
Experience:
• Staff & students found it easy
• Data a “Spreadsheet nightmare”
• Low return – poor timing
• Reports delivered quickly
• Consistent format
7. Institutional Rethink
Education Committee tasked group to
look at Evaluations across the University
Findings
• MEQ data valued by staff
• High degree of variability in process
• A lot of manual report generation
• Aggregation beyond modules difficult
@juliemulvey @malcolmmurray
2
8. Institutional Rethink
Recommendations
• Reduce the number of questions students
are asked
• Standardize evaluations using a mixture of
core institutional questions, core
department questions, and module
specific questions – allowing aggregation
• Reduce the administrative overhead by
automatically producing reports designed
for defined stakeholders
@juliemulvey @malcolmmurray
10. Working with Professional
Support Services
Purchased support for this, our first large
scale implementation.
• Helped us make decisions by providing
context when we needed to think about
things
• Kept us on target
• Helped us adapt to changing institutional
scope.
• Helped design evaluations and reports
@juliemulvey @malcolmmurray
3
11. Evaluation Data Gathering
Process owned by our Academic Support
Office
They produced a spreadsheet and sent it to
departments:
• Choose 3 core Department questions
from a list
• Write 3 individual Module questions per
module
@juliemulvey @malcolmmurray
4
12. Setting Questions
23 / 25 Departments engaged and
provided questions
• One department did their own thing
• Another department declined
because they did not have their
questions approved in time
Time consuming process
Late process change – to add a Staff
Rating question
@juliemulvey @malcolmmurray
13. Staff Rating Question
Went back to ask departments if they
wanted to rate staff’s teaching
• 50% said Yes, 50% said No
• If Yes, need to list staff to rate
No institutional database of who teaches
what, or who the module leader is
Delayed the survey launch – to just before
exams
@juliemulvey @malcolmmurray
34. DIG:
Data Integrity Gateway
Data imported into DIG
To allow designated Admins to approve data
over a 2 week period:
• Check all required modules were listed
• Confirm the Module Leader (for reporting)
• Check the list of Teachers to be rated
@juliemulvey @malcolmmurray
6
35. @juliemulvey @malcolmmurray
1,009 taught Modules to evaluate
23 Departments
8,837 staff roles fed into the system
1,806 Questions added to the Bank
Our Implementation in Figures
@juliemulvey @malcolmmurray
7
36. @juliemulvey @malcolmmurray
14,663 Students surveyed
63,096 Evaluations deployed
> 4,177 Reports released to staff & students
33,524 Evaluations completed (53%)
@juliemulvey @malcolmmurray
All* in a 2 week evaluation period
47. @juliemulvey @malcolmmurray
Reports
Created a range of reports:
• Students
• Individual Teachers
• Module Leaders
• Heads of Department
• University-Wide Summary
• Discrimination Report
@juliemulvey @malcolmmurray
49. Demographic Data
@juliemulvey @malcolmmurray
We wanted to analyze student responses
using demographic data.
• Not possible as the dataset was not
encrypted at rest
• Also wanted to restrict access to
decrypted sensitive data by role
Explorance staff have proved responsive
and have already engineered the first phase
of solutions
50. Discrimination
Question
This year in this module, were there any
circumstances connected with your studies
at Durham University which caused you to
feel disadvantaged as a result of age,
disability, gender, maternity/paternity,
pregnancy, race, religion or sexual
orientation?
• Yes
• No
Photo by Nathan Dumlao, shared via Unsplash
@juliemulvey @malcolmmurray
51. Areas for
Improvement
API to allow us to display live completion rates
Course Progress Leader Boards?
@juliemulvey @malcolmmurray
Photo by Leslie Jones, shared via Unsplash
52. Better support
for repeating
questions
Photo by Jakob Owens, shared via Unsplash
@juliemulvey @malcolmmurray
Overall I am satisfied with the quality of teaching by Dr Jekyll
• Definitely Agree
• Mostly Agree
• Neither Agree nor Disagree
• Mostly Disagree
• Definitely Disagree
Overall I am satisfied with the quality of teaching by Mr Hyde
• Definitely Agree
• Mostly Agree
• Neither Agree nor Disagree
• Mostly Disagree
• Definitely Disagree
53. @juliemulvey @malcolmmurray
Matrix Format
Definitely Agree Mostly Agree Neither Agree nor Disagree Mostly Disagree Definitely Disagree
Dr Jekyll ◎ ◎ ◎ ◎ ◎
Mr Hyde ◎ ◎ ◎ ◎ ◎
Prof Plum ◎ ◎ ◎ ◎ ◎
Overall I am satisfied with the quality of teaching by:
54. Metrics
Report access statistics
Time spent by students to complete an evaluation
Photo by Mitchel Boot, shared via Unsplash
@juliemulvey @malcolmmurray
55. Next Steps
@juliemulvey @malcolmmurray
Complete the feedback Loop
• Use Blue to capture planned
response/actions
Share these later via Reports
• Stretching the concept of a
Project in Blue
• Need process approved
56. Durham Centre for Academic Development
www.durham.ac.uk/dcad/
Education by Design
Julie Mulvey
julie.mulvey@durham.ac.uk
Dr Malcolm Murray
malcolm.murray@durham.ac.uk
Thank you
@juliemulvey @malcolmmurray
Editor's Notes
CLICK to show 5 areas we think will be of interest – three come with Blue 7
High profile, influential department. Not happy with existing LMS-based MEQ solution.
Ran alongside the Business School pilot
Combined with the DUBS trial, University decided to sign with Blue for 5 years, to invest in delivering an institution-wide solution
So this is where the hierarchy is important and allows us to use Blue’s Question bank.* One of these was a potential discrimination question – sensitive data
High stakes - this project was sponsored by our PVC (Education) – like a Provost in the US. Professional Support has been invaluable - with us every step of the way
Amended the spreadsheet and added more fields – one for module leader user ids and one for staff rating user ids
Question Bank Engine. Bespoke application: Java 8, Glassfish server, MAMP (MySQL)Follow the process and thinking, rather than concentrating on the tool itself
Modelling the University – a copy of this is taken for each deployment
University -> Faculty - > Department
Modules will be associated with Depts later
Access roles defined globally, took the decision to keep it super simple
Question Pool – standard questions used for every MEQ – we will select the ones we want to use for a given deployment later
Deployments – at the heart of the system – one diet of evaluations = 1 deployment
Question Template Blocks – gives you flexibility to support central and dept/module-specific questions – some link to Question Pool questions.
Deployment Nodes – based on the Organisational Hierarchy defined earlier. You can add or delete entries as the evaluation required.When populated with questions you can preview the results at the Module level…
Define individuals who can access DIG – it generates a feed file of DIG_ADMINS
Finally we can import Department choices…
CLICK to mimic file upload
Success receipt (or red if something goes wrong)
You can then preview the MEQ (albeit with a different look)
And optionally check the data fed to us by the Department – e.g. the custom questions and any list of staff
Export options
CLICK: even more options…
Run some sanity checks
Quite a few… ..even so some duff data slipped through
See entries for the University placeholder questions, then the relevant departmental ones for each module
Actual Questions on the second worksheet tab
Spreadsheet listing details about the courses, mapping them to nodes in the hierarchy
Module Leader Mapping – used to grant them access to the full module report, similar ones for Teachers and Students
Think we might be able to make more use of this tool in the future.
Would like the Lecturer and Module Leader auto expanded to save time
Would like a link to the associated Evaluation for checking for DIG Admins
* OK some of the reports took a bit longer
Students could access evaluations via email or via a custom module on our LMS (Blackboard)
Staff could see live response rates
Staff could see live response rates
Julie’s sanity threshold
992 (98.3%) Acceptable – 25% and above;
720 (71.4%) Good – 50% and above;
127 (12.5%) Excellent – 75% and above
Greater overview of completion rates across the university than ever before
We sent an initial email to notify students of the launch which included a link to the surveys
Standard reminders sent on the Thursday and the Monday
A final reminder on the Thursday just before close at midnight on Sunday
Students could always access their surveys from within Blackboard
Staff were sent one initial email – they could view the progress of return rates from a module that sits within our Blackboard. They also access reports from here as well
An unexpected peak – looked at the channel data – where were students accessing evaluations from?
A significant impact on the number of evaluations being completed via Blackboard (portal) than email as in previous days.
This continued to play a part on subsequent days throughout the evaluation period.
A: the addition of this bespoke portal module at 9am on Thursday, showing students the current MEQ completion rate after they’d logged in to Blackboard
Some just have the results from Likert scale questions (e.g. students, teachers) others have the full text responses too.
We have strict Governance over data security and the use of cloud services
This is a first time for us – 325 responses - Many were just allowing students to say they had a disability or difficulty but that it was being dealt with
This data in the hands of the Academic Office to evaluate and decide whether to approach departments to see if any issues or trends can be addressed
Also branching logic – e.g. select tutors, TAs they have had, then only rate the ones they can remember
Would be nice to know if staff and students are looking at the reports.Knowing how long students are spending to complete MEQs could help optimise survey length.
We think the workflow is possible and hold much of the required data in the project. Some interesting exceptions to consider – e.g. if the module leader changes in the interim.