Successfully reported this slideshow.
Your SlideShare is downloading. ×

Hearing the online student voice: Addressing perceptions of technology enhanced learning

Loading in …3

Check these out next

1 of 17 Ad

More Related Content

Slideshows for you (20)

Similar to Hearing the online student voice: Addressing perceptions of technology enhanced learning (20)


Recently uploaded (20)

Hearing the online student voice: Addressing perceptions of technology enhanced learning

  1. 1. SOLSTICE Conference 2015 4th & 5th June 2015 Hearing the Online Student Voice: Addressing student perceptions of technology enhanced learning adoption Emily Webb, Rod Cullen, Irfan Mulla, Hannah Palin, Susan Gregory and Osman Javaid
  2. 2. Background • EQAL - ‘step-change improvement in student satisfaction’ (Stubbs, 2014) – Core+ Model – Moodle - 44 million hits – 40,000 unique users – 12,000 programmes and units • Learning Innovation team • MMU Moodle - wrapping the institution around the learner
  3. 3. Wrapping the institution around the learner megamashup
  4. 4. Wrapping the institution around the learner 4 Student ID Timetable Sync to personal device
  5. 5. Student ID (+ Unit code) Deadlines / extensions / feedback return dates / provisional marks Personalised submission sheet Wrapping the institution around the learner
  6. 6. Unit code Resource list Relevant resource Wrapping the institution around the learner
  7. 7. Unit code Past exam papers Past exam paper Wrapping the institution around the learner
  8. 8. Adoption of Technology Enhanced Learning (TEL) at MMU • Student Voice –Improving teaching & assessment practice –Strategic planning/improvement –University Advancement/Progression
  9. 9. ISS (Internal Student Survey) • ISS - Institutional approach to information, data collection, management and analysis – Bi-annual – Student experience on teaching and learning – Two free text questions: • Best things about my course? • Things I would most like improved on my course? ISS December 2014 • 47,800 comments (best & in need of improvement) • 2072 comments – relating to Moodle and learning technologies • Best - 746 Comments / Improve - 1326 Comments
  10. 10. Feedback-action cycle Harvey (2011) ‘The Nexus of feedback and improvement’
  11. 11. Phase 1 - Data Collection, Management and Analysis Thematic analysis: 1. Familiarisation 2. Identify Thematic Framework 3. Indexing (Coding) 4. Charting 5. Mapping and Interpreting
  12. 12. Institutional: Best features Content well organised Provision of audio/video Effective communications Content up-to-date “The wide resources taught e.g. solving questions with software packages as well as by hand. The lectures and tutorials are organised well and a good way to learn the content in the course.” “The organisation of the unit is one of the best things about this unit. The use of podcasts and video lectures alongside traditional lectures and seminars were a good combination. The content was not too heavy and was all digestible in appropriate chunks.” “The communication between groups of the course is improving and suggestions have been implemented to help share knowledge from the course tutors (Facebook/Moodle groups).” “The quality of teaching was amazing and with all the lecture Powerpoints being on Moodle in advanced I was able to start my essay in advanced as I had basic notes already on it.”
  13. 13. Institutional: Areas to improve Better organisationBetter communication Content uploaded to Moodle ‘’This unit is very unorganised and I feel like the tasks that we are completing in the class do not relate to the exam that we had to complete. The slides that were uploaded to Moodle were not in any specific order and it was very confusing to revise from for the exam. Last year in product awareness it was very informative and exciting this year it just feels like we aren’t progressing.’’ ‘’Course organisers need to develop effective communication skills. Eg last week information about a room change was given by email 30 minutes before the lecture. As it turned out the original information was incorrect. The staff are too laid back and give very little information and what they do give is often contradictory. Information is often hidden in obscure places in noodle or is it muddle.’’ “Would be nice if X would make his lecture slides available on Moodle for the sake of note taking. Often what is being said is more important than the notes, but the notes are also needed.”
  14. 14. Phase 2 - Action • Map comments to individual Moodle areas • Present findings from Thematic Analysis to FEG and TEL Group • Each of the 8 faculties different approach – no one size fits all Example – HLSS • Awareness raising sessions – ‘The Student View’ • TEL Coffee Club • Moodle Template – Programme Area Example – HPSC • Moodle Essentials • Try it on sessions – (In conjunction with Academic Staff) • Moodle Template – Programme Area
  15. 15. Phase 3 – Feedback to students • Feedback to students through existing networks – programme committee ‘…monitoring and evaluation of the programme and in particular evaluating its operation, its delivery and standards, its teaching methods, its curriculum aims and students’ needs’ – Student support officers
  16. 16. Next Steps • Longitudinal piece of work – future ISS comparisons • Cross course and faculty focus groups • Moodle Audit Data • CMI (Continuous Monitoring Improvement) Dashboard
  17. 17. QUESTIONS

Editor's Notes

  • I am going to start by giving you some background information on MMU infrastructure

    To start with I am going to talk about EQAL an institutional level project initiated in 2010 to transform the student experience.
    The goal of EQAL, which stands for Enhancing Quality and Assessment for Learning, was to ‘make a step-change improvement in student satisfaction’. Integral to transforming the student learning experience was the introduction of Moodle as the institution’s new VLE., which has been at the centre of MMU’s Core+ managed learning environment since 2011.  
    As an indication of how well Moodle is utilised by students, in this academic year alone to date, Moodle has had over 44 million hits from 40,000 users across 12,000 Programmes and Units.
    From these stats alone, it is evident that Moodle is central to the student experience at MMU.

    Learning Innovation Team
    The Learning Innovation team is responsible for the pedagogical application of Moodle and learning technologies -and sits within the wider department of Learning and Research Technologies – who build MMU’s Core+ managed learning environment
    It consists of 4 Senior Members of the team who are based centrally and 8 Technology Enhanced Learning Advisers who are based in each faculty and work closely on a day to day basis with academic staff.

    Wrapping institution around the learner
    Now going to talk to you briefly about Moodle and how it is currently setup within the institution.
    Moodle at MMU is tightly integrated with the other institutional IT systems, which is designed to create a seamless personalised learning experience to wrap the institution around the learner.
    What this means is that each student has access to a Moodle area for each unit and programme they are studying, as well as access to their personal timetables, assignments, reading resources and online learning resources.
    Although there is a high level of automation and personalization for each Moodle area, academics are still responsible for the main teaching, learning and assessment content, pedagogic design and day to day management of activities and resources within the Moodle areas for the units that they teach.
  • Although there is a high level of automation and personalization for each Moodle area, academics are still responsible for the main teaching, learning and assessment content, pedagogic design and day to day management of activities and resources within the Moodle areas for the units that they teach.
  • The adoption of TEL is an important part of the learning and teaching strategy. It is part of the university wide vision.

    Other factors, which influence the adoption of TEL, are:
    Student and staff digital literacy
    Support infrastructure
    Organisational culture

    The factor we are considering today is the student voice and one which has been critical to our research. There are many ways that student voice can influence the adoption of TEL
    By listening carefully to positive and negative TEL experiences of student this can provide a practical agenda for improving Teaching & Assessment practice
    Which in turn can lead to Strategic planning/improvement, for example if a unit has had positive comments on course organisation and structure a similar structure could then be used across other units in the same programme area.
    Collectively these can promote university advancement.
  • So as we have established the Student Voice is important to the TEL strategy at MMU. How is this data gathered? This is done via an Internal Student Survey known as the ISS.

    The ISS is an institutional data set that helps inform the quality enhancement of MMU’s Core+ managed learning environment. This is one of the many institutional data sets that is used to improve the adoption of TEL. The focus of our research was based on the ISS.

    The ISS is currently run twice a year.

    Students share their experiences of learning, teaching and assessment on each unit within their programme of study by stating what they like best about their course and what they would like to see improved. From this ISS data we are able to get a good picture of what students are thinking and it gives a good indicator of satisfaction with their course.

    Students in the December 2014 ISS posted 47,8000 free text comments.
    We filtered these comments using a set of 24 keywords and extracted 2072 comments related to the student experience of Moodle and other aspects of technology enhanced learning, teaching and assessment.  

    This provided a data-set of 746 comments relating to the “Best” aspects of programmes/units and 1326 comments relating to the aspects of programmes/units that students would like to see improved.

    *It is important to mention that the ISS is a survey around the entire student experience, so any mention of Moodle, either positive or negative means that they felt strongly enough to mention it, when not having been questioned about it specifically.

  • Thanks for that…

    Providing opportunities for student voice and feedback has become established practice within Higher Education Institutions,
    With this I mind we adopted an Action Research approach to take into account the student voice to help improve student provision and our own practice

    We used a model put forward by Harvey proposes that a feedback action cycle is an effective method that ensures the student feedback feeds into action and consequently those actions are fed back to the students.

    In the case of the action research we carried out
    So in Phase 1 – the ISS data is collected and analysed
    Phase 2 we look at how this analysis can lead to tangible actions
    Finally phase 3 these actions are fed back to the students to close the feedback loop
  • Lets first consider phase 1
    We carried out a thematic analysis on the ISS data outlined previously. A variation of Richie and Spencer’s ‘Framework’ approach which outlines the 5 key stages of the thematic analysis

    Familiarisation - Two members of the learning innovation team read though the text responses independently and began to pick out repeating themes.

    Identifying a thematic framework - The same two team members compared and contrasted notes and agreed a common set of coded themes. Eighteen themes were established relating to the “Best” aspects of programmes/units and twenty-five themes relating to aspects “in need of improvement

    Indexing – Each Faculty has a dedicated TELA who indexed (coded) the text comments against the agreed thematic framework.

    Charting - Produced graphs and tables of the frequency of the coded themes – institution and faculty)

    Mapping and interpreting – The team worked to together to further explore the relationships between themes.
  • The graph shows the results of the thematic analysis and we can see here the ‘best’ aspects identified by students.
    So this is the institution as a whole
    Really good way to show staff what students like And get buy-in based on evidence rather than anecdotal etc

    1) Of the themes relating directly to Moodle, the most popular theme was for units to be well organised. Students wanted areas that made it clear what materials are available and how find them
    You can see here one of the ISS student comments showing the value of having content organised

    2) The next highest rated theme related to the use of audio and video materials. Students rated the interactivity of these resources which promote engagement and appeal to a variety of learning styles

    3) Students also valued effective communications. The appropriate use of Announcements within Moodle provided clear communication for students and encourages the students to take responsibility to check their Moodle units on a regular basis.
    Important to note that this encompasses all aspects of communication

    4) Students also appreciated Moodle being up to date including materials being uploaded in advance of lectures so that they could print out and annotate with their own notes.
  • SO here are the areas that students would like to see improved;

    1)The highest rated them is Organisation and quality of resources in Moodle. At first glance it seems that the students are being contradictory about the same units.
    However it infers that well organized and high quality Moodle units are praised by students
    And disorganised units with poor quality resources are criticised.

    Students like consistency across units.
    So if a student access one unit that is well organised and contains high quality resources
    And then logs into a poorly populated with disorganised content
    Will lead to an inconsistent student experience.
    The best examples are basically the criteria by which they judge the others

    2)Communication (clearer information/faster responses)
    Students did not appreciate instances where staff are using Announcements inappropriately.
    Also unclear information, untimely or too many messages that could promote student disengagement.
    Also students want timely responses to their own messages. This relates back to consistency – if students have emails answered the same day in some areas, they will come to expect that for other units within the same programme It is important to establish clear expectations on all communications.

    3) Upload teaching materials to Moodle
    The next highest relates to the availability of teaching materials in Moodle. Again, students like to see all of their units in Moodle contain teaching materials, providing a consistent Moodle experience.

    It is interesting to note that these results correlate with some of the  findings from the literature around student satisfaction with VLEs where students were expecting
    easy to navigate areas
    with relevant and timely content ,
    delivered by responsive staff (Naveh et al, 2012).

    Overall we found that
    Student expectations are fairly modest
    Students work within their own ‘good practice’ frameworks
    Staff need training and support to recognise the value of a VLE
    VLEs send out strong messages on what is valued in terms of teaching and learning
    Content without context is not helpful

  • Having completed the first phase of analysing the data the next step was to devise an action based on the analysis. What we did was drill down the data further into department’s areas and mapped the comments to each area. Hot spot diagrams were created which clearly reflected these.

    I am just going to show you an example of the Hot Spot Diagram

    We have the themes on the left and you have each department across the top.

    The darker the spot – the more comments were received about this issue. As you can see Department A received good comments on the Use of Audio/Visual materials.
    A similar diagram was done relating to comments on areas students would like to see improved.
    Again the darker the spot the more comments were received about the issue.

    The hot spot diagrams gave a quick and clear view of which areas were doing well and which areas needed to improve and we received positive comments from faculty based people.

    We then presented the findings to the Faculty Exec Group which is made of the Dean of the faculty and also the heads of each department and also to the Technology Enhanced Learning (TEL) group in each of the faculty which contains Learning and teaching champions from each department and devised a plan of action.

    Each faculty had a different approach to the findings as there was not one single solution that would suit all faculties although there were many common positive approach from each faculty.
    Example of what was done in HLSS was to arrange awareness raising sessions called the student view to present some of the findings of the analysis to academic staff members about what the student perceive as important to them and what they value in a Moodle area. This was done in a number of ways by having sessions, which staff could book on whereas other departments preferred me to go into department meetings and present the findings.

    TEL Coffee Clubs were also organised to present and demonstrate learning technologies that staff could come in and experiment and talk to learning and teaching professionals about. These learning technologies were in direct relation to what students valued in a Moodle area such as Audio/video.

    The uptake on this was really good as staff preferred coming in for 10-15 mins and experiment rather than be stuck down to hour long staff development sessions.
    One of the key actions from the TEL group was to create Moodle Templates specific to programme areas as student valued consistency across their units. This would ensure that students would see a consistent approach across all their units for the programme they are studying.

    In Health, Psychology and Social Care they organised Moodle Essentials Staff Development Sessions, which would allow staff to book and learn about the Essentials of Moodle. Again this was in direct response to what students wanted in their Moodle areas such as good course organisation, good use of audio/video and good use of communication.

    Try it on sessions were also organised in conjunction with Academic staff to demonstrate learning technologies and a similar approach was used as HLSS in terms of Moodle Template to achieve consistency across units in a programme area.
  • MMU has established networks for feeding back to students which have utilised to feedback to students

    One of the key ways is via the programme committee – students can raise any issue relating to the programme via a elected student rep, and conversely staff are able to relay information to students. It happens twice a year and is a useful two way communication

    This is an excellent mechanism to promote the action we have taken as a result of student comments, whether that be targeted staff dev sessions or a more standardised consistent approach to Moodle for that programme

    As student support have direct access to students and support them in a range of activities, this can also be a useful channel to relay anything relating to TEL

    This phases particular aids our own Action Research approach as it directly informs our practice and how we can improve what we are doing to bring it inline with student expectations
  • So moving forward, as mentioned the ISS is conducted twice a year, the latest being March of this year. A similar analysis will be carried on this and those in the future. This will then form the basis for a longitudinal piece of work where we can identify trends and areas of concern

    This can be supplemented by cross faculty focus groups – again sharing this information with selected students and getting further clarification on the survey data making it a richer data source as well conveying our actions

    At MMU there is an agreed threshold of what should contained in a Moodle area which include reading lists, unit handbooks, assessment information etc as well as other basic content provision
    We’re able to check this using an audit tool that automatically looks at which areas comply with the threshold standards
    This compliance analysis can then be compared to our analysis on satisfaction to see if there is any correlation between the two

    The CMI Dashboard is a very powerful tool, Staff enter in actions based on the ISS comments.
    Moving forward we are planning to make this available in Moodle for student to be able to see what actions have been taken as a result of their comments

    So just to conclude, this work has focussed on the student voice – we have looked at how to take direct action in response to comments, whether that be targeted staff dev sessions, creating more consistent approach to online learning - but an important final step which we found lacking in many other studies to convey this back to the students to close the feedback loop and use it as a means of further improving our practice in an ongoing action research cycle
  • CMI Dashboard is a very powerful tool

    Staff put actions down based on the ISS comments.

    Moving forward we are planning to make this available in Moodle for student to be able to see what the actions.
  • To build on the excellent work analyzing the ISS comments (and to wear a QAA HE Review auditor hat for a moment), I thought it’d be useful to look at the extent to which threshold standards set by the Student Experience Committee (SEC) are being met.
    SEC said every unit should have some Moodle content, a reading list on Talis Aspire and hand-in dates. Even though the rise in co-taughts complicates things, it’s apparent from Chris’ recent audit that not all units are meeting that threshold standard.
    In the attached, I’ve created a pivot table that groups the unit data by faculty, level and department. I’ve focused on the SEC thresholds, and added a crude calculation of the extent to which the thresholds have been met in the %complete column. On the assumption that co-taughts share reading lists, the calculation is percent of units with moodle content * (percent of units that are co-taught – percent with reading lists) * percent with hand-ins.  
    If it does make sense, we’ll need to devise a plan for securing FEG buy-in to bring the units up to the threshold standards agreed by SEC. We’ll need to work with library colleagues to ensure unit leaders populate the missing Talis Aspire reading lists.
  • I am just going to show you an example of the Hot Spot Diagram

    We have the themes on the left and you have each department across the top.

    The darker the spot – the more comments were received about this issue. As you can see Department A received good comments on the Use of Audio/Visual materials.

    What’s happening in individual department

    (drill down) in the future we can go further at individual unit level

    Accessible way that the

    the hot spot diagrams are very accessible and have received positive comments from faculty based people (associate deans/sltf)
  • A similar diagram was done relating to comments on areas students would like to see improved.
    Again the darker the spot the more comments were received about the issue.
  • I am just going to show you what the Core+ Model looks like. As you can see at the core you have the VLE and other learning technologies which surround it in 3 different categories. The first category closest to the core is the arranged category. We expect the majority of staff to able to utilise these within their practise. The next category is recommended. There is no expectation that all staff members would use all of these in their practice. The last category is recognised and in this category we encourage tutors to experiment with these but staff members are responsible for account creation and management.