Introduction: Civil servant librarians with the Scottish Government. Along with other library work, we conduct information literacy training in the Library Services team. Varied grades of staff, approx. 5,300 in Scotland. Not the Scottish Parliament who deal with MSPs (like the House of Commons Library.) Up to now, we haven’t evaluated or measured our training courses. Have just asked delegates to complete ‘happy sheet’ and taken on board their comments using no particular method.
Why are we doing this? Background: Hooks to hang this research on. Setting the scene. SG ‘Working Smarter’ objective (one of five SG objectives – smarter, greener, safer & stronger, SG Skills for Scotland: Lifelong Learning Strategy (Written by CLS this is the corporate learning strategy for the SG. CLS does the corporate training. Information Literacy Strategy Information Literacy Training Need proof IL for senior management and Corporate Learning Services that training worthwhile. That it’s a return on investment. One method of proof is to evaluate and measure our training courses and up till now we haven’t done this in any depth. NB Corporate Learning Services have added our IL training to their Skills for Success framework. The framework plays a key role in identifying the skills the organisation needs to function effectively. Incidentally, CLS does not evaluate or measure the success in their training.
As a result of the findings in the research undertaken by SILP and Library Services, the Scottish Government has developed and is rolling out an information literacy skills programme. See below. 1. Library Services run a variety of courses to improve information literacy skills in the SG Internet Skills in the Workplace Web 2.0 Workshop (new) Fun hour long seminars e.g the Google Treasure Workshop (Advanced Google) for seminars and away days Discover Web 2.0 – short course for Learning at work Drop-ins or ‘one to ones’ at people’s desks for an hour so they don’t need to move! We are involved in the Corporate Induction sessions so Library Services are advertised and promoted monthly 2. Decided to concentrate on two IL training courses for the purposes of our measuring and evaluation, namely the Internet Skills in the Workplace and the new Web 2.0 Workshop. (We amalgamated the basic and advanced internet skills courses because no one was booking onto the basic courses anymore as they all thought they were advanced. But our results bear out the premise that users have ‘false confidence’ – Sheila Correll – when assessing their own searching skills)
Aims of our research: Not rocket science. It’s pretty obvious, we need proof. Does our IL training meet the business need? Need to measure impact in a meaningful way – not just enough to say that we have succeeded with our IL programme – where is the evidence, the proof – how have behaviours changed for example. How have people ‘worked smarter’ real change as a result of attending IL course - how have behaviours changed. What is happening in the real world and are delegates using their new skills? Need to follow up personally Statistics useful for showing impact – but it is the human stories that often get more recognition – showing what individuals have achieved as a result of attending the course – important to go and talk to people, make contact. It demonstrates purpose and focus for our evaluation project. It’s often the human stories that bring it to life and show what they’ve actually achieved. Often recognisable to others, you then see real and practical benefits gained. It’s the narrative/stories – finding out how it has changed people’s behaviours by them giving examples of behavioural change. Statistics are one dimensional whereas stories are multi dimensional and individuals can relate better to the human aspects of practical uses. When we communicate our findings to the DAP and CLS, the human stories make an impact.
1. Immediate evaluation at end of course. Collect at the end of the training courses. It is immediate self evaluation from delegates. ‘Happy’ or ‘smile’ sheets. What they did and didn’t like re: course Participants informed that in 2-3 weeks librarians undertaking course content evaluation and would be contacting the delegates. Librarians collated ‘happy sheet’ evidence into an Excel spreadsheet (statistics and comments) Latterly Librarian introduced herself to delegates at the end of course which meant an increase in interview replies when she phoned up. 2. Interview based research We told the delegates: Not intrusive, not a test Participants told they were helping us to improve our materials (feedback for us, not them) Didn’t matter if not used any of the tools back the office (that was still feedback and relevant to us!) Delegates told it ok to attend course for general professional development (i.e could be for CPD and didn’t have to be for particular work related reasons Told ok to ask Library Services questions when back at the workplace and to get in touch anytime. Librarian seeking examples of a workplace task, the benefit of applying knowledge and skills gained (if any) from the course to that task, and the outcome or consequence of the task. 3. Librarian sent follow up questionnaire 4. Contacted Analytical Services (brief explanation of who they are) for advice on how to collate our evidence into something meaningful. They gave advice re: our collecting of statistics and comments. 5. Also spoke with Corporate Learning Services who responsible for all training in the SG apart from Library Services courses. We wanted to know if and how they evaluate their courses and could the give us any general guidance. Janet Saunders – Kirkpatrick model suggested. 6. Advice from library colleagues. Made use of the experience and knowledge within our team, especially experience librarian with educational and statistical background Interviews post course is essentially retrospective content analysis. It is really the only option as no real a priori theoretical and methodological structure in place.
Advice from CLS Janet Saunders was to use this model. We bought the book and signed up to their blog: Kirkpatrick Evaluation Group and receive regular info. The Kirkpatrick model has 4 basic levels of learning 1. Reaction of student 2. Learning – increase in knowledge/capability 3. Behaviour – implementation/application 4. Results – effects on the business resulting from trainee’s performance provides one technique for appraisal of the evidence for any reported training program used to evaluate whether a training programme is likely to meet the needs and requirements of both the organisation implementing the training and the staff who will participate. Reaction of student: - consider the impact of the individual learning experiences and reaction – first stage of assessment where ‘happy sheets’ are completed by attendees. Learning – the resulting increase in knowledge or capability – news skills & benefits gained from attending an IL session & recalling specific skills from the course to source information/aid communication Behaviour – extent of behaviour and capability improvement and implementation/application – how has this been applied to the task, how has the individual’s experience changed as a result Results – the effects on the business or environment resulting from the trainee’s performance – are the staff within the SG working ‘smarter’ as a result All these measures are recommended for full and meaningful evaluation of learning in organizations, Reaction - students are asked to evaluate the training after completing the program. These are sometimes called smile sheets or happy sheets because in their simplest form they measure how well students liked the training Learning - did the students actually learn the knowledge, skills, and attitudes the program was supposed to teach? To show achievement, have students complete a pre-test and post-test, making sure that test items or questions are truly written to the learning objectives. By summarizing the scores of all students, trainers can accurately see the impact that the training intervention had. Behaviour - real question is whether or not any of the new knowledge and skills are retained and transferred back on the job. Level Three evaluations attempt to answer whether or not students&apos; behaviors actually change as a result of new learning. Results - allowing some time to pass, participants have the opportunity to implement new skills and retention rates can be checked.
Librarian sent questionnaire seeking evidence on the practical use of the course content applied back at the workplace Participants asked to identify a practical example, if possible, of a task, the benefit and the outcome of putting in to practice skills gained on the course. The task – work specific e.g searching for reports, delegate now uses Advanced Google The benefit – advantage of using the tool e.g. able to search more efficiently, saves time The outcome – consequence – working more efficiently 1. Reaction of student - what they thought and felt about the training consider the impact of the individual learning experiences and reaction 2. Learning – increase in knowledge/capability - the resulting increase in knowledge or capability gather the evidence on new skills & benefits gained from attending an IL session & recalling specific skills on the course to source information 3. Behaviour – implementation/application - how has this been applied to the task, how has the individual’s experience changed as a result? 4. Results – effects on the business resulting from trainee’s performance - are the staff within the SG working ‘smarter’ as a result ? Is there any proof that training has positive results?
Evaluation period 6 months, October 2009 – March 2010 The delegate’s subjective estimate of their average skills improvement Happy sheet statistics ( previously used to read them and put them in a box) Little verbal feedback at end of the course Oct 2009 – Mar 2010Internet Skills courseWeb 2.0 course No. of delegates 63 73 No. of replies %38%25% Mean/Average1.441.57 Median1.452 Mode02 Results analysis: Internet Skills course:. 63 delegates = 38% replied when Librarian contacted them for interview based research Most delegates did not give verbal feedback on way out of classroom. Most delegates brought some internet searching knowledge/skills to the course. Stats from Happy sheets: Mean/Average (meaning by how much did delegate think they improved) – 1.44 (20% =1 stage) Median (meaning where the delegate sat on the 1 – 5 scale) = 1.45 (just under the 2 spot.) Mode (most frequent level of improvement) = 0 (Quandary since most evaluation/happy sheet feedback showed new skills adopted) Web 2.0 course – more people gave verbal feedback on way out of classroom than Internet Skills course. 73 delegates = 25% replied when Librarian contacted them for interview based research. Think that lower feedback for number of reasons. Web 2.0 tools are less integrated into SG workplace than people needing to improve searching skills (more immediate use of searching skills.) Most delegates did not come with knowledge/skills of using Web 2.0 for the workplace. Stats from Happy sheets: Mean/Average (meaning by how much did delegate think they improved) = 1.57 Median (meaning where the delegate sat on the 1 – 5 scale) = 2 Mode (most frequent level of improvement) = 2 (since most delegates had little usage or understanding of how to use Web 2.0 tools in the workplace.
Evidence based interview feedback. Analytical Services advised us that it is best to : Internet Skills course: The main point are as follows Positive – 3 main points Advanced Google - my searching more specific, get better results, using domain e.g gov.uk ,searching on Time saving aspect - Evaluation didn&apos;t quite realise the importance of &quot;evaluating&quot; information Do not take information at face value (delegate told colleagues at directorate meeting. Searching and evaluation part the most helpful. Notice whether info is up to date and in a lot of cases found links to other information and websites (Admin post) manual is very useful 3 particular times when, almost immediately. I gained benefit from the course. Negative (very little negative feedback) slow progression through course / &quot;primary school style&quot; - format of delivery not correct * (We’re organised train the trainer) Recommendations/Improvements Delegates: Manual – liked to receive it electronically so that all the links there. Now doing this. Our recommendations (3 main) revamp course to include subject specific databases for individual areas. Need tailor it more. Need to look at attendee list (a la Drop Ins) and see where each delegate work area is and send pre course email to ask is there anything you are working on or subject area. Will entail extra library staff time and effort. eLearning tutorials could be tailored for each subject area e.g health, justice, education, environment and rural areas. We could be pre-testing delegates (via an eLearning tutorial or simple method). Drawback is staff resourcing as this would involve a lot of staff time. Stories: One woman was so enthusiastic about Advanced Google went back to work and immediately trained all her colleagues in her office how to use it. Overall, delegates have a false confidence in their ability to find information. This is borne out by their comments above from the Internet Skills in the workplace. “Thought I was ok using the internet but discovered there&apos;s a lot I didn&apos;t know” Web 2.0 course: Positive – main points from delegate interview feedback Using the web 2.0 tools to network, seek advice on what to use Having practical exercises is really useful and gives a better understanding of the tools too many restrictions in place on the web using web 2.0 for workplace as opposed to ‘banal chit chat’ able to pass this new practical information on to my colleagues knowing whom to contact re: web 2.0 tools links and tips in the manual (electronic version much appreciated) C bands have noted that this has made [social networking] partnerships more accessible Good overview of using web 2.0 in workplace the benefits being to keep up with professional developments when budget and time restraints mean I wouldn&apos;t be able to attend in person it took my decent &apos;in theory&apos; knowledge of the products to using them in practice. Negative Too rushed Too much packed into too short a time There are too many IT security restrictions in the SG so no point in using web 2.0 tools in the workplace Recommendations/Improvements The aim here would be to find best practice examples/exemplars of tools being used in the workplace, thence build up a bank of good practice and examples others to share. Link up with Whitehall colleagues to find/share their practices etc. Get some of the IT security restrictions lifted so easier to set up and use more Web 2.0 tools.
What have we learned? 1 Implementing IL training in the workplace. It has to have meaning to the individual. It needs to be tailored to their needs and of relevance to the user. Most people were resistant to using the table (task, benefit, outcome) that Morag emailed them Most people wanted to speak their thoughts over the phone. Quicker for them to do, under work pressure. The higher the grade, the more likely to fill in the Performa that Morag emailed them. When seeking feedback it’s best to go and meet user group initially as that encourages more responses Once collected, you must act on the feedback. For the Web 2 Workshop Try out, test and then edit if necessary (course and materials) At the end of the course in the happy sheets, the delegates just assessed what they had done on the course. In the interviews Morag assessed what they had put into practice in the workplace – the benefit of training. 2. Needs to be kept up to date 3. Varied delivery methods – e.g classroom, eLearning tutorials. 4. Time consuming. We need resources, staff, and need to have time to develop our skills 5. Difficult to get people to reply/give feedback. Pressure of work, not interested etc..
Next steps: We are delivering Rewriting our IL course materials to be more tailored to individual needs of the SG staff. Producing an specific IL course for our social researchers ‘Information into Evidence’. Organising train the trainer event as all the librarians feel the need to brush up their skills. Rethinking our evaluation process Rewriting our feedback form so that we gather the evidence we require and need to evaluate and measure our courses. Are we getting it right? Will run the in depth interview evaluation again but due to resourcing, not sure when. Perhaps over similar 6 mth period Oct – March 2011? 4. Will continue to seek advice from Analytical Services and CLS for advice and guidance re measuring and evaluating our courses as well as ensuring that our course material is fit for purpose/ fit to help SG staff improve on their skills etc.
Have recently set up a blog so that people can access our course material if interested. http://sglibraryservices.wordpress.com IDeA Communities of practice – ‘Creating an information literate Scotland’ set up in 2009 for librarians, info professionals and anyone interested in joining to discuss IL in Scotland in particular. http://www.communities.idea.gov.uk/welcome.do
Scottish Government information literacy in the workplace - measuring impact. Foreman & Higginson
Information Literacy in the
workplace – measuring impact
Why are we doing this?
• SG ‘Working Smarter’ objective
• SG Lifelong Learning Strategy
• Information Literacy Strategy
• Information Literacy Training
• Need proof IL training worthwhile.
Information Literacy Training
• Internet Skills in the Workplace
• Web 2.0 Workshop
• Web 2.0 for Policymakers
• Information ‘one to ones’
• Google Treasure Hunt
• Discover Web 2.0.
Measuring success & impact
• Measure impact in a meaningful way
• Advice from SG colleagues
• Have behaviours changed after training?
• Statistical data good but stories better?
• Collected ‘happy sheets’
• Individual telephone interviews
• Sent follow-up short questionnaire
• Advice from Analytical Services
• Advice from Corporate Learning
• Advice from library colleagues.
The Kirkpatrick model
1. Reaction – of the student
2. Learning – increase in skills
3. Behaviour – application
4. Results – effects and consequences.
Interview based research
• A task – work specific
• The benefit – what knowledge was
• The outcome – what behaviour
Oct 09 –Mar 10 Internet Skills Web 2.0
Delegate nos. 63 73
Replies 38% 25%
Internet Skills & Web 2.0 courses
• Positive comments
• Negative comments
What have we learned?
• Relevant and tailored to user
• Needs constant revising
• Varied delivery methods
• Time consuming. Need resources.
What are we doing next?
• Revising Information Literacy course
• Organising ‘train the trainer’ event
• Rethinking our evaluation process
• Ongoing advice from SG colleagues.
• ‘Creating an information literate
Scotland’ IDeA Community of Practice.