Your SlideShare is downloading. ×
External evaluation of the peer evaluation process final1
External evaluation of the peer evaluation process final1
External evaluation of the peer evaluation process final1
External evaluation of the peer evaluation process final1
External evaluation of the peer evaluation process final1
External evaluation of the peer evaluation process final1
External evaluation of the peer evaluation process final1
External evaluation of the peer evaluation process final1
External evaluation of the peer evaluation process final1
External evaluation of the peer evaluation process final1
External evaluation of the peer evaluation process final1
External evaluation of the peer evaluation process final1
External evaluation of the peer evaluation process final1
External evaluation of the peer evaluation process final1
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

External evaluation of the peer evaluation process final1

174

Published on

Published in: Education, Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
174
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
7
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. 1 External Evaluation of the Peer Evaluation Process undertaken by the University of Greenwich and Linking London Credit Accumulation and Transfer Scheme Pilot Projects Viki Faulkner July 2013
  • 2. 2 Table of Contents 1. Introduction .................................................................................................................... 3 2. Overview of the Peer Evaluation Process.......................................................................... 3 3. Strengths of the Peer Evaluation Process.......................................................................... 4 4. Challenges Within the Peer Evaluation Process................................................................. 5 5. Additional Monitoring and Evaluation Activities ............................................................... 6 6. Learning Points................................................................................................................ 6 7. Best Practice and Recommendations................................................................................ 8 8. Conclusions ..................................................................................................................... 9 Appendix 1: CATS Pilot Project Peer Evaluation Process........................................................... 10
  • 3. 3 External Evaluation of the Peer Evaluation Process undertaken by the University of Greenwich and Linking London Credit Accumulation and Transfer Scheme Pilot Projects 1. Introduction The University of Greenwich and Linking London each led one of the six, Association of Colleges (AoC) Department for Business Innovation and Skills (BIS) and Skills Funding Agency (SFA) funded credit accumulation and transfer scheme (CATS) pilot projects. The projects arose as a response to ‘New Challenges New Chances’1 , which set out the Government vision to reform the further education and skills system. This recognised that further and higher education institutions would need to find new ways of working together, providing clear and transparent progression routes that would support workforce development and help learners to reach their full potential. The University of Greenwich project, ‘Building on Competence in the Workplace’ focused on work between the university and five members of its partner college network. The Linking London ‘CATS Project in Management’ focused on work between two universities, Birkbeck and Middlesex and four further education colleges. Both projects also had a strong level of engagement from professional bodies and provided mechanisms to capture and integrate the input from learners and employer representatives. The projects both commenced on 1 July 2012 and ran for one year, concluding 30 June 2013. Both projects have undertaken extensive monitoring and evaluation activities and these are the subject of individual submissions to AoC. Additionally, they have taken part in the overarching evaluation of the CATS projects undertaken by AoC. This report, commissioned by the University of Greenwich, does not therefore cover ground already well reported, but instead focuses on providing an external evaluation of the effectiveness and impact of an added quality enhancement aspect: the peer evaluation process. 2. Overview of the Peer Evaluation Process The peer evaluation process was identified at the time of project submission as an activity that both teams felt would add value and it was built within the project plans from the start. The model selected was based on one previously developed by the Kent and Medway Lifelong Learning Network2 and had already been trialled by the University of Greenwich to measure the impact of their LLN funded projects. The peer evaluation process consisted to two elements: an impact evaluation model – to ensure consistency and provide a common framework, and a monitoring and evaluation plan. It was 1 New Challenges, New Chances: Further Education and Skills Reform Plan: Building a World-Class Skills System (BIS, 1 December 2011) 2 Perrin, L Time to Make a Difference: achieving sustainability through projects (2010) KMLLN
  • 4. 4 described in detail in a paper that was attached to the University of Greenwich CATS project interim report (Feb 2013) and this is attached as appendix 1 to minimise repetition. The University of Greenwich and Linking London were mutually selecting peer evaluation ‘buddies’. They were logical partners in the process as both had previous experience of peer evaluation and were partnerships with similar roots – each born out of the HEFCE funded lifelong learning networks initiative. This proved to be a very effective factor in ensuring common purpose and a significant level of shared understanding. 3. Strengths of the Peer Evaluation Process The Impact Evaluation Model adopted as a mechanism to guide the peer evaluation process, set a clear focus and structure for the activity. This was a real strength of the process that was adopted, ensuring that peer evaluation was a constructive and productive activity and did not become an unfocused ‘talking shop’. The project manager of the Linking London project, who was introduced to the model for the first time through this activity, felt it was an example of good practice and found it very useful in helping to shape her initial thinking around evidence gathering. The project teams worked together at an early stage to identify inputs, outputs, intermediate outcomes and final outcomes for each of their projects. They then used this information to populate the model, identifying evidence needs and evidence sources. This meant that from a very early phase, they were already thinking about what impact they wanted their project to achieve – and not just focusing on what they were planning to do in the short term. The model stresses that final outcomes are things which will extend beyond the lifespan of a one year funded project and so consideration needs to be given at the stage to how longer term outcomes can be captured and could be evidenced. It is a strength of the model that it forces a mindset shift into a consideration of longer term behavioural changes rather than a focus on delivery of short term targets. Both partners clearly articulated the benefits that they feel they gained from having a ‘critical friend’ that helped them to keep focused and added another layer under the formal reporting structure and network set up by the AoC. Although both projects had numerous contacts with other CATS projects, they each felt that the relationship that they established with their peer evaluation partner was much stronger and more constructive. Comments from members of the project teams included statements such as, ”it was nice sometimes to not feel you are alone”, “ we were more prepared to share issues and problems, it was more constructive, helpful, not just about reporting on progress”, “the peer evaluation was more process based – it was useful”. These comments demonstrate that they had been able to establish an open trusting relationship with each other and this is when peer support can be most effective. One team member described how much they valued the fact that discussions with the peer evaluation partner did not focus only on the achievement of project aims and objectives, but how they were also able to “talk around the edges of the project”. This was a wonderful phrase and was illustrated by several examples where project team members described how they had shared experiences of working with partners – and strategies for managing the challenges this presents and working within the complexities of a steering group ‘management’ structure. The use of Dropbox has provided an innovative tool that enhanced communication between multiple partners. Project teams at Linking London and the University of Greenwich have all been
  • 5. 5 able to access Dropbox and deposit up to date versions of documentation there. It is a cloud based storage facility which means it is accessible wherever you are and you therefore always have access to the most up to date versions of any document – no more relying on someone to have sent you an email or risking files ending up blocked by over cautious spam filters. The ease of access is a real advantage and has also enabled a very effective mechanism for on-going external evaluation. The external evaluator also had access to Dropbox and so was therefore able to regularly ‘dip in’ to sample the evidence available – effectively moving external evaluation from being a twice in a project snapshot activity, into a continuous monitoring process which proves much more rigorous as an exercise. A further strength of the Dropbox facility was the way in which the folders had been set up. It was structured from the start with four folders that aligned with the elements of the Impact Evaluation Model: inputs, outputs, intermediate outcomes and final outcomes. A great benefit of this structure was that, from the very beginning, this meant that the project teams had to engage critically with every piece of evidence they collected; having to differentiate clearly between inputs, outputs and outcomes. 4. Challenges within the Peer Evaluation Process Although both projects focused on credit accumulation and transfer schemes, there were substantial differences between the two with the projects having very different outputs. At times this made it difficult for each partner to see where lessons could be directly transferrable. This was quite noticeable at the interim project phase where the University of Greenwich had been able to provide clear support in providing the impact evaluation model and sharing their experience of setting up clear reporting schedules and monitoring cycles for managing projects with partners, yet they were less able, at that stage, to articulate tangible benefits or transferrable learning which they had got out of the peer evaluation process. However, the value of the activity as a whole was clear and both project teams valued the reflective process which they said kept them focused and helped them each to clarify their own thoughts about their project. The peer evaluation process was not a requirement of the original AoC project call. It was built into the proposal of both projects because the teams recognised the importance of assessing impact to justify value for money and it was felt that this would be an effective and efficient way to achieve it. According to Appendix 1 of the University of Greenwich Interim Report (Feb 2013), ‘It was agreed that the process should be simple, clear and sufficiently robust but not so onerous as to divert time and resources away from other important project activities.’ In this case, the peer evaluation process has achieved its objectives. The process was clear and simple, it had a strong focus on impact, reinforced by the methodology, and the Impact Evaluation Model adopted provided a robust mechanism for planning and capturing evidence on which to base sound judgements. The peer evaluation also provided a cost effective mechanism which did minimise funding drain. However this was, to a not insubstantial degree, mainly down to a smart selection of partnering organisations. Linking London and the University of Greenwich are geographically close and this means that the four face to face meetings planned within the monitoring and evaluation plan were quite achievable. As mentioned earlier, Linking London and the University of Greenwich project team have previous experience of each others’ work as they shared common features and a common language, working closely together as lifelong learning networks. This enabled the project teams to move quickly past
  • 6. 6 the ‘getting to know you’ phase and move directly into the ‘getting down to business’ phase. Where project teams do not already share a common bond, establishing a mutually supportive, trusting relationship may take a little longer than one meeting. Where partners are geographically quite distant also, this could put even more strain on developing a relationship – thereby requiring a greater investment of time to get it established. In such a case, the peer evaluation process may quickly begin to look less cost effective and could easily begin to look like an extra layer of monitoring and evaluation, also beginning to divert resources from delivery of core activities. It is therefore suggested that the smart selection of a peer evaluation partner is one of the critical success factors for this aspect of activity and further adoption of this as a process should strongly consider providing clear guidance on partner selection criteria. Dropbox, a cloud-based document repository, provided a very effective mechanism for multiple partners to share documentation and to organise evidence collection. Its use was one of the distinctive features of the peer evaluation model but unfortunately it was not in place from the start of the project. This was disappointing and it would have been beneficial for it to be available from the beginning of the project. With hindsight, the team at the University of Greenwich who were responsible for setting up the Dropbox facility agree with this and feel that in future, it would be one of the first things put in place as part of a peer evaluation information strategy. Timescales provided a further area of challenge. Although both projects were working towards an overarching timetable set by the AoC, within that there were minor variations in each project, such as the dates set for their final dissemination events. This did not cause any major challenges but an early deadline for the Linking London dissemination event meant that the final peer evaluation meeting had to be held before the University of Greenwich had been able to complete all of its final evidence collection. Hence evaluation reports were written before case studies were collected and so could not be included in the evidence base available. 5. Additional Monitoring and Evaluation Activities within the University of Greenwich Project The Peer evaluation process within the University of Greenwich project was supplemented with monitoring and evaluation activities undertaken by a specially devised steering group. This group comprised senior managers from the colleges, the project team and employers representing SMEs and large companies in SE London. It was an appropriately constructed group that enabled effective oversight of activities and a useful forum for raising areas of concern. Steering group meetings were hosted by partners in turn. This built the collective awareness of others’ campuses and facilities and was an effective way in which the project was able to strengthen relationships between partners. Steering group meetings not only provided a good vehicle for monitoring project progress but also a forum for staff development activities, enabling all partners to gain a shared understanding of credit and accreditation of prior learning. Greenwich has identified these as ‘active meetings’ – ones that blend together a business agenda with a development session, encouraging interaction and participation of all members. 6. Learning Points Both projects agreed, as part of the peer evaluation process, that it was important to maintain a flexible approach towards project aims and outcomes and to recognise that plans will need to be
  • 7. 7 adapted as a project progresses. Beginning in July as both projects did, they shared a common concern in the early phase of trying to make steady progress at a time when most academic institutions seem to be run by a skeleton staffing model. The University of Greenwich project was in a more advantageous position here at it was set out with a tightly focused proposal based on a pre- existing relationship. It was also well timed as it built on the colleges’ validation to offer the Applied Professional Studies programme. The tight bid with a clear focus certainly helped the University of Greenwich to be able to achieve its aims and objectives in a timely fashion. The Linking London project found the summer break more problematic as they had to establish a number of sub- projects and, in a number of cases, build new relationships between colleges and universities. Both projects had to respond flexibly and adapt to staff changes in what seems to have been an especially turbulent time for FE college partners. The Project Manager of the Linking London project found the first meeting of the peer evaluation group especially useful, not only helping her to situate their project, but also helping her to realise that they were not alone in the situation in which they found themselves with regard to concerns over timescales. As a project manager new to this type of working, the peer evaluation process was especially beneficial, enabling her to work closely with the experienced project manager on the University of Greenwich project. Although both projects had different challenges and very different outcomes, as well as taking on the role of ‘critical friend’ they were able to provide mutual support and this is an aspect that is not to be under-estimated. Although there were a number of meetings of all CATS pilots, facilitated by AoC, both projects described the relationship that formed between them as “deeper” and “richer” than that they had with the other projects. Neither project was afraid to ask their peer evaluation partner for advice and support and neither felt they were being judged in any way. There are some indications here that relationships which foster outside of the formal oversight of funding agencies can enable a greater level of openness and are therefore more likely to be mutually beneficial. The concept of ‘active meetings’ at the University of Greenwich, that blended together a business agenda for the steering group, with a development session was one which appeared to enhance the level of engagement that partners had with the project collectively. All college partners in their interim reports commented on the “closer working relationships”, the “group collaborations” and “stronger relationships” that had been built in this way and cited it as a real strength of the project. This was an aspect that was picked up on through the peer evaluation process by Linking London and is a methodology that they have already identified as one worth trialling as a future way of working. The University of Greenwich noted that Linking London were very good at disseminating the findings of their project work and took good advantage of every opportunity available to do so, based on a number of well established practices they had already put in place. Greenwich felt that most of their dissemination work had been internally focused, ensuring that they maximised the spread of information amongst staff groups within the partner organisations. As a result of the peer evaluation process they are now looking at putting on a number of road shows – adopting a model used by Linking London for wider dissemination, and creating a toolkit as additional outputs to those identified in the original project plan.
  • 8. 8 7. Best Practice Identified and Recommendations Consistency: The peer evaluation process was based on both projects using the Impact Evaluation Model that had been created originally by the University of Greenwich. Both projects worked together to identify their inputs, outputs and outcomes and used this information to populate the model. This exercise provided the projects with a shared language and a common level of understanding. Although the outputs and outcomes varied between the two projects, and so by necessity the evidence needs and evidence sources they would require, they were each clear about the process that they would be undertaking. This gave a commonality that enabled them to evaluate each others’ project against identified negotiated areas. Without the use of a common model the differences between the projects may have begun to be too distracting making it hard to bring any aspects of consistency to the peer evaluation process. It is therefore recommended that, for a peer evaluation to be effective it is important to adopt a consistent model against which both projects can be measured. Clear focus on impact: The peer evaluation process had a clear focus on impact and this was led by the use of the model adopted. Impact measures, if they are to be effective, need to be embedded into a project from the outset. The University of Greenwich had clearly considered the long term impact of their project and had built within the evaluation model final outcomes that stretched way beyond the one year funded project. This is important. With a project such as the CATS pilot, within the space of a year we can truly only realistically measure project outputs, such as the publishing of the prospectus and some intermediate outcomes such as focus group feedback. The true success of the project will only be revealed through increasing progression rates and this will take a number of years before we can see a clear picture. It is recommended that funded projects should not feel constrained to set evaluation measures which are limited only to the timeframe of the funding attached – but should also consider longer term impact measures and look at how this may be captured in the future. Dropbox: Digital, cloud-based storage facilities can provide a cost efficient and very effective mechanism for sharing large amounts of documentation amongst multiple users who may or may not share a common employing organisation and are likely to based in multiple locations. In this case, Dropbox was used very effectively to improve ease of access to documentation and improve lines of communication between the project teams and the External Evaluator. It is recommended that any multi-partner, complex project seriously consider using digital cloud based document storage as an integral part of an effective communication strategy. Cost effective: The peer evaluation process in this project provided a well integrated and cost effective mechanism for monitoring and evaluation. To ensure robustness of the project oversight, this was also supplemented (at the University of Greenwich) by a rigorous system which included the regular submission and review of partner progress reports, monitoring of progress at steering group meetings and a formal reporting structure set in place by the AoC. This is also supplemented by an overarching evaluation of the CATS projects commissioned by AoC. Where extensive formal monitoring and evaluation processes are already established, it is recommended that projects consider peer evaluation as a useful quality enhancement mechanism.
  • 9. 9 8. Conclusions The peer evaluation process is a real strength of this project. It has been an integral component from project conception through to completion and has provided a genuine added value to both projects whilst placing very limited demands on project resources. Subject to the caveats raised above regarding partner selection and cost effectiveness, it is a model worthy of consideration for adoption in future national project initiatives. The University of Greenwich Impact Evaluation Model selected to form the basis for the peer evaluation activity is now one that has been tested in a range of situations. It has been used effectively by both CATS projects as well as the smaller scale LLN projects, and has demonstrated its transferability and flexibility. It has a strong emphasis on impact – which is critically important within any project, and ensures that longer term outcomes that stretch beyond the funded term of the project are built in from the start. Both of these aspects build sustainability and maximise the long term impact gained from pump prime funding. The model is worthy of further dissemination on a wider stage.
  • 10. 10 Appendix – Building on Competence in the Workplace Interim Report 1 February 2013 University of Greenwich CATS Pilot Project Peer Evaluation Process Background The University of Greenwich and Linking London CATS Pilot Project teams recognise the importance of assessing impact to justify value for money, to determine the direction and consequences of projects and to show outcomes in achievements, products and processes. They have therefore developed and agreed a Peer Evaluation Process by which both project teams will monitor the process and outcomes of the other’s project through a shared ‘virtual’ portfolio of evidence and produce an evaluative commentary to accompany the Project Final Report. It was agreed that the process should be simple, clear and sufficiently robust but not so onerous as to divert time and resources away from other important project activities. The Peer Evaluation Process The Peer Evaluation Process consists of 2 elements: 1. An Impact Evaluation model which ensures consistency and provides a common framework to enable each project team to monitor each others’ progress against their work plan 2. A monitoring and evaluation plan which underpins the IEM by confirming the process by which each project team will monitor and evaluate Impact Evaluation Model (IEM) The Impact Evaluation Model is based on a model previously developed by the Kent and Medway Lifelong Learning Network3 , led by the University of Greenwich to help its partner institutions measure the impact of their LLN-funded projects. In turn based on a model developed by the Training and Development Agency for Schools (TDA)4 it did not claim to offer a brand new approach to evaluation, but attempted to provide an accessible and user- friendly framework to assist their activity. The fact that it brings together qualitative and quantitative evidence and allows us to look not only at data but also the real people involved and the impact the project could have on communities (work-based learners, employers and learning providers) makes it particularly applicable to the CATS project. The model is designed to help the project team build up a picture of how they expect the project to work and by being clear about the outcomes of the project. Initially, the project teams worked together to identify the inputs, outputs, intermediate 3 Perrin, L. Time to Make a Difference: achieving sustainability through projects (2010) KMLLN 4 The impact evaluation model (IEM) (2009) Training and Development Agency for Schools
  • 11. 11 outcomes and final outcomes that their project was trying to achieve and to populate the model with these. This enabled them to articulate how they expected their project to have an impact. The model was then used to identify the evidence needed and from there, to identify the potential sources of evidence. (See page 4 for Linking London and UoG IEMs) By working through the model each project team was able to demonstrate the links between the various stages of project delivery, from planning all the way through to the impact on individual users of the activities and products being delivered and on the overall project aims. From past experience we were aware that the model is most effective when put in place early in the project. By completing the model at this stage the project teams knew what data and evidence we needed to collect at the outset and were able to clarify for each other any uncertainties about the aims of our projects. Articulating the underlying assumptions of the projects and identifying the potential evidence sources at an early stage have informed the project work plans and are being explored as delivery progresses. Monitoring and evaluation plan The following methods are being used to monitor the progress of each others’ project: Method Purpose Frequency Face-to-Face Meetings 1. To update and question each other on progress 2. To agree strengths and weaknesses 3. To share knowledge and best practice 4. To discuss issues and propose solutions Beginning (Sep 12) Mid-term (Jan 13) and End (May 13) Dropbox 1. To store and share records, documents and other evidence and make available to each other Ongoing – as records and documents are produced Emails 1. To update each other on events, dropbox updates, developments 2. To share ideas and deal with issues arising between meetings As and when required between meetings Evaluative Commentary 1. To provide an evaluative commentary to inform final report Once – end of project
  • 12. 12 Face to Face Meetings The principle method to support the monitoring and evaluation process, three meetings of Project Directors and Managers will be held over the course of the project as follows:  27 September 2012 (Birkbeck) – to agree impact evaluation model and agree process  14 January 2013 (UoG) – to report progress to date and agree outcomes for draft interim report  17 May 2013 (Birkbeck) – to report progress to date and agree outcomes for final report Dropbox Dropbox will be used as a virtual repository for all documents, publications, records and other evidence identified on the Impact Evaluation Model. Each project will have its own folder, each containing four sub folders - Inputs, Outputs, Intermediate Outcomes and Final Outcomes (as per IEM). Evidence will be filed in the appropriate sub-folder. Folders and sub-folders will be shared and accessible to each other. In addition, the UoG folder will be accessible to the Viki Faulkner of the Sussex Lifelong Learning Network who has been appointed to undertake an external evaluation of the Peer Evaluation Process. Emails Probably the easiest and most effective way of keeping in touch between meetings, these will be used to maintain contact on an ongoing basis. Evaluative Commentary Each project team will provide a brief evaluative commentary (approx. 500 words) on the others’ project to support the final report and confirming: 1. To what extent did the project meet its aims and objectives? 2. What, if any, are the additional benefits delivered by the project? 3. What lessons did we learn that we could apply to future projects? CATS Pilot Project Teams Linking London Project Director: Sue Betts Project Manager: Pamela Calabro University of Greenwich Project Director: Hugh Joslin Project Manager: Lindsay Perrin
  • 13. 13 Impact Evaluation Models – Linking London and University of Greenwich
  • 14. 14

×