1410 young


Published on

Published in: Business, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • 2007 – Audit – old system 2008 – NZQA Pilot, new system 2009 – Self Assessment Trials – Evaluative conversations, KEQ’s 2010 – APERs – Introduction of SA at CPIT 2011 – SSER’s, APERs 2012 – EER 2013 - ???The Future – embedding the process, new ideas, new processes
  • 2007 – Audit – old system 2008 – NZQA Pilot, new system 2009 – Self Assessment Trials – Evaluative conversations, KEQ’s
  • This evaluation-based and outcomes focussed system came into force in January 2009.   What is clear is that the new system is more focused on teaching, learning and outcomes for students and other stakeholders. The self-assessment approach involves staff more fully in analysis, reflection and use of the results of self-assessment in decision making for improvement (as compared with the checking of audit).   The system has two major components – a continuous self-assessment component where ITPs use an evaluative approach to assess their capability and performance, and periodic external evaluation and review. This latter will occur every 4 years, and focus on capability in self-assessment as well as performance.
  • Self-assessment, for a Tertiary provider, is about evaluating how well the organisation is doing in terms of the core business of learning and teaching. Self-assessment is the processes providers use to establish evidence of its own effectiveness. The results are used to inform the organisation’s future planning and their actions to bring about improvements in outcomes.   An evaluative approach to self- assessment is a systematic investigation of the outcomes achieved by providers and the key processes contributing to those outcomes.
  • In evaluating their performance providers need to ask:   What (outcomes) are we (our organisation) trying to achieve? How we know we have achieved these (or not)? What do we know about what contributes to (or inhibits) achievement of those outcomes? What are we doing, and what can we do, to improve? And (after time) what improvement have we made?     And in summary, evaluative self-assessment may ask:   What is? (What is happening?) So what? (Why is that important, why does it matter?? Now what? (What do we do now to improve things?)
  • Both self-assessment and external evaluation and review look for the answers to key evaluation questions. These questions provide a common framework for exploring the quality, value and importance of what is being achieved in any type of TEO. They are designed to explore the most important dimensions of educational quality. They address:   Course/programme content and design   How well do programmes and activities match the needs of learners and other stakeholders?   Delivery   How effective is the teaching? Researchers like John Hattie inform us that Good teaching is likely to be the single most important contributing factor to student achievement that is under the control or influence of the provider. How well are learners guided and supported? Learners needs are identified, and they are engaged?   Outcomes   How well do learners achieve? Evidence of actual leaner achievement. How do you measure progress or value added What is the value of the outcomes for key stakeholders, including learners? Who are the important stakeholders. How do you know that their needs are being met? How effective are governance and management in supporting educational achievement? Encourages a whole of organisation approach.  
  • The processes of self assessment are evolving over time and are beginning to be embedded as ‘business as usual’ in much of the institution SA focuses on things that impact the most on student learning and outcomes and also focuses on valued outcomes for key stakeholders. For SA, we decided that we will only collect the data and information which is relevant to its purpose and which can be used for decision making and improvement. The transition from an audit approach to SA has been guided by Simplification and consolidation; a collegial approach and building staff capability in self assessment. There has been consultation during the SA implementation phase with a variety of groups including: CPSA, Faculty Boards, Academic Board, Management Team, Council, HOS groups, Division Managers, Strategic Leadership Group.
  • We developed a generic self assessment and evaluation process which is used for a variety of self assessment activities. This set the framework to begin the implementation.
  • Stocktake SA team and senior managers (eg HOS, Dean ) Identify challenges and what is already known (information, evidence Characteristics of the area Plan the activity meetings  
  • Evaluative questions asked to establish effectiveness of the activity in relation to student learning and key stakeholders. Engage in evaluative conversations informed by data/information available Identify information gaps Engage stakeholders for advice and information Explore options for action Activity Team – SA team and people involved in the activity (staff, external stakeholders, support/education specialists/students; people who an provide advice or assistance.
  • Talk about SSER
  • Decide actions (Activity team and senior managers – to Faculty Board, Academic Board and/or Management Team
  • Survey Results – survey results showed overwhelming support for the new processes. Feedback was very positive about the evaluative conversations held and the wide variety of people involved. We received some very good constructive feedback that we have added into the process for 2011. Trends Report – each cluster report was anaylsed and a trends report developed for the whole institution that was presented to Academic Board and Council. This report highlights Best Practice and issues across the institution. It also has actions for the year ahead. Closing the Loops – each cluster report is endorsed through their Faculty Board, then twice a year each Faculty board reviews their progress and reports through to Academic Board. Continuous Improvement – Now each cluster is looking back at last years actions and moving to close them or roll them over into this year.
  • 5 September 2010, 22 February 2011, 13 June 2011
  • Improving education quality and performance is an integral driver for the evaluative approach to quality assurance. In order for tertiary providers to make these improvements, NZQA will be applying incentives on the basis of External Evaluation and Review results The expectation is that NZQA is able t invest high levels of trust in the information provided by high-performing tertiary education providers. Where EER demonstrates high levels of confidence in education quality, providers ill have greater freedom, lower compliance costs and performance responsive quality assurance processes. For providers who have not demonstrated this level of quality or where there are concerns, NZQA sanctions will increase the level of external scrutiny and limit the providers activities until there is evidence of improvement. Talk about CPIT expectations
  • Self-assessment is an integral part of a Tertiary provider’s business. Carried out in an on-going manner, it is a powerful tool for organisational improvement. It identifies areas that are working well as well as those needing more development. In turn, this allows an organisation to set priorities for further development, evaluate the impact of actions on outcomes for learners and key stakeholders and promote innovation. When the new model is functioning at CPIT as intended self assessment will apply as a matter of course to al those activities which ultimately impact on student learning. The process will be applied instinctively by those closes to the activity. At CPIT the principles of reflective self assessment are already being applied to many key academic processes. This makes the transition through to 2012 and the future smoother than may be the case at other institution. It is hoped the evolution will include the use of more and stronger data and information, testing the effectiveness of changes and focussing more deliberately on how to improve student learning.
  • Let the journey continue
  • 1410 young

    1. 1. Doing it to yourself... 15 August 2011 Deborah Young Christchurch Polytechnic Institute of Technology
    2. 2. Where from... Where we are… Where to... 2007 – Audit 2008 – NZQA Pilot 2009 – Self Assessment Trials 2010 – APERs 2011 – APERs, SSERs 2012 – EER 2013 - ????
    3. 3. Where from...
    4. 5. What is Self Assessment and External Evaluation and Review? <ul><li>Self Assessment and External Evaluation and Review </li></ul><ul><li>NZQA led </li></ul><ul><li>Focus is on teaching, learning and outcomes for students and stakeholders </li></ul><ul><li>Involves all staff in self assessment </li></ul>
    5. 6. What is Organisational Self- Assessment <ul><li>Definition : The evaluative processes an organisation uses to establish evidence of its own effectiveness. </li></ul><ul><li>Purpose : Organisational improvement and accountability. </li></ul>
    6. 7. Evaluating What Matters <ul><li>The outcomes of learning and teaching and </li></ul><ul><li>The processes contributing to these outcomes </li></ul><ul><li>Evaluating what matters in order to improve outcomes </li></ul>
    7. 8. Key Evaluation Questions <ul><li>How well do learners achieve? </li></ul><ul><li>What are the valued outcomes for key stakeholders, including learners? </li></ul><ul><li>How well do programmes and activities match the needs of learners and other stakeholders? </li></ul><ul><li>How effective is the teaching? </li></ul><ul><li>How well are learners guided and supported? </li></ul><ul><li>How effective are governance and management in supporting educational achievement? </li></ul>
    8. 9. Where we are...
    9. 10. CPIT Principles for introduction of Self Assessment and Evaluation <ul><ul><li>Embed in business as usual </li></ul></ul><ul><ul><li>Focus on student learning and outcomes </li></ul></ul><ul><ul><li>Focus on stakeholder outcomes </li></ul></ul><ul><ul><li>Only collect information used for decision making and improvement </li></ul></ul><ul><ul><li>Simplification and consolidation; collegial approach; building staff capability in SAE </li></ul></ul><ul><ul><li>Consultation with key groups </li></ul></ul>
    10. 11. Before we started <ul><li>Facilitation </li></ul><ul><li>- Registration of Interest </li></ul><ul><li>- Selection </li></ul><ul><li>- Training </li></ul><ul><li>Staff Capability </li></ul><ul><li>- Face to Face meetings </li></ul><ul><li>- Faculty wide open sessions </li></ul><ul><li>- Email/Moodle </li></ul><ul><li>Clusters </li></ul>
    11. 12. Generic SA Process <ul><li>Identification of areas </li></ul><ul><li>Stocktake meeting </li></ul><ul><li>Evaluative Conversation </li></ul><ul><li>Reporting </li></ul><ul><li>Closing the Loops </li></ul>
    12. 13. Stocktake <ul><li>What? </li></ul><ul><ul><li>Identify current challenges </li></ul></ul><ul><ul><li>Characteristics of the area </li></ul></ul><ul><li>Who? </li></ul><ul><ul><li>SAE team and senior managers </li></ul></ul>
    13. 14. Cluster Faculty Prog Codes Programme title/s Head of School Programme Leader/Manager Date   Signed         Cluster Statistics EFTS summary 2010 2009 2008 Domestic SAC – enrolled EFTS ITO funded – enrolled EFTS International – enrolled EFTS Total enrolled EFTS Actual Staff/Student Ratio       *Refer to Appendix 1 for detailed information by programme Domestic SAC EFTS summary 2010 2009 2008 Domestic SAC – Maori EFTS Domestic SAC – Pasifika EFTS Domestic SAC – under 25 EFTS *Refer to Appendix 1 for detailed information by programme Characteristics of cluster (s): e.g.: Foundation/International/Pasifika/Māori/Sustainability Current Challenges Actions from last Annual Programme Evaluation  
    14. 15. Evaluative Conversation <ul><li>What? </li></ul><ul><ul><li>Questions asked to establish effectiveness of the activity in relation to student learning and key stakeholders </li></ul></ul><ul><ul><li>Exploring options for actions </li></ul></ul><ul><li>Who? </li></ul><ul><ul><li>Activity team </li></ul></ul>
    15. 16. Examples of questions How do know that you engage with students? How do you know the learning context and environment match the needs of the learners? What changes have been made in subject content or graduate profiles resulting from stakeholder engagement? How has this been implemented? How do you know that your engagement with industry is effective? How do you know teaching is effective?
    16. 17. Examples of questions Who are your customers? How do you know you are meeting their needs? How do you know that your engagement with your customers is effective? How does your individual role impact on student achievement? What do faculties and other divisions value about your service? Hoe do you know?
    17. 18. Reports <ul><li>What? </li></ul><ul><ul><li>Decide actions/recommendations </li></ul></ul><ul><li>Who? </li></ul><ul><ul><li>Activity team </li></ul></ul>
    18. 19. Identified area (including programme codes and titles if appropriate): Stock take Team members:   Date – self-assessment completed: Activity Team members:    Best Practice Identified (List under Relevant KEQs) Facilitator:   KEQ1   KEQ2   KEQ3   KEQ4 KEQ5   KEQ6   Issues Identified (List under Relevant KEQs) Actions Person Responsible Success Criteria Timeframe for Completion Progress November 2011 Completed Yes date No date KEQ1     KEQ2       KEQ3         KEQ4       KEQ5               KEQ6                 Signed by HOS after Activity Team Meeting   Date Signed by HOS at 6 Monthly Progress Report   Date
    19. 20. After Round 1… <ul><ul><li>Survey Results </li></ul></ul><ul><ul><li>Trends Report to Academic Board </li></ul></ul><ul><ul><li>Closing the Loops </li></ul></ul><ul><ul><li>Continuous Improvement </li></ul></ul>
    20. 21. Then…
    21. 22. Where to…
    22. 23. External Evaluation and Review -EER
    23. 24. Improving Performance
    24. 25. Use and adaptation of existing tools and processes <ul><li>Evaluative vs. descriptive commentary </li></ul><ul><li>Needs to be evidence based </li></ul><ul><li>Some survey tools/questionnaires being adapted </li></ul><ul><li>Recognition that better tools needed for student evaluations and graduate destinations </li></ul><ul><li>Use of AUSSE – student engagement survey </li></ul>
    25. 26. Engagement with stakeholders/industry <ul><li>Self Assessment has lead to re-think of engagement </li></ul><ul><li>Programme vs. Institutional engagement </li></ul>
    26. 27. Evaluating the Outcomes for Learners <ul><li>Difficulties tracking graduates and destinations </li></ul><ul><li>Exploring use of social networking </li></ul><ul><li>Long term employment outcomes and contributions to communities hard to measure </li></ul>
    27. 28. Challenges <ul><li>Evaluation of actions and monitoring </li></ul><ul><li>Benchmarking </li></ul><ul><li>Maintaining the momentum </li></ul>
    28. 30. Let the journey continue…