The document provides guidance on developing a student growth measures (SGM) plan for teacher and principal evaluations in accordance with Ohio's evaluation systems. It reviews the SGM framework and defines student growth. Districts are instructed to conduct an inventory of assessment data and resources to categorize educators based on the types of data available and determine appropriate SGM measures and percentages. Categories are defined based on whether value-added or approved vendor assessment data exists. Considerations are provided for educators new to assessments and principals' data sources.
The document provides guidance on designing a local student growth measures plan for teacher evaluations in Ohio. It outlines the steps to take which include conducting an inventory of teachers, assessments, and student growth measure categories. Districts must determine what percentage of evaluations will be attributed to value-added data, approved vendor assessments, and local measures like SLOs and shared attribution. The document also discusses special considerations for categorizing teachers and setting default percentages for student growth measures.
This document provides guidance on designing a local student growth measures plan for teacher evaluations in Ohio. It outlines the steps to take which include conducting an inventory of teachers, assessments and student growth categories, analyzing available assessments and growth data, and setting default percentages for how student growth will be calculated and measured in teacher evaluations. Key aspects that must be determined are what assessments will be used to measure student growth, how to categorize teachers based on available data, and what percentages will be attributed to value-added data, approved assessments and local measures in each category. The document reviews considerations for each step and category to help local districts develop their own student growth measures plan.
This document summarizes NWEA's Assessment Summit held in South Carolina in March 2013. It discusses NWEA's ongoing process of aligning its Measures of Academic Progress (MAP) assessments to the Common Core State Standards, including the release of new aligned test versions and technology-enhanced item types. It also addresses how the transition to Common Core may impact data reports and the use of MAP assessments for teacher evaluation.
The document analyzes evidence from multiple sources to evaluate the claim that graduates from the MAT teacher preparation program design and teach effective lessons including assessments. Data is presented from the Academic Teaching Skills exam, Teaching and Curriculum final grades, and Clinical Observation Rubric in three cohorts from 2006-2010. Analysis found MAT students scored well above passing averages on the exam and received high marks in the specified COR categories, with higher scores correlating to better grades. It was concluded the evidence supports the initial claim about MAT graduates' pedagogical skills and abilities.
The document discusses upcoming legislation around student growth models for evaluating teachers and schools. It notes that the legislation will require districts to use state assessments to measure student growth and develop local growth models to supplement the state ratings. The document provides an overview of different types of growth models, how they work, and how they can be used to set growth targets and evaluate educators. It emphasizes that growth models are most effective when goals are aligned across the district and based on a shared definition of student success.
Catherine Wreyford - Reforms to Primary Assessment and AccountabilityLamptonLWA
This document summarizes reforms to primary school assessment and accountability in England without the use of levels. It explains that levels were removed because they had unintended consequences and led to a disproportionate focus on pupils near boundaries. The reforms include new national curriculum tests reported as scaled scores, interim frameworks for teacher assessment, an optional reception baseline, and assessment freedoms for schools. It also outlines new measures for statutory accountability including a higher school floor standard and a new progress measure based on value-added scores. A new category of "coasting schools" is also introduced.
The document provides guidance on designing a local student growth measures plan for teacher evaluations in Ohio. It outlines the steps to take which include conducting an inventory of teachers, assessments, and student growth measure categories. Districts must determine what percentage of evaluations will be attributed to value-added data, approved vendor assessments, and local measures like SLOs and shared attribution. The document also discusses special considerations for categorizing teachers and setting default percentages for student growth measures.
This document provides guidance on designing a local student growth measures plan for teacher evaluations in Ohio. It outlines the steps to take which include conducting an inventory of teachers, assessments and student growth categories, analyzing available assessments and growth data, and setting default percentages for how student growth will be calculated and measured in teacher evaluations. Key aspects that must be determined are what assessments will be used to measure student growth, how to categorize teachers based on available data, and what percentages will be attributed to value-added data, approved assessments and local measures in each category. The document reviews considerations for each step and category to help local districts develop their own student growth measures plan.
This document summarizes NWEA's Assessment Summit held in South Carolina in March 2013. It discusses NWEA's ongoing process of aligning its Measures of Academic Progress (MAP) assessments to the Common Core State Standards, including the release of new aligned test versions and technology-enhanced item types. It also addresses how the transition to Common Core may impact data reports and the use of MAP assessments for teacher evaluation.
The document analyzes evidence from multiple sources to evaluate the claim that graduates from the MAT teacher preparation program design and teach effective lessons including assessments. Data is presented from the Academic Teaching Skills exam, Teaching and Curriculum final grades, and Clinical Observation Rubric in three cohorts from 2006-2010. Analysis found MAT students scored well above passing averages on the exam and received high marks in the specified COR categories, with higher scores correlating to better grades. It was concluded the evidence supports the initial claim about MAT graduates' pedagogical skills and abilities.
The document discusses upcoming legislation around student growth models for evaluating teachers and schools. It notes that the legislation will require districts to use state assessments to measure student growth and develop local growth models to supplement the state ratings. The document provides an overview of different types of growth models, how they work, and how they can be used to set growth targets and evaluate educators. It emphasizes that growth models are most effective when goals are aligned across the district and based on a shared definition of student success.
Catherine Wreyford - Reforms to Primary Assessment and AccountabilityLamptonLWA
This document summarizes reforms to primary school assessment and accountability in England without the use of levels. It explains that levels were removed because they had unintended consequences and led to a disproportionate focus on pupils near boundaries. The reforms include new national curriculum tests reported as scaled scores, interim frameworks for teacher assessment, an optional reception baseline, and assessment freedoms for schools. It also outlines new measures for statutory accountability including a higher school floor standard and a new progress measure based on value-added scores. A new category of "coasting schools" is also introduced.
The document discusses REIL-TNG, a teacher performance evaluation system used in year 3. It has either 3 or 4 components depending on the number of students with assessment data. The components are observations, student progress, entity growth based on transition assessments, and possibly individual growth based on transition assessments. A teacher's performance in each area is converted to a 1-5 scale score. Scores are weighted and combined to calculate an overall REIL-TNG score out of 500 points that determines a teacher's performance classification.
This document outlines the requirements for educator evaluations in districts and schools. It includes:
1. Eight evaluation components that assess standards like content knowledge, assessment, learning environment, and student growth.
2. A process for evaluations that incorporates observations, input from parents/students, and measures of student growth.
3. Performance levels of exemplary, proficient, basic, and unsatisfactory that are used to rate educators on each standard and determine their overall rating.
4. Results and actions like improvement plans or professional growth plans for educators who do not meet performance standards. Districts must ensure at least 50% of ratings are based on student growth measures.
EDUtech customer training effectiveness analysisPeng (Will) Wu
This business operations analytics study was sponsored by a fast growing SaaS Education Technology Startup in California. In this exercise, I was provided real operations data consisting of 260 records to analyze how customer training influence SaaS platform usage and how to optimize the training practice in the future.
OECD Reviews of Evaluation and Assessment in Education: SWEDEN - Stockholm – ...EduSkills OECD
The document summarizes an OECD review of evaluation and assessment frameworks in Sweden. It finds that Sweden has a well-established approach built on trust and school autonomy, but could better integrate components like student assessment, teacher appraisal, and school and system evaluation. It recommends Sweden develop a strategic framework, increase the reliability of national assessments, strengthen teacher appraisal and its link to school evaluation, and improve system monitoring by mobilizing existing data.
8.6.13 assessment policy media briefingchipubschools
The document summarizes the findings from focus groups held to examine the Chicago Public Schools (CPS) assessment system and identify areas for streamlining. Major findings included that test preparation crowded out instructional time, assessments were not well-aligned to Common Core standards, and practice testing requirements varied widely. The proposed changes for the 2014 school year and beyond aim to limit high-stakes testing, reduce test preparation activities, ensure appropriate measures for diverse learners, and phase out district assessments as new state assessments are implemented. Specific changes include limiting required assessments, building common benchmark assessments aligned to standards, and increasing flexibility, transparency and support around assessments.
The document discusses changes to the national curriculum and assessment requirements in UK schools. Key points include:
- The national curriculum must be followed by maintained schools but not academies/free schools, though all schools have the same assessment requirements.
- Assessment levels will be removed and schools will be responsible for their own curriculum and assessment frameworks. Accountability will be based on pupil progress and achievement.
- The NAHT recommends assessing pupils against objective criteria rather than rankings and developing consistent assessment criteria based on the new national curriculum.
- An assessment model is proposed using key performance indicators and performance standards to assess pupils termly and annually against the national curriculum criteria. Schools will track achievement and create exemplars.
The document discusses updates to Washington State's Teacher and Principal Evaluation Project (TPEP). It outlines a timeline for implementation of new evaluation models between 2010-2014. All districts will use the new models starting in 2013-14. The new models will include 4 tiers of evaluation (exemplary, proficient, basic, unsatisfactory). Teacher and principal evaluation criteria are aligned around areas like instruction, data use, culture and community. Student growth data from multiple measures must be incorporated. The document recommends various resources and task forces to help with training and implementation of the new evaluation systems.
The Common Core State Standards were developed by the National Governors Association and the Council of Chief State School Officers to provide consistent, clear educational standards across states. They are designed to ensure students are prepared for college and careers. The standards focus on developing critical thinking, problem solving, research, and writing skills. Assessments will be administered throughout the school year via the Partnership for Assessment of Readiness for College and Careers to provide feedback on student progress. States adopting the Common Core Standards will work together on common assessments and performance standards.
The document summarizes evaluation data collected for an educational program over multiple years, including implementation, improvement and effectiveness data. Feedback from professional learning sessions was highly positive. Teacher self-assessment scale results showed teachers who participated in program professional learning had higher efficacy scores than those who did not participate. Trend lines were analyzed at the school, domain and item levels.
This document provides guidance for districts on developing and implementing student growth measures for teacher evaluations under the Performance Evaluation Reform Act (PERA). It discusses the requirements around establishing a joint committee, identifying appropriate assessments including Type I, II, and III, determining student growth targets, developing student learning objectives, and assigning summative ratings. Key points include establishing equal representation on the joint committee, using at least two assessments per teacher category with one being Type I or II, considering student characteristics when setting growth targets, and employing state default models if the committee cannot reach agreement.
Building Institutional Research Capacity in a K-12 Unified DistrictChristopher Kolar
In higher education, Institutional Research (IR) offices function to audit the academic output of the institution, evaluate program efficacy, and monitor student success. Effective institutional research supports the understanding, planning, and operation of programs informed by a recognition that different functions of an institution are interrelated and dependent. This session will outline practices by the Department of Research, Evaluation, and Assessment in the Palo Alto Unified School District – a division designed and staffed using an IR model.
[Appendix 1 b] rpms tool for highly proficient teachers sy 2021 2022 in the t...GlennOcampo
The document is an RPMS tool for highly proficient teachers (Master Teacher I-IV) for the 2021-2022 school year that outlines their position, qualifications, duties and responsibilities, and performance evaluation. It contains details on the education, experience, and training requirements for each level of Master Teacher. It also lists key result areas (KRAs) related to content knowledge and pedagogy, learning environment, and learner development and engagement. Under each KRA are objectives with corresponding performance indicators and means of verification for evaluation.
The document summarizes the accomplishments of Dunlap Community Unit School District 323 in the 2012-2013 school year. Key accomplishments include implementing a new certified staff evaluation tool using iObservation software, developing metrics to measure human resources initiatives, revising principal and administrator evaluation processes, transitioning the support staff structure, and beginning implementation of the HR portion of a new HR/finance system. It also outlines major district initiatives for the 2013-2014 school year such as continuing the focus on data-driven decision making, improving evaluation processes, and transitioning support staff structures.
The document provides information about New Jersey's teacher evaluation system, which uses multiple measures including student growth objectives (SGOs) and classroom observations. SGOs are academic goals for student groups that are aligned to state standards and can be measured over time. Effective SGOs are specific, measurable, ambitious, results-driven and timed. Teachers of tested grades have their evaluation weighted based on student growth percentiles and one SGO, while other teachers have evaluations weighted based on classroom observations and two SGOs.
The Teacher Self-Assessment Scales (TSAS). Presented to school participants at the 2015 Fall ALD4ALL Kick-Off on September 22, 2015 by Dr. Joseph P. Martinez.
The document provides an overview of capital markets and security markets. It discusses how capital is raised in capital markets through various financial instruments like bonds, stocks, and funds. It also describes the key participants in capital markets like households, corporations, and government entities. The security markets are organized into various submarkets and exchanges. Over time, markets have become more electronic and integrated through technological advances. Regulations aim to make markets fair, transparent, and protect investors.
This document outlines the steps for designing a local student growth measures plan, as presented at an Ohio education conference. It discusses defining student growth, analyzing the student growth measures framework, conducting an inventory of teachers and assessments, determining default percentages for value-added data and other measures, and communicating the plan. The presentation provides guidance on categorizing teachers, using student learning objectives and shared attribution, approving measures, and providing training on the implementation.
The document discusses REIL-TNG, a teacher performance evaluation system used in year 3. It has either 3 or 4 components depending on the number of students with assessment data. The components are observations, student progress, entity growth based on transition assessments, and possibly individual growth based on transition assessments. A teacher's performance in each area is converted to a 1-5 scale score. Scores are weighted and combined to calculate an overall REIL-TNG score out of 500 points that determines a teacher's performance classification.
This document outlines the requirements for educator evaluations in districts and schools. It includes:
1. Eight evaluation components that assess standards like content knowledge, assessment, learning environment, and student growth.
2. A process for evaluations that incorporates observations, input from parents/students, and measures of student growth.
3. Performance levels of exemplary, proficient, basic, and unsatisfactory that are used to rate educators on each standard and determine their overall rating.
4. Results and actions like improvement plans or professional growth plans for educators who do not meet performance standards. Districts must ensure at least 50% of ratings are based on student growth measures.
EDUtech customer training effectiveness analysisPeng (Will) Wu
This business operations analytics study was sponsored by a fast growing SaaS Education Technology Startup in California. In this exercise, I was provided real operations data consisting of 260 records to analyze how customer training influence SaaS platform usage and how to optimize the training practice in the future.
OECD Reviews of Evaluation and Assessment in Education: SWEDEN - Stockholm – ...EduSkills OECD
The document summarizes an OECD review of evaluation and assessment frameworks in Sweden. It finds that Sweden has a well-established approach built on trust and school autonomy, but could better integrate components like student assessment, teacher appraisal, and school and system evaluation. It recommends Sweden develop a strategic framework, increase the reliability of national assessments, strengthen teacher appraisal and its link to school evaluation, and improve system monitoring by mobilizing existing data.
8.6.13 assessment policy media briefingchipubschools
The document summarizes the findings from focus groups held to examine the Chicago Public Schools (CPS) assessment system and identify areas for streamlining. Major findings included that test preparation crowded out instructional time, assessments were not well-aligned to Common Core standards, and practice testing requirements varied widely. The proposed changes for the 2014 school year and beyond aim to limit high-stakes testing, reduce test preparation activities, ensure appropriate measures for diverse learners, and phase out district assessments as new state assessments are implemented. Specific changes include limiting required assessments, building common benchmark assessments aligned to standards, and increasing flexibility, transparency and support around assessments.
The document discusses changes to the national curriculum and assessment requirements in UK schools. Key points include:
- The national curriculum must be followed by maintained schools but not academies/free schools, though all schools have the same assessment requirements.
- Assessment levels will be removed and schools will be responsible for their own curriculum and assessment frameworks. Accountability will be based on pupil progress and achievement.
- The NAHT recommends assessing pupils against objective criteria rather than rankings and developing consistent assessment criteria based on the new national curriculum.
- An assessment model is proposed using key performance indicators and performance standards to assess pupils termly and annually against the national curriculum criteria. Schools will track achievement and create exemplars.
The document discusses updates to Washington State's Teacher and Principal Evaluation Project (TPEP). It outlines a timeline for implementation of new evaluation models between 2010-2014. All districts will use the new models starting in 2013-14. The new models will include 4 tiers of evaluation (exemplary, proficient, basic, unsatisfactory). Teacher and principal evaluation criteria are aligned around areas like instruction, data use, culture and community. Student growth data from multiple measures must be incorporated. The document recommends various resources and task forces to help with training and implementation of the new evaluation systems.
The Common Core State Standards were developed by the National Governors Association and the Council of Chief State School Officers to provide consistent, clear educational standards across states. They are designed to ensure students are prepared for college and careers. The standards focus on developing critical thinking, problem solving, research, and writing skills. Assessments will be administered throughout the school year via the Partnership for Assessment of Readiness for College and Careers to provide feedback on student progress. States adopting the Common Core Standards will work together on common assessments and performance standards.
The document summarizes evaluation data collected for an educational program over multiple years, including implementation, improvement and effectiveness data. Feedback from professional learning sessions was highly positive. Teacher self-assessment scale results showed teachers who participated in program professional learning had higher efficacy scores than those who did not participate. Trend lines were analyzed at the school, domain and item levels.
This document provides guidance for districts on developing and implementing student growth measures for teacher evaluations under the Performance Evaluation Reform Act (PERA). It discusses the requirements around establishing a joint committee, identifying appropriate assessments including Type I, II, and III, determining student growth targets, developing student learning objectives, and assigning summative ratings. Key points include establishing equal representation on the joint committee, using at least two assessments per teacher category with one being Type I or II, considering student characteristics when setting growth targets, and employing state default models if the committee cannot reach agreement.
Building Institutional Research Capacity in a K-12 Unified DistrictChristopher Kolar
In higher education, Institutional Research (IR) offices function to audit the academic output of the institution, evaluate program efficacy, and monitor student success. Effective institutional research supports the understanding, planning, and operation of programs informed by a recognition that different functions of an institution are interrelated and dependent. This session will outline practices by the Department of Research, Evaluation, and Assessment in the Palo Alto Unified School District – a division designed and staffed using an IR model.
[Appendix 1 b] rpms tool for highly proficient teachers sy 2021 2022 in the t...GlennOcampo
The document is an RPMS tool for highly proficient teachers (Master Teacher I-IV) for the 2021-2022 school year that outlines their position, qualifications, duties and responsibilities, and performance evaluation. It contains details on the education, experience, and training requirements for each level of Master Teacher. It also lists key result areas (KRAs) related to content knowledge and pedagogy, learning environment, and learner development and engagement. Under each KRA are objectives with corresponding performance indicators and means of verification for evaluation.
The document summarizes the accomplishments of Dunlap Community Unit School District 323 in the 2012-2013 school year. Key accomplishments include implementing a new certified staff evaluation tool using iObservation software, developing metrics to measure human resources initiatives, revising principal and administrator evaluation processes, transitioning the support staff structure, and beginning implementation of the HR portion of a new HR/finance system. It also outlines major district initiatives for the 2013-2014 school year such as continuing the focus on data-driven decision making, improving evaluation processes, and transitioning support staff structures.
The document provides information about New Jersey's teacher evaluation system, which uses multiple measures including student growth objectives (SGOs) and classroom observations. SGOs are academic goals for student groups that are aligned to state standards and can be measured over time. Effective SGOs are specific, measurable, ambitious, results-driven and timed. Teachers of tested grades have their evaluation weighted based on student growth percentiles and one SGO, while other teachers have evaluations weighted based on classroom observations and two SGOs.
The Teacher Self-Assessment Scales (TSAS). Presented to school participants at the 2015 Fall ALD4ALL Kick-Off on September 22, 2015 by Dr. Joseph P. Martinez.
The document provides an overview of capital markets and security markets. It discusses how capital is raised in capital markets through various financial instruments like bonds, stocks, and funds. It also describes the key participants in capital markets like households, corporations, and government entities. The security markets are organized into various submarkets and exchanges. Over time, markets have become more electronic and integrated through technological advances. Regulations aim to make markets fair, transparent, and protect investors.
This document outlines the steps for designing a local student growth measures plan, as presented at an Ohio education conference. It discusses defining student growth, analyzing the student growth measures framework, conducting an inventory of teachers and assessments, determining default percentages for value-added data and other measures, and communicating the plan. The presentation provides guidance on categorizing teachers, using student learning objectives and shared attribution, approving measures, and providing training on the implementation.
Google Apps is a suite of collaboration and productivity tools including Gmail, Calendar, Drive, Docs, Sheets, Slides, and Sites. Gmail provides 25GB of storage and integrates with Calendar and Drive. Calendar allows scheduling of lessons and meetings across devices and with sharing controls. Drive provides cloud storage and sharing of files across devices. Docs, Sheets, and Slides are online versions of word processing, spreadsheet, and presentation software that can be collaboratively edited. Sites enables easy website creation without coding. The suite of tools works across operating systems and devices and provides capabilities for communication, organization, collaboration, and content creation for education.
The document discusses current asset management, including cash management, marketable securities, accounts receivable, and inventory management. It covers topics such as cash flow cycles, float, credit policies, inventory levels, and inventory decision models. The goal of current asset management is to balance liquidity needs with maximizing returns through techniques like minimizing cash balances and actively managing receivables, marketable securities, and inventory levels.
The document discusses current asset management, including cash management, management of marketable securities, accounts receivable, and inventory management. It covers topics such as cash flow cycles, improving collections and extending disbursements, inventory policy and economic order quantity models, just-in-time inventory management, and areas of concern for various current asset management strategies and techniques. The overall document provides an overview of key considerations and approaches for managing a company's current assets.
The document provides an overview of capital markets and security markets. It discusses how capital is raised in capital markets through various financial instruments like bonds, stocks, and funds. It also describes the three main sectors of the US economy and how physical and electronic security markets work. It outlines the key legislation governing security markets and how prices rapidly adjust in efficient markets.
The document discusses the importance of wealth allocation in successful wealth planning. It describes the main components of the wealth allocation process as establishing objectives, identifying opportunities and risks/constraints, and determining potential investment channels. The two major components are the investment policy statement and portfolio management process. It then provides details on what should be included in the investment policy statement and introduces the portfolio allocation scoring system used to determine the appropriate asset allocation mix based on a client's total score.
The document discusses Islamic perspectives on wealth creation, management, and purification compared to conventional approaches. It notes that in Islam, all wealth belongs to Allah and humans are trustees of wealth. It outlines permissible and prohibited types of wealth accumulation and business practices in Islam. The document also discusses the importance of spending wealth on others through voluntary charity and compulsory zakat as a means of wealth purification in Islamic wealth planning.
The document discusses the nature and scope of wealth planning in Islam. It defines wealth planning and compares it to financial planning, noting their similarities such as both aiming to enhance value, but also their differences like wealth planning being long term focused. It also compares conventional and Islamic wealth planning, noting similarities like both containing accumulation and distribution functions, but differences like Islamic wealth planning needing to follow Shariah law. The significance of different stages in the wealth planning process is explained. The concept of trade-offs is discussed in relation to risk and return, and how the Islamic concept differs by also considering trade-offs between this life and the next.
The document discusses current asset management, including cash management, marketable securities, accounts receivable, and inventory management. It covers topics such as cash flow cycles, float, credit policies, inventory levels, and inventory decision models. The goal of current asset management is to balance liquidity needs with maximizing returns through techniques like minimizing cash balances and actively managing accounts receivable and inventory levels.
This document outlines the course details for an Advanced Financial Management course. The course will provide insights into topics like working capital management, accounts receivable, inventory management, cash management, short and long term financing, leverage, risk management, and dividend policy. Students will learn tools and techniques for managing working capital, capital structure, and firm financing. They will apply decision-making skills and acquire transferable skills like critical thinking, communication, leadership through group projects, lectures, tutorials, online learning and assessments.
The document discusses key concepts related to Islamic wealth planning and management, including:
1) It describes the asset allocation process as systematic for reducing overall market risk and unsystematic for reducing company-specific risks through diversification.
2) It outlines the degrees of market efficiency from strong to weak and their implications for using fundamental and technical analysis.
3) It explains that investors should buy undervalued stocks expected to increase in price and sell overvalued stocks expected to decrease in price.
This document outlines key concepts around working capital management and financing decisions. It discusses matching a firm's current asset levels with forecasted sales and production schedules. It also covers controlling assets by matching sales and production levels. Additionally, it examines using long-term versus short-term financing to fund different types of current assets, as well as how financing decisions impact risk and profitability. Overall, the document provides an overview of effective working capital management strategies.
The document discusses capital markets and securities. It covers various topics such as the types of security markets (money markets and capital markets), listing requirements for exchanges like the NYSE, how the organization of markets has changed with the rise of electronic communication networks, and how efficiently markets incorporate information into stock prices. It provides an overview of the key components and functioning of capital markets.
The document discusses wealth planning and management through the Islamic instrument of waqf (endowment). It begins by explaining the hadith about a person's good deeds continuing after death through recurring charity, beneficial knowledge, and righteous children. It then defines waqf and describes the three main types: public waqf, family waqf, and combined public-family waqf. The conditions for valid waqf creation and permissible waqf assets are also summarized.
The document discusses various models for modern applications of cash waqf, including:
1. Waqf shares model where investors purchase shares in a religious institution that manages the funds.
2. Waqf takaful model where contributors pay monthly amounts that are invested, with profits used for charitable purposes.
3. Direct model where contributors deposit funds directly into bank accounts of religious authorities.
4. Mobile model allowing contributions via SMS that are invested and profits used for charity.
The document provides an overview and guide to Baltimore City Public Schools' new Teacher Effectiveness Evaluation system being implemented in the 2013-14 school year. The evaluation measures teacher effectiveness through two areas - professional practice and student/school growth, each accounting for 50% of a teacher's overall rating. Professional practice consists of classroom observations and a professional expectations measure, while student/school growth includes individual student measures, all-student measures, and a school performance measure depending on the teacher's grade and subject. The document outlines the components and process of the new evaluation system and its goal to provide fair and rigorous assessments of teacher performance to improve instruction and student outcomes.
The document provides an overview of Ohio's value-added assessment and accountability system, including:
1) It describes the development of Ohio's two value-added models (URM and MRM) and the unification into a single system to expand teacher-level reporting statewide.
2) It summarizes recent changes and enhancements to Ohio's value-added reports, including the addition of subjects, inclusion of student historical data across districts, and expanded report views.
3) It discusses the role of value-added analysis in Ohio's accountability system and teacher evaluations under new laws.
The document introduces Shrewsbury Public Schools' new educator evaluation system which is being implemented to comply with Massachusetts state regulations and improve teaching and learning. The new five step evaluation cycle will involve goal setting, observations, feedback, and using measures such as student growth percentiles and surveys to provide an impact rating for educators. The district will pilot the new system with 50% of educators in 2013-2014 and fully implement it for all educators the following year.
Principal and-teacher-evaluation-key-ideas-your-role-and-your-school's-leadin...Winnie de Leon
This document provides an overview of principal and teacher evaluation processes in Mississippi, including:
- The Mississippi Principal Evaluation System (MPES) uses student achievement goals, organizational goals, and staff surveys to evaluate principals.
- Teacher evaluations incorporate classroom observations, student growth objectives, and professional growth goals. Non-tested teachers use student learning objectives to measure student growth.
- Principal and teacher evaluations are interconnected, as principals and teachers must work together to achieve schoolwide student growth.
- "Leading indicators" and "lagging indicators" help define goals and measure progress, with leading indicators more easily influenced during the school year.
- Counselors can support the new evaluation processes by fostering
This document discusses approaches to developing comprehensive teacher evaluation systems using student achievement data. It outlines five propositions for validating such systems and the design claims and evidence needed to support each proposition. It then describes different approaches used across the US, including student learning objectives, subject- and grade-alike measures, universal pre-/post-tests, and value-added composites. Both advantages and disadvantages are provided for each approach. Finally, it discusses examples from Minneapolis and Edina that use value-added measures and standardized tests to evaluate teacher effectiveness.
This document provides an overview of the Annual Professional Performance Review (APPR) process for teachers in New York State. It discusses the new requirements under Education Law 3012c, including that teachers will receive a performance rating and score, with 40% based on student performance. It outlines the point distribution for Tarrytown schools, with 20 points from student growth on state assessments, 20 points from locally selected measures, and 60 points from other measures. It also explains the use of Student Learning Objectives for teachers not covered by state growth measures and the required elements for SLOs under NYS guidelines.
The document provides an RPMS tool for evaluating highly proficient teachers (Master Teachers I-IV) in the Philippines for the 2021-2022 school year. It outlines the position profile, qualifications, duties and responsibilities expected of Master Teachers. It also includes a performance rating system across three key result areas of content knowledge and pedagogy, learner development and diversity, and personal and professional growth. Objectives and performance indicators are defined for each key result area along with means of verification and quality performance ratings.
MSDE Presentation on Student Learning Objectives: MSEA 2013 Conventionmarylandeducators
The document discusses implementing high-quality student learning objectives (SLOs) as part of Maryland's teacher evaluation model. It explains that 50% of teacher evaluations will be based on student growth measures, including SLOs. SLOs require teachers to identify critical content, select quality measures, set rigorous targets, and develop best practice action plans. The document provides questions and examples to guide teachers in developing high-quality SLOs and ensuring they are ambitious yet attainable. After using SLOs for 5 years, teachers reported benefits like stronger collaboration and a focus on student progress and achievement.
This document discusses strategies for maximizing student assessment systems. It advocates defining your own assessment goals rather than focusing solely on compliance. It provides seven principles for effective assessment programs: 1) Define assessment purposes and ensure validity, 2) Educate teachers on assessments, 3) Align results to audience needs, 4) Eliminate redundant assessments, 5) Deliver timely results, 6) Use metrics that focus on all students, and 7) Contribute to transparency and long-term focus. The document argues that assessment goals, metrics, and incentives should support all students rather than just those near performance cutoffs.
This document outlines a proposed educator effectiveness evaluation system. It discusses research showing the importance of teacher effectiveness and the need to improve systems to identify and support ineffective teachers. The proposed system would use standards-aligned rubrics to evaluate teachers, administrators, and specialists on both professional practices and student outcomes. It would provide customized tools and training to help districts implement fair, research-based systems to improve instruction and student achievement.
This document contains an RPMS (Results-Based Performance Management System) tool for teachers in the Philippines for the 2021-2022 school year during the COVID-19 pandemic. It includes the position and competency profile, duties and responsibilities, and performance indicators for Key Result Areas related to content knowledge and pedagogy, and learning environment. Teachers are evaluated based on classroom observations, lesson plans, and other means of verification to determine their level of performance in establishing effective learning environments and demonstrating strong content knowledge and teaching skills.
This document provides the performance rating system (RPMS) tool for teachers ranked as proficient (Teacher I-III) for the 2021-2022 school year in the Philippines amid the COVID-19 pandemic. It outlines the position and competency profile, qualifications, duties and responsibilities, and key result areas (KRA) for evaluating teachers. The KRA on content knowledge and pedagogy includes 4 objectives for evaluating teachers' mastery of content, use of research-based teaching practices, language proficiency, and communication strategies. Performance is rated on a 5-point scale based on classroom observations and other evidence like lesson plans. The tool provides means of verification and indicators for assessing teachers' performance level as outstanding, very satisfactory,
This document contains an RPMS (Results-Based Performance Management System) tool for teachers in the Philippines for the 2021-2022 school year during the COVID-19 pandemic. It includes the position and competency profile, duties and responsibilities, and performance indicators for Key Result Areas related to content knowledge and pedagogy, and learning environment. Teachers are evaluated based on classroom observations, lesson plans, and other means of verification to determine their level of performance in establishing effective learning environments and demonstrating strong content knowledge and teaching skills.
This document provides the performance rating system (RPMS) tool for teachers ranked as proficient (Teacher I-III) for the 2021-2022 school year in the Philippines amid the COVID-19 pandemic. It outlines the position and competency profile, qualifications, duties and responsibilities, and key result areas (KRA) for evaluating teachers. The KRA on content knowledge and pedagogy includes 4 objectives for evaluating teachers' mastery of content, use of research-based teaching practices, language proficiency, and communication strategies. Performance is rated on a 5-point scale based on classroom observations and other evidence like lesson plans. The tool provides a systematic way to assess proficient teachers' instructional practices during remote learning.
This document provides the performance rating system (RPMS) tool for teachers ranked as proficient (Teacher I-III) for the 2021-2022 school year in the Philippines amid the COVID-19 pandemic. It outlines the position and competency profile, qualifications, duties and responsibilities, and key result areas (KRA) for evaluating teachers. The KRA on content knowledge and pedagogy includes 4 objectives for evaluating teachers' mastery of content, use of research-based teaching practices, language proficiency, and communication strategies. Performance is rated on a 5-point scale based on classroom observations and other evidence like lesson plans. The tool provides a systematic way to assess proficient teachers' instructional practices during remote learning.
[Appendix 1A] RPMS Tool for Proficient Teachers SY 2021-2022 in the time of C...JAYSONDAGURO
This document contains an RPMS (Results-Based Performance Management System) tool for teachers in the Philippines for the 2021-2022 school year during the COVID-19 pandemic. It includes the position and competency profile, duties and responsibilities, and performance indicators for Key Result Areas related to content knowledge and pedagogy, and learning environment. Teachers are evaluated based on classroom observations, lesson plans, and other means of verification to determine their level of performance in establishing effective learning environments and demonstrating strong content knowledge and teaching skills.
[Appendix 1 a] rpms tool for proficient teachers sy 2021 2022 in the time of ...GlennOcampo
This document contains an RPMS (Results-Based Performance Management System) tool for teachers in the Philippines for the 2021-2022 school year during the COVID-19 pandemic. It includes the position and competency profile, duties and responsibilities, and performance indicators for Key Result Areas related to content knowledge and pedagogy, and learning environment. Teachers are evaluated based on classroom observations, lesson plans, and other means of verification to determine their level of performance in establishing effective learning environments and demonstrating strong content knowledge and teaching skills.
DepEd MEMORANDUM No. 008 , S. 2023
MULTI-YEAR GUIDELINES ON THE RESULTS-BASED PERFORMANCE MANAGEMENT SYSTEM-PHILIPPINE PROFESSIONAL STANDARDS FOR TEACHERS
This document provides information about Student Learning Objectives (SLOs) and their role in teacher evaluations under the Performance Evaluation Reform Act (PERA) in Illinois. It explains that PERA requires teacher evaluations to include both measures of teaching practice and student growth. Districts can choose to measure student growth using SLOs, which are academic goals that teachers set for their students at the start of a course. The document outlines the SLO process and requirements, such as selecting appropriate assessments and setting growth expectations. It also addresses common questions about implementing SLOs and using them for teacher evaluations.
Teacher evaluation and goal setting connecticutJohn Cronin
The document discusses implementing teacher evaluation systems in Connecticut. It covers:
1) The purposes of teacher evaluation, which can include both formative feedback to help teachers improve as well as summative judgments to make decisions about employment.
2) The importance of differentiating evaluations for teachers and principals given their different roles and impact, as well as the risks of not differentiating.
3) Strategies for setting reasonable and rigorous student growth goals for teachers, including using multiple data sources and metrics to set attainable yet ambitious goals.
1. Improve Your District’s
Student Growth Measures Plan
Ohio’s Spring Education Symposium
March 28, 2014
Presented By:
Mindy Schulz, Allen County ESC
Director of Curriculum
eTPES, OTES, OPES State Trainer
1
2. Intended Outcomes
Participants will:
• Review the OTES & OPES student growth
measures (SGM) framework.
• Analyze their district’s current SGM
assessments for quality and relevant use.
• Evaluate and refine their district’s
current SGM plan for potential revisions
and considerations.
2
3. Session Format
• I do
• Presenter will provide
information, demonstrate, and model refining a
SGM plan
• We do
• Participants will practice applying SGM plan
revisions
• You do
• Participants will refine their district SGM plan
using own district, school information
3
4. Definition of Student Growth
For the purpose of use in Ohio’s
evaluation systems, student
growth is defined as the change in
student achievement for an
individual student between two or
more points in time.
4
7. What is a Student Growth
Measures Plan?
“Teacher evaluation as required by
O.R.C.3319.111 relies on two key evaluation
components: a rating of teacher
performance and a rating of student
academic growth, each weighted at 50
percent of each evaluation. The following
guidance speaks to the student growth
measures component, specifically
addressing determinations to be made for
using student growth measures within
teacher evaluation.” – K. Harper
7
8. Let’s Review: ODE Steps for
Designing a Local SGM Plan
1) Conduct an inventory of needs and
resources.
2) Determine and create (if necessary)
student growth measures to be
used.
3) Communicate expectations and
refine the entire process.
8
10. Who is required to be evaluated
by the new evaluation systems?
Any person who is employed under a
teaching license or under a professional or
permanent teacher’s certificate and who
spends at least 50 percent of his/her time
employed providing student instruction. This
does not apply to a teacher employed as a
substitute. (ORC3319.111)
This usually excludes:
Speech pathologists, occupational therapists
Teachers on assignment
Nurses, psychologists, guidance counselors
10
11. Who is required to be evaluated
by the new evaluation systems?
O.R.C. 3319.02 D(1)
The procedures for the evaluation of
principals and assistant principals shall be
based on principles comparable to the
teacher evaluation policy adopted by the
board under section 3319.111 of the Revised
Code, but shall be tailored to the duties and
responsibilities of principals and assistant
principals and the environment in which
they work.
11
12. Categorize Educators
• Who has teacher-level EVAAS Value-Added
data, grades 4-8 reading and/or math? Category
A1 & A2
• Which principals are assigned to buildings with
Value-Added data? Category A
• Who does not have teacher-level Value-Added
data (or building-level for principals), but has data
from assessments on the ODE approved vendor
assessment list? Category B
• Who has no Value-Added or approved vendor
data? Category C
12
13. Value Added Data Timeline
13
Spring Fall 2nd
Semester
May
4-8 Reading &
Math OAA’s
Administered
Prior school year teacher-level value-
added reports (which are actually
received in Fall of current school year) for
Category A1 & A2 Teachers will be used
in calculating SGM % based on LEA’s
default percentages. Building-Level
value-added results for Category A
principals will be used in calculating SGM
% based on LEA’s default percentages.
Teacher Level Value
Added Reports
Released (Data from
Spring of prior school
year: 4-8 Reading &
Math OAAs)
Teacher and Building
Level Value Added
reports (from prior
school year)
uploaded into eTPES
by ODE.
14. Who is a Value-Added
Teacher for 2013-2014?
If a teacher received a value-added report fall
2013 from course(s) instructed 2012-2013, they
are considered a value-added teacher for the
spring 2014 evaluation. Of those, determine:
• Who instructed all value added courses for
2012-2013 (exclusively)?
• Who instructed some value added course(s)
for 2012-2013, but not exclusively? What
percent of time was spent instructing in value
added course(s)?
14
15. Who instructed all value
added courses for 2012-2013?
Previous
School
Year
(2012-
2013)
Current
School
Year
(2013-
2014)
Assigned
Category
in eTPES
(2013-
2014)
SGM Percentage
All VA All VA A1 26-50% (2013-2014 only)
Category A1 teachers must use their teacher-
level Value-Added report as the full 50% student
growth measure beginning July 2014.
All VA Some VA A2
26-50% because previous year had full VA
All VA No VA A2
15
Category Determination
16. Who instructed some value
added course(s) for 2012-2013?
16
Previous
School
Year
(2012-
2013)
Current
School
Year
(2013-
2014)
Assigned
Category
in eTPES
(2013-
2014)
SGM Percentage
Some VA All VA A2
Proportionate to schedule;
10-50%
Some VA Some VA A2
Some VA No VA A2
Category Determination
17. Principal in Building with Value-
Added Data
# of Years in Same
Building
SGM Decisions
4 or more Category A
3 Category A
• Enter the verified principal composite provided
by ODE when available*
• If principal composite is not verified or is
incorrect, use the 2-year building average
available in the EVAAS report
• May have local measures
17
Currently, building-level Value-Added data is based on a 3-year composite.
*If principals 3-year composite is incorrect, SAS and ODE are working to create principal
composite reports which will be available in the spring.
A special thanks to Dr. Kathy Harper, Greene Co. ESC, for contributing to the contents of this slide.
18. Principal in Building with Value-
Added Data Cont.
# of Years in Same
Building
SGM Decisions
2 Category A
• Enter the verified principal composite provided by ODE
when available*
• If principal composite is not verified or is incorrect, use the
1-year data available in the EVAAS report
• May have local measures
1 – YES, previous
principal experience
• Category A if previous assignment was in a building with
Value-Added. **Use decision process for principals with 2,
3, or 4+ years in a Value-Added building.
• Category B or C if previous assignment was in a building
without Value-Added data. Use data from current
assignment to determine category
18
Currently, building-level Value-Added data is based on a 3-year composite.
*If principals 3-year composite is incorrect, SAS and ODE are working to create principal
composite reports which will be available in the spring.
A special thanks to Dr. Kathy Harper, Greene Co. ESC, for contributing to the contents of this slide.
19. Principal in Building with Value
Added Data Cont.
# of Years in Same
Building
SGM Decisions
2 Category A
• Override the default tied to the IRN; enter the 1 year
building score from EVAAS report
• May have local measures
1 – YES, previous
principal experience
• Category A if previous assignment was in a building
with Value-Added. **Use decision process for
principals with 2, 3, or 4+ years in a value added
building.
• Category B or C if previous assignment was in a
building without Value-Added data. Use data from
current assignment to determine category
• May have local measures from current assignment
19
Currently, building-level value added data is based on a 3-year composite.
*SAS and ODE are working to create principal composite reports which will be
available in the spring.
20. Special Considerations: Principals
20
Are any principals in a building with no Value-
Added data, but were previously assigned to a
building with Value-Added data?
• Category A
• Override the default percentage pre-loaded data
and enter data from the previous school; **Use
decision process for principals with 2, 3, or 4+ years in
a value added building.
• May have local measures from current assignment
21. Who has data from assessments on the
ODE approved vendor assessment list?
1) What ODE-approved vendor assessments did we use for 2013-
2014? Will our LEA use any of the newly added ODE approved
vendor assessments for 2014-2015?
2) Will our LEA continue using the 2013-2014 vendor
assessments?
2) LEA Considerations:
• Does the manner in which our LEA is using the ODE vendor assessment
meet the definition of student growth?
• Have we secured the vendor assessment growth reports?
• Which assessments are not on the ODE approved vendors assessment list,
but could be used in SLOs?
• Are any Category A2 teachers using an ODE approved vendor
assessment? If an A2 teacher uses an approved ODE vendor assessment,
that assessment becomes a local measure.
21
22. Who has No Value-Added or
ODE-Approved Vendor Data?
Inventory educators with
no value-added or
ODE-approved vendor
assessment data.
(Category C)
22
23. Special Considerations:
Teachers
Who is new to Value Added assignment for the
current year?
• Inventory teachers that did not receive a value
added report from previous year, but have been
assigned to a value added course for current
year.
This may include:
New teachers, e.g. Year One Resident Educators,
new hire
Any teacher that changed assignment from the prior
year to the current year, e.g. teacher instructed 3rd
grade in previous year, and currently instructs 6th
grade math
23
24. Special Considerations:
Teachers
For teachers new to value-added assignment and
not receiving a teacher-level value-added report in
the fall:
• Determine current year SGM
category, dependent upon available data.
Are there ODE-approved vendor assessments
available? (Category B)
If there are no ODE-approved vendor assessments
available, LEA measures will be used. (Category C)
24
25. Categorizing Teachers New to
Value-Added
Previous School
Year
Current School
Year
Assigned
Category in eTPES
SGM Percentage
No VA All VA B or C B if ODE
approved
vendor
assessment is
used;
all others are
category C
No VA Some VA B or C
No VA No VA B or C
25
Category Determination
26. Special Considerations: Principals
# of Years in Same
Building
SGM Decisions
1– NO previous
principal
experience
• Category B if the building has ODE-
approved vendor data (may also have
local measures)
• Category C if no ODE-approved vendor
data
26
Are any 1st year principals assigned to a building with Value-Added data?
27. Step Two:
Designing a Local SGM Plan
Determine and create
(if necessary) student growth
measures to be used.
27
28. Determine LEA Default Percentages
What percentages will your LEA attribute to:
• Value-Added data (Category A1 and A2)?
• Assessments from the ODE approved vendors
(Category B)?
• Local Measures within each category?
(Local Measures may also apply to Category A1
(2013-2014 only), Category A2 teachers, Category A
principals, and Category B educators)
28
29. SB229 Legislative Update
29
(a)One factor shall be student academic growth which shall
account for fifty thirty-five per cent of each evaluation.
A school district may attribute an additional percentage to the
academic growth factor, not to exceed fifteen per cent of each
evaluation. However, a school district may instead attribute that
additional percentage to any of the factors set forth in
division (A)(1)(b) of this section.
(b) The remainder of each evaluation may include a combination
of the following factors:
(i) Formal observations and reviews as required by division (A)(3)
of this section;
(ii) Student surveys;
(iii) Any other factors the board determines necessary and
appropriate.
SB229 Proposed Changes to Current O.R.C. 3319.111
30. SB229 Legislative Update Cont.
2013-2014
Any proposed changes to ORC3319.111 will not alter
the 2013-2014 OTES or OPES framework and
requirements. The proposed effective date for SB229 is
July 1, 2014.
Disclaimer
Information presented in this session is what is available
today. Anything can change this afternoon or
tomorrow. New and/or revised legislative mandates
can be proposed and passed, even after a
requirement has been implemented.
To check the status of SB229:
http://www.legislature.state.oh.us/bills.cfm?ID=130_SB_229
30
31. 31
How much will our LEA attribute to
Teacher-Level Value-Added Data?
31
O.R.C. 3319.111, O.R.C. 3319.112
A1. Teacher Instructs Value-Added Subjects
Exclusively
A2. Teacher Instructs Value-Added Subjects, but
Not Exclusively
Teacher Value
Added
26-50%
LEA Measures
0-24%
Teacher-Level
Value Added
Proportional to teaching
schedule
10-50%
0-40%
LEA Measures
Proportional to
teaching schedule
2013-2014
32. 32
How much will our LEA attribute to
Teacher-Level Value-Added Data?
32
A1. Teacher Instructs Value-Added
Subjects Exclusively
O.R.C. 3319.111, O.R.C. 3319.112
Teacher Value Added
50%
Teacher-Level
Value Added
Proportional to teaching
schedule
10-50%
0-40%
LEA Measures
Proportional to
teaching schedule
A2. Teacher Instructs Value-Added
Subjects, but Not Exclusively
2014-2015
33. How much will our LEA attribute to
Building-Level Value-Added Data
for Principals?
33
34. How much will our LEA attribute to
the assessments from the ODE
Approved Vendor List?
B:ApprovedVendorAssessmentdataavailable
C: NoTeacher-levelValue-AddedorApprovedVendorAssessmentdataavailable
St
=
=LEAMeasures
0-40%
LEAMeasures
0-40%
TeacherValue-Added
10-50%
S
VendorAssessment
10-50%
34
35. Category B:
Special Considerations
• How many years has the
assessment(s) been administered?
• Is there trend data to analyze?
• Are there variations in the number of
vendor assessments available by
course and/or grade level?
35
36. What LEA measures will be
used for teachers?
1. Student Learning Objective (SLO) process for
measures that are specific to relevant subject matter.
Measures must be district-approved and may
include:
• Other vendor assessments not on the ODE Approved List
• Career Technical Educational assessments
• Locally developed assessments
• Performance-based assessments
• Portfolios.
2. Teacher Category A2 (with Value-Added) also may
use Vendor assessments as an LEA-determined
measure proportionate to the teacher’s schedule for
non-Value-Added courses/subjects.
3. Shared attribution
36
37. What LEA Measures will be
used for principals?
1. An average of all teachers' student growth ratings in the
building
2. Building-Based Student Learning Objectives (SLOs) process
for using measures that are specific to relevant building
goals and priorities and aligned with Ohio Improvement
Process. Measures for SLOs must be district-approved and
may include both direct and indirect measures such as:
• Student achievement trends
• Locally developed assessments
• Progress on school improvement plans
• Student course taking patterns, e.g. more students taking
advanced courses, PSEO, etc.
3. Shared Attribution
37
38. What is Shared Attribution for
Teachers?
• Shared attribution is a collective measure.
• The LEA determines which measure of
shared attribution it would like to use.
• Shared attribution could be:
• A building or district value-added score
• Recommended if available
• Building team composite value-added score
(i.e. the 5th grade VAM score or the middle
school reading ELA team’s combined VAM
score)
• Building-level or district-level SLOs
38
39. What is Shared Attribution for
Principals?
• Shared attribution is a collective measure.
• The LEA determines which measure of
shared attribution it would like to use.
• Shared attribution could be:
• District Value-Added is recommended if
available
• Groups of schools (such as grade level
buildings or regional areas within a district) may
utilize an average Value-Added score
• District-based SLOs
39
40. What Default Percentages will
your LEA Set for 2013-14?
40
*For Category A, teachers with Value-Added may also include
ODE-Approved Vendor Assessment data in the LEA Measures.
41. What Default Percentages will your
LEA Set for 2014-15?
*This information may appear differently in eTPES 2014-2015.
Educator Category Value-Added
Vendor
Assessment
LEA Measures
Total = 50%
SLOs/Other* Shared
Attribution
A: Value-
Added
A1 (exclusive)
50% 50%
A2 (non-
exclusive)
10% or greater
*Remaining % may be split
among SLOs and Shared
Attribution areas
50%
B: Approved Vendor
Assessment
10% or greater
Remaining % may be split
among SLOs and Shared
Attribution areas
50%
C: LEA Measures Remaining % may be split
among SLOs and Shared
Attribution areas 50%
41
*For Category A2, teachers with Value-Added may also include
ODE-Approved Vendor Assessment data in this LEA Measures.
42. Special Considerations
If the district decides to allow variation
from the default percentages, they must
make manual adjustments within eTPES.
• Districts should try to be as consistent as
possible when setting percentages.
• Percentages should not be determined by
individual teachers or determined based
on individual past results.
42
43. What Default Percentages
Will Your LEA Set for Principals?
43
*For Category A principals, this could also include the
ODE-Approved Vendor Assessment data average of all teachers’ growth ratings.
44. Determine how the LEA will implement
the local measures process.
• Will shared attribution measures be used?
• Who is required to create SLOs?
• Within the guidelines of 2-4 SLOs, how many
SLOs are required for each teacher?
• Who will be approving the SLOs?
• How will SLOs be tracked, through revisions,
and to final approval?
• What guidance, training, and support will be
provided to teachers and evaluators?
44
45. Will Shared Attribution
Measures be Used?
• What shared attribution measures are
we using?
• Have we secured the proper reports?
• Will the same shared attribution
measures be used for all educators
within each SGM category?
Note: Only one shared attribution
measure may be used per educator.
45
46. Will SLOs be Used?
• Who is required to create SLOs?
Which categories of teachers will have LEA measures?
Did we select SLOs as an LEA measure?
Which SGM categories will this include?
• Within the guidelines of 2-4 SLOs, how many SLOs
are required for each teacher?
• What assessments will be used?
Refer to the LEA’s “Available Assessments Inventory”
If assessments do not exist for certain grade level(s)
and/or courses, have the “SLO Guidelines for Selecting
Assessments” been followed?
Will the LEA create a district-approved list of SLO
assessments?
46
47. SLO Approval
• Who is approving SLOs in our LEA?
LEAs are responsible for SLO approval.
ODE recommends this process is
completed by a committee(s).
• Has SLO calibration been completed?
SLO calibration is the process of
ensuring a thorough and fair review of
all SLOs by systematically requiring high
quality and rigor across SLOs.
47
48. SLO Procedures
•How will SLOs be tracked?
Submission
Revisions
Final Approval
•What guidance, training, and
support will be provided to
teachers and evaluators?
48
49. SLO Tracking Form
Teacher Name SLO Event Date
Completed
Original SLO Submission
Committee Feedback
Provided to Teacher
SLO Approval
Midpoint Check-In
(recommended, not
required)
SLO End-of-Interval
Scoring and Conference
Final SLO Score Entered in
eTPES 49
50. SLO Professional Development Form
(Example)
Grade
Level
ELA Math Science Soc. St. P.E. Art Music Other
(__________)
Other
(__________)
K
1
2
3
4
5
6
7
50
51. Step Three:
Designing a Local SGM Plan
Communicate expectations and
refine the entire process.
Design communication plans, training, and
professional development opportunities
around requirements and implementation
for teachers and their evaluators.
51
52. SGM Professional Development Form
(Example)
Date Agenda Items Target
Audience
(Identify
which
teachers will
attend the
training)
Follow-Up
Training
Date (if
applicable)
Follow up
Training Agenda
Items
Target
Audience
(Identify
which
teachers will
attend the
training)
52
53. Additional Session Resources
To access session resources, including
the step-by-step workbook and templates
on how to design and improve your own
LEA SGM Plan, go to:
http://bit.ly/SGMPlan
53
54. Works Cited
• Education, Ohio Department of. (2013, May 12). Steps for Designing a Local Student Growth Measures Plan. Retrieved
from Ohio Department of Education: http://education.ohio.gov/Topics/Teaching/Educator-Evaluation-System/Ohio-s-
Teacher-Evaluation-System/Student-Groth-Measures/Additional-Information/Steps-for-Designing-a-Local-Student-
Growth-Measure
• LaWriter Ohio Laws and Rules. (2013, March 22). 3319.111 Applicability of section; evaluating teachers on limited
contracts. Retrieved from LaWriter Ohio Laws and Rules: http://codes.ohio.gov/orc/3319.111
• LaWriter Ohio Laws and Rules. (2013, September 29). 3319.112 [Effective 9/29/2013] Standards-based state framework
for the evaluation of teachers. Retrieved from LaWriter Ohio Laws and Rules: http://codes.ohio.gov/orc/3319.112v2
• Ohio Department of Education. (2013, July 26). Approved Vendor Assessments. Retrieved from Ohio Department of
Education: http://education.ohio.gov/Topics/Teaching/Educator-Evaluation-System/Ohio-s-Teacher-Evaluation-
System/Student-Growth-Measures/Approved-List-of-Assessments#approved
• Ohio Department of Education. (2013, September 13). Local Measures. Retrieved from Ohio Department of Education:
http://education.ohio.gov/getattachment/Topics/Academic-Content-Standards/New-Learning-Standards/Student-
Learning-Objective-Examples/041113-Guidance_on_Selecting_Assessments_for_SLOs.pdf.aspx
• Ohio Department of Education. (2013, September 13). Local Measures. Retrieved from Ohio Department of Education:
http://education.ohio.gov/getattachment/Topics/Academic-Content-Standards/New-Learning-Standards/Student-
Learning-Objective-Examples/112912-SLO-Requirements-and-Recommendations.pdf.aspx
• Ohio Department of Education. (2013, September 25). Student Growth Measures for Teachers. Retrieved from Ohio
Department of Education: http://education.ohio.gov/getattachment/Topics/Teaching/Educator-Evaluation-System/Ohio-
s-Teacher-Evaluation-System/Student-Growth-Measures/091913_Business-rules-for-SGM-FINAL-040913-3.pdf.aspx
• Ohio Department of Education. (2013, September 25). Student Growth Measures for Teachers. Retrieved from Ohio
Department of Education: http://education.ohio.gov/getattachment/Topics/Teaching/Educator-Evaluation-System/Ohio-
s-Teacher-Evaluation-System/Student-Growth-Measures/091913_Combining-the-SGM-scores-one-pager.pdf.aspx
54
55. Questions?
Improve Your District’s SGM Plan
Steps for Designing a Local SGM Plan
Mindy Schulz
Director of Curriculum, Allen County ESC
mindy.schulz@allencountyesc.org
Website info: www.allencountyesc.org
ODE Contact Information:
SGM@education.ohio.gov
55
Editor's Notes
Introduce self. Ask how many are implementing OTES this year. Ask how many are administrators, teachers, other. Ask how many have a SGM plan already.Share how we will go through the steps for the LOCAL DECISIONS that need to be made.
Review session outcomes. Explain time and assistance will be provided to work on their own SGM plan.
“Our session format is the same format we use in OTES, OPES, and eTPES training. Our session and time together will be using the format I do, We do, You do.”
“Please remember that the 50% of the evaluation tied to student growth measures must focus on student academic achievement and change between two or more points in time.”
Ask by a show of hands how many people this graphic looks familiar to. Review a teacher can only be in ONE category & you always start at the top (A1) and work your way down to identify which category the teacher belongs. Categories are determined by the TYPE of data available for each teacher. Give BROAD overview of each category---- A1 = exclusively instructs value added courses; A2 = teachers with value added data and other student growth measures; B = teachers in non-tested grades & subjects with vendor assessment growth measures; C = teachers in non-tested grades and subjects w/o comparable vendor assessments.
Same rule for principals-----only be in one category----start at top & work you way down. Give BROAD overview:A = principals w/value added data & other student growth measuresB= principals w/only non-tested grades, but who have vendor assessment growth measuresC = principals in bldgs. w/non-tested grades & subjects w/o comparable vendor assessments
Acknowledge & thank Kathy for sharing definition, editing & providing feedback on PPT.
Reference this info is on ODE’s website. This session will elaborate on each step.
This is the 1st broad step in designing a local SGM plan.
Principals and assistant principals are also required to be evaluated using the state framework, which is comparable to the teacher evaluation framework.ODE business rules do provide guidance regarding those administrators who do not have the title of principal or assistant principal, but function as principals or assistant principals…Review Business Rules:Are directors/supervisors of Education Service Centers/ Joint Vocational Schools and assistant principals evaluated under the Ohio Principal Evaluation System (OPES)? Administrative titles vary greatly within the state. Thus to answer the above question, focus not on the administrative title, but rather on the role and alignment (if any) to the Principal Performance Rating Rubric, which is based on the Ohio Standards for Principals. When making determinations regarding an administrator’s requirement to be evaluated under the principal evaluation system, answer these important questions:Is the administrator serving as an instructional leader?Do the duties of the administrator fall into at least two of the five principal standards?Does the administrator evaluate multiple staff members?Only those administrators meeting all three criteria above would be evaluated under the principal evaluation system, including student growth measures.Administrators who would not be required to be evaluated under the principal evaluation system include those who do not fit all of these considerations above. These administrators: may have limited contact with teachers and/or students; have narrowly defined roles and administrative responsibilities that do not directly relate to the principal standards; do not provide instructional leadership; nor do they evaluate multiple teachers.
The next step after inventorying teachers required to be evaluated under the new system, is to categorize them. Review criteria for each.
Explain a teacher has to have the value added report “in hand”, in order to be a category A teacher. Value added is determined by the schedule the teacher instructed the prior year. Review timeline…….OAA’s administered in spring….teacher-level VA reports released in fall; uploaded in eTPES 2nd semester; report from previous year follows the teacher to the current year’s evaluation.
Review slide contents.
This chart is useful in determining the category ofteacher. It is important to note again that any teacherwho received a value-added report this fall iseither a Category A1 or A2. What makes thedifference? As you can see from the chart, onlythose teachers who taught value added coursesexclusively last school year and who are teachingvalue added courses exclusively this year areconsidered A1 teachers. ALL + ALL = A1 and these teachers will have aminimum of 26% of the evaluation derived fromthe individual value-added rating.All other teachers with individual value-addedreports are Category A2. If the value-addedreport the teacher received this fall was based ona schedule of exclusive value-added teachinglast year, then that teacher needs to have aminimum of 26% of theevaluation derived fromthe individual value-added rating this year.
All other A2 teachers need to have the studentgrowth measure be proportionate to theirschedules. This is determined at the local level.The minimumpercentage of value added is 10%for these teachers.
Principals in the VA building for the 4 yr. will use the Ohio Report card 3-year composite. The score is loaded automatically for each principal.Currently, building-level Value-Added data is based on a 3-yr. composite and is automatically loaded into eTPES. Since the 3-year composite may not be accurate for each principal due to not serving in a Value-Added building for the past three years, there is a hierarchy of data that we will discuss.
Here, we continue in the possibilities of # of yrs. principals could serve in buildings in which the 3-yr. avg. might need to be re-calculated. On this chart, notice it could be the principal’s 1st yr. in the building, but they could have previous principal experience in another district, in a building with VA data. Let’s review how the local district might approach this situation……REVIEW SLIDE CONTENTS.
Here, we continue in the possibilities of # of yrs. principals could serve in buildings in which the 3-yr. avg. might need to be re-calculated. On this chart, notice it could be the principal’s 1st yr. in the building, but they could have previous principal experience in another district, in a building with VA data. Let’s review how the local district might approach this situation……REVIEW SLIDE CONTENTS.
Just like teachers that instructed in value added courses in the previous year, but may not be teaching any value added courses in the current year…..Principals may also have been in buildings with value added data in the previous year, and be assigned to buildings with no value-added data for the current year. The info on the slide explains the decision process that will need to occur for principals in this scenario.
ORC3319.111requires ODE to develop a list of student assessments that may measure mastery of the course content, for which the VA measure doesn’t apply. The list will be maintained & updated by ODE. Ask how many have seen the ODE-approved vendor assessment list? List is on the ODE website.Ask how many are familiar with the checklist for selecting assessments. This is also on ODE website.Emphasize it is the LEAs responsibility to contact the vendor to ensure they way in which they are using the assessment meets the criteria for student growth. Also, it is their responsibility to secure the growth reports. ODE has posted a 2013-2014 vendor contact information list on their website.
All other teachers and principals will beCategory C, those teachers in non-tested gradesand subjects without comparable vendorassessments or value-addeddata and theprincipals of those buildings.
Once you have completed an inventory of the 3 groups----those with value added data, those w/o value added, but have ODE approved vendor assessment, & those with neither value added or vendor assessment data, you will need to determine if any teachers are new to a value added assignment for the current year.Review the examples on the screen. Remind participants even though the teacher is in a value added course(s), they are not considered a value added teacher until they have the report in their hand. For situations where a value added report does not exist, then LEAs will need to determine if teacher is Cat. B or Cat. C. for the current school year.
Review slide contents.
This chart is useful in determining the category ofteacher. The chart explains Category B and Category Cteachers.
Just like teachers new to value added, principals can also be new with no prior experience as a principal. This chart is useful in determining the category of principal. The chart explains Category B and Category C principals.
This is the 2nd broad step in designing a local SGM plan.
Once step one is completed, “conducting an inventory of needs and resources”, it is time to determine the LEA’s default percentages. The following are ?s to consider (REVIEW SLIDE CONTENTS). Remind participants to stay within the SGM framework, which will be discussed on the following slides.
As many of you might have heard, there is PROPOSED legislation that could POTENTIALLY have a future impact on the educator evaluation model. While this is NOT the focus of today’s session, it is being shared in an effort to keep you informed in the event this is passed and you deem revisions to your SGM plan are needed.
REVIEW SLIDE CONTENTS. Emphasize: ANYTHING CAN CHANGE AT ANY TIME……Stay Tuned!
“Starting in 2014-15, the value-added report must count as 50 percent of the evaluation of an A1 teacher. Again, so if I receive a value-added report for all of the courses I teach, I am considered an A1 teacher and, starting in 2014-15, 50 percent of my evaluation must be based upon those value-added results.” “The allocation of measures and weights will not change for Category A2 teachers between 2013-14 and 2014-15. In 2014-15 and beyond, the weighting of student growth measures must be proportional to the teaching schedule.
“For Category A Principals, there would be no A1 principals, since there would be teachers in the building in non-tested grades and subjects. This graphic depicts the decisions to be made regarding student growth measures for principal evaluation.”
“Category B teachers and principals are those with data from assessments on the ODE-approved vendor assessment list. You may be using wonderful vendor assessments, but only those approved are considered Category B. If you have Category B assessments given in the manner that the vendor states will provide a student growth measure, then you must use that data as part of the evaluation. (A point of note for the facilitator: If the assessment is not on the approved list, it can be used within an SLO as long as it is valid and reliable for the SLO.) “ “The local board of education will need to determine the percentage of the vendor assessment to be used within the evaluation system. The local board of education will make a decision on this for all Category B teachers. This default percentage for the district will be consistent for all Category B teachers. There may be circumstances where this percentage varies; if it does, it should be for valid reasons.” “Here’s a quick question to make sure we’re on the same page: If a teacher has both value-added and approved vendor assessments, is he/she a Category B teacher? (No, the teacher has a value-added report, so the teacher is Category A. The vendor assessment data can be used as part of the local measures, but does not have to be used in this case.)” “Teachers who fall into Category B typically do not have value-added scores, but they do have an approved vendor assessment associated with their classes that can be used to measure student growth.”Note: URM data from SOAR districts will fall intocategory B
Share about Bluffton using STAR Early Literacy for 13 years (began in 2000 with STAR Reading & upgraded as STAR had enhancements). Compare them with Spencerville----just purchased STAR Early Literacy last year. When determining how much weight to place on vendor assessment, it is important to note the length of time the assessment has been in place & the amount of trend data available……establishes greater level of confidence for determining % to apply to vendor assessments. Bluffton placed a higher % on their vendor assessments because they have 13 yrs. of trend data to analyze.
What exactly are local measures?Local measures for teachers differ slightly fromlocal measures for principals. REVIEW SLIDE CONTENTS.
REVIEW SLIDE CONTENTS.
Shared attribution is a local measure that is attributed to a GROUP OF TEACHERS.“We have named the three types of LEA measures and discussed possible weights of these measures. However, let’s take a minute now to further define shared attribution, which is a local measure. Shared attribution is a collective measure of student growth. If the LEA elects to use shared attribution, the LEA must choose which measure of shared attribution it would like to use. Shared attribution can be one of three things:A building or district value-added score, which is the recommended shared attribution measure A building team composite value-added score (i.e. the 5th grade VAM score or the middle school reading ELA team’s combined VAM score)Building-level or district-level SLOs ODE recommends that, if a district opts to include shared attribution, that it use a building or district value-added score. One reason to consider using building or district value-added score because it expands the collective pool. The smaller the pool of teachers who contribute to the value-added score, the larger the effect one teacher could have on the composite value-added score. For example think about new teachers who might not be as effective as veterans during that first year of teaching but is still doing the best she or he can. In addition, using building or district value-added scores rather than team value-added scores could create a team atmosphere as opposed to a competitive atmosphere. Like individual student growth measures, measures of shared attribution would produce a growth score of 1 to 5 that would later be included in the calculation of the teacher’s summative score.”
Like shared attribution for teachers, sharedattribution for principal is also a collectivemeasure of student growth. If the LEA elects touse shared attribution, theLEA must choose whichmeasure of shared attribution it would like to use.However, shared attribution for principals canbe one of three things:• A district value-added rating, which is therecommended shared attribution measure• Groups of schools (such as grade levelbuildings or regional areas within a district)may utilize a composite Value-Addedrating• District-based SLOs
Here is a screen shot of the teacher defaultstudent growth measures portion of eTPES. This isprovided so you can make decisions.
This is an example for 2014-2015. Keep in mind for 14-15 Cat. A1 requires all 50% of SGM to be based on teacher-level value added.
Once the LEA sets their default %s, and categories are assigned, it might be necessary to allow for variations. If there are reasons for varying the percentages within a category, the superintendent may do that for the principal. The principal may do that for the teacher. For example, for category A2 teachers, there is one default percentage for the entire district. There may be varying schedules per teacher and the principal may enter the appropriate percentages based upon the teacher’s schedule so that the percentage is proportionate to the teacher’s schedule. Refer participants to “Combining the Student Growth Measures in the Educator Evaluator Systems” on the ODE website.
Here is a screen shot of the principal defaultstudent growth measures portion of eTPES. This isprovided so you can make decisions.
Once the default %s have been set, it is time to determine how the LEA will implement the local measures process. REVIEW SLIDE CONTENTS.
Remind participants of shared attribution measures covered previously (bldg. or district VA; bldg. team composite; bldg. level or district level SLOs). REVIEW SLIDE CONTENTS.
Important to let teachers know if their category is required to write SLOs. Teachers will need to know which subject areas/courses they are writing SLOs. SLOs development begins in the fall. Several ODE guidance docs are available on the ODE website.SLO FAQs: ODE requires a minimum of 2 & recommends no more than 4 which are representative of your schedule & student population. Guideline applied to both Cat. B and A2 teachers if the LEA has determined SLOs will be used as local measures.
AIR Module 6 focuses on calibration & is available on ODE website.Share calibration is threaded throughout entire OTES & OPES systems……credentialing training requires calibration on the performance rating rubrics; emphasize need to calibrate on any type of rubrics-----rubrics used in SLOs for assessments, etc.
SLO approval is not a “once & done”. It will require ongoing feedback between the SLO approval committee and teachers.
Here is an example of how a district might track SLOs throughout the approval and scoring process.
Another example of how a district might plan for SLO PD.
This is the 3rd broad step in designing a local SGM plan. Keep in mind, the educator evaluation systems will require ONGOING SUPPORT and PD.
Again, sample form for identifying professional development and support.
Explain downloadable, electronic templates of all of the inventory steps are available here, as well as a “How to Design a Local SGM Plan” workbook and today’s PPT.
These are the resources used, mostly ODE docs that are available on ODE’s website.
Review contact info. Ask if any participants have questions. Thank them for attending.