This document provides an overview of Research in Action, Inc., which provides consulting, training, and evaluation services to educational organizations. It summarizes some of the company's recent projects, including:
- Developing accountability systems, training platforms, and evaluation solutions for state departments of education and organizations like the Bureau of Indian Education.
- Providing consultation, such as designing accountability metrics, developing assessment systems, and preparing evidence for peer reviews.
- Creating online training through their Homeroom platform and conducting in-person training sessions on topics like student learning objectives and assessment literacy.
- Partnering with a range of clients over the past decade including most state departments of education.
The document provides guidance on developing high-quality Student Learning Objectives (SLOs) through a multi-step process. It outlines the three phases of designing, building, and reviewing SLOs. Key steps include composing a goal statement and targeted content standards, creating a blueprint, completing the SLO form, and conducting a quality assurance review to ensure the SLO is complete, comprehensive, and coherent. The goal is to design SLOs that accurately measure student achievement and growth to guide instruction.
This document provides an overview of Pennsylvania's Student Learning Objective (SLO) process and template. The SLO template is used to identify goals, indicators, and performance measures for teacher effectiveness evaluations. It includes sections for classroom context, the SLO goal, and performance indicators. The goal should be based on key standards and provide a rationale. Performance indicators specify measurable targets for student achievement on valid assessments. The template provides structure and guidance for teachers to set rigorous and meaningful objectives.
This document provides guidance on building student learning objectives (SLOs) and student support objectives (SSOs) for use in an educator effectiveness system. It reviews the requirements for completing the SLO and SSO forms and guides participants through creating an SLO/SSO, including selecting applicable performance measures. Key sections of the forms are explained, such as context/setting, goal, and objectives. Examples are provided for different content areas and support services. The overall aim is to help educators build technically rigorous SLOs/SSOs to guide instruction and determine student growth.
Introduction to SLO Training - Steps 1 & 2emilycaryn
This document provides information about Student Learning Objectives (SLOs). It defines an SLO as a process used to measure educator effectiveness based on student achievement of content standards. The document outlines the history of SLO development in Pennsylvania and describes the three-part SLO process of design, build, and review. It also presents the SLO template and explains how to complete each section, including setting the goal statement, identifying relevant standards, and providing classroom context. Sample content is provided for an art teacher's SLO. The document guides educators through collaboratively developing an initial SLO design using student data and appropriate standards.
This document provides an overview of Pennsylvania's Student Learning Objective (SLO) process for measuring teacher effectiveness. It reviews the SLO concept, terminology, design, criteria, and template. The SLO process requires teachers to identify goals based on content standards, select performance measures to assess student achievement of those goals, and establish performance indicators and expectations. The SLO template guides teachers through documenting this process in six sections: classroom context, SLO goal and standards, performance measures, growth targets, analysis of student results, and evaluation.
This document provides an overview of a presentation on student learning objectives (SLOs). It discusses the key elements of an SLO template, including setting goals based on standards, identifying assessments and performance indicators, and setting teacher expectations. Participants worked in groups to populate sections of an SLO template based on these elements. The purpose of SLOs is to positively influence teacher effectiveness ratings by setting clear goals for student growth and achievement.
This document provides information about student learning objectives (SLOs) for teachers. It defines SLOs as a process to measure student achievement and educator effectiveness based on content standards. It states that all teachers create SLOs for their specific classes. The document also provides examples of well-written goal statements for SLOs in different subject areas and links to resources on SLOs, standards, and the PA-ETEP website for submitting SLOs. The deadline to submit SLOs via PA-ETEP is November 6.
This document provides information about Student Learning Objectives (SLOs) and their role in teacher evaluations under the Performance Evaluation Reform Act (PERA) in Illinois. It explains that PERA requires teacher evaluations to include both measures of teaching practice and student growth. Districts can choose to measure student growth using SLOs, which are academic goals that teachers set for their students at the start of a course. The document outlines the SLO process and requirements, such as selecting appropriate assessments and setting growth expectations. It also addresses common questions about implementing SLOs and using them for teacher evaluations.
The document provides guidance on developing high-quality Student Learning Objectives (SLOs) through a multi-step process. It outlines the three phases of designing, building, and reviewing SLOs. Key steps include composing a goal statement and targeted content standards, creating a blueprint, completing the SLO form, and conducting a quality assurance review to ensure the SLO is complete, comprehensive, and coherent. The goal is to design SLOs that accurately measure student achievement and growth to guide instruction.
This document provides an overview of Pennsylvania's Student Learning Objective (SLO) process and template. The SLO template is used to identify goals, indicators, and performance measures for teacher effectiveness evaluations. It includes sections for classroom context, the SLO goal, and performance indicators. The goal should be based on key standards and provide a rationale. Performance indicators specify measurable targets for student achievement on valid assessments. The template provides structure and guidance for teachers to set rigorous and meaningful objectives.
This document provides guidance on building student learning objectives (SLOs) and student support objectives (SSOs) for use in an educator effectiveness system. It reviews the requirements for completing the SLO and SSO forms and guides participants through creating an SLO/SSO, including selecting applicable performance measures. Key sections of the forms are explained, such as context/setting, goal, and objectives. Examples are provided for different content areas and support services. The overall aim is to help educators build technically rigorous SLOs/SSOs to guide instruction and determine student growth.
Introduction to SLO Training - Steps 1 & 2emilycaryn
This document provides information about Student Learning Objectives (SLOs). It defines an SLO as a process used to measure educator effectiveness based on student achievement of content standards. The document outlines the history of SLO development in Pennsylvania and describes the three-part SLO process of design, build, and review. It also presents the SLO template and explains how to complete each section, including setting the goal statement, identifying relevant standards, and providing classroom context. Sample content is provided for an art teacher's SLO. The document guides educators through collaboratively developing an initial SLO design using student data and appropriate standards.
This document provides an overview of Pennsylvania's Student Learning Objective (SLO) process for measuring teacher effectiveness. It reviews the SLO concept, terminology, design, criteria, and template. The SLO process requires teachers to identify goals based on content standards, select performance measures to assess student achievement of those goals, and establish performance indicators and expectations. The SLO template guides teachers through documenting this process in six sections: classroom context, SLO goal and standards, performance measures, growth targets, analysis of student results, and evaluation.
This document provides an overview of a presentation on student learning objectives (SLOs). It discusses the key elements of an SLO template, including setting goals based on standards, identifying assessments and performance indicators, and setting teacher expectations. Participants worked in groups to populate sections of an SLO template based on these elements. The purpose of SLOs is to positively influence teacher effectiveness ratings by setting clear goals for student growth and achievement.
This document provides information about student learning objectives (SLOs) for teachers. It defines SLOs as a process to measure student achievement and educator effectiveness based on content standards. It states that all teachers create SLOs for their specific classes. The document also provides examples of well-written goal statements for SLOs in different subject areas and links to resources on SLOs, standards, and the PA-ETEP website for submitting SLOs. The deadline to submit SLOs via PA-ETEP is November 6.
This document provides information about Student Learning Objectives (SLOs) and their role in teacher evaluations under the Performance Evaluation Reform Act (PERA) in Illinois. It explains that PERA requires teacher evaluations to include both measures of teaching practice and student growth. Districts can choose to measure student growth using SLOs, which are academic goals that teachers set for their students at the start of a course. The document outlines the SLO process and requirements, such as selecting appropriate assessments and setting growth expectations. It also addresses common questions about implementing SLOs and using them for teacher evaluations.
The document discusses evaluation of instructional programs, outlining Kirkpatrick's four-level model of evaluation that assesses reaction, learning, behavior, and results. It provides details on each level, including example evaluation questions and methods to measure outcomes. The goal is to help instructors systematically evaluate their programs to improve learning and impact through assessment at each stage of the model.
The document discusses the analysis phase of instructional design, which involves understanding learners, contexts, tasks and needs to design effective instruction. It describes analyzing learners' skills, knowledge and motivation, as well as the teaching environment and tools. The analysis phase aims to understand the current and desired states to identify gaps and inform the design of learning activities, content and assessments.
This document provides information about student learning objectives (SLOs) to teachers at Loyalsock Township Middle School. It explains that SLOs are academic goals set by teachers for groups of students to be achieved by January 2015. The document reviews questions teachers had previously about developing SLOs and addresses how to make them specific, measurable, attainable and aligned to standards. It provides an example of an SLO and outlines next steps, which include further training in September and completing section 1 of the SLO template.
The presentation based on the tuning process in education. The module presented in training of university faculty. It explain how to apply tuning at course, degree and at programme level.
The systematic design of instruction dick and careyCathy Cousear
This document outlines the key principles and steps of instructional design according to Dick and Carey's systematic design model. It discusses establishing instructional goals, analyzing the learning context, writing objectives and criterion-referenced tests, developing instructional strategies and materials, and conducting formative evaluations to revise instruction. The goal is to design effective instruction by thoroughly analyzing learning needs and contexts, developing appropriate objectives and assessments, selecting optimal instructional approaches, and refining the design through evaluation and revision.
This document discusses the development of rubrics and assessments for geoscience education materials. It begins by outlining the key components of the Materials Design Rubric, including learning goals and outcomes, assessments and measurements, resources and materials, instructional strategies, and alignment. It then focuses on defining learning outcomes and assessments. Learning outcomes should be measurable and address cognitive, affective, and behavioral domains. Assessments can be formative or summative and should measure the stated learning goals. The document discusses scoring rubrics as tools for assessment and provides examples of holistic and analytic rubric designs. The overall purpose is to help educators design effective assessments and rubrics to evaluate student learning and the quality of educational materials.
Curriculum constrction sem i evaluation modelsRaj Kumar
The document discusses various approaches to curriculum evaluation including goal-based, goal-free, responsive, decision-making, and accreditation approaches. It then examines several models of curriculum evaluation, including Tyler's objectives-based model, Stake's responsive model focusing on antecedents, transactions, and outcomes, and Stufflebeam's CIPP model evaluating context, inputs, processes, and products. The models provide different conceptual frameworks for designing curriculum evaluations.
An Introduction To The Dick & Carey Instructional Design ModelLarry Weas
The nine basic steps (excluding Summative Evaluation) represent a set of procedures, which is referred to as the systems approach because it is made up of interacting components, each having its own input and output, which together produce predetermined products using the ADDIE process.
This document discusses different types of online assessment used in education. It identifies formative assessment as used early in instruction to monitor student learning and provide feedback to improve teaching. Summative assessment measures how well learning outcomes are achieved at the end of instruction. Other assessment types discussed include diagnostic pre-assessment, confirmative assessment to check ongoing success, norm-referenced assessment comparing to averages, and criterion-referenced assessment against predetermined standards. The document also lists some popular online assessment tools for teachers.
Student Learning Objectives, Mississippi Department of Education, Research in Action, Educator Effectiveness, Assessment Literacy, Assessment, Teacher Effectiveness, Policy
Designing and conducting formative evaluationsLarry Cobb
This presentation discusses formative evaluations and how to design and conduct them. Formative evaluations are used during instructional design to improve effectiveness and identify problems. They involve subject matter experts, learning specialists, and learners. One-to-one and small group evaluations identify obvious errors. Field trials test the instruction in its intended context. Data is collected to evaluate clarity, impact, feasibility, and whether objectives are met. Concerns that influence formative evaluations include the context, learners, and outcomes. Problem solving is used to improve weak units based on evaluation data.
The document describes Loyola University Maryland's process for assessing student learning assessment at the program level. It discusses the university's institutional context, student learning assessment committee, assessment reporting process, and rubric used to rate program-level assessment reports. Programs receive feedback on how their assessment reports were rated in order to continuously improve the assessment of student learning.
Designing and conducting formative evaluationscw8842
Formative evaluation is used to improve instructional materials during the development process and has three stages: 1) one-to-one evaluation identifies obvious errors and gets initial learner feedback; 2) small group evaluation determines if changes address problems and how learners use unguided materials; 3) field trials assess if changes are effective in the intended context. The evaluation should ask questions about each material component's appropriateness, clarity, motivation, and efficiency. Specialists provide feedback on accuracy, learning enhancement, and appropriateness for outcomes.
The document describes the Dick and Carey instructional design model and its application in an online environmental science simulation called Shell Island. The Dick and Carey model is a traditional, performance-based model that breaks instruction down into components. It was applied to the Shell Island simulation to help students learn about identifying stakeholders' perspectives on a coastal development issue through online resources and debate. Educators tested and refined the simulation materials and assessments according to the Dick and Carey model.
Chapter 10 Designing and Conducting Formative Evaluationscdjhaigler
Formative evaluation involves iterative testing of instructional materials with target learners to identify issues and improve effectiveness. It proceeds through three main stages - one-to-one testing to identify gross errors; small group testing to assess revisions; and field testing in the intended environment. Feedback is gathered at each stage through methods like interviews and assessments. The goal is to refine the materials and ensure they clearly communicate content to learners in a way that achieves the objectives. Subject matter, learning, and learner experts also review the materials to evaluate accuracy, pedagogy and appropriateness for the audience.
The document discusses improving classroom assessment to meet requirements of continuous and comprehensive evaluation (CCE). It covers several topics related to modern assessment theory including developing assessments aligned to learning objectives, using formative and summative assessments, and locating students on a development continuum. The document also discusses cognitive levels of assessment items based on Bloom's taxonomy, with higher cognitive levels requiring more complex thought processes. Teachers can vary the difficulty of items by adjusting the vocabulary, content familiarity, format, and inclusion of diagrams or other supports.
Designing and conducting formative evaluationsRaheen26
This document discusses formative evaluation, which involves collecting data during instructional design to improve effectiveness. It covers:
- Formative evaluation designs, and the role of subject matter experts, learning specialists, and learners.
- One-on-one, small group, and field evaluations to identify problems and ensure instructions can be used as intended.
- Evaluating instructional strategies, materials, instructor-led instruction, and the performance context.
- Important concerns like the evaluation context, learners, outcomes, and implementation.
- Using evaluations to solve problems and make decisions about instructional components.
This document provides an overview of assessment and evaluation in course design. It defines assessment as gathering information to make judgments about learner performance compared to standards to determine grades or success. Evaluation gathers information to improve teaching and learning, and can be formative (ongoing) or summative (final). Common assessment methods include tests, assignments, projects and surveys. Kirkpatrick's model outlines four levels of evaluation: reaction, learning, behavior, and results. When developing assessments, considerations include the purpose, objectives, class size, feedback, and using rubrics.
Leadership and learning with revisions dr. lisa bertrand-nfeasj (done)William Kritsonis
NATIONAL FORUM JOURNALS are a group of national and international refereed, blind-reviewed academic journals. NFJ publishes articles academic intellectual diversity, multicultural issues, management, business, administration, issues focusing on colleges, universities, and schools, all aspects of schooling, special education, counseling and addiction, international issues of education, organizational behavior, theory and development, and much more. DR. WILLIAM ALLAN KRITSONIS is Editor-in-Chief (Since 1982). See: www.nationalforum.com
The document summarizes Wisconsin's Educator Effectiveness system which was designed to evaluate teachers and principals using multiple measures of educator practice and student outcomes. It describes the purpose of developing the system to identify and support educator effectiveness. It outlines the key parts of the system including the standards and rubrics used, how student and educator outcomes are measured, how the evaluation process is managed through technology, and the timeline for implementing the system statewide.
The document discusses evaluation of instructional programs, outlining Kirkpatrick's four-level model of evaluation that assesses reaction, learning, behavior, and results. It provides details on each level, including example evaluation questions and methods to measure outcomes. The goal is to help instructors systematically evaluate their programs to improve learning and impact through assessment at each stage of the model.
The document discusses the analysis phase of instructional design, which involves understanding learners, contexts, tasks and needs to design effective instruction. It describes analyzing learners' skills, knowledge and motivation, as well as the teaching environment and tools. The analysis phase aims to understand the current and desired states to identify gaps and inform the design of learning activities, content and assessments.
This document provides information about student learning objectives (SLOs) to teachers at Loyalsock Township Middle School. It explains that SLOs are academic goals set by teachers for groups of students to be achieved by January 2015. The document reviews questions teachers had previously about developing SLOs and addresses how to make them specific, measurable, attainable and aligned to standards. It provides an example of an SLO and outlines next steps, which include further training in September and completing section 1 of the SLO template.
The presentation based on the tuning process in education. The module presented in training of university faculty. It explain how to apply tuning at course, degree and at programme level.
The systematic design of instruction dick and careyCathy Cousear
This document outlines the key principles and steps of instructional design according to Dick and Carey's systematic design model. It discusses establishing instructional goals, analyzing the learning context, writing objectives and criterion-referenced tests, developing instructional strategies and materials, and conducting formative evaluations to revise instruction. The goal is to design effective instruction by thoroughly analyzing learning needs and contexts, developing appropriate objectives and assessments, selecting optimal instructional approaches, and refining the design through evaluation and revision.
This document discusses the development of rubrics and assessments for geoscience education materials. It begins by outlining the key components of the Materials Design Rubric, including learning goals and outcomes, assessments and measurements, resources and materials, instructional strategies, and alignment. It then focuses on defining learning outcomes and assessments. Learning outcomes should be measurable and address cognitive, affective, and behavioral domains. Assessments can be formative or summative and should measure the stated learning goals. The document discusses scoring rubrics as tools for assessment and provides examples of holistic and analytic rubric designs. The overall purpose is to help educators design effective assessments and rubrics to evaluate student learning and the quality of educational materials.
Curriculum constrction sem i evaluation modelsRaj Kumar
The document discusses various approaches to curriculum evaluation including goal-based, goal-free, responsive, decision-making, and accreditation approaches. It then examines several models of curriculum evaluation, including Tyler's objectives-based model, Stake's responsive model focusing on antecedents, transactions, and outcomes, and Stufflebeam's CIPP model evaluating context, inputs, processes, and products. The models provide different conceptual frameworks for designing curriculum evaluations.
An Introduction To The Dick & Carey Instructional Design ModelLarry Weas
The nine basic steps (excluding Summative Evaluation) represent a set of procedures, which is referred to as the systems approach because it is made up of interacting components, each having its own input and output, which together produce predetermined products using the ADDIE process.
This document discusses different types of online assessment used in education. It identifies formative assessment as used early in instruction to monitor student learning and provide feedback to improve teaching. Summative assessment measures how well learning outcomes are achieved at the end of instruction. Other assessment types discussed include diagnostic pre-assessment, confirmative assessment to check ongoing success, norm-referenced assessment comparing to averages, and criterion-referenced assessment against predetermined standards. The document also lists some popular online assessment tools for teachers.
Student Learning Objectives, Mississippi Department of Education, Research in Action, Educator Effectiveness, Assessment Literacy, Assessment, Teacher Effectiveness, Policy
Designing and conducting formative evaluationsLarry Cobb
This presentation discusses formative evaluations and how to design and conduct them. Formative evaluations are used during instructional design to improve effectiveness and identify problems. They involve subject matter experts, learning specialists, and learners. One-to-one and small group evaluations identify obvious errors. Field trials test the instruction in its intended context. Data is collected to evaluate clarity, impact, feasibility, and whether objectives are met. Concerns that influence formative evaluations include the context, learners, and outcomes. Problem solving is used to improve weak units based on evaluation data.
The document describes Loyola University Maryland's process for assessing student learning assessment at the program level. It discusses the university's institutional context, student learning assessment committee, assessment reporting process, and rubric used to rate program-level assessment reports. Programs receive feedback on how their assessment reports were rated in order to continuously improve the assessment of student learning.
Designing and conducting formative evaluationscw8842
Formative evaluation is used to improve instructional materials during the development process and has three stages: 1) one-to-one evaluation identifies obvious errors and gets initial learner feedback; 2) small group evaluation determines if changes address problems and how learners use unguided materials; 3) field trials assess if changes are effective in the intended context. The evaluation should ask questions about each material component's appropriateness, clarity, motivation, and efficiency. Specialists provide feedback on accuracy, learning enhancement, and appropriateness for outcomes.
The document describes the Dick and Carey instructional design model and its application in an online environmental science simulation called Shell Island. The Dick and Carey model is a traditional, performance-based model that breaks instruction down into components. It was applied to the Shell Island simulation to help students learn about identifying stakeholders' perspectives on a coastal development issue through online resources and debate. Educators tested and refined the simulation materials and assessments according to the Dick and Carey model.
Chapter 10 Designing and Conducting Formative Evaluationscdjhaigler
Formative evaluation involves iterative testing of instructional materials with target learners to identify issues and improve effectiveness. It proceeds through three main stages - one-to-one testing to identify gross errors; small group testing to assess revisions; and field testing in the intended environment. Feedback is gathered at each stage through methods like interviews and assessments. The goal is to refine the materials and ensure they clearly communicate content to learners in a way that achieves the objectives. Subject matter, learning, and learner experts also review the materials to evaluate accuracy, pedagogy and appropriateness for the audience.
The document discusses improving classroom assessment to meet requirements of continuous and comprehensive evaluation (CCE). It covers several topics related to modern assessment theory including developing assessments aligned to learning objectives, using formative and summative assessments, and locating students on a development continuum. The document also discusses cognitive levels of assessment items based on Bloom's taxonomy, with higher cognitive levels requiring more complex thought processes. Teachers can vary the difficulty of items by adjusting the vocabulary, content familiarity, format, and inclusion of diagrams or other supports.
Designing and conducting formative evaluationsRaheen26
This document discusses formative evaluation, which involves collecting data during instructional design to improve effectiveness. It covers:
- Formative evaluation designs, and the role of subject matter experts, learning specialists, and learners.
- One-on-one, small group, and field evaluations to identify problems and ensure instructions can be used as intended.
- Evaluating instructional strategies, materials, instructor-led instruction, and the performance context.
- Important concerns like the evaluation context, learners, outcomes, and implementation.
- Using evaluations to solve problems and make decisions about instructional components.
This document provides an overview of assessment and evaluation in course design. It defines assessment as gathering information to make judgments about learner performance compared to standards to determine grades or success. Evaluation gathers information to improve teaching and learning, and can be formative (ongoing) or summative (final). Common assessment methods include tests, assignments, projects and surveys. Kirkpatrick's model outlines four levels of evaluation: reaction, learning, behavior, and results. When developing assessments, considerations include the purpose, objectives, class size, feedback, and using rubrics.
Leadership and learning with revisions dr. lisa bertrand-nfeasj (done)William Kritsonis
NATIONAL FORUM JOURNALS are a group of national and international refereed, blind-reviewed academic journals. NFJ publishes articles academic intellectual diversity, multicultural issues, management, business, administration, issues focusing on colleges, universities, and schools, all aspects of schooling, special education, counseling and addiction, international issues of education, organizational behavior, theory and development, and much more. DR. WILLIAM ALLAN KRITSONIS is Editor-in-Chief (Since 1982). See: www.nationalforum.com
The document summarizes Wisconsin's Educator Effectiveness system which was designed to evaluate teachers and principals using multiple measures of educator practice and student outcomes. It describes the purpose of developing the system to identify and support educator effectiveness. It outlines the key parts of the system including the standards and rubrics used, how student and educator outcomes are measured, how the evaluation process is managed through technology, and the timeline for implementing the system statewide.
Teacher evaluations-and-local-flexibilityDavid Black
School Improvement Network conducted study of 50 state department of education officials who are responsible for implementing teacher evaluation policy to better understand state teacher evaluation policy and how much flexibility districts have at the local level to implement state requirements. The goal was to inform ourselves, school districts and local schools how much freedom and flexibility, or lack thereof, they have to innovate on behalf of their own teachers and students particularly when it comes to using technology to achieve their professional development needs.
Denise Bradby is a senior research associate with over 20 years of experience directing complex projects related to K-12 education. She has specialized in developing course classification systems, analyzing student data, and integrating quantitative and qualitative evaluation methods. Bradby is currently recovering from a double lung and heart transplant. Some of her most notable projects include directing the development of the School Codes for the Exchange of Data taxonomy, evaluating California's Transcript Evaluation Service, and assessing career academies through the ConnectEd initiative.
This document provides guidelines for Arizona's teacher evaluation process based on the state's framework for measuring educator effectiveness. It outlines three main components of the evaluation: teaching performance, student academic progress data, and survey data. It also describes operational definitions, the evaluation process which includes beginning and end of year conferences to set goals and determine performance classifications, as well as examples of weighting the different components. Appendices provide additional resources like evaluation rubrics and forms.
Daniel Mosunich has over 30 years of experience in K-12 education as a teacher, principal, and district administrator in California. He currently serves as the Director of Assessment, Accountability, and Program Development for the Atascadero Unified School District, where he leads the implementation of common core standards, assessment programs, and data-driven processes to improve student outcomes. Mosunich has a passion for strengthening learning for all students, especially traditionally underachieving groups, through vision, professional development, and evidence-based decision making.
This document outlines an induction training for new teachers. It includes an agenda for professional development sessions focused on establishing positive parent relations, analyzing student assessment data, and teaching organizational skills. The training will use activities like roleplaying parent communication, analyzing sample student test data, and creating lesson plans for teaching organizational skills. Formative feedback will be collected through parking lot notes, surveys, and exit slips to inform future professional development.
VoluMe 16, nuMBer 1 | asca 71
SCHOOL
COUNSELORS:
CLOSING ACHIEVEMENt
GAPS ANd
wRItING RESULtS
REPORtS
Charged with closing the achievement gap
for marginalized students, school counselors
need to be able to identify gaps, develop
interventions, evaluate effectiveness, and
share results. This study examined100
summary results reports submitted by school
counselors after having received four days
of training on the ASCA National Model.
Findings indicate that school counselors
were able to identify gaps and develop
interventions but needed additional training
to evaluate outcomes and report findings.
Charged with closing the achievement gap for underserved and marginalized students, today’s professional school counselors must demonstrate that their school counseling program is making a difference and closing gaps in achievement (American School Counselor Associa-tion [ASCA], 2012; Educa-tion Trust, 2003). School counselors are aware that not all students have the same
resources; therefore, they must
analyze data to discover inequities,
develop programs or interventions to
address these inequities, and measure
their results to determine the effective-
ness of the programs or interventions
(ASCA, 2012). By documenting how
the school counseling program is help-
ing to narrow the achievement gap
with school counseling interventions,
school counselors are moving “from
the periphery of the school’s mission
to a position where the educational
community views [school counselors]
as critical to student success” (ASCA,
2005, p. 53).
School counselors must receive
training in order to implement
data-driven comprehensive school
counseling programs (Dimmit,
Carey & Hatch, 2007). Wilkerson
and Eschbach (2009) found that
graduate students in school counsel-
ing programs perceived themselves
as better prepared to implement the
ASCA National Model (ASCA, 2012)
after receiving training developed by
the Education Trust. School coun-
selors need to receive training on the
ASCA National Model just as teach-
ers receive professional development
when new concepts are introduced
(Dahir, Burnham, & Stone, 2009).
Although comprehensive developmen-
tal programs were first implemented
in the 1970s, the data skills needed to
Julie hartline is a school counseling and
advisement consultant with Cobb County
School District. E-mail [email protected]
cobbk12.org debra C. Cobia is associate
dean of the College of Education,
University of West Georgia, Carrollton, Ga.
http://crossmark.crossref.org/dialog/?doi=10.1177%2F2156759X1201600109&domain=pdf&date_stamp=2018-02-15
72 asca | Professional school counseling
implement a comprehensive program
are not taught in all school counselor
education programs. Consequently, the
need exists for training and opportuni-
ties for professional development for
practicing school counselors in the
understanding and implementation
of a comprehensive s.
VoluMe 16, nuMBer 1 | asca 71
SCHOOL
COUNSELORS:
CLOSING ACHIEVEMENt
GAPS ANd
wRItING RESULtS
REPORtS
Charged with closing the achievement gap
for marginalized students, school counselors
need to be able to identify gaps, develop
interventions, evaluate effectiveness, and
share results. This study examined100
summary results reports submitted by school
counselors after having received four days
of training on the ASCA National Model.
Findings indicate that school counselors
were able to identify gaps and develop
interventions but needed additional training
to evaluate outcomes and report findings.
Charged with closing the achievement gap for underserved and marginalized students, today’s professional school counselors must demonstrate that their school counseling program is making a difference and closing gaps in achievement (American School Counselor Associa-tion [ASCA], 2012; Educa-tion Trust, 2003). School counselors are aware that not all students have the same
resources; therefore, they must
analyze data to discover inequities,
develop programs or interventions to
address these inequities, and measure
their results to determine the effective-
ness of the programs or interventions
(ASCA, 2012). By documenting how
the school counseling program is help-
ing to narrow the achievement gap
with school counseling interventions,
school counselors are moving “from
the periphery of the school’s mission
to a position where the educational
community views [school counselors]
as critical to student success” (ASCA,
2005, p. 53).
School counselors must receive
training in order to implement
data-driven comprehensive school
counseling programs (Dimmit,
Carey & Hatch, 2007). Wilkerson
and Eschbach (2009) found that
graduate students in school counsel-
ing programs perceived themselves
as better prepared to implement the
ASCA National Model (ASCA, 2012)
after receiving training developed by
the Education Trust. School coun-
selors need to receive training on the
ASCA National Model just as teach-
ers receive professional development
when new concepts are introduced
(Dahir, Burnham, & Stone, 2009).
Although comprehensive developmen-
tal programs were first implemented
in the 1970s, the data skills needed to
Julie hartline is a school counseling and
advisement consultant with Cobb County
School District. E-mail [email protected]
cobbk12.org debra C. Cobia is associate
dean of the College of Education,
University of West Georgia, Carrollton, Ga.
http://crossmark.crossref.org/dialog/?doi=10.1177%2F2156759X1201600109&domain=pdf&date_stamp=2018-02-15
72 asca | Professional school counseling
implement a comprehensive program
are not taught in all school counselor
education programs. Consequently, the
need exists for training and opportuni-
ties for professional development for
practicing school counselors in the
understanding and implementation
of a comprehensive s ...
This document provides an overview of the institutional effectiveness process used at a university. It discusses identifying outcomes and standards to measure, developing valid measurement tools like rubrics and surveys, and implementing consistent data collection. Key parts of the process include analyzing results, disseminating findings, utilizing the data to suggest necessary changes, and documenting the process. The university uses oversight groups and an annual review to evaluate effectiveness and inform strategic planning and quality improvement initiatives.
Wsu District Capacity Of Well Crafted District Wide System Of SupportWSU Cougars
The document discusses the importance of leadership and data in building an effective district-wide system of support for student and staff success. It provides several key components of an effective district system including leadership focused on instructional improvement, aligning policies to support improvement goals, providing teacher learning resources, and using data to drive decisions. The "Data Wise" process of using data to improve teaching and learning is described. Districts should set up data systems, create incentives, support new skills, and find time to model data-driven work. High-performing schools frequently monitor learning, have high standards, collaborate, align curriculum and assessments, and involve families and communities. Multiple measures should be used to understand student performance.
This document discusses using data to improve schools and student outcomes. It provides:
1) Nine characteristics of high-performing schools that focus on clear goals, high expectations, leadership, collaboration, aligned curriculum and frequent monitoring.
2) An eight-step process called "Data Wise" for using data to identify problems, examine instruction, develop plans and assess progress.
3) The importance of considering multiple data sources, such as demographics, perceptions, programs and student learning to understand different student experiences.
This document summarizes Jackson State University's re-accreditation process with SACS. It outlines the leadership team and committees overseeing compliance certification and development of a Quality Enhancement Plan. It provides timelines for completing documentation, conducting reviews, and submitting materials to SACS. Emerging themes for the QEP include intercultural awareness, service learning, and information literacy.
The document discusses updates to Washington State's Teacher and Principal Evaluation Project (TPEP). It outlines a timeline for implementation of new evaluation models between 2010-2014. All districts will use the new models starting in 2013-14. The new models will include 4 tiers of evaluation (exemplary, proficient, basic, unsatisfactory). Teacher and principal evaluation criteria are aligned around areas like instruction, data use, culture and community. Student growth data from multiple measures must be incorporated. The document recommends various resources and task forces to help with training and implementation of the new evaluation systems.
This document summarizes research from 34 teaching school alliances on developing alternative approaches to assessment without the use of levels. It describes the priorities that emerged from the research, including developing assessment tools to provide feedback to support individual progress, capture progress, and use technology to track attainment. The report provides examples of strategies and tools developed, and concludes with recommendations for further developing assessment approaches and sharing best practices between schools.
How will I and my students utilize the results of the assessment_sir joey.docxLeiYah4
The document provides guidance on how teachers can utilize assessment data from various tools to improve student learning and instruction in five key ways:
1. Plan individualized instructional interventions for struggling students based on their strengths and weaknesses.
2. Develop daily instructional strategies like grouping students based on their performance levels.
3. Set targeted goals for students and teachers to guide success.
4. Monitor student and teacher progress regularly to track improvements.
5. Discover professional development needs for teachers based on areas where students see the least growth.
STEP Annual Report 2014-2015 - MANTRA's School Transformation and Empowerment...Anoop Erakkil
School Transformation and Empowerment Project(STEP) is an initiative of MANTRA Social Services - Bangalore, Through STEP, we strive to promote and improve quality of education in schools serving the socioeconomically disadvantaged population of the country.
In the current academic year(2014-2015), MANTRA engaged with 9 schools for the first stage of STEP with a need assessment and report for clarifying and aligning to school’s purpose.
This report captures our work on the ground hitherto – highlighting our activities in Year 1 of STEP,our key learning and strategic intent going forward.
The document discusses developing a growth model and data visualization system for a school district. It proposes a three-phase approach: 1) Discovering growth model requirements through interviews and observations, 2) Developing technical and design specifications for data visualization, and 3) Implementing and documenting the system. The methodology for phase 1 involves on-site interviews with district and school leaders, teachers, and parents to understand their needs. Phases 2 and 3 involve designing the data warehouse, dashboards, and visualizations to analyze student performance data and factors affecting learning based on requirements. Training and support will be provided to help users understand and utilize the system.
The document discusses best practices for assessing student learning outcomes at the institutional level. It outlines a six-step process for faculty to work through to identify, prioritize, define, map, measure, and analyze learning outcomes. Key aspects of good assessment include using results to inform decisions, having a focus on important goals, active stakeholder participation, communicating results widely, and ensuring results are used fairly and ethically. Regional accrediting bodies outline five principles for institutions around defining learning missions, documenting student learning against standards, compiling evidence from multiple sources, and involving stakeholders in the assessment process.
The document defines assessment and discusses its purpose and components. Assessment is defined as evaluating student learning and progress. It has several purposes, including informing students of their progress, motivating students, helping students set goals, informing teaching practices, and assigning grades. Assessment includes both formative assessment, which monitors student learning during instruction, and summative assessment, such as final exams. The document also discusses deficiencies in Pakistan's assessment system, such as an over-emphasis on rote learning and exams, and proposes incorporating more formative assessment approaches.
How to Manage Your Lost Opportunities in Odoo 17 CRMCeline George
Odoo 17 CRM allows us to track why we lose sales opportunities with "Lost Reasons." This helps analyze our sales process and identify areas for improvement. Here's how to configure lost reasons in Odoo 17 CRM
বাংলাদেশের অর্থনৈতিক সমীক্ষা ২০২৪ [Bangladesh Economic Review 2024 Bangla.pdf] কম্পিউটার , ট্যাব ও স্মার্ট ফোন ভার্সন সহ সম্পূর্ণ বাংলা ই-বুক বা pdf বই " সুচিপত্র ...বুকমার্ক মেনু 🔖 ও হাইপার লিংক মেনু 📝👆 যুক্ত ..
আমাদের সবার জন্য খুব খুব গুরুত্বপূর্ণ একটি বই ..বিসিএস, ব্যাংক, ইউনিভার্সিটি ভর্তি ও যে কোন প্রতিযোগিতা মূলক পরীক্ষার জন্য এর খুব ইম্পরট্যান্ট একটি বিষয় ...তাছাড়া বাংলাদেশের সাম্প্রতিক যে কোন ডাটা বা তথ্য এই বইতে পাবেন ...
তাই একজন নাগরিক হিসাবে এই তথ্য গুলো আপনার জানা প্রয়োজন ...।
বিসিএস ও ব্যাংক এর লিখিত পরীক্ষা ...+এছাড়া মাধ্যমিক ও উচ্চমাধ্যমিকের স্টুডেন্টদের জন্য অনেক কাজে আসবে ...
This slide is special for master students (MIBS & MIFB) in UUM. Also useful for readers who are interested in the topic of contemporary Islamic banking.
it describes the bony anatomy including the femoral head , acetabulum, labrum . also discusses the capsule , ligaments . muscle that act on the hip joint and the range of motion are outlined. factors affecting hip joint stability and weight transmission through the joint are summarized.
This presentation includes basic of PCOS their pathology and treatment and also Ayurveda correlation of PCOS and Ayurvedic line of treatment mentioned in classics.
How to Build a Module in Odoo 17 Using the Scaffold MethodCeline George
Odoo provides an option for creating a module by using a single line command. By using this command the user can make a whole structure of a module. It is very easy for a beginner to make a module. There is no need to make each file manually. This slide will show how to create a module using the scaffold method.
Main Java[All of the Base Concepts}.docxadhitya5119
This is part 1 of my Java Learning Journey. This Contains Custom methods, classes, constructors, packages, multithreading , try- catch block, finally block and more.
This presentation was provided by Steph Pollock of The American Psychological Association’s Journals Program, and Damita Snow, of The American Society of Civil Engineers (ASCE), for the initial session of NISO's 2024 Training Series "DEIA in the Scholarly Landscape." Session One: 'Setting Expectations: a DEIA Primer,' was held June 6, 2024.
Walmart Business+ and Spark Good for Nonprofits.pdfTechSoup
"Learn about all the ways Walmart supports nonprofit organizations.
You will hear from Liz Willett, the Head of Nonprofits, and hear about what Walmart is doing to help nonprofits, including Walmart Business and Spark Good. Walmart Business+ is a new offer for nonprofits that offers discounts and also streamlines nonprofits order and expense tracking, saving time and money.
The webinar may also give some examples on how nonprofits can best leverage Walmart Business+.
The event will cover the following::
Walmart Business + (https://business.walmart.com/plus) is a new shopping experience for nonprofits, schools, and local business customers that connects an exclusive online shopping experience to stores. Benefits include free delivery and shipping, a 'Spend Analytics” feature, special discounts, deals and tax-exempt shopping.
Special TechSoup offer for a free 180 days membership, and up to $150 in discounts on eligible orders.
Spark Good (walmart.com/sparkgood) is a charitable platform that enables nonprofits to receive donations directly from customers and associates.
Answers about how you can do more with Walmart!"
3. 2
Our company is focused on providing consultative, training, and evaluative
services throughout the educational community.
Consulting: We assist in designing solutions to create or refine existing
accountability systems. Our experience in both schooland teacher-based effectiveness
models provides sustainable alternatives for policy-makers.
Training: Our online training platform, Homeroom, provides a scalable,
cost-effective solution to need human capital development. Homeroom’s versatility
tailors schooland teacher-based effectiveness training into actionable results.
Evaluating: We leverage our experience in designing and building school
and teacher-based effectiveness systems into a set of evaluation solutions, including
internal auditing, peer reviews, and project evaluations. Our project-based staffing
ensures the “right” expert team is customized to address the identified evaluative focus.
In the spring of 2013, the company launched its proprietary learning platform to provide large scale
development of two educator effectiveness projects. In 2014, this learning platform was tailored to
create training and marketing solutions for private business beyond the education sector.
6. 5
Consultation (cont.)
St. Tammany Parish School
Board (Covington, LA)
2006-Developed on-line, curriculum-based assessments in grades 2-8 for use
in measuring Louisiana’s content.
2004-Developed the assessment blueprint and specification tables necessary
to coordinate item-development by the district. Implemented the Company’s
quality assurance reviews to ensure content alignment, developmental
appropriateness,range of knowledge, and depth of knowledge prior to
operational form construction. Created assessments forgrades 2-7 in
mathematics and reading.
St. John Parish School Board
(Reserve, LA)
2004-Implemented the Company’s quality assurance reviews to ensure
content alignment, developmental appropriateness,range of knowledge, and
depth of knowledge prior to operational form construction. Created
operational assessments forgrades 3-8 and conducted scoring activities.
7. 6
Training (cont.)
Pennsylvania
Department ofEducation
(Harrisburg, PA)
2013- Generalcontractor for all student learning objectives (SLOs)
development implementation tasks prior to the statewide launch.
2012- Created the statewide SLO training series using the Homeroom
learning platform. Integrated the SLO
process into the greater educator
effectiveness system. Created quality
criteria for those performance measures
used to measure student achievement
within the SLO framework.
Trained over 650 PA educators,principals, Intermediate Units (IU),
curriculum coordinators, PDE staff.
NewMexico Public
Education Department
(Santa Fe, NM)
2012-Trained over 70 NM educators using the Assessment Literacy Series
process,modified this to produce six end-of-course assessments for use as
alternate measures of competency for high school graduation in New
Mexico.
2013-Trained over 100 NM educators using the Assessment Literacy Series
process,including the integration of performance measures. Developed 28
end-of-course assessments in the
state’s educator effectiveness system.
Quality controlled all assessments
using RIA’s assessment rigor
screening tool and quality assurance rubric, including evaluating the
alignment of items/tasks to the New Mexico state content standards and
Common Core State Standards. Provided technical recommendations to the
NMPED senior staff on leveraging resources for greater access by NM
educators.
Miccosukee Tribe
(Miami, FL)
Spring 2013- Provided technical assistance in the development of the MIS
alternate AYP definition as authorized within 25 C.F.R. Created business
rules and integrated multiple assessment data into an overall school-based
index score. Trained school improvement team and administrators using
RIA’s Assessment Literacy Series.
Maine Department of
Education
Augusta, ME)
2011- Operated Maine’s Summer Harvest Program/School and provided
educational services to several hundred migrant students during the
summers of 2011 and 2012. Developed a data validation process for
multiple years necessary to support federal(CSPR) reports on migrant
enrollment.
9. 8
Evaluation
Delaware Department of
Education
(Dover, DE)
2013- Quality controlled all locally-developed assessments (for use in the
state’s educator effectiveness system) using RIA’s assessment rigor
screening tool and quality assurance
rubric, including evaluating the
alignment of items/tasks to the Delaware
state content standards and Common
Core State Standards. Provided
technical support to local districts meeting the quality expectations adopted
by the Delaware Department of Education.
Developed and implemented the Delaware Department of Education’s
Internal Measures Project,which created high-quality, summative
assessments for use in Delaware’s Educator Effectiveness System.
o Trained over 700 DE educators using customized training modules
that created and evaluated locally-developed assessments.
o Developed the criteria procedures for evaluating vendor-made,
student achievement measures (SAM-E).
o Developed and implemented the training and quality standards for
growth goals (student learning objectives-SLOs) of 47 content
areas,including non-subject educators and professional staff (e.g.,
nurses).
Summer 2012- Reviewed over 550 assessments created using the
assessment literacy process,including assessments in early childhood, 28
foreign languages, music, visual and performing art, physical education,
health, and 95% of all courses offered in career and technical education in
Delaware schools.
Summer 2013- Refined over 550 assessments created and field-tested in
2012, including assessments in early childhood, 28 foreign languages,
music, visual and performing art,physical education, health, and 95% of all
courses offered in career and technical education in Delaware schools.
Developed the first accountability technical manual (2004) in the nation
that outlined the operational details associated with Delaware’s
accountability system. Examined data quality practices associated with the
Delaware Student Testing Program (DSTP),Delaware Student Information
System (DELSIS), and targeted federalprograms administered by the state.
Evaluated the state’s Accountability Growth Model design during its first
year of implementation (Summer 2007). Conducted internal auditing of the
agency’s processes used to make AYP determinations with the
aforementioned Accountability Growth Model.
Extracted, evaluated, and established Delaware’s USED Standards and
Assessment Peer Review body of evidence in both 2005 and 2012.
Evaluated the use of accommodations for SWD and ELL students (2006)
and established screening thresholds using multi-wave data.
10. 9
Evaluation (cont.)
NewEngland Secondary
School Consortium
(Portland, ME)
2014 – Published Annual Data Profile: 2012-13 and the NESSC’s Metrics:
ProceduralGuidebook (Fall 2014)
2013- Created and published the NESSC’s Annual Data Profile: SY 2011-
12 (September 2013). Created and published the NESSC’s Metrics:
Procedural Guidebook (February 2013).
United States
Department ofthe
Interior - Bureau of
Indian Education
(Washington, D.C.)
Spring 2009- Trained BIE educators in using assessment data to determine
their school’s AYP status. Created computational and resource support (in
a web-based structure) necessary to calculate AYP for each of the 173 BIE-
funded schools using 23 different, state-level, accountability models.
Created technical guides for BIE-funded schools based upon their state’s
accountability system.
United States
Department ofEducation
(Washington, D.C.)
October 2012- Selected as a member of the ESEA Flexibility Peer Review
team for “Window 3” of the ESEA Flexibility requests. Three requests for
flexibility were reviewed. December 2011- Selected as a member of the
ESEA Flexibility Peer Review team for “Window 1” of the ESEA
Flexibility requests. Reviewed and facilitated the peer team for one state’s
request for flexibility.
December 2008-Selected as the 2008 Chairperson for the USED Growth
Model Peer Review process. Facilitated national experts’ review of several
state growth models for use in making AYP determinations. States
reviewed included: Texas, Colorado, Minnesota, District of Columbia,
Pennsylvania, North Dakota, and NewYork. April 2008- Participated in
the technical evaluation of state growth model proposals for use in making
AYP determinations in accordance with USED guidelines. States reviewed
included: Missouri, NewMexico, Pennsylvania, District ofColumbia,
Michigan, and Minnesota.
January 2005 to 2012- Selected in the initial cohort of Peer Reviewers
(2005) to review evidence associated with state content standards and large-
scale assessments. Evaluated evidence presented by state agencies against
the critical elements found within the USED’s Peer Review Guidance
document. These evaluations included data and information associated
with alternate assessments,modified achievement standards,and science
assessments. States reviewed included: Indiana, Illinois, Iowa, New
Mexico,Puerto Rico,South Carolina, Tennessee,Texas,North
12. 11
Evaluation (cont.)
Maine Department of
Education
Augusta, ME)
Summer 2008- Conducted IT audits of MEDMS data used to calculate the
on-time graduation rate (i.e., as required by 34 C.F.R. 200.19). Developed
QA/QC risk management document and implemented procedures in
partnership with the MDOE staff.
Nebraska Department of
Education
(Lincoln, NE)
Spring 2008- Evaluated accountability business rules with end-user
guidelines during initial piloting of the state’s new student information
system. Supported NDE officials in developing a multi-million dollar
NCES grant to implement a longitudinal data system. Developed a
conceptual framework and the RFP for the NSSRS Decision Support
System (DSS).
St. Tammany Parish
School Board
(Covington, LA)
Spring 2003- Extracted student performance data from the statewide
assessment across multiple years to produce school and district-level Data
Notebooks (SY 2004 thru 2008). Data Notebooks address multi-level
performance goals by providing quantitative trend data to non-technical
audiences. Aligned the business rules within the Strategic Plan, School
Improvement Plan, and Data Notebooks to ensure comparability.
McComb School District
(McComb, MS)
Conducted a comprehensive needs assessment for each school within the
district. Organized data into school and district-level reports to understand
how the district's reform initiatives were being actualized.
13. 12
Summer 2014 – Evaluated via auditing, the MDE’s newly developed
school accountability system. Created proceduralguidelines and the data
validation indicators used to ensure credible results.