This document discusses assessment, scoring, and evaluation in education. It defines key terms like assessment, scoring, and evaluation. It describes different types of assessments like formative and summative assessments. It discusses alternative assessments like projects, presentations, and portfolios. It also covers criteria for effective assessments like validity, difficulty, and discriminability. Evaluation can be criterion-referenced, norm-referenced, or use a curve. The document provides examples of assessing scientific process skills and tips for grading, handling appeals, using portfolios, and record keeping.
Grading Your Assessments: How to Evaluate the Quality of Your ExamsExamSoft
How satisfied are you with the last assessment you gave? Would you describe your exam as a highly effective evaluation tool? How much information does it reveal about individual student’s abilities, and the overall performance of your current class as compared to previous classes? Do you trust your assessment to accurately identify which students “get it,” and which ones clearly do not grasp the content, nor meet the expected standards required to pass your course?
The use of a 3-step item analysis method based on an item’s difficulty levels, discrimination values, and response frequencies provides a revealing look at the quality of your assessment by focusing your attention on the effectiveness of each test item and its contribution to the exam blueprint. Save time and effort in identifying exactly which exam questions need editing, and how much editing is required, before you take any action. You’ll likely find that replacing the item with a brand new question may not be necessary. Learn how your efforts to make small improvements within just a few exam items, guided by a systematic process of reviewing statistical results before you start editing, can drastically enhance the items’ quality, and eliminate the need to spend hours rewriting the entire exam. By using this item analysis method, your future assessments will be able to provide an accurate measurement of your students’ abilities to apply nursing content and solve clinical problems.
How to Create Meaningful Rubrics for Student Assessment ExamSoft
Presented by Aimee F. Strang, Pharm.D., MS-HPEd, Assistant Dean of Curricular Assessment, Albany College of Pharmacy and Health Sciences
Rubrics are powerful assessment tools. Well written rubrics guide student learning and provide formative feedback for improvement. However, creating valid rubrics can be a challenge. This presentation will focus on how to create rubrics within the ExamSoft suite of software, and tips for making them as useful as possible. Topics include selecting appropriate criteria, creating differentiating descriptions, and assigning grading schemas. Emphasis will be placed on ensuring rubrics measure student learning as defined by the course objectives.
Grading Your Assessments: How to Evaluate the Quality of Your ExamsExamSoft
How satisfied are you with the last assessment you gave? Would you describe your exam as a highly effective evaluation tool? How much information does it reveal about individual student’s abilities, and the overall performance of your current class as compared to previous classes? Do you trust your assessment to accurately identify which students “get it,” and which ones clearly do not grasp the content, nor meet the expected standards required to pass your course?
The use of a 3-step item analysis method based on an item’s difficulty levels, discrimination values, and response frequencies provides a revealing look at the quality of your assessment by focusing your attention on the effectiveness of each test item and its contribution to the exam blueprint. Save time and effort in identifying exactly which exam questions need editing, and how much editing is required, before you take any action. You’ll likely find that replacing the item with a brand new question may not be necessary. Learn how your efforts to make small improvements within just a few exam items, guided by a systematic process of reviewing statistical results before you start editing, can drastically enhance the items’ quality, and eliminate the need to spend hours rewriting the entire exam. By using this item analysis method, your future assessments will be able to provide an accurate measurement of your students’ abilities to apply nursing content and solve clinical problems.
How to Create Meaningful Rubrics for Student Assessment ExamSoft
Presented by Aimee F. Strang, Pharm.D., MS-HPEd, Assistant Dean of Curricular Assessment, Albany College of Pharmacy and Health Sciences
Rubrics are powerful assessment tools. Well written rubrics guide student learning and provide formative feedback for improvement. However, creating valid rubrics can be a challenge. This presentation will focus on how to create rubrics within the ExamSoft suite of software, and tips for making them as useful as possible. Topics include selecting appropriate criteria, creating differentiating descriptions, and assigning grading schemas. Emphasis will be placed on ensuring rubrics measure student learning as defined by the course objectives.
Psychometrics 101: Know What Your Assessment Data is Telling YouExamSoft
Presented by Eric Ermie, Executive Director of Sales, ExamSoft Worldwide, Inc.
Keep it? Throw it out? Content/teaching issue? Bad question? Too easy? Too hard? What the heck? More than likely you have asked some or all of these questions at one point or another when trying to understand the performance of questions on an assessment. With differing opinions on how to interpret the statistics provided, how do you know what all this data is trying to tell you? Join us for a webinar on the fundamentals of item analysis, how the data is derived, and the different ways they can be interpreted. This presentation will cover how to put data into a useful context that will allow you to draw your own conclusions on what it means, how you should apply them, and why you should ignore rules that others may use for their specific situation.
Writing exam questions is one of the most important parts of teaching nursing. Having the right roadmap to what to include must be in the minds of nurse educators while developing those exams. This presentation provided directions on how to develop the test blueprint and how to revise questions.
UMT, Test Construction 2, Test Construction II, Item Analysis, Test Analysis, Test Construction, Discrimination Index, Difficulty Index, Waqas A. Khan, Prof. Dr. Abdul Hameed, Abdul Hameed, Dr. Abdul Hameed, SSH, University of Management and Technology, University of Management & Technology, www.waqas.org
From the CALPER/LARC Testing and Assessment Webinar Series
Download the handout: https://goo.gl/ce9s3r
View the recording: http://vimeo.com/79501398
Webinar Description
In their quest for accountability in assessment, teachers might forget those to whom we should first be accountable: our students. Providing students with clear, accessible, and understandable assessment materials promotes accountability. Unfortunately, assessment of student writing is one of the tasks teachers worry about and, at times, nearly dread.
During this presentation, participants will learn procedures for developing tools for writing assessment that are transparent and understandable to students and that act as both teaching and assessment tools. We will first consider assignment criteria – what is it that we want our students to do? We will then consider the rubric, a grading instrument, which offers objectivity, consistency, clarity in assessing writing and concentrate on holistic, analytic, and to a lesser degree, primary trait assessment. We will also consider when and for what kinds of writing assignments each of these rubrics are most appropriate. Additionally, we will examine the components of rubrics (the criteria, the weight, the description) and the steps in creating a good rubric and how assignment criteria informs rubric creation.
Designing Writing Assessments and Rubrics will consider the issue of accountability in classroom assessment of writing. The absence of fair and transparent assessment often leads to student confusion, slows progress, assumptions of professorial arbitrariness, and quite possibly lack of trust in teacher-student relationships.
Webinar Date: November 14, 2013
Psychometrics 101: Know What Your Assessment Data is Telling YouExamSoft
Presented by Eric Ermie, Executive Director of Sales, ExamSoft Worldwide, Inc.
Keep it? Throw it out? Content/teaching issue? Bad question? Too easy? Too hard? What the heck? More than likely you have asked some or all of these questions at one point or another when trying to understand the performance of questions on an assessment. With differing opinions on how to interpret the statistics provided, how do you know what all this data is trying to tell you? Join us for a webinar on the fundamentals of item analysis, how the data is derived, and the different ways they can be interpreted. This presentation will cover how to put data into a useful context that will allow you to draw your own conclusions on what it means, how you should apply them, and why you should ignore rules that others may use for their specific situation.
Writing exam questions is one of the most important parts of teaching nursing. Having the right roadmap to what to include must be in the minds of nurse educators while developing those exams. This presentation provided directions on how to develop the test blueprint and how to revise questions.
UMT, Test Construction 2, Test Construction II, Item Analysis, Test Analysis, Test Construction, Discrimination Index, Difficulty Index, Waqas A. Khan, Prof. Dr. Abdul Hameed, Abdul Hameed, Dr. Abdul Hameed, SSH, University of Management and Technology, University of Management & Technology, www.waqas.org
From the CALPER/LARC Testing and Assessment Webinar Series
Download the handout: https://goo.gl/ce9s3r
View the recording: http://vimeo.com/79501398
Webinar Description
In their quest for accountability in assessment, teachers might forget those to whom we should first be accountable: our students. Providing students with clear, accessible, and understandable assessment materials promotes accountability. Unfortunately, assessment of student writing is one of the tasks teachers worry about and, at times, nearly dread.
During this presentation, participants will learn procedures for developing tools for writing assessment that are transparent and understandable to students and that act as both teaching and assessment tools. We will first consider assignment criteria – what is it that we want our students to do? We will then consider the rubric, a grading instrument, which offers objectivity, consistency, clarity in assessing writing and concentrate on holistic, analytic, and to a lesser degree, primary trait assessment. We will also consider when and for what kinds of writing assignments each of these rubrics are most appropriate. Additionally, we will examine the components of rubrics (the criteria, the weight, the description) and the steps in creating a good rubric and how assignment criteria informs rubric creation.
Designing Writing Assessments and Rubrics will consider the issue of accountability in classroom assessment of writing. The absence of fair and transparent assessment often leads to student confusion, slows progress, assumptions of professorial arbitrariness, and quite possibly lack of trust in teacher-student relationships.
Webinar Date: November 14, 2013
Discusses the facets of Performance Assessment: Definition, advantages and disadvantages, types, process, guidelines and procedures and the types of rubrics
Classroom Assessment and Grading Prepared by Rox Wale EquiasRox Wale Equias
I call my tests “opportunities” to give students a different way of thinking about them.-Bert Moore
“CLASSROOM ASSESSMENT and GRADING”
Prepared by:
ROX WALE EQUIAS
CHAPTER OUTLINE
Learning Goals…
THE CLASSROOM AS AN ASSESSMENT CONTEXT
ASSESSMENT AS AN INTEGRAL PART OF TEACHING
ESTABLISHINGHIGH – QUALITY ASSESSMENTS
ESTABLISHING HIGH – QUALITY ASSESSMENTS
VALIDITY – refers to the extent to which assessment measures what it is intended to measure and appropriateness of inferences from and uses of the information (McMillan, 2011). Inferences are conclusions that teachers draw from information.
Instructional Validity – is the extent to which the assessment is a reasonable sample of what went on in the classroom.
Establishing High – Quality Assessments
CURRENT TRENDS
TRADITIONAL TESTS
SELECTED – RESPONSE ITEMS
MULTIPLE – CHOICE ITEMS
TRUE/FALSE ITEMS
CONSTRUCTED – RESPONSE ITEMS
Definition
Strategies for Scoring Essay Questions
Strategies for Scoring Essay Questions
Suggestions for Writing Good Essay Items
ESSAY QUESTIONS
ALTERNATIVE ASSESSMENTS
Example of Alternative Assessment
Example of Alternative Assessment
Authentic Assessment
Evaluating a student’s knowledge or skill in a context that approximates the real world or real life as closely as possible
Performance Assessment
Performance Criteria
Specific behaviors that student need to perform effectively as part of assessment
Scoring Rubric for a Report on an Invention
Strategies for Developing Scoring Rubrics
Strategies for Developing Scoring Rubrics
Portfolio Assessment
Using Portfolios Effectively
Your grading curve and my learning curve don’t intersect.-Dave Carpenter from Phi Delta Kappan (1997)
GRADING AND REPORTING PERFORMANCE
Grading
Translating descriptive assessment information into letters, numbers or other marks that indicate the quality of a student’s learning performance
The Purposes of Grading
The Components of a Grading System
Definition
Reporting Students’ Progress and Grades to Parents
Strategies for Parent – Teacher Conferences Related to Grades and Assessment
“I don’t know why you’re so surprised by his poor grades. Every day you asked him what he did at school, and every day he answered, ‘Nothing.’”
-Art Bouthillier. Reprinted with permission.
Some Issues in Grading:
Resource
Educational Psychology by John W. Santrock. 6th Edition. pages 547 - 584
Thank you!!!
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
Acetabularia Information For Class 9 .docxvaibhavrinwa19
Acetabularia acetabulum is a single-celled green alga that in its vegetative state is morphologically differentiated into a basal rhizoid and an axially elongated stalk, which bears whorls of branching hairs. The single diploid nucleus resides in the rhizoid.
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
2. Definitions
Assessment -- The process of measuring
something with the purpose of assigning a
numerical value.
Scoring -- The procedure of assigning a
numerical value to assessment task.
Evaluation -- The process of determining
the worth of something in relation to
established benchmarks using
assessment information.
3. Assessment Types
Formative - for
performance
enhancement
Formal - quizzes,
tests, essays, lab
reports, etc.
Traditional - tests,
quizzes, homework ,
lab reports, teacher
Summative - for
performance
assessment
Informal - active
questioning during and
at end of class
Alternative - PBL’s,
presentations, essays,
book reviews, peers
4. Alternative Assessment
Alternative to what? Paper & pencil exams
Alternatives:
lab work / research projects
portfolios
presentations
research papers
essays
self-assessment / peer assessment
lab practical
classroom “clickers” or responder pads
5. More Formal Alternatives
Rube Goldberg projects
bridge building / rocketry / mousetrap cars
writing a computer program
research project
term paper
create web page
create movie
role playing
building models
academic competitions
6. Informal CATs
Quick-fire questions
Minute paper
(Classroom Assessment Techniques)
1) What did you learn today?
2) What questions do you have?
Directed paraphrasing (explain a concept to a
particular audience)
The “muddiest” point (What is it about the topic
that remains unclear to you?)
For additional ideas, see Angelo, T.A. & Cross, P.K. (1993) Classroom
Assessment Techniques (2nd ed) San Francisco: Jossey-Bass.
7. Authentic Assessment
The National Science Education
Standards draft (1994) states, "Authentic
assessment exercises require students to
apply scientific information and reasoning
to situations like those they will encounter
in the world outside the classroom as well
as situations that approximate how
scientists do their work."
8. Assessment Concerns
Validity -- Is the test assessing what’s intended?
Are test items based on stated objectives?
Are test items properly constructed?
Difficulty -- Are questions too easy or too hard?
(e.g., 30% to 70% of students should answer a
given item correctly)
Discriminability -- Are the performance on
individual test items positively correlated with
overall student performances? (e.g., only best
students do well on most difficult questions)
9. Evaluation Types
Criterion-referenced evaluation -- student
performance is assessed against a set of
predetermined standards
Norm-referenced evaluation -- student
performance is assessed relative to the
other students
The “curve” -- sometimes a combination of
criterion- and norm-referenced processes
10. Criterion-Referenced Eval’s
Based on a predetermined set of criteria.
For instance,
90% and up = A
80% to 89.99% = B
70% to 79.99% = C
60% to 69.99% = D
59.99% and below = F
12. Norm-referenced Evaluation
Based upon the assumption of a standard
normal (Gaussian) distribution with n > 30.
Employs the z score:
A = top 10% (z > +1.28)
(
B = next 20% (+0.53 < z < +1.28)
C = central 40% (-0.53 < z < +0.53)
D = next 20% (-1.28 < z < -0.53)
X −X
F = bottom 10% (z < -1.28)
(
z=
σ
13. Norm-referenced Evaluation
Pros:
Ensures a “spread”
between top and
bottom of the class for
clear grade setting.
Shows student
performance relative
to group.
Con: In a group with
great performance,
some will be ensured
an “F.”
Cons:
Top and bottom
performances can
sometimes be very
close.
Dispenses with
absolute criteria for
performance.
Being above average
does not necessarily
imply “A” performance.
14. Norm and Criterion Compared
Norm-Referenced:
Ensures a competitive
classroom atmosphere
Assumes a standard
normal distribution
Small-group statistics
a problem
Assumes “this” class
like all others
Criterion-Referenced:
Allows for a cooperative
classroom atmosphere
No assumptions about
form of distribution
Small-group statistics
not a problem
Difficult to know just
where to set criteria
15. The “Curve”
The “curve” might represent a mixture of
norm- and criterion-referenced grading.
The “curve” is a highly subjective process.
The “curve” is normally applied only at the
end of a term.
18. Integrated Scientific
Process Skills
Identifying variables
Constructing a table of data
Constructing a graph
Describing a relationship between variables
Acquiring and processing data
Analyzing investigations
Constructing hypotheses
Defining variables operationally
Designing investigations
Experimenting
19. Enhanced Scientific
Process Skills
Solving complex, real-world problems
Establishing empirical laws
Synthesizing theoretical explanations
Analyzing and evaluating scientific arguments
Constructing logical proofs
Generating principles through the process of
induction
Generating predictions through the process of
deduction
20. Miscellaneous Comments
Study guides can be created to set objectives.
Prepare tests from objectives.
Assess broad spectrum: content AND skills.
Make a rubric for questions that do not have
forced-choice answers.
Create an answer key for forced-choice
questions.
Double-check your answer key.
Grade ASAP, providing corrective feedback.
21. Handling Appeals
Encourage students to learn from their mistakes.
Accept appeals in writing, due by a certain date.
Refuse to discuss question if student will be
appealing the answer.
Appeals include the following:
Question being appealed
Teacher and student responses
Explanation of why student’s response is as good as or better
than teacher’s expected response.
Teacher responds in writing.
No class-wide correction: each student must make own appeal.
Benefit: students feel they are treated fairly.
22. Portfolios of Student Work
Have students prepare an ongoing,
extensive portfolio of their work.
Maintain these portfolios in an open but
supervised setting.
During parent-teacher conferences, have
student in attendance and have parents
go through portfolio with student under the
watchful eyes of the teacher.
23. Record Keeping
Keep copies of your grade book or
computer program in widely separated
locations.
Keep up to date.
Respect confidentiality laws....