2. The word ‘assess’ . . .
comes from the Latin verb ‘assidere’ meaning ‘to sit
with’.
Thus, in assessment one is supposed to sit with the
learner. This implies it is something we do with and for
students and not to students.
(Green, J. M. 1998, February. Constructing the way forward for all
students. A speech delivered at “Innovations for Effective Schools”
OECD/New Zealand joint follow-up conference, Christchurch, New
Zealand.)
2
3. consider The McNamara Fallacy . . .
attributed to economist Charles Handy
The first step is to measure whatever can be easily measured.
This is okay as far as it goes.
The second step is to disregard that which can’t be easily
measured, or to give it an arbitrary value.
This is artificial and misleading.
The third step is to presume that what can’t be measured easily
isn’t important.
This is blindness.
The fourth step is to say that what can’t be easily measured
doesn’t exist.
This is suicide.
Bottom Line of the Robert McNamara fallacy:
“What does not get counted does not count.”
3
4. Edward L. Thorndike (1874-1949)
. . . his pioneer investigations in the fields of human and animal learning are
among the most influential in the history of Psychology.
4
“Whatever exists,
exists in some quantity.
If it exists in quantity, it
can be measured.” – E.
L. Thorndike (quote
paraphrased)
5. So, as a teacher,
How do you make sure you are counting the
most important things (aka . . . the most
important aspects of learning) in your
assessment system?
It is the overall intent of this class to help you
develop useful attitudes and skills in this
area.
5
6. Four Sections of Study
I. Basic Principles of Assessments
II. Teacher- Made Assessments
III. External Assessments
IV. Special Applications
6
10. IV. Special Applications
Grading and Reporting (Ch 13)
Laws related to assessment (Ch 14)
Evaluating your own teaching (Ch 15)
10
11. What Professional Groups
Want You To Know About Assessment
Examples of Professional Statements
BCOE Conceptual Framework – Reflection in Action
Ohio Standards for the Teaching Profession
Ohio Integrated Systems Model (OISM)
Praxis II PLT Category II – Assessment
NCATE SPAs – Specialized Professional Associations
Main Points These Groups Make
1. Have clear learning objectives.
2. Develop skill with variety of assessment methods.
3. Be familiar with standardized tests.
4. Know technical stuff (e.g., reliability, validity).
5. Know how to grade, give feedback.
6. Be familiar with legal and ethical issues.
11
12. Purposes of Assessment
1. Certify Student Learning
2. Motivate Student Learning
3. Instructional Planning or Diagnosis
4. Feedback to Important Users of Assessment as to
how we are doing.
To the parents
To the school administrators
To the public community
5. Research
12
13. Current Assessment Trends
Assessment is Built on Classical Test Theory
The drive to make educational assessment as
“scientific” as possible
Standards-based Education
Its characteristics
Content Standards (Learning Goals/Objectives)
Performance Standards
Emphasis on HIGH Performance Standards
These HIGH Standards Apply to All Learners
Public Reporting
Today’s example - No Child Left Behind (NCLB)
13
14. Current Trends (cont.)
Accountability
schools must demonstrate their outcomes
teachers must demonstrate competence
Legal Activism
laws and courts are used to pursue social goals
concern for fairness and equity
Alternative Performance Assessment
concepts of authentic assessment
portfolios
14
15. Trends (cont.)
State, National, International bodies are
increasingly interested in group effects (often
related to “return on investment” thinking).
State (explosive growth right now)
National - National Assessment of Educational
Progress (NAEP)
International – (increasing attention)
15
16. Terms
Measurement
Quantification of observation or a believed
underlying reality
Testing
Using a selected measurement process or
measuring device
Assessment
Drawing a conclusion or making a judgment
about information from testing.
16
17. Terms (cont.)
Standardized Testing may involve:
Uniform, clearly specified methods and
procedures for administering the test.
The test has scoring norms based on many
(perhaps thousands) previous cases
Group administered, machine-scored, multiple
choice tests.
“I don’t do well on standardized tests!”
Means which of the above, exactly?
Your response?
17
18. Terms (cont.)
Distinctions Among Types of Tests
Group vs. Individual
Speed vs. Power
Maximum vs. Typical Performance
Criterion vs. Norm Referenced
Paper-and-Pencil vs. Performance
Summative vs. Formative
Selected-Response vs. Constructed
Response
18
19. High-Stakes Testing
A situation in which the outcome of a test has
exceptionally important consequences for an
individual.
You fail . . . You don’t get in or get out.
You fail . . . You are nobody to somebody.
You fail . . . You wasted your time.
This can create test anxiety in the individual
and could lead to non-performance or under
performance.
19
20. Terms/Concepts to Review and
Study on Your Own (1)
accountability
assessment
authentic assessment
classical test theory
constructed-response
criterion-referenced
formative evaluation
20
21. Terms/Concepts to Review and
Study on Your Own (2)
high-stakes testing
measurement
NAEP
No Child Left Behind (NCLB) Act
norm-referenced
performance assessment
portfolio assessment
21
22. Terms/Concepts to Review and
Study on Your Own (3)
reliability
selected-response
standardized
standards-based education
summative evaluation
test
TIMSS
validity
22