Everything that can be counted does
not necessarily count; everything
that counts cannot necessarily be
counted .

Albert Einstein
1
First - the Lighter Side

2
3
4
5
6
Prof: Class, who can tell me what I have preserved in this
jar?
Student A: It’s pig.
Student B: Is it a baby cow?
Prof: No, it’s neither a pig nor a baby cow …
It’s the last student who got caught cheating on one of
my test.
Some Pressing Issues in the Classroom
On Effective Learning
• The test used by teachers encourage rote
and superficial learning
• Questions and methods teachers use are
not shared with other teachers in the same
school
• Tendency to emphasize quantity and
presentation of work neglecting quality in
relation to learning
Some Pressing Issues in the Classroom
On the negative impact
 Overemphasis on grading function
neglecting the learning function.
 Approaches are used in which students
are compared with one another.
 The collection of marks to fill in records is
given higher priority than the analysis of
student’s work to discern learning needs.
• Many Years Back
and

Even Today
10
The Curve
y

0

Mean
Ranking and Sorting

x
Misuse of statistics – labeling students

12
The Average Child
• I don’t cause teachers trouble,
my grades have been ok.
I listen in my classes
and I’m in school everyday.
• My teachers think I’m average,
my parents think so too.
I wish I didn’t know that
‘cause there’s lots I’d like to do.

• I’d like to build a rocket, I have
a book that tells you how,
or start a stamp collection
—well
there’s no use in trying now.
•‘Cause since I found I’m average
I’m just smart enough you see,
to know there is nothing special
that I should expect of me.

I’m part of the majority,
that hump part of the bell,
who spends their life unnoticed
in an average kind of hell.
Written by 9th Grade North American Native
Child quoted by Dale Parnell

13
Purpose of School has Changed
from
Ranking/sorting

to
Learning for all

Goal: Independent, self-directed learners
Key Beliefs
• All students can learn
• Schools and teachers make a
difference
• If students are assisted to
work hard – make an effort –
they improve
• An assessment culture is
central to student and school
improvement
Moving the Mountain

Higher Learning for All
Some Definitions
• Test- a set of specified, uniform
tasks to be performed by students,
these tasks being appropriate sample
from the knowledge or skills in a
broader field of content.
• From the number of tasks done
correctly in the sample, the teacher
makes an assumption of how
student will perform in the total field.
• It is a tool whose general
characteristic is that it forces
responses from a student
• Measurement- a system of
observing phenomenon,
attribute, or characteristic and
translating those observations
into numbers according to a
rule (Case, 1999).

18
• Evaluation - the determination of
the worth or value of an event,
object, or individual in terms of a
specified criterion.
• Educators evaluate student
progress by comparing student
performance to the criteria of
success based on instructional
objectives.
Test
Measurement
Evaluation

Assessment
20
“No matter how good you are,
you can always do better”
By the end of this session you will have a
better understanding of how
ASSESSMENTS can actually help
improve learning (and teaching), and how

you can better lead these processes.
21
This implies it is something we do with and
for the students and not to the students
(Green, 1998)
22
• Assessment -

refers to the full range
of information gathered and
synthesized about the students

 Assessment is the process of
gathering, recording, interpreting, usin
g and reporting information about a
student’s progress and achievement in
developing knowledge skills and
understanding
(NCCA, 2007)
• Testing –focuses on what
we “do” to the learners
after instruction.

Assessment – focuses on what
we do “with” the learners
before, during, and after
learning.
24
assessment methods

• Performances
• Projects
• Products
• Paper and pen
• Portfolios
Core Principles of
Effective Assessment
Three interrelated objectives for quality in
student assessment in higher education
1. Assessment that guides and
encourages effective approaches to
learning
2. Assessment that validly and reliably
measures expected learning outcomes,
in particular the higher-order learning
that characterises higher education;
and
3. Assessment and grading that define
and protect academic standards
Well designed assessment should …

4.Set clear expectations;
5.Establish a reasonable workload
(one that does not push students
into rote reproductive approaches
to study); and
6.Provide opportunities for students
to self-monitor, rehearse, practice
and receive feedback.
16 INDICATORS OF EFFECTIVE ASSESSMENT IN
HIGHER EDUCATION

1. Assessment is treated by faculty and

students as an integral component of the
entire teaching and learning process.
2. The multiple roles of assessment are
recognized.
• The powerful motivating effect of assessment
requirements on students is understood and
assessment tasks are designed to foster
valued study habits.
16 INDICATORS OF EFFECTIVE ASSESSMENT
IN HIGHER EDUCATION (con’t)

3. There is a faculty/departmental
policy that guides assessment
practices.
– Subject assessment is integrated into an
overall plan for course assessment.

4. There is a clear alignment
between expected learning
outcomes, what is taught and
learned, and the knowledge and
skills assessed.
30
16 INDICATORS OF EFFECTIVE ASSESSMENT IN
HIGHER EDUCATION (con’t)

5. Assessment tasks assess the capacity

to analyse and synthesis new
information and concepts rather than
simply recall information which has
been presented.
6. A variety of assessment methods is
employed so that the limitations of
particular methods are minimized.
16 INDICATORS OF EFFECTIVE ASSESSMENT IN HIGHER EDUCATION (con’t)

7. Assessment tasks are designed to assess

relevant generic skills as well as subjectspecific knowledge and skills.
8. There is a steady progression in the
complexity and demands of assessment
requirements in the later years of courses.
9. There is provision for student choice in
assessment tasks and weighting at
certain times.
16 INDICATORS OF EFFECTIVE ASSESSMENT IN HIGHER EDUCATION
(con’t)

10. Student and faculty workloads are
considered in the scheduling and
design of assessment tasks.
11. Excessive assessment is
avoided.
•

Assessment tasks are designed to
sample student learning.

.
33
16 INDICATORS OF EFFECTIVE ASSESSMENT IN HIGHER
EDUCATION (con’t)

• 12. Assessment tasks are weighted to
balance the developmental (‘formative’)
and judgemental (‘summative’) roles of
assessment.
– Early low-stakes, low-weight assessment
is used to provide students with feedback

34
16 INDICATORS OF EFFECTIVE ASSESSMENT IN HIGHER
EDUCATION (con’t)

13. Grades are calculated and reported
on the basis of clearly articulated
learning outcomes and criteria for levels
of achievement.
14. Students receive explanatory and
diagnostic feedback as well as grades.

35
16 INDICATORS OF EFFECTIVE ASSESSMENT IN HIGHER
EDUCATION (con’t)

15. Assessment tasks are checked to
ensure there are no inherent biases
that may disadvantage particular
student groups.
16. Plagiarism is minimized through
careful task design, explicit education
and appropriate monitoring of
academic honesty.
Assessment practices:

Quality and standards
Assessment and the assurance of
academic standards
The assurance of academic standards
embraces a wide range of university
activities beyond the assessment of student
learning.
However, assessment and grading practices
are perhaps the most important safeguard.
What can individual faculty do
about standards? — 1
Ensure …
… there are explicit learning outcomes, clear
criteria and, where possible, statements of
the various levels of achievement.
With the objective of …
students and faculty both being aware of what
is expected, what is valued, and what will be
rewarded.
What can individual faculty do
about standards? — 2
Ensure …
… a close match between the assessment
tasks
— in particular, the knowledge and skills these
tasks are capable of determining
— and the intended learning outcomes.

With the objective of …
creating assessment tasks that validly and
reliably determine the valued learning
outcomes.
What can individual faculty do
about standards? — 3
Ensure …
… the grades awarded (and other information
provided to students on their achievement)
make a direct link between the intended
learning outcomes and students’ actual
performance on assessment tasks.
With the objective of …
…awarding grades that are meaningful
representations of the level of learning.
What can individual faculty do
about standards? — 4
Ensure …
… assessment tasks are capable of detecting
the higher-order learning outcomes that
characterize higher education.
With the objective of …
developing higher education assessment that
determines and reports the highest intellectual
skills and accomplishments.
What can individual faculty do
about standards? — 5
Ensure …
… there is ongoing dialogue on learning
outcomes, assessment and grading with
people teaching in the same discipline area in
other universities.
With the objective of …
using assessment and grading practices that are
informed by the norms and values of the
discipline community.
Bloom’s Taxonomy
of Instructional Objectives / Learning Outcomes

Learning Domains
– Cognitive
• Learning outcomes related to knowledge
– Psychomotor
• Learning outcomes related to skills
– Affective
• Learning outcomes related to attitudes,
behaviors, and values
Taxonomies are based on the assumption that
different types of objectives are learned through
different mental processes.
Learning Domains
Knowledge
(HEAD)

Cognitive
Domain

Attitudes
(HEART)

Affective
domain

Skills
(HANDS)

Psychomotor
Domain

46
Bloom’s Taxonomy
Level

Verb

Remembering: can the student recall or
remember the information?

define, duplicate, list, memorize, recall,
repeat, reproduce state

Understanding: can the student explain
ideas or concepts?

classify, describe, discuss, explain,
identify, locate, recognize, report, select,
translate, paraphrase

Applying: can the student use the
information in a new way?

choose, demonstrate, dramatize,
employ, illustrate, interpret, operate,
schedule, sketch, solve, use, write.

Analyzing: can the student distinguish
between the different parts?

appraise, compare, contrast, criticize,
differentiate, discriminate, distinguish,
examine, experiment, question, test.

Evaluating: can the student justify a
stand or decision?

appraise, argue, defend, judge, select,
support, value, evaluate

Creating: can the student create new
product or point of view?

assemble, construct, create, design,
develop, formulate, write.
Taxonomy for Teaching, Learning, and Assessing
(A Revision of Bloom’s Taxonomy of Educational Objectives)

The revised taxonomy is two-dimensional, identifying both:
•

the kind of knowledge to be learned (knowledge dimension)
and

•

the kind of learning expected from students (cognitive
processes)

to help teachers and administrators improve alignment and rigor
in the classroom.
Factual Knowledge
• The basic elements students must know to
be acquainted with a discipline or solve
problems in it.
• Knowledge of terminology
– technical vocabulary, musical symbols , etc.

• Knowledge of specific details and
elements
– major natural resources, reliable sources of
information
Conceptual Knowledge
• The interrelationships among the basic elements within a
larger structure that enable them to function together.

• Knowledge of classifications and
categories
– periods of geologic time

• Knowledge of principles and
generalizations
– Pythagorean theorem, law of supply and demand

• Knowledge of theories, models and
structures
– theory of evolution, structure of congress
Procedural Knowledge
• How to do something, methods of inquiry, and criteria for
using skills, algorithms, techniques, and methods.

• Knowledge of subject-specific skills and
algorithms
– painting with watercolors, whole-number division

• Knowledge of subject-specific techniques
and methods
– interviewing techniques, scientific method

• Knowledge of criteria for determining
when to use appropriate procedures
– when to apply Newton's second law, when to use a particular
method of estimation
Metacognitive Knowledge
• Knowledge of cognition in general as well as
awareness and knowledge of one's own cognition.
• Strategic knowledge
– outlining as a means of capturing the structure of a
unit of subject matter in a textbook

• Cognitive tasks
– knowledge of the different types of tests, cognitive
demands of different tasks

• Self-knowledge
– knowledge that critiquing essays is a personal
strength, whereas writing essays is a personal
weakness; awareness of one's own knowledge level
Matching Learning Outcomes to Assessment Types

Types of Learning:
Learning outcomes

What is required
from students?

Examples of
Assessment

Thinking critically
and making
judgments

Development of
arguments, reflection,
judgment, evaluation

• Essay
• Report
• Book review

Solving
problems/developi
ng plans

Identify problems,
define problems,
analyze data, review,
design experiments,
plan, apply
information

• Problem scenario
• Group Work
• Work-based
problem
• Analyze a case
• Conference paper
(or notes for a
conference paper
plus annotated
bibliography)
Matching Learning Outcomes to Assessment Types

Types of Learning:
Learning outcomes

What is required
from students?

Examples of
Assessment

Performing
procedures and
demonstrating
techniques

Take readings, use
equipment, follow
laboratory procedures,
follow protocols,
carry out instructions

• Demonstration
• Role Play
• Make a video
(write script and
produce/make a
video)
• Produce a poster
• Lab report

Demonstrating
knowledge and
understanding
(can be assessed in
conjunction with the
above types of

Recall, describe,
report, identify,
recognize, recount,
relate, etc.

• Written
examination
• Oral examination
• MCQs
• Essays
Matching Learning Outcomes to Assessment Types
Types of Learning:
Learning outcomes

What is required from
students?

Examples of
Assessment

Managing/developing
yourself

Work co-operatively
and, independently, be
self-directed, manage
time, manage tasks

• Learning journal
Portfolio
• Learning
Contracts
• Self-evaluation
• Group projects
• Peer assessment

Designing, creating,
performing

Design, create,
perform, produce, etc.

•
•
•
•

Design project
Portfolio
Presentation
Performance
Matching Learning Outcomes to Assessment Types

Types of Learning:
Learning outcomes

What is required
from students?

Examples of
Assessment

Assessing and
managing
information

Information search
and retrieval,
investigate, interpret,
review information

• Annotated
bibliographies
• Use of
bibliographic
software
• Library research
assignment
• Data based project

Communicating

Written, oral, visual
and technical skills

• Written
presentation
• Oral presentation
• Discussions
/Debates/ role plays
• Group work
Constructing Objective Type
Tests
Planning a Test
• First step: Outline learning objectives or
major concepts to be covered by the test
– Test should be representative of objectives
and material covered
– Major student complaint: Tests don’t fairly
cover the material that was supposed to be
canvassed on the test.
Planning a Test
• Second Step: Create a test blueprint
• Third Step: Create questions based on blueprint
– Match the question type with the appropriate level of
learning

• Fourth Step: For each check on the blueprint, jot
down 3-4 alternative question on ideas and item
types which will get at the same objective
• Fifth Step: Organize questions and/or ideas by item
types
Planning a Test
• Sixth Step: Eliminate similar questions
• Seventh Step: Walk away from this for a couple of
days
• Eighth Step: Reread all of the items – try doing
this from the standpoint of a student
Planning a Test
• Ninth Step: Organize questions logically
• Tenth Step: Time yourself actually taking the test
and then multiply that by about 4 depending on
the level of students
• Eleventh Step: Analyze the results (item analyses)
Selecting the Right Type of Test
• How do you know what type of question to use
and when?
• It depends on the skill you are testing.
• Test should always match as closely as possible
the actual activity you’re teaching.
– Examples: Teaching Speech, should evaluate an oral
speech
– If testing ability to write in Spanish, better give an
essay.
– Testing reading –MC, TF
– Wouldn’t use MC to test creative writing
Question Types and Cognitive Levels of Learning
Knowledge
Comprehension

Application

Multiple Choice (MC)
True/False (TF)
Matching
Completion
Short Answer

MC
Short Answer
Problems
Essay
Performance

Analysis
Synthesis
Evaluation
MC
Short Answer
Essay
Constructing the Test
• Types of Test Questions:
–
–
–
–
–

Multiple-Choice Items
True-False Items
Matching Items
Fill-In, Completion or Short-Answer Items
Essay Questions
Selecting Objectives for Assessment
• Test all of the must-know objectives, test some need-toknow, and don't test nice-to-knows.

• A must-know objective - one which students have to
know to go on to the next objective in the sequence
• It is the last time the objective will be taught in the
scope and sequence and the student cannot finish the
course or go on to the next course, without knowing or
being able to do what the objective requires.
• Need-to-know objectives are those where the students
need to know this content to perform within the
sequence but the content will be re-taught or reviewed
prior to the student using the objective again.
• Nice-to-know objectives are those that the students are
to be exposed to but very few are expected to learn
much about it.
• When the specifications have been developed the next step
is construction of the text items.
Table of Specifications or Test Blueprint
Matrix which contains:
 a) objectives being tested
 b) levels at which those objectives should be
tested, according to Bloom's Taxonomy,
or others .
 c) amount of time which was spent teaching
the objective
 Objectives should never be tested at a higher Bloom's
level than the objective that is presented to the students.
 The number of items used should relate to the amount of
teaching time spent on the objectives.
 The more time the greater number of items.
• BEFORE I END ….
"For as long as assessment is viewed as
something we do 'after' teaching and learning
are over, we will fail to greatly improve
student performance, regardless of how well
or how poorly students are currently taught or
motivated."

-- Grant Wiggins, 1998
Assessment is not enough
• We can’t consider assessment
separately from teaching and learning
processes. All are about informing
judgment.
• Alignment between and integration of
learning activities is needed
• Choosing assessment practices
chooses what students will learn
FINALLY
73
Final thought

Why teach to testing when it is
so productive to teach to
learning?

Guy Bensusan
References
•
•
•
•
•
•
•
•
•

•

Boud, D. (1995). Enhancing Learning through Self Assessment. London: Kogan Page.
Boud, D. (2000). Sustainable assessment: rethinking assessment for the learning society.
Studies in Continuing Education, 22, 2, 151-167.
Boud, D. and Falchikov, N. (2006). Aligning assessment with long term learning,
Assessment and Evaluation in Higher Education, 31, 4, 399-413.
Boud, D. and Falchikov, N. (Eds.) (2007) Rethinking Assessment in Higher Education:
Learning for the Longer Term. London: Routledge.
Absolum, M. (2006). Clarity in the classroom. Auckland: Hodder Education. pp 98-117.
Andrade H. and Valtcheva, A. (2009). Promoting Learning and Achievement through Self
Assessment, Theory into Practice, Vol 48 pp 12-19.
Topping, K.J. (2009), Peer Assessment, Theory into Practice, Vol 48 pp 20-27.
Wiliam, D. When is assessment learning-oriented? 4th Biennial EARLI/Northumbria
Assessment Conference, Potsdam, Germany, August 2008. www.dylanwiliam.net
alchikov, N. (2005). Improving Assessment through Student Involvement. London:
Routledge.
http://www.cshe.unimelb.edu.au/assessinglearning/06/index.html2006). How assessment
frames student learning. In Clegg, K. and Bryan, C. (Eds.) Innovative Assessment in
Higher Education. London: Routledge.
Thank You
76

Assessment for higher education (for biology faculty seminar)

  • 1.
    Everything that canbe counted does not necessarily count; everything that counts cannot necessarily be counted . Albert Einstein 1
  • 2.
    First - theLighter Side 2
  • 3.
  • 4.
  • 5.
  • 6.
  • 7.
    Prof: Class, whocan tell me what I have preserved in this jar? Student A: It’s pig. Student B: Is it a baby cow? Prof: No, it’s neither a pig nor a baby cow … It’s the last student who got caught cheating on one of my test.
  • 8.
    Some Pressing Issuesin the Classroom On Effective Learning • The test used by teachers encourage rote and superficial learning • Questions and methods teachers use are not shared with other teachers in the same school • Tendency to emphasize quantity and presentation of work neglecting quality in relation to learning
  • 9.
    Some Pressing Issuesin the Classroom On the negative impact  Overemphasis on grading function neglecting the learning function.  Approaches are used in which students are compared with one another.  The collection of marks to fill in records is given higher priority than the analysis of student’s work to discern learning needs.
  • 10.
    • Many YearsBack and Even Today 10
  • 11.
  • 12.
    Misuse of statistics– labeling students 12
  • 13.
    The Average Child •I don’t cause teachers trouble, my grades have been ok. I listen in my classes and I’m in school everyday. • My teachers think I’m average, my parents think so too. I wish I didn’t know that ‘cause there’s lots I’d like to do. • I’d like to build a rocket, I have a book that tells you how, or start a stamp collection —well there’s no use in trying now. •‘Cause since I found I’m average I’m just smart enough you see, to know there is nothing special that I should expect of me. I’m part of the majority, that hump part of the bell, who spends their life unnoticed in an average kind of hell. Written by 9th Grade North American Native Child quoted by Dale Parnell 13
  • 14.
    Purpose of Schoolhas Changed from Ranking/sorting to Learning for all Goal: Independent, self-directed learners
  • 15.
    Key Beliefs • Allstudents can learn • Schools and teachers make a difference • If students are assisted to work hard – make an effort – they improve • An assessment culture is central to student and school improvement
  • 16.
  • 17.
    Some Definitions • Test-a set of specified, uniform tasks to be performed by students, these tasks being appropriate sample from the knowledge or skills in a broader field of content. • From the number of tasks done correctly in the sample, the teacher makes an assumption of how student will perform in the total field. • It is a tool whose general characteristic is that it forces responses from a student
  • 18.
    • Measurement- asystem of observing phenomenon, attribute, or characteristic and translating those observations into numbers according to a rule (Case, 1999). 18
  • 19.
    • Evaluation -the determination of the worth or value of an event, object, or individual in terms of a specified criterion. • Educators evaluate student progress by comparing student performance to the criteria of success based on instructional objectives.
  • 20.
  • 21.
    “No matter howgood you are, you can always do better” By the end of this session you will have a better understanding of how ASSESSMENTS can actually help improve learning (and teaching), and how you can better lead these processes. 21
  • 22.
    This implies itis something we do with and for the students and not to the students (Green, 1998) 22
  • 23.
    • Assessment - refersto the full range of information gathered and synthesized about the students  Assessment is the process of gathering, recording, interpreting, usin g and reporting information about a student’s progress and achievement in developing knowledge skills and understanding (NCCA, 2007)
  • 24.
    • Testing –focuseson what we “do” to the learners after instruction. Assessment – focuses on what we do “with” the learners before, during, and after learning. 24
  • 25.
    assessment methods • Performances •Projects • Products • Paper and pen • Portfolios
  • 26.
  • 27.
    Three interrelated objectivesfor quality in student assessment in higher education 1. Assessment that guides and encourages effective approaches to learning 2. Assessment that validly and reliably measures expected learning outcomes, in particular the higher-order learning that characterises higher education; and 3. Assessment and grading that define and protect academic standards
  • 28.
    Well designed assessmentshould … 4.Set clear expectations; 5.Establish a reasonable workload (one that does not push students into rote reproductive approaches to study); and 6.Provide opportunities for students to self-monitor, rehearse, practice and receive feedback.
  • 29.
    16 INDICATORS OFEFFECTIVE ASSESSMENT IN HIGHER EDUCATION 1. Assessment is treated by faculty and students as an integral component of the entire teaching and learning process. 2. The multiple roles of assessment are recognized. • The powerful motivating effect of assessment requirements on students is understood and assessment tasks are designed to foster valued study habits.
  • 30.
    16 INDICATORS OFEFFECTIVE ASSESSMENT IN HIGHER EDUCATION (con’t) 3. There is a faculty/departmental policy that guides assessment practices. – Subject assessment is integrated into an overall plan for course assessment. 4. There is a clear alignment between expected learning outcomes, what is taught and learned, and the knowledge and skills assessed. 30
  • 31.
    16 INDICATORS OFEFFECTIVE ASSESSMENT IN HIGHER EDUCATION (con’t) 5. Assessment tasks assess the capacity to analyse and synthesis new information and concepts rather than simply recall information which has been presented. 6. A variety of assessment methods is employed so that the limitations of particular methods are minimized.
  • 32.
    16 INDICATORS OFEFFECTIVE ASSESSMENT IN HIGHER EDUCATION (con’t) 7. Assessment tasks are designed to assess relevant generic skills as well as subjectspecific knowledge and skills. 8. There is a steady progression in the complexity and demands of assessment requirements in the later years of courses. 9. There is provision for student choice in assessment tasks and weighting at certain times.
  • 33.
    16 INDICATORS OFEFFECTIVE ASSESSMENT IN HIGHER EDUCATION (con’t) 10. Student and faculty workloads are considered in the scheduling and design of assessment tasks. 11. Excessive assessment is avoided. • Assessment tasks are designed to sample student learning. . 33
  • 34.
    16 INDICATORS OFEFFECTIVE ASSESSMENT IN HIGHER EDUCATION (con’t) • 12. Assessment tasks are weighted to balance the developmental (‘formative’) and judgemental (‘summative’) roles of assessment. – Early low-stakes, low-weight assessment is used to provide students with feedback 34
  • 35.
    16 INDICATORS OFEFFECTIVE ASSESSMENT IN HIGHER EDUCATION (con’t) 13. Grades are calculated and reported on the basis of clearly articulated learning outcomes and criteria for levels of achievement. 14. Students receive explanatory and diagnostic feedback as well as grades. 35
  • 36.
    16 INDICATORS OFEFFECTIVE ASSESSMENT IN HIGHER EDUCATION (con’t) 15. Assessment tasks are checked to ensure there are no inherent biases that may disadvantage particular student groups. 16. Plagiarism is minimized through careful task design, explicit education and appropriate monitoring of academic honesty.
  • 37.
  • 38.
    Assessment and theassurance of academic standards The assurance of academic standards embraces a wide range of university activities beyond the assessment of student learning. However, assessment and grading practices are perhaps the most important safeguard.
  • 39.
    What can individualfaculty do about standards? — 1 Ensure … … there are explicit learning outcomes, clear criteria and, where possible, statements of the various levels of achievement. With the objective of … students and faculty both being aware of what is expected, what is valued, and what will be rewarded.
  • 40.
    What can individualfaculty do about standards? — 2 Ensure … … a close match between the assessment tasks — in particular, the knowledge and skills these tasks are capable of determining — and the intended learning outcomes. With the objective of … creating assessment tasks that validly and reliably determine the valued learning outcomes.
  • 41.
    What can individualfaculty do about standards? — 3 Ensure … … the grades awarded (and other information provided to students on their achievement) make a direct link between the intended learning outcomes and students’ actual performance on assessment tasks. With the objective of … …awarding grades that are meaningful representations of the level of learning.
  • 42.
    What can individualfaculty do about standards? — 4 Ensure … … assessment tasks are capable of detecting the higher-order learning outcomes that characterize higher education. With the objective of … developing higher education assessment that determines and reports the highest intellectual skills and accomplishments.
  • 43.
    What can individualfaculty do about standards? — 5 Ensure … … there is ongoing dialogue on learning outcomes, assessment and grading with people teaching in the same discipline area in other universities. With the objective of … using assessment and grading practices that are informed by the norms and values of the discipline community.
  • 45.
    Bloom’s Taxonomy of InstructionalObjectives / Learning Outcomes Learning Domains – Cognitive • Learning outcomes related to knowledge – Psychomotor • Learning outcomes related to skills – Affective • Learning outcomes related to attitudes, behaviors, and values Taxonomies are based on the assumption that different types of objectives are learned through different mental processes.
  • 46.
  • 48.
    Bloom’s Taxonomy Level Verb Remembering: canthe student recall or remember the information? define, duplicate, list, memorize, recall, repeat, reproduce state Understanding: can the student explain ideas or concepts? classify, describe, discuss, explain, identify, locate, recognize, report, select, translate, paraphrase Applying: can the student use the information in a new way? choose, demonstrate, dramatize, employ, illustrate, interpret, operate, schedule, sketch, solve, use, write. Analyzing: can the student distinguish between the different parts? appraise, compare, contrast, criticize, differentiate, discriminate, distinguish, examine, experiment, question, test. Evaluating: can the student justify a stand or decision? appraise, argue, defend, judge, select, support, value, evaluate Creating: can the student create new product or point of view? assemble, construct, create, design, develop, formulate, write.
  • 49.
    Taxonomy for Teaching,Learning, and Assessing (A Revision of Bloom’s Taxonomy of Educational Objectives) The revised taxonomy is two-dimensional, identifying both: • the kind of knowledge to be learned (knowledge dimension) and • the kind of learning expected from students (cognitive processes) to help teachers and administrators improve alignment and rigor in the classroom.
  • 50.
    Factual Knowledge • Thebasic elements students must know to be acquainted with a discipline or solve problems in it. • Knowledge of terminology – technical vocabulary, musical symbols , etc. • Knowledge of specific details and elements – major natural resources, reliable sources of information
  • 51.
    Conceptual Knowledge • Theinterrelationships among the basic elements within a larger structure that enable them to function together. • Knowledge of classifications and categories – periods of geologic time • Knowledge of principles and generalizations – Pythagorean theorem, law of supply and demand • Knowledge of theories, models and structures – theory of evolution, structure of congress
  • 52.
    Procedural Knowledge • Howto do something, methods of inquiry, and criteria for using skills, algorithms, techniques, and methods. • Knowledge of subject-specific skills and algorithms – painting with watercolors, whole-number division • Knowledge of subject-specific techniques and methods – interviewing techniques, scientific method • Knowledge of criteria for determining when to use appropriate procedures – when to apply Newton's second law, when to use a particular method of estimation
  • 53.
    Metacognitive Knowledge • Knowledgeof cognition in general as well as awareness and knowledge of one's own cognition. • Strategic knowledge – outlining as a means of capturing the structure of a unit of subject matter in a textbook • Cognitive tasks – knowledge of the different types of tests, cognitive demands of different tasks • Self-knowledge – knowledge that critiquing essays is a personal strength, whereas writing essays is a personal weakness; awareness of one's own knowledge level
  • 54.
    Matching Learning Outcomesto Assessment Types Types of Learning: Learning outcomes What is required from students? Examples of Assessment Thinking critically and making judgments Development of arguments, reflection, judgment, evaluation • Essay • Report • Book review Solving problems/developi ng plans Identify problems, define problems, analyze data, review, design experiments, plan, apply information • Problem scenario • Group Work • Work-based problem • Analyze a case • Conference paper (or notes for a conference paper plus annotated bibliography)
  • 55.
    Matching Learning Outcomesto Assessment Types Types of Learning: Learning outcomes What is required from students? Examples of Assessment Performing procedures and demonstrating techniques Take readings, use equipment, follow laboratory procedures, follow protocols, carry out instructions • Demonstration • Role Play • Make a video (write script and produce/make a video) • Produce a poster • Lab report Demonstrating knowledge and understanding (can be assessed in conjunction with the above types of Recall, describe, report, identify, recognize, recount, relate, etc. • Written examination • Oral examination • MCQs • Essays
  • 56.
    Matching Learning Outcomesto Assessment Types Types of Learning: Learning outcomes What is required from students? Examples of Assessment Managing/developing yourself Work co-operatively and, independently, be self-directed, manage time, manage tasks • Learning journal Portfolio • Learning Contracts • Self-evaluation • Group projects • Peer assessment Designing, creating, performing Design, create, perform, produce, etc. • • • • Design project Portfolio Presentation Performance
  • 57.
    Matching Learning Outcomesto Assessment Types Types of Learning: Learning outcomes What is required from students? Examples of Assessment Assessing and managing information Information search and retrieval, investigate, interpret, review information • Annotated bibliographies • Use of bibliographic software • Library research assignment • Data based project Communicating Written, oral, visual and technical skills • Written presentation • Oral presentation • Discussions /Debates/ role plays • Group work
  • 58.
  • 59.
    Planning a Test •First step: Outline learning objectives or major concepts to be covered by the test – Test should be representative of objectives and material covered – Major student complaint: Tests don’t fairly cover the material that was supposed to be canvassed on the test.
  • 60.
    Planning a Test •Second Step: Create a test blueprint • Third Step: Create questions based on blueprint – Match the question type with the appropriate level of learning • Fourth Step: For each check on the blueprint, jot down 3-4 alternative question on ideas and item types which will get at the same objective • Fifth Step: Organize questions and/or ideas by item types
  • 61.
    Planning a Test •Sixth Step: Eliminate similar questions • Seventh Step: Walk away from this for a couple of days • Eighth Step: Reread all of the items – try doing this from the standpoint of a student
  • 62.
    Planning a Test •Ninth Step: Organize questions logically • Tenth Step: Time yourself actually taking the test and then multiply that by about 4 depending on the level of students • Eleventh Step: Analyze the results (item analyses)
  • 63.
    Selecting the RightType of Test • How do you know what type of question to use and when? • It depends on the skill you are testing. • Test should always match as closely as possible the actual activity you’re teaching. – Examples: Teaching Speech, should evaluate an oral speech – If testing ability to write in Spanish, better give an essay. – Testing reading –MC, TF – Wouldn’t use MC to test creative writing
  • 64.
    Question Types andCognitive Levels of Learning Knowledge Comprehension Application Multiple Choice (MC) True/False (TF) Matching Completion Short Answer MC Short Answer Problems Essay Performance Analysis Synthesis Evaluation MC Short Answer Essay
  • 65.
    Constructing the Test •Types of Test Questions: – – – – – Multiple-Choice Items True-False Items Matching Items Fill-In, Completion or Short-Answer Items Essay Questions
  • 66.
    Selecting Objectives forAssessment • Test all of the must-know objectives, test some need-toknow, and don't test nice-to-knows. • A must-know objective - one which students have to know to go on to the next objective in the sequence • It is the last time the objective will be taught in the scope and sequence and the student cannot finish the course or go on to the next course, without knowing or being able to do what the objective requires.
  • 67.
    • Need-to-know objectivesare those where the students need to know this content to perform within the sequence but the content will be re-taught or reviewed prior to the student using the objective again. • Nice-to-know objectives are those that the students are to be exposed to but very few are expected to learn much about it. • When the specifications have been developed the next step is construction of the text items.
  • 68.
    Table of Specificationsor Test Blueprint Matrix which contains:  a) objectives being tested  b) levels at which those objectives should be tested, according to Bloom's Taxonomy, or others .  c) amount of time which was spent teaching the objective  Objectives should never be tested at a higher Bloom's level than the objective that is presented to the students.  The number of items used should relate to the amount of teaching time spent on the objectives.  The more time the greater number of items.
  • 69.
    • BEFORE IEND ….
  • 70.
    "For as longas assessment is viewed as something we do 'after' teaching and learning are over, we will fail to greatly improve student performance, regardless of how well or how poorly students are currently taught or motivated." -- Grant Wiggins, 1998
  • 71.
    Assessment is notenough • We can’t consider assessment separately from teaching and learning processes. All are about informing judgment. • Alignment between and integration of learning activities is needed • Choosing assessment practices chooses what students will learn
  • 72.
  • 73.
  • 74.
    Final thought Why teachto testing when it is so productive to teach to learning? Guy Bensusan
  • 75.
    References • • • • • • • • • • Boud, D. (1995).Enhancing Learning through Self Assessment. London: Kogan Page. Boud, D. (2000). Sustainable assessment: rethinking assessment for the learning society. Studies in Continuing Education, 22, 2, 151-167. Boud, D. and Falchikov, N. (2006). Aligning assessment with long term learning, Assessment and Evaluation in Higher Education, 31, 4, 399-413. Boud, D. and Falchikov, N. (Eds.) (2007) Rethinking Assessment in Higher Education: Learning for the Longer Term. London: Routledge. Absolum, M. (2006). Clarity in the classroom. Auckland: Hodder Education. pp 98-117. Andrade H. and Valtcheva, A. (2009). Promoting Learning and Achievement through Self Assessment, Theory into Practice, Vol 48 pp 12-19. Topping, K.J. (2009), Peer Assessment, Theory into Practice, Vol 48 pp 20-27. Wiliam, D. When is assessment learning-oriented? 4th Biennial EARLI/Northumbria Assessment Conference, Potsdam, Germany, August 2008. www.dylanwiliam.net alchikov, N. (2005). Improving Assessment through Student Involvement. London: Routledge. http://www.cshe.unimelb.edu.au/assessinglearning/06/index.html2006). How assessment frames student learning. In Clegg, K. and Bryan, C. (Eds.) Innovative Assessment in Higher Education. London: Routledge.
  • 76.

Editor's Notes

  • #13 Misuse of statistics – labeling students.