Classroom Assessment is a systematic approach to formative evaluation, used by instructors to determine how much and how well students are learning. CATs and other informal assessment tools provide key information during the semester regarding teaching and learning so that changes can be made as necessary. "The central purpose of Classroom Assessment is to empower both teachers and their students to improve the quality of learning in the classroom" through an approach that is "learner-centered, teacher-directed, mutually beneficial, formative, context-specific, and firmly rooted in good practice" (Angelo & Cross, 1993, p. 4).
This power point is about the didactic assessment. It is all about the didactic assessment definitions, related concepts, types, and didactic assessment tools.
This PowerPoint by Dr. Dee McKinney & Katie Shepard was presented as a workshop for the East Georgia State College Center for Teaching & Learning for interested faculty & staff in January 2018.
This power point is about the didactic assessment. It is all about the didactic assessment definitions, related concepts, types, and didactic assessment tools.
This PowerPoint by Dr. Dee McKinney & Katie Shepard was presented as a workshop for the East Georgia State College Center for Teaching & Learning for interested faculty & staff in January 2018.
This powerpoint presentation is intended for future educators as it will explain the six levels of Bloom's Taxonomy and provide information on how to develop tests and reduce cheating and test anxiety.
Evaluation assessment & 2g curriculum a pril 26 2016Mr Bounab Samir
Salam,
The 3rd 2 g curriculum & evaluation meeting was about :
1- teachers and 1 G syllabus evaluation
2- 2 G and evaluation
3- changes occured in the new 2 G
4- Evaluation according to the 2G curirculum
Thank you
By Mr Samir Bounab ( Teacher trainer at MONE)
-
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
This powerpoint presentation is intended for future educators as it will explain the six levels of Bloom's Taxonomy and provide information on how to develop tests and reduce cheating and test anxiety.
Evaluation assessment & 2g curriculum a pril 26 2016Mr Bounab Samir
Salam,
The 3rd 2 g curriculum & evaluation meeting was about :
1- teachers and 1 G syllabus evaluation
2- 2 G and evaluation
3- changes occured in the new 2 G
4- Evaluation according to the 2G curirculum
Thank you
By Mr Samir Bounab ( Teacher trainer at MONE)
-
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
Operation “Blue Star” is the only event in the history of Independent India where the state went into war with its own people. Even after about 40 years it is not clear if it was culmination of states anger over people of the region, a political game of power or start of dictatorial chapter in the democratic setup.
The people of Punjab felt alienated from main stream due to denial of their just demands during a long democratic struggle since independence. As it happen all over the word, it led to militant struggle with great loss of lives of military, police and civilian personnel. Killing of Indira Gandhi and massacre of innocent Sikhs in Delhi and other India cities was also associated with this movement.
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
Francesca Gottschalk - How can education support child empowerment.pptxEduSkills OECD
Francesca Gottschalk from the OECD’s Centre for Educational Research and Innovation presents at the Ask an Expert Webinar: How can education support child empowerment?
Instructions for Submissions thorugh G- Classroom.pptxJheel Barad
This presentation provides a briefing on how to upload submissions and documents in Google Classroom. It was prepared as part of an orientation for new Sainik School in-service teacher trainees. As a training officer, my goal is to ensure that you are comfortable and proficient with this essential tool for managing assignments and fostering student engagement.
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...Levi Shapiro
Letter from the Congress of the United States regarding Anti-Semitism sent June 3rd to MIT President Sally Kornbluth, MIT Corp Chair, Mark Gorenberg
Dear Dr. Kornbluth and Mr. Gorenberg,
The US House of Representatives is deeply concerned by ongoing and pervasive acts of antisemitic
harassment and intimidation at the Massachusetts Institute of Technology (MIT). Failing to act decisively to ensure a safe learning environment for all students would be a grave dereliction of your responsibilities as President of MIT and Chair of the MIT Corporation.
This Congress will not stand idly by and allow an environment hostile to Jewish students to persist. The House believes that your institution is in violation of Title VI of the Civil Rights Act, and the inability or
unwillingness to rectify this violation through action requires accountability.
Postsecondary education is a unique opportunity for students to learn and have their ideas and beliefs challenged. However, universities receiving hundreds of millions of federal funds annually have denied
students that opportunity and have been hijacked to become venues for the promotion of terrorism, antisemitic harassment and intimidation, unlawful encampments, and in some cases, assaults and riots.
The House of Representatives will not countenance the use of federal funds to indoctrinate students into hateful, antisemitic, anti-American supporters of terrorism. Investigations into campus antisemitism by the Committee on Education and the Workforce and the Committee on Ways and Means have been expanded into a Congress-wide probe across all relevant jurisdictions to address this national crisis. The undersigned Committees will conduct oversight into the use of federal funds at MIT and its learning environment under authorities granted to each Committee.
• The Committee on Education and the Workforce has been investigating your institution since December 7, 2023. The Committee has broad jurisdiction over postsecondary education, including its compliance with Title VI of the Civil Rights Act, campus safety concerns over disruptions to the learning environment, and the awarding of federal student aid under the Higher Education Act.
• The Committee on Oversight and Accountability is investigating the sources of funding and other support flowing to groups espousing pro-Hamas propaganda and engaged in antisemitic harassment and intimidation of students. The Committee on Oversight and Accountability is the principal oversight committee of the US House of Representatives and has broad authority to investigate “any matter” at “any time” under House Rule X.
• The Committee on Ways and Means has been investigating several universities since November 15, 2023, when the Committee held a hearing entitled From Ivory Towers to Dark Corners: Investigating the Nexus Between Antisemitism, Tax-Exempt Universities, and Terror Financing. The Committee followed the hearing with letters to those institutions on January 10, 202
3. Conceptualizing Terms
An Instrument or a Tool
• Systematic procedure of measuring a sample of
behavior of learners.
• Composed of some questions that require
answers from learners.
• It normally aims to measure learners’ acquisition
of knowledge.
• An example of an instrument or a tool is A TEST.
4. Measurement:
• Administration of an instrument/a tool such as a
test to the students in an examination or test
venue.
• The process of administering, marking and
assigning number/grade to an individual
according to the satisfied rules.
• Scoring and assigning grade
• It normally tries to answer the question “How
much does an individual learner scored in a
test?”
5. Assessment:
• The systematic and continuous process of
monitoring various pieces of learning in order to
evaluate student’s achievement and instructional
effectiveness.
• The process of collecting data or information on
students’ performance.
• It normally lies between evaluation and
measurement.
• When do we assess?
6. Assessment cont…
• During teaching and learning process
• At the end of a lesson or unit
• At the end of the school year when we want to
determine whether or not the objectives/goals have
been obtained
7. Evaluation:
• systematic process of collecting, analyzing and
interpreting information in order to determine
the extent to which a student is able to
achieve the stated objectives.
• It comprises of measurement and assessment
for the purpose of answering the question of
“How good or bad is a student” in order to
make value judgments.
8. Evaluation cont…
Classification of Evaluation
• Evaluation has been classified according to the time
and manner/modality of assessment,
• That is when evaluation is done and for what
purpose(s).
9. Evaluation cont…
a.) Placement Evaluation
• This type of evaluation is done at the beginning of a
programme or course before beginning of
instruction.
• It is concerned with student’s entry behavior or
characteristics of students before beginning of an
instruction or programme.
• Form two NECTA examinations in streaming of
students into form three.
10. Evaluation cont…
Purposes of Placement Evaluation
• To place students into required categories
according to their abilities or prior knowledge.
• It aims to get the student’s pre requisite
knowledge, ability, attitude, skills and interest
before beginning the new lesson or programme.
11. Evaluation cont…
b.) Diagnostic Evaluation
• It can be done any time when there is a need
to identify the learning difficulties.
• It takes place throughout the instruction,
programme or course,
12. Evaluation cont…
Purposes of Diagnostic Evaluation
• It aims to identify learning difficulties that have
been left unsolved.
• It is applied after trying several times to solve the
problem but the problem is still existing or
unsuccessful solved.
• It aims on determining the cause - effect of the
problem and try to suggest possible solutions to
solve the problem as required.
13. Evaluation cont...
c.) Formative Evaluation - Feedback
• This type of evaluation takes place
throughout the programme, course or
instruction for the purpose of monitoring
learning progress and improving
performance.
• It comprises of day to day activities that
take place in an institution whether within
or outside the classroom.
• It is a continuous process in an institution.
For example questions and answers in a
class, tests, seminars, quizzes, assignments
and so on.
14. Evaluation cont…
Purposes of formative evaluation
• It provides a feedback to both teachers and students
on learning progress.
• This feedback should be immediate in order for it to
be effective.
15. Evaluation Cont…
Purposes of Formative Evaluation
• The given feedback will help both the
teachers and students to seek for further
action(s) to improve teaching and
learning. For example remedial classes.
• It acts as reinforcement for the students
to learn more, that is; promotes learning
among the students.
16. Evaluation cont…
d) Summative Evaluation:
• This type of evaluation takes place at the
end of a programme, course or unit of
instruction for the purpose of
determining how the students have
attained the intended instructional
objectives.
•
• For example the final examination
(NECTA)
17. Evaluation cont…
Purposes of Summative Evaluation
• It gives information for judgment of the
appropriateness of the materials and
instructional strategies.
• It provides feedback to the curriculum
planners on the effectiveness of the
curriculum.
• It helps to evaluate the effectiveness of
the teacher and the instruction.
18. Evaluation cont...
Purposes of Summative Evaluation
It is used for grading purposes to the
students in all levels of education.
• It is for certification purposes after the
completion of a programme, course or
unit of instruction.
19. Steps in Planning and constructing a
Test:
• Make sure the contents are relevant
• All contents taught must be evaluated{Table of
specifications}
• Consider the aims of the test precisely-say exactly
what a test is intended to evaluate
• Plan the type and scope of the test eg [essay,
multiple choice]
20. • Make a provisional draft of the items/questions
• Plan the length of time needed to answer these
items
• Reconsider the items. Delete those that are
unsatisfactory
• Determine whether the question is clear and
unambiguous
21. Planning for a test cont…
• Order the items i.e from easy to difficult – Make sure
the difficult items are not beyond the ability of the
best students
• Make sure the test instructions are clear
• Draw up a marking scoring/scheme/rubric – to
determine whether the test is too long, too difficult
or too easy
• Review the test accordingly
22. Classification of a Test
There are two major categories of a test, namely
Objective tests and Subjective tests.
A: Objective Tests
Supply items
Selection items
23. Objective Tests cont…
• It covers test items that are one way oriented,
• which means in all of the test items there must
be only one appropriate/correct answer in each
question.
• That is the answers provided are straight forward
with no explanation.
• Objective tests include multiple choice items,
true/false items and matching items.
24. Objective Tests cont…
Merits of Objective Tests
They cover a wide content in short time.
• They involve many questions at once because the
items are short; hence they require short answers
too.
• They are easy to mark.
• They are very easy to score.
25. Demerits of Objective Tests
• Limits the freedom of the students
to express their opinions, feelings
and ideas.
• Students are required to cover a
wide range of content in a short
time.
• They are very difficult to construct,
and if the examiner is not careful
she/he can set a complicated and
ambiguous test items.
26. a.) Multiple Choice Items:
• These are test items which contain a
stem and the alternatives.
• The alternatives contain a correct
answer and destructors.
• The correct answer is a called
alternative and other incorrect
alternatives are destructors.
• Its flexibility is on the student to
choose the correct answer.
27. Guidelines for constructing Multiple
Choice Items:
• Make sure that the alternatives are
grammatically consisting of a stem of the items
• the alternatives must be equal in weight.
• Avoid clues among the alternatives to reduce
guessing.
• Make sure that there is only one correct
answer among the alternatives.
• Involve the alternative that relate to each other
to reduce guessing.
28. b.) Matching Items:
• These are test items which contain two
columns.
• The first column which is termed as List A
contains questions (premises).
• The answers that are found in the List B are
technically known as responses.
29. Guidelines for constructing Matching items
• The responses should be short and clear
• Make sure that both the columns are within the
same page in order to simplify matching.
• The responses should be many compared to the
items/questions. At least each item to have two
related responses.
• Avoid clues that would facilitate guessing – Use
homogeneous materials /related ones
30. c.) True/False Items:
• These are test items which contain declarative
statements that can either be true or false/wrong.
• They can sometimes consist of destructive answers
(false) and sometimes correct answers (true).
• T/F items are effective in correctness of facts, definition
of terms and statement of principles.
31. Guidelines for constructing
True/False
• Avoid long and complex statements.
• The use of negative statements is not
encouraged.
• Avoid ambiguous vocabularies for
effectiveness of the question.
• Avoid mixing two ideas at once.
32. B: Subjective Tests
• These are test items which have no one pre-
determined correct answer; that is they are subjective
in nature.
• Measure advanced and complex outcomes of learning.
• They intend to measure competence on how to
integrate or organize ideas.
• These include essay items and short answers items.
33. Subjective Test cont…
Merits of Subjective Test
• Permit learners to side with facts
• Give freedom to learners to express
their ideas.
• Easy to construct so they take short
time.
• They are not prone to guessing and
cheating.
34. Subjective tests cont...
Demerits of Subjective Tests
• They cover a very small content.
• Scoring is very difficult because it involves other extra
criteria like handwriting, grammar and neatness when
marking. Show how you will handle them
• They make or force markers to be subjective too.
• Biasness
35. Subjective Test cont…
i) Essay Test Items
• Essay items have been divided into two
categories namely, restricted and non-
restricted essay items.
ia) Restricted Essay Items
• Limit the students in terms of Content and
Response
• Consider the nature of the question asked if
it real needs a short answer.
36. Subjective Test cont...
ib.) Non-Restricted/ Extended Essay Items
• These items provide a more freedom/ opportunity to
learners to express their ideas and opinion
• They do not limit a learner on how to organize his/her
materials.
• An examiner just needs to make sure that the
constructed item is not limited in the way learners
should answer it.
37. Subjective Test Items cont…
Guidelines for constructing Essay Test Items
• Formulate a question that can match with learners’ behavior
or intended learning outcome.
• Make sure the question indicates clearly the task to be
undertaken by the students – direct and clear
• Indicate time limit for each question
• Little time limits may put students at a disadvantage
• Avoid opportunity questions- difficult to mark.
38. Subjective Test Items…
Marking of Essay Test Items
• Prepare marking rubric/marking scheme I advance
and credits or scores to be allocated to each point.
• Describe how you will handle factors like spellings,
handwriting, sentence structure, punctuations and
neatness.
• Efforts should be made to keep such factors less
influential in scoring
• Mark one question before going to the next in
order to be consistent and justice to students
• Evaluate the answers without looking at student’s
39. Advantages of essay items
i. They are unique in measuring
students' ability to select content,
organize and integrate it, and
present it in logical order.
ii. They present a more realistic task
to students.
Students organise and
communicate their thoughts
iii. A device for improving wring skills.
iv. They are easy to construct.
v. They are less time consuming to
construct.
40. Limitations of essay items
i. Grading is often subjective and not consistent
depending on neatness, handwriting, spelling and
grammar
ii. Can be a limited sampling of content.
iii. Good writing requires time to think, organize,
write and revise.
iv. Time consuming to correct.
v. Advantageous for students with good writing and
verbal skills and quick
vi. Advantageous for students who are quick
41. Validity and Reliability
Your test/exam must have 2 characteristics
which will judge its quality.
Validity
Reliability
42. Reliability
• Reliability is the degree to which an
assessment tool produces stable and
consistent results.
• The degree of consistency or stability of
evaluation instrument when
administered twice or at different time
or marked by different markers
• It should give the same results
43. Validity
• Accuracy of an assessment tool whether or
not it measures what it is supposed to
measure.
• How well a test measures what it is supposed
to measure
• Note: Validity and reliability starts from when
a teacher is preparing the exam up to the
point of evaluation.
• So test results will be appropriate, meaningful
and useful if they are valid and reliable.
44. Validity
Test can have little or no validity for their intended
use due to:
• Using vocabulary and sentence structure which are
difficult for the level of students
• Unclear directions on how to respond to the items
• Items that are easy or too difficult will not provide
reliable discriminations among the students
45. Validity
• Poorly constructed test items – unintentionally
provide clues to the answer
• Ambiguous statements in test items contribute to
misinterpretations and confusion especially for
better students
46. • Test items which are not appropriate for the
outcomes being measured – Blooms Taxonomy levels
• Test which are too short to provide good
representative sample
47. Validity
• Improper arrangement of test items – starting with
difficult items
• Identifiable pattern of answers
- Placing correct answer in some systematic manner
will enable students to guess
48. Validity
• Insufficient time to complete the test
• Unfair directions
• Cheating
• Unreliable scoring especially in essay
items
• Errors in recording/awarding marks
• Emotional disturbances can bother the
students and affect their performance
49. • Some students may be frightened by the test
situation and so unable to respond normally
• Response set
50. Curriculum Evaluation
• Curriculum Evaluation is a process of
gathering information about the
effectiveness of the curricular -
programme
• to determine its quality/ value/merit of a
program or learning with the aim of
deciding whether to adopt, modify,
reject or revise.
51. What do we evaluate?
The value of learning, instruction and the overall
program.
Did the learner improve during the educational
process?
Did the learner meet a certain standards set for the
curriculum?
52. What do we evaluate?……….
Did the instruction meet expectation of the learner
and the overall program?
Overall program – Did the program or curriculum
content/format accomplish what it set out to do?
53. Methods employed in curriculum
evaluation
Discussions
Experiments
Interviews – group/ personal –
information from stakeholders
Observation procedures
Questionnaires
Practical performance and official
records.
Surveys
54. Curriculum Evaluation
Why is evaluation important
Feedback and motivation to all stakeholders
Improvement of the educational process
Certification of learner competency
Data to meet accreditation requirements
Assessment of the cost of delivery
55. Types of evaluation
Formative Evaluation Occurs during
educational process with the intent of
improving performance –feedback.
- Means to assess the educational process
while it is still being used or developed.
- It is intention is to improve performance
56. Types of Evaluation
• Summative Evaluation occurs at the conclusion
of an educational activity with the intent of
documenting achievement or competence.
• To make a judgement or decision at the
conclusion of the educational curriculum.
• The final outcome or a certification of
competency at completion.
57. Note:
Evaluation of curriculum happens in order
to decide whether to accept, change or
eliminate various aspects of the
curriculum.
To understand if the curriculum is
producing desired results
58. Stages in conducting curriculum
evaluation
• Preparation of the curriculum to be evaluated
• Designing of instruments
• Conducting analysis
• Reporting and using information
59. Curriculum Evaluation models
1. Bradley’s Effective Model
2. Tyler’s Objective - Centred model
3. Stufflebean’s Context, Input, Process,
Product model
4. Sriven’s Goal - free model
5. Stake’s Responsive model
6. Eisner’s Connoisseurship model
7. Roger’s Need Assessment model
8. Kirkpatrick’s Four – Levels model
60. Why curriculum models?
To provide a conceptual framework for designing a
particular evaluation depending on the specific
purpose of evaluation