2. BACKGROUND
• Summarizing and analyzing data obtained
from formative evaluation
• Revising materials
• The changes that are made to the content of
the materials
• The changes that are related to the
procedures employed in using the materials
BACKGROUND
3. OBJECTIVES
• Describe various methods for summarizing data
obtained from formative evaluation studies.
• Summarize data obtained from formative evaluation
studies.
• Given summarized formative evaluation data, identify
weaknesses in instructional materials and instructor-
led instruction.
• Given formative evaluation data for set of
instructional materials, identify problems in the
materials, and suggest revisions for the materials
4. DATA
• Learner characteristics
• Entry behavior
• Direct responses to the instruction
• Learning time
• Posttest performance
• Responses to an attitude questionnaire
• Comments made directly in the materials
5. DATA 1 TO 1
• The designer must look at the similarities and
differences among the responses of the learners, and
determine the best changes to make in the
instruction. Three Sources Of Suggestions For
Changes
• Learner suggestions
• Learner performance
• Your own reactions to the instruction
6. SMALL GROUP DATA
• The fundamental unit of analysis for all the
assessments is the individual assessment item.
Performance on each item must be scored as correct
or incorrect. Methods For Summarizing Data
• Item-by-objective performance
• Graphing learners’ performance
• Descriptive fashion
7. FIELD TRIAL DATA
• Comments can be captured in one-on-one charts
where you list out comments made by each learner
• Assessment scores can be shown in charts or
hierarchies that represent your individual objectives
• Assessment scores can be shown in charts or
hierarchies that represent your individual objectives
8. PERFORMANCE TEST
• Derive assessment instruments based on the objectives to:
• Diagnose an individual’s possessions of the necessary
prerequisites for learning new skills
• Check the results of student learning during the process of
a lesson
• Provide document of students progress for parents or
administrators
• It is useful in evaluating the instructional system itself
(Formative/ Summative evaluation) and for early
determination of performance measures before the
development of lesson plan and instructional materials
9. GRAPHING PERFORMANCE
• The goal of continuous monitoring and charting of student
performance is twofold. First, it provides you, the teacher,
information about student progress on discrete, short-term
objectives. It enables you to adjust your instruction to review or
re-teach concepts or skills immediately, rather than waiting until
you've covered several topics to find out that one or more
students didn't learn a particular skill or concept. Second, it
provides your students with a visual representation of their
learning. Students can become more engaged in their learning
by charting and graphing their own performance
10. DATA
• OBSERVATIONAL ASSESSMENT is the most common form of formative
assessment. Teachers can circulate the room to monitor students' progress.
If students are working independently or in groups, teachers should
intervene when the students are not understanding the material. Teachers
can also take note of students' comments and participation levels during
class discussions to gauge their learning.
• SELECTED RESPONSE ASSESSMENTS are any type of objective exam where
there is only one correct answer for each question. Multiple choice, fill-in-
the-blank, matching and true/false questions are all types of selected
response assessments. This type of assessment allows the teacher to score
exams quickly and with a large degree of reliability in scoring from one exam
to another.
• CONSTRUCTED RESPONSE ASSESSMENTS require students to generate their
own response rather than selecting a single response from several possible
ones. These exams are much more subjective as there is not a single correct
answer. Instead, teachers must grade either with a rubric or holistically to
maintain a fair degree of reliability
11. DATA CONTINUED
• PERFORMANCE ASSESSMENTS require students to perform as a means
of showing they understand class material. The types of performances
can include actual performing, as in a class debate, or performance by
creating, as in making a brochure or TV ad. These assessments evaluate
complex cognitive processes as well as attitude and social skills, and
students often find them engaging.
• PORTFOLIO ASSESSMENTS evaluate a student's progress over the
course of the semester. It is more than a one-time picture of what a
learner has accomplished. Portfolios include all of a student's work in a
particular area. For example, a student in an English class could have a
portfolio for a research paper that includes note cards, outlines, rough
drafts, revisions and a final draft. The teacher would evaluate the
portfolio as a whole, not just the final draft, to see how the student has
grown.
12. DATA SEQUENCING
• The information on the clarity of instruction, impact on learner, and
feasibility of instruction needs to be summarized and focused.
• Particular aspects of the instruction found to be weak can then be
reconsidered in order to plan revisions likely to improve the instruction
for similar learners
13. BEHAVIORS
• A step-by-step determination of what people are doing when they
perform the goal and what entry behaviors are needed.
• Involves identification of the context in which the skills will be learned
and the context in which the skills will be used
14. PRETEST AND POSTTEST
• After the students in the one- to- one trials have completed the
instruction, they should review the posttest and attitude questionnaire in
the same fashion.
• After each item or step in the assessment, ask the learners why they
made the particular responses that they did.
• This will help you spot not only mistakes but also the reasons for the
mistakes, which can be quite helpful during the re-vision process.
15. INSTRUCTIONAL STRATEGY
• Instructional strategy is an overall plan of activities to achieve an
instructional goal; it includes the sequence of intermediate objectives
and the learning activities leading to the instructional goal.
• Its purpose is to identify the strategy to achieve the terminal objective
and to outline how instructional activities will relate to the
accomplishment of the objectives.
• Emphasis is given on presentation of information, practice and
feedback, and testing.
• A well-designed lesson should demonstrating know-ledge about the
learners, tasks reflected in the objectives, and effectiveness of teaching
strategies
16. PROCEDURE
• Instructional strategy is an overall plan of activities to achieve an
instructional goal; it includes the sequence of intermediate objectives
and the learning activities leading to the instructional goal.
• Its purpose is to identify the strategy to achieve the terminal objective
and to outline how instructional activities will relate to the
accomplishment of the objectives.
• Emphasis is given on presentation of information, practice and
feedback, and testing.
• A well-designed lesson should demonstrating know-ledge about the
learners, tasks reflected in the objectives, and effectiveness of teaching
strategies.
17. REVISION
• Use the data, your experience, and sound learning principles as the
bases for your revision.
• The aim is to revise the instruction so as to make it as effective as
possible for larger number of students.
• Data from the formative evaluation are summarized and interpreted to
attempt to identify difficulties experience by learners in achieving the
objectives and to relate these difficulties to specific deficiencies in the
material
18. REVISING
• 1. Omit portions of the instruction.
• 2. Include other available materials.
• 3. Simply develop supplementary instruction
19. SUMMARY
• The final step in the design and development process (and the first step
in a repeat cycle) is revising the instruction. Data from the formative
evaluation are summarized and interpreted to identify difficulties
experienced by learners in achieving the objectives and to relate those
difficulties to specific deficiencies in the instruction. It is used to re-
examine the validity of instructional analysis and the assumptions about
the entry behaviors and characteristics of learners. It may be necessary
to reexamine statements of performance objectives and test times in
light of collected data