3. • Summative Evaluation
• Expert Judgement and Impact
Phase of Summative Evaluation
• Congruence Analysis
• Content Analysis
• Design Analysis
• Impact Analysis
• Existing Materials Analysis
• Comparison of Formative and
Summative Evaluation
• Attentive Relevance,
Confidence Satisfaction
Motivation Model
• The summative evaluation,
conversely, contains only two
stages:
•
Expert Judgement
• Impact Judgement
• Sample questions for
systematic instructional design
perspective. include the
following:
• Rating Forms For Impact
Analysis
OBJECTIVE
4. SUMMATIVE EVALUATION
The process of collecting data and information to make decisions about
whether the instruction actually works as intended in the performance
context.
5. EXPERT JUDGEMENT AND IMPACT PHASE OF
SUMMATIVE EVALUATION
Congruence Analysis
Content Analysis
Design Analysis
9. IMPACT ANALYSIS
Typically includes the following parts:
1. focusing the impact study
establishing criteria and data needs
2.selecting respondents
3.planning study procedures, summarizing and analyzing data,
4.reporting results, and negotiating resources.
10. EXISTING MATERIALS ANALYSIS
The analysis of existing materials follows the same sequence as used to
analyze materials that have just been designed and developed. The analysis
begins with congruence analysis, followed by content analysis, design
analysis, and transfer feasibility analysis.
11. COMPARISON OF FORMATIVE AND SUMMATIVE
EVALUATION
The first difference is related to the purpose for conducting each type of
evaluation.
Formative evaluations are undertaken to locate weaknesses and problems
in the instruction in order to revise it.
Summative evaluations are undertaken after instruction is completed to
determine the impact of the instruction for the learners, their jobs, and
the organization.
12. ATTENTIVE RELEVANCE, CONFIDENCE SATISFACTION
MOTIVATION MODEL
Attention 1. Are strategies used to gain and maintain the learners’
attention (e.g., emotional or personal appeals, questions, thinking
challenges, human interest examples)?
Relevance 2. Is the instruction relevant for given target
groups? 3. Are learners informed and convinced of the relevance(e.g.,
information about new requirements for graduation, certification,
employment, advancement, self-actualization)?
Confidence 4. Are learners likely to be confident at the outset
and throughout instruction so that they can succeed? • Informed of
purposes • Likely to possess prerequisites • Instruction progresses from
familiar to unfamiliar • Concrete to abstract • Vocabulary, contexts, and
scope appropriate • Challenges present but realistic
Satisfaction 5. Are learners likely to be satisfied from the learning
experience?
14. EXPERT JUDGEMENT
Resembles evaluative decisions made by the designer and the context and
content experts during the design and development of materials.
15. IMPACT JUDGEMENT
Conducted with target learners after they have returned to their jobs and
focuses on the jobsite and examines three things:
(1) If an organization’s needs were met following use of the instruction
(2) whether employees are able to transfer new information and skills to
the job
(3) If an improvement in job performance or productivity is realized.
Outcome data are typically obtained through unobtrusive
observations, questionnaires, document analysis, and job
performance ratings in the performance context.
16. SAMPLE QUESTIONS FOR SYSTEMATIC INSTRUCTIONAL
DESIGN PERSPECTIVE. INCLUDE THE FOLLOWING:
1. How clear are the goal(s) and the main objectives of this instruction?
2. How accurate and current is the information included in the instruction?
3. How logical is the sequence of information in the instruction?
17. RATING FORMS FOR IMPACT ANALYSIS
An introduction section
A list of the learning outcomes for the instruction
Questions about the participants’ level of use of the skills taught in the
instruction
Question about the relevance of the particular skills for their work
Questions about additional support needed for any of the skills
18. REFERENCES/ URL’S
Carey, Lou, and James O. Carey. "Developing Assessment
Instruments."The Systematic Design of Instruction. By Walter Dick. 8th
ed. N.p.: n.p., n.d. 368-343. Print
19. CHANGE AGENT, REFLECTIVE PRACTITIONER, LIFELONG LEARNER
Reflective Practitioner: Understand their disciplines and how learners learn.
This understanding enables them to make judgments about the
knowledge level of students and to make decisions about representing
the content in ways that facilitate the students’ intellectual growth​
Change Agent: Collaborate with other members of the professional
community and take responsibility for school, curricular, and
instructional decisions which help create student-centered learning
communities.​
Lifelong Learner: Continue to expand their repertoire of knowledge and
experience, update their skills, evaluate and enhance their dispositions,
and further their ability to refine and adapt their decision making
process to more diverse and continuously changing educational settin