This document outlines an instructional module for training educators on how to use SMART Boards. It includes the module's goals and objectives, instructional strategies, and evaluation plans. The strategies follow a sequence including an introduction, pre-test, presentations, group activities, Q&A, workbooks, and a post-test. Data would be collected from pre- and post-tests to analyze learner performance and the module's effectiveness. Revisions would modify content as needed and adapt to changes in classroom sizes and technology.
1. Implementation and
Evaluation Report
Gregory S. Rooks
Dr. Desiree De Priest
Advanced Instructional Design
(EIDT – 6110 – 3)
2. Edu Tech
Consultants
would like to
welcome you
to your
Jeffrey, Laurie, Lori, Jamie & Nancy, Instructional Design Project, (2012)
Edu Tech Consultants. Minneapolis, MN. Walden Press.
3. Unit Goals and
Instructional
Objectives 2.To locate the most
At the end of the useful tools and how to
learning module, the use them effectively.
student will be able: 3.To apply given
1.To calibrate/orient techniques to create a
whiteboards and simple word document using
troubleshooting Ink Aware on the
techniques. SMART Board.
Jeffrey, Laurie, Lori, Jamie & Nancy, Instructional Design Project, (2012)
Edu Tech Consultants. Minneapolis, MN. Walden Press.
4. Instructional
Strategies
In this section, we will
outline the sequence of
instructional tasks and
events that will take
place during the
instructional module
with a brief description.
Jeffrey, Laurie, Lori, Jamie & Nancy, Instructional Design Project, (2012)
Edu Tech Consultants. Minneapolis, MN. Walden Press.
5. Instructional • Formation of groups
Strategies by category:
Learner Centered
The sequence is as
Groups for
follows:
activities.
• A brief introduction
video: Explaining
company and
reason for training
module.
Jeffrey, Laurie, Lori, Jamie & Nancy, Instructional Design Project, (2012)
Edu Tech Consultants. Minneapolis, MN. Walden Press.
6. Instructional
Strategies
• Pretesting quiz: Essential
knowledge quiz for
analysis data for module
evaluation.
• Handouts and manuals:
PowerPoint Presentation
notes and manual with
handouts for turn in work.
Jeffrey, Laurie, Lori, Jamie & Nancy, Instructional Design Project, (2012)
Edu Tech Consultants. Minneapolis, MN. Walden Press.
7. Instructional Strategies
• PowerPoint
Presentation: Explaining
whiteboard features and
benefits.
• Break out group
activities on whiteboards
based upon facilitator
led activities.
Jeffrey, Laurie, Lori, Jamie & Nancy, Instructional Design Project, (2012)
Edu Tech Consultants. Minneapolis, MN. Walden Press.
8. Instructional
Strategies
• Question and Answer:
Facilitator will answer
questions based upon
the group’s activities.
• Workbook Manuel:
Instructor led fill-ins.
Jeffrey, Laurie, Lori, Jamie & Nancy, Instructional Design Project, (2012)
Edu Tech Consultants. Minneapolis, MN. Walden Press.
9. Instructional Strategies
• Module end assessment:
two parts-
1. Written test,
2. Group whiteboard
demonstrations.
Jeffrey, Laurie, Lori, Jamie & Nancy, Instructional Design Project, (2012)
Edu Tech Consultants. Minneapolis, MN. Walden Press.
10.
11. Conduct Revise
Instructional Instruction
Analysis
Identify Write Develop Develop Develop & Identify
Instructional Performance Assessment Instructional Conduct Instructional
Goal(s) Goals Instruments Strategy Formative Goal(s)
Evaluation
of
Instruction
Analyze
Learners &
Contexts
Identify
Instructional
Revising Instructional Materials Goal(s)
Dick, W., Carey, L., & Carey, J. O. (2009). The systematic design
of instruction (7th ed.). Upper Saddle River, NJ: Pearson.
13. analyze
implement Evaluation design
development
Dick, W., Carey, L., & Carey, J. O. (2009). The systematic design
of instruction (7th ed.). Upper Saddle River, NJ: Pearson.
14. Planning for Evaluation
…evaluation is used for the purposes
of making judgments about the worth or success
of people or things (e.g., lessons, programs,
projects). Before initiating an evaluation, you
must determine its goals. -Morrison, Ross,
Kalman, & Kemp, 2011, p. 272
Morrison, G.R., Ross, S.M., Kalman, H.K., & Kemp, J.E. (2011).
Designing effective instruction (6th ed.). Hoboken, NJ: John
15. “Here are just some • Do learners like the
of the questions that course?
might be explored • Do learners achieve the
learning objectives at
during the the end of the course?
evaluation phase: • Do the learners change
their behaviors in the
workplace?
• Does the course help
the company achieve its
business goals?”
ADDIE Evaluation Phase
http://www.intulogy.com/addie/evaluation.html
16. TABLE 1.1 Entry Skill, Pretest, and Posttest Data
Summarization by Percent of Total Possible Objectives
Student Entry Skill Pretest Posttest
Number Objectives Instructional Objectives
Objectives
1 100 75 100
2 75 50 100
3 50 25 100
Mean 75 50 100
Dick, W., Carey, L., & Carey, J. O. (2009). The systematic design
of instruction (7th ed.). Upper Saddle River, NJ: Pearson.
17. Table 1.2 Student Performance on the Pretest and Posttest
by Objective
Objectives ONE TWO THREE FOUR
Test PR PS PR PS PR PS PR PS
Student 1 1 1 1 1 1 1 1
2 1 1 1 1 1 1
3 1 1 1 1 1
% 67 100 67 100 33 100 33 100
Mastering
Difference 33 33 67 67
PR = pretest; PS = posttest; 1 = mastered
Dick, W., Carey, L., & Carey, J. O. (2009). The systematic design
of instruction (7th ed.). Upper Saddle River, NJ: Pearson.
18. Table 1.3 Pretest/Posttest Graph Showing Learner
Performance
4.5
4
3.5
3
2.5
2
1.5
1
0.5
0
Ojective 1 Ojective 2 Objective 3 Ojective 4
Pretest 1 Posttest 2
Dick, W., Carey, L., & Carey, J. O. (2009). The systematic design
of instruction (7th ed.). Upper Saddle River, NJ: Pearson.
19. Table 1.4 Item-by-Item Analysis Table
Dick, W., Carey, L., & Carey, J. O. (2009). The systematic design
of instruction (7th ed.). Upper Saddle River, NJ: Pearson.
21. What conclusions can be
drawn from the data?
“Two basic types of revisions
you should consider with your
materials are changes made to
the content or substance of the
materials to make them more
accurate or more effective as a
learning tool and changes related
to the procedures employed in
using your materials”. -Dick,
Carey, & Carey, 2009, p. 295.
Dick, W., Carey, L., & Carey, J. O. (2009). The systematic design
of instruction (7th ed.). Upper Saddle River, NJ: Pearson.
22. What conclusions
can be drawn from
the data?
• Changing the content
and substance of the
materials of Objective
3.5 and 4.9 will make
them more accurate
and effective as a
learning tool.
23. What conclusions can be
drawn from the data?
Further changes related
to the procedures employed
in using the materials will
be needed when the
SMART Boards arrive and
as classes increase in size.
24. What might need
further study in future
implementations?
• As Smart Board
technology advances,
new features will be
added and need to be
included in future
training.
25. Providing a brief
analysis of the
implications of the data
gathered.
• We are cautiously
excited about the initial
success of the training
module and know that
close monitoring will
ensure it’s effectiveness!
26. What revisions would you
propose to the
instructional module?
• Adding more content as
part of future needs analysis
would substantiate the need
for a new series of training
modules based upon
extending the initial training.
27. What specific changes or modifications would
you recommend for the implementation and
evaluation plans?
• Creating a facilitator exit survey would provide critical
data as the program runs it’s coarse with several
instructors imputing useful data and observations.
• Surveying participants about what future
training they would perceive as useful or
desired to be created in the future would
provide data for a needs analysis.
28. What specific changes • Our recommendations of
or modifications would extending the participant
you recommend for the exit survey and
instituting individual
evaluation materials
skills demonstrations on
before the client moves actual SMART Boards
forward with another would greatly enhance
implementation? the evaluation materials
of this current training
module.
29. Provide a rationale for
your recommendations.
• Training needs will grow as
employees become more
familiar with the current
and future features that
benefits that enhanced the
capabilities of performing
job related tasks and
therefore the need of
future training modules.
30. References
ADDIE Evaluation Phase http://www.intulogy.com/addie/evaluation.html
Dick, W., Carey, L., & Carey, J. O. (2009). The systematic design of
instruction (7th ed.). Upper Saddle River, NJ
Jeffrey, Laurie, Lori, Jamie & Nancy, Instructional Design Project, (2012)
Edu Tech Consultants. Minneapolis, MN. Walden Press.
https://class.waldenu.edu/@@/B8AAD375C2F4C511D0D4E7E1F
BD8892D/courses/1/USW1.43915.201260/db/_2080909_1/SMART
-Board-Training-Final.doc
Morrison, G.R., Ross, S.M., Kalman, H.K., & Kemp, J.E. (2011).
Designing effective instruction (6th ed.). Hoboken, NJ: John
Wiley & Sons, Inc.
Walden University, EIDT 6100 Advanced Instructional Design. Course
content, Week 6 Discussion: Informally Evaluating Your
Implementation
31. Would like to
Thank you for viewing
this presentation!
And now for the door prizes!
Editor's Notes
Edu Tech Consultants is pleased to present the evaluation of our new training program! Let’s talk about the process for those who may not be familiar with it and look at what the data tells us.
Welcome!
Our training module today is about using the SMART Board.
Let’s take a look at how it’s all put together!
It’s always a good idea to tell your participants the what and why of the training!
What do we already know?
Narrative :
Narrative :
In this initial start up phase, we had no SMART Boards to use as they are all on order at this time.
Concrete data gathering is the first step to evaluating our instructional program and to begin the process of revisions of the program, and it's implementation from a systems perspective.
This is what a systems approach looks like. Notice that the evaluation and revision processes are critical to the success of the instructional design of any training program.
Here are some questions that guide our thoughts as to how think about changing our training program to make it better:• “What went well? What was challenging or needed improvement?• What were your initial impressions upon completing the implementation?• Did the learners seem to enjoy it?• What anecdotal evidence do you have of student learning?• How do you know that they will use what they have learned?” (Walden University).
This is the ADDIE Process. First, information is gathered about the overall goals of the project, tasks to be gained, and the intended audience in the Analyze Phase. The second Phase of Design starts the process of writing learning objectives, then determines the tasks required, and the activities to learn the objectives. The Development Phase is the blueprint stage where activities are created. The next stage is called the Implement Phase, where the content is developed and implemented to insure that the materials are tested to make sure that they are functional and appropriate for the selected audience. The final stage is the Evaluation Phase. This is when formative and summative assessments are created and applied to make revisions and enhance the program. Notice that revision is part of every phase of the ADDIE Process.
As you can see from these questions, we are thinking ahead about the follow through during and after the training occurs.
To begin our evaluation process, we will first look at the summarized data of: Entry Skill, Pretest, and Posttest Data presented in Table 1.1. As you can see from the data, not all participants were “tech Savvy”, yet all were able to achieve the learning objectives.
By separating the data on pretest and posttest results, we see in Table 1.2 that the participants in our training program achieved the desired objectives of the training with a 50% increase in knowledge.
By graphing the results of our data, we see in Table 1.3 that a flat line shows success. This is where our experience leads us to further examine the data to find areas for improvements.
Upon further evaluation of data, in Table 1.4 we find where our training program could use adjustments. Since our test group included only three participants, special attention should be placed on the evaluation process since we will get more accurate data as the program has more participants in the future. This is a chance to realign the training to maximize the learning process as we know it at the present time. Objective 3, item 5 and Objective 4, item 9 need revision at this time.
There is more than one door prize!
We do appreciate your business and look forward to serving all your training needs!