ADDIE Model Phases/ Elements
Assessment
Distinguishes current HRD gaps from systemic (non-HRD) gaps, anticipates HRD needs based on organizational strategy, and anticipates HRD needs due to changes in technology
You are to briefly tell how you would conduct an assessment. Then based on the case provide data from the case as well as “dummy” data you need to create to demonstrate evidence of assessment (results). What did the assessment look like as well as your interpretation of it. Address the italicized rubric statement above.
Define Purpose
Define Assessment Tools/Methods to Use
Collect and Compile Assessment Data – (if needed create dummy data for analysis)
Strategic/organizational
Task Analysis
Person Analysis
Anticipate HRD needs due to changes in technology
Provide Data Analysis & Conclusions/Prioritization
Identify system (non-HRD) issues that are preventing effective performance that cannot be effectively addressed by training and development interventions
Design
Defines strategy, objectives, method (fitted to the training target—skill, knowledge, interpersonal competency, or experiential growth), materials, and media (classroom or technological.) You need to address the italicized rubric statement above. You are to have no more than 4 training objectives (Mager criteria).
Define Purpose/strategy
Write Training Objectives
Define Criteria for evaluation
Select Trainers (Criteria for selection)
Draft Lesson Plan (see text for example-p. 153; Figure 5-2)
Select Training Methods and Media (preliminary)
Draft Training Materials
Draft Schedule Program/course
Development
Organizes content assets (developed in the design phase) to plan timely and logical delivery of all learning components with proper integration.
You need to address the italicized rubric statement above.
Implementation
Determines contractor versus in-house facilitator, type of facility, use of technology, equipment, materials, scheduling/sequencing, constraints, and pilot test if feasible
Define Purpose
Decide Make or Buy: Justify
Select Instructional Methods for Training Delivery
Select Any On the Job Methods
Select Job Instruction Training
Select Classroom Instruction
Select Audiovisual Media
Select Computer Based Training (Classroom-Based)
Select Self-Paced/Computer-Based Training Media and Methods
Select Arrangements for the Physical Environment
You need to address the italicized rubric statement above.
Evaluation
Evaluates data using the four Kirkpatrick levels—reactions, learning (retention), behavior (transfer), and organization-level results
You need to address the italicized rubric statement above.
Define Purpose
Select Criteria and Methods of Evaluation
Choose Research Design
Choose Data Collection Methods
Identify Means of Assessing HRD in Monetary Terms
Present Evaluation Data and Interpretation-Was the Training Successful? Why or Why Not?
SAMPLE C.
ADDIE Model Phases ElementsAssessmentDistinguishes current HR.docx
1. ADDIE Model Phases/ Elements
Assessment
Distinguishes current HRD gaps from systemic (non-HRD)
gaps, anticipates HRD needs based on organizational strategy,
and anticipates HRD needs due to changes in technology
You are to briefly tell how you would conduct an assessment.
Then based on the case provide data from the case as well as
“dummy” data you need to create to demonstrate evidence of
assessment (results). What did the assessment look like as well
as your interpretation of it. Address the italicized rubric
statement above.
Define Purpose
Define Assessment Tools/Methods to Use
Collect and Compile Assessment Data – (if needed create
dummy data for analysis)
Strategic/organizational
Task Analysis
Person Analysis
Anticipate HRD needs due to changes in technology
Provide Data Analysis & Conclusions/Prioritization
Identify system (non-HRD) issues that are preventing effective
performance that cannot be effectively addressed by training
and development interventions
Design
Defines strategy, objectives, method (fitted to the training
target—skill, knowledge, interpersonal competency, or
experiential growth), materials, and media (classroom or
technological.) You need to address the italicized rubric
statement above. You are to have no more than 4 training
objectives (Mager criteria).
2. Define Purpose/strategy
Write Training Objectives
Define Criteria for evaluation
Select Trainers (Criteria for selection)
Draft Lesson Plan (see text for example-p. 153; Figure 5-2)
Select Training Methods and Media (preliminary)
Draft Training Materials
Draft Schedule Program/course
Development
Organizes content assets (developed in the design phase) to plan
timely and logical delivery of all learning components with
proper integration.
You need to address the italicized rubric statement above.
Implementation
Determines contractor versus in-house facilitator, type of
facility, use of technology, equipment, materials,
scheduling/sequencing, constraints, and pilot test if feasible
Define Purpose
Decide Make or Buy: Justify
Select Instructional Methods for Training Delivery
Select Any On the Job Methods
Select Job Instruction Training
Select Classroom Instruction
Select Audiovisual Media
Select Computer Based Training (Classroom-Based)
Select Self-Paced/Computer-Based Training Media and Methods
Select Arrangements for the Physical Environment
You need to address the italicized rubric statement
above.
Evaluation
3. Evaluates data using the four Kirkpatrick levels—reactions,
learning (retention), behavior (transfer), and organization-level
results
You need to address the italicized rubric statement above.
Define Purpose
Select Criteria and Methods of Evaluation
Choose Research Design
Choose Data Collection Methods
Identify Means of Assessing HRD in Monetary Terms
Present Evaluation Data and Interpretation-Was the Training
Successful? Why or Why Not?
SAMPLE CASE ANALYSIS
This document consists of two parts: (1) a case study similar to
those you will be
discussing in class and analyzing in writing this semester, and
(2) a sample case analysis. Its
purpose is to assist you in satisfactorily completing the case
analysis requirement of the course.
The sample case analysis is not perfect but its content and
organization would likely earn
an “A.” It exemplifies one way of analyzing a case that is clear,
concise, and well-argued. There
are other ways that are just as effective so this case analysis
should not be viewed as the model to
which your own analysis must conform. Use it as a stimulus
rather than as a straitjacket.
4. 1
CASE: THE JOB CORPS
In September 1966 a staff member of the Office of Research,
Plans, Programs, and
Evaluation (RPP&E) within the Office of Economic Opportunity
(OEO) published a staff paper
entitled “A Framework for the Evaluation of Training
Programs.” Mr. Timothy O’Brien, an
analyst in RPP&E, had been asked to decide if the method
suggested in the paper might help
RPP&E evaluate the Job Corps, one of OEO’s manpower
training programs.
The author’s approach was to ascertain the current and
projected costs of the different
programs, their enrollee characteristics, and their expected
yearly earnings flows, and then to
calculate how much those flows would need to be altered, under
differing circumstances, to
justify the costs of the various programs.
The paper began with a discussion of the proper costs and
benefits to be considered:
There has been considerable discussion about which costs and
benefits ought to
be included in evaluating a particular training program. An
economist would include
only the real resources expended and gained, omitting any
transfer-type expenditures or
savings due to the program. However, a bureaucrat
administering a single-program
5. budget would be tempted to include all financial costs, since
any expenditures avoided in
other sectors as a result of the program would not be added to
his or her program’s
budget.
Because this paper is written from a multiagency viewpoint, it
will include only
real resources, which means that all trainee remuneration is
considered to be a transfer,
not a direct cost, and no estimate of reduced transfer payments
is made in computing
benefits. It also implies that an opportunity cost of removing
the trainees from the labor
force must be included, based on expected trainee earnings over
the period of program
participation. Program costs, however, will be set forth in both
real and financial terms
for anyone desiring to perform computations on other bases.
The “Framework” paper presented the following summary of
projected program and real
resource costs for the Job Corps:
Program cost
...........................................................................$5,625
Opportunity
cost...........................................................................743
Total cost
.....................................................................$6,368
Less trainee remuneration ......................................................
(1,080)
Real resource cost .......................................................$5,288
6. The “program cost” of $5,625 per trainee was derived as
follows:
1. A figure of $7,500 per “slot” was used as OEO’s projection1
of average annual costs for the
Job Corps.
2. Assuming an average enrollee stay of nine months, the Job
Corps program per trainee was
taken to be 75 percent of $7,500, or $5,625.
1 Historically, as of September 1966, the Job Corps had spent
$499,923,000 and “affected” 44,531
individuals at a cost of $11,226 per “affectee.”
2
The “opportunity cost” was the “expected earnings of a trainee
over a nine-month span,
the usual expected program enrollment period . . . calculated
from Census data,” and represented
the cost of removing the trainee from the labor force during the
period of his or her enrollment.
The “Framework” paper recognized that “cost per enrollee data
often do not give the
most meaningful index to program impact.” Accordingly,
figures were derived for program
costs for Job Corps trainees, graduates, placements, and
affectees. A summary of these costs is
shown below:
7. Cost per
Trainee
Cost per
Graduate
Cost per
Placement
Cost per
Affectee
$5,288 $7,051 $8,813 $8,135
The derivation of the costs per Job Corps graduate, placement,
and affectee were explained in the
paper as follows:
Job Corps assumes a steady state cost of $7,500 per slot or
$5,625 per enrollee based on a
nine-month stay. Various completion assumptions are made
with a 75 percent
completion factor being a rough median of the different
estimates. The Program
Memorandum assumes an 80 percent success rate for graduates,
with success defined as a
steady job, returning to school, more advanced training, or
military service. An estimated
20 percent of nongraduates are assumed to be affected by the
program. Quite clearly
these figures are sensitive to the assumptions and a more
thorough investigation will
require additional and more complete data.
8. Turning to the discussion of benefits from the various training
programs being considered, the
“Framework” paper stated:
The benefits from any government program would be the
discounted increase in
the national product attributable to that program. In a
manpower retraining program, the
most obvious source of benefits is the differential earnings of
the trainees after
completion of a retraining program. Three problems, however,
make the differential
earnings of trainees a less than perfect measure of the benefits
of a federal retraining
program.
First, training takes time, and overall earnings during that time
presumably rise in
both real and nominal terms. Therefore, part of a trainee’s
earnings differential may in
reality be merely a secular increase in earnings, and thus not a
program benefit.
Second, accepting earnings differential as a benefit implicitly
assumes that the
training program has not merely "shuffled the faces in the .job
queue." It must be
recognized that, to some unknown extent, graduates from
training programs do not obtain
additional jobs, but simply take jobs away from other members
of the labor force, most
likely their socioeconomic peers. For the present calculations,
no estimates are made of
the magnitude of this problem.
Third, we don't yet know whether the program provides skills
and experience that
9. will enhance the individual’s expected earnings for the rest of
his or her life, or whether it
merely provides a one-shot benefit and the trainee will shortly
return to his or her old
earnings pattern.
Two sets of calculations were made to account for this
difficulty. In both sets, a
5 percent social rate of discount was used. As yet, the
determination of a proper discount
3
rate is unresolved, but the selection of 5 percent can be justified
on the grounds of
reasonableness and workability. The conclusions of the
analysis are sensitive to the
choice of a discount rate.
The first set of calculations—the simple annuity approach—
assumed that any
earnings differentials that occurred would be constant over a
finite time period. The
annual earnings differential required to match the total cost of
the program was calculated
over two time horizons—15 and 30 years. The results for the
Job Corps program are
shown below:
Annual Earnings Differential
Per
Trainee
Per
10. Graduate
Per
Success
Per
Affectee
15-year horizon . . . 508 675 847 772
30-year horizon . . . 343 456 572 522
The second and more ambitious analysis—the enhanced
educational characteristics
approach—attempted to correlate earnings with the level of
education achieved, and then
compute the educational level enrollees must achieve in order to
realize earnings equal to
the costs of the program.
Census data were obtained showing expected annual earnings
by age and race for males
aged 18to 24 through 55 to 64, classified according to years of
school. The paper stated:
The total expected earnings for someone with an 8th-grade
education were then
subtracted from those of someone with a 12th-grade education,
and the remainder was
divided by 4 to arrive at a rough approximation of the value (in
terms of additional
expected earnings) of a year's education at a given age level.
11. The 1959 numbers were
then inflated by the 1.22 coefficient to arrive at more realistic
1965 figures. Quite clearly
other things such as intelligence, motivation, and family
resources are interrelated with
additional schooling so that it is unrealistic to attribute entire
observed earnings
differentials to education. At present, several attempts have
been made to standardize for
other characteristics; and while precise calculations are clearly
untenable for the present
study, the Denison 60 percent factor will be used as the
coefficient of adjustment for
differences in education levels, meaning that 60 percent of the
observed higher income
differences by education is attributable to the education and the
other 40 percent is
explained by other factors.
The increased annual earnings power of an additional year's
schooling (at high school
level) was then plotted against age for both whites and
nonwhites. Then, using statistical
correlation techniques, mathematical functions were found that
would "fit" each of the observed
plots. The value of an additional year's schooling was seen to
be less at all ages for nonwhites
than for whites. The paper made the assumption that
"nonwhites received a lower return on time
spent in school due to inferior education and not economic
discrimination." It was also assumed
that identical training would be provided to all enrollees in a
particular program, and that this
training would have the same payoff as white education for all
participants regardless of race.
12. Accordingly, the mathematical function for whites was adopted.
4
This function was refined in several respects. For example, it
was modified to enable the
analyst to compute the present value of an incremental year's
education for an individual, given
the individual's age and the age at which training occurred.
This permitted the calculation of
data such as the following:
Trained at Age Work until Age
Present Value
of One Year’s
Education
18 65 $4,279
30 65 3,528
45 65 1,872
55 65 1,198
The final step was to divide the real resource program costs of
the programs by the value
of a year's education to arrive at an estimate of the degree to
which enrollee characteristics would
have to be changed to justify the costs of each program. For the
purposes of calculation, the
average entrant into the Job Corps was assumed to be 18 years
old. The results for the Job Corps
program are summarized below:
13. Degree of Change of Enrollee Characteristics to Justify Cost
Per Trainee Per Graduate Per Placement Per Affectee
A* B* A* B* A* B* A* B*
1.24 — 1.65 — 2.06 — 1.90 —
* All adult programs are composed on the assumption of a 30-
year-old and 45-
year-old input, corresponding to columns A and B.
Questions
1. Analyze the methodology of the study.
2. Suggest alternative approaches which might improve the
evaluation of anti-poverty programs
such as Job Corps.
15. 1
ANALYSIS
1. ANALYZE THE METHODOLOGY OF THE STUDY.
Job Corps provides examples for examining alternative purposes
and approaches of
evaluating ongoing programs. There are two basic approaches
to the review of ongoing
programs: (1) the analysis of performance, and (2) the review of
the investment decision.
The analysis of performance focuses on operating management.
Its methodology is to
compare the actual performance of an organization or
organizational unit with its planned
performance as defined in a previously-determined budget. The
analysis compares outputs and
achievements of the organization as well as the cost of inputs.
The comparison should take into
account whether the circumstances affecting actual performance
are significantly different from
circumstances assumed in the formulation of the budget. The
focus of the analysis or
16. performance tends to be short term. Its goal is to insure that
operating managers are
implementing programs according to the plan of top
management; in addition, this type of
analysis reports to top management whether a program is
producing the expected results.
The review of the investment decision is a process aimed at
thoroughly examining all the
programs and operations of an organization over a period of
several years. This process is
sometimes called zero-based review, and it relies on the
analytical tools of capital budgeting. It
seeks to determine whether the objectives of a given program
are still consistent with the goals
and policies of the organization and whether the current form of
the program is the most
effective and efficient means of attaining these objectives.
“A Framework for the Evaluation of Training Programs”
provides the type of analysis
appropriate for investment decision making and for a review of
the investment decision, though
it does explicitly report on the performance of antipoverty
programs. It begins with a digression
17. 2
on the point of reference on the analyst, making a distinction
between the financial resources
available to a particular agency and the real resource costs to
society. Since the U.S.
Government and all its agencies exist to promote the general
welfare, the appropriate point of
reference for the analyst is one based on the real resource costs
to the society. Therefore, sound
analysis should always consider the real resource costs and
benefits in evaluating alternative
courses of action instead of taking a parochial approach that
seeks to maximize the financial
resources available to a particular agency. This study, in fact,
does use a real resource approach
in its evaluation.
Consistent with a real resource approach to analysis, an agency
must consider both out-
of-pocket costs and opportunity costs in its definition of costs.
“A Framework” underscores both
the conceptual issues involved in determining opportunity costs
and the technical problems of
measuring out-of pocket costs.
18. Transfer payments are not a real cost of the OEO programs.
Transfer payments simply
redistribute the consumption of real resources within the private
sector; transfer payments do not
withdraw real resources from private use and reallocate them to
public use. The opportunity
costs to the society of the OEO programs are the wages
foregone by the enrollees during their
training period. These opportunity costs, however, may be less
than the market price of labor if
OEO programs are drawing from a chronically unemployed or
underemployed labor pool. I
believe that this study is conceptually correct in its
understanding of opportunity costs, but that it
probably overestimates their value.
The measurement of out-of-pocket costs in “A Framework,”
however, is very crude. The
study divides a project of total program costs for the Job Corps
by an estimate of the number of
annual “slots” available in the Job Corps; it then adjusts this
cost per slot for the average enrollee
3
19. stay of nine months. More information is needed to determine
the validity of these cost
estimates; in particular, the nature of the costs (fixed vs.
variable) and the capacity utilization of
the slots available would be critical information in estimating
the cost per enrollee.
Even if accurate total cost data were available, the relevant
information is not the cost per
enrollee. Rather, the cost per success would give a better
measure of the inputs utilized by these
programs. “A Framework” attempts to provide this information
by making several heroic
assumptions concerning the definition of success and the
success rate of the programs (assumed
to be 80 percent of graduates). Since these assumptions are
unsubstantiated by research and
data, they cannot be used to generate reliable unit cost data.
However, the 80 percent success
rate may be interpreted as a commitment by OEO management
to produce these results (i.e., as a
measure of output). Failure to perform according to these
results can serve as a signal of
problems within the Job Corps.
20. This study defines program benefits (i.e., output) in terms of
increase in earnings of
trainees. Its approach is to determine what level of benefits
will justify program costs under
different assumptions. It recognizes three problems in the
measurement of benefits:
1. It is necessary to separate out the impact of inflation and the
secular increase in
real wages from the effect of OEO training.
2. A test must be made to determine the extent to which OEO
training actually
improves the employment level of the disadvantaged or simply
redistribute an
unchanged number of jobs among the disadvantaged. I may
note that in the
strong economy of the 1960s, the reshuffling effect was
probably minimal. In the
difficult economic times of the 1980s, this reshuffling is very
likely. Some claim,
however that the shuffling is useful, in and of itself, for people
who would
otherwise be unemployed.
3. It is necessary to project the permanence of the earning
increase and its behavior
into the future. There are not data which directly provide
information on this
question.
21. 4
This study also presents methods for measuring benefits: (1)
the annuity method and (2)
the enhanced educational characteristics (EEC) method. The
annuity method presented in the
case text is simple and straightforward. The EEC method
attempts a more complex
measurement. Using census data, it estimates the effect in
terms of increased earnings of an
additional year of education at a given age level. The analyst
attempts to correct for the
influence of noneducational factors on earnings by assuming
that sixty percent of the changes in
income are due to education. The study then uses this projected
relationship between education
and earnings to estimate what change in the educational
characteristics of enrollees OEO must
cause in order to generate the required change in earnings that
will justify program costs.
The major flaw in “A Framework” is its assumptions
concerning the causal link between
22. inputs and results. The annuity method simply reports what
effect an OEO program must have
without offering any evidence supporting the validity of the
causal link between the program and
the change in earnings. The EEC method uses a very crude data
base and very crude methods in
order to estimate the very complex relationship between
education and earnings; the use of the
Denison Factor is a highly vulnerable assumption. Neither
approach is credible.
2. SUGGEST ALTERNATIVE APPROACHES WHICH MIGH
IMPROVE THE
EVALUATION OF ANTI-POVERTY PROGRAMS SUCH AS
THE JOB
CORPS.
The review process in the private sector can serve as a model
for developing general
procedures for the evaluation of ongoing programs.
Conceptually there are two types of review:
(1) operational review and (2) a strategic review of the viability
of major product lines.
Operational review focuses on the performance of operating
management. It requires a
definition of standards of measure and a plan which defines
23. goals and inputs for achieving these
goals in terms of the standards. Operational review consists of
both comparing the actual
5
performance of operating management with the expected
performance according to the plan and
analyzing the causes and managerial implications of variance or
differences. In a manufacturing
environment where a considerable portion of costs are
engineered costs, highly developed
standard-cost systems can be used for the purposes of control
and evaluation. Most of the costs
in the OEO environment are comparable in nature to the
discretionary costs of a private firm.
Top managers of OEO should negotiate and develop an
agreement with program managers
concerning a level of expenditure and the appropriate goals
given this level expenditure. An
example of such a goal would be an agreement that a certain
program (with a given budget)
would accept X number of trainees and that Y percent of these
will have jobs with a Z percent
24. increase in hourly wages within six months after training. This
type of planning makes
individual managers accountable for their programs and informs
top management of the progress
(success or failure) of each program. This type of information
is totally absent from the OEO
review process.
The strategic review of programs or product lines as a
formalized practice is not
widespread in either the public or private sector. Strategic
review analyzes the objectives of a
program to insure that they remain consistent with the overall
strategy of the organization and
examines the return in order to insure that the program is a
worthwhile undertaking. Strategic
review is an intensive evaluation of a program. It is neither
feasible nor desirable to review all
the programs of an organization within a given year. Strategic
review should be part of the
budget process and seek to evaluate several major programs
each year so that all product lines of
the organization are reviewed every four or five years.
Finally, it should be noted that there is a structural flaw in the
organization of OEO. The
25. Office of RPP&E has responsibility for evaluating programs as
well as planning and developing
6
programs. Since planners and programmers are likely to be
committed to the programs they
develop, they are not likely to produce objective evaluations of
those programs.
Case Assignment Instructions
General Instructions Overview
The outcome of this assignment is a written report/research
paper.
You are to use at least 5 ACADEMIC references to support your
paper. Include these in your Reference section.
Read the instructions very carefully. ALSO, open both Grading
rubrics (2) and read over them carefully. Prepare an outline
using only the headings of "Strategic Application"Rubric #1.
Next, go to the Harvard Publishing website and purchase and
download your case. Read it slowly and carefully the first time
through. Then, read it again making notes along the way. As
you progress through the text material, continue to review the
case and start filling in the outline, keeping in mind the case
instructions, the ADDIE model and the Rubric requirements. As
you start to write your paper, review APA formatting
requirements, grammar, punctuation, spelling, word usage,
sentence structure, etc. Your grade will be based on these
rubrics.
26. Assignment Details
This course focuses on the models, concepts, and phases of the
ADDIE (ADiME) Model of assessing, designing, developing,
implementing and evaluating a training and development
program. This paper is an opportunity to apply the models and
concepts to the case study entitled “A.P. Moller – Maersk
Group: Evaluating Strategic Talent Management Initiatives.”
The elements of your paper are to be helpful to Moller and
Maersk (M&M); to improve their effectiveness through the
development of a training program. A suggested way to
accomplish this project is to read and become familiar with the
case study first. As you learn about the concepts and models in
the text, see how they may apply to the situation (case) to most
benefit the organization. Do not wait to do all the writing
during the last few days before the deadline. So, the idea is --
you learn and then you write (apply) the concepts/models to the
case. If you do this week to week, writing the paper will be
much more manageable.
Using the Training and & HRD Process Model (Figure 1-7 of
text, page 27) as the roadmap, you are to develop a Leadership
Development training program for the “mission critical” group
which is part of the top 120 positions in M&M. Note the text
has chapters providing detailed information about each phase of
this model. Note that the text model is called ADiME
(assessment, design, implementation {also includes
development but not framed that way}, and evaluation), which
folds in development as part of implementation. For the
purposes of this paper, organize the paper around the ADDIE
Model (Google it).
Also note that additional components (Coaching and
Performance Management, etc.). These components are in
addition to the ones provided in the author’s model (page 27).
However the additional components added to the model below
are included in other chapters of the text as well.
The grading rubric uses a slightly different model called
27. ADDIE (assessment, design, development, implementation, and
evaluation). So make sure that when you write your paper that
you address the items in the ADDIE grading rubric!
As you write your paper, present the components in the order
provided below/ next page. Headings are in bold and these must
be included in your paper in the order displayed below.
Additionally, APA formatting must be incorporated into the
paper. In writing anything you must keep the readers in mind
and write in such a way that the reader finds it easy to follow
your writing without having to read 2-3 times in an effort to
understand what you are trying communicate. You may use
headings interspersed within and in addition to the headings
(bold-below) that the paper requires.
After you write the introduction of your paper, you’ll need to
include headers corresponding to the grading rubric –
assessment, design, development, implementation, and
evaluation. Certainly, you may have sub headers if that helps
the organization of your paper. The statements below are from
the grading rubric. You need to address these ADDIE Model
phases as you develop a training program for M&M. These
elements are listed below/ next page which include elements
from the grading rubric.
General Outline
Assessment
Distinguishes current HRD gaps from systemic (non-HRD)
gaps, anticipates HRD needs based on organizational strategy,
and anticipates HRD needs due to changes in technology
Briefly tell how you would conduct an assessment. Then based
on the case provide data from the case
Design your Proposed
Solution
28. s
For example ... strategy, objectives, method (fitted to the
training target—skill, knowledge, interpersonal competency, or
experiential growth), materials, and media (classroom or
technological.) Only use that which is applicable to your
proposals.
How Will You Develop Your