SlideShare a Scribd company logo
1 of 165
Doctor of Education in Educational Leadership
The Doctor of Education in Educational Leadership program in
the college of education
prepares graduates to become effective administrators and
visionary leaders. Students
learn the skills required to lead organizations, manage change,
and apply research and
theory to real-world problems.
Executive Educational Leadership EdD program courses are
taught by faculty with both
academic credentials and experience as practitioners.
Coursework combines the
theoretical and methodological foundations of academic
research with an applied
focus, allowing students to develop the professional and
interpersonal wisdom needed
to successfully manage change in complex organizations.
Graduates are well prepared
to lead schools, school districts and organizations and possess
the skills required to
conduct, interpret and evaluate research and data, diagnose and
resolve organizational
challenges, and create programs and policies that affect learning
success.
This cohort-based executive graduate program consists of a
fixed set of courses
offered in a specific sequence, and all students in each cohort
take the same courses in
the same sequence. Courses are offered one weekend per month
to accommodate the
schedules of working professionals.
All courses within the Executive Educational Leadership EdD
program are offered at
Temple University Center City. The program is designed to be
completed on a part-time
basis; students may complete the program in three years.
Related Graduate Degrees
ion and Human Development.
Supporting Materials
1. Transcripts: Submit official undergraduate and graduate
transcripts from all
accredited institutions you have attended and/or from which you
earned credit. Official
http://education.temple.edu/admissions/documents
transcripts can be emailed to or sent to the Office of Enrollment
Management address
listed above.
2. Goals Statement: Include an autobiographical personal
statement that explains your
reasons for pursuing a doctoral degree in education. The
statement should address
these questions.
How have your personal, academic, and professional
experiences shaped your
research interests, and how might a doctoral program in
Education help you
explore those interests?
What academic/professional goals would the program help you
to achieve
following graduation?
How does the doctoral program at Temple fit your individual
interests, needs, and
future goals (including the faculty member whose research best
matches your
own interests)?
3. Academic Writing Sample: This should be a paper written for
a course within the past
five years. If applicants do not have a recent paper written for a
course, they should
compose an op-ed piece on the educational issue of their
choosing. The op-ed should
be between 400 and 1,200 words and should be the kind of piece
that might appear in
The New York Times
4. Recommendations: Submit two letters of reference that
provide insight into your
academic competence. References from college or university
faculty members are
recommended. You may request recommendations through
yourRésumé: A current
professional résumé is required.
School & Community Partnerships
As part of its mission to give back to the surrounding
community and provide substantive
fieldwork experiences for its students,
Project 2 Program Evaluation Proposal: Project 2 involves
designing a program evaluation for student’s
respective programs (i.e., Advocacy and Organizational
Development; Educational Psychology, Applied
Research and Evaluation). The proposal should address the
impacts of the program, its implementation or
both. The proposal should clearly delineate a feasible evaluation
plan that draws on course readings,
lectures, exercises and presentations. The proposal is worth
25% of your total grade and should address all
elements described below.
Deliverable: Proposals should address elements detailed below.
Proposals should be type-written,
doubled spaced in 12pt font. Submit proposals by End of Day 30
July 2022 via CANVAS.
ELEMENTS OF THE PROPOSAL
The proposal shall include: (a) an introduction, (b) method, (c)
proposed analysis, (d) discussion, and (e) a
reference page. Consult Mertens and Wilson (2019), and the
APA Manual (2010) to address all elements of
the proposal. Brief descriptions for each section are provided
below.
a. Introduction – Provide relevant background and focus for the
evaluation proposal. What does the program seek
to accomplish? Why is it important? What are the goals,
objectives, and purposes of the program? What is the
program’s “theory of cause and effect” (i.e., why and how will
the program accomplish its goals, objectives
and purposes?). What question(s) does the evaluation seek to
answer? Why are these questions selected? Is
there any literature that can inform the evaluation (reference it
appropriately)? See Mertens and Wilson (2019)
Chapter 7-8.
This should help you understand the dynamics of the program I
am currently in.. below
are my thoughts/experiences
The program seeks to develop well-rounded educational leaders
who are ethical and prepared to run and
make big decisions for school systems, school networks, and
school districts. The program is run cohort
style where you experience all classes with a set group of
classmates that become your network, your
resources and your group presentation buddies. The program
theory of cause and effect .. the program
will accomplish its goals by providing students with exposure to
real world education and school based
problems and projects that will have to be solved and presented
in groups, by exposing students to
courses that require you to develop your understanding as a
researcher and get in tune with education
research and education “wicked problems”. The evaluator seeks
to answer the question of is the edd
program being responsive to the drastic change in needs by
schools in America during a post pandemic
world where education is in crisis. Is the program being flexible
scheduling wise by not providing a fully
remote model and still requiring students to come in person? Is
it providing enough practical experience
with the courses offered? Is the program creating critical and
analytical thinkers? How is the program
providing consistency for students considering faculty leaving
and students needing new advisors? Is the
program reasonable considering it costs about 30k a year,
requires summers in person and does not offer
financial aid? Is the EDD worth it since it is considered lesser
than the PHd and is not a requirement to
be a leader in education in america?
Required Courses we take are listed below
Year 1
Summer II Credit Hours
EDAD 8461 Ethical Leadership 3
EPSY 8627 Introduction to Research
Design and Methods
3
Term Credit Hours 6
Fall
EDAD 8635 Education Policy Analysis 3
EDUC 5325 Introduction to Statistics
and Research
3
Term Credit Hours 6
Spring
EDAD 8653 Civic Leadership 3
EDUC 5262 Introduction to Qualitative
Research
3
Term Credit Hours 6
Year 2
Summer II
EDAD 8636 Research for Change 3
EDAD 8755 Organizational Dynamics 3
Term Credit Hours 6
Fall
EDAD 8093 Administration Research
Seminar
3
EDAD 8553 Democratic, Equitable, and
Ethical Leadership
3
Term Credit Hours 6
Spring
EDUC 5010 Special Topics in
Education
3
EDUC 9998 Dissertation Proposal
Design
3
Term Credit Hours 6
Year 3
Summer II
AOD 5534 Group Facilitation and
Consultation
3
Term Credit Hours 3
Fall
EDUC 9999 Doctor of Education
Dissertation
3
Term Credit Hours 3
Spring
EDUC 9999 Doctor of Education
Dissertation
3
Term Credit Hours 3
Total Credit Hours:
Chapter 9 Notes
The purpose of evaluation is to determine the merit or worth of
an evaluand. That is, we want to know
whether a program had the intended effect on its participants as
specified by the programs theory and
model. Our ability to faithfully and confidently determine the
effects of a program are in part
determined by the manner in which we design the evaluation.
There are many ways to think about
designs within the context of evaluation and designing
evaluation is a complex endeavor. Moreover, it
is important to note that different designs can be used for
different types of applications. Regardless of
how we conceptualize and frame the relationship between the
purposes and methods of an evaluation
process, there are two major questions that we have to be
explicitly addressed:
1. To what extent are the effects we observe in participants
really due to the program and not
some other reason?
2. To what extent can the results observed in participants be
expected to generalize (extend to)
other situations?
Both of these questions pertain more formally to the concept of
validity. And there are two specific
forms of validity that as evaluators we must be concerned with:
• Internal validity – Refers to the extent to which a research
design includes enough control of
the conditions and experiences of participants that it can
demonstrate a single unambiguous
explanation for a manipulation, that is cause and effect.
To what extent are the effects we observe in participants really
due to the program and not
some other reason?
When we have adequately attended to issues involving internal
validity within the evaluation
process it means that an evaluator has controlled the effects of
variables other than the
treatment, in order to say with confidence that the results are
reflective of the treatment.
Hence, we can confidently say that the observed effects are
cause by the program and nothing
else.
• External validity – Extent to which observations made in a
study generalize beyond the specific
manipulations or constraints in the study
To what extent can the results observed in participants be
expected to generalize (extend
to) other situations?
When we have adequately attended to issues involving external
validity it means that an
evaluator has ensured that the participants of the program are
representative of the population,
and therefore that if the treatment is applied with another group
of people from that
population under similar circumstances, it should be effective
there as well
WHAT FACTORS DIMINISH OR THREATEN VALIDITY OF
EVALUATIONS?
We can classify threats to the validity of our conclusions in
terms of internal and external threats
Threat Description
History Events occurring during a study (other than the program
treatment) that can influence
results
Maturation Naturally occurring physical or psychological
changes in program participants (e.g.,
growth, development, aging) that can influence results
Testing Administration of test before and after program might
influence scores on test
independent of program (e.g., familiarity with test results in
changes in scores)
Instrumentation Having pretest and posttest that differ in terms
of content, structure, format or
difficulty can lead to differences in scores but not due to
program treatment but
differences in the instruments used
Statistical regression Having extreme groups in program may
artificially decrease or increase scores
independent of the program treatment—if all members of a
group are already scoring
at the highest levels and their scores cant go any higher, any
observed decline in
scores may be due to the test not program treatment indicating a
measurement error
Differential
Selection
Differences between groups compared (treatment vs. no-
treatment groups) on
important characteristics may account for observed differences
but these are not due
to the program treatments
Experimental
Mortality
Differential dropout of participants in treatment and no-
treatment groups yields
differences in observed effects that are not a function of the
program treatment but
rather an artifact of attrition within the groups.
Treatment Diffusion Proximity among participants in treatment
and no-treatment groups leads to
treatment exposure for the no-treatment group
Compensatory
rivalry
When no-treatment group outperforms the treatment group, but
those differences
are not due to the treatment effects, but by competition –John
Henry effect
Compensatory
Equalization of
treatments
If one group receives something and the other receives nothing,
than any effects on
the first group may be due to the fact that this group received
something, and not to
the specifics of what it received.
Resentful
Demoralization
When members of the no-treatment group realize they did not
get something that the
treatment group received they may become demoralized because
they are being
excluded but not because they did not get the specific treatment
External Validity Threats
Threat Description
Selection Treatment
Interaction
Refers to the possibility that the program results may be
applicable to only to that
population from which the treatment and no-treatment groups
were chosen—hence
results may be internally valid but not generalizable
Testing Treatment
Interaction
Refers to the fact that the program results may be generalizable
to other groups only
when a pretest is also given.
Situation effects
Experimenter
effects
Refers to the existence of multiple factors associated with the
program itself---results
may be due to a particularly charismatic instructor rather than
the content of the
program
Multiple treatment
effects
Participants are involved in multiple programs at the same time
of the evaluation,
hence the findings may not be generalizable to other settings
because of the
confounding of multiple treatments
Population validity Extent to which results observed in a study
will generalize to the population from
which a ample was selected. Homogeneous attrition: Rates of
attrition are about the
same in
Ecological validity Extent to which results observed in a study
will generalize across settings or
environments
Temporal validity Extent to which results observed in a study
will generalize across time and at different
points in time
Outcome validity Extent to which results observed in a study
will generalize across different but related
DVs
HOW DO WE MITIGATE AGAINST THREATS TO
INTERNAL AND EXTERNAL VALIDITY?
An evaluator can try to mitigate against these potential threats
by selecting an evaluation design that
reduces the influence of the particular threat by the manner in
which the design is executed. There are
many ways by which to characterize evaluation designs—
Mertens and Wilsons 2012 distinguish
between quantitative vs. qualitative data; but we can also
classify designs in terms of being
experimental, quasi-experiemental and non-experimental. I will
use this latter one to highlight how the
various designs attempt to address the validity threats we just
discussed.
The experimental research designs use methods and procedures
to make observations in which the
researcher fully controls the conditions and experiences of
participants by applying three required
elements of control: randomization, manipulation, and
comparison/control
—involves randomly selecting participants
into the study so that all individuals in
a study; it also involves randomly assigning participants to the
experimental conditions.
—involves the systematic application of an
experimental treatment.
—involves controlling who gets or does not get a
particular treatment and ensuring that
all other aspects of the experimental process are the same
except for who gets or does not get a
particular treatment.
Experimental research designs are the only research design
capable of establishing cause—effect
relationships. To demonstrate that one factor causes changes in
a dependent variables, the conditions
and experiences of participants must be under the full control of
the research. This often means that an
experiment is conducted in a laboratory and not in an
environment where a behavior may occur
naturally. Strength: Capable of demonstrating cause and effect.
Limitation: Behavior that occurs under
controlled conditions may not be the same as behavior that
occurs in a natural environment
We can categorize experimental research designs into one of 4
possible types of designs.
Box 9.4 provides an alternative way to conceptualize designs in
Mertens & Wilson (p. 316). You will
note that (R) designates randomization for all of those designs,
(O) indicates and observation; and (X)
denotes a treatment. There are 5 different experimental designs
which we can use to evaluate the
impact of a program. Each one affords particular advantages
that if relevant to the validity concerns and
purpose of the evaluation enable you to more faithfully assess
the program. Whether you are able to
employ this designs for evaluation depends on whether or not
you can randomize, manipulate and
control. To the extent that you can randomize (randomly
select/randomly assign participants to a
treatment and no treatment group); manipulate (manipulate
which group receives a treatment and
which group does not); control (control for extraneous factors
that may influence or impact participants
that may not involve the treatment itself—e.g., control lighting
and temperature on performance) then
you are able to use one of the experimental designs described
(see pl. 316-319). Besides practical
concerns you have to think about ethical concerns with regard
to the potential risks and benefits of
randomizing, manipulating and controlling the treatment and the
participants—how ethical is it to
withhold a potential treatment for cancer from a terminally ill
patient?
If you cannot randomize, manipulate or control within your
evaluation design, then the alternative is to
employ a quasi-experimental design. To be an experimental
design, it must meet the following three
elements of control: 1. Randomization 2. Manipulation 3.
Comparison/control group. Quasi-
experiments are similar to an experiment, except that this
design does one or both of the following:
Includes a quasi-independent variable--Quasi-independent
variable: A preexisting variable that is often a
characteristic inherent to an individual, which differentiates the
groups or conditions being compared in
a research study (e.g. Gender (man, woman), health status (lean,
overweight, obese). It lacks an
appropriate or equivalent control group. Strength: Allows
researchers to study factors related to the
unique characteristics of participants. Limitation: Cannot
demonstrate cause and effect
Again, there are many ways to classify the various types of
quasi-experimental designs, what is most
important is to pay attention to the design that matches and
address the purposes and validity threats
that may influence and impact the evaluation. Mertens and
Wilson describe the relevant issue with
regard to quasi experimental designs in Box 9.5 (pp. 320-325)
WHAT ABOUT OTHER DESIGNS THAT DO NOT CONFORM
TO THE EXPERIMENT AND QUASI-
EXPERIMENT CLASSIFICATION?
The last category of designs involves what I refer to as non-
experimental or what Mertens and Wilson
(2013) classify as qualitative designs. These designs do not
share any of the characteristics that are
required for experimentation (e.g., randomization, manipulate,
control). These designs use of methods
and procedures to make observations in which the behavior or
event is observed “as is” or without an
intervention from the researcher. Strength: Can be used to
make observations in settings that the
behaviors and events being observed naturally operate (e.g.
Interactions between an athlete and coach
during a game). Limitation: Lacks control needed to
demonstrate cause and effect.
Correlational Designs
• Measurement of two or more factors to determine or estimate
the extent to which the values
for the factors are related or change in an identifiable pattern
• Correlation coefficient: Statistic used to measure the strength
and direction of the linear
relationship, or correlation, between two factors
• The value of r can range from -1.0 to +1.0
Naturalistic Observation
The observation of behavior in the natural setting where it is
expected to occur, with limited or
no attempt to overtly manipulate the conditions of the
environment where the observations are
made (e.g. Buying behavior in a grocery store, parenting
behavior in a residential home
Generally associated with high external validity, but low
internal validity
Qualitative Designs
• Use of scientific method to make nonnumeric observations,
from which conclusions are drawn
without the use of statistical analysis
• Adopts the assumption of determinism; however, it does not
assume that behavior itself is
universal
• Determinism: Assumption in science that all actions in the
universe have a cause
• Based on the holistic view, or “complete picture,” that reality
changes and behavior is dynamic
Phenomenology (Individual)
• Analysis of the conscious experiences of phenomena from the
first-person point of view
• The researcher interviews a participant then constructs a
narrative to summarize the
experiences described in the interview
• Conscious experience is any experience that a person has lived
through or performed and can
bring to memory
• The researchers must be considerate of the intentionality or
meaning of a participant’s
conscious experiences
• Identify objects of awareness, which are those things that
bring an experience to consciousness
Ethnography (Group)
• Analysis of the behavior and identity of a group or culture as
it is described and characterized by
the members of that group or culture
• A culture is a “shared way of life” that includes patterns of
interaction, shared beliefs and
understandings, adaptations to the environments, and many
more factors
• To observe a group or culture, it is often necessary to get
close up to or participate in that group
or culture
• To gain entry into a group or culture without causing
participants to react or change
their behavior
• Researchers can covertly enter a group
• Researchers can announce or request entry into a group
• Participant observation: Researchers participate in or join the
group or culture they are
observing
• Researchers need to remain neutral in how they interact with
members of the group
• Common pitfalls associated with participant observation
• The “eager speaker” bias
• The “good citizen” bias
• The “stereotype” bias
Case Study
Analysis of an individual, group, organization, or event used to
illustrate a phenomenon, explore new
hypotheses, or compare the observations of many cases
Case history: An in-depth description of the history and
background of the individual, group, or
organization observed. A case history can be the only
information provided in a case study for situations
in which the researcher does not include a manipulation,
treatment, or intervention
Illustrative: Investigates rare or unknown cases
Exploratory: Preliminary analysis that explores potentially
important hypotheses
Case studies have two common applications:
1. General inquiry
2. Theory development
The level of control in a research design directly related to
internal validity or the extent to which the
research design can demonstrate cause—effect. Experimental
research designs have the greatest
control and therefore the highest internal validity.
Nonexperimental research designs typically have the
least control and therefor the lowest internal validity.
Internal validity – Extent to which a research design includes
enough control of the conditions and
experiences of participants that it can demonstrate a single
unambiguous explanation for a
manipulation, that is cause and effect
External validity – Extent to which observations made in a
study generalize beyond the specific
manipulations or constraints in the study
Constraint: Any aspect of the research design that can limit
observations to the specific conditions or
manipulations in a study
See also Merten & Wilson (2102) Box 9.8
Chapter 9 Notes
The purpose of evaluation is to determine the merit or worth of
an evaluand. That is, we want to know
whether a program had the intended effect on its participants as
specified by the programs theory and
model. Our ability to faithfully and confidently determine the
effects of a program are in part
determined by the manner in which we design the evaluation.
There are many ways to think about
designs within the context of evaluation and designing
evaluation is a complex endeavor. Moreover, it
is important to note that different designs can be used for
different types of applications. Regardless of
how we conceptualize and frame the relationship between the
purposes and methods of an evaluation
process, there are two major questions that we have to be
explicitly addressed:
1. To what extent are the effects we observe in participants
really due to the program and not
some other reason?
2. To what extent can the results observed in participants be
expected to generalize (extend to)
other situations?
Both of these questions pertain more formally to the concept of
validity. And there are two specific
forms of validity that as evaluators we must be concerned with:
• Internal validity – Refers to the extent to which a research
design includes enough control of
the conditions and experiences of participants that it can
demonstrate a single unambiguous
explanation for a manipulation, that is cause and effect.
To what extent are the effects we observe in participants really
due to the program and not
some other reason?
When we have adequately attended to issues involving internal
validity within the evaluation
process it means that an evaluator has controlled the effects of
variables other than the
treatment, in order to say with confidence that the results are
reflective of the treatment.
Hence, we can confidently say that the observed effects are
cause by the program and nothing
else.
• External validity – Extent to which observations made in a
study generalize beyond the specific
manipulations or constraints in the study
To what extent can the results observed in participants be
expected to generalize (extend
to) other situations?
When we have adequately attended to issues involving external
validity it means that an
evaluator has ensured that the participants of the program are
representative of the population,
and therefore that if the treatment is applied with another group
of people from that
population under similar circumstances, it should be effective
there as well
WHAT FACTORS DIMINISH OR THREATEN VALIDITY OF
EVALUATIONS?
We can classify threats to the validity of our conclusions in
terms of internal and external threats
Threat Description
History Events occurring during a study (other than the program
treatment) that can influence
results
Maturation Naturally occurring physical or psychological
changes in program participants (e.g.,
growth, development, aging) that can influence results
Testing Administration of test before and after program might
influence scores on test
independent of program (e.g., familiarity with test results in
changes in scores)
Instrumentation Having pretest and posttest that differ in terms
of content, structure, format or
difficulty can lead to differences in scores but not due to
program treatment but
differences in the instruments used
Statistical regression Having extreme groups in program may
artificially decrease or increase scores
independent of the program treatment—if all members of a
group are already scoring
at the highest levels and their scores cant go any higher, any
observed decline in
scores may be due to the test not program treatment indicating a
measurement error
Differential
Selection
Differences between groups compared (treatment vs. no-
treatment groups) on
important characteristics may account for observed differences
but these are not due
to the program treatments
Experimental
Mortality
Differential dropout of participants in treatment and no-
treatment groups yields
differences in observed effects that are not a function of the
program treatment but
rather an artifact of attrition within the groups.
Treatment Diffusion Proximity among participants in treatment
and no-treatment groups leads to
treatment exposure for the no-treatment group
Compensatory
rivalry
When no-treatment group outperforms the treatment group, but
those differences
are not due to the treatment effects, but by competition –John
Henry effect
Compensatory
Equalization of
treatments
If one group receives something and the other receives nothing,
than any effects on
the first group may be due to the fact that this group received
something, and not to
the specifics of what it received.
Resentful
Demoralization
When members of the no-treatment group realize they did not
get something that the
treatment group received they may become demoralized because
they are being
excluded but not because they did not get the specific treatment
External Validity Threats
Threat Description
Selection Treatment
Interaction
Refers to the possibility that the program results may be
applicable to only to that
population from which the treatment and no-treatment groups
were chosen—hence
results may be internally valid but not generalizable
Testing Treatment
Interaction
Refers to the fact that the program results may be generalizable
to other groups only
when a pretest is also given.
Situation effects
Experimenter
effects
Refers to the existence of multiple factors associated w ith the
program itself---results
may be due to a particularly charismatic instructor rather than
the content of the
program
Multiple treatment
effects
Participants are involved in multiple programs at the same time
of the evaluation,
hence the findings may not be generalizable to other settings
because of the
confounding of multiple treatments
Population validity Extent to which results observed in a study
will generalize to the population from
which a ample was selected. Homogeneous attrition: Rates of
attrition are about the
same in
Ecological validity Extent to which results observed in a study
will generalize across settings or
environments
Temporal validity Extent to which results observed in a study
will generalize across time and at different
points in time
Outcome validity Extent to which results observed in a study
will generalize across different but related
DVs
HOW DO WE MITIGATE AGAINST THREATS TO
INTERNAL AND EXTERNAL VALIDITY?
An evaluator can try to mitigate against these potential threats
by selecting an evaluation design that
reduces the influence of the particular threat by the manner in
which the design is executed. There are
many ways by which to characterize evaluation designs—
Mertens and Wilsons 2012 distinguish
between quantitative vs. qualitative data; but we can also
classify designs in terms of being
experimental, quasi-experiemental and non-experimental. I will
use this latter one to highlight how the
various designs attempt to address the validity threats we just
discussed.
The experimental research designs use methods and procedures
to make observations in which the
researcher fully controls the conditions and experiences of
participants by applying three required
elements of control: randomization, manipulation, and
comparison/control
—involves randomly selecting participants
into the study so that all individuals in
a study; it also involves randomly assigning participants to the
experimental conditions.
n—involves the systematic application of an
experimental treatment.
—involves controlling who gets or does not get a
particular treatment and ensuring that
all other aspects of the experimental process are the same
except for who gets or does not get a
particular treatment.
Experimental research designs are the only research design
capable of establishing cause—effect
relationships. To demonstrate that one factor causes changes in
a dependent variables, the conditions
and experiences of participants must be under the full control of
the research. This often means that an
experiment is conducted in a laboratory and not in an
environment where a behavior may occur
naturally. Strength: Capable of demonstrating cause and effect.
Limitation: Behavior that occurs under
controlled conditions may not be the same as behavior that
occurs in a natural environment
We can categorize experimental research designs into one of 4
possible types of designs.
Box 9.4 provides an alternative way to conceptualize designs in
Mertens & Wilson (p. 316). You will
note that (R) designates randomization for all of those designs,
(O) indicates and observation; and (X)
denotes a treatment. There are 5 different experimental designs
which we can use to evaluate the
impact of a program. Each one affords particular advantages
that if relevant to the validity concerns and
purpose of the evaluation enable you to more faithfully assess
the program. Whether you are able to
employ this designs for evaluation depends on whether or not
you can randomize, manipulate and
control. To the extent that you can randomize (randomly
select/randomly assign participants to a
treatment and no treatment group); manipulate (manipulate
which group receives a treatment and
which group does not); control (control for extraneous factors
that may influence or impact participants
that may not involve the treatment itself—e.g., control lighting
and temperature on performance) then
you are able to use one of the experimental designs described
(see pl. 316-319). Besides practical
concerns you have to think about ethical concerns with regard
to the potential risks and benefits of
randomizing, manipulating and controlling the treatment and the
participants—how ethical is it to
withhold a potential treatment for cancer from a terminally ill
patient?
If you cannot randomize, manipulate or control within your
evaluation design, then the alternative is to
employ a quasi-experimental design. To be an experimental
design, it must meet the following three
elements of control: 1. Randomization 2. Manipulation 3.
Comparison/control group. Quasi-
experiments are similar to an experiment, except that this
design does one or both of the following:
Includes a quasi-independent variable--Quasi-independent
variable: A preexisting variable that is often a
characteristic inherent to an individual, which differentiates the
groups or conditions being compared in
a research study (e.g. Gender (man, woman), health status (lean,
overweight, obese). It lacks an
appropriate or equivalent control group. Strength: Allows
researchers to study factors related to the
unique characteristics of participants. Limitation: Cannot
demonstrate cause and effect
Again, there are many ways to classify the various types of
quasi-experimental designs, what is most
important is to pay attention to the design that matches and
address the purposes and validity threats
that may influence and impact the evaluation. Mertens and
Wilson describe the relevant issue with
regard to quasi experimental designs in Box 9.5 (pp. 320-325)
WHAT ABOUT OTHER DESIGNS THAT DO NOT CONFORM
TO THE EXPERIMENT AND QUASI-
EXPERIMENT CLASSIFICATION?
The last category of designs involves what I refer to as non-
experimental or what Mertens and Wilson
(2013) classify as qualitative designs. These designs do not
share any of the characteristics that are
required for experimentation (e.g., randomization, manipulate,
control). These designs use of methods
and procedures to make observations in which the behavior or
event is observed “as is” or without an
intervention from the researcher. Strength: Can be used to
make observations in settings that the
behaviors and events being observed naturally operate (e.g.
Interactions between an athlete and coach
during a game). Limitation: Lacks control needed to
demonstrate cause and effect.
Correlational Designs
• Measurement of two or more factors to determine or estimate
the extent to which the values
for the factors are related or change in an identifiable pattern
• Correlation coefficient: Statistic used to measure the strength
and direction of the linear
relationship, or correlation, between two factors
• The value of r can range from -1.0 to +1.0
Naturalistic Observation
The observation of behavior in the natural setting where it is
expected to occur, with limited or
no attempt to overtly manipulate the conditions of the
environment where the observations are
made (e.g. Buying behavior in a grocery store, parenting
behavior in a residential home
Generally associated with high external validity, but low
internal validity
Qualitative Designs
• Use of scientific method to make nonnumeric observations,
from which conclusions are drawn
without the use of statistical analysis
• Adopts the assumption of determinism; however, it does not
assume that behavior itself is
universal
• Determinism: Assumption in science that all actions in the
universe have a cause
• Based on the holistic view, or “complete picture,” that reality
changes and behavior is dynamic
Phenomenology (Individual)
• Analysis of the conscious experiences of phenomena from the
first-person point of view
• The researcher interviews a participant then construc ts a
narrative to summarize the
experiences described in the interview
• Conscious experience is any experience that a person has lived
through or performed and can
bring to memory
• The researchers must be considerate of the intentionality or
meaning of a participant’s
conscious experiences
• Identify objects of awareness, which are those things that
bring an experience to consciousness
Ethnography (Group)
• Analysis of the behavior and identity of a group or culture as
it is described and characterized by
the members of that group or culture
• A culture is a “shared way of life” that includes patterns of
interaction, shared beliefs and
understandings, adaptations to the environments, and many
more factors
• To observe a group or culture, it is often necessary to get
close up to or participate in that group
or culture
• To gain entry into a group or culture without causing
participants to react or change
their behavior
• Researchers can covertly enter a group
• Researchers can announce or request entry into a group
• Participant observation: Researchers participate in or join the
group or culture they are
observing
• Researchers need to remain neutral in how they interact with
members of the group
• Common pitfalls associated with participant observation
• The “eager speaker” bias
• The “good citizen” bias
• The “stereotype” bias
Case Study
Analysis of an individual, group, organization, or event used to
illustrate a phenomenon, explore new
hypotheses, or compare the observations of many cases
Case history: An in-depth description of the history and
background of the individual, group, or
organization observed. A case history can be the only
information provided in a case study for situations
in which the researcher does not include a manipulation,
treatment, or intervention
Illustrative: Investigates rare or unknown cases
Exploratory: Preliminary analysis that explores potentially
important hypotheses
Case studies have two common applications:
1. General inquiry
2. Theory development
The level of control in a research design directly related to
internal validity or the extent to which the
research design can demonstrate cause—effect. Experimental
research designs have the greatest
control and therefore the highest internal validity.
Nonexperimental research designs typically have the
least control and therefor the lowest internal validity.
Internal validity – Extent to which a research design includes
enough control of the conditions and
experiences of participants that it can demonstrate a single
unambiguous explanation for a
manipulation, that is cause and effect
External validity – Extent to which observations made in a
study generalize beyond the specific
manipulations or constraints in the study
Constraint: Any aspect of the research design that can limit
observations to the specific conditions or
manipulations in a study
See also Merten & Wilson (2102) Box 9.8
Reproduced with permission of the copyright owner. Further
reproduction prohibited without permission.
A Power Primer
Jacob Cohen
Psychological Bulletin [PsycARTICLES]; July 1992; 112, 1;
PsycARTICLES
pg. 155
Reproduced with permission of the copyright owner. Further
reproduction prohibited without permission.
Reproduced with permission of the copyright owner. Further
reproduction prohibited without permission.
Reproduced with permission of the copyright owner. Further
reproduction prohibited without permission.
Reproduced with permission of the copyright owner. Further
reproduction prohibited without permission.
Program Evaluation Proposal 1
An evaluation of the doctoral program in applied research
psychology
Psychology 3512 Program Evaluation
Anonymous
Program Evaluation Proposal 2
An evaluation of the doctoral program in applied research
psychology
This proposal presents an evaluation plan for the doctoral
program in Psychology
currently offered at the University of Texas at El Paso (UTEP).
The Doctor of Philosophy
(Ph.D.) program at UTEP is designed to produce
bilingual/bicultural research psychologists that
will be able to serve Hispanic populations in Texas. The Ph.D.
program at UTEP aims “to
prepare research psychologists to address questions appl icable
to the English-Spanish bicultural
communities in the Southwest” (The University of Texas at El
Paso, 1992, p. 8). Though the
program is fairly new (September of 1993), its development can
be traced back to the late 1970s
when the faculty first began to initiate efforts to pursue a Ph.D.
program for the department (J. V.
Devine, personal communication, March 27, 1998). Since its
development the program has
enjoyed support from the University and the El Paso
community.
Program Overview
The program aims to provide the local community, the state and
the nation with trained
bilingual/bicultural research psychologists. In fact, there is no
other program available to train
bilingual/bicultural research psychologists in the nation (Jones,
Keppel & Meissen, 1993). The
Ph.D. program at UTEP aims to provide local residents from
both the El Paso and Juarez
community with the opportunity to pursue Ph.D-level training in
psychology. Though there are
other Ph.D. programs in the State of Texas, there is no Ph.D.
programs--other than that currently
offered by UTEP department of psychology--available to the
local community. Thus, the Ph.D.
program will enable the local community to pursue Ph.D. level
training and provide access to
and retain trained bilingual/bicultural psychologists. At the
state level, the Ph.D. program aims
to increase the number of psychologist trained to work with
Hispanic populations in Texas. This
particularly important to Texas because it is among the states
with higher percentages of
Program Evaluation Proposal 3
Hispanics in their population. The Ph.D. program at UTEP can
serve to provide trained
psychologists to deal with issues faced by Hispanics living in
Texas. Nationally, the Ph.D.
program will offer specialized training in working with
Hispanic populations. This is important
since the Hispanic population is rapidly growing and is
emerging as a significant minority
population in the United States (Marin & Marin, 1991). Thus,
the Ph.D. program will serve an
important and emergent need for trained bilingual/bicultural
research psychologists nationwide.
Program Description & Objectives
To meet its objectives the program emphasizes the application
of research methods and
findings in non-academic settings. This includes a core
curriculum in experimental psychology
along with a field placement and specialization courses in either
health or human behavior in
organizations. The program assumes that preparation for
applied psychologists requires a firm
foundation in general experimental psychology including
courses in statistics, cross-cultural
research, and field research methodology as applied to
bilingual/bicultural settings. In addition,
the program requires its students to undertake a field placement
designed to meet the goals of the
program and the needs of the individual student. The field
placement is the cornerstone of the
program and is designed “to provide the practical experience in
an organizational setting that will
enhance the student’s training as an applied research
psychologist”(The University of Texas at El
Paso, 1992, p.16). Together with the academic training, the
internship will provide students with
a well rounded education and enable them to function as
bilingual/bicultural psychologists.
To be eligible for the program an applicant must have (a) a
B.A. from an accredited
university, (b) a minimum 3.0 G.P.A. in undergraduate work,
(c) a minimum of 3.0 G.P.A. in
their psychology course work, (d) a minimum score of 500 on
each of the subtests of the GRE
test, (e) satisfactory background in statistics/experimental work,
and (f) three letters of
Program Evaluation Proposal 4
recommendation. In addition, international students must also
(g) have satisfactory scores in the
TOEFL exam. This criteria is similar to that used by most other
graduate programs in
psychology in the US and Canada (American Psychological
Association, 1994; Norcross,
Hanych, & Terranova, 1996; Purdy, Reinehr, & Swartz, 1989).
In sum, the purpose of the Ph.D. program at UTEP is “to
prepare research psychologist to
address questions applicable to the English-Spanish bicultural
communities in the Southwest”
(The University of Texas at El Paso, 1992, p. 8). This program
is based on the notion that there
is an increasing need for trained bilingual/bicultural
psychologists to handle the needs of the
Hispanic community at the local, state and national levels (See
Table 1). Futhermore, there is no
equivalent program available in El Paso, or in the State of
Texas. This program is of timely
importance since the Hispanic population is becoming a
significant minority group in the US
(Marin & Marin, 1991) and their growth is likely to have a
significant impact on the work force
as well as other areas. This program assumes that by providing
specialized training it will be
able to prepare bilingual/bicultural research psychologists.
Purpose of the Evaluation
This proposal presents an evaluation plan for assessing the
progress of the Psychology
Ph.D. program at UTEP. At the outset it must be recognized
that this evaluation plan comprises
a formative rather than a summative evaluation. This approach
was followed for several reasons.
First, the Ph.D. program at UTEP has been in place for
approximately five years. Given that on
average most Ph.D. candidates complete their training in 6 years
it is unreasonable to expect that
the Ph.D. program will have met its objectives during its fifth
year of implementation. Secondly,
the Ph.D. program is the first of its kind for the department of
psychology and as a result will
undergo procedural changes commonly associated with the
development of new programs in any
Program Evaluation Proposal 5
organization. Finally, this evaluation seeks to provide
evaluation information that can be utilized
to improve the program rather than assess its absolute merit or
worth at this point. Accordingly,
this evaluation aims to provide information that can be utilized
by program administrators,
faculty, staff and students of the UTEP doctoral program in
psychology to improve the program
and address issues of concerns.
Evaluation Questions
Specifically, this evaluation proposes to examine the following
evaluation questions: (1)
To what extent has the program been implemented as proposed?
(2) To what extent has the
program progressed toward meeting its stated objective of
providing trained “bilingual/bicultural
research psychologists”? (3) What are the perceptions of the
various stakeholder regarding the
Ph.D. program? and (4) What are the main concerns for program
development and improvement
among the faculty and students of the program? These
evaluation questions have been selected
to represent broad areas of interests that all stakeholders (e.g.,
local staff and faculty, students in
the program as well as higher level university and public
officials) share in common. In
addition, these questions can provide valuable information that
can be directly fed back to the
program as well as enable the participants of the program to
voice issues of concerns and make a
worthwhile contribution toward the improvement of the
graduate education offered at UTEP.
Evaluation Design
This evaluation concerns both process and implementation
issues. That is, this evaluation
will assess whether the Ph.D. program has been implemented as
proposed and whether the
program has made progress in that process. Shortell &
Richardson (1978) have argued that
“recurrent institutional cycle designs provide a flexible way of
handling some major threats to
internal validity and is particularly conducive to settings and
circumstances in which new groups
Program Evaluation Proposal 6
of individuals are exposed to ongoing programs” (p. 63). The
Ph.D. program at UTEP certainly
would fit this description. The UTEP program has been in place
for approximately five years
and has had a new cohort of students entering the program each
year since it began. This design
allows for the integration of data from multiple sources (e.g.,
program participants at various
stages of the programs, faculty, and staff) while allowing for
diversity in information. This
design is made up of three designs including the one-shot case
study, a static group comparison
design and the single group pre-posttest design. This design
offers several advantages in that it
controls for the effects of history, testing , instrumentation,
selection and attrition. However, it
does not rule out the effects of maturation or regression.
Because this evaluation is not
concerned with generalizing outside of UTEP at this time,
external validity issues (e.g.,
generalizing to other participants, settings, times) are not of
major concern.
Population and Sampling Process
This evaluation will be conducted with the program
participants at the University of
Texas at El Paso. The proposed evaluation will include both
paper and pencil assessments of
participants satisfaction with the program, supervision,
coursework, research and other areas of
the program. In addition, focus group interviews will be
conducted with each student cohort
enrolled since the program was installed as well as with both
the faculty and staff of the program.
Focus group interviews will be carried in one of the seminar
rooms of the department. These
rooms are generally medium in size and can accommodate up to
20 people. As recommended by
focus group researchers (Krueger, 1994), the sessions will be
tape-recorded to ensure the
accuracy of the data.
Participants. Participants will include students, faculty and
staff of the Ph.D. program at
the University of Texas at El paso. Participant will be asked to
complete both a written survey
Program Evaluation Proposal 7
and a focus group interview designed to assess their perceptions
and attitudes about the Ph. D.
program at UTEP. In addition, individual interviews will be
conducted with various university
officials including the Dean of the Liberal Arts College,
University Vice President of Academic
affairs and the UTEP President to assess their knowledge about
the programs’ success.
Individual interviews will also be conducted with other
University officials that are presently
involved with the program or were involved in its formulation,
or development.
Measures
Participant survey. Participant perceptions about the program
will be assessed through a
survey questionnaire (see Appendix A). This survey will assess
various aspects of the program
including participants knowledge, development and satisfaction
(e.g., satisfaction with the
program, supervision).
Participant knowledge. Participants’ knowledge about the
program will be assessed with
a scale specifically designed for this project. The scale is
comprised of four items that are
presented in Likert format. Participants rate the extent to which
they agree or disagree with each
item on a scale ranging from one to five. Scoring is
accomplished by summing across the four
items. With a five-point response scale, scores can range from
a low of 4 to a high of 20. Higher
scores on the scale indicate greater knowledge about the
program.
Development. Participant development will be measured using
a scale designed
specifically for this evaluation. The scale is comprised of 6
items that are presented in Likert
format. Participants rate the extent to which they agree or
disagree with each item on a scale
ranging from one to five. Scoring is accomplished by summing
across the 6 items. With a five-
point response scale, scores can range from a low of 6 to a high
of 30. Higher scores on the scale
indicate more positive views about participant development in
the program.
Program Evaluation Proposal 8
Satisfaction. Satisfaction with the program, supervision, co-
workers will be measured
using a scale modelled after the Job Descriptive Index (JDI)
developed by Smith, Kendall, &
Hulin (1969). Each scale contains an 18-item index that
presents respondents with adjective
checklists. Respondents indicate "yes" if the adjective
describes their work (or co-worker,
supervision, pay), "no" if it does not describe their work (or co -
worker, supervision, pay), and
"?" if they cannot decide.
Focus groups. Focus group interviews will be administered to
individual stakeholder
groups (see Appendix B). The focus group interview is
composed of four parts. Part I is
designed to determine the participant's attitudes and feeli ngs
toward program. It contains three
questions dealing with the perception about the Ph.D. program
(e.g., What is your general
perception of the program? What do you think of it?). Part II
is designed to explore participant's
knowledge of the program. It contains several questions that
assess the participant's knowledge
and experience with the program (e.g., What do you perceive as
the purposes or guiding
philosophy of the program?). Part III and IV are designed to
understand participant's concerns
about the program. It contains questions that deal with
participant's feelings about the program,
its objectives, goals and overall purposes (e.g., What are some
concerns that you have about the
program?).
In addition, the present evaluation will also collect archival
data to assess the programs
progress. Archival data collected will include samples of
students program plans for both
current and graduated students. These data will be used to
examine the extent to which the
program has been implemented as proposed.
Procedures
Participants will be asked to attend a discussion session with
the program evaluator. The
Program Evaluation Proposal 9
participants will be scheduled to attend their session with their
respective cohort (e.g., 1st. year
students, faculty, staff). At the begining of the session the
evaluator will give a brief
introduction and describe the purpose of the session in general
terms. Participants will be
informed of their rights as participants. They will be told that
their responses are anonymous and
only the evaluator will have access to their individual
responses. Each participant will then be
given a questionnaire to complete and place into a box located
near the entrance of the room.
When the group is finished, the evaluator will then begin the
focus group discussion with the
group.
Focus group interviews will be conducted in one of the seminar
rooms in the department
of psychology building by the evaluator and an assistant.
Again, participants will be informed of
their rights and they will be told that their responses are to be
kept anonymous. Participants will
be told not to state their name or any other identifying
information during the focus group
interviews. Approximately 60-90 minutes will be required to
complete the focus group
interviews. However, should more time be needed it will be
determined at the time of the
interview with the consent of all of the participants. Upon
completion of the interview,
participants will be given the opportunity to ask questions
regarding the purpose of the
evaluation. All responses and comments about the interview
will be recorded. The tape recorder
will be turned off by the evaluator when the last participant has
left the room.
Analysis
Appendix C outlines the evaluation questions along with the
information required for
each question, the information source and the method of data
collection. As can be seen the
evaluation questions will be answered using a combination of
quantitative and qualitative
information. For the quantitative data reliability indices will be
calculated to determine the
Program Evaluation Proposal 10
internal consistency of the items for all the scales included in
the questionnaire. Scores on each
of the scales will be used to compare across participant groups
as well as to the narrative
responses from the focus group interviews.
Qualitative survey comments and focus group interview
responses will be content
analyzed and major themes will be identified for the entire
group (e.g., including all participants)
as well as for each group separately (e.g., student responses,
faculty responses etc.). The data
will be coded and analyzed by two independent judges to cross
check the reliability of the coding
procedures. Major themes and concerns will be compiled for
comparisons between the
stakeholder groups.
Personnel, Equipment & Logistics
The evaluation design described in the design section, will be
conducted with the
program participants at the University of Texas at El Paso. The
university is located near the
Mexico-US border approximately 3-5 miles from Ciudad Juarez,
Mexico. It is a midsize
university with a predominantly Hispanic student population
from the surrounding communities.
Both the participant survey and focus group interviews will be
carried in one of the
seminar rooms of the department (e.g., Psych. 308). These
rooms are generally medium in size
and can accomodate up to 20 people. As recommended by focus
group researchers (Krueger,
1994), the sessions will be tape-recorded to ensure the accuracy
of the data. Participants will be
asked to attend a discussion session with the program evaluator.
The participants will be
scheduled to attend their session with their respective cohort
(e.g., 1st. year students, faculty,
staff).
Evaluation time line. Approximately seven months will be
needed to conduct the full
evaluation. This time will be used to develop measures,
conduct observations, administer the
Program Evaluation Proposal 11
survey, analyze data, review with clients and to prepare and
present a final report. A significant
portion of time will dedicated to the gathering of focus group
interviews (e.g., 2 months) as well
as to the analysis of this data. All other aspects of the
evaluation will last no longer than one
month. Although some personnel in the department of
psychology may be qualified to assist in
the evaluation, because of the nature of the data being collected
and confidentiality issues the
expertise of individuals in the department will not be utilized.
Implications & Recommendations
The results from both the questionnaire survey and the focus
group interview will provide
data on stakeholders assessment of the Ph. D program at UTEP.
The result will help determine
whether the program has made progress toward achieving its
goals. Furthermore, the findings
will also help to identify program strengths and weaknesses.
Specifically, the findings of the
proposed evaluation will provide information on (1) the extent
to which the program has been
implemented as proposed; (2) the extent to which the program
has made progress toward
meeting its stated objectives; (3) provided information about the
perceptions of stakeholders
regarding the Ph.D. program; and (4) help identify areas of
concern for the purposes of program
development and improvement. Furthermore, this evaluation
plan can provide information that
can guide the development of similar programs in other schools.
This is particularly important
since the strength of the Hispanic community is rapidly
increasing and the need for such
programs may become acute.
Human Subjects
The proposed evaluation plan aims to assess the Ph. D. program
at UTEP. Participants in
this project will be required to complete an anonymous survey
assessing their perceptions about
the Ph.D. program as well as a focus group interview with other
participants. Participants will
Program Evaluation Proposal 12
include students, faculty and staff as well as other selected
university and public officials
associated with the program. Local stakeholders (e.g., students,
faculty, staff and other UTEP
personnel) will be asked to participate in a discussion with the
program evaluator. A flyer with a
brief description of the project will be placed in individual
mailboxes as well as throughout
various information boards in the department. The flyer will
include a brief description of the
project, as well as the date, time, and location of the study.
Questionnaires will be administered to participants on the same
day as the scheduled
focus group discussions. Participants will be informed of their
rights as participants. They will
told that responses are anonymous and only the evaluator will
see the completed questionnaires.
Instructions will be read aloud and participants will be given a
pencil, an informed consent form,
the survey questionnaire. Participants will be told not to write
their name or any other
identifying information on the survey itself. Once completed
the participants will place their
individual surveys into a box located near the entrance of the
room. After a brief break, focus
group discussion will begin.
Focus group interviews will be conducted in one of the seminar
rooms in the department
of psychology building by the evaluator and an assistant.
Again, participants will be informed of
their rights. Participants will be told not to state their name or
any other identifying information
during the focus group interviews. Approximately 45-60
minutes will be required to complete
the focus group interviews. Upon completion of the interview,
participants will be given the
opportunity to ask questions regarding the purpose of the
evaluation. All responses and
comments about the interview will be recorded. The tape
recorder will be turned off by the
evaluator when the last participant has left the room.
Program Evaluation Proposal 13
References
American Psychological Association. (1994). Graduate study in
psychology. Washington, DC:
Author.
Jones, J. M., Keppel, G., & Meissen, G. (1993, February). Site
visit report: Department of
Psychology University of Texas at El Paso. Berkeley, CA.
Krueger, R. A. (1994). Focus groups (2nd. ed.). Thousand
Oaks, CA: Sage.
Marin, G., & Marin B. V. (1991). Research with Hispanic
populations. Newbury Park, CA.
Sage.
Norcross, J. C., Hanych, J. M., & Terranova, R. D. (1996).
Graduate study in psychology: 1992-
1993. American Psychologist, 51, 631-643.
Purdy, J.E., Reinehr, R.C., & Swartz, J.D. (1989). Graduate
admissions criteria of leading
psychology departments. American Psychologist, 44, 960-961.
Ronco, S., Passmore, B., Morales, T., & Schwartz, O. (1997).
The University of Texas at El
Paso fact book 1996-1997. El Paso, Texas. The University of
Texas, Office of
Institutional Studies.
Shortell, S. M., & Richardson, W. C. (1978). Health program
evaluation. Saint Louis, MO. The
C. V. Mosby Company.
Smith, P. C., Kendall, L. M., & Hulin, C. L. (1969). The
measurement of satisfaction in work
and retirement. Chicago, IL. Rand Mc Nally.
The University of Texas at El Paso (1992). Doctor of
philosophy in psychology. El Paso, Texas.
University of Texas
Program Evaluation Proposal 14
Table 1. Process Model of Evaluation.
Preexisting Conditions Program Components Intervening Events
Impact/Consequences
No Ph.D. programs
exist to meet the needs
of the Hispanic
Population
Ph.D. programs in
Texas are beyond their
estimated capacity to
train additional
psychologist for the
state.
There is no Ph.D.
program available to the
El Paso-Juarez
community.
-Inputs/Obj.
-Prepare research
psychologist to serve
the English-Spanish
Bicultural
communities in the
Southwest.
Resources
-Graduate Faculty
-Adequate facilities
-Adm. Support
-Library resources
-Computing resources
-Extramural funding
Activities
-External evaluations
have been conducted.
-Recruitment of students
has continued.
Internal
-Faculty Turnover
-Budgetary changes
-Student dropout rate
-Availability of
assistantships
-Student progress
External
-Extramural funding
-Poor applicant pool
-Low visibility
Proximal
-increased pool of
qualified applicants
-increased enrollment
in the Ph.D. program
-increases in
applicants from the El
Paso/Juarez area
-increased fluency in
Spanish language
among students
-increases in the
number of students
passing the language
proficiency exam.
-increases in the
passing rate of
students taking the
first year exam
-increases in the rate
of students reaching
ABD
-increases in
extramural funding
awards to faculty
Distal
-Increase the number
of bilingual/bicultural
applied research
psychologists
-Greater cooperation
/communication
between UTEP and El
Paso-Juarez
community
-Increase KSAs of
Psychologist
Program Evaluation Proposal 15
APPENDIX A
Participant Questionnaire
Program Evaluation Proposal 16
Survey Questionnaire
We are interested in learning how to improve the Ph.D. program
at UTEP. Please give your honest opinions to the
questions asked here. Information collected on this survey will
assist in the evaluation of the Ph.D. program and will be
used to make improvements. All information is confidential and
only the program evaluator will have access to the
information you provide--you as an individual will be
anonymous.
Directions: Read each statement carefully and check the box
that best represents your opinion next to each statement.
Participant’s Knowledge
Strongly
Disagree
Disagree Neither Agree
Nor Disagree
Agree Strongly
Agree
There are clear guidelines on the program requirements.
I am informed of program changes affecting my educational
goals
Information is shared with students is timely and organized.
There is adequate information/resources available to students.
Participant Development Strongly
Disagree
Disagree Neither Agree
Nor Disagree
Agree Strongly
Agree
Student research is taken seriously in the department.
I would like to influence decisions impacting students.
I find the coursework in the program worthwhile.
I find the coursework in the program useful.
There is good cooperation among faculty and students.
Recently, I have considered leaving the Ph. D. Program.
Directions: Read each statement carefully and check the box
that best represents your opinion next to each statement.
PROGRAM IN GENERAL
Think of the Ph.D. program at UTEP.
How well do the words or phrases
describe the program in general?
Yes No Don’t
Know
CURRENT POSITION
Think of the work you do at
present in the Ph.D. program.
How well do the words or
phrases describe what your
position is like most of the time?
Yes No Don’t
Know
Pleasant Fascinating
Bad Routine
Ideal Satisfying
Waste of time Boring
Good Good
Undesirable Creative
Worthwhile Respected
Worse than most Uncomfortable
Acceptable Pleasant
Superior Useful
Better than most Tiring
Disagreeable Healthful
Makes me content Challenging
Inadequate Too much to do
Excellent Frustrating
Rotten Simple
Enjoyable Repetitive
Poor Gives sense of accomplishment
Program Evaluation Proposal 17
Directions: Read each statement carefully and check the box
that best represents your opinion next to each statement.
SUPERVISION
Think of the Ph.D. program. How
well do the words or phrases
describe the supervision you receive
in the program?
Yes No Don’t
Know
COWORKERS-PEERS
Think of the Ph.D. program. How
well do the words or phrases
describe your coworkers or
peers?
Yes No Don’t
Know
Asks my advice Stimulating
Hard to please Boring
Impolite Slow
Praises good work Helpful
Tactful Stupid
Influential Responsible
Up-to-date Fast
Doesn’t supervise enough Intelligent
Has favorites Easy to make enemies
Tells me where I stand Talk too much
Annoying Smart
Stubborn Lazy
Knows job well Unpleasant
Bad Gossipy
Intelligent Active
Poor planner Narrow interests
Around when needed Loyal
Lazy Stubborn
Please use this space to give us your reactions and views of the
Ph.D. program at UTEP.
Why or why not?
_____________________________________________________
____________________________________________
_____________________________________________________
____________________________________________
_____________________________________________________
____________________________________________
Why or why not?
_____________________________________________________
____________________________________________
_____________________________________________________
____________________________________________
_____________________________________________________
____________________________________________
3. What did you like most about the training you have received
thus far?
_____________________________________________________
____________________________________________
_____________________________________________________
____________________________________________
_____________________________________________________
_________________________________________ ___
4. What do you like least about the training you have received
thus far?
_____________________________________________________
_____________________________________________________
_______________
_____________________________________________________
____________________________________________
_____________________________________________________
____________________________________________
Program Evaluation Proposal 18
5. Are there any areas covered in the program that you feel
should have been covered in more detail?
_____________________________________________________
____________________________________________
_____________________________________________________
____________________________________________
_____________________________________________________
____________________________________________
6. Wou
If YES, in what area?
_____________________________________________________
____________________________________________
_____________________________________________________
____________________________________________
_____________________________________________________
____________________________________________
7. Of the courses you have taken so far, indicate those which
have been the most useful:
_____________________________________________________
____________________________________________
_____________________________________________________
____________________________________________
_____________________________________________________
____________________________________________
8. Of the courses you have taken so far, indicate those which
have been the least useful:
_____________________________________________________
____________________________________________
_____________________________________________________
____________________________________________
_____________________________________________________
____________________________________________
THIS INFORMATION IS CONFIDENTIAL AND WILL ONLY
BE USED TO ANALYZE THE RESULTS. IT WILL HELP US
MAKE IMPROVEMENTS ON THE TRAINING.
DEMOGRAPHICS
Student Status:
Number of years in the program:
Program Evaluation Proposal 19
APPENDIX B
Focus group interviews
All of you have been asked to participate in this study because
you can help me understand how
people like yourselves [e.g., students, faculty, staff, or other]
feel about the Ph.D. program
offered by the department of psychology. We are interested in
finding out what you feel about
the Ph.D. program--be that positive or negative. There are no
wrong or right answers; we just
want to know what you think and feel about the program.
Remember your answers are
completely private. No one except the evaluator (myself) will
have access what you say today.
To begin with we are going to start by completing a brief survey
that explores how you feel
about the program. We will begin our discussion as soon as that
is completed. Take a few
moments and answer the survey then place it face down on your
table so that I know when it is
alright to begin.[WAIT FOR PARTICIPANTS TO COMPLETE
THE SURVEY].
Now I am going to turn on the tape recorder and check to make
sure it is functioning properly. I
would like all of us to count to five to check the tape recorder, I
will begin and the next person to
my right continue till we have gone full circle. I will then
check to make sure the recorder is
functioning properly and begin the discussion. To keep your
answers private, we will not use
your name or any other type of identifying label. [REWIND
TAPE AND PLAY BACK TO
ENSURE THAT THE RECORDINGS ARE AUDIBLE].
Part I. Goal: Determine attitudes and feelings toward the Ph.D.
program.
1. What is your general perception of the program?
--What do you think of it? Do you think well of it?
2. What do you like about it? What do you think the program
does well?
--What makes you say that?
3. What do you think is bad about the program? What don’t
you like about the program?
--What makes you say that? Explain ?
Part II. Goal: Explore participants knowledge of and about the
program.
4. What do you perceive as the purposes (goals, objectives) or
guiding philosophy of the
program?
--How did you arrive at this answer ?
--What kinds of things happened that told you this?
5. What do you think about this philosophy? Do you agree
with these purposes or philosophy?
Do you think the problems the program addresses are severe?
important?
Program Evaluation Proposal 20
6. I am going to read to you the stated objectives of the
program as written in the “gray book”--
the program’s official plan that delineates the purpose and
objectives.
Read [The Doctor of Philosophy (Ph.D.) program at UTEP is
designed to
produce bilingual/bicultural research psychologists that will be
able to serve
Hispanic populations in Texas. The Ph.D. program at UTEP
aims “to prepare
research psychologists to address questions applicable to the
English-Spanish
bicultural communities in the Southwest” (The University of
Texas at El Paso,
1992, p. 8)].
7. What do you think about this philosophy? Do you agree
with these purposes or philosophy?
Do you think the problems the program addresses are severe?
important? Is it different from
what you expected? Why or why not?
8. What do you think the theory or model for the program is?
Why/how do you think it works?
How is it supposed to work? Why would the program actions
lead to sucess on the program’s
objectives or criteria? Which program components are most
critical to success?
9. Typically, what types of activities do students in the
department engage in? (academic
activities)?
10. Are these activities different from that of other students in
other schools? Explore ?
11. What does a student have to do to finish the program?
Part III. Goal: Understand participant concerns about the Ph.D.
program
13. What concerns do you have about the program? About its
outcomes? Its operations? Other
issues?
--What makes you say that?
14 If there any changes that you would like to see occur what
would they be?
How would these help you or others in the program?
Do you think others share your concerns?
Do you think these would be easily implemented?
15. Why do you think these changes have not occurred ?
Program Evaluation Proposal 21
Part IV. Goal: Learn about issues related to specific
participant groups
Faculty
A. Is the program being implemented as originally
conceptualized?
B. Are there any problems with the implementation process?
C. What problems do faculty perceive?
D. How satisfied are they with the way the program is
implemented”
E. How satisfied are they with the quality of students being
enrolled?
F. How satisfied are they with the graduating students?
G. How well does the program prepare its students for the job
market?
H. How comparable is the curriculum to similar graduate
programs (nationally, or regionally)?
Students
A. Is the program meeting their expectations? Why or why not?
B. What are some problem areas in the program:
1. Classes: Have classes been offered regularly? Which have
and which have not? are there
any additional courses that you would like to see offered?
2. Internships? How many students have gone on internships?
How satisfied were they with
their experiences? How relevant were the internships to their
career/professional goals?
3. Funding? How are funding decisions made?
4. Teaching opportunities? How many have taught classes?
How is the assignment of
courses made? What preparation do students receive to teach a
course?
5. Other?
Staff
A. Has there been added work as a result of the Ph.D. program
beyond what is possible with the
resources available to the department?
General
A. Has the program graduated the first class of students? How
prepared are they for the market?
How well does the program prepare its students for the job-
market?
B. How comparable is the curriculum to those of other graduate
programs (nationally, or
regionally)?
C. Has the program met accreditation standards (p. 29)?
16. Are there any comments or suggestions that you want to
add ?
Program Evaluation Proposal 22
APPENDIX C
Evaluation Plan Outline
Evaluation
Question
Information
Required
Source of
information
Method
1. To what extent
has the program
been implemented as
proposed?
Qualitative Participant interviews
Archival data
Focus group interviews
Departmental records
2. To what extent
has the program
progressed toward
meeting its stated
objectives?
Qualitative Participant interviews
Archival data
Focus group interviews
Departmental records
3. What are the
perceptions of
stakeholders
regarding the
program?
Quantitative/
Qualitative
Participant survey
Participant interviews
Participant survey
Focus group interviews
4. What are the
main concerns for
program
development and
imrovement?
Quantitative/
Qualitative
Quantitative/
Qualitative
Participant survey
Focus group interviews
Proposal to Establish the Doctor of Education in Higher
Education in Higher Education and
Restructure the Doctor of Education in Educational Leadership
1. Terminate the Doctor of Education in Educational Leadership
Concentrations in K-12 and in Higher Education.
2. Restructure the Doctor of Education in Educational
Leadership to a 45-Credit Doctor of Education in Educational
Leadership
3. Establish a 69-Credit Doctor of Education in Higher
Education
Executive Summary
Terminate the K-12 and Higher Education Concentrations in the
Doctor of Education in Educational Leadership and Restructure
the Curriculum as a 45-Credit Doctor of Education in
Educational Leadership
The College of Education proposes terminate the K-12 and the
Higher Education concentrations in the Doctor of Education in
Educational Leadership (EdDEL) and restructure the curriculum
of the EdDEL into a cohort program of 45-credits. Current
students will have 5 years to complete the current degree. In
some cases, current students may option into the new
curriculum.
The program will be taught in an executive format of a fixed set
of courses that combines the theoretical and methodological
foundations of academic research with an applied focus that
helps students develop the professional and interpersonal
wisdom necessary to successfully manage change within
complex organizational structures. The EdDEL executive format
degree program will prepare its graduates to be not just
effective administrators but skillful and visionary leaders.
The EdDEL degree program is a cohort program and will consist
of a fixed set of courses offered in a specific sequence. All
students in each cohort will take the same courses in the same
sequence. Possession of a master’s degree or at least 30
graduate credits in a related field will be required for admission
to the program. The EdDEL degree program will enroll the first
cohort in summer of 2017.
Curriculum
EPSY 8627: Introduction to Research Design and Methods
EDAD 8461: Ethical Leadership
EDUC 5325: Introduction to Statistics and Research
EDAD 8635: Education Policy Analysis
EDAD 5262: Introduction to Qualitative Research
EDAD 8653: Civic Leadership
EDAD 8636: Research for Change and Program Evaluation
EDAD 8755: Organizational Dynamics
EDAD 8093: Administrative Research Seminar
EDAD 8553: Democratic, Equitable, and Ethical
Leadership
EDUC 9998: Dissertation Proposal Design
EDUC 5010: Special Topics in Education: Trends in
Special Education
AOD 5534: Group Facilitation and Consultation
EDUC 9999: Doctor of Education Dissertation
Establish a Doctor of Education in Higher Education (EdD-HE)
The College of Education proposes to establish a 69-credit
(EdD-HE) degree. The EdD-HE degree will require a rigorous
program of study that helps students develop the skills needed
to diagnose and resolve organizational challenges and to craft
and evaluate programs and policies impacting student success.
We believe that a stand-alone higher education-focused degree
will better serve our growing population of higher education-
focused students and make our already high-quality EdD
program more competitive in the regional market for doctoral
programs. We also believe that the focus and coherence of the
new program will enable us to broaden our recruiting to include
professionals working within the entire educational enterprise
that is now the full-service university, as well as those outside
of postsecondary institutions—including researchers,
administrators, policymakers, and educational support
providers.
The program features a core set of courses that reflect the
essential values of the Temple University graduate program in
higher education and the foundational knowledge, skills, and
abilities required for effective postsecondary administrative
practice. Possession of a master’s degree in a related field will
be required for admission to the program and students will be
expected to transfer in up to 30 credits as advanced standing
(with approval). With approval, students may also transfer up to
nine credits earned at the Temple College of Education prior to
matriculation into the EdD-HE. Most students will thus need to
complete 11 courses plus at least six credits in the dissertation
block (including at least two credits of EDAD 9999) in
residence in the doctoral program at Temple.
Curriculum
Core Courses (Four 3-Credit Courses):
HIED 8101: Advanced Seminar on Higher Education
Administration
HIED 8102: Higher Education Economics & Finance
HIED 8103: Equity in Higher Education Policy & Practice
HIED 8104: Seminar on Theory in Higher Education
Electives (Two 3-Credit Courses): Course suggestions provided,
students select a 2- course cognate based on dissertation
interests.
Advanced Research Methods (Four 3-Credit Courses)
EPSY 8627: Introduction to Research Design & Methods
EDUC 5325: Introduction to Statistics and Research
EDUC 5262: Introduction to Qualitative Research
Plus one of the following:
EPSY 8625: Intermediate Educational Statistics
EPSY 5529: Test and Measurements
EPSY 8826: Multivariate Research Methods
HIED 8XXX: Advanced Practice-Based Qualitative Research in
Higher Education (new)
Comp Exam & Dissertation Block (9 Credits Minimum)
HIED 8XXX: Advanced Higher Education Research Seminar
(Lit. Review & Comp. Exam)
EDUC 9998: Dissertation Proposal (3 credits)
EDAD 9999: Dissertation (3-6 credits)
The EdD in Higher Education will be offered in Fall 2017.
Proposal to Restructure the Doctor of Education in Educational
Leadership Degree
Abstract. The following is a proposal to restructure to Doctor of
Education in Educational Leadership (EdDEL) to a post
master’s 45-credit EdDEL and to terminate the K-12 and Higher
Education concentrations within the EdDEL degree.
Market. The restructured EdDEL degree is intended for a wide
audience of individuals with experience in K-12 educational
settings who desire to advance their careers. Many educational
professionals want a defined, predictable program of study that
supports steady progress and complements the busy schedule of
a teacher and/or school leader. The proposed program will
institute a cohort-based executive format program, wherein
courses will be offered on weekends and during the summer,
accommodating the schedules of working professionals. We
believe that a stand-alone K-12 education focused EdDEL will
better serve our population of K-12 focused students and make
our already high-quality EdDEL program more competitive in
the regional market for doctoral programs. We also believe that
the focus, coherence and executive format of the new program
will enable us to broaden our recruiting efforts.
Program objectives. The EdD-EL degree will require a rigorous
program of study that helps students develop the skills needed
to diagnose and resolve organizational challenges and to craft
and evaluate programs and policies impacting student success.
The proposed curriculum combines the theoretical and
methodological foundations of academic research with an
applied focus that helps students develop the professional and
interpersonal wisdom necessary to successfully manage change
within complex organizational structures. The Temple EdDEL
degree will prepare its graduates to be not just effective
administrators but skillful and visionary leaders.
I. New Program Rationale, Context, and Demand
Rationale:
· The proposed program is consistent with those offered by the
leading schools of education and would enhance our capacity to
fulfill the university’s mission of social justice in education.
· This proposal responds to the need to support the learning of
working practitioners seeking to advance knowledge of systems
leadership by structuring learning in an executive format.
· The proposal program is designed to reflect the current and
emerging trends in school district and school system design,
management and leadership.
Description:
We are proposing to restructure the current EdDEL degree with
two concentrations into a post-masters 45-credit EdDEL degree
program that will be cohort based taught in an executive format.
The program will continue to foster and reinforce Temple’s
commitment to social justice, equity, and ethical practices. The
courses, course sequence, and dissertation process are designed
in a way that the program coheres around these issues and their
implications for educational leadership. We believe that all
students will benefit from engaging with these issues, and that
it will not only contribute to students’ preparation as system
leaders, but also as citizens in their communities and in the
broader world.
The proposed cohort program will consist of a fixed course
sequence and is structured to allow students to complete their
degree in three years. Admission criteria for the program will
include evidence of scholarship and leadership activities.
Possession of a master’s degree or at least 30 graduate credits in
a related field will be required for admission to the program.
Many educational professionals want a defined, predictable
program of study that supports steady progress and
complements the busy schedule of a teacher of school leader.
The proposed program will institute a cohort-based executive
program, wherein courses will be offered on weekends and
during the summer, accommodating the schedules of working
professionals. Students will be admitted in cohorts of 18 – 20
students every other year. They will take all of their courses
together, and beginning with the first semester, students will
receive support in preparing for, conducting, and writing their
dissertation studies.
The courses in the proposed program are organized thematically
and incorporate elements that are uniquely supported by the
coherence of this new design. Each of the themes has a faculty
sponsor who will oversee course content to support articulation
within the program curriculum. We expect that developing
coherence among all the executive program components will
lead to increased efficiency and efficacy, and an intensive, yet
manageable experience for students.
II. Relationship to other programs in the college
The proposed restructured EdDEL degree will replace the
terminated K-12 concentrations within the current EdDEL
degree. The College of Education is also proposing to establish
a Doctor of Education in Higher Education (EdDHE) that will
provide a more focused, robust doctoral program for Higher
Education than is currently being offered. The restructured
EdDEL and the proposed EdDHE degrees allow the College of
Education the opportunity to clarify mission and market, create
more structured and coherent programs of study, and better
position the College as the regional leader in education graduate
training programs.
III. Curriculum
Program design.Completion of the cohort EdDEL degree will
require 45-credits beyond the master’s degree. Possession of a
master’s degree or at least 30 graduate credits in a related field
will be required for admission to the program. With program
approval students may also transfer in up to nine post master’s
credits earned at the Temple College of Education while not
matriculated in a graduate program. Most students will thus
need to complete 13 courses plus at least six credits in the
dissertation block (including at least two credits of EDAD
9999) in the doctoral program at Temple.
Core Courses & Course Sequence
The program will consist of a fixed set of courses offered in a
specific sequence, Table 1. All students in each cohort will take
the same courses in the same sequence. Consequently, course
curricula will more explicitly build upon one another, and
certain themes will bridge multiple courses. In addition, as
students proceed through the program, we will orchestrate
opportunities for them to develop deep interpersonal and
professional relationships in support of their coursework, the
dissertation process, and their practice.
The executive program will be open to students working in a
wide array of contexts. That said, throughout our courses we
will continue to emphasize issues traditionally associated with
urban school systems: disadvantages related to low socio-
economic status, institutional and individual biases based on
race and class, unequal distribution of resources, and the
political and organizational implications and the implications of
pursuing social justice in schools.
We propose changing some of the current course titles to more
accurately reflect their content, as indicated below.
· EDAD 8461 Ethical Educational Leadership Ethical
Leadership
· EDAD 8635 Current Issues in Educational Policy Education
Policy Analysis
· EDAD 8653 Educational Leadership as Civic Leadership
Civic Leadership
· EDAD 8755 Theoretical Perspectives/Organizational
Dynamics of Education
Leadership Organizational Dynamics
· EDAD 8553 Profiles of Democratic and Ethical Leadership
Democratic,
Equitable, and Ethical Leadership
Table 1. Proposed Executive EdD in Educational Leadership
Curriculum and Pathway
Summer 1
EPSY 8627
Introduction to Research Design and Methods
EDAD 8461
Ethical Leadership
Fall 1
EDUC 5325
Introduction to Statistics and Research
EDAD 8635
Education Policy Analysis
Spring 1
EDUC 5262
Introduction to Qualitative Research
EDAD 8653
Civic Leadership
Summer 2
EDAD 8636
Research for Change and Program Evaluation
EDAD 8755
Organizational Dynamics
Fall 2 *
EDUC 8093
Administrative Research Seminar
EDAD 8553
Democratic, Equitable, and Ethical Leadership
Spring 2
EDUC 9998
Dissertation Proposal Design
EDUC 5010
Special Topics in Education: Trends in Special Education
Summer 3
AOD 5534
Group Facilitation and Consultation
Fall 3
EDUC 9999
Doctor of Education Dissertation
Spring 3
EDUC 9999
Doctor of Education Dissertation
*Comprehensive exams will take place at the end of this
semester.
Program Requirements: Dissertation
The majority of program graduates will continue to work in a
practical field, and therefore dissertation studies will involve
addressing a pressing problem of practice. This problem must be
rich enough to require a thorough examination of the relevant
practical and theoretical literature, and yet specific enough to
yield actionable recommendations. Dissertation topics must also
be responsive to the set of methodological tools at the students’
disposal.
· Dissertation proposal: In the semester immediately following
completion of the Advanced Research Seminar and successful
completion of the comprehensive exam, students will enroll in
Dissertation Proposal Design (EDUC 9998). Like the Advanced
Research Seminar, EDUC 9998 will function as a structured,
intensive, cohort-based monthly workshop in which students
will design and defend a dissertation proposal that outlines a
rigorous plan for empirical study of an issue relevant to the
student’s professional responsibilities or aspirations. The
proposal must incorporate a thorough and critical review of
literature relevant to the topic, a discussion of theoretical
approaches to understanding and studying the topic and a
conceptual or theoretical framework that will guide the study,
and a robust methodological plan (including assurances of
completing IRB review and any interview or other protocols
necessary to submit for IRB review). Dissertation proposal
defense will occur at any point during or at the end of the
semester and students will receive feedback from the faculty
adviser, other committee members, and their cohort peers during
their defense.
· Dissertation: Following successful defense of the dissertation
proposal and after securing IRB approval, students will carry
out an original research project intended to make a significant
practice-based contribution to the field. While the EdD
dissertation is meant to have practical and applied relevance,
however, it is nonetheless expected to engage rigorously with
existing literature and theory appropriate to the student’s
chosen topic and to demonstrate the student’s ability to execute
robust methods appropriate to the student’s research
question(s).
Dissertation requirements
The EdD dissertation is distinct from the PhD dissertation; the
intent of the dissertation is not to build theory but rather to
make a substantive contribution to practice-focused scholarship
in a particular domain of K-12 Educational Leadership.
· Dissertation study report. EdD students will complete as the
dissertation a standard academic manuscript (inclusive of an
introduction, literature review, conceptual/theoretical
framework, methodology, results, discussion, and references).
EdD dissertations are typically less lengthy than PhD
dissertations with a smaller scope of theorizing and data
collection, but are held to the same standards as PhD
dissertations with respect to methodological validity, data
analysis, and writing quality and clarity.
· White Paper/Executive Summary. Because of the practice
focus of the EdDEL program, in addition to completion of a
dissertation study report, students will be required to produce a
white paper/executive summary distilling the lessons of their
research for practitioners in their field.
Requirements for the dissertation will otherwise remain
consistent with those of the current EdD in Educational
Leadership and as defined by the graduate school, including the
composition of the dissertation committee and processes for
dissertation defense and submission.
Course Scheduling & Curriculum Grid
The executive program will require a total of 45 credits beyond
the earned master’s degree. During the first two years, students
will take two courses in each of the fall and spring semesters,
and two courses will be offered in the summers during an
intensive seven-day session. In the final year, students will take
one course per term
IV. Impact on Faculty & Students
Faculty. The courses offered in the executive program are
equivalent to the traditional course offerings, and will be taught
by the faculty who currently teach them. Core program faculty
will serve as advisors for 3 – 5 students every other year, and
they will usually, but not always become the chair of each of
those students’ dissertation committees. Current program
faculty include Christopher McGinley, Steve Gross, Sarah
Cordes, and John Hall.
Given the program's strong focus on practice, dissertation
committees will include two qualified faculty members from the
College of Education and a qualified practitioner who is
external to the university. The external member must
successfully complete a selection process developed by the
program faculty.
The courses offered in the program sort into four main groups,
and each group of courses will be managed by a program faculty
member. This does not mean that a group’s faculty member
teaches each of the courses, but he or she is responsible for
reviewing and adjusting the curriculum of these courses,
developing coherence among the courses, and coordinating with
other faculty members to develop coherence across the program.
The groups, and the person responsible, are as follows:
Research Methodology: Sarah Cordes; Organization and Policy:
John Hall; Equity and Ethics: Steve Gross; The Practice of
Leadership: Christopher McGinley.
Students. Students enrolled in the executive program will take
coursework that accommodates an intense work-week schedule,
and with support and guidance they will be able to complete the
program in three years. This approach appeals to professionals
who want a high-quality education that is concentrated,
efficient, and predictable.
We will select 18 to 20 students for each cohort, and cohorts
will be admitted every two years.
In the first two years, the program will comprise two courses
per semester, offered on weekends, and two courses per
summer, offered during an intensive seven-day session. In the
final year, students will take one course per term, unless they
are pursuing superintendent licensure, in which case they will
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E
Doctor of Education in Educational LeadershipThe Doctor of E

More Related Content

Similar to Doctor of Education in Educational LeadershipThe Doctor of E

1 PA 315 SPRING 2020 GOVERNMENT-BUSINESS .docx
      1    PA 315 SPRING 2020    GOVERNMENT-BUSINESS .docx      1    PA 315 SPRING 2020    GOVERNMENT-BUSINESS .docx
1 PA 315 SPRING 2020 GOVERNMENT-BUSINESS .docxShiraPrater50
 
1 PA 315 SPRING 2020 GOVERNMENT-BUSINESS .docx
1    PA 315 SPRING 2020    GOVERNMENT-BUSINESS .docx1    PA 315 SPRING 2020    GOVERNMENT-BUSINESS .docx
1 PA 315 SPRING 2020 GOVERNMENT-BUSINESS .docxadkinspaige22
 
1 PA 315 SPRING 2020 GOVERNMENT-BUSINESS
      1    PA 315 SPRING 2020    GOVERNMENT-BUSINESS       1    PA 315 SPRING 2020    GOVERNMENT-BUSINESS
1 PA 315 SPRING 2020 GOVERNMENT-BUSINESS hirstcruz
 
Hey Carzetta,  You did a beautiful job on your char.docx
Hey Carzetta,  You did a beautiful job on your char.docxHey Carzetta,  You did a beautiful job on your char.docx
Hey Carzetta,  You did a beautiful job on your char.docxhanneloremccaffery
 
Year 2014Summer Semester Prepared by Elena Ashley & Ahma.docx
Year 2014Summer Semester Prepared by Elena Ashley & Ahma.docxYear 2014Summer Semester Prepared by Elena Ashley & Ahma.docx
Year 2014Summer Semester Prepared by Elena Ashley & Ahma.docxjeffevans62972
 
Pgbm161+module+guide+oct+2020+starts
Pgbm161+module+guide+oct+2020+startsPgbm161+module+guide+oct+2020+starts
Pgbm161+module+guide+oct+2020+startsMefratechnologies
 
BUS 500 SyllabusMASTER IN BUSINESS ADMINISTRATION PROGRAM
BUS 500 SyllabusMASTER IN BUSINESS ADMINISTRATION PROGRAMBUS 500 SyllabusMASTER IN BUSINESS ADMINISTRATION PROGRAM
BUS 500 SyllabusMASTER IN BUSINESS ADMINISTRATION PROGRAMVannaSchrader3
 
Conceptual Curriculum Design
Conceptual Curriculum DesignConceptual Curriculum Design
Conceptual Curriculum DesignChris Lawton
 
Assignment BriefBUSI72804ToBUSI7280 StudentsFrom.docx
Assignment BriefBUSI72804ToBUSI7280 StudentsFrom.docxAssignment BriefBUSI72804ToBUSI7280 StudentsFrom.docx
Assignment BriefBUSI72804ToBUSI7280 StudentsFrom.docxbraycarissa250
 
Cnics mentoring program
Cnics mentoring programCnics mentoring program
Cnics mentoring programJames Kahn
 
The Catholic University of America Metropolitan School of .docx
The Catholic University of America Metropolitan School of .docxThe Catholic University of America Metropolitan School of .docx
The Catholic University of America Metropolitan School of .docxmattinsonjanel
 
Running head EVALUATION OF CLINICAL PRACTICE PROGRAM EVALUATIO.docx
Running head EVALUATION OF CLINICAL PRACTICE PROGRAM EVALUATIO.docxRunning head EVALUATION OF CLINICAL PRACTICE PROGRAM EVALUATIO.docx
Running head EVALUATION OF CLINICAL PRACTICE PROGRAM EVALUATIO.docxcharisellington63520
 
Scholarly Writings.docx
Scholarly Writings.docxScholarly Writings.docx
Scholarly Writings.docxwrite12
 
EE Introduction Presentation (Students) Class of 2022.pptx
EE Introduction Presentation (Students) Class of 2022.pptxEE Introduction Presentation (Students) Class of 2022.pptx
EE Introduction Presentation (Students) Class of 2022.pptxFrankAlfano6
 
Employee Goal Setting ToolkitWhat it doesThe Employee Goa.docx
Employee Goal Setting ToolkitWhat it doesThe Employee Goa.docxEmployee Goal Setting ToolkitWhat it doesThe Employee Goa.docx
Employee Goal Setting ToolkitWhat it doesThe Employee Goa.docxchristinemaritza
 
Bus module outline
Bus module outlineBus module outline
Bus module outlineAng Averllen
 
Business Module Outline
Business Module OutlineBusiness Module Outline
Business Module OutlineYung Kai
 

Similar to Doctor of Education in Educational LeadershipThe Doctor of E (20)

1 PA 315 SPRING 2020 GOVERNMENT-BUSINESS .docx
      1    PA 315 SPRING 2020    GOVERNMENT-BUSINESS .docx      1    PA 315 SPRING 2020    GOVERNMENT-BUSINESS .docx
1 PA 315 SPRING 2020 GOVERNMENT-BUSINESS .docx
 
1 PA 315 SPRING 2020 GOVERNMENT-BUSINESS .docx
1    PA 315 SPRING 2020    GOVERNMENT-BUSINESS .docx1    PA 315 SPRING 2020    GOVERNMENT-BUSINESS .docx
1 PA 315 SPRING 2020 GOVERNMENT-BUSINESS .docx
 
1 PA 315 SPRING 2020 GOVERNMENT-BUSINESS
      1    PA 315 SPRING 2020    GOVERNMENT-BUSINESS       1    PA 315 SPRING 2020    GOVERNMENT-BUSINESS
1 PA 315 SPRING 2020 GOVERNMENT-BUSINESS
 
Hey Carzetta,  You did a beautiful job on your char.docx
Hey Carzetta,  You did a beautiful job on your char.docxHey Carzetta,  You did a beautiful job on your char.docx
Hey Carzetta,  You did a beautiful job on your char.docx
 
An example IDP
An example IDPAn example IDP
An example IDP
 
Year 2014Summer Semester Prepared by Elena Ashley & Ahma.docx
Year 2014Summer Semester Prepared by Elena Ashley & Ahma.docxYear 2014Summer Semester Prepared by Elena Ashley & Ahma.docx
Year 2014Summer Semester Prepared by Elena Ashley & Ahma.docx
 
Pgbm161+module+guide+oct+2020+starts
Pgbm161+module+guide+oct+2020+startsPgbm161+module+guide+oct+2020+starts
Pgbm161+module+guide+oct+2020+starts
 
BUS 500 SyllabusMASTER IN BUSINESS ADMINISTRATION PROGRAM
BUS 500 SyllabusMASTER IN BUSINESS ADMINISTRATION PROGRAMBUS 500 SyllabusMASTER IN BUSINESS ADMINISTRATION PROGRAM
BUS 500 SyllabusMASTER IN BUSINESS ADMINISTRATION PROGRAM
 
Conceptual Curriculum Design
Conceptual Curriculum DesignConceptual Curriculum Design
Conceptual Curriculum Design
 
Assignment BriefBUSI72804ToBUSI7280 StudentsFrom.docx
Assignment BriefBUSI72804ToBUSI7280 StudentsFrom.docxAssignment BriefBUSI72804ToBUSI7280 StudentsFrom.docx
Assignment BriefBUSI72804ToBUSI7280 StudentsFrom.docx
 
Cnics mentoring program
Cnics mentoring programCnics mentoring program
Cnics mentoring program
 
The Catholic University of America Metropolitan School of .docx
The Catholic University of America Metropolitan School of .docxThe Catholic University of America Metropolitan School of .docx
The Catholic University of America Metropolitan School of .docx
 
Running head EVALUATION OF CLINICAL PRACTICE PROGRAM EVALUATIO.docx
Running head EVALUATION OF CLINICAL PRACTICE PROGRAM EVALUATIO.docxRunning head EVALUATION OF CLINICAL PRACTICE PROGRAM EVALUATIO.docx
Running head EVALUATION OF CLINICAL PRACTICE PROGRAM EVALUATIO.docx
 
SLO Training
SLO TrainingSLO Training
SLO Training
 
Nursing Question.docx
Nursing Question.docxNursing Question.docx
Nursing Question.docx
 
Scholarly Writings.docx
Scholarly Writings.docxScholarly Writings.docx
Scholarly Writings.docx
 
EE Introduction Presentation (Students) Class of 2022.pptx
EE Introduction Presentation (Students) Class of 2022.pptxEE Introduction Presentation (Students) Class of 2022.pptx
EE Introduction Presentation (Students) Class of 2022.pptx
 
Employee Goal Setting ToolkitWhat it doesThe Employee Goa.docx
Employee Goal Setting ToolkitWhat it doesThe Employee Goa.docxEmployee Goal Setting ToolkitWhat it doesThe Employee Goa.docx
Employee Goal Setting ToolkitWhat it doesThe Employee Goa.docx
 
Bus module outline
Bus module outlineBus module outline
Bus module outline
 
Business Module Outline
Business Module OutlineBusiness Module Outline
Business Module Outline
 

More from DustiBuckner14

Your new clientsThe Wagner’s – Scott and Ella are a young marri.docx
Your new clientsThe Wagner’s – Scott and Ella are a young marri.docxYour new clientsThe Wagner’s – Scott and Ella are a young marri.docx
Your new clientsThe Wagner’s – Scott and Ella are a young marri.docxDustiBuckner14
 
Writing Conclusions for Research PapersWhat is the purpose.docx
Writing Conclusions for Research PapersWhat is the purpose.docxWriting Conclusions for Research PapersWhat is the purpose.docx
Writing Conclusions for Research PapersWhat is the purpose.docxDustiBuckner14
 
What Is Septic TankSeptic or septic typically is used t.docx
What Is Septic TankSeptic or septic typically is used t.docxWhat Is Septic TankSeptic or septic typically is used t.docx
What Is Septic TankSeptic or septic typically is used t.docxDustiBuckner14
 
· You should respond to at least two of your peers by extending, r.docx
· You should respond to at least two of your peers by extending, r.docx· You should respond to at least two of your peers by extending, r.docx
· You should respond to at least two of your peers by extending, r.docxDustiBuckner14
 
You are a medical student working your way throughcollege and ar.docx
You are a medical student working your way throughcollege and ar.docxYou are a medical student working your way throughcollege and ar.docx
You are a medical student working your way throughcollege and ar.docxDustiBuckner14
 
[removed]THIS IEP INCLUDES FORMCHECKBOX Transitions.docx
[removed]THIS IEP INCLUDES     FORMCHECKBOX  Transitions.docx[removed]THIS IEP INCLUDES     FORMCHECKBOX  Transitions.docx
[removed]THIS IEP INCLUDES FORMCHECKBOX Transitions.docxDustiBuckner14
 
Using the Integrated Model of Work Motivation Figure 12.1 (Latham, 2.docx
Using the Integrated Model of Work Motivation Figure 12.1 (Latham, 2.docxUsing the Integrated Model of Work Motivation Figure 12.1 (Latham, 2.docx
Using the Integrated Model of Work Motivation Figure 12.1 (Latham, 2.docxDustiBuckner14
 
What We Can Afford” Poem By Shavar X. Seabrooks L.docx
What We Can Afford” Poem By Shavar  X. Seabrooks L.docxWhat We Can Afford” Poem By Shavar  X. Seabrooks L.docx
What We Can Afford” Poem By Shavar X. Seabrooks L.docxDustiBuckner14
 
What are the techniques in handling categorical attributesHow.docx
What are the techniques in handling categorical attributesHow.docxWhat are the techniques in handling categorical attributesHow.docx
What are the techniques in handling categorical attributesHow.docxDustiBuckner14
 
University of the CumberlandsSchool of Computer & Information .docx
University of the CumberlandsSchool of Computer & Information .docxUniversity of the CumberlandsSchool of Computer & Information .docx
University of the CumberlandsSchool of Computer & Information .docxDustiBuckner14
 
Theresa and Mike fully support creating a code of conduct for th.docx
Theresa and Mike fully support creating a code of conduct for th.docxTheresa and Mike fully support creating a code of conduct for th.docx
Theresa and Mike fully support creating a code of conduct for th.docxDustiBuckner14
 
Unit VII 1. Suppose a firm uses sugar in a product tha.docx
Unit VII         1. Suppose a firm uses sugar in a product tha.docxUnit VII         1. Suppose a firm uses sugar in a product tha.docx
Unit VII 1. Suppose a firm uses sugar in a product tha.docxDustiBuckner14
 
Title PageThis spreadsheet supports STUDENT analysis of the case .docx
Title PageThis spreadsheet supports STUDENT analysis of the case .docxTitle PageThis spreadsheet supports STUDENT analysis of the case .docx
Title PageThis spreadsheet supports STUDENT analysis of the case .docxDustiBuckner14
 
Title If a compensation system works well for one business, that .docx
Title If a compensation system works well for one business, that .docxTitle If a compensation system works well for one business, that .docx
Title If a compensation system works well for one business, that .docxDustiBuckner14
 
Review the Article Below Keller, J. G., Miller, C., LasDulce, C.docx
Review the Article Below Keller, J. G., Miller, C., LasDulce, C.docxReview the Article Below Keller, J. G., Miller, C., LasDulce, C.docx
Review the Article Below Keller, J. G., Miller, C., LasDulce, C.docxDustiBuckner14
 
Teachers reach diverse learners by scaffolding instruction in ways t.docx
Teachers reach diverse learners by scaffolding instruction in ways t.docxTeachers reach diverse learners by scaffolding instruction in ways t.docx
Teachers reach diverse learners by scaffolding instruction in ways t.docxDustiBuckner14
 
ScenarioThe HIT Innovation Steering Committee of a large.docx
ScenarioThe HIT Innovation Steering Committee of a large.docxScenarioThe HIT Innovation Steering Committee of a large.docx
ScenarioThe HIT Innovation Steering Committee of a large.docxDustiBuckner14
 
Space ... the final frontier.  So, as I am sure everyone knows, .docx
Space ... the final frontier.  So, as I am sure everyone knows, .docxSpace ... the final frontier.  So, as I am sure everyone knows, .docx
Space ... the final frontier.  So, as I am sure everyone knows, .docxDustiBuckner14
 
The Internal EnvironmentInstitutionStudent’s name.docx
The Internal EnvironmentInstitutionStudent’s name.docxThe Internal EnvironmentInstitutionStudent’s name.docx
The Internal EnvironmentInstitutionStudent’s name.docxDustiBuckner14
 
THE RESEARCH PROPOSAL BUS8100 8Chapter 2 - Literature ReviewTh.docx
THE RESEARCH PROPOSAL BUS8100 8Chapter 2 - Literature ReviewTh.docxTHE RESEARCH PROPOSAL BUS8100 8Chapter 2 - Literature ReviewTh.docx
THE RESEARCH PROPOSAL BUS8100 8Chapter 2 - Literature ReviewTh.docxDustiBuckner14
 

More from DustiBuckner14 (20)

Your new clientsThe Wagner’s – Scott and Ella are a young marri.docx
Your new clientsThe Wagner’s – Scott and Ella are a young marri.docxYour new clientsThe Wagner’s – Scott and Ella are a young marri.docx
Your new clientsThe Wagner’s – Scott and Ella are a young marri.docx
 
Writing Conclusions for Research PapersWhat is the purpose.docx
Writing Conclusions for Research PapersWhat is the purpose.docxWriting Conclusions for Research PapersWhat is the purpose.docx
Writing Conclusions for Research PapersWhat is the purpose.docx
 
What Is Septic TankSeptic or septic typically is used t.docx
What Is Septic TankSeptic or septic typically is used t.docxWhat Is Septic TankSeptic or septic typically is used t.docx
What Is Septic TankSeptic or septic typically is used t.docx
 
· You should respond to at least two of your peers by extending, r.docx
· You should respond to at least two of your peers by extending, r.docx· You should respond to at least two of your peers by extending, r.docx
· You should respond to at least two of your peers by extending, r.docx
 
You are a medical student working your way throughcollege and ar.docx
You are a medical student working your way throughcollege and ar.docxYou are a medical student working your way throughcollege and ar.docx
You are a medical student working your way throughcollege and ar.docx
 
[removed]THIS IEP INCLUDES FORMCHECKBOX Transitions.docx
[removed]THIS IEP INCLUDES     FORMCHECKBOX  Transitions.docx[removed]THIS IEP INCLUDES     FORMCHECKBOX  Transitions.docx
[removed]THIS IEP INCLUDES FORMCHECKBOX Transitions.docx
 
Using the Integrated Model of Work Motivation Figure 12.1 (Latham, 2.docx
Using the Integrated Model of Work Motivation Figure 12.1 (Latham, 2.docxUsing the Integrated Model of Work Motivation Figure 12.1 (Latham, 2.docx
Using the Integrated Model of Work Motivation Figure 12.1 (Latham, 2.docx
 
What We Can Afford” Poem By Shavar X. Seabrooks L.docx
What We Can Afford” Poem By Shavar  X. Seabrooks L.docxWhat We Can Afford” Poem By Shavar  X. Seabrooks L.docx
What We Can Afford” Poem By Shavar X. Seabrooks L.docx
 
What are the techniques in handling categorical attributesHow.docx
What are the techniques in handling categorical attributesHow.docxWhat are the techniques in handling categorical attributesHow.docx
What are the techniques in handling categorical attributesHow.docx
 
University of the CumberlandsSchool of Computer & Information .docx
University of the CumberlandsSchool of Computer & Information .docxUniversity of the CumberlandsSchool of Computer & Information .docx
University of the CumberlandsSchool of Computer & Information .docx
 
Theresa and Mike fully support creating a code of conduct for th.docx
Theresa and Mike fully support creating a code of conduct for th.docxTheresa and Mike fully support creating a code of conduct for th.docx
Theresa and Mike fully support creating a code of conduct for th.docx
 
Unit VII 1. Suppose a firm uses sugar in a product tha.docx
Unit VII         1. Suppose a firm uses sugar in a product tha.docxUnit VII         1. Suppose a firm uses sugar in a product tha.docx
Unit VII 1. Suppose a firm uses sugar in a product tha.docx
 
Title PageThis spreadsheet supports STUDENT analysis of the case .docx
Title PageThis spreadsheet supports STUDENT analysis of the case .docxTitle PageThis spreadsheet supports STUDENT analysis of the case .docx
Title PageThis spreadsheet supports STUDENT analysis of the case .docx
 
Title If a compensation system works well for one business, that .docx
Title If a compensation system works well for one business, that .docxTitle If a compensation system works well for one business, that .docx
Title If a compensation system works well for one business, that .docx
 
Review the Article Below Keller, J. G., Miller, C., LasDulce, C.docx
Review the Article Below Keller, J. G., Miller, C., LasDulce, C.docxReview the Article Below Keller, J. G., Miller, C., LasDulce, C.docx
Review the Article Below Keller, J. G., Miller, C., LasDulce, C.docx
 
Teachers reach diverse learners by scaffolding instruction in ways t.docx
Teachers reach diverse learners by scaffolding instruction in ways t.docxTeachers reach diverse learners by scaffolding instruction in ways t.docx
Teachers reach diverse learners by scaffolding instruction in ways t.docx
 
ScenarioThe HIT Innovation Steering Committee of a large.docx
ScenarioThe HIT Innovation Steering Committee of a large.docxScenarioThe HIT Innovation Steering Committee of a large.docx
ScenarioThe HIT Innovation Steering Committee of a large.docx
 
Space ... the final frontier.  So, as I am sure everyone knows, .docx
Space ... the final frontier.  So, as I am sure everyone knows, .docxSpace ... the final frontier.  So, as I am sure everyone knows, .docx
Space ... the final frontier.  So, as I am sure everyone knows, .docx
 
The Internal EnvironmentInstitutionStudent’s name.docx
The Internal EnvironmentInstitutionStudent’s name.docxThe Internal EnvironmentInstitutionStudent’s name.docx
The Internal EnvironmentInstitutionStudent’s name.docx
 
THE RESEARCH PROPOSAL BUS8100 8Chapter 2 - Literature ReviewTh.docx
THE RESEARCH PROPOSAL BUS8100 8Chapter 2 - Literature ReviewTh.docxTHE RESEARCH PROPOSAL BUS8100 8Chapter 2 - Literature ReviewTh.docx
THE RESEARCH PROPOSAL BUS8100 8Chapter 2 - Literature ReviewTh.docx
 

Recently uploaded

call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxthorishapillay1
 
Planning a health career 4th Quarter.pptx
Planning a health career 4th Quarter.pptxPlanning a health career 4th Quarter.pptx
Planning a health career 4th Quarter.pptxLigayaBacuel1
 
Grade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptxGrade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptxChelloAnnAsuncion2
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 
AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.arsicmarija21
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentInMediaRes1
 
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxEPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxRaymartEstabillo3
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...JhezDiaz1
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Mark Reed
 
ROOT CAUSE ANALYSIS PowerPoint Presentation
ROOT CAUSE ANALYSIS PowerPoint PresentationROOT CAUSE ANALYSIS PowerPoint Presentation
ROOT CAUSE ANALYSIS PowerPoint PresentationAadityaSharma884161
 
Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Celine George
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Educationpboyjonauth
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...Nguyen Thanh Tu Collection
 
Quarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up FridayQuarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up FridayMakMakNepo
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersSabitha Banu
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPCeline George
 

Recently uploaded (20)

call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptx
 
Planning a health career 4th Quarter.pptx
Planning a health career 4th Quarter.pptxPlanning a health career 4th Quarter.pptx
Planning a health career 4th Quarter.pptx
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
Grade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptxGrade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptx
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 
AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media Component
 
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxEPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)
 
ROOT CAUSE ANALYSIS PowerPoint Presentation
ROOT CAUSE ANALYSIS PowerPoint PresentationROOT CAUSE ANALYSIS PowerPoint Presentation
ROOT CAUSE ANALYSIS PowerPoint Presentation
 
Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Education
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
 
Quarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up FridayQuarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up Friday
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginners
 
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERP
 

Doctor of Education in Educational LeadershipThe Doctor of E

  • 1. Doctor of Education in Educational Leadership The Doctor of Education in Educational Leadership program in the college of education prepares graduates to become effective administrators and visionary leaders. Students learn the skills required to lead organizations, manage change, and apply research and theory to real-world problems. Executive Educational Leadership EdD program courses are taught by faculty with both academic credentials and experience as practitioners. Coursework combines the theoretical and methodological foundations of academic research with an applied focus, allowing students to develop the professional and interpersonal wisdom needed to successfully manage change in complex organizations. Graduates are well prepared to lead schools, school districts and organizations and possess the skills required to conduct, interpret and evaluate research and data, diagnose and resolve organizational challenges, and create programs and policies that affect learning success. This cohort-based executive graduate program consists of a fixed set of courses offered in a specific sequence, and all students in each cohort take the same courses in the same sequence. Courses are offered one weekend per month to accommodate the
  • 2. schedules of working professionals. All courses within the Executive Educational Leadership EdD program are offered at Temple University Center City. The program is designed to be completed on a part-time basis; students may complete the program in three years. Related Graduate Degrees ion and Human Development. Supporting Materials 1. Transcripts: Submit official undergraduate and graduate transcripts from all accredited institutions you have attended and/or from which you earned credit. Official http://education.temple.edu/admissions/documents transcripts can be emailed to or sent to the Office of Enrollment Management address listed above. 2. Goals Statement: Include an autobiographical personal statement that explains your reasons for pursuing a doctoral degree in education. The statement should address these questions. How have your personal, academic, and professional experiences shaped your research interests, and how might a doctoral program in Education help you
  • 3. explore those interests? What academic/professional goals would the program help you to achieve following graduation? How does the doctoral program at Temple fit your individual interests, needs, and future goals (including the faculty member whose research best matches your own interests)? 3. Academic Writing Sample: This should be a paper written for a course within the past five years. If applicants do not have a recent paper written for a course, they should compose an op-ed piece on the educational issue of their choosing. The op-ed should be between 400 and 1,200 words and should be the kind of piece that might appear in The New York Times 4. Recommendations: Submit two letters of reference that provide insight into your academic competence. References from college or university faculty members are recommended. You may request recommendations through yourRésumé: A current professional résumé is required. School & Community Partnerships As part of its mission to give back to the surrounding community and provide substantive fieldwork experiences for its students, Project 2 Program Evaluation Proposal: Project 2 involves
  • 4. designing a program evaluation for student’s respective programs (i.e., Advocacy and Organizational Development; Educational Psychology, Applied Research and Evaluation). The proposal should address the impacts of the program, its implementation or both. The proposal should clearly delineate a feasible evaluation plan that draws on course readings, lectures, exercises and presentations. The proposal is worth 25% of your total grade and should address all elements described below. Deliverable: Proposals should address elements detailed below. Proposals should be type-written, doubled spaced in 12pt font. Submit proposals by End of Day 30 July 2022 via CANVAS. ELEMENTS OF THE PROPOSAL The proposal shall include: (a) an introduction, (b) method, (c) proposed analysis, (d) discussion, and (e) a reference page. Consult Mertens and Wilson (2019), and the APA Manual (2010) to address all elements of the proposal. Brief descriptions for each section are provided below. a. Introduction – Provide relevant background and focus for the evaluation proposal. What does the program seek to accomplish? Why is it important? What are the goals, objectives, and purposes of the program? What is the program’s “theory of cause and effect” (i.e., why and how will the program accomplish its goals, objectives and purposes?). What question(s) does the evaluation seek to answer? Why are these questions selected? Is
  • 5. there any literature that can inform the evaluation (reference it appropriately)? See Mertens and Wilson (2019) Chapter 7-8. This should help you understand the dynamics of the program I am currently in.. below are my thoughts/experiences The program seeks to develop well-rounded educational leaders who are ethical and prepared to run and make big decisions for school systems, school networks, and school districts. The program is run cohort style where you experience all classes with a set group of classmates that become your network, your resources and your group presentation buddies. The program theory of cause and effect .. the program will accomplish its goals by providing students with exposure to real world education and school based problems and projects that will have to be solved and presented in groups, by exposing students to courses that require you to develop your understanding as a researcher and get in tune with education research and education “wicked problems”. The evaluator seeks to answer the question of is the edd program being responsive to the drastic change in needs by schools in America during a post pandemic world where education is in crisis. Is the program being flexible scheduling wise by not providing a fully remote model and still requiring students to come in person? Is it providing enough practical experience with the courses offered? Is the program creating critical and analytical thinkers? How is the program providing consistency for students considering faculty leaving and students needing new advisors? Is the program reasonable considering it costs about 30k a year, requires summers in person and does not offer
  • 6. financial aid? Is the EDD worth it since it is considered lesser than the PHd and is not a requirement to be a leader in education in america? Required Courses we take are listed below Year 1 Summer II Credit Hours EDAD 8461 Ethical Leadership 3 EPSY 8627 Introduction to Research Design and Methods 3 Term Credit Hours 6 Fall EDAD 8635 Education Policy Analysis 3 EDUC 5325 Introduction to Statistics and Research 3 Term Credit Hours 6 Spring
  • 7. EDAD 8653 Civic Leadership 3 EDUC 5262 Introduction to Qualitative Research 3 Term Credit Hours 6 Year 2 Summer II EDAD 8636 Research for Change 3 EDAD 8755 Organizational Dynamics 3 Term Credit Hours 6 Fall EDAD 8093 Administration Research Seminar 3 EDAD 8553 Democratic, Equitable, and Ethical Leadership 3 Term Credit Hours 6
  • 8. Spring EDUC 5010 Special Topics in Education 3 EDUC 9998 Dissertation Proposal Design 3 Term Credit Hours 6 Year 3 Summer II AOD 5534 Group Facilitation and Consultation 3 Term Credit Hours 3 Fall EDUC 9999 Doctor of Education Dissertation 3 Term Credit Hours 3
  • 9. Spring EDUC 9999 Doctor of Education Dissertation 3 Term Credit Hours 3 Total Credit Hours: Chapter 9 Notes The purpose of evaluation is to determine the merit or worth of an evaluand. That is, we want to know whether a program had the intended effect on its participants as specified by the programs theory and model. Our ability to faithfully and confidently determine the effects of a program are in part determined by the manner in which we design the evaluation. There are many ways to think about designs within the context of evaluation and designing evaluation is a complex endeavor. Moreover, it is important to note that different designs can be used for different types of applications. Regardless of how we conceptualize and frame the relationship between the purposes and methods of an evaluation process, there are two major questions that we have to be explicitly addressed:
  • 10. 1. To what extent are the effects we observe in participants really due to the program and not some other reason? 2. To what extent can the results observed in participants be expected to generalize (extend to) other situations? Both of these questions pertain more formally to the concept of validity. And there are two specific forms of validity that as evaluators we must be concerned with: • Internal validity – Refers to the extent to which a research design includes enough control of the conditions and experiences of participants that it can demonstrate a single unambiguous explanation for a manipulation, that is cause and effect. To what extent are the effects we observe in participants really due to the program and not some other reason? When we have adequately attended to issues involving internal validity within the evaluation process it means that an evaluator has controlled the effects of variables other than the treatment, in order to say with confidence that the results are reflective of the treatment. Hence, we can confidently say that the observed effects are cause by the program and nothing
  • 11. else. • External validity – Extent to which observations made in a study generalize beyond the specific manipulations or constraints in the study To what extent can the results observed in participants be expected to generalize (extend to) other situations? When we have adequately attended to issues involving external validity it means that an evaluator has ensured that the participants of the program are representative of the population, and therefore that if the treatment is applied with another group of people from that population under similar circumstances, it should be effective there as well WHAT FACTORS DIMINISH OR THREATEN VALIDITY OF EVALUATIONS? We can classify threats to the validity of our conclusions in terms of internal and external threats Threat Description History Events occurring during a study (other than the program
  • 12. treatment) that can influence results Maturation Naturally occurring physical or psychological changes in program participants (e.g., growth, development, aging) that can influence results Testing Administration of test before and after program might influence scores on test independent of program (e.g., familiarity with test results in changes in scores) Instrumentation Having pretest and posttest that differ in terms of content, structure, format or difficulty can lead to differences in scores but not due to program treatment but differences in the instruments used Statistical regression Having extreme groups in program may artificially decrease or increase scores independent of the program treatment—if all members of a group are already scoring at the highest levels and their scores cant go any higher, any observed decline in scores may be due to the test not program treatment indicating a measurement error Differential Selection Differences between groups compared (treatment vs. no- treatment groups) on important characteristics may account for observed differences but these are not due to the program treatments
  • 13. Experimental Mortality Differential dropout of participants in treatment and no- treatment groups yields differences in observed effects that are not a function of the program treatment but rather an artifact of attrition within the groups. Treatment Diffusion Proximity among participants in treatment and no-treatment groups leads to treatment exposure for the no-treatment group Compensatory rivalry When no-treatment group outperforms the treatment group, but those differences are not due to the treatment effects, but by competition –John Henry effect Compensatory Equalization of treatments If one group receives something and the other receives nothing, than any effects on the first group may be due to the fact that this group received something, and not to the specifics of what it received. Resentful Demoralization When members of the no-treatment group realize they did not get something that the
  • 14. treatment group received they may become demoralized because they are being excluded but not because they did not get the specific treatment External Validity Threats Threat Description Selection Treatment Interaction Refers to the possibility that the program results may be applicable to only to that population from which the treatment and no-treatment groups were chosen—hence results may be internally valid but not generalizable Testing Treatment Interaction Refers to the fact that the program results may be generalizable to other groups only when a pretest is also given. Situation effects Experimenter effects Refers to the existence of multiple factors associated with the program itself---results may be due to a particularly charismatic instructor rather than the content of the
  • 15. program Multiple treatment effects Participants are involved in multiple programs at the same time of the evaluation, hence the findings may not be generalizable to other settings because of the confounding of multiple treatments Population validity Extent to which results observed in a study will generalize to the population from which a ample was selected. Homogeneous attrition: Rates of attrition are about the same in Ecological validity Extent to which results observed in a study will generalize across settings or environments Temporal validity Extent to which results observed in a study will generalize across time and at different points in time Outcome validity Extent to which results observed in a study will generalize across different but related DVs HOW DO WE MITIGATE AGAINST THREATS TO INTERNAL AND EXTERNAL VALIDITY?
  • 16. An evaluator can try to mitigate against these potential threats by selecting an evaluation design that reduces the influence of the particular threat by the manner in which the design is executed. There are many ways by which to characterize evaluation designs— Mertens and Wilsons 2012 distinguish between quantitative vs. qualitative data; but we can also classify designs in terms of being experimental, quasi-experiemental and non-experimental. I will use this latter one to highlight how the various designs attempt to address the validity threats we just discussed. The experimental research designs use methods and procedures to make observations in which the researcher fully controls the conditions and experiences of participants by applying three required elements of control: randomization, manipulation, and comparison/control —involves randomly selecting participants into the study so that all individuals in a study; it also involves randomly assigning participants to the experimental conditions. —involves the systematic application of an experimental treatment. —involves controlling who gets or does not get a particular treatment and ensuring that all other aspects of the experimental process are the same except for who gets or does not get a particular treatment.
  • 17. Experimental research designs are the only research design capable of establishing cause—effect relationships. To demonstrate that one factor causes changes in a dependent variables, the conditions and experiences of participants must be under the full control of the research. This often means that an experiment is conducted in a laboratory and not in an environment where a behavior may occur naturally. Strength: Capable of demonstrating cause and effect. Limitation: Behavior that occurs under controlled conditions may not be the same as behavior that occurs in a natural environment We can categorize experimental research designs into one of 4 possible types of designs. Box 9.4 provides an alternative way to conceptualize designs in Mertens & Wilson (p. 316). You will note that (R) designates randomization for all of those designs, (O) indicates and observation; and (X) denotes a treatment. There are 5 different experimental designs which we can use to evaluate the impact of a program. Each one affords particular advantages that if relevant to the validity concerns and purpose of the evaluation enable you to more faithfully assess the program. Whether you are able to employ this designs for evaluation depends on whether or not you can randomize, manipulate and control. To the extent that you can randomize (randomly select/randomly assign participants to a
  • 18. treatment and no treatment group); manipulate (manipulate which group receives a treatment and which group does not); control (control for extraneous factors that may influence or impact participants that may not involve the treatment itself—e.g., control lighting and temperature on performance) then you are able to use one of the experimental designs described (see pl. 316-319). Besides practical concerns you have to think about ethical concerns with regard to the potential risks and benefits of randomizing, manipulating and controlling the treatment and the participants—how ethical is it to withhold a potential treatment for cancer from a terminally ill patient? If you cannot randomize, manipulate or control within your evaluation design, then the alternative is to employ a quasi-experimental design. To be an experimental design, it must meet the following three elements of control: 1. Randomization 2. Manipulation 3. Comparison/control group. Quasi- experiments are similar to an experiment, except that this design does one or both of the following: Includes a quasi-independent variable--Quasi-independent variable: A preexisting variable that is often a characteristic inherent to an individual, which differentiates the groups or conditions being compared in a research study (e.g. Gender (man, woman), health status (lean, overweight, obese). It lacks an appropriate or equivalent control group. Strength: Allows researchers to study factors related to the unique characteristics of participants. Limitation: Cannot demonstrate cause and effect
  • 19. Again, there are many ways to classify the various types of quasi-experimental designs, what is most important is to pay attention to the design that matches and address the purposes and validity threats that may influence and impact the evaluation. Mertens and Wilson describe the relevant issue with regard to quasi experimental designs in Box 9.5 (pp. 320-325) WHAT ABOUT OTHER DESIGNS THAT DO NOT CONFORM TO THE EXPERIMENT AND QUASI- EXPERIMENT CLASSIFICATION? The last category of designs involves what I refer to as non- experimental or what Mertens and Wilson (2013) classify as qualitative designs. These designs do not share any of the characteristics that are required for experimentation (e.g., randomization, manipulate, control). These designs use of methods and procedures to make observations in which the behavior or event is observed “as is” or without an intervention from the researcher. Strength: Can be used to make observations in settings that the behaviors and events being observed naturally operate (e.g. Interactions between an athlete and coach during a game). Limitation: Lacks control needed to demonstrate cause and effect.
  • 20. Correlational Designs • Measurement of two or more factors to determine or estimate the extent to which the values for the factors are related or change in an identifiable pattern • Correlation coefficient: Statistic used to measure the strength and direction of the linear relationship, or correlation, between two factors • The value of r can range from -1.0 to +1.0 Naturalistic Observation The observation of behavior in the natural setting where it is expected to occur, with limited or no attempt to overtly manipulate the conditions of the environment where the observations are made (e.g. Buying behavior in a grocery store, parenting behavior in a residential home Generally associated with high external validity, but low internal validity Qualitative Designs • Use of scientific method to make nonnumeric observations, from which conclusions are drawn without the use of statistical analysis • Adopts the assumption of determinism; however, it does not assume that behavior itself is
  • 21. universal • Determinism: Assumption in science that all actions in the universe have a cause • Based on the holistic view, or “complete picture,” that reality changes and behavior is dynamic Phenomenology (Individual) • Analysis of the conscious experiences of phenomena from the first-person point of view • The researcher interviews a participant then constructs a narrative to summarize the experiences described in the interview • Conscious experience is any experience that a person has lived through or performed and can bring to memory • The researchers must be considerate of the intentionality or meaning of a participant’s conscious experiences • Identify objects of awareness, which are those things that bring an experience to consciousness Ethnography (Group) • Analysis of the behavior and identity of a group or culture as it is described and characterized by the members of that group or culture
  • 22. • A culture is a “shared way of life” that includes patterns of interaction, shared beliefs and understandings, adaptations to the environments, and many more factors • To observe a group or culture, it is often necessary to get close up to or participate in that group or culture • To gain entry into a group or culture without causing participants to react or change their behavior • Researchers can covertly enter a group • Researchers can announce or request entry into a group • Participant observation: Researchers participate in or join the group or culture they are observing • Researchers need to remain neutral in how they interact with members of the group • Common pitfalls associated with participant observation • The “eager speaker” bias • The “good citizen” bias • The “stereotype” bias Case Study Analysis of an individual, group, organization, or event used to illustrate a phenomenon, explore new hypotheses, or compare the observations of many cases Case history: An in-depth description of the history and
  • 23. background of the individual, group, or organization observed. A case history can be the only information provided in a case study for situations in which the researcher does not include a manipulation, treatment, or intervention Illustrative: Investigates rare or unknown cases Exploratory: Preliminary analysis that explores potentially important hypotheses Case studies have two common applications: 1. General inquiry 2. Theory development The level of control in a research design directly related to internal validity or the extent to which the research design can demonstrate cause—effect. Experimental research designs have the greatest control and therefore the highest internal validity. Nonexperimental research designs typically have the least control and therefor the lowest internal validity. Internal validity – Extent to which a research design includes enough control of the conditions and experiences of participants that it can demonstrate a single unambiguous explanation for a manipulation, that is cause and effect External validity – Extent to which observations made in a study generalize beyond the specific manipulations or constraints in the study
  • 24. Constraint: Any aspect of the research design that can limit observations to the specific conditions or manipulations in a study See also Merten & Wilson (2102) Box 9.8 Chapter 9 Notes The purpose of evaluation is to determine the merit or worth of an evaluand. That is, we want to know whether a program had the intended effect on its participants as specified by the programs theory and model. Our ability to faithfully and confidently determine the effects of a program are in part determined by the manner in which we design the evaluation. There are many ways to think about designs within the context of evaluation and designing evaluation is a complex endeavor. Moreover, it is important to note that different designs can be used for different types of applications. Regardless of how we conceptualize and frame the relationship between the purposes and methods of an evaluation process, there are two major questions that we have to be explicitly addressed: 1. To what extent are the effects we observe in participants really due to the program and not some other reason?
  • 25. 2. To what extent can the results observed in participants be expected to generalize (extend to) other situations? Both of these questions pertain more formally to the concept of validity. And there are two specific forms of validity that as evaluators we must be concerned with: • Internal validity – Refers to the extent to which a research design includes enough control of the conditions and experiences of participants that it can demonstrate a single unambiguous explanation for a manipulation, that is cause and effect. To what extent are the effects we observe in participants really due to the program and not some other reason? When we have adequately attended to issues involving internal validity within the evaluation process it means that an evaluator has controlled the effects of variables other than the treatment, in order to say with confidence that the results are reflective of the treatment. Hence, we can confidently say that the observed effects are cause by the program and nothing else. • External validity – Extent to which observations made in a study generalize beyond the specific manipulations or constraints in the study
  • 26. To what extent can the results observed in participants be expected to generalize (extend to) other situations? When we have adequately attended to issues involving external validity it means that an evaluator has ensured that the participants of the program are representative of the population, and therefore that if the treatment is applied with another group of people from that population under similar circumstances, it should be effective there as well WHAT FACTORS DIMINISH OR THREATEN VALIDITY OF EVALUATIONS? We can classify threats to the validity of our conclusions in terms of internal and external threats Threat Description History Events occurring during a study (other than the program treatment) that can influence results Maturation Naturally occurring physical or psychological changes in program participants (e.g., growth, development, aging) that can influence results
  • 27. Testing Administration of test before and after program might influence scores on test independent of program (e.g., familiarity with test results in changes in scores) Instrumentation Having pretest and posttest that differ in terms of content, structure, format or difficulty can lead to differences in scores but not due to program treatment but differences in the instruments used Statistical regression Having extreme groups in program may artificially decrease or increase scores independent of the program treatment—if all members of a group are already scoring at the highest levels and their scores cant go any higher, any observed decline in scores may be due to the test not program treatment indicating a measurement error Differential Selection Differences between groups compared (treatment vs. no- treatment groups) on important characteristics may account for observed differences but these are not due to the program treatments Experimental Mortality Differential dropout of participants in treatment and no- treatment groups yields differences in observed effects that are not a function of the
  • 28. program treatment but rather an artifact of attrition within the groups. Treatment Diffusion Proximity among participants in treatment and no-treatment groups leads to treatment exposure for the no-treatment group Compensatory rivalry When no-treatment group outperforms the treatment group, but those differences are not due to the treatment effects, but by competition –John Henry effect Compensatory Equalization of treatments If one group receives something and the other receives nothing, than any effects on the first group may be due to the fact that this group received something, and not to the specifics of what it received. Resentful Demoralization When members of the no-treatment group realize they did not get something that the treatment group received they may become demoralized because they are being excluded but not because they did not get the specific treatment
  • 29. External Validity Threats Threat Description Selection Treatment Interaction Refers to the possibility that the program results may be applicable to only to that population from which the treatment and no-treatment groups were chosen—hence results may be internally valid but not generalizable Testing Treatment Interaction Refers to the fact that the program results may be generalizable to other groups only when a pretest is also given. Situation effects Experimenter effects Refers to the existence of multiple factors associated w ith the program itself---results may be due to a particularly charismatic instructor rather than the content of the program Multiple treatment effects Participants are involved in multiple programs at the same time
  • 30. of the evaluation, hence the findings may not be generalizable to other settings because of the confounding of multiple treatments Population validity Extent to which results observed in a study will generalize to the population from which a ample was selected. Homogeneous attrition: Rates of attrition are about the same in Ecological validity Extent to which results observed in a study will generalize across settings or environments Temporal validity Extent to which results observed in a study will generalize across time and at different points in time Outcome validity Extent to which results observed in a study will generalize across different but related DVs HOW DO WE MITIGATE AGAINST THREATS TO INTERNAL AND EXTERNAL VALIDITY? An evaluator can try to mitigate against these potential threats by selecting an evaluation design that reduces the influence of the particular threat by the manner in which the design is executed. There are many ways by which to characterize evaluation designs— Mertens and Wilsons 2012 distinguish
  • 31. between quantitative vs. qualitative data; but we can also classify designs in terms of being experimental, quasi-experiemental and non-experimental. I will use this latter one to highlight how the various designs attempt to address the validity threats we just discussed. The experimental research designs use methods and procedures to make observations in which the researcher fully controls the conditions and experiences of participants by applying three required elements of control: randomization, manipulation, and comparison/control —involves randomly selecting participants into the study so that all individuals in a study; it also involves randomly assigning participants to the experimental conditions. n—involves the systematic application of an experimental treatment. —involves controlling who gets or does not get a particular treatment and ensuring that all other aspects of the experimental process are the same except for who gets or does not get a particular treatment. Experimental research designs are the only research design capable of establishing cause—effect relationships. To demonstrate that one factor causes changes in a dependent variables, the conditions and experiences of participants must be under the full control of the research. This often means that an
  • 32. experiment is conducted in a laboratory and not in an environment where a behavior may occur naturally. Strength: Capable of demonstrating cause and effect. Limitation: Behavior that occurs under controlled conditions may not be the same as behavior that occurs in a natural environment We can categorize experimental research designs into one of 4 possible types of designs. Box 9.4 provides an alternative way to conceptualize designs in Mertens & Wilson (p. 316). You will note that (R) designates randomization for all of those designs, (O) indicates and observation; and (X) denotes a treatment. There are 5 different experimental designs which we can use to evaluate the impact of a program. Each one affords particular advantages that if relevant to the validity concerns and purpose of the evaluation enable you to more faithfully assess the program. Whether you are able to employ this designs for evaluation depends on whether or not you can randomize, manipulate and control. To the extent that you can randomize (randomly select/randomly assign participants to a treatment and no treatment group); manipulate (manipulate which group receives a treatment and which group does not); control (control for extraneous factors that may influence or impact participants that may not involve the treatment itself—e.g., control lighting and temperature on performance) then
  • 33. you are able to use one of the experimental designs described (see pl. 316-319). Besides practical concerns you have to think about ethical concerns with regard to the potential risks and benefits of randomizing, manipulating and controlling the treatment and the participants—how ethical is it to withhold a potential treatment for cancer from a terminally ill patient? If you cannot randomize, manipulate or control within your evaluation design, then the alternative is to employ a quasi-experimental design. To be an experimental design, it must meet the following three elements of control: 1. Randomization 2. Manipulation 3. Comparison/control group. Quasi- experiments are similar to an experiment, except that this design does one or both of the following: Includes a quasi-independent variable--Quasi-independent variable: A preexisting variable that is often a characteristic inherent to an individual, which differentiates the groups or conditions being compared in a research study (e.g. Gender (man, woman), health status (lean, overweight, obese). It lacks an appropriate or equivalent control group. Strength: Allows researchers to study factors related to the unique characteristics of participants. Limitation: Cannot demonstrate cause and effect Again, there are many ways to classify the various types of
  • 34. quasi-experimental designs, what is most important is to pay attention to the design that matches and address the purposes and validity threats that may influence and impact the evaluation. Mertens and Wilson describe the relevant issue with regard to quasi experimental designs in Box 9.5 (pp. 320-325) WHAT ABOUT OTHER DESIGNS THAT DO NOT CONFORM TO THE EXPERIMENT AND QUASI- EXPERIMENT CLASSIFICATION? The last category of designs involves what I refer to as non- experimental or what Mertens and Wilson (2013) classify as qualitative designs. These designs do not share any of the characteristics that are required for experimentation (e.g., randomization, manipulate, control). These designs use of methods and procedures to make observations in which the behavior or event is observed “as is” or without an intervention from the researcher. Strength: Can be used to make observations in settings that the behaviors and events being observed naturally operate (e.g. Interactions between an athlete and coach during a game). Limitation: Lacks control needed to demonstrate cause and effect. Correlational Designs
  • 35. • Measurement of two or more factors to determine or estimate the extent to which the values for the factors are related or change in an identifiable pattern • Correlation coefficient: Statistic used to measure the strength and direction of the linear relationship, or correlation, between two factors • The value of r can range from -1.0 to +1.0 Naturalistic Observation The observation of behavior in the natural setting where it is expected to occur, with limited or no attempt to overtly manipulate the conditions of the environment where the observations are made (e.g. Buying behavior in a grocery store, parenting behavior in a residential home Generally associated with high external validity, but low internal validity Qualitative Designs • Use of scientific method to make nonnumeric observations, from which conclusions are drawn without the use of statistical analysis • Adopts the assumption of determinism; however, it does not assume that behavior itself is universal • Determinism: Assumption in science that all actions in the universe have a cause • Based on the holistic view, or “complete picture,” that reality changes and behavior is dynamic
  • 36. Phenomenology (Individual) • Analysis of the conscious experiences of phenomena from the first-person point of view • The researcher interviews a participant then construc ts a narrative to summarize the experiences described in the interview • Conscious experience is any experience that a person has lived through or performed and can bring to memory • The researchers must be considerate of the intentionality or meaning of a participant’s conscious experiences • Identify objects of awareness, which are those things that bring an experience to consciousness Ethnography (Group) • Analysis of the behavior and identity of a group or culture as it is described and characterized by the members of that group or culture • A culture is a “shared way of life” that includes patterns of interaction, shared beliefs and understandings, adaptations to the environments, and many more factors • To observe a group or culture, it is often necessary to get
  • 37. close up to or participate in that group or culture • To gain entry into a group or culture without causing participants to react or change their behavior • Researchers can covertly enter a group • Researchers can announce or request entry into a group • Participant observation: Researchers participate in or join the group or culture they are observing • Researchers need to remain neutral in how they interact with members of the group • Common pitfalls associated with participant observation • The “eager speaker” bias • The “good citizen” bias • The “stereotype” bias Case Study Analysis of an individual, group, organization, or event used to illustrate a phenomenon, explore new hypotheses, or compare the observations of many cases Case history: An in-depth description of the history and background of the individual, group, or organization observed. A case history can be the only information provided in a case study for situations in which the researcher does not include a manipulation, treatment, or intervention
  • 38. Illustrative: Investigates rare or unknown cases Exploratory: Preliminary analysis that explores potentially important hypotheses Case studies have two common applications: 1. General inquiry 2. Theory development The level of control in a research design directly related to internal validity or the extent to which the research design can demonstrate cause—effect. Experimental research designs have the greatest control and therefore the highest internal validity. Nonexperimental research designs typically have the least control and therefor the lowest internal validity. Internal validity – Extent to which a research design includes enough control of the conditions and experiences of participants that it can demonstrate a single unambiguous explanation for a manipulation, that is cause and effect External validity – Extent to which observations made in a study generalize beyond the specific manipulations or constraints in the study Constraint: Any aspect of the research design that can limit observations to the specific conditions or manipulations in a study See also Merten & Wilson (2102) Box 9.8
  • 39. Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. A Power Primer Jacob Cohen Psychological Bulletin [PsycARTICLES]; July 1992; 112, 1; PsycARTICLES pg. 155 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
  • 40. Program Evaluation Proposal 1 An evaluation of the doctoral program in applied research psychology Psychology 3512 Program Evaluation Anonymous Program Evaluation Proposal 2 An evaluation of the doctoral program in applied research psychology This proposal presents an evaluation plan for the doctoral program in Psychology currently offered at the University of Texas at El Paso (UTEP). The Doctor of Philosophy (Ph.D.) program at UTEP is designed to produce bilingual/bicultural research psychologists that
  • 41. will be able to serve Hispanic populations in Texas. The Ph.D. program at UTEP aims “to prepare research psychologists to address questions appl icable to the English-Spanish bicultural communities in the Southwest” (The University of Texas at El Paso, 1992, p. 8). Though the program is fairly new (September of 1993), its development can be traced back to the late 1970s when the faculty first began to initiate efforts to pursue a Ph.D. program for the department (J. V. Devine, personal communication, March 27, 1998). Since its development the program has enjoyed support from the University and the El Paso community. Program Overview The program aims to provide the local community, the state and the nation with trained bilingual/bicultural research psychologists. In fact, there is no other program available to train bilingual/bicultural research psychologists in the nation (Jones, Keppel & Meissen, 1993). The Ph.D. program at UTEP aims to provide local residents from both the El Paso and Juarez community with the opportunity to pursue Ph.D-level training in
  • 42. psychology. Though there are other Ph.D. programs in the State of Texas, there is no Ph.D. programs--other than that currently offered by UTEP department of psychology--available to the local community. Thus, the Ph.D. program will enable the local community to pursue Ph.D. level training and provide access to and retain trained bilingual/bicultural psychologists. At the state level, the Ph.D. program aims to increase the number of psychologist trained to work with Hispanic populations in Texas. This particularly important to Texas because it is among the states with higher percentages of Program Evaluation Proposal 3 Hispanics in their population. The Ph.D. program at UTEP can serve to provide trained psychologists to deal with issues faced by Hispanics living in Texas. Nationally, the Ph.D. program will offer specialized training in working with Hispanic populations. This is important since the Hispanic population is rapidly growing and is emerging as a significant minority
  • 43. population in the United States (Marin & Marin, 1991). Thus, the Ph.D. program will serve an important and emergent need for trained bilingual/bicultural research psychologists nationwide. Program Description & Objectives To meet its objectives the program emphasizes the application of research methods and findings in non-academic settings. This includes a core curriculum in experimental psychology along with a field placement and specialization courses in either health or human behavior in organizations. The program assumes that preparation for applied psychologists requires a firm foundation in general experimental psychology including courses in statistics, cross-cultural research, and field research methodology as applied to bilingual/bicultural settings. In addition, the program requires its students to undertake a field placement designed to meet the goals of the program and the needs of the individual student. The field placement is the cornerstone of the program and is designed “to provide the practical experience in an organizational setting that will enhance the student’s training as an applied research
  • 44. psychologist”(The University of Texas at El Paso, 1992, p.16). Together with the academic training, the internship will provide students with a well rounded education and enable them to function as bilingual/bicultural psychologists. To be eligible for the program an applicant must have (a) a B.A. from an accredited university, (b) a minimum 3.0 G.P.A. in undergraduate work, (c) a minimum of 3.0 G.P.A. in their psychology course work, (d) a minimum score of 500 on each of the subtests of the GRE test, (e) satisfactory background in statistics/experimental work, and (f) three letters of Program Evaluation Proposal 4 recommendation. In addition, international students must also (g) have satisfactory scores in the TOEFL exam. This criteria is similar to that used by most other graduate programs in psychology in the US and Canada (American Psychological Association, 1994; Norcross, Hanych, & Terranova, 1996; Purdy, Reinehr, & Swartz, 1989). In sum, the purpose of the Ph.D. program at UTEP is “to
  • 45. prepare research psychologist to address questions applicable to the English-Spanish bicultural communities in the Southwest” (The University of Texas at El Paso, 1992, p. 8). This program is based on the notion that there is an increasing need for trained bilingual/bicultural psychologists to handle the needs of the Hispanic community at the local, state and national levels (See Table 1). Futhermore, there is no equivalent program available in El Paso, or in the State of Texas. This program is of timely importance since the Hispanic population is becoming a significant minority group in the US (Marin & Marin, 1991) and their growth is likely to have a significant impact on the work force as well as other areas. This program assumes that by providing specialized training it will be able to prepare bilingual/bicultural research psychologists. Purpose of the Evaluation This proposal presents an evaluation plan for assessing the progress of the Psychology Ph.D. program at UTEP. At the outset it must be recognized that this evaluation plan comprises
  • 46. a formative rather than a summative evaluation. This approach was followed for several reasons. First, the Ph.D. program at UTEP has been in place for approximately five years. Given that on average most Ph.D. candidates complete their training in 6 years it is unreasonable to expect that the Ph.D. program will have met its objectives during its fifth year of implementation. Secondly, the Ph.D. program is the first of its kind for the department of psychology and as a result will undergo procedural changes commonly associated with the development of new programs in any Program Evaluation Proposal 5 organization. Finally, this evaluation seeks to provide evaluation information that can be utilized to improve the program rather than assess its absolute merit or worth at this point. Accordingly, this evaluation aims to provide information that can be utilized by program administrators, faculty, staff and students of the UTEP doctoral program in psychology to improve the program and address issues of concerns.
  • 47. Evaluation Questions Specifically, this evaluation proposes to examine the following evaluation questions: (1) To what extent has the program been implemented as proposed? (2) To what extent has the program progressed toward meeting its stated objective of providing trained “bilingual/bicultural research psychologists”? (3) What are the perceptions of the various stakeholder regarding the Ph.D. program? and (4) What are the main concerns for program development and improvement among the faculty and students of the program? These evaluation questions have been selected to represent broad areas of interests that all stakeholders (e.g., local staff and faculty, students in the program as well as higher level university and public officials) share in common. In addition, these questions can provide valuable information that can be directly fed back to the program as well as enable the participants of the program to voice issues of concerns and make a worthwhile contribution toward the improvement of the graduate education offered at UTEP. Evaluation Design
  • 48. This evaluation concerns both process and implementation issues. That is, this evaluation will assess whether the Ph.D. program has been implemented as proposed and whether the program has made progress in that process. Shortell & Richardson (1978) have argued that “recurrent institutional cycle designs provide a flexible way of handling some major threats to internal validity and is particularly conducive to settings and circumstances in which new groups Program Evaluation Proposal 6 of individuals are exposed to ongoing programs” (p. 63). The Ph.D. program at UTEP certainly would fit this description. The UTEP program has been in place for approximately five years and has had a new cohort of students entering the program each year since it began. This design allows for the integration of data from multiple sources (e.g., program participants at various stages of the programs, faculty, and staff) while allowing for diversity in information. This design is made up of three designs including the one-shot case
  • 49. study, a static group comparison design and the single group pre-posttest design. This design offers several advantages in that it controls for the effects of history, testing , instrumentation, selection and attrition. However, it does not rule out the effects of maturation or regression. Because this evaluation is not concerned with generalizing outside of UTEP at this time, external validity issues (e.g., generalizing to other participants, settings, times) are not of major concern. Population and Sampling Process This evaluation will be conducted with the program participants at the University of Texas at El Paso. The proposed evaluation will include both paper and pencil assessments of participants satisfaction with the program, supervision, coursework, research and other areas of the program. In addition, focus group interviews will be conducted with each student cohort enrolled since the program was installed as well as with both the faculty and staff of the program. Focus group interviews will be carried in one of the seminar rooms of the department. These
  • 50. rooms are generally medium in size and can accommodate up to 20 people. As recommended by focus group researchers (Krueger, 1994), the sessions will be tape-recorded to ensure the accuracy of the data. Participants. Participants will include students, faculty and staff of the Ph.D. program at the University of Texas at El paso. Participant will be asked to complete both a written survey Program Evaluation Proposal 7 and a focus group interview designed to assess their perceptions and attitudes about the Ph. D. program at UTEP. In addition, individual interviews will be conducted with various university officials including the Dean of the Liberal Arts College, University Vice President of Academic affairs and the UTEP President to assess their knowledge about the programs’ success. Individual interviews will also be conducted with other University officials that are presently involved with the program or were involved in its formulation, or development.
  • 51. Measures Participant survey. Participant perceptions about the program will be assessed through a survey questionnaire (see Appendix A). This survey will assess various aspects of the program including participants knowledge, development and satisfaction (e.g., satisfaction with the program, supervision). Participant knowledge. Participants’ knowledge about the program will be assessed with a scale specifically designed for this project. The scale is comprised of four items that are presented in Likert format. Participants rate the extent to which they agree or disagree with each item on a scale ranging from one to five. Scoring is accomplished by summing across the four items. With a five-point response scale, scores can range from a low of 4 to a high of 20. Higher scores on the scale indicate greater knowledge about the program. Development. Participant development will be measured using a scale designed specifically for this evaluation. The scale is comprised of 6
  • 52. items that are presented in Likert format. Participants rate the extent to which they agree or disagree with each item on a scale ranging from one to five. Scoring is accomplished by summing across the 6 items. With a five- point response scale, scores can range from a low of 6 to a high of 30. Higher scores on the scale indicate more positive views about participant development in the program. Program Evaluation Proposal 8 Satisfaction. Satisfaction with the program, supervision, co- workers will be measured using a scale modelled after the Job Descriptive Index (JDI) developed by Smith, Kendall, & Hulin (1969). Each scale contains an 18-item index that presents respondents with adjective checklists. Respondents indicate "yes" if the adjective describes their work (or co-worker, supervision, pay), "no" if it does not describe their work (or co - worker, supervision, pay), and "?" if they cannot decide. Focus groups. Focus group interviews will be administered to
  • 53. individual stakeholder groups (see Appendix B). The focus group interview is composed of four parts. Part I is designed to determine the participant's attitudes and feeli ngs toward program. It contains three questions dealing with the perception about the Ph.D. program (e.g., What is your general perception of the program? What do you think of it?). Part II is designed to explore participant's knowledge of the program. It contains several questions that assess the participant's knowledge and experience with the program (e.g., What do you perceive as the purposes or guiding philosophy of the program?). Part III and IV are designed to understand participant's concerns about the program. It contains questions that deal with participant's feelings about the program, its objectives, goals and overall purposes (e.g., What are some concerns that you have about the program?). In addition, the present evaluation will also collect archival data to assess the programs progress. Archival data collected will include samples of students program plans for both
  • 54. current and graduated students. These data will be used to examine the extent to which the program has been implemented as proposed. Procedures Participants will be asked to attend a discussion session with the program evaluator. The Program Evaluation Proposal 9 participants will be scheduled to attend their session with their respective cohort (e.g., 1st. year students, faculty, staff). At the begining of the session the evaluator will give a brief introduction and describe the purpose of the session in general terms. Participants will be informed of their rights as participants. They will be told that their responses are anonymous and only the evaluator will have access to their individual responses. Each participant will then be given a questionnaire to complete and place into a box located near the entrance of the room. When the group is finished, the evaluator will then begin the focus group discussion with the
  • 55. group. Focus group interviews will be conducted in one of the seminar rooms in the department of psychology building by the evaluator and an assistant. Again, participants will be informed of their rights and they will be told that their responses are to be kept anonymous. Participants will be told not to state their name or any other identifying information during the focus group interviews. Approximately 60-90 minutes will be required to complete the focus group interviews. However, should more time be needed it will be determined at the time of the interview with the consent of all of the participants. Upon completion of the interview, participants will be given the opportunity to ask questions regarding the purpose of the evaluation. All responses and comments about the interview will be recorded. The tape recorder will be turned off by the evaluator when the last participant has left the room. Analysis Appendix C outlines the evaluation questions along with the information required for
  • 56. each question, the information source and the method of data collection. As can be seen the evaluation questions will be answered using a combination of quantitative and qualitative information. For the quantitative data reliability indices will be calculated to determine the Program Evaluation Proposal 10 internal consistency of the items for all the scales included in the questionnaire. Scores on each of the scales will be used to compare across participant groups as well as to the narrative responses from the focus group interviews. Qualitative survey comments and focus group interview responses will be content analyzed and major themes will be identified for the entire group (e.g., including all participants) as well as for each group separately (e.g., student responses, faculty responses etc.). The data will be coded and analyzed by two independent judges to cross check the reliability of the coding procedures. Major themes and concerns will be compiled for comparisons between the
  • 57. stakeholder groups. Personnel, Equipment & Logistics The evaluation design described in the design section, will be conducted with the program participants at the University of Texas at El Paso. The university is located near the Mexico-US border approximately 3-5 miles from Ciudad Juarez, Mexico. It is a midsize university with a predominantly Hispanic student population from the surrounding communities. Both the participant survey and focus group interviews will be carried in one of the seminar rooms of the department (e.g., Psych. 308). These rooms are generally medium in size and can accomodate up to 20 people. As recommended by focus group researchers (Krueger, 1994), the sessions will be tape-recorded to ensure the accuracy of the data. Participants will be asked to attend a discussion session with the program evaluator. The participants will be scheduled to attend their session with their respective cohort (e.g., 1st. year students, faculty, staff).
  • 58. Evaluation time line. Approximately seven months will be needed to conduct the full evaluation. This time will be used to develop measures, conduct observations, administer the Program Evaluation Proposal 11 survey, analyze data, review with clients and to prepare and present a final report. A significant portion of time will dedicated to the gathering of focus group interviews (e.g., 2 months) as well as to the analysis of this data. All other aspects of the evaluation will last no longer than one month. Although some personnel in the department of psychology may be qualified to assist in the evaluation, because of the nature of the data being collected and confidentiality issues the expertise of individuals in the department will not be utilized. Implications & Recommendations The results from both the questionnaire survey and the focus group interview will provide data on stakeholders assessment of the Ph. D program at UTEP. The result will help determine
  • 59. whether the program has made progress toward achieving its goals. Furthermore, the findings will also help to identify program strengths and weaknesses. Specifically, the findings of the proposed evaluation will provide information on (1) the extent to which the program has been implemented as proposed; (2) the extent to which the program has made progress toward meeting its stated objectives; (3) provided information about the perceptions of stakeholders regarding the Ph.D. program; and (4) help identify areas of concern for the purposes of program development and improvement. Furthermore, this evaluation plan can provide information that can guide the development of similar programs in other schools. This is particularly important since the strength of the Hispanic community is rapidly increasing and the need for such programs may become acute. Human Subjects The proposed evaluation plan aims to assess the Ph. D. program at UTEP. Participants in this project will be required to complete an anonymous survey assessing their perceptions about
  • 60. the Ph.D. program as well as a focus group interview with other participants. Participants will Program Evaluation Proposal 12 include students, faculty and staff as well as other selected university and public officials associated with the program. Local stakeholders (e.g., students, faculty, staff and other UTEP personnel) will be asked to participate in a discussion with the program evaluator. A flyer with a brief description of the project will be placed in individual mailboxes as well as throughout various information boards in the department. The flyer will include a brief description of the project, as well as the date, time, and location of the study. Questionnaires will be administered to participants on the same day as the scheduled focus group discussions. Participants will be informed of their rights as participants. They will told that responses are anonymous and only the evaluator will see the completed questionnaires. Instructions will be read aloud and participants will be given a pencil, an informed consent form,
  • 61. the survey questionnaire. Participants will be told not to write their name or any other identifying information on the survey itself. Once completed the participants will place their individual surveys into a box located near the entrance of the room. After a brief break, focus group discussion will begin. Focus group interviews will be conducted in one of the seminar rooms in the department of psychology building by the evaluator and an assistant. Again, participants will be informed of their rights. Participants will be told not to state their name or any other identifying information during the focus group interviews. Approximately 45-60 minutes will be required to complete the focus group interviews. Upon completion of the interview, participants will be given the opportunity to ask questions regarding the purpose of the evaluation. All responses and comments about the interview will be recorded. The tape recorder will be turned off by the evaluator when the last participant has left the room.
  • 62. Program Evaluation Proposal 13 References American Psychological Association. (1994). Graduate study in psychology. Washington, DC: Author. Jones, J. M., Keppel, G., & Meissen, G. (1993, February). Site visit report: Department of Psychology University of Texas at El Paso. Berkeley, CA. Krueger, R. A. (1994). Focus groups (2nd. ed.). Thousand Oaks, CA: Sage. Marin, G., & Marin B. V. (1991). Research with Hispanic populations. Newbury Park, CA. Sage. Norcross, J. C., Hanych, J. M., & Terranova, R. D. (1996). Graduate study in psychology: 1992- 1993. American Psychologist, 51, 631-643. Purdy, J.E., Reinehr, R.C., & Swartz, J.D. (1989). Graduate admissions criteria of leading psychology departments. American Psychologist, 44, 960-961. Ronco, S., Passmore, B., Morales, T., & Schwartz, O. (1997).
  • 63. The University of Texas at El Paso fact book 1996-1997. El Paso, Texas. The University of Texas, Office of Institutional Studies. Shortell, S. M., & Richardson, W. C. (1978). Health program evaluation. Saint Louis, MO. The C. V. Mosby Company. Smith, P. C., Kendall, L. M., & Hulin, C. L. (1969). The measurement of satisfaction in work and retirement. Chicago, IL. Rand Mc Nally. The University of Texas at El Paso (1992). Doctor of philosophy in psychology. El Paso, Texas. University of Texas Program Evaluation Proposal 14 Table 1. Process Model of Evaluation. Preexisting Conditions Program Components Intervening Events Impact/Consequences No Ph.D. programs exist to meet the needs of the Hispanic
  • 64. Population Ph.D. programs in Texas are beyond their estimated capacity to train additional psychologist for the state. There is no Ph.D. program available to the El Paso-Juarez community. -Inputs/Obj. -Prepare research psychologist to serve the English-Spanish Bicultural
  • 65. communities in the Southwest. Resources -Graduate Faculty -Adequate facilities -Adm. Support -Library resources -Computing resources -Extramural funding Activities -External evaluations have been conducted. -Recruitment of students has continued. Internal -Faculty Turnover -Budgetary changes
  • 66. -Student dropout rate -Availability of assistantships -Student progress External -Extramural funding -Poor applicant pool -Low visibility Proximal -increased pool of qualified applicants -increased enrollment in the Ph.D. program -increases in applicants from the El Paso/Juarez area -increased fluency in
  • 67. Spanish language among students -increases in the number of students passing the language proficiency exam. -increases in the passing rate of students taking the first year exam -increases in the rate of students reaching ABD -increases in extramural funding awards to faculty Distal
  • 68. -Increase the number of bilingual/bicultural applied research psychologists -Greater cooperation /communication between UTEP and El Paso-Juarez community -Increase KSAs of Psychologist Program Evaluation Proposal 15 APPENDIX A Participant Questionnaire
  • 69. Program Evaluation Proposal 16 Survey Questionnaire We are interested in learning how to improve the Ph.D. program at UTEP. Please give your honest opinions to the questions asked here. Information collected on this survey will assist in the evaluation of the Ph.D. program and will be used to make improvements. All information is confidential and only the program evaluator will have access to the information you provide--you as an individual will be anonymous. Directions: Read each statement carefully and check the box that best represents your opinion next to each statement. Participant’s Knowledge Strongly Disagree Disagree Neither Agree Nor Disagree Agree Strongly Agree There are clear guidelines on the program requirements. I am informed of program changes affecting my educational goals Information is shared with students is timely and organized. There is adequate information/resources available to students.
  • 70. Participant Development Strongly Disagree Disagree Neither Agree Nor Disagree Agree Strongly Agree Student research is taken seriously in the department. I would like to influence decisions impacting students. I find the coursework in the program worthwhile. I find the coursework in the program useful. There is good cooperation among faculty and students. Recently, I have considered leaving the Ph. D. Program. Directions: Read each statement carefully and check the box that best represents your opinion next to each statement. PROGRAM IN GENERAL Think of the Ph.D. program at UTEP. How well do the words or phrases describe the program in general? Yes No Don’t Know CURRENT POSITION Think of the work you do at
  • 71. present in the Ph.D. program. How well do the words or phrases describe what your position is like most of the time? Yes No Don’t Know Pleasant Fascinating Bad Routine Ideal Satisfying Waste of time Boring Good Good Undesirable Creative Worthwhile Respected Worse than most Uncomfortable Acceptable Pleasant Superior Useful Better than most Tiring Disagreeable Healthful Makes me content Challenging Inadequate Too much to do Excellent Frustrating Rotten Simple Enjoyable Repetitive Poor Gives sense of accomplishment
  • 72. Program Evaluation Proposal 17 Directions: Read each statement carefully and check the box that best represents your opinion next to each statement. SUPERVISION Think of the Ph.D. program. How well do the words or phrases describe the supervision you receive in the program? Yes No Don’t Know COWORKERS-PEERS Think of the Ph.D. program. How well do the words or phrases describe your coworkers or peers? Yes No Don’t Know Asks my advice Stimulating Hard to please Boring Impolite Slow Praises good work Helpful
  • 73. Tactful Stupid Influential Responsible Up-to-date Fast Doesn’t supervise enough Intelligent Has favorites Easy to make enemies Tells me where I stand Talk too much Annoying Smart Stubborn Lazy Knows job well Unpleasant Bad Gossipy Intelligent Active Poor planner Narrow interests Around when needed Loyal Lazy Stubborn Please use this space to give us your reactions and views of the Ph.D. program at UTEP. Why or why not? _____________________________________________________ ____________________________________________ _____________________________________________________ ____________________________________________
  • 74. _____________________________________________________ ____________________________________________ Why or why not? _____________________________________________________ ____________________________________________ _____________________________________________________ ____________________________________________ _____________________________________________________ ____________________________________________ 3. What did you like most about the training you have received thus far? _____________________________________________________ ____________________________________________ _____________________________________________________ ____________________________________________ _____________________________________________________ _________________________________________ ___ 4. What do you like least about the training you have received thus far? _____________________________________________________ _____________________________________________________
  • 75. _______________ _____________________________________________________ ____________________________________________ _____________________________________________________ ____________________________________________ Program Evaluation Proposal 18 5. Are there any areas covered in the program that you feel should have been covered in more detail? _____________________________________________________ ____________________________________________ _____________________________________________________ ____________________________________________ _____________________________________________________ ____________________________________________ 6. Wou If YES, in what area? _____________________________________________________ ____________________________________________ _____________________________________________________ ____________________________________________
  • 76. _____________________________________________________ ____________________________________________ 7. Of the courses you have taken so far, indicate those which have been the most useful: _____________________________________________________ ____________________________________________ _____________________________________________________ ____________________________________________ _____________________________________________________ ____________________________________________ 8. Of the courses you have taken so far, indicate those which have been the least useful: _____________________________________________________ ____________________________________________ _____________________________________________________ ____________________________________________ _____________________________________________________ ____________________________________________ THIS INFORMATION IS CONFIDENTIAL AND WILL ONLY BE USED TO ANALYZE THE RESULTS. IT WILL HELP US MAKE IMPROVEMENTS ON THE TRAINING. DEMOGRAPHICS
  • 77. Student Status: Number of years in the program:
  • 78. Program Evaluation Proposal 19 APPENDIX B Focus group interviews All of you have been asked to participate in this study because you can help me understand how people like yourselves [e.g., students, faculty, staff, or other] feel about the Ph.D. program offered by the department of psychology. We are interested in finding out what you feel about the Ph.D. program--be that positive or negative. There are no wrong or right answers; we just want to know what you think and feel about the program. Remember your answers are completely private. No one except the evaluator (myself) will have access what you say today.
  • 79. To begin with we are going to start by completing a brief survey that explores how you feel about the program. We will begin our discussion as soon as that is completed. Take a few moments and answer the survey then place it face down on your table so that I know when it is alright to begin.[WAIT FOR PARTICIPANTS TO COMPLETE THE SURVEY]. Now I am going to turn on the tape recorder and check to make sure it is functioning properly. I would like all of us to count to five to check the tape recorder, I will begin and the next person to my right continue till we have gone full circle. I will then check to make sure the recorder is functioning properly and begin the discussion. To keep your answers private, we will not use your name or any other type of identifying label. [REWIND TAPE AND PLAY BACK TO ENSURE THAT THE RECORDINGS ARE AUDIBLE]. Part I. Goal: Determine attitudes and feelings toward the Ph.D. program.
  • 80. 1. What is your general perception of the program? --What do you think of it? Do you think well of it? 2. What do you like about it? What do you think the program does well? --What makes you say that? 3. What do you think is bad about the program? What don’t you like about the program? --What makes you say that? Explain ? Part II. Goal: Explore participants knowledge of and about the program. 4. What do you perceive as the purposes (goals, objectives) or guiding philosophy of the program? --How did you arrive at this answer ? --What kinds of things happened that told you this? 5. What do you think about this philosophy? Do you agree
  • 81. with these purposes or philosophy? Do you think the problems the program addresses are severe? important? Program Evaluation Proposal 20 6. I am going to read to you the stated objectives of the program as written in the “gray book”-- the program’s official plan that delineates the purpose and objectives. Read [The Doctor of Philosophy (Ph.D.) program at UTEP is designed to produce bilingual/bicultural research psychologists that will be able to serve Hispanic populations in Texas. The Ph.D. program at UTEP aims “to prepare research psychologists to address questions applicable to the English-Spanish bicultural communities in the Southwest” (The University of Texas at El Paso, 1992, p. 8)].
  • 82. 7. What do you think about this philosophy? Do you agree with these purposes or philosophy? Do you think the problems the program addresses are severe? important? Is it different from what you expected? Why or why not? 8. What do you think the theory or model for the program is? Why/how do you think it works? How is it supposed to work? Why would the program actions lead to sucess on the program’s objectives or criteria? Which program components are most critical to success? 9. Typically, what types of activities do students in the department engage in? (academic activities)? 10. Are these activities different from that of other students in other schools? Explore ? 11. What does a student have to do to finish the program? Part III. Goal: Understand participant concerns about the Ph.D. program
  • 83. 13. What concerns do you have about the program? About its outcomes? Its operations? Other issues? --What makes you say that? 14 If there any changes that you would like to see occur what would they be? How would these help you or others in the program? Do you think others share your concerns? Do you think these would be easily implemented? 15. Why do you think these changes have not occurred ? Program Evaluation Proposal 21 Part IV. Goal: Learn about issues related to specific participant groups Faculty A. Is the program being implemented as originally
  • 84. conceptualized? B. Are there any problems with the implementation process? C. What problems do faculty perceive? D. How satisfied are they with the way the program is implemented” E. How satisfied are they with the quality of students being enrolled? F. How satisfied are they with the graduating students? G. How well does the program prepare its students for the job market? H. How comparable is the curriculum to similar graduate programs (nationally, or regionally)? Students A. Is the program meeting their expectations? Why or why not? B. What are some problem areas in the program: 1. Classes: Have classes been offered regularly? Which have and which have not? are there any additional courses that you would like to see offered? 2. Internships? How many students have gone on internships? How satisfied were they with
  • 85. their experiences? How relevant were the internships to their career/professional goals? 3. Funding? How are funding decisions made? 4. Teaching opportunities? How many have taught classes? How is the assignment of courses made? What preparation do students receive to teach a course? 5. Other? Staff A. Has there been added work as a result of the Ph.D. program beyond what is possible with the resources available to the department? General A. Has the program graduated the first class of students? How prepared are they for the market? How well does the program prepare its students for the job- market? B. How comparable is the curriculum to those of other graduate programs (nationally, or regionally)?
  • 86. C. Has the program met accreditation standards (p. 29)? 16. Are there any comments or suggestions that you want to add ? Program Evaluation Proposal 22 APPENDIX C Evaluation Plan Outline Evaluation Question Information Required Source of information Method 1. To what extent has the program
  • 87. been implemented as proposed? Qualitative Participant interviews Archival data Focus group interviews Departmental records 2. To what extent has the program progressed toward meeting its stated objectives? Qualitative Participant interviews Archival data Focus group interviews Departmental records 3. What are the perceptions of
  • 88. stakeholders regarding the program? Quantitative/ Qualitative Participant survey Participant interviews Participant survey Focus group interviews 4. What are the main concerns for program development and imrovement? Quantitative/ Qualitative Quantitative/ Qualitative
  • 89. Participant survey Focus group interviews Proposal to Establish the Doctor of Education in Higher Education in Higher Education and Restructure the Doctor of Education in Educational Leadership 1. Terminate the Doctor of Education in Educational Leadership Concentrations in K-12 and in Higher Education. 2. Restructure the Doctor of Education in Educational Leadership to a 45-Credit Doctor of Education in Educational Leadership 3. Establish a 69-Credit Doctor of Education in Higher Education Executive Summary Terminate the K-12 and Higher Education Concentrations in the Doctor of Education in Educational Leadership and Restructure the Curriculum as a 45-Credit Doctor of Education in Educational Leadership The College of Education proposes terminate the K-12 and the Higher Education concentrations in the Doctor of Education in Educational Leadership (EdDEL) and restructure the curriculum of the EdDEL into a cohort program of 45-credits. Current students will have 5 years to complete the current degree. In some cases, current students may option into the new
  • 90. curriculum. The program will be taught in an executive format of a fixed set of courses that combines the theoretical and methodological foundations of academic research with an applied focus that helps students develop the professional and interpersonal wisdom necessary to successfully manage change within complex organizational structures. The EdDEL executive format degree program will prepare its graduates to be not just effective administrators but skillful and visionary leaders. The EdDEL degree program is a cohort program and will consist of a fixed set of courses offered in a specific sequence. All students in each cohort will take the same courses in the same sequence. Possession of a master’s degree or at least 30 graduate credits in a related field will be required for admission to the program. The EdDEL degree program will enroll the first cohort in summer of 2017. Curriculum EPSY 8627: Introduction to Research Design and Methods EDAD 8461: Ethical Leadership EDUC 5325: Introduction to Statistics and Research EDAD 8635: Education Policy Analysis EDAD 5262: Introduction to Qualitative Research EDAD 8653: Civic Leadership EDAD 8636: Research for Change and Program Evaluation EDAD 8755: Organizational Dynamics EDAD 8093: Administrative Research Seminar EDAD 8553: Democratic, Equitable, and Ethical Leadership EDUC 9998: Dissertation Proposal Design EDUC 5010: Special Topics in Education: Trends in Special Education AOD 5534: Group Facilitation and Consultation EDUC 9999: Doctor of Education Dissertation
  • 91. Establish a Doctor of Education in Higher Education (EdD-HE) The College of Education proposes to establish a 69-credit (EdD-HE) degree. The EdD-HE degree will require a rigorous program of study that helps students develop the skills needed to diagnose and resolve organizational challenges and to craft and evaluate programs and policies impacting student success. We believe that a stand-alone higher education-focused degree will better serve our growing population of higher education- focused students and make our already high-quality EdD program more competitive in the regional market for doctoral programs. We also believe that the focus and coherence of the new program will enable us to broaden our recruiting to include professionals working within the entire educational enterprise that is now the full-service university, as well as those outside of postsecondary institutions—including researchers, administrators, policymakers, and educational support providers. The program features a core set of courses that reflect the essential values of the Temple University graduate program in higher education and the foundational knowledge, skills, and abilities required for effective postsecondary administrative practice. Possession of a master’s degree in a related field will be required for admission to the program and students will be expected to transfer in up to 30 credits as advanced standing (with approval). With approval, students may also transfer up to nine credits earned at the Temple College of Education prior to matriculation into the EdD-HE. Most students will thus need to complete 11 courses plus at least six credits in the dissertation block (including at least two credits of EDAD 9999) in residence in the doctoral program at Temple. Curriculum Core Courses (Four 3-Credit Courses):
  • 92. HIED 8101: Advanced Seminar on Higher Education Administration HIED 8102: Higher Education Economics & Finance HIED 8103: Equity in Higher Education Policy & Practice HIED 8104: Seminar on Theory in Higher Education Electives (Two 3-Credit Courses): Course suggestions provided, students select a 2- course cognate based on dissertation interests. Advanced Research Methods (Four 3-Credit Courses) EPSY 8627: Introduction to Research Design & Methods EDUC 5325: Introduction to Statistics and Research EDUC 5262: Introduction to Qualitative Research Plus one of the following: EPSY 8625: Intermediate Educational Statistics EPSY 5529: Test and Measurements EPSY 8826: Multivariate Research Methods HIED 8XXX: Advanced Practice-Based Qualitative Research in Higher Education (new) Comp Exam & Dissertation Block (9 Credits Minimum) HIED 8XXX: Advanced Higher Education Research Seminar (Lit. Review & Comp. Exam) EDUC 9998: Dissertation Proposal (3 credits) EDAD 9999: Dissertation (3-6 credits) The EdD in Higher Education will be offered in Fall 2017. Proposal to Restructure the Doctor of Education in Educational Leadership Degree Abstract. The following is a proposal to restructure to Doctor of Education in Educational Leadership (EdDEL) to a post master’s 45-credit EdDEL and to terminate the K-12 and Higher Education concentrations within the EdDEL degree.
  • 93. Market. The restructured EdDEL degree is intended for a wide audience of individuals with experience in K-12 educational settings who desire to advance their careers. Many educational professionals want a defined, predictable program of study that supports steady progress and complements the busy schedule of a teacher and/or school leader. The proposed program will institute a cohort-based executive format program, wherein courses will be offered on weekends and during the summer, accommodating the schedules of working professionals. We believe that a stand-alone K-12 education focused EdDEL will better serve our population of K-12 focused students and make our already high-quality EdDEL program more competitive in the regional market for doctoral programs. We also believe that the focus, coherence and executive format of the new program will enable us to broaden our recruiting efforts. Program objectives. The EdD-EL degree will require a rigorous program of study that helps students develop the skills needed to diagnose and resolve organizational challenges and to craft and evaluate programs and policies impacting student success. The proposed curriculum combines the theoretical and methodological foundations of academic research with an applied focus that helps students develop the professional and interpersonal wisdom necessary to successfully manage change within complex organizational structures. The Temple EdDEL degree will prepare its graduates to be not just effective administrators but skillful and visionary leaders. I. New Program Rationale, Context, and Demand Rationale: · The proposed program is consistent with those offered by the leading schools of education and would enhance our capacity to fulfill the university’s mission of social justice in education.
  • 94. · This proposal responds to the need to support the learning of working practitioners seeking to advance knowledge of systems leadership by structuring learning in an executive format. · The proposal program is designed to reflect the current and emerging trends in school district and school system design, management and leadership. Description: We are proposing to restructure the current EdDEL degree with two concentrations into a post-masters 45-credit EdDEL degree program that will be cohort based taught in an executive format. The program will continue to foster and reinforce Temple’s commitment to social justice, equity, and ethical practices. The courses, course sequence, and dissertation process are designed in a way that the program coheres around these issues and their implications for educational leadership. We believe that all students will benefit from engaging with these issues, and that it will not only contribute to students’ preparation as system leaders, but also as citizens in their communities and in the broader world. The proposed cohort program will consist of a fixed course sequence and is structured to allow students to complete their degree in three years. Admission criteria for the program will include evidence of scholarship and leadership activities. Possession of a master’s degree or at least 30 graduate credits in a related field will be required for admission to the program. Many educational professionals want a defined, predictable program of study that supports steady progress and complements the busy schedule of a teacher of school leader. The proposed program will institute a cohort-based executive program, wherein courses will be offered on weekends and during the summer, accommodating the schedules of working
  • 95. professionals. Students will be admitted in cohorts of 18 – 20 students every other year. They will take all of their courses together, and beginning with the first semester, students will receive support in preparing for, conducting, and writing their dissertation studies. The courses in the proposed program are organized thematically and incorporate elements that are uniquely supported by the coherence of this new design. Each of the themes has a faculty sponsor who will oversee course content to support articulation within the program curriculum. We expect that developing coherence among all the executive program components will lead to increased efficiency and efficacy, and an intensive, yet manageable experience for students. II. Relationship to other programs in the college The proposed restructured EdDEL degree will replace the terminated K-12 concentrations within the current EdDEL degree. The College of Education is also proposing to establish a Doctor of Education in Higher Education (EdDHE) that will provide a more focused, robust doctoral program for Higher Education than is currently being offered. The restructured EdDEL and the proposed EdDHE degrees allow the College of Education the opportunity to clarify mission and market, create more structured and coherent programs of study, and better position the College as the regional leader in education graduate training programs. III. Curriculum Program design.Completion of the cohort EdDEL degree will require 45-credits beyond the master’s degree. Possession of a master’s degree or at least 30 graduate credits in a related field
  • 96. will be required for admission to the program. With program approval students may also transfer in up to nine post master’s credits earned at the Temple College of Education while not matriculated in a graduate program. Most students will thus need to complete 13 courses plus at least six credits in the dissertation block (including at least two credits of EDAD 9999) in the doctoral program at Temple. Core Courses & Course Sequence The program will consist of a fixed set of courses offered in a specific sequence, Table 1. All students in each cohort will take the same courses in the same sequence. Consequently, course curricula will more explicitly build upon one another, and certain themes will bridge multiple courses. In addition, as students proceed through the program, we will orchestrate opportunities for them to develop deep interpersonal and professional relationships in support of their coursework, the dissertation process, and their practice. The executive program will be open to students working in a wide array of contexts. That said, throughout our courses we will continue to emphasize issues traditionally associated with urban school systems: disadvantages related to low socio- economic status, institutional and individual biases based on race and class, unequal distribution of resources, and the political and organizational implications and the implications of pursuing social justice in schools. We propose changing some of the current course titles to more accurately reflect their content, as indicated below. · EDAD 8461 Ethical Educational Leadership Ethical Leadership · EDAD 8635 Current Issues in Educational Policy Education
  • 97. Policy Analysis · EDAD 8653 Educational Leadership as Civic Leadership Civic Leadership · EDAD 8755 Theoretical Perspectives/Organizational Dynamics of Education Leadership Organizational Dynamics · EDAD 8553 Profiles of Democratic and Ethical Leadership Democratic, Equitable, and Ethical Leadership Table 1. Proposed Executive EdD in Educational Leadership Curriculum and Pathway Summer 1 EPSY 8627 Introduction to Research Design and Methods EDAD 8461 Ethical Leadership Fall 1 EDUC 5325 Introduction to Statistics and Research EDAD 8635 Education Policy Analysis Spring 1 EDUC 5262 Introduction to Qualitative Research EDAD 8653 Civic Leadership Summer 2 EDAD 8636 Research for Change and Program Evaluation EDAD 8755 Organizational Dynamics Fall 2 * EDUC 8093 Administrative Research Seminar EDAD 8553
  • 98. Democratic, Equitable, and Ethical Leadership Spring 2 EDUC 9998 Dissertation Proposal Design EDUC 5010 Special Topics in Education: Trends in Special Education Summer 3 AOD 5534 Group Facilitation and Consultation Fall 3 EDUC 9999 Doctor of Education Dissertation Spring 3 EDUC 9999 Doctor of Education Dissertation *Comprehensive exams will take place at the end of this semester. Program Requirements: Dissertation The majority of program graduates will continue to work in a practical field, and therefore dissertation studies will involve addressing a pressing problem of practice. This problem must be rich enough to require a thorough examination of the relevant practical and theoretical literature, and yet specific enough to yield actionable recommendations. Dissertation topics must also be responsive to the set of methodological tools at the students’ disposal. · Dissertation proposal: In the semester immediately following completion of the Advanced Research Seminar and successful completion of the comprehensive exam, students will enroll in Dissertation Proposal Design (EDUC 9998). Like the Advanced Research Seminar, EDUC 9998 will function as a structured, intensive, cohort-based monthly workshop in which students will design and defend a dissertation proposal that outlines a
  • 99. rigorous plan for empirical study of an issue relevant to the student’s professional responsibilities or aspirations. The proposal must incorporate a thorough and critical review of literature relevant to the topic, a discussion of theoretical approaches to understanding and studying the topic and a conceptual or theoretical framework that will guide the study, and a robust methodological plan (including assurances of completing IRB review and any interview or other protocols necessary to submit for IRB review). Dissertation proposal defense will occur at any point during or at the end of the semester and students will receive feedback from the faculty adviser, other committee members, and their cohort peers during their defense. · Dissertation: Following successful defense of the dissertation proposal and after securing IRB approval, students will carry out an original research project intended to make a significant practice-based contribution to the field. While the EdD dissertation is meant to have practical and applied relevance, however, it is nonetheless expected to engage rigorously with existing literature and theory appropriate to the student’s chosen topic and to demonstrate the student’s ability to execute robust methods appropriate to the student’s research question(s). Dissertation requirements The EdD dissertation is distinct from the PhD dissertation; the intent of the dissertation is not to build theory but rather to make a substantive contribution to practice-focused scholarship in a particular domain of K-12 Educational Leadership. · Dissertation study report. EdD students will complete as the dissertation a standard academic manuscript (inclusive of an introduction, literature review, conceptual/theoretical framework, methodology, results, discussion, and references).
  • 100. EdD dissertations are typically less lengthy than PhD dissertations with a smaller scope of theorizing and data collection, but are held to the same standards as PhD dissertations with respect to methodological validity, data analysis, and writing quality and clarity. · White Paper/Executive Summary. Because of the practice focus of the EdDEL program, in addition to completion of a dissertation study report, students will be required to produce a white paper/executive summary distilling the lessons of their research for practitioners in their field. Requirements for the dissertation will otherwise remain consistent with those of the current EdD in Educational Leadership and as defined by the graduate school, including the composition of the dissertation committee and processes for dissertation defense and submission. Course Scheduling & Curriculum Grid The executive program will require a total of 45 credits beyond the earned master’s degree. During the first two years, students will take two courses in each of the fall and spring semesters, and two courses will be offered in the summers during an intensive seven-day session. In the final year, students will take one course per term IV. Impact on Faculty & Students Faculty. The courses offered in the executive program are equivalent to the traditional course offerings, and will be taught by the faculty who currently teach them. Core program faculty will serve as advisors for 3 – 5 students every other year, and they will usually, but not always become the chair of each of those students’ dissertation committees. Current program faculty include Christopher McGinley, Steve Gross, Sarah
  • 101. Cordes, and John Hall. Given the program's strong focus on practice, dissertation committees will include two qualified faculty members from the College of Education and a qualified practitioner who is external to the university. The external member must successfully complete a selection process developed by the program faculty. The courses offered in the program sort into four main groups, and each group of courses will be managed by a program faculty member. This does not mean that a group’s faculty member teaches each of the courses, but he or she is responsible for reviewing and adjusting the curriculum of these courses, developing coherence among the courses, and coordinating with other faculty members to develop coherence across the program. The groups, and the person responsible, are as follows: Research Methodology: Sarah Cordes; Organization and Policy: John Hall; Equity and Ethics: Steve Gross; The Practice of Leadership: Christopher McGinley. Students. Students enrolled in the executive program will take coursework that accommodates an intense work-week schedule, and with support and guidance they will be able to complete the program in three years. This approach appeals to professionals who want a high-quality education that is concentrated, efficient, and predictable. We will select 18 to 20 students for each cohort, and cohorts will be admitted every two years. In the first two years, the program will comprise two courses per semester, offered on weekends, and two courses per summer, offered during an intensive seven-day session. In the final year, students will take one course per term, unless they are pursuing superintendent licensure, in which case they will