SlideShare a Scribd company logo
Running head: INSTRUCTIONAL PROGRAM EVALUATION 1
Instructional Program Evaluation Plan: Education for Democratic Citizenship
Cynthia Crump
June 20, 2011
Instructional Program Evaluation Plan: Education for Democratic Citizenship
Instructional Program Evaluation Plan 2
The presentation is a systematic and comprehensive formative evaluation plan to
investigate the implementation of social studies education for Democratic citizenship (SSEDC)
in the mature stage. The lead evaluator will select a team to guide and conduct key actions
throughout the evaluation process. The plan will begin with the Grades K-6 program
description, followed by the theoretical framework, including the research questions that will
guide the project over a 12-week period. The methodology will be mixed method survey design,
using multiple methods to collect quantitative and qualitative data. The sampled target group
will include various stakeholders in the school community, including the implementers and
others as the need arises. Content and descriptive data analyses will be the suggested methods to
extract themes and concepts and highlight possible findings influenced by (a) teachers‟
understanding of SSEDC goal; (b) methods used by teachers; and (c) problems the teachers are
experiencing during the implementation process. The evidence will form the basis for findings
and conclusions, and for recommending strategies for improvement of SSEDC. The evaluation
team will put measures in place to promote accurate results, and efficient reporting procedures.
The evaluation team will put efficient reporting procedures or measures in place respected by the
internal stakeholders – designers and implementers.
Some motivating factors influencing the evaluation of the SSEDC instructional program
include the need of designers and supervisors to (a) influence program improvement or
strengthen planning and delivery; (b) identify areas of challenge; and (c) foster accountability;
(Chen, 2005; Gard, Flannigan, & Cluskey, 2004; Jason, 2008; Posavak & Carey, 2007). The
proposed evaluation plan will include:
1. Committee Selection
2. Part I: Background of the SSEDC Program
Instructional Program Evaluation Plan 3
a. Determination of the Current State of the Program
b. Evaluation Goal
3. Part II: Theoretical Framework
a. Standards
b. Program Theory
c. Stakeholders
d. Methodology
e. Target Population
4. Part III: Mixed Design
a. Rationale
b. Data Collection Methods
c. Validity and Reliability
5. Part IV: Analysis of Data
a. Possible Findings
b. Recommendation
c. Reporting Considerations
The preceding list is an outline of the main sections of the evaluation plan. The
committee selection is important to ensure key personnel selected are competent to lead and
perform major roles. Part I will be a brief description of the program and a statement of the
evaluation goal and the questions to guide the evaluation. Part II is a guide to the design, the
standards and criteria that will be used as a checklist to judge the quality of the evaluation, and
will have a description of the individuals who will affect or be affected by the implementation
the program, including the beneficiaries. Part III, the mixed design, emphasizes the plan to
Instructional Program Evaluation Plan 4
conduct a detailed evaluation by means of multiple data collection tools, from multiple sources.
In Part IV, possible findings from the data analysis will provide the basis for suggestions to
improve the program. The presentation will form the basis to conduct a formative process
evaluation and provide recommendations for improvement of teachers‟ implementation of
SSEDC.
Committee Selection
The director, supervisor, one other education officer, teachers, principals, a community
member, a parent, and a private school principal, a school mentor, will comprise the evaluation
committee. The responsibility of the committee members is to respond to the strengths and
challenges of the program to refine the program. Gard, Flannigan, and Cluskey (2004) cited the
evaluation committee has the responsibility “to use data to identify strengths and weaknesses of
the program” (p. 176).
The coordinator of the development and revision processes and the supervisor coupled
with the stakeholders in the school community “…. are vital to the survival and success of the
[program]” (Gard et al., 2004, p. 4). Collaboration with external evaluators will ensure a
supportive environment (Chen, 2005). The director must guard against bias and conflict of
interests (Posavac & Carey, 2007) because of involvement in all stages of the program. Ethics
and values are two elements necessary to plan, conduct, and evaluate a program to ensure
accuracy of results. Using external and internal evaluators would help to lessen or eliminate
perceived internal bias while empowering internal and external stakeholders (Chen).
Part I: Background of the SSEDC Program
Before 2006, the last attempt at social studies curriculum review and renewal was in the
late 1980s, supported by USAID curriculum specialists. After a quarter of a century, rebranding
Instructional Program Evaluation Plan 5
of the social studies curriculum was necessary including renaming the program to social studies
education for Democratic citizenship (SSEDC). Besides datedness, factors, affecting the social
behaviors of citizens, especially among the youth, influenced the development of SSEDC.
In 2007, a team, including the Director of Curriculum as expert, a core of teachers, and
representatives from the environment ministry, completed a first draft of SSEDC. After several
reviews, SSEDC was piloted among a sample of schools and classes (K –9), over a period of 12
weeks in 2008. At various review sessions, all grade teachers had the opportunity to input
changes, based on the results and recommendations of the pilot implementation data.
Implementation of the revised instructional program took place in September 2009, Familiarity
seminars and training workshops were actions to develop teacher competence and support the
implementation. Between 2009 and present, the director, the supervisor, education officers,
principals, and senior teachers continue to conduct monitoring of the SSEDC.
Goals of SSEDC
The following is a section of the rationale of SSEDC (Ministry of Education, 2009)
outlining several reasons that influenced program development.
First, in Antigua and Barbuda [is] a Democratic state; independent from Britain
since 1981; Education for Democratic Citizenship (EDC) would mean that the
main outcome of schooling should be citizens with civic consciousness; not only
equipped with knowledge but[also have] the ability to demonstrate skills
appropriate to such a citizen, who also exhibit democratic values. Second, there
appears to be a democratic deficit. A high percentage of individuals (youth) do
not vote or even show much interest in politics. SSEDC should help to improve
individuals‟ levels of understanding of their lives and how they interact within
Instructional Program Evaluation Plan 6
society. Third, [because the mid-2000s] there has been an upsurge of crime and
violence. Of particular interest are the negative activities among the youth. These
include school violence, drug-related violence, increases in cases of HIV/AIDS,
home invasions coupled with robbery and rape, murders, and other gun-related
crimes. Fourth, surge in migration of Caribbean neighbors and an influx of other
migrants from China has opened up the avenue for the focus on themes, such as
civic ideals and practices, identity, traditions, multiculturalism, cultural diversity
and tolerance. All citizens need to tolerate peoples from other places, and also to
tolerate their differences (p. 1)
The focus of SSEDC is on relationships to promote (i) understanding the role and
responsibility of citizens in a democratic society and (ii) awareness of the link and
interdependence locally, regionally, globally. The overarching goal of SSEDC is citizenship;
achievable through:
1. Knowledge of social issues and concerns;
2. Skill development;
3. Development of values and attitudes; and
4. Social participation (p. 3)
Teachers should provide the preceding experiences. The program„s rationale and goal
emphasize the outcome capabilities including knowledge, skills, values, and dispositions the
students should achieve. Students should also receive opportunities to participate in the society
by transferring classroom learning to perform the role of productive citizens. These long-term
outcomes should drive lesson objectives as well as the teaching learning experiences.
Instructional Program Evaluation Plan 7
The director introduced the instructional guide with the following statement adapted from
the Organization of Eastern Caribbean States Educational and Research Unit (OERU):
The [program] offers a range of ideas and suggestions to help teachers
organize participatory learning experiences designed to prepare students
for lifelong learning”…. Social studies classrooms place major emphasis
on student-centered learning through the acquisition and development of
specific cognitive skills and competencies. The focus is on learning
through activities, practice, and participation…. These skills are expected
to produce the ultimate outcomes of SSEDC: students as citizens,
acquiring and demonstrating social understanding and civic efficacy”
(Ministry of Education, 2009, p. 2)
Social and Contextual Factors
The SSEDC Instructional program is a part of private and public schools curriculum.
The pilot implementation findings highlighted some gaps and the intent of the review was to
improve on the program. Currently, the curriculum unit personnel conduct support and
monitoring evaluation to provide feedback information on a regular basis to facilitate supervision
of the program. The qualitative and quantitative reports obtained from observation of teaching
using a rating scale, reflections, the classroom environment, students‟ work, and the interactions
reveal areas that mentors could assist with on a continuous basis.
Determination of the Status of the Program
The monitoring in public schools revealed variations exist in the teaching-learning
contexts within and across schools and classes that result in differentiated delivery and students‟
learning experiences. The nature of school leadership and support, supporting materials, and out
Instructional Program Evaluation Plan 8
of class experiences could have differentiated effects on students in achieving the goals of the
curriculum. The public-private dichotomy could also be an influential factor on the teaching-
learning process of SSEDC, because the monitoring of the SSEDC is a feature of public schools
only. The information is important to recommend a more in-depth process evaluation.
Evaluation Goal
The end of July 2011 school year will make two years of implementation. Therefore,
2011-2012 is the year of mature implementation. The purpose of the evaluation is formative, to
inform ways to improve SSEDC the program. The proposed plan will therefore outline a
development-oriented process evaluation to examine perceived problems and recommend
the way forward (Chen, 2005, Posavak & Carey, 2007). Formative evaluation is ongoing,
relevant to address the purpose of the evaluation. Throughout the implementation process, the
team would collect data as the program is in effect. The team will be able to identify strengths
and limitations, and intermediate results during implementation, rather than waiting for the one-
time outcomes evaluation (Posavak & Carey). The central question to guide the evaluation is:
1. How well is SSEDC implemented?
Sub-questions:
1. Is the focus for democratic citizenship clear to the teachers?
2. What methods are teachers to prepare students?
3. What problems are teachers experiencing?
The response to the questions should help in identifying the sources of problems and the role of
stakeholders to improve the program.
The preceding section sets the stage for the proposed evaluation of SSEDC. The selected
committee will collaborate with the users and implementers at the Grades K-6 levels at private
Instructional Program Evaluation Plan 9
and public schools. The main purpose is to investigate the implementation process to identify the
strengths and weaknesses and suggest improvements.
Part II: Theoretical Framework
Part II is a discussion of the theoretical basis of the plan, including criteria, standards,
program theory, and model. The aim is to (a) discuss how standards and stakeholders will
influence the evaluation plan; (b) provide a rationale for the selected evaluation model; and (c)
identify the design. The purpose of proposed process evaluation will be to examine the quality
of the implementation focusing on the following criteria:
1. teachers‟ understanding of SSEDC goal
2. student-centered instruction and assessment congruent with the experiential learning,
behaviorist, and constructivist theories; and
3. Social and contextual factors
Standards
Standards are necessary to “identify and define evaluation quality and guide evaluators”
(Yarbrough, 2011, p. xxii). Attention to attributes of quality such as utility, feasibility, propriety,
and accuracy promote accountability. Evaluation accountability is important to foster program
improvement, and improved decision making, and create reflective practitioners. For the
purpose of the proposed evaluation, the following standards could help to define the quality
necessary for a successful evaluation (Yarbrough, 2011).
1. Utility
a. Evaluator credibility
i. Clarify that individuals will be responsible for the various elements of
the evaluation
Instructional Program Evaluation Plan 10
ii. Provide assurance that each has the expertise or support required to
complete the work.
2. Feasibility
a. Practical procedures
i. Implement practical and responsive procedures aligned with the
operation of the program.
3. Propriety
a. Human rights and respect
i. Design and conduct evaluation to protect human and legal rights and
maintain the dignity of participants and stakeholders.
4. Accuracy
a. Sound designs and analyses
i. Employ technically adequate designs and analyses appropriate for the
purpose of the evaluation
The description of the standards supports the importance of the stakeholders developing trust in
the expertise of the evaluator to plan and implement appropriate procedures and designs to
promote successful and valid evaluation. Stakeholders must also feel protected and respected.
The following discussion will support how the standards will influence the plan in the choice of
theory, stakeholders, model, design, and human rights and respect.
Program Theory
Chen (2005) supported the view program theory is useful in “improving the
generalizability of evaluation results, contributing to social science theory, uncovering
unintended effects,… achieving consensus in evaluation planning…[and providing] …early
Instructional Program Evaluation Plan 11
indications of program effectiveness” (p. 15). Chen (2005) identified program theories as
causative or normative. Normative stakeholder theory highlights the input of designers,
directors, and staff in an organization. Normative theory is different from the scientific theory
that controls evaluation conducted by academics (outsider or expert interest). The leader of the
evaluation, the director, will perform the role of the internal evaluator, guiding the staff and
selected users and implementers during the evaluation. The activities of the program are
ongoing and information on the process is necessary to determine strengths and weaknesses to
promote improvement, to enable achievement of the goals. An external reviewer could be an
expert in another government department.
Stakeholders
McCawley (2001) defined stakeholders to include a wide cross-section of actors;
individuals who all make contributions or benefit from the inputs and resources and activity,
which result in short-, medium-, and long-term outcomes. In the primary institutions, the main
beneficiaries are the students; the other important stakeholders are the principals, teachers,
parents, and individuals in the community. Corporate citizens, other government and
nongovernmental partners collaborate with schools to promote learning (Beaumier, Marchand,
Simoneau, & Savard, 2000; Chen, 2005; Eaton, 2009).
Yarbrough (2011) described several stakeholders generic to program evaluation.
Stakeholders include evaluators, designers, implementers, participants, intended users, and other
respondents. For the purpose of SSEDC program, evaluation stakeholders include individuals
from the administrative center or the Ministry of Education (MOE), other government ministries,
school community, and the wider community as in Table 1.
Table 1
Instructional Program Evaluation Plan 12
Stakeholders
School Personnel Community Ministry Other
Stakeholders Teachers
Students
Principals or
administrators
External evaluator
Parents
Others who could
contribute information
Teacher trainers
Internal evaluator
Supervisor
Education officers
Government
departments
Non-
governmental
organizations
Local specialist
Normative theory depicted by the stakeholder model would influence or prescribe the
components and activities considered necessary for the success of SSEDC program
implementation and evaluation. Table 2 shows overlap of stakeholders‟ responsibilities in some
areas; however, some individuals have roles more dominant than others. The working group
format would be an important strategy to build consensus on tasks, roles, and issues affecting
relationships during the process obtaining buy-in (Chen, 2005).
Table 2
Stakeholders and their Responsibilities
Stakeholders Responsibilities
Evaluators - external and internal experts Plan, guide, and conduct the evaluation; reviews; decides on
strategy; act as facilitator or consultant, and gives technical
assistance.
Designers – evaluator, supervisor, and
stakeholders from the school community
and community
Work together to plan purpose, objectives
Implementers –ministry personnel,
specialist, evaluators, school personnel
Collaborate to manage, oversee, and ensure the quality of the
evaluation; share information on the program is implementation.
Instructional Program Evaluation Plan 13
Teachers will provide feedback; data related to their pedagogies
including teaching, learning, and assessment; the challenges,
benefits, and suggestions. Principals will provide feedback on the
program in their school.
Evaluation participants Provide information or data
Other (in the community or school) provide additional information about the program
Teachers had input during the development and review stages. Feedback during the
monitoring stage allowed them to participate and have a voice in identifying strengths and
shortcomings. Problems experienced with principals support for training and review sessions for
teachers is evidence that principals needed buy-in, to claim ownership and better understand the
goals and objectives of SSEDC. In the mature implementation stage evaluation, the evaluator
must ensure continuous communication with the users and other implementers, allowing them to
share their concerns and doubt, and receive clarification on the goals of the program and their
role as leaders or implementers (Yarbrough, 2011).
The Logic Model
Program theory depicted by the logic model focuses on causal assumptions – a systems
approach linking program resources, activities, and intended outcomes. Table 3 is a
representation of the SSEDC logic model (Cojocaru, 2009; Jason, 2008; McCawley, 2001).
Table 3
SSEDC Logic Model
Inputs Activities Outputs Outcomes
Resources Programs Products Benefits
short-term
changes
medium-term
changes
long-term
changes
Instructional Program Evaluation Plan 14
Prescriptive
Curriculum:, guiding
philosophy; sample
lessons; rubrics;
Supporting teacher
material; Student
text;
Mentors
Curriculum
preparation and
review,
Training workshops,
seminars,
monitoring
teachers,
students
others in the
community
Knowledge
skills, attitude,
awareness,
motivation
Behaviors,
practices
Environment
and social
changes
The logic model provides a framework to examine the implementation of SSEDC in the primary
grades. The formative process evaluation will provide evidence on whether teachers understand
their role in preparing learners to become Democratic citizens. The preparation should include
using the resources and applying the suggested experiential, student-centered activities or
methods. Analysis of the data could reveal limitations needing attention to foster adjustments to
the program. The recommendations could focus on clarifying rationale, training, and retraining
of teachers, providing additional resources, all in preparation for evaluating the effectiveness or
outcome of implementing SSEDC.
Methodology
The proposed program evaluation will be a mixed method survey design, placing priority
on collection of qualitative data. Qualitative data may describe the ongoing process of the
SSEDC activities and strategies in the form of words from open-ended questionnaire and
observations. Quantitative data might result from close-ended form of questionnaire and
observation schedules (Neuman, 2003).
Target Group
Population
Instructional Program Evaluation Plan 15
The target group will be teachers of Grades K-6 - primary or elementary classes at public
and private schools. Public (32) and private (30) schools, located in four districts or zones,
provide education for 10801 students, taught by 807 teachers. Table 4 is a breakdown of the
number of schools and teachers in the four districts/zones.
Table 4
Target Population: Number of Schools and Teachers
Sampling
Units of sampling will be the schools, teachers, and principals
1. Simple random sampling
a. Select one of each school type from each zone by putting the names in a bag and
choosing 4 public and 4 private primary schools; n=8 schools. All the teachers
(and classes) in each of the eight schools are participants.
2. Purposive sampling
a. social studies supervisor
b. other participants.
Simple random sampling is useful to promote generalizability of findings to all schools because
each has a chance for selection, and the sample is representative of the population (Neuman,
2003). Purposive sampling of supervisors and others is necessary to obtain information in the
selected school communities.
ZONE 1 ZONE 2 ZONE 3 ZONE 4 Total
Public primary schools 7 7 9 9 32
Public primary Teachers 467
Private primary schools 7 7 8 8 30
Private primary Teachers 340
Instructional Program Evaluation Plan 16
Human Subject Consideration
Involvement in the evaluation will be voluntary. The evaluator must ensure
confidentiality; stating to stakeholders the information collected is for the purpose of program
evaluation only. All information will be secure; the evaluator will put measures in place to
maintain the confidentiality, for example:
1. by assigning secret numbers to participants;
2. reminding participants not to write names on data collection instruments; and
3. asking personnel in the unit, or who handles the data to keep the information confidential.
The evaluator is able to convince the prospective participants that their rights are protected and
respected; they sign and agree to take part in the evaluation.
The framework links the input of stakeholders, the activities, the resources at different
stages – planning, implementation, and evaluation. Standards that describe utility, feasibility,
propriety, and accuracy define the quality against which to judge the merit of the evaluation.
Various stakeholders in the school and wider communities, the government departments, and
other groups should collaborate to promote buy-in and to conduct an effective evaluation and
promote improvement. The SSEDC logic model is applicable because looking first at the
possible outcomes the stakeholders can identify factors that might influence the implementation
process. The mixed method survey design is appropriate to collect ongoing qualitative and
quantitative data from implementers at public and primary schools, and other participants as
become necessary. Appropriate sampling methods will promote generalizability to the target
population and seek input from suitable stakeholders. The human rights of the subjects are a
major consideration.
Part III: Methodology
Instructional Program Evaluation Plan 17
The proposed program evaluation of social studies education for democratic citizenship
(SSEDC) will be a mixed method survey design, using various methods to collect quantitative
and qualitative data but placing priority on collection of qualitative data. Qualitative data could
describe the ongoing process of the SSEDC activities and strategies in the form of words from
open-ended questionnaire and observations. Quantitative or numerical data could result from
close-ended form of questionnaire and observation schedules (Neuman, 2003). Triangulation of
mixed data could provide valid and reliable evaluation instruments and promote understanding of
(analysis) results (Grammatikopoulos, Zachopoulou, Tsangaridou, Liukkonen, & Pickup, 2008).
The presentation is a discussion of the rationale of the method and design of the proposed
evaluation. The development of the rationale will include (a) an outline of the proposed data
collection instruments, including how and why stakeholders will contribute to the decision-
making process; (b) a discourse on the importance of putting mechanisms in place to promote
validity and reliability, followed by (c) a simple plan to analyze the data. The presentation ends
with a conclusion.
Mixed Method Design: A Rationale
Formative evaluation requires flexible methodology that provides quick feedback using
mixed methodology (Chen, 2005). The program is in the implementation stage, and a survey
would be most useful among smaller samples. The mixed method provides a comprehensive
scope to promote sound program evaluation and program improvement (Jason, 2008).
Grammatikopoulos et al. (2008) promoted mixed design as a method to improve or increase the
degrees of validity and reliability while criticizing the use of one data source to make decisions
during an evaluation.
The mixed method provides a comprehensive scope to promote sound program evaluation and
program improvement (Jason, 2008). At the interpretation stage, the usefulness of mixed design
Instructional Program Evaluation Plan 18
is "providing a versatile and more complete picture of the procedures under evaluation”
(Grammatikopoulos et al., 2008, p. 5). Qualitative or quantitative data used in isolation will not
provide the same insight as using both. The voices of participants can be most convincing as
they tell their stories, useful to corroborate results of quantitative data (Grammatikopoulos et al.,
2008; Creswell, 2005). Participants‟ comments and open responses can influence how readers
accept or reject a program. The themes interpreted from such presentations could corroborate
quantitative measures.
Instruments or Data Collection Methods
Process (formative) data, through triangulation method is necessary to obtain information
on how the program is working and identify the influencing factors – to examine the
implementation fidelity of SSEDC. Several evaluation research (Burnstein, Hutton, & Curtis,
2006; Gallavan, 2008) identified different instrumentation approaches useful to mixed qualitative
and quantitative data analyses. The examples relevant to the evaluation of SSEDC include:
Qualitative Data
1. Schedule for systematic observation (Appendix A) to include comments of class
observations of teaching, learning, and assessment to obtain first hand information of
the process;
2. Teachers‟ respond to open-ended questionnaire items (Appendix B) as they share
freely their view of the program in process;
3. Interview (Appendix C) with parents, principals, teachers, concerning how the
curriculum is used and impact of the context;
4. Weekly reflections written by teachers about their classroom practice and student
participation, highlighting challenges, difficulties, and ease of delivery;
Instructional Program Evaluation Plan 19
5. Document analyses including analyzing the curriculum and supporting documents,
such as texts, and guides; and
6. Focus groups composed of students, teachers, and parents of students to obtain
additional information on how students respond to the curriculum.
Quantitative Data
1. Teachers ratings of close ended teacher questionnaire (Appendix A) (Likert type)
items to obtain data about the process and principals responses to obtain data on the
context and views of the implementation
The preceding examples show evaluators could collect in-depth data from the users in the
school community, parents, community personnel, and administrators. The stakeholders or
implementers are an important group (in the context) that could influence the results of the
evaluation. They represent the outputs or people who benefit from or influence the processes (or
outcomes), directly or indirectly (Fontaine, Haarman, & Schmid, 2006).
Validity and Reliability
Validity and reliability of research are major considerations. Researchers must examine
several important aspects of instruments; the appropriateness; the measurement properties; and
the process of administering and scoring (Borg & Gall, 1998). The structure and contents of the
researcher-designed instruments could affect the responses provided by the participants,
influencing interpretation of data (Creswell, 2005; Neuman, 2003). Using themes evolving from
the qualitative data or the literature review, dividing into appropriate subsets, training observers,
and testing the instrument will promote content validity and reliable results (VanTassel-Baska et
al., 2008).
Instructional Program Evaluation Plan 20
Simple random sampling may be (more) appropriate to select units and participants when
evaluating SSEDC. Probability sampling promotes generalizability of findings to all schools
because each has a chance for selection, and the sample is representative of the population
(Neuman, 2003). Retraining teachers to deliver SSEDC and using the multiple methods of data
collection will promote triangulation and improve validity and reliability (Amadeo & Cepeda,
2007).
Data Analysis
The results of the evaluation will guide decision-making about the program. A
combination of the qualitative and quantitative data at the interpretation stage results in deeper
understanding of the issues and promotes validity and reliability. Ethical evaluators must put
measures in place to ensure accuracy and consistency of results and consequently the
conclusions (Grammatikopoulos et al., 2008).
Content analysis of qualitative data could determine themes that evolve from the
comments or responses of participants. Quantitative data in the form of means and percentages
could complement the data from interviews, open-ended responses, or documents. Pictorial
representations could be in the form of tables and graphs (Neuman, 2003).
The mixed method design is the preferred design to guide data collection and analysis
during the SSEDC evaluation. The survey approach using a variety of data collection tools will
provide qualitative and quantitative data from multiple sources and methods. The statistical
methods appropriate to making meaning of mixed data will provide greater insight into the
issues. Triangulation is one component in mixed method design that increases the degree of
validity and reliability, promoting generalizability of findings to the population
Instructional Program Evaluation Plan 21
(Grammatikopoulos et al., 2008). Presenting data in different forms such as words, or in tables
and graphs, is advantageous, supporting several means to interpret, analyze, and report findings.
Part IV: Analysis of Data
This section is a presentation of the statistical approach to mixed data analysis. Content
analysis of qualitative data will determine themes that evolve from the comments or responses of
participants. Quantitative data in the form of means and percentages will complement the data
from interviews, open-ended responses, or documents. Pictorial representations of data will
highlight the possible use of tables and graphs (Neuman, 2003). Triangulation and integration of
multiple data and sources could reveal possible findings and conclusions about the
implementation of SSEDC. Finally, the recommendations will be presented to develop an
improvement plan.
Methodology
The procedures to analyze qualitative data will have two phases; the steps include:
Phase 1
(i) transcribing data collected from observations, interviews, focus groups, and
document analysis;
(ii) perusing the information to identify a theme;
(iii) coding words and phrases using an interactive approach to allow additional phrase
applicable to the analysis;
(iv) coding for frequency;
(v) formulating categories and generalizing based on similarity of words or phrases;
(vi) coding relationship between word;
Phase 2
Instructional Program Evaluation Plan 22
(vii) applying themes based on theory in order to explain the data and answer the research
questions; and
(viii) presenting data in the form of narrative, tables, and graphs.
Content analysis procedures promote the discovery of themes, patterns, and characteristics from
the data. The procedures include transcribing, perusing, and extracting words and concepts to be
able to apply coding to identify the themes. In phase two applying theory from past research can
also support development of themes and findings. Numerical percentages and means of words
and themes could result from the qualitative analysis, and presented graphically. The steps of the
qualitative analysis could be done manually or by using qualitative analysis software.
Quantitative
Quantitative data will derive from ratings of Likert type questionnaire and observation
schedule and the frequency of themes and subthemes from the qualitative data. Presentation of
data will be in the form of means and percentages represented in graphs and tables. Data from
the teacher observation schedule (Appendix A) will demonstrate teachers‟ competence in
planning, teaching, and assessment by the frequency of Yes or No selected by the observer. Data
from the teacher questionnaire will represent teachers‟ perception of the quality of the
curriculum, the strategies, and the assessments most frequently used.
Tracking concepts stated most frequently would help to identify the patterns that emerge.
Specific statements written or spoken by the participants will give a voice to the issues in relation
to the research questions. Further analysis will show the most frequently used approaches to
teaching and learning.
The central question to guide the evaluation is:
1. How well is SSEDC implemented?
Instructional Program Evaluation Plan 23
Sub-questions:
2. Is the focus for democratic citizenship clear to the teachers?
3. What methods are teachers using to prepare students?
4. What problems are teachers experiencing?
Possible Findings
The responses of the participants from multiple sources should help in identifying the
sources of problems and the role of stakeholders to improve the program. The findings would
give ideas to improve the program as necessary. The data collected using the observation
schedules, questionnaires, interviews, focus groups, weekly reflections, and document analyses
will be the basis of the following discussion. The possible findings will reflect the quality of the
implementation focusing on the stated criteria, resulting in themes such as (a) social studies
goals, (b) teacher centered and student centered teaching, learning, and assessment strategies,
and (c) difficulties and challenges. The research questions will be the basis for the discussion of
the possible findings of how well teachers are implementing SSEDC.
Sub-question 1. Is the focus of democratic citizenship clear to the teachers?
The designers expected articulation of the goal of SSEDC in the program document,
during training, and the monitoring by supervisors to help teacher demonstrate a clear
understanding of the goal of SSEDC. Griffith and Barth (2006) explained teachers might voice
social studies (SSEDC) goal from various perspective:
1. disciplines or content necessary for nation building;
2. transmission of knowledge about what a citizen should know or do to be productive;
and
Instructional Program Evaluation Plan 24
3. active engagement to develop competencies to function effectively as a productive
citizen.
The extent to which teachers can explain the importance of SSEDC in the national curriculum
will be a reflection of whether the goal of SSEDC to develop competence to function effectively
as a productive citizen is clear or not.
Sub-question 2. What methods are teachers using to plan, teach, and assess SSEDC?
Comments written by the observers would indicate the extent to which teachers are
applying the teaching, learning, and assessment strategies outlined in the program, or other
strategies applicable to meet the needs of the students. The SSEDC teachers‟ guide (Ministry of
Education, 2009) provides examples to engage students in various ways. For example, students
“can engage in research projects, cooperative group work, drama, and role-play, discussions, and
community service learning” (p. 3).
The experiential learning is the main philosophical basis of the curriculum,
complemented by the behaviorist and constructivist theories. The purpose of the experiential
approach is ““to increase knowledge, develop skills and clarify values” (Roberts, 2006, p. 13).
Further, the goal of the SSEDC is to promote democratic citizenship competencies, subsumed
under the experiential approach. It follows therefore; teachers need to have a clear focus of this
goal. The delivery of the lesson plan should build students‟ experience, reflection, concept
development, and active experimentation on previous experience, incorporating direct and
indirect instruction (Hunt et al., 2009).
Teachers should provide opportunities for students to engage in traditional paper and
pencil test and performance based assessment. Service learning, coined in the early 1960s, is
becoming an important component of experiential learning. Community service learning (CSL)
Instructional Program Evaluation Plan 25
is a summative approach to link teaching learning and assessment. Schneller (2008) noted
service learning as an offshoot of Dewey‟s theory of experience, describing the strategy as
“pedagogy, curriculum, activities and programmes that embrace organized, hands-on community
service and volunteerism to enhance student learning and the schooling experience” (p. 294).
This aspect of experiential learning culminates a period of learning, giving opportunity for the
learners to demonstrate transfer of learning competencies in similar or new situations in the
school environment and in the community.
The documents will include the program document and teachers‟ lesson plans with their
reflections. The documents could bear evidence of the teaching, learning, and assessment
strategies that teachers use in planning and delivery of lessons. Numerous instructional
strategies are available for use in the classroom (See Teacher questionnaire, Appendix B).
Planning is important since “without a careful plan for presenting content, [students] experience
may be akin to a jigsaw puzzle” (Gunter, Estes, & Schwab, 2003, p. 39). Planning the
procedures portion of lesson plans requires teachers to select appropriate strategies to meet
identified needs, interests, motivations, and dispositions of learners. Teachers should consider
learner characteristics and learning styles when choosing an instructional strategy (Hunt,
Wiseman, & Touzel, 2009).
Sub-question 3. What problems are teachers experiencing?
The questionnaire should give further support to the delivery of the curriculum.
Although prescriptive, some variables might yet affect the implementation process, including the
allotted periods, the scope of the topics, the available resources, and the appropriateness of the
content for the prescribed grade level, and the learning environment. The evaluation could
highlight the difficulty and ease with which teachers were able to implement the objectives and
Instructional Program Evaluation Plan 26
content of the program. The analysis should demonstrate the teaching, learning, and assessment
strategies used most to achieve the goals, and those used less or not at all.
Interview data (including focus group) from teachers and parents could support possible
findings above and support improvement of SSEDC program. Difficulties and challenges
teachers experience may result in gaps that influence the achievement of SSEDC program goal
and objectives. This would be evident in teachers‟ knowledge of SSEDC goal, and the
application of student centered experiential activities, and traditional and performance-based
assessment strategies. The preceding analysis including the identification of difficulties teachers
are experiencing, and their suggestions for improvement will provide the evidence to design and
implement the improvement plan. This evidence should be the basis on which to recommend the
strategies to improve the following:
1. teachers‟ understanding of the goal of SSEDC
2. assistance from personnel in the curriculum development unit; including the director and
supervisors
3. teacher preparation procedures for planning, delivery, and assessment of SSEDC; and
4. involvement of teachers in further revision of the SSEDC program.
Reporting
Collaboration is important and communication is even more important, especially for the
evaluator to communicate with the implementers, the goal, theory, and procedures for the
evaluation and sharing and discussing findings (Jason, 2008). Reporting of evaluation findings
requires the development of a communication plan, outlining schedule of presentations. A draft
report to the heads of the Ministry of Education such as the Minister and the Director of
Education will clarify misconceptions or provoke responses to any conflicting results (Llosa &
Instructional Program Evaluation Plan 27
Slayton, 2009). The evaluator should add a personal touch by making the live presentation,
using supporting aids to clarify points and keep the interests of the other stakeholders (Posavac
& Carey, 2007). Llola and Slayton emphasized “…findings be communicated appropriately and
convincingly to stakeholders, so the recommendations would be considered and not
dismissed”(p. 37).
Program evaluation is iterative, ongoing, and cyclical to achieve different purposes,
contributing to communication, collaboration, and learning in an organization (Jason, 2008). The
main goal of SSEDC is to foster democratic citizenship competencies, practices, and social
change. The causal relationship between resources, activities, and outputs should influence the
change observed in the users over a period. This proposed evaluation plan was a formative
process evaluation of social studies education of democratic citizenship (SSEDC). The purpose
was to examine how teachers were implementing SSEDC at Grades K-6 at private and public
primary schools. The participants of interest were (a) teachers randomly selected from four of
each of the type of school, and (b) supervisor and other influential stakeholders. Mixed method
design supported the collection and analysis of qualitative and quantitative data, including
questionnaire, observation, and interview. Content analysis or descriptive analysis were the
applied statistical approaches to identify possible findings. The evidence supported negative and
positive findings of SSEDC goals and teaching, learning, and assessment strategies, and
difficulties experienced during implementation. These findings, consequent conclusions, and
participants‟ suggestions form the basis for the recommendation of improvement strategies. The
strategies included stakeholder involvement, continuous training, and accountability measures.
Reporting evaluation findings required effective planning, collaboration, communication, and
Instructional Program Evaluation Plan 28
presentation strategies to ensure the client or stakeholders see merit in the evaluation as stated in
the purpose.
References
Instructional Program Evaluation Plan 29
Amadeo, J., & Cepeda, A. (2007). National policies on Education for Democratic Citizenship in
the Americas. Draft analytic Report. Washington D.C.: Inter-American Program on
Education for Democratic Values and Practices. Organization of American States (OAS).
Beaumier, J-P., Marchand, C., Simoneau, C., & Savard, D. (2000). The institutional evaluation
guide. Commission D'évaluation de L'enseignement Collegial at its 103th meeting in
Québec City.
Borg, W. R., & Gall, M. D. (1998). Educational research: An introduction. (5th
ed.). New York:
Longman.
Burnstein, J. H., Hutton, L. A., & Curtis, R. (2006). The state of elementary social studies
teaching in one urban district. Journal of Social Studies Research, 30(1), 15-20.
Chen, H. (2005). Practical program evaluation: Assessing and improving planning,
implementation, and effectiveness. Thousand Oaks, CA: Sage.
Cojocaru, S. (2009). Clarifying the theory-based evaluation. Review of Research and Social
Intervention, 26, 76-86. Retrieved from
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1448390
Creswell, J. (2005). Educational research: Planning, conducting, and evaluating quantitative
and qualitative research. (2nd
ed.). New Jersey: Pearson, Merrill Prentice Hall.
Eaton, J. S. (2009). Accreditation in the United States. New Directions for Higher Education,
(145), 79-86.
Fontaine, C., Haarman, A., & Schmid, S. (2006). Stakeholder theory. Retrieved from
http://www.edalys.fr/documents/Stakeholders%20theory.pdf
Gallavan, N. P. (2008). Examining teacher candidates‟ views on teaching world citizenship.
Social Studies, 99(6), 249-254.
Instructional Program Evaluation Plan 30
Gard, C.L., Flannigan, P. N., & Cluskey, M. (2004). Program evaluation: An ongoing
systematic process. Nursing Education Perspectives, 25(4), 176-179.
Grammatikopoulos, V., Zachopoulou, E., Tsangaridou, N., Liukkonen, J., & Pickup, I. (2008).
Applying a mixed method design to evaluate training seminars within an early childhood
education project. Evaluation & Research in Education, 21(1), 4-17.
Gunter, M.A., Estes, T. H., & Schwab, J. (2003). Instruction: A models approach (4th ed.).
Boston: Pearson Education Inc.
Hunt, G. H., Wiseman, D. G., & Touzel, T. J. (2009). Effective teaching: Preparation and
implementation (4th ed.). Illinois: Charles C. Thomas – Publisher Ltd
Jason, M.H. (2008). Evaluating programs to increase student achievement (2nd ed.). Thousand
Oaks, CA: Sage.
Llosa, L., & Slayton, J. (2009). Using program evaluation to inform and improve the education
of young English language learners in US schools. Language Teaching Research 13(1),
35–54.
McCawley, F. (2001). The logic model for program planning and evaluation. Retrieved from
http://www.uiweb.uidaho.edu/extension/LogicModel.pdf
Ministry of Education. (2009). Social studies teachers’ guide: Social studies education for
democratic citizenship. St. John‟s, Antigua & Barbuda: Curriculum Development Unit.
Ministry of Education. (2009). Draft national curriculum policy framework. St. John‟s, Antigua
& Barbuda: Curriculum Development Unit.
Neuman, L. W. (2003). Social research methods. Qualitative and quantitative approaches (5th
ed.). Boston: Allyn & Bacon.
Posavac, E. J., & Carey. R. G. (2007). Program evaluation: Methods and case studies (7th ed.).
Upper Saddle River, N.J.: Pearson/Prentice Hall.
Instructional Program Evaluation Plan 31
Roberts, J. W. (2005). Disney, Dewey, and the death of experience in education. Education and
Culture, 21(2), 12-30.
VanTassel-Baska, J., Feng, A., MacFarlane, B., Heng, M., Wong, M. L., Quek, C.G., & Khong,
B. C. (2008). A cross-cultural study of teachers‟ instructional practices in Singapore and
the United States. Journal for the Education of the Gifted, 31(3), 338-363.
Yarbrough, D. B., Shulha, L. M., Hopson, R. K., & Caruthers, F. L. (2011). The program evaluation
standards: A guide for evaluators and evaluation users (3rd. ed.). Thousand Oaks, CA:
Sage.
Instructional Program Evaluation Plan 32
Appendix A
Classroom Observation Schedule
Date _______________________________ Grade _______________________________
School: ______________________________
Teacher _____________________________ Topic _______________________________
A. Tick the appropriate column
1. Teacher is using the Social Studies Curriculum ____ ____
2. Teacher has a completed lesson plan ____ ____
3. Lesson plan incorporates a variety of teaching learning activities ____ ____
4. Activities are in keeping with those suggested in curriculum ____ ____
5. Teacher seems comfortable with activities suggested in curriculum ____ ____
6. Materials are appropriate to the lesson ____ ____
7. Materials are used appropriately during the lesson ____ ____
8. Students respond positively to lesson activities ____ ____
9. Students are active participants in the lesson ____ ____
10. Activities are planned to cater for students individual differences ____ ____
11. Lesson objectives are achieved ____ ____
12. Students are assessed in a variety of ways ____ ____
13. Methods of assessment are clear and appropriate ____ ____
B. Elaborate or comment on any of your observations. Suggest support that could help the teacher
improve.
_____________________________________________________________________________________________
_____________________________________________________________________________________________
_____________________________________________________________________________________________
_____________________________________________________________________________________________
YES NO
Instructional Program Evaluation Plan 33
Appendix B
Teacher Questionnaire
Teacher: ________________________________________________________________
School: _________________________________________________________________
Class:
Term: __________________________________________________________________
Section A
i. How often do you use the social studies education for democratic citizenship (SSEDC) instructional guide
Always Sometimes Never
ii. Lessons contain realistic teaching time frames. Yes _____ No ______
iii. Number of teaching lessons/activities. Sufficient ____ Insufficient ____
iv. Number of available resources listed. Sufficient ____ Insufficient ____
v. Content for the level of teaching? Appropriate ____ Inappropriate_____
Section B
1.What objectives did you cover this term?
[use unit & objective numbers]
2.What content was difficult to teach?
3.What content was easy to teach?
Section C: Strategies/methods
1. Which teaching-learning strategies or activities do you use?
Research ____
Grouping ____
Peer teaching ____
Investigation ____
Simulations ____
Role Play ____
Dramatization ____
Community Service Learning ____
Lecture ____
Reading textbook ____
Project ____
Poster ____
Chart ____
Poem/song ____
Displays ____
Exhibitions ____
Questioning ____
Field trip ____
Journal ____
Discussion ____
Lecture ____
Vocabulary development ____
Presentation ____
Notes ____
Class work ____
Running head: INSTRUCTIONAL PROGRAM EVALUATION 34
2. Which assessment methods do you use?
Journals ____
Investigation & Projects ____
Observation ____
Oral assessment ____
Pencil& Paper Tests/exercises ____
Worksheets ____
Practical / Performance Assessment ____
Portfolio Assessment ____
Peer assessment ____
Questionnaires ____
Community Service Learning ____
Section D
Respond to the following:
1. Describe TWO main difficulties you encounter in using the curriculum/program guide
2. State THREE suggestions for improving the curriculum
3. Explain the importance of SSEDC in the national curriculum.
4. Other comments. [e.g. your feelings, your practice, and students‟ responses]
Instructional program Evaluation Plan 35
Appendix C
Interview
1. How can the curriculum development unit assist with the teaching of SSEDC?
2. Is the focus for democratic citizenship clear to the teachers?
3. Is the goal of democratic citizenship clearly outlined in the guide?
4. How helpful is the interaction with supervisors?
5. What comments do you have about preparation of teachers for planning and delivering SSEDC?
6. What comments do you have about your involvement in the development of the program?

More Related Content

What's hot

Health Promoting Schools
Health Promoting SchoolsHealth Promoting Schools
Health Promoting Schools
mhairistratton
 
Preparing a logical framework for your project
Preparing a logical framework for your projectPreparing a logical framework for your project
Preparing a logical framework for your project
Karzen & Karzen d.o.o.
 
PRECED/PROCEED MODEL
PRECED/PROCEED MODEL PRECED/PROCEED MODEL
PRECED/PROCEED MODEL
University of Gondar
 
Capacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And EvaluationCapacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And Evaluation
Knowledge Management Center
 
Community diagnosis
Community diagnosisCommunity diagnosis
Community diagnosis
Ahmed-Refat Refat
 
Health service management
Health service managementHealth service management
Health service management
Hale Teka
 
Role of Evaluation in Teaching
Role of Evaluation in TeachingRole of Evaluation in Teaching
Role of Evaluation in Teaching
Thanikachalam Vedhathiri
 
Introduction to monitoring and evaluation
Introduction to monitoring and evaluationIntroduction to monitoring and evaluation
Introduction to monitoring and evaluation
Meshack Lomoywara
 
Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.
Muthuraj K
 
Monitoring and evaluation
Monitoring and evaluationMonitoring and evaluation
Monitoring and evaluation
Md Rifat Anam
 
Monitoring and evaluation (2)
Monitoring and evaluation (2)Monitoring and evaluation (2)
Monitoring and evaluation (2)
Dr.RAJEEV KASHYAP
 
Monitoring and evaluation presentatios
Monitoring and evaluation presentatios Monitoring and evaluation presentatios
Monitoring and evaluation presentatios
athanzeer
 
Monitoring and Evaluation of Health Services
Monitoring and Evaluation of Health ServicesMonitoring and Evaluation of Health Services
Monitoring and Evaluation of Health Services
Nayyar Kazmi
 
Some Challenges of M&E
Some Challenges of M&ESome Challenges of M&E
Some Challenges of M&E
Md. Shariful Hoque
 
M&E.ppt
M&E.pptM&E.ppt
M&E.ppt
selam49
 
Prototype for health education program on utilization of family planning
Prototype for health education program on utilization of family planningPrototype for health education program on utilization of family planning
Prototype for health education program on utilization of family planning
Mohammad Aslam Shaiekh
 
Impact evaluation
Impact evaluationImpact evaluation
Impact evaluation
Carlo Magno
 
Logical framework
Logical frameworkLogical framework
Logical framework
Ermete Mariani
 
Project Monitoring & Evaluation
Project Monitoring & EvaluationProject Monitoring & Evaluation
Project Monitoring & Evaluation
Srinivasan Rengasamy
 
Community Assessments: How to assess a community's needs
Community Assessments: How to assess a community's needsCommunity Assessments: How to assess a community's needs
Community Assessments: How to assess a community's needs
Rotary International
 

What's hot (20)

Health Promoting Schools
Health Promoting SchoolsHealth Promoting Schools
Health Promoting Schools
 
Preparing a logical framework for your project
Preparing a logical framework for your projectPreparing a logical framework for your project
Preparing a logical framework for your project
 
PRECED/PROCEED MODEL
PRECED/PROCEED MODEL PRECED/PROCEED MODEL
PRECED/PROCEED MODEL
 
Capacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And EvaluationCapacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And Evaluation
 
Community diagnosis
Community diagnosisCommunity diagnosis
Community diagnosis
 
Health service management
Health service managementHealth service management
Health service management
 
Role of Evaluation in Teaching
Role of Evaluation in TeachingRole of Evaluation in Teaching
Role of Evaluation in Teaching
 
Introduction to monitoring and evaluation
Introduction to monitoring and evaluationIntroduction to monitoring and evaluation
Introduction to monitoring and evaluation
 
Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.
 
Monitoring and evaluation
Monitoring and evaluationMonitoring and evaluation
Monitoring and evaluation
 
Monitoring and evaluation (2)
Monitoring and evaluation (2)Monitoring and evaluation (2)
Monitoring and evaluation (2)
 
Monitoring and evaluation presentatios
Monitoring and evaluation presentatios Monitoring and evaluation presentatios
Monitoring and evaluation presentatios
 
Monitoring and Evaluation of Health Services
Monitoring and Evaluation of Health ServicesMonitoring and Evaluation of Health Services
Monitoring and Evaluation of Health Services
 
Some Challenges of M&E
Some Challenges of M&ESome Challenges of M&E
Some Challenges of M&E
 
M&E.ppt
M&E.pptM&E.ppt
M&E.ppt
 
Prototype for health education program on utilization of family planning
Prototype for health education program on utilization of family planningPrototype for health education program on utilization of family planning
Prototype for health education program on utilization of family planning
 
Impact evaluation
Impact evaluationImpact evaluation
Impact evaluation
 
Logical framework
Logical frameworkLogical framework
Logical framework
 
Project Monitoring & Evaluation
Project Monitoring & EvaluationProject Monitoring & Evaluation
Project Monitoring & Evaluation
 
Community Assessments: How to assess a community's needs
Community Assessments: How to assess a community's needsCommunity Assessments: How to assess a community's needs
Community Assessments: How to assess a community's needs
 

Viewers also liked

Week 5 Reflection Assignment
Week 5 Reflection AssignmentWeek 5 Reflection Assignment
Week 5 Reflection Assignment
jnall
 
Professional Development Plan
Professional Development PlanProfessional Development Plan
Professional Development Plan
jnall
 
Pg And Eval
Pg And EvalPg And Eval
Pg And Eval
ewestend
 
Evaluation of teacher education initiative of CEMCA for three year plan2012 1...
Evaluation of teacher education initiative of CEMCA for three year plan2012 1...Evaluation of teacher education initiative of CEMCA for three year plan2012 1...
Evaluation of teacher education initiative of CEMCA for three year plan2012 1...
Gurumurthy Kasinathan
 
Collection Evaluation & Development Plan
Collection Evaluation & Development PlanCollection Evaluation & Development Plan
Collection Evaluation & Development Plan
jj_edge00
 
Module 6 Assignment 1 ppt
Module 6 Assignment 1 pptModule 6 Assignment 1 ppt
Module 6 Assignment 1 ppt
Maria Cieslak
 
Basic steps to program evaluation
Basic steps to program evaluationBasic steps to program evaluation
Basic steps to program evaluation
Leith Mazzochi
 
Long range and short-range goals
Long range and short-range goalsLong range and short-range goals
Long range and short-range goals
Deidra Johnson
 
Program Evaluation
Program EvaluationProgram Evaluation
Program Evaluation
Burhan Omar
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
spanishpvs
 
Program Evaluation and Review Technique
Program Evaluation and Review TechniqueProgram Evaluation and Review Technique
Program Evaluation and Review Technique
Raymund Sanchez
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
Yen Bunsoy
 
Validity and reliability in assessment.
Validity and reliability in assessment. Validity and reliability in assessment.
Validity and reliability in assessment.
Tarek Tawfik Amin
 
Presentation Validity & Reliability
Presentation Validity & ReliabilityPresentation Validity & Reliability
Presentation Validity & Reliability
songoten77
 
Organization ppt
Organization pptOrganization ppt
Organization ppt
Sandeep Mahto
 

Viewers also liked (15)

Week 5 Reflection Assignment
Week 5 Reflection AssignmentWeek 5 Reflection Assignment
Week 5 Reflection Assignment
 
Professional Development Plan
Professional Development PlanProfessional Development Plan
Professional Development Plan
 
Pg And Eval
Pg And EvalPg And Eval
Pg And Eval
 
Evaluation of teacher education initiative of CEMCA for three year plan2012 1...
Evaluation of teacher education initiative of CEMCA for three year plan2012 1...Evaluation of teacher education initiative of CEMCA for three year plan2012 1...
Evaluation of teacher education initiative of CEMCA for three year plan2012 1...
 
Collection Evaluation & Development Plan
Collection Evaluation & Development PlanCollection Evaluation & Development Plan
Collection Evaluation & Development Plan
 
Module 6 Assignment 1 ppt
Module 6 Assignment 1 pptModule 6 Assignment 1 ppt
Module 6 Assignment 1 ppt
 
Basic steps to program evaluation
Basic steps to program evaluationBasic steps to program evaluation
Basic steps to program evaluation
 
Long range and short-range goals
Long range and short-range goalsLong range and short-range goals
Long range and short-range goals
 
Program Evaluation
Program EvaluationProgram Evaluation
Program Evaluation
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
Program Evaluation and Review Technique
Program Evaluation and Review TechniqueProgram Evaluation and Review Technique
Program Evaluation and Review Technique
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
Validity and reliability in assessment.
Validity and reliability in assessment. Validity and reliability in assessment.
Validity and reliability in assessment.
 
Presentation Validity & Reliability
Presentation Validity & ReliabilityPresentation Validity & Reliability
Presentation Validity & Reliability
 
Organization ppt
Organization pptOrganization ppt
Organization ppt
 

Similar to Program evaluation plan

curriculum development process-1.pptx
curriculum development process-1.pptxcurriculum development process-1.pptx
curriculum development process-1.pptx
Sani191640
 
Leadership and learning with revisions dr. lisa bertrand-nfeasj (done)
Leadership and learning with revisions dr. lisa bertrand-nfeasj (done)Leadership and learning with revisions dr. lisa bertrand-nfeasj (done)
Leadership and learning with revisions dr. lisa bertrand-nfeasj (done)
William Kritsonis
 
2014 Montgomery College Diversity Report
2014 Montgomery College Diversity Report2014 Montgomery College Diversity Report
2014 Montgomery College Diversity Report
Dr. Michelle Scott
 
An Evaluation Of One District S Think Through Math Program And Curriculum Imp...
An Evaluation Of One District S Think Through Math Program And Curriculum Imp...An Evaluation Of One District S Think Through Math Program And Curriculum Imp...
An Evaluation Of One District S Think Through Math Program And Curriculum Imp...
Martha Brown
 
Lesson 4 Curriculum Development 2023.pptx
Lesson 4 Curriculum  Development 2023.pptxLesson 4 Curriculum  Development 2023.pptx
Lesson 4 Curriculum Development 2023.pptx
FenembarMekonnen
 
An Evaluation Of One District U27S Think Through Math Program And Curriculum ...
An Evaluation Of One District U27S Think Through Math Program And Curriculum ...An Evaluation Of One District U27S Think Through Math Program And Curriculum ...
An Evaluation Of One District U27S Think Through Math Program And Curriculum ...
Gina Brown
 
Situation analysis
Situation analysisSituation analysis
Situation analysis
Majid Shafeghat
 
Curriculum determimants
Curriculum determimantsCurriculum determimants
Curriculum determimants
Kamala Uprety
 
Elements Of Curriculum Development
Elements Of Curriculum DevelopmentElements Of Curriculum Development
Elements Of Curriculum Development
Shaikh Mustafa
 
Designing an evaluation of a tertiary preparatory program sounds
Designing an evaluation of a tertiary preparatory program soundsDesigning an evaluation of a tertiary preparatory program sounds
Designing an evaluation of a tertiary preparatory program sounds
physrcd
 
Designing an evaluation of a tertiary preparatory program sounds
Designing an evaluation of a tertiary preparatory program soundsDesigning an evaluation of a tertiary preparatory program sounds
Designing an evaluation of a tertiary preparatory program sounds
physrcd
 
Learning experiences
Learning experiences Learning experiences
Learning experiences
Marry Adina
 
A guide for comprehensive needs assessment
A guide for comprehensive needs assessmentA guide for comprehensive needs assessment
A guide for comprehensive needs assessment
k1hinze
 
A guide for comprehensive needs assessment
A guide for comprehensive needs assessmentA guide for comprehensive needs assessment
A guide for comprehensive needs assessment
k1hinze
 
Comprehensive Guidance And Counseling Program
Comprehensive Guidance And Counseling ProgramComprehensive Guidance And Counseling Program
Comprehensive Guidance And Counseling Program
faith.seals
 
DEDP orientation.pptx
DEDP orientation.pptxDEDP orientation.pptx
DEDP orientation.pptx
Boubert Dumagan
 
The curriculum framework
The curriculum frameworkThe curriculum framework
The curriculum framework
Michie Maniego-Gumangan
 
Running Head Target of Program Evaluation Plan, Part 11TARG.docx
Running Head Target of Program Evaluation Plan, Part 11TARG.docxRunning Head Target of Program Evaluation Plan, Part 11TARG.docx
Running Head Target of Program Evaluation Plan, Part 11TARG.docx
toltonkendal
 
1SIP BEDP 2030 by DepEd Planning Service Director Roger Masapol.pptx
1SIP BEDP 2030 by DepEd Planning Service Director Roger Masapol.pptx1SIP BEDP 2030 by DepEd Planning Service Director Roger Masapol.pptx
1SIP BEDP 2030 by DepEd Planning Service Director Roger Masapol.pptx
beriniaedeno
 
MD8Assgn: A8: Course Project—Program Proposal
MD8Assgn: A8: Course Project—Program ProposalMD8Assgn: A8: Course Project—Program Proposal
MD8Assgn: A8: Course Project—Program Proposal
eckchela
 

Similar to Program evaluation plan (20)

curriculum development process-1.pptx
curriculum development process-1.pptxcurriculum development process-1.pptx
curriculum development process-1.pptx
 
Leadership and learning with revisions dr. lisa bertrand-nfeasj (done)
Leadership and learning with revisions dr. lisa bertrand-nfeasj (done)Leadership and learning with revisions dr. lisa bertrand-nfeasj (done)
Leadership and learning with revisions dr. lisa bertrand-nfeasj (done)
 
2014 Montgomery College Diversity Report
2014 Montgomery College Diversity Report2014 Montgomery College Diversity Report
2014 Montgomery College Diversity Report
 
An Evaluation Of One District S Think Through Math Program And Curriculum Imp...
An Evaluation Of One District S Think Through Math Program And Curriculum Imp...An Evaluation Of One District S Think Through Math Program And Curriculum Imp...
An Evaluation Of One District S Think Through Math Program And Curriculum Imp...
 
Lesson 4 Curriculum Development 2023.pptx
Lesson 4 Curriculum  Development 2023.pptxLesson 4 Curriculum  Development 2023.pptx
Lesson 4 Curriculum Development 2023.pptx
 
An Evaluation Of One District U27S Think Through Math Program And Curriculum ...
An Evaluation Of One District U27S Think Through Math Program And Curriculum ...An Evaluation Of One District U27S Think Through Math Program And Curriculum ...
An Evaluation Of One District U27S Think Through Math Program And Curriculum ...
 
Situation analysis
Situation analysisSituation analysis
Situation analysis
 
Curriculum determimants
Curriculum determimantsCurriculum determimants
Curriculum determimants
 
Elements Of Curriculum Development
Elements Of Curriculum DevelopmentElements Of Curriculum Development
Elements Of Curriculum Development
 
Designing an evaluation of a tertiary preparatory program sounds
Designing an evaluation of a tertiary preparatory program soundsDesigning an evaluation of a tertiary preparatory program sounds
Designing an evaluation of a tertiary preparatory program sounds
 
Designing an evaluation of a tertiary preparatory program sounds
Designing an evaluation of a tertiary preparatory program soundsDesigning an evaluation of a tertiary preparatory program sounds
Designing an evaluation of a tertiary preparatory program sounds
 
Learning experiences
Learning experiences Learning experiences
Learning experiences
 
A guide for comprehensive needs assessment
A guide for comprehensive needs assessmentA guide for comprehensive needs assessment
A guide for comprehensive needs assessment
 
A guide for comprehensive needs assessment
A guide for comprehensive needs assessmentA guide for comprehensive needs assessment
A guide for comprehensive needs assessment
 
Comprehensive Guidance And Counseling Program
Comprehensive Guidance And Counseling ProgramComprehensive Guidance And Counseling Program
Comprehensive Guidance And Counseling Program
 
DEDP orientation.pptx
DEDP orientation.pptxDEDP orientation.pptx
DEDP orientation.pptx
 
The curriculum framework
The curriculum frameworkThe curriculum framework
The curriculum framework
 
Running Head Target of Program Evaluation Plan, Part 11TARG.docx
Running Head Target of Program Evaluation Plan, Part 11TARG.docxRunning Head Target of Program Evaluation Plan, Part 11TARG.docx
Running Head Target of Program Evaluation Plan, Part 11TARG.docx
 
1SIP BEDP 2030 by DepEd Planning Service Director Roger Masapol.pptx
1SIP BEDP 2030 by DepEd Planning Service Director Roger Masapol.pptx1SIP BEDP 2030 by DepEd Planning Service Director Roger Masapol.pptx
1SIP BEDP 2030 by DepEd Planning Service Director Roger Masapol.pptx
 
MD8Assgn: A8: Course Project—Program Proposal
MD8Assgn: A8: Course Project—Program ProposalMD8Assgn: A8: Course Project—Program Proposal
MD8Assgn: A8: Course Project—Program Proposal
 

More from Cynthia Crump-Russell

Meaningful test and assignments crumpcla
Meaningful test and assignments crumpclaMeaningful test and assignments crumpcla
Meaningful test and assignments crumpcla
Cynthia Crump-Russell
 
Accountability
AccountabilityAccountability
Accountability
Cynthia Crump-Russell
 
Tpd models presentation 2010
Tpd models presentation 2010Tpd models presentation 2010
Tpd models presentation 2010
Cynthia Crump-Russell
 
Day in a School Teacher Support (DiaS-TS) Change Model
Day in a School Teacher Support (DiaS-TS) Change ModelDay in a School Teacher Support (DiaS-TS) Change Model
Day in a School Teacher Support (DiaS-TS) Change Model
Cynthia Crump-Russell
 
National Curriculum Design
National Curriculum DesignNational Curriculum Design
National Curriculum Design
Cynthia Crump-Russell
 
Curriculum Development Overview / FOUNDATIONS
Curriculum Development Overview / FOUNDATIONSCurriculum Development Overview / FOUNDATIONS
Curriculum Development Overview / FOUNDATIONS
Cynthia Crump-Russell
 
National Curriculum Development Process (Plan)
National Curriculum Development Process (Plan)National Curriculum Development Process (Plan)
National Curriculum Development Process (Plan)
Cynthia Crump-Russell
 
C&i experiential appraisal model
C&i experiential appraisal modelC&i experiential appraisal model
C&i experiential appraisal model
Cynthia Crump-Russell
 

More from Cynthia Crump-Russell (8)

Meaningful test and assignments crumpcla
Meaningful test and assignments crumpclaMeaningful test and assignments crumpcla
Meaningful test and assignments crumpcla
 
Accountability
AccountabilityAccountability
Accountability
 
Tpd models presentation 2010
Tpd models presentation 2010Tpd models presentation 2010
Tpd models presentation 2010
 
Day in a School Teacher Support (DiaS-TS) Change Model
Day in a School Teacher Support (DiaS-TS) Change ModelDay in a School Teacher Support (DiaS-TS) Change Model
Day in a School Teacher Support (DiaS-TS) Change Model
 
National Curriculum Design
National Curriculum DesignNational Curriculum Design
National Curriculum Design
 
Curriculum Development Overview / FOUNDATIONS
Curriculum Development Overview / FOUNDATIONSCurriculum Development Overview / FOUNDATIONS
Curriculum Development Overview / FOUNDATIONS
 
National Curriculum Development Process (Plan)
National Curriculum Development Process (Plan)National Curriculum Development Process (Plan)
National Curriculum Development Process (Plan)
 
C&i experiential appraisal model
C&i experiential appraisal modelC&i experiential appraisal model
C&i experiential appraisal model
 

Recently uploaded

Digital Artefact 1 - Tiny Home Environmental Design
Digital Artefact 1 - Tiny Home Environmental DesignDigital Artefact 1 - Tiny Home Environmental Design
Digital Artefact 1 - Tiny Home Environmental Design
amberjdewit93
 
A Independência da América Espanhola LAPBOOK.pdf
A Independência da América Espanhola LAPBOOK.pdfA Independência da América Espanhola LAPBOOK.pdf
A Independência da América Espanhola LAPBOOK.pdf
Jean Carlos Nunes Paixão
 
RPMS TEMPLATE FOR SCHOOL YEAR 2023-2024 FOR TEACHER 1 TO TEACHER 3
RPMS TEMPLATE FOR SCHOOL YEAR 2023-2024 FOR TEACHER 1 TO TEACHER 3RPMS TEMPLATE FOR SCHOOL YEAR 2023-2024 FOR TEACHER 1 TO TEACHER 3
RPMS TEMPLATE FOR SCHOOL YEAR 2023-2024 FOR TEACHER 1 TO TEACHER 3
IreneSebastianRueco1
 
Executive Directors Chat Leveraging AI for Diversity, Equity, and Inclusion
Executive Directors Chat  Leveraging AI for Diversity, Equity, and InclusionExecutive Directors Chat  Leveraging AI for Diversity, Equity, and Inclusion
Executive Directors Chat Leveraging AI for Diversity, Equity, and Inclusion
TechSoup
 
writing about opinions about Australia the movie
writing about opinions about Australia the moviewriting about opinions about Australia the movie
writing about opinions about Australia the movie
Nicholas Montgomery
 
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdfবাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
eBook.com.bd (প্রয়োজনীয় বাংলা বই)
 
Walmart Business+ and Spark Good for Nonprofits.pdf
Walmart Business+ and Spark Good for Nonprofits.pdfWalmart Business+ and Spark Good for Nonprofits.pdf
Walmart Business+ and Spark Good for Nonprofits.pdf
TechSoup
 
Pride Month Slides 2024 David Douglas School District
Pride Month Slides 2024 David Douglas School DistrictPride Month Slides 2024 David Douglas School District
Pride Month Slides 2024 David Douglas School District
David Douglas School District
 
How to Setup Warehouse & Location in Odoo 17 Inventory
How to Setup Warehouse & Location in Odoo 17 InventoryHow to Setup Warehouse & Location in Odoo 17 Inventory
How to Setup Warehouse & Location in Odoo 17 Inventory
Celine George
 
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
National Information Standards Organization (NISO)
 
DRUGS AND ITS classification slide share
DRUGS AND ITS classification slide shareDRUGS AND ITS classification slide share
DRUGS AND ITS classification slide share
taiba qazi
 
Liberal Approach to the Study of Indian Politics.pdf
Liberal Approach to the Study of Indian Politics.pdfLiberal Approach to the Study of Indian Politics.pdf
Liberal Approach to the Study of Indian Politics.pdf
WaniBasim
 
What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...
What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...
What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...
GeorgeMilliken2
 
Hindi varnamala | hindi alphabet PPT.pdf
Hindi varnamala | hindi alphabet PPT.pdfHindi varnamala | hindi alphabet PPT.pdf
Hindi varnamala | hindi alphabet PPT.pdf
Dr. Mulla Adam Ali
 
ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...
ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...
ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...
PECB
 
How to Fix the Import Error in the Odoo 17
How to Fix the Import Error in the Odoo 17How to Fix the Import Error in the Odoo 17
How to Fix the Import Error in the Odoo 17
Celine George
 
S1-Introduction-Biopesticides in ICM.pptx
S1-Introduction-Biopesticides in ICM.pptxS1-Introduction-Biopesticides in ICM.pptx
S1-Introduction-Biopesticides in ICM.pptx
tarandeep35
 
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...
Dr. Vinod Kumar Kanvaria
 
PCOS corelations and management through Ayurveda.
PCOS corelations and management through Ayurveda.PCOS corelations and management through Ayurveda.
PCOS corelations and management through Ayurveda.
Dr. Shivangi Singh Parihar
 
Film vocab for eal 3 students: Australia the movie
Film vocab for eal 3 students: Australia the movieFilm vocab for eal 3 students: Australia the movie
Film vocab for eal 3 students: Australia the movie
Nicholas Montgomery
 

Recently uploaded (20)

Digital Artefact 1 - Tiny Home Environmental Design
Digital Artefact 1 - Tiny Home Environmental DesignDigital Artefact 1 - Tiny Home Environmental Design
Digital Artefact 1 - Tiny Home Environmental Design
 
A Independência da América Espanhola LAPBOOK.pdf
A Independência da América Espanhola LAPBOOK.pdfA Independência da América Espanhola LAPBOOK.pdf
A Independência da América Espanhola LAPBOOK.pdf
 
RPMS TEMPLATE FOR SCHOOL YEAR 2023-2024 FOR TEACHER 1 TO TEACHER 3
RPMS TEMPLATE FOR SCHOOL YEAR 2023-2024 FOR TEACHER 1 TO TEACHER 3RPMS TEMPLATE FOR SCHOOL YEAR 2023-2024 FOR TEACHER 1 TO TEACHER 3
RPMS TEMPLATE FOR SCHOOL YEAR 2023-2024 FOR TEACHER 1 TO TEACHER 3
 
Executive Directors Chat Leveraging AI for Diversity, Equity, and Inclusion
Executive Directors Chat  Leveraging AI for Diversity, Equity, and InclusionExecutive Directors Chat  Leveraging AI for Diversity, Equity, and Inclusion
Executive Directors Chat Leveraging AI for Diversity, Equity, and Inclusion
 
writing about opinions about Australia the movie
writing about opinions about Australia the moviewriting about opinions about Australia the movie
writing about opinions about Australia the movie
 
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdfবাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
 
Walmart Business+ and Spark Good for Nonprofits.pdf
Walmart Business+ and Spark Good for Nonprofits.pdfWalmart Business+ and Spark Good for Nonprofits.pdf
Walmart Business+ and Spark Good for Nonprofits.pdf
 
Pride Month Slides 2024 David Douglas School District
Pride Month Slides 2024 David Douglas School DistrictPride Month Slides 2024 David Douglas School District
Pride Month Slides 2024 David Douglas School District
 
How to Setup Warehouse & Location in Odoo 17 Inventory
How to Setup Warehouse & Location in Odoo 17 InventoryHow to Setup Warehouse & Location in Odoo 17 Inventory
How to Setup Warehouse & Location in Odoo 17 Inventory
 
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
Pollock and Snow "DEIA in the Scholarly Landscape, Session One: Setting Expec...
 
DRUGS AND ITS classification slide share
DRUGS AND ITS classification slide shareDRUGS AND ITS classification slide share
DRUGS AND ITS classification slide share
 
Liberal Approach to the Study of Indian Politics.pdf
Liberal Approach to the Study of Indian Politics.pdfLiberal Approach to the Study of Indian Politics.pdf
Liberal Approach to the Study of Indian Politics.pdf
 
What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...
What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...
What is Digital Literacy? A guest blog from Andy McLaughlin, University of Ab...
 
Hindi varnamala | hindi alphabet PPT.pdf
Hindi varnamala | hindi alphabet PPT.pdfHindi varnamala | hindi alphabet PPT.pdf
Hindi varnamala | hindi alphabet PPT.pdf
 
ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...
ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...
ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...
 
How to Fix the Import Error in the Odoo 17
How to Fix the Import Error in the Odoo 17How to Fix the Import Error in the Odoo 17
How to Fix the Import Error in the Odoo 17
 
S1-Introduction-Biopesticides in ICM.pptx
S1-Introduction-Biopesticides in ICM.pptxS1-Introduction-Biopesticides in ICM.pptx
S1-Introduction-Biopesticides in ICM.pptx
 
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...
 
PCOS corelations and management through Ayurveda.
PCOS corelations and management through Ayurveda.PCOS corelations and management through Ayurveda.
PCOS corelations and management through Ayurveda.
 
Film vocab for eal 3 students: Australia the movie
Film vocab for eal 3 students: Australia the movieFilm vocab for eal 3 students: Australia the movie
Film vocab for eal 3 students: Australia the movie
 

Program evaluation plan

  • 1. Running head: INSTRUCTIONAL PROGRAM EVALUATION 1 Instructional Program Evaluation Plan: Education for Democratic Citizenship Cynthia Crump June 20, 2011 Instructional Program Evaluation Plan: Education for Democratic Citizenship
  • 2. Instructional Program Evaluation Plan 2 The presentation is a systematic and comprehensive formative evaluation plan to investigate the implementation of social studies education for Democratic citizenship (SSEDC) in the mature stage. The lead evaluator will select a team to guide and conduct key actions throughout the evaluation process. The plan will begin with the Grades K-6 program description, followed by the theoretical framework, including the research questions that will guide the project over a 12-week period. The methodology will be mixed method survey design, using multiple methods to collect quantitative and qualitative data. The sampled target group will include various stakeholders in the school community, including the implementers and others as the need arises. Content and descriptive data analyses will be the suggested methods to extract themes and concepts and highlight possible findings influenced by (a) teachers‟ understanding of SSEDC goal; (b) methods used by teachers; and (c) problems the teachers are experiencing during the implementation process. The evidence will form the basis for findings and conclusions, and for recommending strategies for improvement of SSEDC. The evaluation team will put measures in place to promote accurate results, and efficient reporting procedures. The evaluation team will put efficient reporting procedures or measures in place respected by the internal stakeholders – designers and implementers. Some motivating factors influencing the evaluation of the SSEDC instructional program include the need of designers and supervisors to (a) influence program improvement or strengthen planning and delivery; (b) identify areas of challenge; and (c) foster accountability; (Chen, 2005; Gard, Flannigan, & Cluskey, 2004; Jason, 2008; Posavak & Carey, 2007). The proposed evaluation plan will include: 1. Committee Selection 2. Part I: Background of the SSEDC Program
  • 3. Instructional Program Evaluation Plan 3 a. Determination of the Current State of the Program b. Evaluation Goal 3. Part II: Theoretical Framework a. Standards b. Program Theory c. Stakeholders d. Methodology e. Target Population 4. Part III: Mixed Design a. Rationale b. Data Collection Methods c. Validity and Reliability 5. Part IV: Analysis of Data a. Possible Findings b. Recommendation c. Reporting Considerations The preceding list is an outline of the main sections of the evaluation plan. The committee selection is important to ensure key personnel selected are competent to lead and perform major roles. Part I will be a brief description of the program and a statement of the evaluation goal and the questions to guide the evaluation. Part II is a guide to the design, the standards and criteria that will be used as a checklist to judge the quality of the evaluation, and will have a description of the individuals who will affect or be affected by the implementation the program, including the beneficiaries. Part III, the mixed design, emphasizes the plan to
  • 4. Instructional Program Evaluation Plan 4 conduct a detailed evaluation by means of multiple data collection tools, from multiple sources. In Part IV, possible findings from the data analysis will provide the basis for suggestions to improve the program. The presentation will form the basis to conduct a formative process evaluation and provide recommendations for improvement of teachers‟ implementation of SSEDC. Committee Selection The director, supervisor, one other education officer, teachers, principals, a community member, a parent, and a private school principal, a school mentor, will comprise the evaluation committee. The responsibility of the committee members is to respond to the strengths and challenges of the program to refine the program. Gard, Flannigan, and Cluskey (2004) cited the evaluation committee has the responsibility “to use data to identify strengths and weaknesses of the program” (p. 176). The coordinator of the development and revision processes and the supervisor coupled with the stakeholders in the school community “…. are vital to the survival and success of the [program]” (Gard et al., 2004, p. 4). Collaboration with external evaluators will ensure a supportive environment (Chen, 2005). The director must guard against bias and conflict of interests (Posavac & Carey, 2007) because of involvement in all stages of the program. Ethics and values are two elements necessary to plan, conduct, and evaluate a program to ensure accuracy of results. Using external and internal evaluators would help to lessen or eliminate perceived internal bias while empowering internal and external stakeholders (Chen). Part I: Background of the SSEDC Program Before 2006, the last attempt at social studies curriculum review and renewal was in the late 1980s, supported by USAID curriculum specialists. After a quarter of a century, rebranding
  • 5. Instructional Program Evaluation Plan 5 of the social studies curriculum was necessary including renaming the program to social studies education for Democratic citizenship (SSEDC). Besides datedness, factors, affecting the social behaviors of citizens, especially among the youth, influenced the development of SSEDC. In 2007, a team, including the Director of Curriculum as expert, a core of teachers, and representatives from the environment ministry, completed a first draft of SSEDC. After several reviews, SSEDC was piloted among a sample of schools and classes (K –9), over a period of 12 weeks in 2008. At various review sessions, all grade teachers had the opportunity to input changes, based on the results and recommendations of the pilot implementation data. Implementation of the revised instructional program took place in September 2009, Familiarity seminars and training workshops were actions to develop teacher competence and support the implementation. Between 2009 and present, the director, the supervisor, education officers, principals, and senior teachers continue to conduct monitoring of the SSEDC. Goals of SSEDC The following is a section of the rationale of SSEDC (Ministry of Education, 2009) outlining several reasons that influenced program development. First, in Antigua and Barbuda [is] a Democratic state; independent from Britain since 1981; Education for Democratic Citizenship (EDC) would mean that the main outcome of schooling should be citizens with civic consciousness; not only equipped with knowledge but[also have] the ability to demonstrate skills appropriate to such a citizen, who also exhibit democratic values. Second, there appears to be a democratic deficit. A high percentage of individuals (youth) do not vote or even show much interest in politics. SSEDC should help to improve individuals‟ levels of understanding of their lives and how they interact within
  • 6. Instructional Program Evaluation Plan 6 society. Third, [because the mid-2000s] there has been an upsurge of crime and violence. Of particular interest are the negative activities among the youth. These include school violence, drug-related violence, increases in cases of HIV/AIDS, home invasions coupled with robbery and rape, murders, and other gun-related crimes. Fourth, surge in migration of Caribbean neighbors and an influx of other migrants from China has opened up the avenue for the focus on themes, such as civic ideals and practices, identity, traditions, multiculturalism, cultural diversity and tolerance. All citizens need to tolerate peoples from other places, and also to tolerate their differences (p. 1) The focus of SSEDC is on relationships to promote (i) understanding the role and responsibility of citizens in a democratic society and (ii) awareness of the link and interdependence locally, regionally, globally. The overarching goal of SSEDC is citizenship; achievable through: 1. Knowledge of social issues and concerns; 2. Skill development; 3. Development of values and attitudes; and 4. Social participation (p. 3) Teachers should provide the preceding experiences. The program„s rationale and goal emphasize the outcome capabilities including knowledge, skills, values, and dispositions the students should achieve. Students should also receive opportunities to participate in the society by transferring classroom learning to perform the role of productive citizens. These long-term outcomes should drive lesson objectives as well as the teaching learning experiences.
  • 7. Instructional Program Evaluation Plan 7 The director introduced the instructional guide with the following statement adapted from the Organization of Eastern Caribbean States Educational and Research Unit (OERU): The [program] offers a range of ideas and suggestions to help teachers organize participatory learning experiences designed to prepare students for lifelong learning”…. Social studies classrooms place major emphasis on student-centered learning through the acquisition and development of specific cognitive skills and competencies. The focus is on learning through activities, practice, and participation…. These skills are expected to produce the ultimate outcomes of SSEDC: students as citizens, acquiring and demonstrating social understanding and civic efficacy” (Ministry of Education, 2009, p. 2) Social and Contextual Factors The SSEDC Instructional program is a part of private and public schools curriculum. The pilot implementation findings highlighted some gaps and the intent of the review was to improve on the program. Currently, the curriculum unit personnel conduct support and monitoring evaluation to provide feedback information on a regular basis to facilitate supervision of the program. The qualitative and quantitative reports obtained from observation of teaching using a rating scale, reflections, the classroom environment, students‟ work, and the interactions reveal areas that mentors could assist with on a continuous basis. Determination of the Status of the Program The monitoring in public schools revealed variations exist in the teaching-learning contexts within and across schools and classes that result in differentiated delivery and students‟ learning experiences. The nature of school leadership and support, supporting materials, and out
  • 8. Instructional Program Evaluation Plan 8 of class experiences could have differentiated effects on students in achieving the goals of the curriculum. The public-private dichotomy could also be an influential factor on the teaching- learning process of SSEDC, because the monitoring of the SSEDC is a feature of public schools only. The information is important to recommend a more in-depth process evaluation. Evaluation Goal The end of July 2011 school year will make two years of implementation. Therefore, 2011-2012 is the year of mature implementation. The purpose of the evaluation is formative, to inform ways to improve SSEDC the program. The proposed plan will therefore outline a development-oriented process evaluation to examine perceived problems and recommend the way forward (Chen, 2005, Posavak & Carey, 2007). Formative evaluation is ongoing, relevant to address the purpose of the evaluation. Throughout the implementation process, the team would collect data as the program is in effect. The team will be able to identify strengths and limitations, and intermediate results during implementation, rather than waiting for the one- time outcomes evaluation (Posavak & Carey). The central question to guide the evaluation is: 1. How well is SSEDC implemented? Sub-questions: 1. Is the focus for democratic citizenship clear to the teachers? 2. What methods are teachers to prepare students? 3. What problems are teachers experiencing? The response to the questions should help in identifying the sources of problems and the role of stakeholders to improve the program. The preceding section sets the stage for the proposed evaluation of SSEDC. The selected committee will collaborate with the users and implementers at the Grades K-6 levels at private
  • 9. Instructional Program Evaluation Plan 9 and public schools. The main purpose is to investigate the implementation process to identify the strengths and weaknesses and suggest improvements. Part II: Theoretical Framework Part II is a discussion of the theoretical basis of the plan, including criteria, standards, program theory, and model. The aim is to (a) discuss how standards and stakeholders will influence the evaluation plan; (b) provide a rationale for the selected evaluation model; and (c) identify the design. The purpose of proposed process evaluation will be to examine the quality of the implementation focusing on the following criteria: 1. teachers‟ understanding of SSEDC goal 2. student-centered instruction and assessment congruent with the experiential learning, behaviorist, and constructivist theories; and 3. Social and contextual factors Standards Standards are necessary to “identify and define evaluation quality and guide evaluators” (Yarbrough, 2011, p. xxii). Attention to attributes of quality such as utility, feasibility, propriety, and accuracy promote accountability. Evaluation accountability is important to foster program improvement, and improved decision making, and create reflective practitioners. For the purpose of the proposed evaluation, the following standards could help to define the quality necessary for a successful evaluation (Yarbrough, 2011). 1. Utility a. Evaluator credibility i. Clarify that individuals will be responsible for the various elements of the evaluation
  • 10. Instructional Program Evaluation Plan 10 ii. Provide assurance that each has the expertise or support required to complete the work. 2. Feasibility a. Practical procedures i. Implement practical and responsive procedures aligned with the operation of the program. 3. Propriety a. Human rights and respect i. Design and conduct evaluation to protect human and legal rights and maintain the dignity of participants and stakeholders. 4. Accuracy a. Sound designs and analyses i. Employ technically adequate designs and analyses appropriate for the purpose of the evaluation The description of the standards supports the importance of the stakeholders developing trust in the expertise of the evaluator to plan and implement appropriate procedures and designs to promote successful and valid evaluation. Stakeholders must also feel protected and respected. The following discussion will support how the standards will influence the plan in the choice of theory, stakeholders, model, design, and human rights and respect. Program Theory Chen (2005) supported the view program theory is useful in “improving the generalizability of evaluation results, contributing to social science theory, uncovering unintended effects,… achieving consensus in evaluation planning…[and providing] …early
  • 11. Instructional Program Evaluation Plan 11 indications of program effectiveness” (p. 15). Chen (2005) identified program theories as causative or normative. Normative stakeholder theory highlights the input of designers, directors, and staff in an organization. Normative theory is different from the scientific theory that controls evaluation conducted by academics (outsider or expert interest). The leader of the evaluation, the director, will perform the role of the internal evaluator, guiding the staff and selected users and implementers during the evaluation. The activities of the program are ongoing and information on the process is necessary to determine strengths and weaknesses to promote improvement, to enable achievement of the goals. An external reviewer could be an expert in another government department. Stakeholders McCawley (2001) defined stakeholders to include a wide cross-section of actors; individuals who all make contributions or benefit from the inputs and resources and activity, which result in short-, medium-, and long-term outcomes. In the primary institutions, the main beneficiaries are the students; the other important stakeholders are the principals, teachers, parents, and individuals in the community. Corporate citizens, other government and nongovernmental partners collaborate with schools to promote learning (Beaumier, Marchand, Simoneau, & Savard, 2000; Chen, 2005; Eaton, 2009). Yarbrough (2011) described several stakeholders generic to program evaluation. Stakeholders include evaluators, designers, implementers, participants, intended users, and other respondents. For the purpose of SSEDC program, evaluation stakeholders include individuals from the administrative center or the Ministry of Education (MOE), other government ministries, school community, and the wider community as in Table 1. Table 1
  • 12. Instructional Program Evaluation Plan 12 Stakeholders School Personnel Community Ministry Other Stakeholders Teachers Students Principals or administrators External evaluator Parents Others who could contribute information Teacher trainers Internal evaluator Supervisor Education officers Government departments Non- governmental organizations Local specialist Normative theory depicted by the stakeholder model would influence or prescribe the components and activities considered necessary for the success of SSEDC program implementation and evaluation. Table 2 shows overlap of stakeholders‟ responsibilities in some areas; however, some individuals have roles more dominant than others. The working group format would be an important strategy to build consensus on tasks, roles, and issues affecting relationships during the process obtaining buy-in (Chen, 2005). Table 2 Stakeholders and their Responsibilities Stakeholders Responsibilities Evaluators - external and internal experts Plan, guide, and conduct the evaluation; reviews; decides on strategy; act as facilitator or consultant, and gives technical assistance. Designers – evaluator, supervisor, and stakeholders from the school community and community Work together to plan purpose, objectives Implementers –ministry personnel, specialist, evaluators, school personnel Collaborate to manage, oversee, and ensure the quality of the evaluation; share information on the program is implementation.
  • 13. Instructional Program Evaluation Plan 13 Teachers will provide feedback; data related to their pedagogies including teaching, learning, and assessment; the challenges, benefits, and suggestions. Principals will provide feedback on the program in their school. Evaluation participants Provide information or data Other (in the community or school) provide additional information about the program Teachers had input during the development and review stages. Feedback during the monitoring stage allowed them to participate and have a voice in identifying strengths and shortcomings. Problems experienced with principals support for training and review sessions for teachers is evidence that principals needed buy-in, to claim ownership and better understand the goals and objectives of SSEDC. In the mature implementation stage evaluation, the evaluator must ensure continuous communication with the users and other implementers, allowing them to share their concerns and doubt, and receive clarification on the goals of the program and their role as leaders or implementers (Yarbrough, 2011). The Logic Model Program theory depicted by the logic model focuses on causal assumptions – a systems approach linking program resources, activities, and intended outcomes. Table 3 is a representation of the SSEDC logic model (Cojocaru, 2009; Jason, 2008; McCawley, 2001). Table 3 SSEDC Logic Model Inputs Activities Outputs Outcomes Resources Programs Products Benefits short-term changes medium-term changes long-term changes
  • 14. Instructional Program Evaluation Plan 14 Prescriptive Curriculum:, guiding philosophy; sample lessons; rubrics; Supporting teacher material; Student text; Mentors Curriculum preparation and review, Training workshops, seminars, monitoring teachers, students others in the community Knowledge skills, attitude, awareness, motivation Behaviors, practices Environment and social changes The logic model provides a framework to examine the implementation of SSEDC in the primary grades. The formative process evaluation will provide evidence on whether teachers understand their role in preparing learners to become Democratic citizens. The preparation should include using the resources and applying the suggested experiential, student-centered activities or methods. Analysis of the data could reveal limitations needing attention to foster adjustments to the program. The recommendations could focus on clarifying rationale, training, and retraining of teachers, providing additional resources, all in preparation for evaluating the effectiveness or outcome of implementing SSEDC. Methodology The proposed program evaluation will be a mixed method survey design, placing priority on collection of qualitative data. Qualitative data may describe the ongoing process of the SSEDC activities and strategies in the form of words from open-ended questionnaire and observations. Quantitative data might result from close-ended form of questionnaire and observation schedules (Neuman, 2003). Target Group Population
  • 15. Instructional Program Evaluation Plan 15 The target group will be teachers of Grades K-6 - primary or elementary classes at public and private schools. Public (32) and private (30) schools, located in four districts or zones, provide education for 10801 students, taught by 807 teachers. Table 4 is a breakdown of the number of schools and teachers in the four districts/zones. Table 4 Target Population: Number of Schools and Teachers Sampling Units of sampling will be the schools, teachers, and principals 1. Simple random sampling a. Select one of each school type from each zone by putting the names in a bag and choosing 4 public and 4 private primary schools; n=8 schools. All the teachers (and classes) in each of the eight schools are participants. 2. Purposive sampling a. social studies supervisor b. other participants. Simple random sampling is useful to promote generalizability of findings to all schools because each has a chance for selection, and the sample is representative of the population (Neuman, 2003). Purposive sampling of supervisors and others is necessary to obtain information in the selected school communities. ZONE 1 ZONE 2 ZONE 3 ZONE 4 Total Public primary schools 7 7 9 9 32 Public primary Teachers 467 Private primary schools 7 7 8 8 30 Private primary Teachers 340
  • 16. Instructional Program Evaluation Plan 16 Human Subject Consideration Involvement in the evaluation will be voluntary. The evaluator must ensure confidentiality; stating to stakeholders the information collected is for the purpose of program evaluation only. All information will be secure; the evaluator will put measures in place to maintain the confidentiality, for example: 1. by assigning secret numbers to participants; 2. reminding participants not to write names on data collection instruments; and 3. asking personnel in the unit, or who handles the data to keep the information confidential. The evaluator is able to convince the prospective participants that their rights are protected and respected; they sign and agree to take part in the evaluation. The framework links the input of stakeholders, the activities, the resources at different stages – planning, implementation, and evaluation. Standards that describe utility, feasibility, propriety, and accuracy define the quality against which to judge the merit of the evaluation. Various stakeholders in the school and wider communities, the government departments, and other groups should collaborate to promote buy-in and to conduct an effective evaluation and promote improvement. The SSEDC logic model is applicable because looking first at the possible outcomes the stakeholders can identify factors that might influence the implementation process. The mixed method survey design is appropriate to collect ongoing qualitative and quantitative data from implementers at public and primary schools, and other participants as become necessary. Appropriate sampling methods will promote generalizability to the target population and seek input from suitable stakeholders. The human rights of the subjects are a major consideration. Part III: Methodology
  • 17. Instructional Program Evaluation Plan 17 The proposed program evaluation of social studies education for democratic citizenship (SSEDC) will be a mixed method survey design, using various methods to collect quantitative and qualitative data but placing priority on collection of qualitative data. Qualitative data could describe the ongoing process of the SSEDC activities and strategies in the form of words from open-ended questionnaire and observations. Quantitative or numerical data could result from close-ended form of questionnaire and observation schedules (Neuman, 2003). Triangulation of mixed data could provide valid and reliable evaluation instruments and promote understanding of (analysis) results (Grammatikopoulos, Zachopoulou, Tsangaridou, Liukkonen, & Pickup, 2008). The presentation is a discussion of the rationale of the method and design of the proposed evaluation. The development of the rationale will include (a) an outline of the proposed data collection instruments, including how and why stakeholders will contribute to the decision- making process; (b) a discourse on the importance of putting mechanisms in place to promote validity and reliability, followed by (c) a simple plan to analyze the data. The presentation ends with a conclusion. Mixed Method Design: A Rationale Formative evaluation requires flexible methodology that provides quick feedback using mixed methodology (Chen, 2005). The program is in the implementation stage, and a survey would be most useful among smaller samples. The mixed method provides a comprehensive scope to promote sound program evaluation and program improvement (Jason, 2008). Grammatikopoulos et al. (2008) promoted mixed design as a method to improve or increase the degrees of validity and reliability while criticizing the use of one data source to make decisions during an evaluation. The mixed method provides a comprehensive scope to promote sound program evaluation and program improvement (Jason, 2008). At the interpretation stage, the usefulness of mixed design
  • 18. Instructional Program Evaluation Plan 18 is "providing a versatile and more complete picture of the procedures under evaluation” (Grammatikopoulos et al., 2008, p. 5). Qualitative or quantitative data used in isolation will not provide the same insight as using both. The voices of participants can be most convincing as they tell their stories, useful to corroborate results of quantitative data (Grammatikopoulos et al., 2008; Creswell, 2005). Participants‟ comments and open responses can influence how readers accept or reject a program. The themes interpreted from such presentations could corroborate quantitative measures. Instruments or Data Collection Methods Process (formative) data, through triangulation method is necessary to obtain information on how the program is working and identify the influencing factors – to examine the implementation fidelity of SSEDC. Several evaluation research (Burnstein, Hutton, & Curtis, 2006; Gallavan, 2008) identified different instrumentation approaches useful to mixed qualitative and quantitative data analyses. The examples relevant to the evaluation of SSEDC include: Qualitative Data 1. Schedule for systematic observation (Appendix A) to include comments of class observations of teaching, learning, and assessment to obtain first hand information of the process; 2. Teachers‟ respond to open-ended questionnaire items (Appendix B) as they share freely their view of the program in process; 3. Interview (Appendix C) with parents, principals, teachers, concerning how the curriculum is used and impact of the context; 4. Weekly reflections written by teachers about their classroom practice and student participation, highlighting challenges, difficulties, and ease of delivery;
  • 19. Instructional Program Evaluation Plan 19 5. Document analyses including analyzing the curriculum and supporting documents, such as texts, and guides; and 6. Focus groups composed of students, teachers, and parents of students to obtain additional information on how students respond to the curriculum. Quantitative Data 1. Teachers ratings of close ended teacher questionnaire (Appendix A) (Likert type) items to obtain data about the process and principals responses to obtain data on the context and views of the implementation The preceding examples show evaluators could collect in-depth data from the users in the school community, parents, community personnel, and administrators. The stakeholders or implementers are an important group (in the context) that could influence the results of the evaluation. They represent the outputs or people who benefit from or influence the processes (or outcomes), directly or indirectly (Fontaine, Haarman, & Schmid, 2006). Validity and Reliability Validity and reliability of research are major considerations. Researchers must examine several important aspects of instruments; the appropriateness; the measurement properties; and the process of administering and scoring (Borg & Gall, 1998). The structure and contents of the researcher-designed instruments could affect the responses provided by the participants, influencing interpretation of data (Creswell, 2005; Neuman, 2003). Using themes evolving from the qualitative data or the literature review, dividing into appropriate subsets, training observers, and testing the instrument will promote content validity and reliable results (VanTassel-Baska et al., 2008).
  • 20. Instructional Program Evaluation Plan 20 Simple random sampling may be (more) appropriate to select units and participants when evaluating SSEDC. Probability sampling promotes generalizability of findings to all schools because each has a chance for selection, and the sample is representative of the population (Neuman, 2003). Retraining teachers to deliver SSEDC and using the multiple methods of data collection will promote triangulation and improve validity and reliability (Amadeo & Cepeda, 2007). Data Analysis The results of the evaluation will guide decision-making about the program. A combination of the qualitative and quantitative data at the interpretation stage results in deeper understanding of the issues and promotes validity and reliability. Ethical evaluators must put measures in place to ensure accuracy and consistency of results and consequently the conclusions (Grammatikopoulos et al., 2008). Content analysis of qualitative data could determine themes that evolve from the comments or responses of participants. Quantitative data in the form of means and percentages could complement the data from interviews, open-ended responses, or documents. Pictorial representations could be in the form of tables and graphs (Neuman, 2003). The mixed method design is the preferred design to guide data collection and analysis during the SSEDC evaluation. The survey approach using a variety of data collection tools will provide qualitative and quantitative data from multiple sources and methods. The statistical methods appropriate to making meaning of mixed data will provide greater insight into the issues. Triangulation is one component in mixed method design that increases the degree of validity and reliability, promoting generalizability of findings to the population
  • 21. Instructional Program Evaluation Plan 21 (Grammatikopoulos et al., 2008). Presenting data in different forms such as words, or in tables and graphs, is advantageous, supporting several means to interpret, analyze, and report findings. Part IV: Analysis of Data This section is a presentation of the statistical approach to mixed data analysis. Content analysis of qualitative data will determine themes that evolve from the comments or responses of participants. Quantitative data in the form of means and percentages will complement the data from interviews, open-ended responses, or documents. Pictorial representations of data will highlight the possible use of tables and graphs (Neuman, 2003). Triangulation and integration of multiple data and sources could reveal possible findings and conclusions about the implementation of SSEDC. Finally, the recommendations will be presented to develop an improvement plan. Methodology The procedures to analyze qualitative data will have two phases; the steps include: Phase 1 (i) transcribing data collected from observations, interviews, focus groups, and document analysis; (ii) perusing the information to identify a theme; (iii) coding words and phrases using an interactive approach to allow additional phrase applicable to the analysis; (iv) coding for frequency; (v) formulating categories and generalizing based on similarity of words or phrases; (vi) coding relationship between word; Phase 2
  • 22. Instructional Program Evaluation Plan 22 (vii) applying themes based on theory in order to explain the data and answer the research questions; and (viii) presenting data in the form of narrative, tables, and graphs. Content analysis procedures promote the discovery of themes, patterns, and characteristics from the data. The procedures include transcribing, perusing, and extracting words and concepts to be able to apply coding to identify the themes. In phase two applying theory from past research can also support development of themes and findings. Numerical percentages and means of words and themes could result from the qualitative analysis, and presented graphically. The steps of the qualitative analysis could be done manually or by using qualitative analysis software. Quantitative Quantitative data will derive from ratings of Likert type questionnaire and observation schedule and the frequency of themes and subthemes from the qualitative data. Presentation of data will be in the form of means and percentages represented in graphs and tables. Data from the teacher observation schedule (Appendix A) will demonstrate teachers‟ competence in planning, teaching, and assessment by the frequency of Yes or No selected by the observer. Data from the teacher questionnaire will represent teachers‟ perception of the quality of the curriculum, the strategies, and the assessments most frequently used. Tracking concepts stated most frequently would help to identify the patterns that emerge. Specific statements written or spoken by the participants will give a voice to the issues in relation to the research questions. Further analysis will show the most frequently used approaches to teaching and learning. The central question to guide the evaluation is: 1. How well is SSEDC implemented?
  • 23. Instructional Program Evaluation Plan 23 Sub-questions: 2. Is the focus for democratic citizenship clear to the teachers? 3. What methods are teachers using to prepare students? 4. What problems are teachers experiencing? Possible Findings The responses of the participants from multiple sources should help in identifying the sources of problems and the role of stakeholders to improve the program. The findings would give ideas to improve the program as necessary. The data collected using the observation schedules, questionnaires, interviews, focus groups, weekly reflections, and document analyses will be the basis of the following discussion. The possible findings will reflect the quality of the implementation focusing on the stated criteria, resulting in themes such as (a) social studies goals, (b) teacher centered and student centered teaching, learning, and assessment strategies, and (c) difficulties and challenges. The research questions will be the basis for the discussion of the possible findings of how well teachers are implementing SSEDC. Sub-question 1. Is the focus of democratic citizenship clear to the teachers? The designers expected articulation of the goal of SSEDC in the program document, during training, and the monitoring by supervisors to help teacher demonstrate a clear understanding of the goal of SSEDC. Griffith and Barth (2006) explained teachers might voice social studies (SSEDC) goal from various perspective: 1. disciplines or content necessary for nation building; 2. transmission of knowledge about what a citizen should know or do to be productive; and
  • 24. Instructional Program Evaluation Plan 24 3. active engagement to develop competencies to function effectively as a productive citizen. The extent to which teachers can explain the importance of SSEDC in the national curriculum will be a reflection of whether the goal of SSEDC to develop competence to function effectively as a productive citizen is clear or not. Sub-question 2. What methods are teachers using to plan, teach, and assess SSEDC? Comments written by the observers would indicate the extent to which teachers are applying the teaching, learning, and assessment strategies outlined in the program, or other strategies applicable to meet the needs of the students. The SSEDC teachers‟ guide (Ministry of Education, 2009) provides examples to engage students in various ways. For example, students “can engage in research projects, cooperative group work, drama, and role-play, discussions, and community service learning” (p. 3). The experiential learning is the main philosophical basis of the curriculum, complemented by the behaviorist and constructivist theories. The purpose of the experiential approach is ““to increase knowledge, develop skills and clarify values” (Roberts, 2006, p. 13). Further, the goal of the SSEDC is to promote democratic citizenship competencies, subsumed under the experiential approach. It follows therefore; teachers need to have a clear focus of this goal. The delivery of the lesson plan should build students‟ experience, reflection, concept development, and active experimentation on previous experience, incorporating direct and indirect instruction (Hunt et al., 2009). Teachers should provide opportunities for students to engage in traditional paper and pencil test and performance based assessment. Service learning, coined in the early 1960s, is becoming an important component of experiential learning. Community service learning (CSL)
  • 25. Instructional Program Evaluation Plan 25 is a summative approach to link teaching learning and assessment. Schneller (2008) noted service learning as an offshoot of Dewey‟s theory of experience, describing the strategy as “pedagogy, curriculum, activities and programmes that embrace organized, hands-on community service and volunteerism to enhance student learning and the schooling experience” (p. 294). This aspect of experiential learning culminates a period of learning, giving opportunity for the learners to demonstrate transfer of learning competencies in similar or new situations in the school environment and in the community. The documents will include the program document and teachers‟ lesson plans with their reflections. The documents could bear evidence of the teaching, learning, and assessment strategies that teachers use in planning and delivery of lessons. Numerous instructional strategies are available for use in the classroom (See Teacher questionnaire, Appendix B). Planning is important since “without a careful plan for presenting content, [students] experience may be akin to a jigsaw puzzle” (Gunter, Estes, & Schwab, 2003, p. 39). Planning the procedures portion of lesson plans requires teachers to select appropriate strategies to meet identified needs, interests, motivations, and dispositions of learners. Teachers should consider learner characteristics and learning styles when choosing an instructional strategy (Hunt, Wiseman, & Touzel, 2009). Sub-question 3. What problems are teachers experiencing? The questionnaire should give further support to the delivery of the curriculum. Although prescriptive, some variables might yet affect the implementation process, including the allotted periods, the scope of the topics, the available resources, and the appropriateness of the content for the prescribed grade level, and the learning environment. The evaluation could highlight the difficulty and ease with which teachers were able to implement the objectives and
  • 26. Instructional Program Evaluation Plan 26 content of the program. The analysis should demonstrate the teaching, learning, and assessment strategies used most to achieve the goals, and those used less or not at all. Interview data (including focus group) from teachers and parents could support possible findings above and support improvement of SSEDC program. Difficulties and challenges teachers experience may result in gaps that influence the achievement of SSEDC program goal and objectives. This would be evident in teachers‟ knowledge of SSEDC goal, and the application of student centered experiential activities, and traditional and performance-based assessment strategies. The preceding analysis including the identification of difficulties teachers are experiencing, and their suggestions for improvement will provide the evidence to design and implement the improvement plan. This evidence should be the basis on which to recommend the strategies to improve the following: 1. teachers‟ understanding of the goal of SSEDC 2. assistance from personnel in the curriculum development unit; including the director and supervisors 3. teacher preparation procedures for planning, delivery, and assessment of SSEDC; and 4. involvement of teachers in further revision of the SSEDC program. Reporting Collaboration is important and communication is even more important, especially for the evaluator to communicate with the implementers, the goal, theory, and procedures for the evaluation and sharing and discussing findings (Jason, 2008). Reporting of evaluation findings requires the development of a communication plan, outlining schedule of presentations. A draft report to the heads of the Ministry of Education such as the Minister and the Director of Education will clarify misconceptions or provoke responses to any conflicting results (Llosa &
  • 27. Instructional Program Evaluation Plan 27 Slayton, 2009). The evaluator should add a personal touch by making the live presentation, using supporting aids to clarify points and keep the interests of the other stakeholders (Posavac & Carey, 2007). Llola and Slayton emphasized “…findings be communicated appropriately and convincingly to stakeholders, so the recommendations would be considered and not dismissed”(p. 37). Program evaluation is iterative, ongoing, and cyclical to achieve different purposes, contributing to communication, collaboration, and learning in an organization (Jason, 2008). The main goal of SSEDC is to foster democratic citizenship competencies, practices, and social change. The causal relationship between resources, activities, and outputs should influence the change observed in the users over a period. This proposed evaluation plan was a formative process evaluation of social studies education of democratic citizenship (SSEDC). The purpose was to examine how teachers were implementing SSEDC at Grades K-6 at private and public primary schools. The participants of interest were (a) teachers randomly selected from four of each of the type of school, and (b) supervisor and other influential stakeholders. Mixed method design supported the collection and analysis of qualitative and quantitative data, including questionnaire, observation, and interview. Content analysis or descriptive analysis were the applied statistical approaches to identify possible findings. The evidence supported negative and positive findings of SSEDC goals and teaching, learning, and assessment strategies, and difficulties experienced during implementation. These findings, consequent conclusions, and participants‟ suggestions form the basis for the recommendation of improvement strategies. The strategies included stakeholder involvement, continuous training, and accountability measures. Reporting evaluation findings required effective planning, collaboration, communication, and
  • 28. Instructional Program Evaluation Plan 28 presentation strategies to ensure the client or stakeholders see merit in the evaluation as stated in the purpose. References
  • 29. Instructional Program Evaluation Plan 29 Amadeo, J., & Cepeda, A. (2007). National policies on Education for Democratic Citizenship in the Americas. Draft analytic Report. Washington D.C.: Inter-American Program on Education for Democratic Values and Practices. Organization of American States (OAS). Beaumier, J-P., Marchand, C., Simoneau, C., & Savard, D. (2000). The institutional evaluation guide. Commission D'évaluation de L'enseignement Collegial at its 103th meeting in Québec City. Borg, W. R., & Gall, M. D. (1998). Educational research: An introduction. (5th ed.). New York: Longman. Burnstein, J. H., Hutton, L. A., & Curtis, R. (2006). The state of elementary social studies teaching in one urban district. Journal of Social Studies Research, 30(1), 15-20. Chen, H. (2005). Practical program evaluation: Assessing and improving planning, implementation, and effectiveness. Thousand Oaks, CA: Sage. Cojocaru, S. (2009). Clarifying the theory-based evaluation. Review of Research and Social Intervention, 26, 76-86. Retrieved from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1448390 Creswell, J. (2005). Educational research: Planning, conducting, and evaluating quantitative and qualitative research. (2nd ed.). New Jersey: Pearson, Merrill Prentice Hall. Eaton, J. S. (2009). Accreditation in the United States. New Directions for Higher Education, (145), 79-86. Fontaine, C., Haarman, A., & Schmid, S. (2006). Stakeholder theory. Retrieved from http://www.edalys.fr/documents/Stakeholders%20theory.pdf Gallavan, N. P. (2008). Examining teacher candidates‟ views on teaching world citizenship. Social Studies, 99(6), 249-254.
  • 30. Instructional Program Evaluation Plan 30 Gard, C.L., Flannigan, P. N., & Cluskey, M. (2004). Program evaluation: An ongoing systematic process. Nursing Education Perspectives, 25(4), 176-179. Grammatikopoulos, V., Zachopoulou, E., Tsangaridou, N., Liukkonen, J., & Pickup, I. (2008). Applying a mixed method design to evaluate training seminars within an early childhood education project. Evaluation & Research in Education, 21(1), 4-17. Gunter, M.A., Estes, T. H., & Schwab, J. (2003). Instruction: A models approach (4th ed.). Boston: Pearson Education Inc. Hunt, G. H., Wiseman, D. G., & Touzel, T. J. (2009). Effective teaching: Preparation and implementation (4th ed.). Illinois: Charles C. Thomas – Publisher Ltd Jason, M.H. (2008). Evaluating programs to increase student achievement (2nd ed.). Thousand Oaks, CA: Sage. Llosa, L., & Slayton, J. (2009). Using program evaluation to inform and improve the education of young English language learners in US schools. Language Teaching Research 13(1), 35–54. McCawley, F. (2001). The logic model for program planning and evaluation. Retrieved from http://www.uiweb.uidaho.edu/extension/LogicModel.pdf Ministry of Education. (2009). Social studies teachers’ guide: Social studies education for democratic citizenship. St. John‟s, Antigua & Barbuda: Curriculum Development Unit. Ministry of Education. (2009). Draft national curriculum policy framework. St. John‟s, Antigua & Barbuda: Curriculum Development Unit. Neuman, L. W. (2003). Social research methods. Qualitative and quantitative approaches (5th ed.). Boston: Allyn & Bacon. Posavac, E. J., & Carey. R. G. (2007). Program evaluation: Methods and case studies (7th ed.). Upper Saddle River, N.J.: Pearson/Prentice Hall.
  • 31. Instructional Program Evaluation Plan 31 Roberts, J. W. (2005). Disney, Dewey, and the death of experience in education. Education and Culture, 21(2), 12-30. VanTassel-Baska, J., Feng, A., MacFarlane, B., Heng, M., Wong, M. L., Quek, C.G., & Khong, B. C. (2008). A cross-cultural study of teachers‟ instructional practices in Singapore and the United States. Journal for the Education of the Gifted, 31(3), 338-363. Yarbrough, D. B., Shulha, L. M., Hopson, R. K., & Caruthers, F. L. (2011). The program evaluation standards: A guide for evaluators and evaluation users (3rd. ed.). Thousand Oaks, CA: Sage.
  • 32. Instructional Program Evaluation Plan 32 Appendix A Classroom Observation Schedule Date _______________________________ Grade _______________________________ School: ______________________________ Teacher _____________________________ Topic _______________________________ A. Tick the appropriate column 1. Teacher is using the Social Studies Curriculum ____ ____ 2. Teacher has a completed lesson plan ____ ____ 3. Lesson plan incorporates a variety of teaching learning activities ____ ____ 4. Activities are in keeping with those suggested in curriculum ____ ____ 5. Teacher seems comfortable with activities suggested in curriculum ____ ____ 6. Materials are appropriate to the lesson ____ ____ 7. Materials are used appropriately during the lesson ____ ____ 8. Students respond positively to lesson activities ____ ____ 9. Students are active participants in the lesson ____ ____ 10. Activities are planned to cater for students individual differences ____ ____ 11. Lesson objectives are achieved ____ ____ 12. Students are assessed in a variety of ways ____ ____ 13. Methods of assessment are clear and appropriate ____ ____ B. Elaborate or comment on any of your observations. Suggest support that could help the teacher improve. _____________________________________________________________________________________________ _____________________________________________________________________________________________ _____________________________________________________________________________________________ _____________________________________________________________________________________________ YES NO
  • 33. Instructional Program Evaluation Plan 33 Appendix B Teacher Questionnaire Teacher: ________________________________________________________________ School: _________________________________________________________________ Class: Term: __________________________________________________________________ Section A i. How often do you use the social studies education for democratic citizenship (SSEDC) instructional guide Always Sometimes Never ii. Lessons contain realistic teaching time frames. Yes _____ No ______ iii. Number of teaching lessons/activities. Sufficient ____ Insufficient ____ iv. Number of available resources listed. Sufficient ____ Insufficient ____ v. Content for the level of teaching? Appropriate ____ Inappropriate_____ Section B 1.What objectives did you cover this term? [use unit & objective numbers] 2.What content was difficult to teach? 3.What content was easy to teach? Section C: Strategies/methods 1. Which teaching-learning strategies or activities do you use? Research ____ Grouping ____ Peer teaching ____ Investigation ____ Simulations ____ Role Play ____ Dramatization ____ Community Service Learning ____ Lecture ____ Reading textbook ____ Project ____ Poster ____ Chart ____ Poem/song ____ Displays ____ Exhibitions ____ Questioning ____ Field trip ____ Journal ____ Discussion ____ Lecture ____ Vocabulary development ____ Presentation ____ Notes ____ Class work ____
  • 34. Running head: INSTRUCTIONAL PROGRAM EVALUATION 34 2. Which assessment methods do you use? Journals ____ Investigation & Projects ____ Observation ____ Oral assessment ____ Pencil& Paper Tests/exercises ____ Worksheets ____ Practical / Performance Assessment ____ Portfolio Assessment ____ Peer assessment ____ Questionnaires ____ Community Service Learning ____ Section D Respond to the following: 1. Describe TWO main difficulties you encounter in using the curriculum/program guide 2. State THREE suggestions for improving the curriculum 3. Explain the importance of SSEDC in the national curriculum. 4. Other comments. [e.g. your feelings, your practice, and students‟ responses]
  • 35. Instructional program Evaluation Plan 35 Appendix C Interview 1. How can the curriculum development unit assist with the teaching of SSEDC? 2. Is the focus for democratic citizenship clear to the teachers? 3. Is the goal of democratic citizenship clearly outlined in the guide? 4. How helpful is the interaction with supervisors? 5. What comments do you have about preparation of teachers for planning and delivering SSEDC? 6. What comments do you have about your involvement in the development of the program?