This document discusses concepts, needs, goals and tools related to program evaluation. It defines evaluation as a systematic process to determine the merit, worth and significance of a program or intervention using set standards and criteria. The primary purposes of evaluation are to gain insight and enable reflection to identify future changes. Some key goals of program evaluation include improving program design, assessing progress towards goals, and determining effectiveness and efficiency. Common tools for program evaluation discussed include interviews, observations, questionnaires, and case studies.
This slide contains basic understanding on the concept of program evaluation. The key learning objectives include -
- Evaluation fundamentals
- Developing a logic model
- Understanding evaluation design
- Data analysis approach
A presentation for my Ed. D. Degree Program relating to Program Evaluation Models: Developers of the Management-Oriented Evaluation Approach and their Contributions;
How the Management-Oriented Evaluation Approach Has Been Used; Strengths and Limitations of the Management-Oriented Evaluation Approach; Other References, Questions for Discussion
This PowerPoint presentation was made to understand what Strategic Planning is.
FRANCO, stresses that planning should build on past gains or achievements: at the same time, however, it should start new initiatives and strike for new grounds precisely because change never ends, is always taking place, and will even be more complex and rapid in years ahead.
Topic: Development of Educational Guidance Program
Student Name: Ruqaya Gilal
Class: M.Ed.
Project Name: “Young Teachers' Professional Development (TPD)"
"Project Founder: Prof. Dr. Amjad Ali Arain
Faculty of Education, University of Sindh, Pakistan
This slide contains basic understanding on the concept of program evaluation. The key learning objectives include -
- Evaluation fundamentals
- Developing a logic model
- Understanding evaluation design
- Data analysis approach
A presentation for my Ed. D. Degree Program relating to Program Evaluation Models: Developers of the Management-Oriented Evaluation Approach and their Contributions;
How the Management-Oriented Evaluation Approach Has Been Used; Strengths and Limitations of the Management-Oriented Evaluation Approach; Other References, Questions for Discussion
This PowerPoint presentation was made to understand what Strategic Planning is.
FRANCO, stresses that planning should build on past gains or achievements: at the same time, however, it should start new initiatives and strike for new grounds precisely because change never ends, is always taking place, and will even be more complex and rapid in years ahead.
Topic: Development of Educational Guidance Program
Student Name: Ruqaya Gilal
Class: M.Ed.
Project Name: “Young Teachers' Professional Development (TPD)"
"Project Founder: Prof. Dr. Amjad Ali Arain
Faculty of Education, University of Sindh, Pakistan
The slides are consist of different models of educational leadership like academic leadership, professional leadership, visionary leadership, bureaucratic leadership etc. f
Expertise, Consumer-Oriented, and Program-Oriented Evaluation Approachesdctrcurry
All information referenced from: Fitzpatrick, J., Sanders, J., & Worthen, B. (2011). Program evaluation: Alternative approaches and practical guidelines (4th ed.). Upper Saddle River, N.J.: Pearson Education.
Assessment is a process of Evaluation is described Collecting, reviewing and using data for the purpose of improvement Evaluation describes as an act of passing judgment basis of evidence
Assessment pays attention‘ to teaching and learning
Evaluation focuses final outcome
Assessment is done at the beginning of the inquiry
Evaluation is usually done at the end
it is diagnostic It is judgmental
It is Formative it is Summative
It is process oriented It is product oriented
Provides feedback on
performance and are as of improvement
Determines to which objectives are achieved.
Based on observation and positive and negative Points Based on the level of quality as per set standard
set by both the parties
jointly (Assessor and Assesseee)
Set by the evaluator
It is absolute. It is comparative
An overview Instructional Leadership, Educator Effectiveness and the Teacher-Principal Partnership.
Discover best practices and staff development tools with this in-depth brief on SB-191 implementation
Highlights
• The importance of Instructional Leadership
• Understanding the rubric
• Making the shift
• The teacher-principal partnership
• Developing teacher leaders
• Fostering talent
• Peer practices
Concept of curriculum, composition of curriculum committee, steps of curriculum devt, curriculum evaluation, curriculum revision - the need, factors to consider and components
The slides are consist of different models of educational leadership like academic leadership, professional leadership, visionary leadership, bureaucratic leadership etc. f
Expertise, Consumer-Oriented, and Program-Oriented Evaluation Approachesdctrcurry
All information referenced from: Fitzpatrick, J., Sanders, J., & Worthen, B. (2011). Program evaluation: Alternative approaches and practical guidelines (4th ed.). Upper Saddle River, N.J.: Pearson Education.
Assessment is a process of Evaluation is described Collecting, reviewing and using data for the purpose of improvement Evaluation describes as an act of passing judgment basis of evidence
Assessment pays attention‘ to teaching and learning
Evaluation focuses final outcome
Assessment is done at the beginning of the inquiry
Evaluation is usually done at the end
it is diagnostic It is judgmental
It is Formative it is Summative
It is process oriented It is product oriented
Provides feedback on
performance and are as of improvement
Determines to which objectives are achieved.
Based on observation and positive and negative Points Based on the level of quality as per set standard
set by both the parties
jointly (Assessor and Assesseee)
Set by the evaluator
It is absolute. It is comparative
An overview Instructional Leadership, Educator Effectiveness and the Teacher-Principal Partnership.
Discover best practices and staff development tools with this in-depth brief on SB-191 implementation
Highlights
• The importance of Instructional Leadership
• Understanding the rubric
• Making the shift
• The teacher-principal partnership
• Developing teacher leaders
• Fostering talent
• Peer practices
Concept of curriculum, composition of curriculum committee, steps of curriculum devt, curriculum evaluation, curriculum revision - the need, factors to consider and components
A Good Program Can Improve Educational Outcomes.pdfnoblex1
We hope this guide helps practitioners and others strengthen programs designed to increase academic achievement, ultimately broadening access to higher education for youth and adults.
We believe that evaluation is a critical part of program design and is necessary for ongoing program improvement. Evaluation requires collecting reliable, current and compelling information to empower stakeholders to make better decisions about programs and organizational practices that directly affect students. A good evaluation is an effective way of gathering information that strengthens programs, identifies problems, and assesses the extent of change over time. A sound evaluation that prompts program improvement is also a positive sign to funders and other stakeholders, and can help to sustain their commitment to your program.
Theories of change are conceptual maps that show how and why program activities will achieve short-term, interim, and long-term outcomes. The underlying assumptions that promote, support, and sustain a program often seem self-evident to program planners. Consequently, they spend too little time clarifying those assumptions for implementers and participants. Explicit theories of change provoke continuous reflection and shared ownership of the work to be accomplished. Even the most experienced program planners sometimes make the mistake of thinking an innovative design will accomplish goals without checking the linkages among assumptions and plans.
Developing a theory of change is a team effort. The collective knowledge and experience of program staff, stakeholders, and participants contribute to formulating a clear, precise statement about how and why a program will work. Using a theory-based approach, program collaborators state what they are doing and why by working backwards from the outcomes they seek to the interventions they plan, and forward from interventions to desired outcomes. When defining a theory of change, program planners usually begin by deciding expected outcomes, aligning outcomes with goals, deciding on the best indicators to evaluate progress toward desired outcomes, and developing specific measures for evaluating results. The end product is a statement of the expected change that specifies how implementation, resources, and evaluation translate into desired outcomes.
Continuously evaluating a theory of change encourages program planners to keep an eye on their goals. Statements about how and why a program will work must be established using the knowledge of program staff, stakeholders, and participants. This statement represents the theory underlying the program plan and shows planners how resources and activities translate to desired improvements and outcomes. It also becomes a framework for program implementation and evaluation.
Source: https://ebookscheaper.com/2022/04/06/a-good-program-can-improve-educational-outcomes/
The field of program evaluation presents a diversity of images a.docxcherry686017
The field of program evaluation presents a diversity of images and claims about the nature and role of evaluation that confounds any attempt to construct a coher- ent account of its methods or confidently identify important new developments. We take the view that the overarching goal of the program evaluation enterprise is to contribute to the improvement of social conditions by providing scientifically credible information and balanced judgment to legitimate social agents about the effectiveness of interventions intended to produce social benefits. Because of its centrality in this perspective, this review focuses on outcome evaluation, that is, the assessment of the effects of interventions upon the populations they are intended to benefit. The coverage of this topic is concentrated on literature published within the last decade with particular attention to the period subsequent to the related reviews by Cook and Shadish (1994) on social experiments and Sechrest & Figueredo (1993) on program evaluation.
The word ‘evaluation’ has become increasingly used in the language of community, health and social services and programs. The growth of talk and practice of evaluation in these fields has often been promoted and encouraged by funders and commissioners of services and programs. Following the interest of funders, has been a growth in the study and practice of evaluation by community, health and social service practitioners and academics. When we consider why this move in evaluative thinking and practice has occurred, we can assume the position of the funder and simply answer, ‘...because we want to know if this program or service works’. Practitioners, specialists and academics in these fields have been called upon by governments and philanthropists to aid the development of effective evaluation. Over time, they have led their own thinking and practice independently. Evaluation in its simplest form is about understanding the effect and impact of a program, service, or indeed a whole organization. Evaluation as a practice is not so simple however, largely because in order to assess impact, we need to be very clear at the beginning what effect or difference we are trying to achieve.
The literature review begins with an overview of qualitative and quantitative research methods, followed by a description of key forms of evaluation. Health promotion evaluation and advocacy and policy evaluation will then be explored as two specific domains. These domains are not evaluation methodologies, but forms of evaluation that present unique requirements for effective community development evaluation. Following this discussion, the review will explore eight key evaluation methodologies: appreciative enquiry, empowerment evaluation, social capital,
social return on investment, outcomes based evaluation, performance dashboards and scorecards and developmental evaluation. Each of these sections will include specific methods, the values base of each methodo ...
1Discussion 2 Definition and Practical Applications of Acti.docxnovabroom
1
Discussion 2: Definition and Practical Applications of Action Research
Action research is a commonly used form of inquiry in the field of education. The goal is to allow practitioners to effect continuous process improvement. Since action researchers are active participants, their investigations must be relevant to their practice.
Defining Action Research
An examination of different definitions of action research reveals some core characteristics of this form of inquiry. Action research focuses on providing practical solutions to everyday challenges in a practitioner’s area of work. Action researchers collaborate with others to effect changes that aim at improving the current conditions or processes in their place of practice (Mills, 2014). Based on these characteristics, I define action research as
“A reflective and collaborative systematic inquiry aimed at identifying challenges relevant to a researcher’s everyday practice, generate practical solutions, and actively engage in effecting the solutions aimed at improving conditions or processes and contributing to knowledge in the area.”
Hence, the process examines arising issues. An understanding of such a concept does not only elaborate on the task context but also aids teachers in accomplishing their objectives.
Action research helps educators to define and achieve professional goals, as well as refine their practice in several ways. It enables them to identify areas of improvement, develop appropriate interventions, work with others to implement the programs, and make relevant modifications to their processes. The reflective aspect of action research and its applications improves educators’ ability to implement best practices that match their needs and those of learners.
Practical Applications of Action Research
Action research can provide educators with solutions to questions that affect their everyday practices. For instance, suppose I recently implemented a program for enhancing the reading skills of learners in a seventh grade, I can apply various aspects of action research, including systematic inquiry, reflection, and collaboration, to identify the different ways in which the program affects students. In this research, I will actively involve myself with the learners to observe how they respond to the program, identify areas that need improvements, and implement appropriate actions by commencing with relevant plans and designs.
Designing and Action Research
The success of any research is contingent on its design. Action research can be thought of as a four-stage process that involves identifying a focus area and questions, collecting data, analyzing and interpreting data, and developing a plan of action (Gerstein, 2008; Mills, 2014). In this section, the steps are applied in designing a study for assessing the effects of the reading program. Efficient plans should specify a central area and aim to solve the identified problem.
Area of Focus
The study will describe the impact.
valuation is a methodological area that is closely related to, but distinguishable from more traditional social research. Evaluation utilizes many of the same methodologies used in traditional social research, but because evaluation takes place within a political and organizational context, it requires group skills, management ability, political dexterity, sensitivity to multiple stakeholders and other skills that social research in general does not rely on as much.
This presentation has a vivid description of the basics of doing a program evaluation, with detailed explanation of the " Log Frame work " ( LFA) with practical example from the CLICS project. This presentation also includes the CDC framework for evaluation of program.
N.B: Kindly open the ppt in slide share mode to fully use all the animations wheresoever made.
Collecting Information Please respond to the followingUsi.docxmary772
"Collecting Information" Please respond to the following:
Using your evaluation plan, describe it briefly and discuss the appropriateness, benefits, and limitations of using two of the following designs: (a) case study, (b) time-series, (c) causal –pre- and posttest, (d) comparison.
"Evaluation Designs" Please respond to the following:
Since it is usually impossible to evaluate the whole population of a large program, evaluators must select samples. Using your evaluation plan, discuss the possible benefits and limitations of selecting a random sample or using purposive sampling to obtain the target population.
THIS IS THE PROGRAM EVALUATION
Program Evaluation Approach for Education
Student`s Name
Instructor
Institutional Affiliation
Course
Date
The program evaluation is a viable mechanism that is used in schools that seek to strengthen the quality ofeducation that they offer as well as improving the outcomes of the students. Today, many approaches that are used in the evaluation focus on education and especially about the key features of the program that will be evaluated. This paper will seek to describe the planned approach as it applied in education as well as the rationale for the strategy, description of the question areas and their rationale and finally the stockholders and the reasons they should be involved as well as the ways that can be used in obtaining their involvement.
Description
and Rationale
the Program Evaluation Approach
The Tylerian evaluation approach usually has a significant influence on both evaluation and education. His theory foresaw the concepts that will be used in today`s world in the improvement and multiple as the means of assessment. He defined the objectives as a way for the teachers to explain what they wanted to teach in the classes(Posavac, 2015). Through stating the goals in terms of what the students should do, Tyler believed that the teachers should plan more on their curricula so that they can be able to achieve more. Tyler eventually defined the program evaluation approach as a process of determining how best one is achieving its objectives (Jacobs, 2017). In the evaluation process, one should consider the following steps; establishment of the broad goals as well as the objectives, classifying the goals, define the objectives in terms of behavior, finding situation in which the achievement of the targets can be shown, development of the required measurement techniques, collecting the performance data and eventually compare the performance of the data with the behaviors that have been stated in the objectives.
Description of the
Question and their Rationale
Some of the description questions that can be asked on the process are why is there a discrepancy?
The discrepancy in education is the model that is usually used in the determination of whether a child is eligible for education. It usually refers to the mismatch between the child`s intellectual ability and their progres.
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
Ethnobotany and Ethnopharmacology:
Ethnobotany in herbal drug evaluation,
Impact of Ethnobotany in traditional medicine,
New development in herbals,
Bio-prospecting tools for drug discovery,
Role of Ethnopharmacology in drug evaluation,
Reverse Pharmacology.
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
We all have good and bad thoughts from time to time and situation to situation. We are bombarded daily with spiraling thoughts(both negative and positive) creating all-consuming feel , making us difficult to manage with associated suffering. Good thoughts are like our Mob Signal (Positive thought) amidst noise(negative thought) in the atmosphere. Negative thoughts like noise outweigh positive thoughts. These thoughts often create unwanted confusion, trouble, stress and frustration in our mind as well as chaos in our physical world. Negative thoughts are also known as “distorted thinking”.
Synthetic Fiber Construction in lab .pptxPavel ( NSTU)
Synthetic fiber production is a fascinating and complex field that blends chemistry, engineering, and environmental science. By understanding these aspects, students can gain a comprehensive view of synthetic fiber production, its impact on society and the environment, and the potential for future innovations. Synthetic fibers play a crucial role in modern society, impacting various aspects of daily life, industry, and the environment. ynthetic fibers are integral to modern life, offering a range of benefits from cost-effectiveness and versatility to innovative applications and performance characteristics. While they pose environmental challenges, ongoing research and development aim to create more sustainable and eco-friendly alternatives. Understanding the importance of synthetic fibers helps in appreciating their role in the economy, industry, and daily life, while also emphasizing the need for sustainable practices and innovation.
The Indian economy is classified into different sectors to simplify the analysis and understanding of economic activities. For Class 10, it's essential to grasp the sectors of the Indian economy, understand their characteristics, and recognize their importance. This guide will provide detailed notes on the Sectors of the Indian Economy Class 10, using specific long-tail keywords to enhance comprehension.
For more information, visit-www.vavaclasses.com
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
Instructions for Submissions thorugh G- Classroom.pptxJheel Barad
This presentation provides a briefing on how to upload submissions and documents in Google Classroom. It was prepared as part of an orientation for new Sainik School in-service teacher trainees. As a training officer, my goal is to ensure that you are comfortable and proficient with this essential tool for managing assignments and fostering student engagement.
Palestine last event orientationfvgnh .pptxRaedMohamed3
An EFL lesson about the current events in Palestine. It is intended to be for intermediate students who wish to increase their listening skills through a short lesson in power point.
How to Create Map Views in the Odoo 17 ERPCeline George
The map views are useful for providing a geographical representation of data. They allow users to visualize and analyze the data in a more intuitive manner.
1. 1
Concept, need, goals and tools
Introduction
Evaluation is a systematic determination of a subject's merit, worth and
significance, using criteria governed by a set of standards. It can assist an organization,
program, project or any other intervention or initiative to assess any aim, realisable
concept/proposal, or any alternative, to help in decision-making; or to ascertain the degree of
achievement or value in regard to the aim and objectives and results of any such action that
has been completed. The primary purpose of evaluation, in addition to gaining insight into
prior or existing initiatives, is to enable reflection and assist in the identification of future
change.
Meaning of evaluation:
It is a process of information gathering, information processing, judgement forming and
decision making. It is process of finding out present status of skills/knowledge/functioning of
an individual or group, which can be used purposefully in future.
Evaluation is an integral component of all systems of education at all processes. It is
what enables educators, teachers, administrators, policy makers and the community have an
idea of what is missing and what is available.
2. 2
Definition of evaluation:
According to Antony J Nitko (1986)- “Evaluation involves judging the value
or worth of a pupil of an instrumental method or of an educational programme.”
According to Prof. N.M. Dandekar (1971), evaluation may be defined as a
“systematic process of determining the extent to which educational objectives are achieved
by pupils.”
Programme evaluation
Program evaluation is a systematic method for collecting, analyzing, and using
information to answer questions about projects, policies and programs, particularly about
their effectiveness and efficiency. Program evaluations can involve both quantitative and
qualitative methods of social research. People who do program evaluation come from many
different backgrounds, such as sociology, psychology, economics, social work, and public
policy. Some graduate schools also have specific training programs for program evaluation.
Definition of programme evaluation.
Evaluation is the systematic application of scientific methods to assess the design,
implementation, improvement or outcomes of a program is called as programme evaluation
(Rossi & Freeman, 1993; Short, Hennessy, & Campbell, 1996)
Program evaluation is “…the systematic assessment of the operation and/or outcomes
of a program or policy, compared to a set of explicit or implicit standards as a means of
contributing to the improvement of the program or policy…”
According to Psychology definition for Program Evaluation in normal everyday
language, edited by psychologists, professors and leading students. Help us get better.
3. 3
Program evaluation is a systematic method for collecting, analysing, and using information to
answer questions about projects, policies and programs, particularly about their effectiveness
and efficiency. Wikipedia”
Need of programme evaluation:
Program evaluation is a valuable tool for program managers who are seeking to
strengthen the quality of their programs and improve outcomes for the children and
youth they serve.
Program evaluation answers basic questions about a program's effectiveness, and
evaluation data can be used to improve program services.
Improve program design and implementation.
It is important to periodically assess and adapt your activities to ensure they are as
effective as they can be.
Evaluation can help you identify areas for improvement and ultimately help you
realize your goals more efficiently.
Program evaluation is critical to assessing progress and maintaining alignment with
your organization's mission and community needs.
Program evaluations can also provide input for future program plans.
The need of program evaluation is to determine whether the program is efficient in
terms of using resources wisely to perform the needed work, effective by performance
measures or objectives set, and implemented as stated.
The efficiency and effectiveness of a program can help make decisions, fix
accountability problems, and aid in planning.
4. 4
It can also improve operations, reallocation of resources and contract monitoring
(Lane 1999).
In order to understand the importance, knowledge of the components of the program
evaluation is required.
Goals of programme evaluation
1. Information is needed to make current decisions about a product or program
2. The information, how much can be collected and analyzed in a low-cost and practical
manner, e.g., using questionnaires, surveys and checklists
3. The information should be accurate (reference the above table for disadvantages of
methods)
4. The methods get all of the needed information
5. Additional methods should and could be used if additional information is needed
5. 5
6. The information appear as credible to decision makers, e.g., to funders or top management
7. The nature of the audience conform to the methods, e.g., will they fill out questionnaires
carefully, engage in interviews or focus groups, let you examine their documentations, etc..
8. Evaluation can administer the methods now or is training required
9. The information can be analysed
Tools of programme evaluation
1. End-of-term course evaluation form completed by students in the course
2. Reflective memo, completed by undergraduate instructors for each course taught,
discussed with the Associate Department Head in charge of teaching improvement
3. Interview questions asked of groups of senior students just prior to program completion
1. Define the key questions: Clearly define what questions the evaluation will be
designed to answer. Will program participants be compared to a control group of
nonparticipants, or will two different program model variations be compared to each
other? Interviewing and selecting a third-party evaluator (e.g., university researchers,
individual experts, or firms such as MDRC) can help raise and clarify key questions
for the evaluation to answer.
2. Design the evaluation: Together with the evaluator, design a rigorous study that will
answer your key questions as efficiently and affordably as possible. Different
questions and program models lend themselves to different evaluation methods (e.g.,
randomly assigning participants to different groups, or doing pre/post comparisons).
Longer study duration and larger sample sizes will allow higher levels of confidence
in the results, but also increase the expense of the study.
3. Conduct the study: Conduct the evaluation according to the design. The evaluator
may collect and track all necessary data during the study period, or the non-profit’s
internal data systems and staff may be part of the process.
4. Analyze the results: Analyze the data to answer the key questions and reveal any
additional key insights about the program that may emerge from the evaluation
process. If the program evaluation showed high levels of effectiveness and impact,
seek ways to build upon this success (e.g., strengthening or expanding the program,
publicizing results to seek additional funding). If the results were unclear or negative,
6. 6
discuss potential causes and remedies (e.g., evaluation design changes, program
model changes).
5. Improve: Begin implementing changes to strengthen the program and the non-profit
as a whole.
There are other programme evaluation tools such as:
1. Interview
2. Observation
3. Questionnaire
4. Case study
1. Interview
An interview is a conversation where questions are asked and answers are given. In
common parlance, the word "interview" refers to a one-on-one conversation with one
person acting in the role of the interviewer and the other in the role of the interviewee.
The interviewer asks questions, the interviewee responds, with participants taking
turns talking. Interviews usually involve a transfer of information from interviewee to
interviewer, which is usually the primary purpose of the interview, although
information transfers can happen in both directions simultaneously. One can contrast
an interview which involves bi-directional communication with a one-way flow of
information, such as a speech or oration.
2. Observation
Observational tool is defined as the tool of viewing and recording the actions and
behaviors of participants. It is described as being a systematic observation method,
which implies that the observation techniques are sensible and replicable procedures
so that the research could be reproduced.
Here are some different types of observation methods that can be used
to observe a child:
Anecdotal Records. This observation is usually recorded after the event has occurred
and written in past tense.
Running Records.
Learning Stories.
Jottings.
Sociograms.
7. 7
Time Samples.
Event Samples.
Photographs.
3. Questionnaire
A questionnaire is a research instrument consisting of a series of questions (or other
types of prompts) for the purpose of gathering information from respondents. The
questionnaire was invented by the Statistical Society of London in 1838.
A questionnaire consists of a number of questions that the respondent has to answer
in a set format. A distinction is made between open-ended and closed-ended
questions. An open-ended question asks the respondent to formulate his own answer,
whereas a closed-ended question has the respondent pick an answer from a given
number of options. The response options for a closed-ended question should be
exhaustive and mutually exclusive. Four types of response scales for closed-ended
questions are distinguished:
Dichotomous, where the respondent has two options
Nominal-polytomous, where the respondent has more than two unordered
options
Ordinal-polytomous, where the respondent has more than two ordered options
(Bounded)Continuous, where the respondent is presented with a continuous
scale
4. Case study
A case study is a research method involving an up-close, in-depth, and detailed
examination of a subject of study (the case), as well as its related contextual
conditions.
Case studies can be produced by following a formal research method. These case
studies are likely to appear in formal research venues, as journals and professional
conferences, rather than popular works. The resulting body of 'case study research'
has long had a prominent place in many disciplines and professions, ranging from
psychology, anthropology, sociology, and political science to education, clinical
science, social work, and administrative science.
Conclusion:
Evaluation is a process of information gathering, information processing, judgement forming
and decision making. Program evaluation is a systematic method for collecting, analysing,
and using information to answer questions about projects According to Psychology definition
for Program Evaluation in normal everyday language, edited by psychologists, professors and
8. 8
leading students. Help us get better. It answers basic questions about a program's
effectiveness. There are some tools for programme evaluation such as observation, case
study, interview etc... Observational tool is defined as the tool of viewing and recording the
actions and behaviors of participants. A case study is a research method involving an up-
close, in-depth, and detailed examination of a subject of study. Interview refers to a one-on-
one conversation with one person acting in the role of the interviewer and the other in the role
of the interviewee.
References:
1. http://www.deakin.edu.au/data/assets/pdffile/0005/268511/programme evaluation .pdf
2. https://www.dickinson.edu/download/downloads/id/3583/writingsamples
3. https://en.wikipedia.org/wiki/Program_evaluation.
4. http://www.mrc-ewl.cam.ac.uk/communications/evaluation-tools/.
5.https://www.opm.gov/wiki/uploads/docs/Wiki/OPM/training/Program%20Evaluation%20B
eginners%20Guide.pdf