Plan Evaluation & Implementation
Upcoming SlideShare
Loading in...5
×
 

Plan Evaluation & Implementation

on

  • 1,692 views

ME 201 Strategic Management of Engineering Enterprise

ME 201 Strategic Management of Engineering Enterprise

Statistics

Views

Total Views
1,692
Views on SlideShare
1,692
Embed Views
0

Actions

Likes
0
Downloads
20
Comments
0

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Plan Evaluation & Implementation Plan Evaluation & Implementation Presentation Transcript

    • HENRY JOHN N. NUEVAME 201: Strategic Management of Engineering EnterpriseMaster in Management EngineeringPangasinan State University
    • At the end of this lecture andpresentation, we will be able to:• Verify the importance of PlanEvaluation and its concept as appliedto every project proposals;•Be aware of the procedures andcomposition of Evaluation StudyCommittee;•Be knowledgeable upon identifyingactivities as applied in the conduct ofPlan Evaluation;
    • •The Concept of Evaluation•The Plan Evaluation•Primary Purpose•Reasons for neglectful conduct•Plan, Program & Project Evaluation•Organization of Evaluation Committee•Preparing the Evaluation Proposal•Implementing the Evaluation•Data Gathering and Processing•Presentation and Analysis of Data•Findings & Conclusions•Plan Update
    • Evaluation as applied to a project proposal and planningmanagement, describes as the “PROCESS OF ANALYZINGPROJECT INPUTS, TRANSFORMATION TECHNIQUES ANDTHE EFFECT & IMPACT OF OUTPUTS AGAINST DEFINITESTATED GOALS AND OBJECTIVES” Hence,Evaluation on its simplest term, define as “SYSTEMATICDETERMINATION OF WORTH AND SIGNIFICANCE OFSOMETHING USING A CRITERIA AGAINST A SET OFSTANDARDS”
    • -a quality determinitation of Strategic Planning For the last twenty-five years, why does Plan Evaluation indeed grossly disregard by planners and managers causes neglectful conduct? 1. Planners and Managers general opinion was that, their main task is just to put the project in place with the hope that expected results would come up; 2. They are quite reluctant to subject their projects be evaluated by outsourced group because their motivation, integrity and competence are placed under scrutiny; 3. As to their presumptions, evaluation has no practical value because of whatever results known would not be in any way put to effective use.
    • CONTROLLING BUDGET- During the planning process, the production manager is forced to create a production plan that fits the given budget offered by the company. The budget should be evaluated and assessed each time a decision is made in the planning stages. Each decision may cost money, whether it is in labor costs or equipment fees. Examining the changing budget in the planning stages can help the production manager stay in control of the spending.
    • ADDRESS RISKS UPFRONT- Another important reason why evaluation and assessments should be done during any planning stages is the risks associated with a given project. Each project performed by a company may have a set of risks, such as the lack of operational equipment, sick or absent employees or the lack of a flexible budget. Each of the risks the production manager faces should have a set of solutions, so the risks are prevented upfront.
    • TIME FRAME- Without a steady and solid plan for a project, the time line can be extremely flexible. The time line may not have a deadline or monthly goals if a plan or schedule has not been created. Once a deadline has been set by the board of executives or business owner, the production manager must evaluate the tasks to determine whether a project can be completed within the time frame provided. The tasks must be assessed to ensure the schedule is realistic.
    • QUALITY CONTROL- Once the planning is complete, the manager must go back and assess the schedule in terms of the quality produced in the given time. If the schedule is too packed, it may affect the quality of the production. The assessment and evaluation of the planning process is important to ensure the quality of the product, as the manager will be held responsible if it is not satisfactory.
    • PRIMARY PURPOSE:“To determine the quality of a program by formulating a judgment”
    • FORMATIVE SUMMATIVEEVALUATIONS EVALUATIONS (Click Here) (Click Here)
    • FORMATIVE EVALUATIONSAn evaluation that is performed during thedevelopment or implementation of a projectto help developers modify or improve theprojectExample:An example of Formative Evaluation includes mid-term surveys andfocus groups asking students about a new technology that hasbeen introduced. During a class an evaluator gave the students abrief survey and engages them in a discussion on specific topics.The results from this research led to an increased awareness ofsome of the problems students were having and suggested someshort-term changes in how the programs were implemented, andsome larger longer-term changes in the way the courses weredesigned.
    • SUMMATIVE EVALUATIONSAn evaluation that is performed near, at, orafter the end of a project or a major section ofa project. Examples of summative evaluations can range from relatively simple and common IDQ (Instructor Designed Questionnaire) results to a careful study that compares two sections of the same class using different methods of instruction. Careful planning is key to successful summative evaluations. This is because determining the possible outcomes and developing criteria need to occur well in advance of the completion of the project to be evaluated. ATL can provide support for planning and implementation of this type of evaluation.
    • “Plan Evaluation as a component ofstrategic plan provides the key making strategic planning a cyclical and continuous process” (Annual Project Basis)
    • ANNUAL PROJECT/PROGRAM TIMELINE YEAR END Q1 Q2 Q3 Q4 MID-PERIODIn reference to the above model, results of the Mid-PeriodEvaluation would have the specific program or project as itsframe of reference.
    • ANNUAL PROJECT/PROGRAM TIMELINE YEAR END Q1 Q2 Q3 Q4 FINALIn reference to the above model, Results of the Final PlanEvaluation would partly tell whether the mission and thevision of the plan are achieved or not. Thus,accomplishment reports are integrated and consolidated bythe planner in reference to the set objective.
    • •It would identify conclusively whether program/project objectives are adequate and responsively attained or not.•Conclusively resolves whetherthe plan mission and vision are realized or not.
    • What is the use / What will happen to the results of outputs and outcomes in terms of effects and impacts?1. Planners could use these results for research and study since it is eventually recycled.2. May use as feedback or as an input in the planning process.
    • Since a medium or long-term strategic developmentplan requires a periodic evaluation, a need ofevaluators or committee is highly recommended. Taskto perform in-depth reviews of selected evaluationissues, strategies and methodologies.Evaluation Committee also discusses selectedevaluation reports to make suggestions for includingevaluations of particular interest towards the annualwork program. It is also suggested that thecomposition of committee comprises of multi-disciplinary orientation individual or experts in parallelto the focused project.
    • OVERVIEW OF THE EVALUATION.•All experts are briefed orally or in writing before the evaluation in order to informthem of the general evaluation guidelines and the objectives of the research areaunder consideration.•Each proposal is evaluated against the applicable criteria independently byexperts who fill in individual evaluation forms giving marks and providingcomments.•For each proposal a consensus report is prepared. The report faithfully reflectsthe views of the independent experts referred to in Step 2.•A panel discussion may be convened, if necessary, to examine and compare theconsensus reports and marks in a given area, to review the proposals withrespect to each other to make recommendations on a priority order and/or onpossible clustering or combination of proposals.
    • THE EVALUATION CRITERIA.•In all circumstances, proposals are evaluated against the criteria for theinstrument for which they are submitted. In clear-cut cases a proposal may beruled out of scope by the Commission without referring it to experts.•Any proposal for an indirect action which contravenes fundamental ethicalprinciples or which does not fulfil any conditions set out in the call shall not beselected and may be excluded from the evaluation and selection procedure at anytime.•Any particular interpretations of the criteria to be used for evaluation are set out inthe work programme, in particular the way in which they translate into the issuesto be examined.
    • PROPOSAL MARKING.•Evaluators examine the individual issues comprising each block of evaluationcriteria and in general mark the blocks on a six-point scale from 0 to 5 or any othermarking. An example of score markings are as follows: 0 - the proposal fails to address the issue under examination or can not be judged against the criterion due to missing or incomplete information 1 - poor 2 - fair 3 - good 4 - very good 5 - excellent•Where appropriate, half marks may be given. If appropriate, evaluators mayalso be asked to give a mark to each of the individual issues comprising theblocks of criteria. Only the marks for the blocks of criteria are taken into account(after applying any weightings) for the overall score for the proposal.
    • THRESHOLDS AND WEIGHTINGS.•Thresholds may be set for some or all of the blocks of criteria, such that anyproposal failing to achieve the threshold marks will be rejected. The thresholds tobe applied to each block of criteria as well as any overall threshold are set out inthe call. If the proposal fails to achieve a threshold for a block of criteria, theevaluation of the proposal may be stopped. The reasons will be detailed in theconsensus report. It may be decided to divide the evaluation into several stepswith the possibility of different experts examining different aspects. Where theevaluation is carried out in several successive steps, any proposal failing athreshold mark may not progress to the next step. Such proposals mayimmediately be categorised as rejected.•According to the specific nature of the instruments and the call, it may be decidedto weight the blocks of criteria. The weightings to be applied to each block ofcriteria are set out in the call.
    • “Implementation of Evaluation is set & ready if and only if organized team, approved proposal, released budget, validated evaluation instruments are all prepared”“Evaluation – an EVIDENCE in Program Development Process
    • Evaluation Collect Analyze & Interpret Report Focus Data
    • Evaluation Focus Guidelines for utility consideration in determining •The following guidelines will determine the correct evaluation focus the correct evaluation focus .•What is the purpose of the evaluation?Purpose refers to the general intent of the evaluation. A clear purpose serves asthe basis for the evaluation questions, design, and methods. Some commonpurposes: Gain new knowledge about program activities Improve or fine-tune existing program operations (e.g., program processes or strategies) Determine the effects of a program by providing evidence concerning the program’s contributions to a long-term goal Affect program participants by acting as a catalyst for self-directed change (e.g., teaching)
    • Evaluation Focus Guidelines for utility consideration in determining•The following guidelines will determine the correct evaluation focusthe correct evaluation focus .•Who will use the evaluation results?Users are the individuals or organizations that will employ the evaluationfindings in some way. The users will likely have been identified during Step1 during the process of engaging stakeholders. In this step, you need tosecure their input into the design of the evaluation and the selection ofevaluation questions. Support from the intended users will increase thelikelihood that the evaluation results will be used for program improvement.
    • Evaluation Focus Guidelines for utility consideration in determining •The following guidelines will determine the correct evaluation focus the correct evaluation focus .•How will they use the evaluation results?Uses describe what will be done with what is learned from the evaluation, andmany insights on use will have been identified in Step 1. Information collected mayhave varying uses, which should be described in detail when designing theevaluation. Some examples of uses of evaluation information: To document the level of success in achieving objectives To identify areas of the program that need improvement To decide how to allocate resources To mobilize community support To redistribute or expand the locations where the intervention is carried out To improve the content of the program’s materials To focus program resources on a specific population To solicit more funds or additional partners
    • Collect Data Collecting data is a major part of any evaluation, •The following guidelines will determine the correct evaluation focus but we need to take note that the method follows purpose.• SOURCES OF EVALUATION INFORMATIONA variety of information sources exist which to gather your evaluative data.Thus, in a major program evaluation, we may need more than one source.The information source we select will depend upon what is available and whatanswers the evaluation questions effectively. Most common source ofevaluative information fall into 3 categories namely:1. EXISTING INFORMATION2. PEOPLE3. PICTORAL RECORDS AND OBSERVATIONS
    • Collect Data EXISTING PICTORAL RECORDS PEOPLE INFORMATION & OBSERVATIONS• Might use of • Think about who • Data collection via: program can best answer • Visual accounts documents the questions via: • Pictures and• Log-books • Participants or photographs• Minutes of the beneficiaries • Direct observation meeting (directly or of situations• Accomplishment indirectly) • Behaviors reports • Nonparticipants, • Program activities• Media releases proponents, critics, and outcomes victims• Local statistics • Experts &• Agency data Specialists• Etc. • Collaborators & Policy makers
    • Collect Data Major Methods for collecting information about an Evaluation.SURVEY-collecting standardized information through structured questionnaires to generatequantitative data . Surveys may be mailed or online throughWebPages, completed on-site or administered through interviews,Conducted either face to face, by telephone or electronically.Sample surveys use probability sampling whichAllow us to generalize findings to a largerPopulation while informal surveys do not.
    • Collect Data Major Methods for collecting information about an Evaluation.CASE STUDY-an in-depth examination of a particular case- a program, group of participants,single individual, site or location.Case studies rely on multiple sources of information and methods to provide ascomplete a picture as possible.
    • Collect Data Major Methods for collecting information about an Evaluation.•INTERVIEWS-an information collected by talking with and listening to people.Interviews range on a continuum from those which are tightly structured(as in a survey) to those that are free flowing and conversational.
    • Collect Data Major Methods for collecting information about an Evaluation.•GROUP & PEER ASSESMENT-collecting evaluation information through the use of group processessuch as nominal group technique, focus group, brainstormingand community forum.
    • Analyze & Interpret What does it mean by ANALYZING DATA ?•Analyzing data involves examining it in ways that reveal the relationships,patterns, trends, etc. that can be found within it.•That may mean subjecting it to statistical operations that can tell you notonly what kinds of relationships seem to exist among variables, but also towhat level you can trust the answers you’re getting.•It may mean comparing your information to that from other groups to helpdraw some conclusions from the data. The point, in terms of evaluation, isto get an accurate assessment in order to better understand the work andits effects in order to better understand the overall situation.
    • Analyze &Interpret2 Types of data and how to analyze as applied toplanning • To view QUANTITATIVE DATA Page Pls. Proceed Click Button • To view QUALITATIVE DATA Page Pls. Proceed Click Button
    • QUANTITATIVE DATA-Refer to the information that is collected as, or can be translatedinto, numbers, which can then be displayed and analyzedmathematicallyExamples include:The FrequencyTest scoresSurvey ResultsNumbers or PercentagesThis data allow us to compare those changes to one another, tochanges in another variable, or to changes in another population. Itwill be able to tell us, at a particular degree of reliability, whetherthose changes are likely to have been caused by your intervention orprogram, or by another factor, known or unknown. And they canidentify relationships among different variables, which may or maynot mean that one causes another.
    • QUALITATIVE DATA- Data collected asdescriptions, anecdotes, opinions, quotes, interpretations, etc., andare generally either not able to be reduced to numbers, or areconsidered more valuable or informative if left as narratives. The challenges of translating qualitative into quantitative datahave to do with the human factor. Even if most people agree on what1 (lowest) or 5 (highest) means in regard to rating “satisfaction” with aprogram, ratings of 2, 3, and 4 may be very different for differentpeople.
    • Analyze & Interpret How to analyze & interpret gathered data?•Record data in the agreed-upon ways. These may include pencil and paper,computer (using a laptop or handheld device in the field, entering numbers into aprogram, etc.), audio or video, journals, etc.•Score any tests and record the scores appropriately•Sort your information in ways appropriate to your interest. This may includesorting by category of observation, by event, by place, by individual, by group, bythe time of observation, or by a combination or some other standard.•When possible, necessary, and appropriate, transform qualitative into quantitativedata. This might involve, for example, counting the number of times specificissues were mentioned in interviews, or how often certain behaviors wereobserved.
    • Analyze & Interpret How to analyze & interpret gathered data?•Simple counting, graphing and visual inspection of frequency or rates of behavior,events, etc., over time.•Calculating the mean (average), median (midpoint), and/or mode (most frequent)of a series of measurements or observations.•Finding patterns in qualitative data. If many people refer to similar problems orbarriers, these may be important in understanding the issue, determining whatworks or doesn’t work and why, or more•Comparing actual results to previously determined goals or benchmarks. Onemeasure of success might be meeting a goal for planning or programimplementation, for example
    • Report Report – a final stage of Evaluation Implementation .• Depending on the nature of the research orproject, results may be statistically significant orsimply important or unusual. Also, These may ormay not be socially significant.Once we’ve organized the results and run them through whatever statistical orother analysis we’ve planned for, it’s time to figure out what these mean for theevaluation. Probably the most common question that evaluation research isdirected toward is whether the program being evaluated works or makes adifference.
    • Report “What were the effects of the independent variable (the program, intervention, etc.) on the dependent variable(s) (the behavior, conditions, or other factors it was meant to change)?.•Findings on the report should be stated in clear, straight-forward and objectivefashion.•It should also be in agreement with the facts presented, briefly stated inanswer to the questions raised and preferably arranged sequentially inaccordance with the order of the problems or objectives of the project.•On the report, conclusions should be presented in a more detailed mannerand resulting directly from the findings or tested hypothesis if there are.
    • Recommendations advanced and proposed shouldbe further verified and substantiated in the light ofstudy findings and conclusions.Once validated, said recommendations provideuseful inputs to planners and managers in theplanning and decision-making processes.Said inputs not only update the plan but alsomake the programs and projects moreresponsive and relevant.
    • en.wikipedia.org/wiki/Evaluation http://www.ehow.com/info_8013194_importance-planning-evaluation- assessments.htmlhttp://www.er.undp.org/Procurement/docs/undp_procurement_evaluation.pdfhttp://cordis.europa.eu/documents/documentlibrary/66623291EN6.pdfhttp://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.htmlhttp://learningstore.uwex.edu/assets/pdfs/G3658-4.pdfhttp://ctb.ku.edu/en/tablecontents/chapter37/section5.aspx http://webxtc.extension.ualberta.ca/research/evaluation//evalModel3a.cfm?&subsecti onid=3&sectionid=1&level3=6&sublevel3=17
    • HENRY JOHN N. NUEVAPLAN EVALUATION & IMPLEMENTATIONMasters in Management Engineering