Plan Evaluation & Implementation


Published on

ME 201 Strategic Management of Engineering Enterprise

Published in: Education, Technology, Business
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Plan Evaluation & Implementation

  1. 1. HENRY JOHN N. NUEVAME 201: Strategic Management of Engineering EnterpriseMaster in Management EngineeringPangasinan State University
  2. 2. At the end of this lecture andpresentation, we will be able to:• Verify the importance of PlanEvaluation and its concept as appliedto every project proposals;•Be aware of the procedures andcomposition of Evaluation StudyCommittee;•Be knowledgeable upon identifyingactivities as applied in the conduct ofPlan Evaluation;
  3. 3. •The Concept of Evaluation•The Plan Evaluation•Primary Purpose•Reasons for neglectful conduct•Plan, Program & Project Evaluation•Organization of Evaluation Committee•Preparing the Evaluation Proposal•Implementing the Evaluation•Data Gathering and Processing•Presentation and Analysis of Data•Findings & Conclusions•Plan Update
  5. 5. -a quality determinitation of Strategic Planning For the last twenty-five years, why does Plan Evaluation indeed grossly disregard by planners and managers causes neglectful conduct? 1. Planners and Managers general opinion was that, their main task is just to put the project in place with the hope that expected results would come up; 2. They are quite reluctant to subject their projects be evaluated by outsourced group because their motivation, integrity and competence are placed under scrutiny; 3. As to their presumptions, evaluation has no practical value because of whatever results known would not be in any way put to effective use.
  6. 6. CONTROLLING BUDGET- During the planning process, the production manager is forced to create a production plan that fits the given budget offered by the company. The budget should be evaluated and assessed each time a decision is made in the planning stages. Each decision may cost money, whether it is in labor costs or equipment fees. Examining the changing budget in the planning stages can help the production manager stay in control of the spending.
  7. 7. ADDRESS RISKS UPFRONT- Another important reason why evaluation and assessments should be done during any planning stages is the risks associated with a given project. Each project performed by a company may have a set of risks, such as the lack of operational equipment, sick or absent employees or the lack of a flexible budget. Each of the risks the production manager faces should have a set of solutions, so the risks are prevented upfront.
  8. 8. TIME FRAME- Without a steady and solid plan for a project, the time line can be extremely flexible. The time line may not have a deadline or monthly goals if a plan or schedule has not been created. Once a deadline has been set by the board of executives or business owner, the production manager must evaluate the tasks to determine whether a project can be completed within the time frame provided. The tasks must be assessed to ensure the schedule is realistic.
  9. 9. QUALITY CONTROL- Once the planning is complete, the manager must go back and assess the schedule in terms of the quality produced in the given time. If the schedule is too packed, it may affect the quality of the production. The assessment and evaluation of the planning process is important to ensure the quality of the product, as the manager will be held responsible if it is not satisfactory.
  10. 10. PRIMARY PURPOSE:“To determine the quality of a program by formulating a judgment”
  12. 12. FORMATIVE EVALUATIONSAn evaluation that is performed during thedevelopment or implementation of a projectto help developers modify or improve theprojectExample:An example of Formative Evaluation includes mid-term surveys andfocus groups asking students about a new technology that hasbeen introduced. During a class an evaluator gave the students abrief survey and engages them in a discussion on specific topics.The results from this research led to an increased awareness ofsome of the problems students were having and suggested someshort-term changes in how the programs were implemented, andsome larger longer-term changes in the way the courses weredesigned.
  13. 13. SUMMATIVE EVALUATIONSAn evaluation that is performed near, at, orafter the end of a project or a major section ofa project. Examples of summative evaluations can range from relatively simple and common IDQ (Instructor Designed Questionnaire) results to a careful study that compares two sections of the same class using different methods of instruction. Careful planning is key to successful summative evaluations. This is because determining the possible outcomes and developing criteria need to occur well in advance of the completion of the project to be evaluated. ATL can provide support for planning and implementation of this type of evaluation.
  14. 14. “Plan Evaluation as a component ofstrategic plan provides the key making strategic planning a cyclical and continuous process” (Annual Project Basis)
  15. 15. ANNUAL PROJECT/PROGRAM TIMELINE YEAR END Q1 Q2 Q3 Q4 MID-PERIODIn reference to the above model, results of the Mid-PeriodEvaluation would have the specific program or project as itsframe of reference.
  16. 16. ANNUAL PROJECT/PROGRAM TIMELINE YEAR END Q1 Q2 Q3 Q4 FINALIn reference to the above model, Results of the Final PlanEvaluation would partly tell whether the mission and thevision of the plan are achieved or not. Thus,accomplishment reports are integrated and consolidated bythe planner in reference to the set objective.
  17. 17. •It would identify conclusively whether program/project objectives are adequate and responsively attained or not.•Conclusively resolves whetherthe plan mission and vision are realized or not.
  18. 18. What is the use / What will happen to the results of outputs and outcomes in terms of effects and impacts?1. Planners could use these results for research and study since it is eventually recycled.2. May use as feedback or as an input in the planning process.
  19. 19. Since a medium or long-term strategic developmentplan requires a periodic evaluation, a need ofevaluators or committee is highly recommended. Taskto perform in-depth reviews of selected evaluationissues, strategies and methodologies.Evaluation Committee also discusses selectedevaluation reports to make suggestions for includingevaluations of particular interest towards the annualwork program. It is also suggested that thecomposition of committee comprises of multi-disciplinary orientation individual or experts in parallelto the focused project.
  20. 20. OVERVIEW OF THE EVALUATION.•All experts are briefed orally or in writing before the evaluation in order to informthem of the general evaluation guidelines and the objectives of the research areaunder consideration.•Each proposal is evaluated against the applicable criteria independently byexperts who fill in individual evaluation forms giving marks and providingcomments.•For each proposal a consensus report is prepared. The report faithfully reflectsthe views of the independent experts referred to in Step 2.•A panel discussion may be convened, if necessary, to examine and compare theconsensus reports and marks in a given area, to review the proposals withrespect to each other to make recommendations on a priority order and/or onpossible clustering or combination of proposals.
  21. 21. THE EVALUATION CRITERIA.•In all circumstances, proposals are evaluated against the criteria for theinstrument for which they are submitted. In clear-cut cases a proposal may beruled out of scope by the Commission without referring it to experts.•Any proposal for an indirect action which contravenes fundamental ethicalprinciples or which does not fulfil any conditions set out in the call shall not beselected and may be excluded from the evaluation and selection procedure at anytime.•Any particular interpretations of the criteria to be used for evaluation are set out inthe work programme, in particular the way in which they translate into the issuesto be examined.
  22. 22. PROPOSAL MARKING.•Evaluators examine the individual issues comprising each block of evaluationcriteria and in general mark the blocks on a six-point scale from 0 to 5 or any othermarking. An example of score markings are as follows: 0 - the proposal fails to address the issue under examination or can not be judged against the criterion due to missing or incomplete information 1 - poor 2 - fair 3 - good 4 - very good 5 - excellent•Where appropriate, half marks may be given. If appropriate, evaluators mayalso be asked to give a mark to each of the individual issues comprising theblocks of criteria. Only the marks for the blocks of criteria are taken into account(after applying any weightings) for the overall score for the proposal.
  23. 23. THRESHOLDS AND WEIGHTINGS.•Thresholds may be set for some or all of the blocks of criteria, such that anyproposal failing to achieve the threshold marks will be rejected. The thresholds tobe applied to each block of criteria as well as any overall threshold are set out inthe call. If the proposal fails to achieve a threshold for a block of criteria, theevaluation of the proposal may be stopped. The reasons will be detailed in theconsensus report. It may be decided to divide the evaluation into several stepswith the possibility of different experts examining different aspects. Where theevaluation is carried out in several successive steps, any proposal failing athreshold mark may not progress to the next step. Such proposals mayimmediately be categorised as rejected.•According to the specific nature of the instruments and the call, it may be decidedto weight the blocks of criteria. The weightings to be applied to each block ofcriteria are set out in the call.
  24. 24. “Implementation of Evaluation is set & ready if and only if organized team, approved proposal, released budget, validated evaluation instruments are all prepared”“Evaluation – an EVIDENCE in Program Development Process
  25. 25. Evaluation Collect Analyze & Interpret Report Focus Data
  26. 26. Evaluation Focus Guidelines for utility consideration in determining •The following guidelines will determine the correct evaluation focus the correct evaluation focus .•What is the purpose of the evaluation?Purpose refers to the general intent of the evaluation. A clear purpose serves asthe basis for the evaluation questions, design, and methods. Some commonpurposes: Gain new knowledge about program activities Improve or fine-tune existing program operations (e.g., program processes or strategies) Determine the effects of a program by providing evidence concerning the program’s contributions to a long-term goal Affect program participants by acting as a catalyst for self-directed change (e.g., teaching)
  27. 27. Evaluation Focus Guidelines for utility consideration in determining•The following guidelines will determine the correct evaluation focusthe correct evaluation focus .•Who will use the evaluation results?Users are the individuals or organizations that will employ the evaluationfindings in some way. The users will likely have been identified during Step1 during the process of engaging stakeholders. In this step, you need tosecure their input into the design of the evaluation and the selection ofevaluation questions. Support from the intended users will increase thelikelihood that the evaluation results will be used for program improvement.
  28. 28. Evaluation Focus Guidelines for utility consideration in determining •The following guidelines will determine the correct evaluation focus the correct evaluation focus .•How will they use the evaluation results?Uses describe what will be done with what is learned from the evaluation, andmany insights on use will have been identified in Step 1. Information collected mayhave varying uses, which should be described in detail when designing theevaluation. Some examples of uses of evaluation information: To document the level of success in achieving objectives To identify areas of the program that need improvement To decide how to allocate resources To mobilize community support To redistribute or expand the locations where the intervention is carried out To improve the content of the program’s materials To focus program resources on a specific population To solicit more funds or additional partners
  29. 29. Collect Data Collecting data is a major part of any evaluation, •The following guidelines will determine the correct evaluation focus but we need to take note that the method follows purpose.• SOURCES OF EVALUATION INFORMATIONA variety of information sources exist which to gather your evaluative data.Thus, in a major program evaluation, we may need more than one source.The information source we select will depend upon what is available and whatanswers the evaluation questions effectively. Most common source ofevaluative information fall into 3 categories namely:1. EXISTING INFORMATION2. PEOPLE3. PICTORAL RECORDS AND OBSERVATIONS
  30. 30. Collect Data EXISTING PICTORAL RECORDS PEOPLE INFORMATION & OBSERVATIONS• Might use of • Think about who • Data collection via: program can best answer • Visual accounts documents the questions via: • Pictures and• Log-books • Participants or photographs• Minutes of the beneficiaries • Direct observation meeting (directly or of situations• Accomplishment indirectly) • Behaviors reports • Nonparticipants, • Program activities• Media releases proponents, critics, and outcomes victims• Local statistics • Experts &• Agency data Specialists• Etc. • Collaborators & Policy makers
  31. 31. Collect Data Major Methods for collecting information about an Evaluation.SURVEY-collecting standardized information through structured questionnaires to generatequantitative data . Surveys may be mailed or online throughWebPages, completed on-site or administered through interviews,Conducted either face to face, by telephone or electronically.Sample surveys use probability sampling whichAllow us to generalize findings to a largerPopulation while informal surveys do not.
  32. 32. Collect Data Major Methods for collecting information about an Evaluation.CASE STUDY-an in-depth examination of a particular case- a program, group of participants,single individual, site or location.Case studies rely on multiple sources of information and methods to provide ascomplete a picture as possible.
  33. 33. Collect Data Major Methods for collecting information about an Evaluation.•INTERVIEWS-an information collected by talking with and listening to people.Interviews range on a continuum from those which are tightly structured(as in a survey) to those that are free flowing and conversational.
  34. 34. Collect Data Major Methods for collecting information about an Evaluation.•GROUP & PEER ASSESMENT-collecting evaluation information through the use of group processessuch as nominal group technique, focus group, brainstormingand community forum.
  35. 35. Analyze & Interpret What does it mean by ANALYZING DATA ?•Analyzing data involves examining it in ways that reveal the relationships,patterns, trends, etc. that can be found within it.•That may mean subjecting it to statistical operations that can tell you notonly what kinds of relationships seem to exist among variables, but also towhat level you can trust the answers you’re getting.•It may mean comparing your information to that from other groups to helpdraw some conclusions from the data. The point, in terms of evaluation, isto get an accurate assessment in order to better understand the work andits effects in order to better understand the overall situation.
  36. 36. Analyze &Interpret2 Types of data and how to analyze as applied toplanning • To view QUANTITATIVE DATA Page Pls. Proceed Click Button • To view QUALITATIVE DATA Page Pls. Proceed Click Button
  37. 37. QUANTITATIVE DATA-Refer to the information that is collected as, or can be translatedinto, numbers, which can then be displayed and analyzedmathematicallyExamples include:The FrequencyTest scoresSurvey ResultsNumbers or PercentagesThis data allow us to compare those changes to one another, tochanges in another variable, or to changes in another population. Itwill be able to tell us, at a particular degree of reliability, whetherthose changes are likely to have been caused by your intervention orprogram, or by another factor, known or unknown. And they canidentify relationships among different variables, which may or maynot mean that one causes another.
  38. 38. QUALITATIVE DATA- Data collected asdescriptions, anecdotes, opinions, quotes, interpretations, etc., andare generally either not able to be reduced to numbers, or areconsidered more valuable or informative if left as narratives. The challenges of translating qualitative into quantitative datahave to do with the human factor. Even if most people agree on what1 (lowest) or 5 (highest) means in regard to rating “satisfaction” with aprogram, ratings of 2, 3, and 4 may be very different for differentpeople.
  39. 39. Analyze & Interpret How to analyze & interpret gathered data?•Record data in the agreed-upon ways. These may include pencil and paper,computer (using a laptop or handheld device in the field, entering numbers into aprogram, etc.), audio or video, journals, etc.•Score any tests and record the scores appropriately•Sort your information in ways appropriate to your interest. This may includesorting by category of observation, by event, by place, by individual, by group, bythe time of observation, or by a combination or some other standard.•When possible, necessary, and appropriate, transform qualitative into quantitativedata. This might involve, for example, counting the number of times specificissues were mentioned in interviews, or how often certain behaviors wereobserved.
  40. 40. Analyze & Interpret How to analyze & interpret gathered data?•Simple counting, graphing and visual inspection of frequency or rates of behavior,events, etc., over time.•Calculating the mean (average), median (midpoint), and/or mode (most frequent)of a series of measurements or observations.•Finding patterns in qualitative data. If many people refer to similar problems orbarriers, these may be important in understanding the issue, determining whatworks or doesn’t work and why, or more•Comparing actual results to previously determined goals or benchmarks. Onemeasure of success might be meeting a goal for planning or programimplementation, for example
  41. 41. Report Report – a final stage of Evaluation Implementation .• Depending on the nature of the research orproject, results may be statistically significant orsimply important or unusual. Also, These may ormay not be socially significant.Once we’ve organized the results and run them through whatever statistical orother analysis we’ve planned for, it’s time to figure out what these mean for theevaluation. Probably the most common question that evaluation research isdirected toward is whether the program being evaluated works or makes adifference.
  42. 42. Report “What were the effects of the independent variable (the program, intervention, etc.) on the dependent variable(s) (the behavior, conditions, or other factors it was meant to change)?.•Findings on the report should be stated in clear, straight-forward and objectivefashion.•It should also be in agreement with the facts presented, briefly stated inanswer to the questions raised and preferably arranged sequentially inaccordance with the order of the problems or objectives of the project.•On the report, conclusions should be presented in a more detailed mannerand resulting directly from the findings or tested hypothesis if there are.
  43. 43. Recommendations advanced and proposed shouldbe further verified and substantiated in the light ofstudy findings and conclusions.Once validated, said recommendations provideuseful inputs to planners and managers in theplanning and decision-making processes.Said inputs not only update the plan but alsomake the programs and projects moreresponsive and relevant.
  44. 44. assessments.html onid=3&sectionid=1&level3=6&sublevel3=17
  45. 45. HENRY JOHN N. NUEVAPLAN EVALUATION & IMPLEMENTATIONMasters in Management Engineering