SDPM - Lecture 7 - Project monitoring and control

2,927 views

Published on

Published in: Education, Technology, Business
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
2,927
On SlideShare
0
From Embeds
0
Number of Embeds
24
Actions
Shares
0
Downloads
223
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

SDPM - Lecture 7 - Project monitoring and control

  1. 1. Leiden Insitute of Advanced Computer Science System’s Development and Project Management – Project monitoring and control Prof. Dr. Thomas Bäck 1
  2. 2. Leiden Institute of Advanced Computer Science DatesFeb. 1 14:45 – 17:30 Introduction, Project DescriptionFeb. 2 13:45 – 16:30 STEP WISE Approach to Project PlanningFeb. 9 13:10 – 15:45 STEP WISE Approach to Project Planning, SAVE ENERGY CaseFeb. 15 14:45 – 17:30 Selecting an Appropriate Software Dev. ApproachFeb. 16 15:15 – 18:00 Activity Planning and Resource AllocationFeb. 22 14:45 – 17:30 Software Effort EstimationFeb. 23 13:15 – 15:45 Risk management, project escalationMar. 1 14:45 – 17:00 ExamMar. 2 13:45 – 16:30 Risk Management, Project monitoring and controlMar. 8 14:45 – 17:30 Software Quality AssuranceMar. 9 13:45 – 16:30 Managing People; Contract ManagementMar. 18 15:00 – 17:00 Trade Fair
  3. 3. Leiden Institute of Advanced Computer ScienceProject control - Motivation !   Project under way … !   Attention on ensuring progress !  Monitoring !  Comparison !  Revision of plans and schedules 3
  4. 4. Leiden Institute of Advanced Computer ScienceProject control - overview !   The project control life-cycle !   What’s going on? Collecting control information !   Excuses, excuses… Reporting upwards !   Doing something about it. Corrective action. !   A continual process ! !  Monitoring against plan !  Revising plan, if necessary 4
  5. 5. Leiden Institute of Advanced Computer ScienceProject control – Responsibility ? !   Overall: Project Steering Committee !   Day-to-day: Project Manager Steering committee Client Project manager Team leader Team leader Team leader Team leader Analysis/design Programming Quality control User documentation section section section section 5
  6. 6. Leiden Institute of Advanced Computer ScienceThe project control life-cycle real world actions define collect objectives data data process data implement information make modeling decisions decisions 6
  7. 7. Leiden Institute of Advanced Computer ScienceThe project control life-cycle Start Publish initial plan Gather project information Publish revised plan End Compare progress - targets Take remedial action Document conclusions no Satisfactory? Review Project yes no Completed ? End Project yes 7
  8. 8. Leiden Institute of Advanced Computer ScienceWhat needs controlling !   Technical integrity !   Project may be on time but !  What tasks have been only because more completed resources have been used than were originally !   Business integrity budgeted !  Costs of project must be !   Conversely, project may be less than benefits late because planned !  Delays in implementation resources have not been reduce benefits used 8
  9. 9. Leiden Institute of Advanced Computer ScienceWhat needs controlling (cont’d) !   Quality !  A task has not really been finished unless the product of that task is satisfactory !  Activity reported as finished could need to be re- worked !  Testing is difficult to control: depends on an unknown number of errors 9
  10. 10. Leiden Institute of Advanced Computer ScienceThe bug chain Errors Requirem. gathering More errors Errors Design Even more errors Errors Build HELP! Errors Test 10
  11. 11. Leiden Institute of Advanced Computer ScienceData collection !   Partial completion reporting !  Common to enhance existing accounting data collection systems (e.g. time sheets) to meet needs of project control !  Asking for estimated completion time 11
  12. 12. Leiden Institute of Advanced Computer ScienceTime Sheets Rechargeable Hours Name: Week ending: 30/03/05 Project Activity Description Hours % Scheduled Estimated Code this week complete completion completion P21 A243 Code mod A3 12 30 24/4/05 24/4/05 P34 B771 Document take-on 20 90 6/4/05 4/4/05 Total: 32 12
  13. 13. Leiden Institute of Advanced Computer ScienceData collection !   Risk reporting – traffic light method !  Identify key elements for assessment !  Break into constituent elements !  Assess each second-level element •  Red / Amber / Green (Traffic light) !  Produce overall assessment •  All second-level assessment -> first-level assessment •  Review for overall estimate 13
  14. 14. Leiden Institute of Advanced Computer ScienceRisk Reporting: Red, amber, green !   Red not on plan: recoverable only with difficulty !   Amber not on plan: recoverable !   Green on schedule 14
  15. 15. Leiden Institute of Advanced Computer ScienceSome problems with controlling projects !   99% completion syndrome !  Job reported as ‘on time’ until last scheduled week !  Job reported as ‘99% complete’ for each remaining week until task is completed !   Solution? !  Control on deliverables 15
  16. 16. Leiden Institute of Advanced Computer ScienceFurther problem !   Scope creep !  Tendency for system to increase in size during development !   Solution? !  Re-estimating !  Change control 16
  17. 17. Leiden Institute of Advanced Computer ScienceProgress checklist !   Tasks completed !   Risk analysis !   Staffing !  Can identify sensitive factors that need !   Scope (more monitoring requirements) !   External dependencies !   Cost of quality !   Finance 17
  18. 18. Leiden Institute of Advanced Computer ScienceLevels of control !   End-stage assessment Project board (event driven) !   Mid-stage assessmentCheckpoint reports (time driven, e.g. Project manager monthly) (stage manager) Checkpoint meetings e.g. weekly Project team 18
  19. 19. Leiden Institute of Advanced Computer ScienceLevels of control Information Control Decision-making Reporting on actions !   As information goes to higher levels it becomes more summarized !   General directives are filled in with operational details as they filter down !   Danger of ‘information overload’ 19
  20. 20. Leiden Institute of Advanced Computer ScienceCollecting project information !   Sources !  Checkpoint meetings !  Time sheets !  Machine generated statistics, e.g. connect time !  Internal documents, e.g. error reports 20
  21. 21. Leiden Institute of Advanced Computer ScienceProgress report !   Avoid ‘information overload’ !   Focus on real problems - exceptions to planned activity !   Some approaches !  Graphical representation !  Highlight problem cases, e.g. RAG (red/amber / green) indicators 21
  22. 22. Leiden Institute of Advanced Computer ScienceProgress report (cont‘d) !   Achievements in reporting period !  Tasks that should have been finished !  Tasks that should have been started !   Costs - actual costs compare to budgeted !   Staffing - joiners, leavers, sickness, etc. !   Risk monitoring - status of identified risks !   Outlook !  How things are likely to progress in next period 22
  23. 23. Leiden Institute of Advanced Computer ScienceEarned value analysis (EVA) 1. Identify ‘modules’ !  Good if users can recognize these 2. Identify ‘checkpoints’ !  When a phase finishes - should be specific and measurable 3. Identify percentage durations, e.g. !  Design 30% , code 25%, test 45% 4. Estimate size/effort for each module 5. When phase is completed for a module !  That percentage of the project has been ‘earned’ 23
  24. 24. Leiden Institute of Advanced Computer ScienceEarned value: !   0/100 technique: EV = 0% until task completed, then EV = 100%. !   50/50 technique: EV = 50% as soon as task is started. 100% when completed. !   Milestone technique: Based on achievement of milestones with assigned values. 24
  25. 25. Leiden Institute of Advanced Computer ScienceEarned value analysis - example Module Est. Est. Design Code Test ID hours hours 30% 25% 45% (total) (total) Hours % Hours % Done Hours % Done Hours % Done A 100 47.6 30 14% y 25 12% y 45 21% n B 50 23.8 15 7% y 12.5 6% y 22.5 11% y C 60 28.6 18 9% y 15 7% n 27 13% n Total 210 100 30% 18% 11% % Total 59% % of total (210) 25
  26. 26. Leiden Institute of Advanced Computer ScienceAccumulative chart 120 100 80 %   C omplete Planned 60 Actual 40 20 0 1 2 3 4 5 6 7 8 9 10 11 Week   n umber 26
  27. 27. Leiden Institute of Advanced Computer ScienceEVA indicators - cost !   BCWP Budgeted cost of work performed !  = earned value EV !  = The planned (not actual) cost to complete the work that has been done. !   ACWP Actual cost of work performed, i.e. what it actually costs to get BCWP !  = actual cost AC !  = Cost incurred to accomplish the work that has been done to date. !   Cost variance BCWP – ACWP = EV – AC 27
  28. 28. Leiden Institute of Advanced Computer ScienceEVA indicators - schedule performance !   BCWS Budgeted cost of work scheduled: !  BCWP that would be achieved if all work had been finished on time !  BCWS = Planned Value PV !  = Planned cost of the total amount of work scheduled to be performed by the milestone date. !   Budget variance ACWP – BCWS = AC-PV !   Schedule variance BCWP – BCWS = EV-PV 28
  29. 29. Leiden Institute of Advanced Computer ScienceEVA performance indices !   Cost performance indicator (CPI = EV/AC) !  BCWP/ACWP !   Schedule performance indicator (SPI = EV/PV) !  BCWP/BCWS !   CPI=1 – right on track !   CPI>1 – ahead of plan; cost less than budget. !   CPI<1 – falling behind. !   Value for money indices 29
  30. 30. Leiden Institute of Advanced Less work was Computer Science performed than … the actual cost scheduled for it was higher than budgeted!Accumulative chart Now Baseline budget PVCumulative Cost % Planned Value, BCWS ACWP, Actual Cost to date (AC) Budget variance AC-PV Cost variance EV – AC (< 0) Schedule variance EV-PV Earned Value EV, BCWP TimeCPI = BCWP/ACWP < 1 à NOT Good! Schedule variance (time)SPI = BCWP/BCWS < 1 à NOT Good! 30
  31. 31. Leiden Institute of Advanced Computer ScienceGraphical representation !   Gantt charts !  Activity bar chart indicating scheduled activity dates and duration (and floats) !  Shading (for schedule completion) and today ‘cursor‘ !   Slip charts !  Gantt chart plus slip line (bending = bad) !   Ball charts !  Circles indicate estimated and actual start and completion points for activities !  Green and red shading !   Timeline charts !  Recording and displaying changed targets !  Slipping more clearly visualized! 31
  32. 32. Leiden Institute of Advanced Computer ScienceGraphical representation (cont’d) SA SD1 SD2 CDR1 CDR2 ‘Slip-chart’ red-line indicates position as of today A very uneven line suggests need for rescheduling 32
  33. 33. Leiden Institute of Advanced Computer ScienceMonitoring priorities !   Critical path activities !   Activities with no free float !   Activities with less than a specified float !   High-risk activities !   Activities using critical resources 33
  34. 34. Leiden Institute of Advanced Computer ScienceCorrective action !   Tolerance !  Acceptable margins of overshoot may be specified in plan !   Contingency !  This is not owned by the activity but by the project: give and take between activities !   Exception plans !  Drawn up when the original plan needs major change: especially change to scope or costs !  Requires project board authority 34
  35. 35. Leiden Institute of Advanced Computer ScienceSome possible actions to recover project !   Re-schedule, e.g. precedence requirements !   Make more resources available !   Redefine scope !   Modify quality requirements !   Enhance productivity, e.g. through training, tools 35
  36. 36. Leiden Institute of Advanced Computer ScienceChange control – Task changes ! !   Identification of all items that are subject to change control !   Central repository of master copies, project documentation, and software products !   Formal set of procedures to deal with change !   Maintenance of access rights and library item status 36
  37. 37. Leiden Institute of Advanced Computer ScienceChange control – Example !   Users perceive need for system modification !   User management considers change request, approves, passes to development mgmt. !   Dev. Mgmt. delegates staff member to look at request, report practicality and cost. !   Dev. Mgmt. reports back to user mgmt. !   User mgmt. decides whether they want to go ahead. !   Developers are authorized to go ahead. !   Code is modified. !   User mgmt. is notified on completion, software is released for user acceptance testing. !   Operational release when users are satisfied. 37

×