Pranabendu

584 views

Published on

Published in: Business, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
584
On SlideShare
0
From Embeds
0
Number of Embeds
83
Actions
Shares
0
Downloads
11
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Pranabendu

  1. 1. Experience Predictability in Software Project Delivery 2013 TCS 6/18/2013 Pranabendu Bhattacharyya, CFPS, PMP Tata Consultancy Services Ltd Plot C, Sector V. Salt lake Electronics Complex, Kolkata - 700091 +91-33 6636 6068 pranabendu.bhattacharyya@tcs.com Parag Saha Tata Consultancy Services Ltd Plot C, Sector V. Salt lake Electronics Complex, Kolkata - 700091 +91-33 6636 6248 parag.saha@tcs.com Sanghamitra Ghoshbasu Tata Consultancy Services Ltd Plot C, Sector V. Salt lake Electronics Complex, Kolkata - 700091 +91-33 6636 6064 sanghamitra.ghoshbasu@tcs.com Sudipta Mohan Ghosh Tata Consultancy Services Ltd Plot C, Sector V. Salt lake Electronics Complex, Kolkata - 700091 +91-33 6636 6309 Sayantan Roy Tata Consultancy Services Ltd Plot C, Sector V. Salt lake Electronics Complex, Kolkata - 700091 +91-33 6636 6066
  2. 2. CONTENTS 1. ABSTRACT...............................................................................................................................................................................................3 2. INTRODUCTION .......................................................................................................................................................................................3 3. POTENTIAL RISKS IN ABSENCE OF A STANDARD ESTIMATION PROCESS...............................................................................................3 4. ESTIMATION APPROACH .........................................................................................................................................................................4 5. ESTIMATION FRAMEWORK DRIVING STANDARDIZATION ........................................................................................................................5 5.1 Size Estimator ......................................................................................................................................................................................5 5.2 Effort Estimator.....................................................................................................................................................................................5 5.3 Schedule Calculator ..............................................................................................................................................................................6 5.4 Phase-Wise Distributor ..........................................................................................................................................................................6 5.5 FTE Calculator......................................................................................................................................................................................6 5.6 Cost Calculator .....................................................................................................................................................................................6 5.7 Feedback Adaptor.................................................................................................................................................................................6 5.8 Governance Umbrella............................................................................................................................................................................6 6. MODEL SELECTION DRIVING ACCURACY................................................................................................................................................7 7. CONTINUOUS FEEDBACK DRIVING IMPROVEMENT.................................................................................................................................8 8. CASE STUDY............................................................................................................................................................................................8 8.1 Determine ............................................................................................................................................................................................9 8.2 Design & Develop ............................................................................................................................................................................... 10 8.3 Deploy ............................................................................................................................................................................................... 12 8.4 Deliver ............................................................................................................................................................................................... 13 9. CONCLUSION......................................................................................................................................................................................... 15 10. ACKNOWLEDGMENTS............................................................................................................................................................................ 15 11. REFERENCES ........................................................................................................................................................................................ 15 12. AUTHORS’ PROFILE............................................................................................................................................................................... 15 KEY WORDS Software Estimation, Predictability, Estimation Framework, Standardization, Decision Matrix, Estimation Model, Productivity, Estimation Metrics, Estimation Effectiveness, Process Deployment
  3. 3. 1. ABSTRACT Unrealistic expectations based on inaccurate estimates have been identified as the single largest root-cause of software project failures. Going by the average yearly spend of $3.76 trillion (source: Gartner, March-2013) by the IT customers worldwide it is essential to eliminate the impediments of delivery uncertainty and non-predictability. Estimation is the binding force of all project metrics related to scope of work, effort, schedule, resource- budget and quality. Thus if collective estimation accuracy can be increased even by a minimal percentage, it will translate to savings of multi-billion dollars for the worldwide IT business. The objective of this paper is to present the risks that mostly occur in absence of standard and scientific estimation processes and then outline the key requirements to minimise uncertainties if not fully eliminate them. The scope includes defining estimation approach of multiple IT project-types which has been discussed here with focus on the following broad categories: Estimation framework driving standardisation - Size/productivity/effort/schedule/cost and their dynamic behavior Model selection driving accuracy - Project-type based estimation model selection and configuring it based on organisation, geography, industry/domain and technologies Measurement and continuous feedback driving improvement - Measuring productivity, refinement based on effectiveness/data currency/lessons learnt Adapting to such a streamlined arrangement has resulted in the much sought after predictability and repeatability of estimates that eliminates the worry of incurring huge monetary loss. This provides a paradigm shift from the traditional methods of estimation having very little bearing with the actuals. The case-studies presented at the end will of this document reinforce this fact. 2. INTRODUCTION The single most important task of a project is setting up realistic expectations. This is possible through use of a well-crafted, scientific, logical and self- refining Estimation Framework which can help predict cost/schedule and control envisaged risk. Most organisations today face multiple challenges while estimating software projects. These can include lack of standardised rules/guidelines for estimation, dearth of governance around estimation process, limited reuse of past organisational experience in estimation and unavailability of organisational baselined productivity (resulting in absence of benchmarking and improvement measurement). Over and above these, software projects can be of multiple ‘types’ such as bespoke application development, large functional enhancement, minor technical enhancement, testing, package implementation and so on. The methodologies of estimation for these project types are varied. All these challenges in turn, result in issues either at a project level (like incorrect budgeting, incorrect resource loading, issues in tracking/monitoring) or at an organisation level (like increased ‘scrap-value’ of projects, incorrect forecasting of IT budgets and incorrect build-buy decisions). One of the key requirements in overcoming these challenges is the availability of a robust, standard yet flexible framework of estimation. This in turn can aid project managers to select a best fit estimation approach depending on project characteristics and achieve predictability in estimates. 3. POTENTIAL RISKS IN ABSENCE OF A STANDARD ESTIMATION PROCESS Effective software project estimation is one of the most challenging and important activities in project execution. Proper project planning and control is not
  4. 4. possible without a sound and reliable estimate. The risks associated with incorrect estimation are: Underestimating a project leads to under-staffing it resulting in staff burnout Under-scoping the quality assurance effort leads to the risk of low quality of deliverables. Setting too short a schedule due to underestimation leads to loss of credibility as deadlines are missed. Overestimating a project leads to a project being allocated more resources than it really needs, without sufficient scope controls. The project is then likely to cost more than it should and have a negative impact on the bottom line Overestimating also causes a project to take longer than necessary to deliver resulting in lost opportunities, and delayed use of resources on other projects. Non-availability of standard estimation techniques across the organisation for a given type of project type results in incorrect comparison among projects, inaccurate productivity measurements and inaccurate person-dependent estimates. All these risks, in turn result in lack of predictability in estimates and impact downstream activities like planning, staffing, monitoring and tracking. 4. ESTIMATION APPROACH Figure 1: Estimation Approach
  5. 5. At the outset, a comprehensive estimation approach has the following levers: 1. A standard Estimation Framework – consists of standard estimation techniques for sizing, effort estimation, schedule estimation, Full Time Equivalent (FTE) determination and cost derivation. It also has the necessary guidelines, checklists and so on. 2. A defined decision matrix – helps to determine the suitable estimation techniques based on different parameters like technology, engagement type, estimation stage, SDLC model and so on. Based on strong foundation of a standard Estimation Framework and decision matrix, appropriate estimation models, guideline and processes can be obtained. These models can be utilised on the selected projects to obtain a valid estimate. Different KPIs should be defined to enable measurement techniques and also understand the current situation of the organisation so that necessary plans and roadmap can be formed to achieve desired organisational goals. During this entire process feedback, lessons learnt and other relevant inputs are recorded and utilised to refine the Estimation Framework as well as the decision matrix for more predictable results. Thus, the entire process is a self- sustaining and an evolving ecosystem. 5. ESTIMATION FRAMEWORK DRIVING STANDARDIZATION There are four major facets of any estimate – size, effort, schedule and cost. Apart from these, the Estimation Framework must be scalable to estimate for projects of different sizes and types. The estimation framework being proposed broadly consists of the components described in the following sections. 5.1 Size Estimator In general, estimation is associated with deriving the number of person-hours or dollars required to deliver a project. Most of the times, it is not apparent that the effort or cost do not indicate the ‘work-volume’ that the project entails. The Size Estimator component defines the ‘work-volume’ in terms of a size unit. There are multiple techniques for estimating size, both deterministic and probabilistic. A few such techniques are listed as follows: Function Point Analysis Use Case Point Story Point Lines of Code Approach Feature Points Technical Components COSMIC FISMA NESMA 5.2 Effort Estimator Estimation of effort is one of the most important aspects of project management because, unlike software and hardware resources, staffing resources are very difficult to manage. Effort has a direct relationship with staffing cost. The Effort Estimator component comprised of the following two distinct components Base Effort Estimator Component: This component derives the effort that is required to perform the activities of the given software life cycle. The method of deriving this effort could be parametric or heuristic. Some of the parametric techniques include: Productivity Based Effort Estimation: COCOMO Some of the heuristic techniques include: o Wideband Delphi o Monte Carlo Simulation o Estimation by Analogy a. Effort Adjustor: Over and above the effort needed to perform the Software Life Cycle
  6. 6. activities, additional effort may be required in a project to cater to other activities. The effort adjustor component adjusts the base effort with other effort which can either be expressed as a percentage of the base effort (this may increase or reduce the overall effort) or be expressed as a static value. Some examples of other factors are as follows: Project Specific Factors like availability of reusable components or availability of documentation Geography and Domain Specific factors like confirmation to regulatory compliances Organisation Specific factors Team Specific factors like niche skill availability or SME availability Risks 5.3 Schedule Calculator The project schedule is dependent on the effort estimates for the project. It can be calculated using the following: a. COCOMO II b. Gantt charts c. Critical Path Method (CPM) technique d. PERT Adjustments to schedule can be done manually to ensure compliance to client mandated schedules. 5.4 Phase-Wise Distributor The overall effort and schedule derived can be distributed across phases of a project. The phases of a project may vary depending on Software Life Cycle considered, like Waterfall or RUP. Guidance for effort and schedule distribution is provided by the framework for multiple SDLC types. 5.5 FTE Calculator The FTE Calculator has two variants: For Maintenance/Support Projects : This component calculates the number of FTEs required for support functions including incident management, outage management, release management and administration depending on the effort required to resolve incidents and perform minor enhancements to the application(s) under support. The FTE calculator component takes into account whether the application under consideration is in steady state or transient state, the working hours of FTEs, shift requirements and so on. a. For other types of Projects: Once the effort and schedule have been distributed across phases, manpower loading for each phase is derived by the FTE Calculator. 5.6 Cost Calculator The Cost Calculator component derives overall cost for a project based on overall effort and schedule needed for the project. The cost can be broadly classified into the following two components: a. Staffing (Consultancy) Cost: This cost is derived based on inputs from the FTE Calculator regarding the number of resources required for each phase. Factors like role/designation of the resource for each phase, location and effort spent by the resource in each phase determine the staffing cost. b. Other Cost: This category includes the estimated cost for hardware, travel, communication and other miscellaneous items 5.7 Feedback Adaptor The feedback adaptor component uses the actual effort utilized, the actual size delivered at project end and best practices and lessons learnt from projects and feeds it to the ‘Continuous Improvement Cycle’ for continued refinement of the Estimation Framework. 5.8 Governance Umbrella
  7. 7. The Governance Umbrella ensures that every estimate from the framework is reviewed and vetted by a competent authority. The roles defined within the Estimation Framework to enable this are: Initiator: This role initiates the process of estimation and owns the entire estimation. Estimator: Estimates size, effort, schedule and cost. Reviewer: Is a certified authority who can review estimates. Approver: Signs-off on the bottom-line and vets the estimate before it is submitted. The overall framework can be digitised as a tool and utilised to perform estimates. 6. MODEL SELECTION DRIVING ACCURACY The concept of ‘estimation model’ is closely linked to the Estimation Framework. For different project types, there are different techniques that could be adopted to estimate Size, Effort, Schedule and Cost. A combination of these methodologies/techniques constitutes a model. A single estimation model can be used to estimate multiple project types. The real crux lies in selection of the right model to ensure the much required predictability in estimation The TCS estimation framework is accessorised by a “Decision Matrix” which enables the process of “FIRST TIME RIGHT” model selection. To effectively use the framework one should utilize the “Decision Matrix” enabler which consists of the following four dimensions Estimation Stage: This could be concept/Early stage – where requirements are not formulated and only a concept of the project is available, proposal stage- where some requirements are available or project stage- where entire gamut of requirements are available. Technology area and platform: These could range from mainframes using COBOL/DB2 to Web based applications using JAVA/.NET to package implementation using SAP/Oracle Apps. Project Type: Projects can be of different types viz. Bespoke Development, Maintenance, Support, Assurance (Testing), Package development /customization/upgrade etc. Software Life Cycle Used: The life cycle methodology used in delivering these projects may range from Waterfall, RUP to Iterative/ Agile. Based on the matrix formed from any combination of the decision matrix dimensions, the framework performs the following: a. Determines which components of the framework (Size estimator, Effort estimator, Schedule Calculator and so on) are applicable to the specific case and which components may not be relevant. For example, “Size Estimator” component may not be relevant for “Package Upgrade” projects. b. Determines which specific methodology/ technique would be applicable to each framework component such as, “Function Point” from Size Estimator, “COCOMO” from Effort Estimator for “project stage” estimation of a bespoke “Development” project using “Java/J2EE” technology and adopting “Waterfall” project execution method. c. Suggests the estimator which estimation model(s) can be used. Depending on project types, more than one model can be suggested. d. Suggests, the best fit model based on the organisational history of success (less variance) for the given input matrix. Based on the model chosen the framework selects the organisational baseline productivity for given technology area/ platform and helps the estimator in arriving at the Effort, Schedule and Cost estimation.
  8. 8. 7. CONTINUOUS FEEDBACK DRIVING IMPROVEMENT Figure 2: Continuous Improvement The estimation framework is completed with the closed feedback loop which helps integrate the best practices and lessons learnt back into the framework thus enabling further refinement and maturity of the same with increased utilisation. The Feedback Adaptor of the framework is the inception point of the ‘Closed Feedback Loop’ or the Continuous Improvement Cycle. This takes the actual effort utilised, actual size delivered, and schedule with cost at project end and best practices and lessons learnt from projects as its inputs and returns the same to the loop. The Closed Feedback Loop operates in Plan-Do-Check-Act cycle (as depicted in Figure 2) and pumps worthwhile data back to the Framework for advancement of the following inherent aspects thereby establishing a self-evolving Framework. a. Estimation Effectiveness of Models: Fine tuning of the estimation process to ensure that size and effort variance is within control limits. b. Productivity: Deriving and base lining productivity, productivity benchmarking and identification of levers to improve productivity. c. Core reference repository: Building and enriching the historical estimation repository of the organisation to perform better estimates. 8. CASE STUDY For demonstration of the successful implementation of the proven and predictable “Estimation Framework, the case study for a North America based Financial Institution has been described here. Estimation Framework Implementation – Our Approach and Results
  9. 9. Premise: TCS was one of the vendor partners for this financial organisation which was a leader in financial planning with more than 110 years of history. It was the largest mutual fund advisory program provider in terms of assets, with more than $400 billion in assets. After a recent spin-off from the main conglomerate, the organisation was teeming with lots of challenges in the IT space for which it sought TCS’s expertise and help. The key problems were as follows: Regular cost and effort overrun in most of the projects (~150%-200%)  Increased project management efforts (>40%) due to poor estimates/re-estimates  Recurring losses (amounting to millions of dollars) due to scrapping of projects  Huge expenditure due to induction of resources at higher rates at later stages of the projects to complete them on time The outcome of these problems led to the following:  Poor Return On Investments (ROI)  Low productivity in projects as evident from Due Diligence exercise  Unsatisfied clients  No vendor performance comparison to augment outsourcing  Difficult decision-making for the right investment opportunities, which requires a reasonable assessment of cost early on in the life cycle.  No scope of validation of the estimates prepared by project teams, who in turn depended upon vendors and subcontractors TCS applied a four phased approach for process improvement, described as follows: a. Determine: Identify the gaps and plan accordingly. b. Design and Develop: Tailor, pilot and setup an Estimation Framework to establish processes and estimation techniques aligned to the needs. Configure estimation repository. c. Deploy: Integrate with organisational processes. d. Deliver: Demonstrate estimation effectiveness using metrics. 8.1 Determine Our first step was to determine the gaps and understand the project portfolio. A two weeks long Due Diligence programme was conducted to understand the current operational mode. One-on- one interactions with business process owners were conducted to understand their current processes and key business drivers. A standard checklist embedded with TCS project management experience was used to assess the gaps and portfolio. Gap Analysis: The gaps identified were as follows: Only effort, cost and/or schedule estimation was performed. No “Volume of work” identification No presence of framework for productivity measurement and improvement/initiatives to drive ‘cost saves’ Inconsistent project estimation practices across the IT organisation. No standard models for estimation. During “Early/Concept” stage only ball park figures provided without any formal technique Cost drivers, such as infrastructure unavailability and cross-commits, which impacted a project’s total cost, were also not tracked or taken into consideration No periodic risk monitoring and identification of risks. Flat 20% contingency reserve was kept constant for all projects irrespective of characteristics Actual effort data classification not realisable and effort consumption pattern not captured Portfolio Analysis: The portfolio analysis of the customer organisation projects brought into light the following spectrum of parameters needed for the Estimation Framework customisation Engagement Types  Development and Small Initiative Path  Conversion  Assurance
  10. 10.  Project Support  Package Customisation  Package Upgrade Stage  Qualify  Requirements  Design  Final/Actual Technology  Web  Mainframe  COTS Products SDLC: Waterfall, Agile, Spiral/Iterative Based on the analysis, the TCS consultants concluded that the deployment of a befitting “Estimation Framework” will be the answer to the entire gamut of organisation’s issues. The well proven and predictable “Estimation Framework” was recommended for deployment, with adequate tailoring as per customer organisation’s needs. 8.2 Design & Develop When an organisation wants to adopt the Estimation Framework first there is a need to understand what services of the framework it will need. A due diligence is required to understand the services already present which can be accommodated within the framework, components which will have to be newly built and the overall digitisation requirements. That was accomplished through a fit gap analysis exercise, a sample snapshot of which is described in the following table. Sl No. Identified Gaps Framework Segments used for solution 1 Only effort, cost and/or schedule estimation was performed. No “Volume of work” identification Size Estimator Component 2 No presence of framework for productivity measurement and improvement/initiatives to drive ‘cost saves’ Feedback Adaptor Component 3 Inconsistent project estimation practices across the IT organisation. No standard models for estimation Multi-dimensional Decision Matrix 4 During “Early/Concept” stage only ball park figures provided without any formal technique Multi-dimensional Decision Matrix 5 Cost drivers, such as infrastructure unavailability and cross- commits, which impacted a project’s total cost, were also not tracked or taken into consideration Cost Estimator Component 7 No Periodic risk monitoring and identification of risks. Flat 20% contingency reserve was kept constant for all projects irrespective of characteristics Effort Estimator Component 8 Actual effort data classification not realizable and effort consumption pattern not captured Feedback Adaptor Component
  11. 11. Table 1: Fit Gap Analysis Uncertainty management and feedback of learning is significant for any estimation framework. The Estimation Framework in question was enabled to manage uncertainty, calibrate and configure models based on history, accommodate past learning and bring prediction with respect to size, effort, schedule and cost estimation. The Estimation Framework was replete with a complete set of documents, tools, methods and best practices that addressed the customer’s need on the basis of current “types” of projects, technology/platforms and “stage” of estimation. In line with the primary objectives of standardisation, accuracy and continuous improvement, the following activities were performed in each of the areas to establish the Estimation Framework in collaborative mode. FRAMEWORK ADOPTION Identify Estimation framework components which need to be adopted for the organisation as per Decision Matrix (guided by the framework itself) Adopt the Size Estimator Component, Schedule Component, Schedule Calculator Component, Phase Distributor Component and FTE Calculator Component for defining the models o The Size Estimator Component helped in providing a fact-based input to support the comparison of vendor bids o The Size Estimator Component provided a foundation for measuring productivity to track improvement o The Effort Adjustor Component allowed to accommodate risks in estimation to account for contingency Adopt the “ Feedback Adaptor” component to establish a mechanism for Actual Data collection, reporting and sizing to enable productivity baselining and estimation effectiveness computation Adopt the historical repository of guidelines and best practices embedded in the framework to publish estimation guidelines for all types of projects o The Estimation Framework with its collection of models (with in-built help), extensive guidelines and tailored training programmes conducted by the consultants minimised the degree of changes required by user community and helped in providing best in class estimation capabilities Digitise the framework thus enabling creation of central repository Adopt “Governance Umbrella” through digitisation mode MODEL SELECTION Based on the Decision Matrix utilisation, the “Size Estimator” framework component was preferentially adopted to instantiate size estimation models for relevant project types. For Development, COTS Implementation, Conversion project types standard techniques like Function Point, Package Points and customised techniques (like extensions to Function Point approach, early stage FP, COCOTS, and so on) were adopted to create the models.
  12. 12. Figure 3: Sample Decision Matrix For effort and schedule estimation, COCOMO, COCOTS, Productivity- and Resource-based approaches were fed into the Decision matrix and best-fitted ones deployed for each type of project. Alternative paths were also provided for performing “What If “ analysis MEASUREMENT AND CONTINUOUS FEEDBACK • Metrics identified based on goals/objectives (for example, estimation effectiveness, process compliance, SLA compliance, productivity improvement) • Mechanism of performing root-cause analysis implemented to determine the drivers behind estimation variance and to gain insights on productivity “influencers” • Improvement drivers identification mechanism implemented through amalgamation of “Lessons learnt” and “Best Practices” documents ( supported by central repository of framework) • Established mechanism of metrics collection and reporting analysis results through the “Feedback Adopter” digitised enablers. Process to compare periodically with industry benchmarks was also established 8.3 Deploy After the processes and techniques were finalised, the focus shifted on deploying the same at an organisation level and handle the associated change
  13. 13. management effectively. Following were some of the challenges faced while deploying: It was difficult to convince PMs and other stakeholders to shift from their existing ways of non-standard estimation Since most of the benefits were in long term, short duration projects were not ready to comply Initial infrastructural issues were faced to establish governance mechanism in absence of any tool The following steps were taken to deploy the developed processes and overcome the challenges: Identified the business units and analysed the project profiles of each. Easy to deploy set of projects identified as low hanging fruits Key point of contacts identified for business units and projects. Multiple model awareness sessions conducted (Brown Bag sessions, Town Halls , and so on) for similar nature projects (which will use the same model and guidelines) along with live estimation performance Prioritised projects with challenges in adopting the Estimation Framework. Conducted hand holding sessions with the project teams Created case studies on high impact projects which successfully adopted the framework (termed as ‘Golden Samples’) and showcased them to the next set of identified projects Arranged for experience sharing sessions (and circulated the content later) with target PMs where the speakers were from deployed projects. Focus was on articulation of benefits Trainings designed for various levels of executives (CIO, PMs, DMs, TMs, and so on) and conducted with custom content and duration Showcased and recognised the early adopters Institutionalised the estimation deployment approach for continuous adoption of newer projects whenever they start. 8.4 Deliver The deployment of this predictable Estimation Framework brought about the desired results within a few quarters. The potential of the framework was best highlighted through the following: Best-in-class estimation capabilities Improved predictability of project costs and schedules. Measured and base-lined productivity levels Improvement in business – IT Estimation alignment. Reduced cost of estimation/re-estimation, idle time, unplanned induction of staff, project scraps and so on. Improved vendor management and efficient outsourcing practices. Repository of historical estimation data created High estimation awareness within the practitioner community Cost, effort and time recording against defined implementation scopes Estimation traceability to business requirements Quantitative risk analysis and understanding of confidence of an estimate. Fact based inputs for vendor bid negotiations Decomposed estimates which helped to understand how estimates relate, through different stages of a project.
  14. 14. RESULTS Figure 4: Improvement in Scrap Value Reduction Figure 5: YoY Improvement in productivity Figure 6: YoY Improvement in Estimation Effectiveness  Reduced cost/function point (by 41 percent) for web based projects  Reduced cost/function point (by 15 percent) for mainframe projects New Process Deployed New Process Deployed
  15. 15. 9. CONCLUSION This paper showcases that the development and implementation of a standard Estimation Framework based on historically proven techniques and data will help bring the much desired efficiency in project management. It will make the projects harness the estimation experience of executed projects to bring in the desired predictability and also provide feedback for the improvements and further refinements in future. To conclude, let us look at the key differentiators of this framework and the lessons learnt while deploying the same. LESSONS LEARNT “One Solution Does Not fit all”. Separate prescription has to be provided for different organisations depending on their exact problems “Reuse” of existing artifacts is essential to get the “buy-in” from the practitioner community “High Impact Projects” need to be identified for initial deployment. This will create success stories and drive deployment for a larger audience The success of deployment should be timely communicated to all stakeholders for benefits management. KEY DIFFERENTIATORS • End to end estimation process definitions with guidelines • Plug and play framework components based on organization maturity • Well established cohesive collaboration between components enabling easy propagation of changes • Framework validated and calibrated against huge number of project data across domains and technologies thus ensuring consistent predictability with acceptable confidence • Analytical approach (through up-to-date decision matrix) to get the “best fit“ prescription applicable for different types of projects • Knowledge and understanding of various estimation techniques captured through framework components • A streamlined integrated approach to generate key metrics like variance, productivity, schedule & effort slippage and so on • Self-sustaining framework capable of capturing end results and incorporate the same for continuous improvement 10. ACKNOWLEDGMENTS We thank TCS Global Delivery Excellence Group, TCS Techcom Group, Sharmila Das from TCS Estimation Center of Excellence for all the help and support provided in writing this paper. Special thanks to Ms. Aarthi Subramanian (Head – TCS Global Delivery Excellence Group) for her support and guidance. 11. REFERENCES [1] Project Management Institute’s (PMI), Project Management Body of Knowledge Guide (PMBOK® Guide). Edition 5, 2013 [2] Tata Consultancy Services (TCS), Estimation Guidelines Version 7.2, April 2013 [3] “Quality Software Project Management” by Futrell, Shafer and Shafer. 12. AUTHORS’ PROFILE Pranabendu Bhattacharyya, CFPS, PMP Tata Consultancy Services Ltd. pranabendu.bhattacharyya@tcs.com Pranabendu is having more than 20 years of IT experience and heading the TCS estimation Center of Excellence for last 8 years. He is an M-Tech (IIT KGP) and has been the chief consultant for many estimation consulting engagements. He is one of the core members of ITPC (IFPUG) guiding committee
  16. 16. and presented paper in various international colloquiums Parag Saha Tata Consultancy Services Ltd. parag.saha@tcs.com Parag has over 15 years of industry experience spanning multiple domains including Transportation, Logistics, Government, Insurance and Telecom- RAFM. He is currently part of the Estimation Center of Excellence in TCS and has been involved in defining and refining estimation models and in deployment of these standardized models across multiple domains in TCS. Sanghamitra Ghoshbasu Tata Consultancy Services Ltd. sanghamitra.ghoshbasu@tcs.com Sanghamitra has 13 years of experience in software delivery and project management. She has around 9 years of experience in software estimation and has been instrumental in defining, developing and deploying estimation models for multiple engagement types. Sudipta Mohan Ghosh Tata Consultancy Services Ltd. sm.ghosh@tcs.com Sudipta working with TCS Estimation Center of Excellence for around 8 years is involved in various estimation and project management engagements. He is also a TCS Internal certified estimator Sayantan Roy Tata Consultancy Services Ltd. sayantan.roy@tcs.com Sayantan has 8 years of IT experience in fields of application development, business analysis and estimation process consulting. He is an IBM certified Requirements Manager with Use Cases (RMUC)

×