Your SlideShare is downloading. ×
[StepTalks2011] Team Software Process (TSP): High Performance Individuals, High Performance Teams - Alan Willett
Upcoming SlideShare
Loading in...5

Thanks for flagging this SlideShare!

Oops! An error has occurred.


Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

[StepTalks2011] Team Software Process (TSP): High Performance Individuals, High Performance Teams - Alan Willett


Published on

StepTalks2011 - Team Software Process (TSP): High Performance Individuals, High Performance Teams

StepTalks2011 - Team Software Process (TSP): High Performance Individuals, High Performance Teams

Published in: Business, Technology, Education

1 Like
  • Be the first to comment

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

No notes for slide


  • 1. Team Software Process 1© 2011 Carnegie Mellon University
  • 2. Team Software Process 2 © 2011 Carnegie Mellon University
  • 3. Team Software Process: High Performance Individuals High Performance Teams Alan Willett Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 E-mai: Phone: 001 607-592-7279 March 2011 © 2011 Carnegie Mellon University
  • 4. Frustrated ExecutiveExecutive needs • New feature set to meet increasing demands of customers • On time delivery • High quality software deliverables to lower the cost of maintenanceExecutive wish list • Improved architecture to enable more product lines, new markets, and more revenue • Higher productivity Team Software Process 4 © 2011 Carnegie Mellon University
  • 5. Software Industry Project Performance Standish group 2009 Chaos Many software projects fail to meet report.2000-2008 Project Performance key technical and business objectives. Only 30% of projects satisfy their planned cost, schedule, and feature commitments. Nearly 50% are Challenged Failed • 43% average cost overrun • 83% average schedule overrunChallenged • Only 52% of planned features are completedSuccessful 0 10 20 30 40 50 Team Software Process 5 © 2011 Carnegie Mellon University
  • 6. Leading Causes of Project FailureLeading causes of project failure are: • Unrealistic schedules, staffing, and plans • Changing or misunderstood requirements • Inadequate or poorly implemented architecture • Test-in quality • Undisciplined practice Team Software Process 6 © 2011 Carnegie Mellon University
  • 7. Why?Management Expectations Developer Skill Set“I want to see the most aggressive Results from over 3000 developersschedule possible!” writing a controlled sample set of programs. - On average remove 40“Hurry up and get it to test so we defects/KLOC in Unit Test.can start finding the defects.” Delivering an expected 10-20 defects/KLOC to system test.“Never slip a date until the day Estimation Erroryou miss it.” Team Software Process 7 © 2011 Carnegie Mellon University
  • 8. SEI Engineering SolutionsIntegrates and Leverages SEI Technologies Team Rapid Architecture Six Sigma CMMI SCAMPI Software Deployment Centric toolkit Process Strategy Engineering Team Software Process 8 © 2011 Carnegie Mellon University
  • 9. Case StudyBackground: • Bolsa Mexicana de Valores (BMV) operates the Mexican financial markets under license from the federal government. • Bursatec is the technology arm of the BMV. • BMV desired a new trading engine to replace the existing stock market engine and integrate the options and futures markets. • The BMV performed a build vs. buy analysis, and decided to replace their three existing trading engines with one in-house developed system. Team Software Process 9 © 2011 Carnegie Mellon University
  • 10. The ProjectBursatec committed to deliver a trading engine in8-10 quarters: • High performance (as fast or faster than anything out there) • Reliable and of high quality (the market cannot go down) • Scalable (able to handle both spikes and long-term growth in trading volume)Complicating factors: • Pressure – managers replaced when commitments are not met • Inexperience – available staff talented but young • Large project – beyond the organization’s recent experience • Key implementation technologies never used together formally • Constant stream of new requirements/changes to business rules Team Software Process 10 © 2011 Carnegie Mellon University
  • 11. TSP+ACE: Summary of Operational Best PracticesTeam Software Process Architecture-Centric EngineeringTSP is a disciplined, agile, process for software The discipline of using architecture as the focaldevelopment teams. TSP builds teams that build point for performing ongoing analyses to assurehigh-quality products at lower cost while meetingplanned costs and schedule. systems will support their missions. • Self-managed team • Quality Attribute Workshop management model • Business Thread Workshop • TSP metrics framework • Attribute-Driven Design • Team Launch process • Views and Beyond • Estimating method (PROBE) • Quality management model • Architecture Trade-off Analysis Method • Personal Software Process • Active Reviews for • Project-based deployment strategy Intermediate Design Team Software Process 11 © 2011 Carnegie Mellon University
  • 12. Personal Software Process (PSP) Training Average Defects/KLOC Removed in Test 70 60 50 C++ Def/KLOC 40 C C# 30 Java VB 20 10 0 1 2 3 4 5 6 7 8 9 10 Program Number Team Software Process 12 © 2011 Carnegie Mellon University
  • 13. The TSP Launch Process1. Establish 4. Build Top- 7. Conduct 9. HoldProduct and down and Risk ManagementBusiness Next-Phase Assessment ReviewGoals Plans 8. Prepare2. Assign Roles 5. Develop Management Launchand Define the Quality Briefing and PostmortemTeam Goals Plan Launch Report3. Produce 6. Build Bottom- The TSP launch process produces necessaryDevelopment up and planning artifacts, e.g. goals, roles,Strategy Consolidated estimates, task plan, milestones, quality plan, Plans risk mitigation plan, etc. The most important outcome is a committed team. Team Software Process 13 © 2011 Carnegie Mellon University
  • 14. Transparency Cumulative Earned ValueVisible artifacts from every team, 100.0 90.0 Cumulative Planned and Actual Hours per Weekevery six weeks. 80.0 70.0 2500.0 Planned and Actual Hours per Week 200.0 Earned Value Earned Value 60.0 2000.0 Cumulative Planned Value 180.0On-going reviews of all artifacts 50.0 Cumulative EV Cummulative Hours 14.0 1500.0 160.0 Cumulative Predicted Earned Value 40.0 140.0 12.0 Cumulative Planned Hoursensures technical quality. 30.0 Cumulative Actual Hours 120.0 20.0 1000.0 10.0 Hours Planned Hours 100.0 10.0 Earned Value Actual Hours 80.0 8.0 Planned Value 0.0 500.0Project status Earned Value 2/24/2003 4/7/2003 5/5/2003 12/2/2002 1/13/2003 1/27/2003 2/10/2003 2/24/2003 3/10/2003 2/10/20033/24/2003 3/10/20034/21/2003 12/16/2002 12/30/2002 60.0 6.0 Predicted Earned Value 40.0 0.0 4.0 Percent Defect Free 4/7/2003 5/5/2003 12/2/2002 1/13/2003 1/27/2003 3/24/2003 4/21/2003 12/16/2002 12/30/2002 20.0 Weeks • To-date effort and schedule 100.0% 0.0 2.0 4/7/2003 5/5/2003 12/2/2002 1/13/2003 1/27/2003 2/10/2003 2/24/2003 3/10/2003 3/24/2003 4/21/2003 12/16/2002 12/30/2002 Defects Removed by Phase for Assembly SYSTEM 90.0% Weeks 0.0 4/7/2003 5/5/2003 12/2/2002 1/13/2003 1/27/2003 2/10/2003 2/24/2003 3/10/2003 3/24/2003 4/21/2003 12/16/2002 12/30/2002 Predicted effort and schedule 80.0% • 500.0 Weeks Defect Density by Phase for Assembly SYSTEM 70.0% 450.0 Percent Defect Free Defects Removed by Phase 400.0 60.0% Weeks 350.0 25.00 • Resource utilization 50.0% 40.0% 300.0 250.0 20.00 Cumulative Defects Removed by Phase for Assembly SYSTEM Plan 1400 Actual Cumulative Defects Removed by Phase 200.0 Defects/KLOC 30.0% • Process and plan fidelity 20.0% 150.0 100.0 15.00 1200 Plan Actual 10.00 1000 10.0% 50.0 0.0 800 • Pre-test, post-test, and release quality 0.0% 5.00 Plan n st le t n n n e st w w es ti o tio tio Actual tio Te od In Compile In Unit Test In Build and In System Test In Acceptance Test In Product Life pi ie Te ie 600 tT ec om ec ec ev ec C ev n em sp ni Integration Test io sp sp sp R R C U at In D In In In st de gr w DL Sy EQ 0.00 400 LD LD e w Co t T Inte Phase od R H D C Cycle and phase entry and exit criteria d • st le t n n st an es tio tio Te pi ie Te ie 200 om ec ev ec i ld ev n em ni io sp Bu sp R R C U at LD In In st e gr od Sy 0 LD n de D de te Phase Te nts C tio o od Co In D ec C d n n n t Bue t sp n H Tes n ng sp le i o st on In Uni n es st w en w an ti o In Pla la ig LD sig tio pi gr t Te ie Te ie i ld ti ni Sy n T P pm es e ec om c De ev ec e ev m an t EQ st em D D D el o sp sp R re R C Pl d g r vel at In Sy qui In In st e le v n em ai Le LD LD io e e D te Phase od et R C at h- st st R D C ig Te d te H an In i ld Bu Phase Team Software Process 14 © 2011 Carnegie Mellon University
  • 15. Experience and ResultsThe ACE methods and architecture coaching,coupled with the discipline of the TSP, built acompetent architecture team quickly.The project objectives were met. • Schedule – finished early • Quality – early trials and quality metrics suggest that reliability and quality goals were met. • Performance – a day’s worth of transactions can be processed in seconds. • Total test effort was less than 15% of total development effort. • Architecture development effort was less than 15% of total development effort. Team Software Process 15 © 2011 Carnegie Mellon University
  • 16. AIM and Case Study #2SEI developed AIM (Accelerated Improvement Method) to meet the goals andchallenges of • Rapid achievement of maturity levels • High performance projects • Rapid return on investment Team Rapid CMMI SCAMPI Software Deployment Process Strategy Accelerated Improvement Method Team Software Process 16 © 2011 Carnegie Mellon University
  • 17. AIM Components OverviewAIM provides the followingcomponents (high level view) 100% 90% • TSP + 80% Unrated Percentage of SPs 70% — Scripts (68) 60% Not Addressed 50% Partially Addressed — Forms (46) 40% Supported 30% Directly Addressed — Specifications (5) 20% 10% — Roles (20) 0% Level 2 Level 3 Level 4 Level 5 All Levels • Introduction and sustaining guidance CMMI Maturity Level • Role based training (executives, managers, team leaders, AIM coverage of CMMI developers, more) • Training for instructors and coaches • Tools Team Software Process 17 © 2011 Carnegie Mellon University
  • 18. Accelerated Improvement Method (AIM) Implementation Timeline Initial GAP Analysis Class B Appraisal: Dates: 1/22/2010 Req M PP PMC M&A PPQA CM RD TS PI Ver Val IPM Rsk M DAR OPF OPD OT Specific Goal 1 SP 1.1 R G G Y R R Y Y G G Y R Y G R R R SP 1.2 G R G Y G Y Y Y Y G Y Y Y Y G G Y SP 1.3 Y G G G G Y G G Y G Y G R Y SP 1.4 R G Y G Y Y R Y SP 1.5 G Y G Y Y SP 1.6 G R Y Y SP 1.7 G Specific Goal 2 SP 2.1 G G G Y R Y G R G G G G G Y SP 2.2 G G G G Y Y G R G G G Y G G SP 2.3 Y G G R R Y G Y SP 2.4 G Y R SP 2.5 G SP 2.6 R SP 2.7 G SP 2.8 Specfic Goal 3 SP 3.1 G Y R Y Y G G Y SP 3.2 G R G G G G G Y SP 3.3 G R Y R SP 3.4 G Y R SP 3.5 G Generic Goal 2 GP 2.1 R G Y Y G G R R R R R R R R R R Y GP 2.2 R G G G G Y G G G G G G G Y G G Y GP 2.3 Y G G G G G G G G G Y G G Y G G Y GP 2.4 G G G G G Y G G G G G G G Y Y Y G GP 2.5 Y G G G G Y Y G R G G G G G G G Y GP 2.6 Y G G G Y Y R R R R R R Y Y R R R GP 2.7 Y G G G Y Y R G R R R R G Y Y Y Y GP 2.8 G G G G G G G G G G G G G Y G G R GP 2.9 Y R G Y G R Y Y Y Y Y Y Y Y R R Y GP 2.10 Y G G G G Y G G G G G G G Y Y Y Y Generic Goal 3 GP 3.1 G G G G G Y R R G G G R G G R R G SCAMPI A GP 3.2 Y Y Y Y Y Y Y Y Y Y Y R Y R Y Y Y Organizational Tailoring of TSP SCAMPI B Appraisal TSP Processes Appraisal ML 3 Rating TrainingAugust 2009 January 2010 May 2010 August 2010 October 2010 September 2009 June 2010 TSP TSP EPG TSP TSP Cycle 1 Cycle 2 Launch Cycle 3 Cycle 4 Team Software Process 18 © 2011 Carnegie Mellon University
  • 19. The Results! Level 3 Achieved with great performance resultsClass B Appraisal:Dates: 1/22/10 (13 months!)Specific Goal 1 Req M PP PMC M&A PPQA CM RD TS PI Ver Val IPM Rsk M DAR OPF OPD OT CGI Federal, TPG, SEIDSP 1.1 G G G G G G G G G G G G G G G G GSP 1.2SP 1.3 G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G Project PerformanceSP 1.4SP 1.5 G G G G G G G G G G G G G Today vs Pre-TSPSP 1.6 G G G GSP 1.7 GSpecific Goal 2SP 2.1 G G G G G G G G G G G G G G Productivity Increased by 35%SP 2.2 G G G G G G G G G G G G G GSP 2.3 G G G G G G G GSP 2.4 G G GSP 2.5SP 2.6 G G Estimated Time on Task VarianceSP 2.7SP 2.8 G Reduced from 18% to 7%Specfic Goal 3SP 3.1 G G G G G G G GSP 3.2 G G G G G G G GSP 3.3SP 3.4 G G G G G G G Defects Found in Validation TestingSP 3.5 G Reduced by 50%Generic Goal 2GP 2.1 G G G G G G G G G G G G G G G G GGP 2.2 G G G G G G G G G G G G G G G G GGP 2.3GP 2.4GP 2.5 G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G Schedule VarianceGP 2.6GP 2.7 G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G G Reduced to Less than 10%GP 2.8 G G G G G G G G G G G G G G G G GGP 2.9 G G G G G G G G G G G G G G G G GGP 2.10 G G G G G G G G G G G G G G G G GGeneric Goal 3GP 3.1 G G G G G G G G G G G G G G G G GGP 3.2 G G G G G G G G G G G G G G G G G Team Software Process 19 © 2011 Carnegie Mellon University
  • 20. TSP Results Across Many Projects Typical Performance Mean Min. Max. Industry Category ResultEffort estimation error 5% -24% 25% >40%Schedule estimation error 6% -20% 27% >40%System test effort** 4% 2% 7% >30%Cost of quality 17% 4% 38% >30%Product quality* 0.06 0.0 0.5 1.0 to 7.0Process Yield >98%Productivity improvement 30%Return on Investment Typically less than one year for each project *Post-release defects reported thousand new or modified lines of code **System test effort as a percentage of total development effort Source: Davis, N.; & Mullaney, J. The Team Software Process in Practice: A Summary of Results (CMU/SEI-2003-TR-014) Team Software Process 20 © 2011 Carnegie Mellon University
  • 21. Team Software Process 21© 2011 Carnegie Mellon University
  • 22. Expectations of ExcellenceThose that do the work should • build a product the customer will truly appreciate • focus on quality throughout the project • make commitments based on realistic plans • track plans daily to ensure no late surprises • improve the process based on data Team Software Process 22 © 2011 Carnegie Mellon University
  • 23. For More InformationFor more information contact SEI • Jim Over; + 1 412-268-7624; FAX: +1 412-268-5800; • Dave Scherb; +1 412-268-3946; FAX: +1 412-268-5800; • Greg Such; +1 412-268-6973; FAX: +1 412-268-5800; gsuch@sei.cmu.eduFor information on user experience and results • Team Software Process 23 © 2011 Carnegie Mellon University
  • 24. Additional TopicsACE methods and practicesTSP methods and practicesGetting Started Team Software Process 24 © 2011 Carnegie Mellon University
  • 25. QAW/BTW – Building Quality Attribute ScenariosThe Quality Attribute Workshop (QAW) and Business ThreadWorkshop (BTW)•bring together important internal and external stakeholders•developand validate key quality attribute scenarios thatquantitatively define the most important non-functional requirements•QAW focuses on developing quality attribute scenarios•BTW focuses on business context to validate scenarios Team Software Process 25 © 2011 Carnegie Mellon University
  • 26. Attribute-Driven Design (ADD) MethodADD uses quality attribute scenarios to drive architectural design.The process was time-boxed two ways. • Six-week boxes to focus on — initial architectural (v1) while training architect team — refined architecture (v2) for early review or ATAM1 — “complete” (not final) architecture (v3) for use by developers2 • Two-week boxes that focused on — developing the architecture — preparing for and performing ATAM-based peer-reviews with the “architecture coach”1. Development team was launched at this point2. ATAM actually occurred at this point Team Software Process 26 © 2011 Carnegie Mellon University
  • 27. Views and Beyond for Architecture Documentation“View and Beyond is not a method, but a collection of techniques:1. Find out what architecture information stakeholders need.2. Provide that information to satisfy the needs.3. Capture the information in views, plus beyond-view information.4. Package the information in a useful form to its stakeholders.5. Review the result to see if it satisfied stakeholders’ needs.”From the SEI class Documenting Software Architectures, Team Software Process 27 © 2011 Carnegie Mellon University
  • 28. Architecture Trade-off Analysis Method (ATAM)ATAM • brings together a system’s stakeholders • evaluates the existing architecture with respect to the quality attribute scenarios • focuses on surfacing architectural risks • promotes & requires adequate documentation of the architectureAs mentioned previously, two-day ATAM-based peer-reviews wereused by the architecture coach during development. • on-the-job training for architecture team • forced adequate documentation from the start • fewer risks surfaced at formal ATAM than expected for size/scope of project Team Software Process 28 © 2011 Carnegie Mellon University
  • 29. Active Review of Intermediate Designs (ARID)An ARID was held in conjunction with a TSP relaunch.The purpose of ARID is to•put the architectural documents into the hands of developers•ensure that the documents are fit for development use (rightinformation recorded at sufficient level of detail)•provide early “live” feedback to the architecture team Team Software Process 29 © 2011 Carnegie Mellon University
  • 30. TSP Self-Directed Team Management Model Customer or Customer or Representative Leadership Team Representative Management TL PM TM TM TM TSP Coach TM TM TM TM TM TM TM TM TM TM TM TM TM Traditional team TSP Self-directed teamThe leader plans, directs, and tracks the work. The team plans, directs, and tracks their work. Team Software Process 30 © 2011 Carnegie Mellon University
  • 31. TSP Measurement FrameworkYou can’t manage what you can’t measure.To help teams know where they stand,every TSP project uses four base measures.Teams use these measures for planning. Size TimeTeam members gather these samemeasures as they work.From these data • status is generated Defects Schedule • improvements are identified Team Software Process 31 © 2011 Carnegie Mellon University
  • 32. Put a Quality Product into TestIBM’s Dr. Harlan Mills asked: “How do you Defects Removed by Phaseknow that you’ve found the last defect in 1000system test?” 900 800 700 600 Defects 500“You never find the first one.” 400 300 200 100If you want a quality product out of test, 0 Design Review and Code Review and Unit Test System Test Inspection Inspectionyou must put a quality product into test.How do you put a quality product into test?You must estimate, measure, track, andmanage quality at every step. Team Software Process 32 © 2011 Carnegie Mellon University
  • 33. Economics of Quality Efficiency Effectiveness Predictability Avg. removal Phase yields Estimated effort rate (% of defects (defects/hr) removed)Design Review 1.5 50% to 70% LowDesign Inspection 0.5 50% to 70% variability - based onCode Review 4 50% to 70% product sizeCode Inspection 1 50% to 70%Unit Test 0.2 35% to 50% High variability - based onIntegration Test 0.1 35% to 50% time to find &System Test 0.05 35% to 50% fix defects Team Software Process 33 © 2011 Carnegie Mellon University
  • 34. TSP Yield Management Phase YieldThe focus of yield management is to 90%produce high-quality output 80% 70% 60% • at every step of the process. Yield 50% 40% • for all work products. 30% 20% 10% 0% REQ HLD DLD DLD Code Code Unit Test Build andTSP uses two yield measures. Inspection Inspection Review Inspection Review Inspection Integration Test • Phase yield is the percentage of Process Yield defects that were found and fixed in a 100.0% 99.5% 90.0% given phase. 80.0% 99.5% 70.0% • Process yield is the percentage of all 60.0% Yield 50.0% the defects that were found and fixed 40.0% up to a given phase. 30.0% 20.0% 10.0% 0.0% % Before Compile % Before Unit Test % Before Build % Before System and Integration Test Test Team Software Process 34 © 2011 Carnegie Mellon University
  • 35. TSP Yield IndicatorsAt test entry • high yield in early phases is likely to be a positive indicator. • low yield is almost certainly an indicator of poor quality.Yield can’t be accurately calculated until the product is used and retiredso early indicators that are correlated with high yield are needed. • Review rates • Quality profile • Process quality index Team Software Process 35 © 2011 Carnegie Mellon University
  • 36. Getting StartedTSP/ACE is introduced into an TSP Introduction Stepsorganization on a project-by-project 1. Start by identifying candidate projects,basis. architects, and internal TSP Coach candidates. 2 Train senior management. 3. Train the selected teams and their managers then launch the project. 4. Monitor the projects and make adjustments as needed. 5. Expand the scope to include additional projects and teams. 6. Create or expand the pool of available SEI- authorized architects, instructors and coaches. 7. Repeat starting at step 3. Team Software Process 36 © 2011 Carnegie Mellon University
  • 37. Training Participants Participant Course NotesExecutives and senior TSP Executive Strategy Seminar 1 day + optional ½ day strategic planning session.managementMiddle and first-line Leading Development Teams 3 daysmanagersSoftware developers PSP Fundamentals 5 days (EDS uses Inner Workings version) PSP Advanced 5 days (optional) PSP I 5 days PSP II 5 days (Alternative to PSP Fundamentals and Advanced)Other team members TSP Team Member Training 2.5 daysInstructors PSP Instructor Training 5 days Pre-requisite training: PSP Fundamentals and PSP Advanced or PSP I and PSP IICoaches TSP Coach Training 5 days Pre-requisite training: PSP Fundamentals and PSP Advanced or PSP I and PSP II Team Software Process 37 © 2011 Carnegie Mellon University
  • 38. Selecting Pilot ProjectsPick 3 to 5 medium-to large-sized pilot projects. • 8 to 15 team members • 4 to 18 month schedule • Software-intensive new development or enhancement • Representative of the organization’s work • Important projectsSelect teams with members and managers who are willing to participate.Consider the group relationships. • Contractors • Organizational boundaries • Internal conflicts Team Software Process 38 © 2011 Carnegie Mellon University
  • 39. ACE Training CERTIFICATE PROGRAMS CERTIFICATION Software Architecture Requirements ATAM Evaluator ATAM Leader ProfessionalSoftware Architecture: Principlesand Practices course   Documenting SoftwareArchitectures course  Software Architecture Designand Analysis course  Software Product Lines course Software Architecture: Principlesand Practices Exam   ATAM Evaluator Training course  ATAM Leader Training course ATAM Observation  Team Software Process 39 © 2011 Carnegie Mellon University
  • 40. TSP Training TSP Executive Strategy Seminar • Building a “winning” organization • Managing with facts and data • One-day course Leading a Development Team • Building self-directed teams • Motivating and leading self-directed teams • Three-day course Coaching Development Teams • Launching Teams • Coaching teams • Five-day course PSP for Software Developers • Using a defined and measured personal process • Planning, tracking, design, quality management • Five-day course Team Software Process 40 © 2011 Carnegie Mellon University
  • 41. Team Software Process 41© 2011 Carnegie Mellon University
  • 42. STRONGSTEP - INNOVATION IN SOFTWARE QUALITYEmail: geral@strongstep.ptWeb: www.strongstep.ptTelefone: +351 22 030 15 85Rua Actor Ferreira da Silva, UPTEC4200-298 PortoPortugal Team Software Process 42 © 2011 Carnegie Mellon University