SlideShare a Scribd company logo
1 of 32
Download to read offline
Saving a Runaway Project
(a case study for project turnaround)
International Quality Conference, Toronto, October 05, 2005
© Copyright 2005 – CGI
Yury Makedonov, CGI
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 20052
 Introduction of the case study
 Challenges of this project.
 Description of the actual process that was used for the
successful completion of this project. The following
critical success factors will be discussed:
 Active involvement of the project sponsor, fast decision
making.
 Active user involvement, instant feedback
 Co-location, effective communication
 Iterative and incremental development in a customer
selected order.
 Comparison of this process to current agile thinking.
Agenda
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 20053
Story of a real life software project
 It was a new application for a sales department
(~100 people) of a division of a big company.
 The goal of this department was to configure and sell
telecommunication products.
 The new application was to replace an existing
legacy Client/Server application.
 The modern web application was to have a sexy web
interface and more functionality.
 This department did not have an established software
acquisition/implementation process.
 The vendor was a start-up company trying to
establish itself. 10-15 developers were working on
this project initially.
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 20054
History
This project followed a classic review and sign-off
approach:
 Requirements were reviewed and signed off
 Design of the system was reviewed and signed off
 Specifications were reviewed and signed off
 Development was finished and tested by a vendor
The first attempt to deliver the application to the
customer failed. Customer’s reaction was:
 “This is not what we wanted!”
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 20055
Customer: “This is not what we wanted!”
 How was this possible despite the rigorous process
of walkthroughs and sign-offs?
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 20056
Why did this happen?
 The following assumptions are paramount to the
success of the review process:
 All steps are relatively defect free.
 Efficiency of defect detection by these reviews is
high.
 Defect removal efficiency is high.
 Unfortunately these assumptions are not valid 100%
of the time.
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 20057
Root cause of this problem
Let’s take a look at what really happened:
 The Business Analyst (BA) talked to users and
stakeholders.
 Users and stakeholders explained what they
needed.
 Business Analyst got a perception that he
understood these users.
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 20058
Root cause of this problem
In reality the users and the Business Analyst were
speaking different languages:
 they used different meanings of the same English
words,
 considered different contexts,
 made different assumptions.
As a result the Business Analyst significantly
misinterpreted the application users’ needs and
created a flawed requirements document.
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 20059
Actual review and sign-off process
 This requirements document was then sent back to
the users for review:
 Business users got a perception that they
understood the requirements document.
 They provided some comments and, after their
concerns were addressed, they signed off on the
requirements document.
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200510
Actual review and sign-off process
 But remember, the business users and the Business
Analyst were speaking different languages!
 So these modifications were irrelevant to the quality
of the requirements document and major flaws were
missed by the reviewers.
 We see that in this case the review/sign off process
was useless and only wasted valuable time.
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200511
History
Customer: “This is not what we wanted!”
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200512
Episode II - One more attempt
 The requirements were modified using the same
review/sign off approach.
 The vendor implemented these updated
requirements.
 To make the acceptance process more organized
the customer brought in an external Test Manager to
supervise the user acceptance testing.
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200513
Development of acceptance test cases
 End users were brought to the project team to
develop acceptance test cases to verify the
requirements.
 Some of them worked on commission and special
arrangements about their salaries were required.
 They started working together with “professional”
testers and were trained on how to develop test
cases.
 Users were asked to use their own language when
writing these test cases. They ranked all existing
requirements and started developing test cases for
the most important requirements.
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200514
Development of acceptance test cases
 The first version of the test cases was
incomprehensible for professional testers and
developers.
 The test cases were modified and the second
version could be understood by both testers and
developers.
 Further on these acceptance test cases came to be
used instead of the original requirements.
 These test cases still had many errors but
nevertheless were much better than original and
revised “requirements.”
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200515
Start of testing
 Users started testing as soon as acceptance test
cases for the major functions were developed.
 The first round of testing was a complete disaster:
 Most test cases failed.
 It was impossible even to initiate execution of
many test cases - they depended on successful
executions of other test cases.
 Many “Severity 1” and “Severity 2” defects were
recorded (high severity defects).
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200516
Status of the project
 It was
already the
second
iteration.
 Apparently
we had
almost all
the right
pieces but
could not fit
the puzzle
together!
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200517
Pilot and Release 1
 There was an extensive discussion with the project sponsor,
developers and end users (acceptance testers).
 A decision was made to break all functions into two releases:
 Pilot:
 Bare minimum functionality - not even all the functions
of the legacy application were implemented.
 Pilot was inferior to the existing legacy application and
could not be used in real production.
 Its goal was to show the release to a wider community
of sales people.
 Release 1:
 Fully functional product with all remaining features.
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200518
Change Control Board established
 To better manage this project a “Change Control
Board” was established. It consisted of:
 the project sponsor,
 users of the application (acceptance testers) and
 the developers’ managers.
 The Board approved a list of defects (including
several missing functions) to be fixed for the “Pilot”
release.
 Only defects of the highest severities (“Severity 1”
and “Severity 2”) were to be fixed.
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200519
Defects fixing
 Developers were not allowed to fix any defect or
implement any new feature without the approval of
this board.
 The developers were allowed to work only on
defects of the highest priority:
 defects fixing of wich was essential for the “Pilot”
implementation, and
 defects that prevented the execution of major test
cases.
 The repair of not so severe defects was postponed
despite the developers’ promises that some of them
required “only 10 to 30 minutes to fix.”
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200520
Role of end users
 The vendor had daily builds delivered for testing.
 End users (acceptance testers) were available to
clarify specific details of test cases (requirements)
and defect reports for developers.
 Those clarifications were important because initially
the developers had a lot of problems interpreting the
test cases and defect reports.
 Development of new acceptance test cases
continued.
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200521
Role of end users
 Why were end users more efficient at developing
and executing test cases?
 They have working knowledge of their business –
how to configure a product and create an order.
 They were able to use their knowledge better when
creating and executing test cases than when talking
to a business analyst and trying to understand his
gibberish language.
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200522
Release of end users
 Initially four end users were used full time for four weeks
to develop these acceptance test cases and to start
testing the application.
 After four initial weeks professional testers were able to
execute these acceptance test cases, and the four
business users were gradually released back to their
main jobs.
 Nevertheless, one of these business users, on a
rotational basis, was working with testers and developers
every day.
 The remaining business users were available for phone
consultations. Unfortunately this arrangement was not as
good as having them on site. They had their own
responsibilities and as a result their response time was
not as good as when they were working on site.
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200523
Change Control Board role
 New test cases, which were essentially new
requirements, were reviewed and approved by the same
Change Control Board.
 The Board discussed:
 Progress of testing,
 Fixed defects,
 New discovered defects,
 Defects reported as fixed by developers that were
“failed” by acceptance testers.
 Priority of remaining defects and additional “new” defects
were reviewed and revaluated every day.
 Developers were allowed to work only on defects of
highest priority in this stack!
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200524
Change Control Board
 During the first week, the Control Board was
meeting every day under the direction of the project
sponsor.
 Initially the project sponsor said that he had no time
to attend these daily meetings, but was told that
without his involvement he would not get the project
implemented.
 Project sponsor involvement allowed all required
decisions to be made really fast.
 After the first week the Control Board continued
meeting every day.
 The project sponsor chaired it twice a week and
then once a week until the “Pilot” was implemented.
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200525
Pilot, Release 1 and Release 2
 After the end of Pilot a decision was made to break
“Release 1” into:
 Release 1:
 about the same functionality as the legacy
application plus a couple of new features.
 Release 2:
 remaining features of original “Release 1”.
 For two weeks the Pilot was used by a wider
community of users concurrently with the legacy
application
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200526
Release 1
 “Release 1” was developed using about the same
approach.
 The frequency of Change Control Board meetings
was changed to twice a week.
 The project sponsor was chairing this Change
Control Board once a week.
 “Release 1” was finally implemented almost two
years after project initiation instead of less than a
year as was originally planned.
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200527
Release 2
 The scope of “Release 2” was reviewed based on
first results of “Release 1” in production.
 Some features were removed.
 Other new features were added.
 In four months “Release 2” was developed and
implemented using about the same approach.
 For the last 3 years this application was in
production with just minor modifications. This project
was recognised as a success despite all original
problems.
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200528
Summary
 What allowed for an efficient turnaround of this project:
 Active involvement of the project sponsor, fast
decision making.
 Active user involvement, instant feedback
 Co-location, effective communication
 Iterative and incremental development in a customer
selected order.
 These are:
 core principles of different modern agile software
development methods and at the same time
 common-sense principles that started being used
many years ago by successful software managers.
 The same approach is recommended for any project of a
similar complexity.
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200529
Q & A
 Questions?
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200530
Contact information
Yury Makedonov, CSTE, CSQA
Senior Technical Consultant
Centre for Testing and Quality
CGI services to BCE
125 Commerce Valley Drive West, Floor 6B,
Markham, ON, L3T7W4
(905) 762-2705 – office,
(647) 224-6619 – cell
yury.makedonov@cgi.com
http://www.cgi.com
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200531
Appendix - literature
 You can find more information on agile processes,
for example, in:
 Kevin, J. Aguanno (editor), “Managing Agile
Projects“.
 http://www.agilealliance.org/articles/index
2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200532
Appendix - Test team organization
How the whole test team was organized:
 Business users were developing “scenario like”
test cases which required their intimate
knowledge of business processes.
 Professional testers concentrated on classical
valid/invalid test cases, boundary conditions, etc.
 There was even one person working on GUI test
automation. He focused of automation of “Smoke
testing” or “Acceptance into testing” tests - a
small set of test cases that was executed as soon
as a new build was delivered for testing.

More Related Content

What's hot

Creating a successful continuous testing environment by Eran Kinsbruner
Creating a successful continuous testing environment by Eran KinsbrunerCreating a successful continuous testing environment by Eran Kinsbruner
Creating a successful continuous testing environment by Eran KinsbrunerQA or the Highway
 
Vincent Spena - Agile and Lean Methods for Hardware Product Development
Vincent Spena - Agile and Lean Methods for Hardware Product DevelopmentVincent Spena - Agile and Lean Methods for Hardware Product Development
Vincent Spena - Agile and Lean Methods for Hardware Product DevelopmentVincent Spena
 
Quality Assurance to Test Engineering – Insights From our Journey by Oksana S...
Quality Assurance to Test Engineering – Insights From our Journey by Oksana S...Quality Assurance to Test Engineering – Insights From our Journey by Oksana S...
Quality Assurance to Test Engineering – Insights From our Journey by Oksana S...QA or the Highway
 
Evolution of Agile Testing
Evolution of Agile TestingEvolution of Agile Testing
Evolution of Agile TestingOdd-e
 
Software Testing - Defect Metrics & Analysis
Software Testing - Defect Metrics & AnalysisSoftware Testing - Defect Metrics & Analysis
Software Testing - Defect Metrics & AnalysisOAK Systems Pvt Ltd
 
Testing Comes into its Own in DevOps by Jack Maher
Testing Comes into its Own in DevOps by Jack MaherTesting Comes into its Own in DevOps by Jack Maher
Testing Comes into its Own in DevOps by Jack MaherQA or the Highway
 
Agile vs. DevOps for Continuous Testing: How to Optimize Your Pipeline
Agile vs. DevOps for Continuous Testing: How to Optimize Your PipelineAgile vs. DevOps for Continuous Testing: How to Optimize Your Pipeline
Agile vs. DevOps for Continuous Testing: How to Optimize Your PipelinePerfecto by Perforce
 
Unlocking Software Testing Circa 2016
Unlocking Software Testing Circa 2016Unlocking Software Testing Circa 2016
Unlocking Software Testing Circa 2016MentorMate
 
TEA Presentation V 0.3
TEA Presentation V 0.3TEA Presentation V 0.3
TEA Presentation V 0.3Ian McDonald
 
DevOps the Big Picture for Testers by Joseph Ours
DevOps the Big Picture for Testers by Joseph OursDevOps the Big Picture for Testers by Joseph Ours
DevOps the Big Picture for Testers by Joseph OursQA or the Highway
 
Try: Fail, Try: Succeed by Tim Grant
Try: Fail, Try: Succeed by Tim GrantTry: Fail, Try: Succeed by Tim Grant
Try: Fail, Try: Succeed by Tim GrantQA or the Highway
 
1 JimRinehart ResumeQE 15 12-4-15
1 JimRinehart ResumeQE 15 12-4-151 JimRinehart ResumeQE 15 12-4-15
1 JimRinehart ResumeQE 15 12-4-15Jim Rinehart
 
Product Design Using Solidworks
Product Design Using SolidworksProduct Design Using Solidworks
Product Design Using SolidworksBess Ho
 
Continuous testing & devops with @petemar5hall
Continuous testing & devops with @petemar5hallContinuous testing & devops with @petemar5hall
Continuous testing & devops with @petemar5hallPeter Marshall
 
How to Embed Codeless Test Automation Into DevOps
How to Embed Codeless Test Automation Into DevOpsHow to Embed Codeless Test Automation Into DevOps
How to Embed Codeless Test Automation Into DevOpsPerfecto by Perforce
 
Test Design for Fully Automated Build Architecture
Test Design for Fully Automated Build ArchitectureTest Design for Fully Automated Build Architecture
Test Design for Fully Automated Build ArchitectureTechWell
 
Solving the 3 Biggest Questions in Continuous Testing
Solving the 3 Biggest Questions in Continuous TestingSolving the 3 Biggest Questions in Continuous Testing
Solving the 3 Biggest Questions in Continuous TestingPerfecto by Perforce
 

What's hot (20)

Creating a successful continuous testing environment by Eran Kinsbruner
Creating a successful continuous testing environment by Eran KinsbrunerCreating a successful continuous testing environment by Eran Kinsbruner
Creating a successful continuous testing environment by Eran Kinsbruner
 
Vincent Spena - Agile and Lean Methods for Hardware Product Development
Vincent Spena - Agile and Lean Methods for Hardware Product DevelopmentVincent Spena - Agile and Lean Methods for Hardware Product Development
Vincent Spena - Agile and Lean Methods for Hardware Product Development
 
Dw testing
Dw testingDw testing
Dw testing
 
Quality Assurance to Test Engineering – Insights From our Journey by Oksana S...
Quality Assurance to Test Engineering – Insights From our Journey by Oksana S...Quality Assurance to Test Engineering – Insights From our Journey by Oksana S...
Quality Assurance to Test Engineering – Insights From our Journey by Oksana S...
 
Evolution of Agile Testing
Evolution of Agile TestingEvolution of Agile Testing
Evolution of Agile Testing
 
Software Testing - Defect Metrics & Analysis
Software Testing - Defect Metrics & AnalysisSoftware Testing - Defect Metrics & Analysis
Software Testing - Defect Metrics & Analysis
 
Testing Comes into its Own in DevOps by Jack Maher
Testing Comes into its Own in DevOps by Jack MaherTesting Comes into its Own in DevOps by Jack Maher
Testing Comes into its Own in DevOps by Jack Maher
 
Agile vs. DevOps for Continuous Testing: How to Optimize Your Pipeline
Agile vs. DevOps for Continuous Testing: How to Optimize Your PipelineAgile vs. DevOps for Continuous Testing: How to Optimize Your Pipeline
Agile vs. DevOps for Continuous Testing: How to Optimize Your Pipeline
 
Unlocking Software Testing Circa 2016
Unlocking Software Testing Circa 2016Unlocking Software Testing Circa 2016
Unlocking Software Testing Circa 2016
 
TEA Presentation V 0.3
TEA Presentation V 0.3TEA Presentation V 0.3
TEA Presentation V 0.3
 
DevOps the Big Picture for Testers by Joseph Ours
DevOps the Big Picture for Testers by Joseph OursDevOps the Big Picture for Testers by Joseph Ours
DevOps the Big Picture for Testers by Joseph Ours
 
Try: Fail, Try: Succeed by Tim Grant
Try: Fail, Try: Succeed by Tim GrantTry: Fail, Try: Succeed by Tim Grant
Try: Fail, Try: Succeed by Tim Grant
 
1 JimRinehart ResumeQE 15 12-4-15
1 JimRinehart ResumeQE 15 12-4-151 JimRinehart ResumeQE 15 12-4-15
1 JimRinehart ResumeQE 15 12-4-15
 
Product Design Using Solidworks
Product Design Using SolidworksProduct Design Using Solidworks
Product Design Using Solidworks
 
New model
New modelNew model
New model
 
Continuous testing & devops with @petemar5hall
Continuous testing & devops with @petemar5hallContinuous testing & devops with @petemar5hall
Continuous testing & devops with @petemar5hall
 
How to Embed Codeless Test Automation Into DevOps
How to Embed Codeless Test Automation Into DevOpsHow to Embed Codeless Test Automation Into DevOps
How to Embed Codeless Test Automation Into DevOps
 
Innovation Team Plan
Innovation Team PlanInnovation Team Plan
Innovation Team Plan
 
Test Design for Fully Automated Build Architecture
Test Design for Fully Automated Build ArchitectureTest Design for Fully Automated Build Architecture
Test Design for Fully Automated Build Architecture
 
Solving the 3 Biggest Questions in Continuous Testing
Solving the 3 Biggest Questions in Continuous TestingSolving the 3 Biggest Questions in Continuous Testing
Solving the 3 Biggest Questions in Continuous Testing
 

Viewers also liked

Presentation 2009
Presentation 2009Presentation 2009
Presentation 2009gmuntazir
 
YuryMakedonov_STP_2006_TestingCycles_05slides
YuryMakedonov_STP_2006_TestingCycles_05slidesYuryMakedonov_STP_2006_TestingCycles_05slides
YuryMakedonov_STP_2006_TestingCycles_05slidesYury M
 
Why I Need a Phone
Why I Need a PhoneWhy I Need a Phone
Why I Need a PhoneChris Shigas
 
YuryMakedonov_GUI_TestAutomation_QAI_Canada_2007_14h
YuryMakedonov_GUI_TestAutomation_QAI_Canada_2007_14hYuryMakedonov_GUI_TestAutomation_QAI_Canada_2007_14h
YuryMakedonov_GUI_TestAutomation_QAI_Canada_2007_14hYury M
 
YuryMakedonov_TesTrek2013_AndroidTesting_12u_slides
YuryMakedonov_TesTrek2013_AndroidTesting_12u_slidesYuryMakedonov_TesTrek2013_AndroidTesting_12u_slides
YuryMakedonov_TesTrek2013_AndroidTesting_12u_slidesYury M
 
Cognitive dev power_point-1
Cognitive dev power_point-1Cognitive dev power_point-1
Cognitive dev power_point-1Chris Shigas
 
YuryMakedonov_TesTrek2013_ValueOfGUITestAutomation_09_handouts
YuryMakedonov_TesTrek2013_ValueOfGUITestAutomation_09_handoutsYuryMakedonov_TesTrek2013_ValueOfGUITestAutomation_09_handouts
YuryMakedonov_TesTrek2013_ValueOfGUITestAutomation_09_handoutsYury M
 
BYU Portfolio 2009
BYU Portfolio 2009BYU Portfolio 2009
BYU Portfolio 2009jsmith55555
 
Connecting to Fight Hunger
Connecting to Fight HungerConnecting to Fight Hunger
Connecting to Fight Hungergueste9815469
 

Viewers also liked (14)

Presentation 2009
Presentation 2009Presentation 2009
Presentation 2009
 
Lean Management
Lean ManagementLean Management
Lean Management
 
Oma Op Texel 2
Oma Op Texel 2Oma Op Texel 2
Oma Op Texel 2
 
Vision Quest
Vision QuestVision Quest
Vision Quest
 
YuryMakedonov_STP_2006_TestingCycles_05slides
YuryMakedonov_STP_2006_TestingCycles_05slidesYuryMakedonov_STP_2006_TestingCycles_05slides
YuryMakedonov_STP_2006_TestingCycles_05slides
 
Why I Need a Phone
Why I Need a PhoneWhy I Need a Phone
Why I Need a Phone
 
Oma Op Texel
Oma Op TexelOma Op Texel
Oma Op Texel
 
YuryMakedonov_GUI_TestAutomation_QAI_Canada_2007_14h
YuryMakedonov_GUI_TestAutomation_QAI_Canada_2007_14hYuryMakedonov_GUI_TestAutomation_QAI_Canada_2007_14h
YuryMakedonov_GUI_TestAutomation_QAI_Canada_2007_14h
 
YuryMakedonov_TesTrek2013_AndroidTesting_12u_slides
YuryMakedonov_TesTrek2013_AndroidTesting_12u_slidesYuryMakedonov_TesTrek2013_AndroidTesting_12u_slides
YuryMakedonov_TesTrek2013_AndroidTesting_12u_slides
 
Cognitive dev power_point-1
Cognitive dev power_point-1Cognitive dev power_point-1
Cognitive dev power_point-1
 
YuryMakedonov_TesTrek2013_ValueOfGUITestAutomation_09_handouts
YuryMakedonov_TesTrek2013_ValueOfGUITestAutomation_09_handoutsYuryMakedonov_TesTrek2013_ValueOfGUITestAutomation_09_handouts
YuryMakedonov_TesTrek2013_ValueOfGUITestAutomation_09_handouts
 
BYU Portfolio 2009
BYU Portfolio 2009BYU Portfolio 2009
BYU Portfolio 2009
 
Connecting to Fight Hunger
Connecting to Fight HungerConnecting to Fight Hunger
Connecting to Fight Hunger
 
Olcme
OlcmeOlcme
Olcme
 

Similar to YuryMakedonov_SavingRunawayProject_DKL_2005_07

Root_Cause_Analysis.ppt
Root_Cause_Analysis.pptRoot_Cause_Analysis.ppt
Root_Cause_Analysis.pptpavan patel
 
Simona Hera - A Practical Guide for Testers in Agile Teams
Simona Hera - A Practical Guide for Testers in Agile TeamsSimona Hera - A Practical Guide for Testers in Agile Teams
Simona Hera - A Practical Guide for Testers in Agile TeamsSimona Hera
 
Sonu balasubramanian agile_qtp_selenium
Sonu balasubramanian  agile_qtp_seleniumSonu balasubramanian  agile_qtp_selenium
Sonu balasubramanian agile_qtp_seleniumSonu Balasubramanian
 
Fundamentals of Testing
Fundamentals of TestingFundamentals of Testing
Fundamentals of TestingCode95
 
Fitman webinar 2015 06 Verification and Validation methodology
Fitman webinar 2015 06 Verification and Validation methodologyFitman webinar 2015 06 Verification and Validation methodology
Fitman webinar 2015 06 Verification and Validation methodologyFITMAN FI
 
General technical interview questions
General technical interview questionsGeneral technical interview questions
General technical interview questionsKevalkumar Shah
 
PMI-ACP Lesson 06 Quality
PMI-ACP Lesson 06 QualityPMI-ACP Lesson 06 Quality
PMI-ACP Lesson 06 QualityThanh Nguyen
 
Moderated vs Unmoderated Research: It’s time to say ELMO (Enough, let’s move ...
Moderated vs Unmoderated Research: It’s time to say ELMO (Enough, let’s move ...Moderated vs Unmoderated Research: It’s time to say ELMO (Enough, let’s move ...
Moderated vs Unmoderated Research: It’s time to say ELMO (Enough, let’s move ...UserZoom
 
Accessibility in the Engineering Village CSUN 2019
Accessibility in the Engineering Village CSUN 2019Accessibility in the Engineering Village CSUN 2019
Accessibility in the Engineering Village CSUN 2019Ted Gies
 
Manual testing interview questions and answers
Manual testing interview questions and answersManual testing interview questions and answers
Manual testing interview questions and answersTestbytes
 
201008 Software Testing Notes (part 1/2)
201008 Software Testing Notes (part 1/2)201008 Software Testing Notes (part 1/2)
201008 Software Testing Notes (part 1/2)Javier Gonzalez-Sanchez
 
Experiences in shift left test approach
Experiences in shift left test approachExperiences in shift left test approach
Experiences in shift left test approachJournal Papers
 
Thuy_Tran_Ngoc_-_SD0585
Thuy_Tran_Ngoc_-_SD0585Thuy_Tran_Ngoc_-_SD0585
Thuy_Tran_Ngoc_-_SD0585tester Tran
 
Professional Services Project Delivery Methodology
Professional Services Project Delivery MethodologyProfessional Services Project Delivery Methodology
Professional Services Project Delivery MethodologyAmbareesh Kulkarni
 
Dhirendra_Prabhugaunker_Resume
Dhirendra_Prabhugaunker_ResumeDhirendra_Prabhugaunker_Resume
Dhirendra_Prabhugaunker_Resumedhirendra prabhu
 
Simona Hera - Practical Tips on Pair Testing
Simona Hera - Practical Tips on Pair TestingSimona Hera - Practical Tips on Pair Testing
Simona Hera - Practical Tips on Pair TestingSimona Hera
 

Similar to YuryMakedonov_SavingRunawayProject_DKL_2005_07 (20)

Root_Cause_Analysis.ppt
Root_Cause_Analysis.pptRoot_Cause_Analysis.ppt
Root_Cause_Analysis.ppt
 
Simona Hera - A Practical Guide for Testers in Agile Teams
Simona Hera - A Practical Guide for Testers in Agile TeamsSimona Hera - A Practical Guide for Testers in Agile Teams
Simona Hera - A Practical Guide for Testers in Agile Teams
 
6 Ways to Speed Up App Testing
6 Ways to Speed Up App Testing6 Ways to Speed Up App Testing
6 Ways to Speed Up App Testing
 
Sonu balasubramanian agile_qtp_selenium
Sonu balasubramanian  agile_qtp_seleniumSonu balasubramanian  agile_qtp_selenium
Sonu balasubramanian agile_qtp_selenium
 
Fundamentals of Testing
Fundamentals of TestingFundamentals of Testing
Fundamentals of Testing
 
Fitman webinar 2015 06 Verification and Validation methodology
Fitman webinar 2015 06 Verification and Validation methodologyFitman webinar 2015 06 Verification and Validation methodology
Fitman webinar 2015 06 Verification and Validation methodology
 
General technical interview questions
General technical interview questionsGeneral technical interview questions
General technical interview questions
 
PMI-ACP Lesson 06 Quality
PMI-ACP Lesson 06 QualityPMI-ACP Lesson 06 Quality
PMI-ACP Lesson 06 Quality
 
Unit 4
Unit 4Unit 4
Unit 4
 
Unit 4
Unit 4Unit 4
Unit 4
 
Moderated vs Unmoderated Research: It’s time to say ELMO (Enough, let’s move ...
Moderated vs Unmoderated Research: It’s time to say ELMO (Enough, let’s move ...Moderated vs Unmoderated Research: It’s time to say ELMO (Enough, let’s move ...
Moderated vs Unmoderated Research: It’s time to say ELMO (Enough, let’s move ...
 
Accessibility in the Engineering Village CSUN 2019
Accessibility in the Engineering Village CSUN 2019Accessibility in the Engineering Village CSUN 2019
Accessibility in the Engineering Village CSUN 2019
 
Manual testing interview questions and answers
Manual testing interview questions and answersManual testing interview questions and answers
Manual testing interview questions and answers
 
manual interview q.pdf
manual interview q.pdfmanual interview q.pdf
manual interview q.pdf
 
201008 Software Testing Notes (part 1/2)
201008 Software Testing Notes (part 1/2)201008 Software Testing Notes (part 1/2)
201008 Software Testing Notes (part 1/2)
 
Experiences in shift left test approach
Experiences in shift left test approachExperiences in shift left test approach
Experiences in shift left test approach
 
Thuy_Tran_Ngoc_-_SD0585
Thuy_Tran_Ngoc_-_SD0585Thuy_Tran_Ngoc_-_SD0585
Thuy_Tran_Ngoc_-_SD0585
 
Professional Services Project Delivery Methodology
Professional Services Project Delivery MethodologyProfessional Services Project Delivery Methodology
Professional Services Project Delivery Methodology
 
Dhirendra_Prabhugaunker_Resume
Dhirendra_Prabhugaunker_ResumeDhirendra_Prabhugaunker_Resume
Dhirendra_Prabhugaunker_Resume
 
Simona Hera - Practical Tips on Pair Testing
Simona Hera - Practical Tips on Pair TestingSimona Hera - Practical Tips on Pair Testing
Simona Hera - Practical Tips on Pair Testing
 

YuryMakedonov_SavingRunawayProject_DKL_2005_07

  • 1. Saving a Runaway Project (a case study for project turnaround) International Quality Conference, Toronto, October 05, 2005 © Copyright 2005 – CGI Yury Makedonov, CGI
  • 2. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 20052  Introduction of the case study  Challenges of this project.  Description of the actual process that was used for the successful completion of this project. The following critical success factors will be discussed:  Active involvement of the project sponsor, fast decision making.  Active user involvement, instant feedback  Co-location, effective communication  Iterative and incremental development in a customer selected order.  Comparison of this process to current agile thinking. Agenda
  • 3. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 20053 Story of a real life software project  It was a new application for a sales department (~100 people) of a division of a big company.  The goal of this department was to configure and sell telecommunication products.  The new application was to replace an existing legacy Client/Server application.  The modern web application was to have a sexy web interface and more functionality.  This department did not have an established software acquisition/implementation process.  The vendor was a start-up company trying to establish itself. 10-15 developers were working on this project initially.
  • 4. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 20054 History This project followed a classic review and sign-off approach:  Requirements were reviewed and signed off  Design of the system was reviewed and signed off  Specifications were reviewed and signed off  Development was finished and tested by a vendor The first attempt to deliver the application to the customer failed. Customer’s reaction was:  “This is not what we wanted!”
  • 5. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 20055 Customer: “This is not what we wanted!”  How was this possible despite the rigorous process of walkthroughs and sign-offs?
  • 6. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 20056 Why did this happen?  The following assumptions are paramount to the success of the review process:  All steps are relatively defect free.  Efficiency of defect detection by these reviews is high.  Defect removal efficiency is high.  Unfortunately these assumptions are not valid 100% of the time.
  • 7. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 20057 Root cause of this problem Let’s take a look at what really happened:  The Business Analyst (BA) talked to users and stakeholders.  Users and stakeholders explained what they needed.  Business Analyst got a perception that he understood these users.
  • 8. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 20058 Root cause of this problem In reality the users and the Business Analyst were speaking different languages:  they used different meanings of the same English words,  considered different contexts,  made different assumptions. As a result the Business Analyst significantly misinterpreted the application users’ needs and created a flawed requirements document.
  • 9. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 20059 Actual review and sign-off process  This requirements document was then sent back to the users for review:  Business users got a perception that they understood the requirements document.  They provided some comments and, after their concerns were addressed, they signed off on the requirements document.
  • 10. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200510 Actual review and sign-off process  But remember, the business users and the Business Analyst were speaking different languages!  So these modifications were irrelevant to the quality of the requirements document and major flaws were missed by the reviewers.  We see that in this case the review/sign off process was useless and only wasted valuable time.
  • 11. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200511 History Customer: “This is not what we wanted!”
  • 12. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200512 Episode II - One more attempt  The requirements were modified using the same review/sign off approach.  The vendor implemented these updated requirements.  To make the acceptance process more organized the customer brought in an external Test Manager to supervise the user acceptance testing.
  • 13. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200513 Development of acceptance test cases  End users were brought to the project team to develop acceptance test cases to verify the requirements.  Some of them worked on commission and special arrangements about their salaries were required.  They started working together with “professional” testers and were trained on how to develop test cases.  Users were asked to use their own language when writing these test cases. They ranked all existing requirements and started developing test cases for the most important requirements.
  • 14. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200514 Development of acceptance test cases  The first version of the test cases was incomprehensible for professional testers and developers.  The test cases were modified and the second version could be understood by both testers and developers.  Further on these acceptance test cases came to be used instead of the original requirements.  These test cases still had many errors but nevertheless were much better than original and revised “requirements.”
  • 15. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200515 Start of testing  Users started testing as soon as acceptance test cases for the major functions were developed.  The first round of testing was a complete disaster:  Most test cases failed.  It was impossible even to initiate execution of many test cases - they depended on successful executions of other test cases.  Many “Severity 1” and “Severity 2” defects were recorded (high severity defects).
  • 16. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200516 Status of the project  It was already the second iteration.  Apparently we had almost all the right pieces but could not fit the puzzle together!
  • 17. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200517 Pilot and Release 1  There was an extensive discussion with the project sponsor, developers and end users (acceptance testers).  A decision was made to break all functions into two releases:  Pilot:  Bare minimum functionality - not even all the functions of the legacy application were implemented.  Pilot was inferior to the existing legacy application and could not be used in real production.  Its goal was to show the release to a wider community of sales people.  Release 1:  Fully functional product with all remaining features.
  • 18. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200518 Change Control Board established  To better manage this project a “Change Control Board” was established. It consisted of:  the project sponsor,  users of the application (acceptance testers) and  the developers’ managers.  The Board approved a list of defects (including several missing functions) to be fixed for the “Pilot” release.  Only defects of the highest severities (“Severity 1” and “Severity 2”) were to be fixed.
  • 19. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200519 Defects fixing  Developers were not allowed to fix any defect or implement any new feature without the approval of this board.  The developers were allowed to work only on defects of the highest priority:  defects fixing of wich was essential for the “Pilot” implementation, and  defects that prevented the execution of major test cases.  The repair of not so severe defects was postponed despite the developers’ promises that some of them required “only 10 to 30 minutes to fix.”
  • 20. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200520 Role of end users  The vendor had daily builds delivered for testing.  End users (acceptance testers) were available to clarify specific details of test cases (requirements) and defect reports for developers.  Those clarifications were important because initially the developers had a lot of problems interpreting the test cases and defect reports.  Development of new acceptance test cases continued.
  • 21. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200521 Role of end users  Why were end users more efficient at developing and executing test cases?  They have working knowledge of their business – how to configure a product and create an order.  They were able to use their knowledge better when creating and executing test cases than when talking to a business analyst and trying to understand his gibberish language.
  • 22. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200522 Release of end users  Initially four end users were used full time for four weeks to develop these acceptance test cases and to start testing the application.  After four initial weeks professional testers were able to execute these acceptance test cases, and the four business users were gradually released back to their main jobs.  Nevertheless, one of these business users, on a rotational basis, was working with testers and developers every day.  The remaining business users were available for phone consultations. Unfortunately this arrangement was not as good as having them on site. They had their own responsibilities and as a result their response time was not as good as when they were working on site.
  • 23. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200523 Change Control Board role  New test cases, which were essentially new requirements, were reviewed and approved by the same Change Control Board.  The Board discussed:  Progress of testing,  Fixed defects,  New discovered defects,  Defects reported as fixed by developers that were “failed” by acceptance testers.  Priority of remaining defects and additional “new” defects were reviewed and revaluated every day.  Developers were allowed to work only on defects of highest priority in this stack!
  • 24. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200524 Change Control Board  During the first week, the Control Board was meeting every day under the direction of the project sponsor.  Initially the project sponsor said that he had no time to attend these daily meetings, but was told that without his involvement he would not get the project implemented.  Project sponsor involvement allowed all required decisions to be made really fast.  After the first week the Control Board continued meeting every day.  The project sponsor chaired it twice a week and then once a week until the “Pilot” was implemented.
  • 25. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200525 Pilot, Release 1 and Release 2  After the end of Pilot a decision was made to break “Release 1” into:  Release 1:  about the same functionality as the legacy application plus a couple of new features.  Release 2:  remaining features of original “Release 1”.  For two weeks the Pilot was used by a wider community of users concurrently with the legacy application
  • 26. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200526 Release 1  “Release 1” was developed using about the same approach.  The frequency of Change Control Board meetings was changed to twice a week.  The project sponsor was chairing this Change Control Board once a week.  “Release 1” was finally implemented almost two years after project initiation instead of less than a year as was originally planned.
  • 27. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200527 Release 2  The scope of “Release 2” was reviewed based on first results of “Release 1” in production.  Some features were removed.  Other new features were added.  In four months “Release 2” was developed and implemented using about the same approach.  For the last 3 years this application was in production with just minor modifications. This project was recognised as a success despite all original problems.
  • 28. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200528 Summary  What allowed for an efficient turnaround of this project:  Active involvement of the project sponsor, fast decision making.  Active user involvement, instant feedback  Co-location, effective communication  Iterative and incremental development in a customer selected order.  These are:  core principles of different modern agile software development methods and at the same time  common-sense principles that started being used many years ago by successful software managers.  The same approach is recommended for any project of a similar complexity.
  • 29. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200529 Q & A  Questions?
  • 30. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200530 Contact information Yury Makedonov, CSTE, CSQA Senior Technical Consultant Centre for Testing and Quality CGI services to BCE 125 Commerce Valley Drive West, Floor 6B, Markham, ON, L3T7W4 (905) 762-2705 – office, (647) 224-6619 – cell yury.makedonov@cgi.com http://www.cgi.com
  • 31. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200531 Appendix - literature  You can find more information on agile processes, for example, in:  Kevin, J. Aguanno (editor), “Managing Agile Projects“.  http://www.agilealliance.org/articles/index
  • 32. 2005 International Quality Conference, Toronto, Ontario, Canada October 5, 200532 Appendix - Test team organization How the whole test team was organized:  Business users were developing “scenario like” test cases which required their intimate knowledge of business processes.  Professional testers concentrated on classical valid/invalid test cases, boundary conditions, etc.  There was even one person working on GUI test automation. He focused of automation of “Smoke testing” or “Acceptance into testing” tests - a small set of test cases that was executed as soon as a new build was delivered for testing.