Project Management
Tips to Improve Test
Planning
ricki henry, pmp, csm
1
Who am I?
• Started professional life as a programmer, then
• Sr. programmer, and Lead programmer.
• As a lead programmer at American Express began
managing projects and have been a project manager
ever since.
• PMP (Project Management Professional) certified and
a Certified Scrum Master (Scrum is an agile
methodology)
• Formed the PMO (Project Management Office) where
I work at Clark County Nevada where I also run the
Project Management Workshops
• Currently the VP of Education for my local PMI
(Project Management Institute) chapter
• Frequent presenter at conferences and on webinars
2
Successful Test Managers
3
Testing is most effective
when considered to be a
sub-project of the larger
development project
Successful Test Managers
4
Apply project management
processes such as risk
management and scope
management
Successful Test Managers
5
Understand stakeholder goals and benefits
Work with the customer to formulate
testing goals and approaches
Contribute to the overall project planning
Stakeholder Expectations
Test Scope
Test Planning
Test Risks
6
Agenda
Customer Expectations
7
Project success is measured against project
objectives
What objectives do stakeholders have for
testing?
What objectives will you be able to deliver?
Customer Expectations
• What if expectations are that you’ll test
everything and have zero defects?
 What CAN we do?
 Prioritize testing and devote more time to
priority testing (aka priority coverage)
 Change the expectation to ‘Show the
software works”
 What are examples of testing that might be
a priority? 8
Customer Expectations
Highest volume
transactions
Features related to
revenue
Functions that drive
customer retention
Functions that drive
customer satisfaction
Performance Usability
9
Priority testing/focus examples
Customer Expectations
• Using the priority testing/focus you
identified, formulate goal statements for
testing
• Include these goals in the Project Charter
10
Customer Expectations
• A goal is what you want to achieve
• It must be accomplishable and measureable
11
Customer Expectations
• Goal example:
• Goals: Perform quality control testing to
• Verify the product works as specified with an
acceptable level of quality
• Determine when the product meets ‘Go/No Go’
criteria and can be deployed
• Constraints:
• The hard due date constrains the amount of testing
possible for the project
• The availability of a key SME constrains the amount
of scenario testing that will be done
12
Customer Expectations
• Sample Beta Testing Goals
• Get feedback from the users on customer
experience
• Assess stability and reliability of the application
• Assess full deployment readiness
• Assess performance
• Collect testimonials or case studies
13
Test Scope
• List what’s in and out of scope for testing
 An example of in scope would be
oThe most common logic paths
oAll APIs
oIE9, IE10, Chrome and Foxfire browsers
 An example of out of scope would be
oLogic paths not on the list of commonly used paths
oSafari Browser
oRegression testing for modules not being modified
14
Test Scope
• Work with the project team to determine all
of the testing that will be needed
 Will the test team have a role?
 Will that role be to test or to
o Provide a test environment
o Supervise internal or external personnel
o Schedule time in the test lab
o Provide test scripts to subject matter experts
(Avoid the ‘easy to test’ and ‘we’ve always
done it that way’ trap) 15
Test Scope
• The testing scope is to test the goals.
• Document assumptions such as: the code
will be fully unit/component tested before
the test team accepts the deliverable for
their testing
• The test scope is included in the Project
Scope Definition document
16
Tools
Once you determine what testing tools you will be
using on this project determine:
17
What training
is needed for
new tool
users?
Will training be
internal, online
or instructor
lead?
What are the
licensing
costs?
What
infrastructure is
needed?
What are some examples of tools?
Testing Budget
18
Submit your budget request to the PM
What will
new tools
cost?
How
much will
training
cost?
What do
tool
licenses
cost?
Will
vendors
do any
verificati
on?
What
resources
are
needed?
• Number of instances needed
• Use of same test tools as developers
• Project risk identification
• “What if” conditions during requirements
• Participate in requirements
19
QA Included Early
Include testers in the early phases to give
input on:
QA Included Early
• Help define defect severity
• Help formulate ‘Go/No Go’ criteria
• Help define the defect/fix/retest process
• Help define how change control will work
20
QA Included Early
• Promote continuous integration of code
• Help with decision on whether or not to
have a pilot
• Promote go-live war rooms
• Included in lessons learned
21
QA Included Early
• Perception – we don’t have time to
participate in early Phases
• Reality – preventing rework pays off
22
Nothing new
for most of
you
Test plans are
included in
project plans
Determine
types of
testing
Planning
Planning
• Document the business value of each type
of testing in your plan
• Document the approach to testing, e.g.
 Test cycles
 Who will test (you, SMEs, developers, BAs?)
 Test early and often (code is delivered often)
NOTE: we’re planning now, not writing test
scripts
24
Planning
• Include anything that reduces test time
without reducing quality of testing, e.g.
 Multiple instances for simultaneous testing
 Check QA team for ideas from lessons learned
• Send completed plan to the PM for
inclusion in the Project Management Plan
25
Planning
• Human Resource planning
 What skills and knowledge do you need to
perform the tests you identified?
 Who will be assigned to the project?
 What is their availability
• Write test cases as soon as possible in the
project [See V-model]
26
Early Test Preparation
Business
Case
27
Requirements
System
Specification
System
Design
Component
Design
Component
Testing
Acceptance
Criteria
verifying
Acceptance
Testing
System
Testing
Integration
Testing
Benefit
realization
checks
UAT test
scripts
System
test scripts
Integration
test script
Component
test script
Component
Build
Planning
Test Status Reporting Plan
• Select initial reporting approach. Compare:
 Tracking of all bugs – metrics based, or
 Tracking test case/test activity completion –
goal/scope based
 Which method illustrates progress better?
• Final testing phase - after all tests have been
done once, switch to bug tracking (with bug
criticality) 28
29
Planning
Estimate
due date for
each
planned
activity
Estimate
Go live
date
Include
lessons
learned
sessions
Quality
must be
planned in,
not
inspected in
30
Scheduling
Develop
testing
schedule
Planning is
input to testing
schedule
Identify tasks,
dependencies,
resources
Add effort,
duration, start
and end dates
Scheduling
• Identify all touchpoints
 Marketing, training, user documentation all
need final screens/reports to do their job
 How does your test schedule align with the
needs of these departments
• Your test schedule needs to be integrated
into the overall project schedule
 Work with the PM if there are any resource
conflicts (often happens with use of SMEs)
31
Scheduling
• Scheduling is iterative once it’s integrated
into the overall project plan you may find
some timelines don’t work for example
 People are unavailable
 Touchpoints require a different testing order
• Not all scheduling issues can be resolved
with reordering work. You may need to
change what testing you had planned
32
Risk Management
The next thing we want to do is
identify and manage test risks
33
Risk Management
What’s a risk?
“An uncertain event or condition that, if it
occurs, has a positive or negative effect on
one or more project objectives.”
PMBOK
34
Risk Management
• Risk Management is about events, not
effects
• There can be multiple events that have the
same effect.
• Each event is a risk that needs to be
managed
35
Risk Management
• For example if I said: “There is a risk that
the project will not complete on the planned
due date”
• Is that an event or the effect?
• Give me examples of events that would
cause that effect?
36
Risk Management
• To do Risk Management
 Identify/mitigate/plan for test risks
 Calculate risk reserve for test risks to the
schedule and test risks to the budget
• Plan to do the riskiest processes first
37
Risk Reduction
38
Risk Reduction
39
Integrate
Risk Reduction
40
Integrate
Test Often
Risk Reduction
41
Integrate
Test Often
V-Model
Risk Reduction
42
Integrate
Test Often
V-Model
Risky Req
43
Risk Management
Step 1RiskIdentification Step 2 Step 3 Step 4 Step 5
QualitativeAnalysis
QuantitativeAnalysis
MitigationandResponse
MonitorandManage
Risk Identification
• Risk Identification tools
 Checklist of inherent risks (e.g. It’s inherently
risky to not have an instance that mirrors
production)
 Lessons learned from prior projects
 Problem root cause analysis techniques
 Mind mapping
• Determine critical success factors then see
if they present any risks 44
Qualitative Risk Analysis
• Qualitative risk analysis
 How likely is the risk to occur?
 What impact will it have if it occurs?
 How critical of a risk is it?
• Use subject matter expertise and a common
scale to determine the qualities of each risk
45
Qualitative Risk Analysis
• For each risk determine the probability that
the risk will become reality (sample criteria)
 Almost Certain = .9 (90%)
 Likely = .7 (70%)
 Possible = .5 (50%)
 Unlikely = .3 (30%)
 Rare = .1 (10%)
46
Qualitative Risk Analysis
• For each risk determine the impact to the testing
schedule, testing cost or deliverable quality
(sample scale):
 1 = insignificant impact
 2 = Minor impact
 3 = Moderate impact
 4 = Major impact
 5 = Extreme impact
47
Qualitative Risk Analysis
• Sample Scale
• 1 = insignificant impact: Little to no schedule
or cost variance and quality degradation barely
noticeable.
 2 = Minor impact: < 5% schedule or cost
impact, or only minor quality degradation
 3 = Moderate impact: 5 – 10 % increase in time
or cost, or quality reduction requires OK from
the project risk approver(s)
48
Qualitative Risk Analysis
• Sample Scale, continued
 4 = Major impact: 10 – 20 % increase in time or
cost, or quality unacceptable to the project risk
approver(s)
 5 = Extreme impact: The entire project needs
to be reevaluated and probably terminated.
49
Qualitative Risk Analysis
• Risk Score – Impact * Probability
50
5X5 Impact
Impact 
Probability 
Insignificant
1
Minor
2
Moderate
3
Major
4
Extreme
5
Almost
Certain
90%
.9 1.8 2.7 3.6 4.5
Likely
70%
.7 1.4 2.1 2.8 3.5
Possible
50%
.5 1.0 1.5 2.0 2.5
Unlikely
30%
.3 0.6 0.9 1.2 1.5
Rare
10%
.1 0.2 0.3 0.4 0.5
Qualitative Risk Analysis
• Criticality
The matrix shows all of the possible risk scores
and categorizes them as:
 Risk Category 1 (Red) = Score 2.0-4.5
 Risk Category 2 (Yellow) = Score of 1.0-1.8
 Risk Category 3 (Green) = Score of .1-.9
51
Quantitative Risk Analysis
 For schedule overruns, can you
guestimate how much of a delay
the risk event will cause?
 For budget overruns, can you
guestimate how much additional
cost the risk event will cause?
52
• Sometimes it’s not possible to guess
• Look at the risk impact of each event
you will manage
Quantitative Risk Analysis
• When it is possible to guestimate
 Multiply each risk’s impact guestimate times
that risk event’s probability
 Risk reserve for schedule and budget
 For identified risks: Sum all of the calculations
 For unknown risks: Add a percentage based on
experience
53
Mitigating
• Actions you can take now, before the risk
occurs,
• to reduce likelihood, impact or both
For example
• Estimation of time to learn the new automated
testing tool will likely be inaccurate
Mitigation
• We will hire a contractor with expertise in the
new tool who will help us code it and also
teach us how to use the tool
Risk Mitigation
54
Risk Response
 What are the tasks that will be needed?
 What are the resources, both human and non-
human, needed to execute the plan?
 Who will be the risk owner? The risk owner
must monitor the risk and let us know when to
activate the response plan.
• If nothing can be done - buy insurance?55
Risk Response planning is the plan that
will be activated should a risk come to
fruition
Risk Monitoring
• Risk logs are great tools for monitoring
your risks, for example keep a log with this
information:
 ID, Title, Description, Event
 Impact, Probability, Score, Criticality
 Owner, Status, Mitigation, Response Plan
56
57
Continuous Process Improvement
Don’t wait
until the end
of testing to
hold lessons
learned
sessionst

Identify
problems and
opportunities
early

Apply
lessons
learned
immediately

Conclusion
• Software testing is a critical element in the
software development life cycle
• Everyone involved should be familiar with
the testing goals, planned approaches, and
constraints
• By adopting project management processes
you will set yourself apart – a cut above the
rest
58
Check this out
• A good tool to help with planning is from
HARBORLIGHT Management Services:
Project Management for Software Testers
http://www.harborlightmanagement.com/Pres
entations/EuroSTAR%20PM%20workbook.p
df
59
Resources
• ‘How to Avoid 7 Testing Project Problems’ from
www.automatedqa.com/support
• ‘Common Testing Problems: Pitfalls to Prevent and
Mitigate’ by Donald Firesmith, September 12 2012
• Fundamental Challenges in Software Testing, Cem Kaner,
April 2003
• ‘Quality Department Involvement in Project Life Cycle’
Part 1, Marcin Zreda, March 15, 2010
• ‘Quality Department Involvement in Project Life Cycle’
Part 2, Marcin Zreda, March 24, 2010
61
Resources
• “Plan your Projects’ Testing – or Plan to Fail”, James F.
York, May 26, 2009
• “Software Testing – Goals, Principles, and Limitations”,
S.M.K. Quadri, September 2010
• “Beta Testing Objectives”, centercode.com
62

Project Management Tips to Improve Test Planning

  • 1.
    Project Management Tips toImprove Test Planning ricki henry, pmp, csm 1
  • 2.
    Who am I? •Started professional life as a programmer, then • Sr. programmer, and Lead programmer. • As a lead programmer at American Express began managing projects and have been a project manager ever since. • PMP (Project Management Professional) certified and a Certified Scrum Master (Scrum is an agile methodology) • Formed the PMO (Project Management Office) where I work at Clark County Nevada where I also run the Project Management Workshops • Currently the VP of Education for my local PMI (Project Management Institute) chapter • Frequent presenter at conferences and on webinars 2
  • 3.
    Successful Test Managers 3 Testingis most effective when considered to be a sub-project of the larger development project
  • 4.
    Successful Test Managers 4 Applyproject management processes such as risk management and scope management
  • 5.
    Successful Test Managers 5 Understandstakeholder goals and benefits Work with the customer to formulate testing goals and approaches Contribute to the overall project planning
  • 6.
    Stakeholder Expectations Test Scope TestPlanning Test Risks 6 Agenda
  • 7.
    Customer Expectations 7 Project successis measured against project objectives What objectives do stakeholders have for testing? What objectives will you be able to deliver?
  • 8.
    Customer Expectations • Whatif expectations are that you’ll test everything and have zero defects?  What CAN we do?  Prioritize testing and devote more time to priority testing (aka priority coverage)  Change the expectation to ‘Show the software works”  What are examples of testing that might be a priority? 8
  • 9.
    Customer Expectations Highest volume transactions Featuresrelated to revenue Functions that drive customer retention Functions that drive customer satisfaction Performance Usability 9 Priority testing/focus examples
  • 10.
    Customer Expectations • Usingthe priority testing/focus you identified, formulate goal statements for testing • Include these goals in the Project Charter 10
  • 11.
    Customer Expectations • Agoal is what you want to achieve • It must be accomplishable and measureable 11
  • 12.
    Customer Expectations • Goalexample: • Goals: Perform quality control testing to • Verify the product works as specified with an acceptable level of quality • Determine when the product meets ‘Go/No Go’ criteria and can be deployed • Constraints: • The hard due date constrains the amount of testing possible for the project • The availability of a key SME constrains the amount of scenario testing that will be done 12
  • 13.
    Customer Expectations • SampleBeta Testing Goals • Get feedback from the users on customer experience • Assess stability and reliability of the application • Assess full deployment readiness • Assess performance • Collect testimonials or case studies 13
  • 14.
    Test Scope • Listwhat’s in and out of scope for testing  An example of in scope would be oThe most common logic paths oAll APIs oIE9, IE10, Chrome and Foxfire browsers  An example of out of scope would be oLogic paths not on the list of commonly used paths oSafari Browser oRegression testing for modules not being modified 14
  • 15.
    Test Scope • Workwith the project team to determine all of the testing that will be needed  Will the test team have a role?  Will that role be to test or to o Provide a test environment o Supervise internal or external personnel o Schedule time in the test lab o Provide test scripts to subject matter experts (Avoid the ‘easy to test’ and ‘we’ve always done it that way’ trap) 15
  • 16.
    Test Scope • Thetesting scope is to test the goals. • Document assumptions such as: the code will be fully unit/component tested before the test team accepts the deliverable for their testing • The test scope is included in the Project Scope Definition document 16
  • 17.
    Tools Once you determinewhat testing tools you will be using on this project determine: 17 What training is needed for new tool users? Will training be internal, online or instructor lead? What are the licensing costs? What infrastructure is needed? What are some examples of tools?
  • 18.
    Testing Budget 18 Submit yourbudget request to the PM What will new tools cost? How much will training cost? What do tool licenses cost? Will vendors do any verificati on? What resources are needed?
  • 19.
    • Number ofinstances needed • Use of same test tools as developers • Project risk identification • “What if” conditions during requirements • Participate in requirements 19 QA Included Early Include testers in the early phases to give input on:
  • 20.
    QA Included Early •Help define defect severity • Help formulate ‘Go/No Go’ criteria • Help define the defect/fix/retest process • Help define how change control will work 20
  • 21.
    QA Included Early •Promote continuous integration of code • Help with decision on whether or not to have a pilot • Promote go-live war rooms • Included in lessons learned 21
  • 22.
    QA Included Early •Perception – we don’t have time to participate in early Phases • Reality – preventing rework pays off 22
  • 23.
    Nothing new for mostof you Test plans are included in project plans Determine types of testing Planning
  • 24.
    Planning • Document thebusiness value of each type of testing in your plan • Document the approach to testing, e.g.  Test cycles  Who will test (you, SMEs, developers, BAs?)  Test early and often (code is delivered often) NOTE: we’re planning now, not writing test scripts 24
  • 25.
    Planning • Include anythingthat reduces test time without reducing quality of testing, e.g.  Multiple instances for simultaneous testing  Check QA team for ideas from lessons learned • Send completed plan to the PM for inclusion in the Project Management Plan 25
  • 26.
    Planning • Human Resourceplanning  What skills and knowledge do you need to perform the tests you identified?  Who will be assigned to the project?  What is their availability • Write test cases as soon as possible in the project [See V-model] 26
  • 27.
  • 28.
    Planning Test Status ReportingPlan • Select initial reporting approach. Compare:  Tracking of all bugs – metrics based, or  Tracking test case/test activity completion – goal/scope based  Which method illustrates progress better? • Final testing phase - after all tests have been done once, switch to bug tracking (with bug criticality) 28
  • 29.
    29 Planning Estimate due date for each planned activity Estimate Golive date Include lessons learned sessions Quality must be planned in, not inspected in
  • 30.
    30 Scheduling Develop testing schedule Planning is input totesting schedule Identify tasks, dependencies, resources Add effort, duration, start and end dates
  • 31.
    Scheduling • Identify alltouchpoints  Marketing, training, user documentation all need final screens/reports to do their job  How does your test schedule align with the needs of these departments • Your test schedule needs to be integrated into the overall project schedule  Work with the PM if there are any resource conflicts (often happens with use of SMEs) 31
  • 32.
    Scheduling • Scheduling isiterative once it’s integrated into the overall project plan you may find some timelines don’t work for example  People are unavailable  Touchpoints require a different testing order • Not all scheduling issues can be resolved with reordering work. You may need to change what testing you had planned 32
  • 33.
    Risk Management The nextthing we want to do is identify and manage test risks 33
  • 34.
    Risk Management What’s arisk? “An uncertain event or condition that, if it occurs, has a positive or negative effect on one or more project objectives.” PMBOK 34
  • 35.
    Risk Management • RiskManagement is about events, not effects • There can be multiple events that have the same effect. • Each event is a risk that needs to be managed 35
  • 36.
    Risk Management • Forexample if I said: “There is a risk that the project will not complete on the planned due date” • Is that an event or the effect? • Give me examples of events that would cause that effect? 36
  • 37.
    Risk Management • Todo Risk Management  Identify/mitigate/plan for test risks  Calculate risk reserve for test risks to the schedule and test risks to the budget • Plan to do the riskiest processes first 37
  • 38.
  • 39.
  • 40.
  • 41.
  • 42.
  • 43.
    43 Risk Management Step 1RiskIdentificationStep 2 Step 3 Step 4 Step 5 QualitativeAnalysis QuantitativeAnalysis MitigationandResponse MonitorandManage
  • 44.
    Risk Identification • RiskIdentification tools  Checklist of inherent risks (e.g. It’s inherently risky to not have an instance that mirrors production)  Lessons learned from prior projects  Problem root cause analysis techniques  Mind mapping • Determine critical success factors then see if they present any risks 44
  • 45.
    Qualitative Risk Analysis •Qualitative risk analysis  How likely is the risk to occur?  What impact will it have if it occurs?  How critical of a risk is it? • Use subject matter expertise and a common scale to determine the qualities of each risk 45
  • 46.
    Qualitative Risk Analysis •For each risk determine the probability that the risk will become reality (sample criteria)  Almost Certain = .9 (90%)  Likely = .7 (70%)  Possible = .5 (50%)  Unlikely = .3 (30%)  Rare = .1 (10%) 46
  • 47.
    Qualitative Risk Analysis •For each risk determine the impact to the testing schedule, testing cost or deliverable quality (sample scale):  1 = insignificant impact  2 = Minor impact  3 = Moderate impact  4 = Major impact  5 = Extreme impact 47
  • 48.
    Qualitative Risk Analysis •Sample Scale • 1 = insignificant impact: Little to no schedule or cost variance and quality degradation barely noticeable.  2 = Minor impact: < 5% schedule or cost impact, or only minor quality degradation  3 = Moderate impact: 5 – 10 % increase in time or cost, or quality reduction requires OK from the project risk approver(s) 48
  • 49.
    Qualitative Risk Analysis •Sample Scale, continued  4 = Major impact: 10 – 20 % increase in time or cost, or quality unacceptable to the project risk approver(s)  5 = Extreme impact: The entire project needs to be reevaluated and probably terminated. 49
  • 50.
    Qualitative Risk Analysis •Risk Score – Impact * Probability 50 5X5 Impact Impact  Probability  Insignificant 1 Minor 2 Moderate 3 Major 4 Extreme 5 Almost Certain 90% .9 1.8 2.7 3.6 4.5 Likely 70% .7 1.4 2.1 2.8 3.5 Possible 50% .5 1.0 1.5 2.0 2.5 Unlikely 30% .3 0.6 0.9 1.2 1.5 Rare 10% .1 0.2 0.3 0.4 0.5
  • 51.
    Qualitative Risk Analysis •Criticality The matrix shows all of the possible risk scores and categorizes them as:  Risk Category 1 (Red) = Score 2.0-4.5  Risk Category 2 (Yellow) = Score of 1.0-1.8  Risk Category 3 (Green) = Score of .1-.9 51
  • 52.
    Quantitative Risk Analysis For schedule overruns, can you guestimate how much of a delay the risk event will cause?  For budget overruns, can you guestimate how much additional cost the risk event will cause? 52 • Sometimes it’s not possible to guess • Look at the risk impact of each event you will manage
  • 53.
    Quantitative Risk Analysis •When it is possible to guestimate  Multiply each risk’s impact guestimate times that risk event’s probability  Risk reserve for schedule and budget  For identified risks: Sum all of the calculations  For unknown risks: Add a percentage based on experience 53
  • 54.
    Mitigating • Actions youcan take now, before the risk occurs, • to reduce likelihood, impact or both For example • Estimation of time to learn the new automated testing tool will likely be inaccurate Mitigation • We will hire a contractor with expertise in the new tool who will help us code it and also teach us how to use the tool Risk Mitigation 54
  • 55.
    Risk Response  Whatare the tasks that will be needed?  What are the resources, both human and non- human, needed to execute the plan?  Who will be the risk owner? The risk owner must monitor the risk and let us know when to activate the response plan. • If nothing can be done - buy insurance?55 Risk Response planning is the plan that will be activated should a risk come to fruition
  • 56.
    Risk Monitoring • Risklogs are great tools for monitoring your risks, for example keep a log with this information:  ID, Title, Description, Event  Impact, Probability, Score, Criticality  Owner, Status, Mitigation, Response Plan 56
  • 57.
    57 Continuous Process Improvement Don’twait until the end of testing to hold lessons learned sessionst  Identify problems and opportunities early  Apply lessons learned immediately 
  • 58.
    Conclusion • Software testingis a critical element in the software development life cycle • Everyone involved should be familiar with the testing goals, planned approaches, and constraints • By adopting project management processes you will set yourself apart – a cut above the rest 58
  • 59.
    Check this out •A good tool to help with planning is from HARBORLIGHT Management Services: Project Management for Software Testers http://www.harborlightmanagement.com/Pres entations/EuroSTAR%20PM%20workbook.p df 59
  • 61.
    Resources • ‘How toAvoid 7 Testing Project Problems’ from www.automatedqa.com/support • ‘Common Testing Problems: Pitfalls to Prevent and Mitigate’ by Donald Firesmith, September 12 2012 • Fundamental Challenges in Software Testing, Cem Kaner, April 2003 • ‘Quality Department Involvement in Project Life Cycle’ Part 1, Marcin Zreda, March 15, 2010 • ‘Quality Department Involvement in Project Life Cycle’ Part 2, Marcin Zreda, March 24, 2010 61
  • 62.
    Resources • “Plan yourProjects’ Testing – or Plan to Fail”, James F. York, May 26, 2009 • “Software Testing – Goals, Principles, and Limitations”, S.M.K. Quadri, September 2010 • “Beta Testing Objectives”, centercode.com 62