Software Testing Process, Testing Automation and Software Testing Trends
Upcoming SlideShare
Loading in...5
×

Like this? Share it with your network

Share

Software Testing Process, Testing Automation and Software Testing Trends

  • 3,141 views
Uploaded on

This is the slide deck that KMS Technology's experts shared useful information about latest and greatest achievements of software testing field with lecturers of HCMC University of Industry.

This is the slide deck that KMS Technology's experts shared useful information about latest and greatest achievements of software testing field with lecturers of HCMC University of Industry.

More in: Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
3,141
On Slideshare
3,138
From Embeds
3
Number of Embeds
1

Actions

Shares
Downloads
299
Comments
0
Likes
9

Embeds 3

http://user-pc5 3

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. © 2012 KMS Technology
  • 2. SOFTWARE TESTING PROCESS,TECHNOLOGY & TRENDApril 2013KMS Technology - http://kms-technology.comQA Symphony - http://www.qasymphony.com
  • 3. AGENDA• KMS Technology Overview 10’• Software Testing Process & Trends 50’• Software Testing Estimation 20’• Break 15’• Automation Testing & Tools 60’• Future of Software Testing 20’• Q&A 20’3
  • 4. © 2012 KMS TechnologyKMS TECHNOLOGYOVERVIEWVu Pham
  • 5. KMS TECHNOLOGY OVERVIEW5US Company/Passionate Global Workforce• 400 Resources & Growing in Vietnam and the US• 160 Testers ~ 50% Workforce• Proven Leadership TeamWorld Class Infrastructure• Built for ISO 27001, Planned Certification in 2013Best-in-Class SDLC Practices• CMMI and Agile focus• QASymphony - Commercial Agile Testing Solutionsrecognized by Forrester with over 4,500 usersBest Clients – Raving Fans• 100% Referenceable and Ecstatic• 100% in long-term dedicated team
  • 6. KMS SOFTWARE TESTING SERVICESTesting Tools Proprietary Tools Commercial Tools Open source Tools Automation &PerformanceTesting FrameworksTest Processes Process Assessment Best PracticeImplementation Continuous ProcessImprovement Quality and ProjectManagement MetricsKMS Testing ServicesTestingConsultingServicesLife-cycleTestingServicesAutomationTestingServicesPerformance &Load TestingServicesMobile andSpecialty TestingServicesFlexibleStaffing OptionStreamlined Processes& FrameworksTools & Automation Strategic Solution &Best PracticesTest Planning& EstimationTest Design &ImplementationTest ExecutionQA MetricsDrivenMonitoringQA MetricsDriven ProcessImprovements6
  • 7. OTHER KMS SERVICES7APPLICATIONDEVELOPMENT• J2EE and .NET expertise• Full lifecycle Product Development• Application modification andcustomizationAPPLICATIONSUPPORT• Perform defect resolution, on-goingmaintenance of existing applicationsAPPLICATIONREENGINEERING• Re-engineer and migrate to a differenttechnology and platform such as SaaS orMobileDATA WAREHOUSE /BUSSINESS INTELLIGENCE• Develop and Deploy Data Warehousesolutions• Data migration services• Report writing servicesMOBILEDEVELOPMENT• Apple iOS, Android SDK, and Windows 8• Mobile gaming• Enterprise mobile apps
  • 8. © 2012 KMS TechnologySOFTWARE TESTING PROCESS& TRENDSVu Pham
  • 9. AGENDA• Testing Process Evolution• Fundamental Testing Process• Components of Testing Process Framework• Best Practices in Testing9
  • 10. DEVELOPMENT PROCESS EVOLUTION60’s: Waterfall 80’s: RUP 00’s: Agile70’s: V-Model10
  • 11. DEVELOPMENT PROCESS EVOLUTION (CONT.)11Client Advantages Disadvantages• Simple model and easy to manage• Applicable for small software• “Big Design Up Front”• Defect detected at late phases• High amounts of risk and uncertain• Early testing involvement• Clear relationship between test phasesdevelopment phases• Still possess limitation of sequential model• Require high amount of documentation• Duplication of testing effort• Risk and uncertain are managed• Testing activities and process aremanaged• Heavy documentation• Late customer involvement – only at UAT• Adaptable to changes• Early client involvement - Avoidunrealistic requirements• Avoid spending time on uselessactivities• Require high-capable people• Need representative from client• Problem scaling up the architecture
  • 12. SO HOW TESTING IS CHANGED?• Black-box testing• System testing• Functional testing• Part-time tester• Grey-box testing• System/Integrationtesting• Functional testing• Full-time tester• White-box testing• System-system• Non-functional testing• Fit-for-Use• Professional tester1260’– 80’: Nice To Have 90’: Should Have 00’: Must Have
  • 13. AGENDA• Testing Process Evolution• Fundamental Testing Process• Components of Testing Process Framework• Best Practices in Testing13
  • 14. PHASES IN TESTING PROCESS14
  • 15. COMPONENTS OF TESTING PROCESSGuidelines TM - Test Plan Template TM - Test Strategy Template TM - Test Case Template TM - Test Estimation Template TM - Test Metrics DashboardTemplate TM - Defect Tracking ReportTemplate TM - Requirement to Test TMTemplate TM - Test Daily / Weekly /Summary Report TemplateTemplates Checklist CK - Test Readiness Checklist CK - Test Plan Review Checklist CK - Test Case Review Checklist CK - User Acceptance TestChecklist … GD - Defect TrackingGuidelines GD - Test Metrics Guidelines GD - KPI Metrics Guidelines GD - Test EstimationGuidelines GD - User Acceptance TestGuidelines PR – Testing Process (Detail)  PR - Testing Process Diagram (Xmind)Process15
  • 16. TestingImplementationRequirementDesignDeploymentSoftware QualityAssuranceRiskManagementProjectManagementRELATIONSHIP WITH OTHER PROCESSESCM16
  • 17. AGENDA• Testing Process Evolution• Fundamental Testing Process• Components of Testing Process Framework• Best Practices in Testing17
  • 18. WHAT ELSE WE NEED FOR PROCESS?PlanTestDesignTestExecuteTestCloseTest18Actual testing needs morethan just fundamentalprocess:• Solutions• Best Practices• Standards• ToolsAnd more to become “TestCenter of Excellence”
  • 19. TESTING CENTER OF EXCELLENCETest Solutions Automation Testing Performance Testing Mobile Testing Specialty TestingBest Practices Process Assessment Testing Estimation Continuous ProcessImprovement Exploratory/Risk-based TestingQualityPolicyGuidelines &TemplatesFundamentalTesting ProcessQuality Metrics &StandardsPlanTestDesignTestExecuteTestCloseTest19TCoE = Processes + Practices + Solutions
  • 20. WHY TEST SOLUTIONS?20About the ClientClearleap was the first companyproviding data streaming solution tooffer a complete platform that allows TVeverywhere possibleBusiness Challenges• Simulate high volume of concurrentusers 100,000+• Complete within a tight schedule• Limited budget for toolKMS’s Solutions• Tool Evaluation: Execute a proof of conceptto evaluate both commercial and open sourcetools• Planning: Determine a test strategy,approaches• Test Design and Development: Design anddevelop scalable load testing architecture• Execution and Reporting: Perform loadtesting and analyzing/reporting test resultsAchievements• Developed a scalable solution basedon Jmeter• Extremely reduced the cost oftesting and tremendously increasedROI• Found critical performance issues
  • 21. WHY TEST SOLUTIONS? (CONT.)• It takes months to build up solution from beginning• Cost of commercial tools v.s open source tools• Effective solutions differentiates us from other vendorsTypical Testing Solutions:– Automation testing (web, desktop, mobile)– Performance/Load Testing– Security Testing– Database/ETL Testing …21
  • 22. WHY BEST PRACTICES?22About the ClientGlobal company supporting clinical trials in67 countries. The Client offers serviceswhich include behavioral science,information technology, and clinicalresearchBusiness Challenges• 100% on time delivery with zero criticalbugs• Complicated paper process followingFDA regulations• Various testing platforms for both mobiledevices and desktopKMS’s Solution• Process Establishment: Identify gaps incurrent process; Leverage start-of the-artpractices• Process Improvement: Define andmeasure performance /quality metrics• Lifecycle Testing: Perform all lifecycletesting activities• Test Automation: Develop an automationframework to shorten test cycleAchievements• New process helps reducing 60% testingeffort• No ‘critical’ defects identified during 1 yearof engagement• Moved paper work process to testmanagement system open new trend inclinical trial industrial
  • 23. WHY BEST PRACTICES? (CONT.)23• Best practice improves outcome of activities• Best practice has been proved of it effectiveness• The more practices we use the higher maturity we areTypical Testing Best Practice:– Review and Lesson-Learnt– Root Cause Analysis– Risk-based/Exploratory Testing– Estimation Method, ROI Model– Quality Metric Dashboard
  • 24. AGENDA• Testing Process Evolution• Fundamental Testing Process• Components of Testing Process Framework• Best Practices in Testing24
  • 25. Definition: CPI is an ongoing effort toimprove quality of products,services, or processesIn software testing CPI is seeking forimprovement of:• Quality• Productivity• Cost of Quality• Time to Market …CONTINUOUS PROCESS IMPROVEMENT25AssessPlanImplementEvaluate
  • 26. • Three metric categories in practice:– Product Quality Metrics – How good the overall quality of theproduct– Process Effectiveness Metrics – How the processes of deliveryare performed– Testing and Test Automation Metrics – Detail status of testingactivities, test outcomeMetrics are standards of measurement by whichefficiency, performance, progress, or quality of a plan,process, project or product can be assessed with theaim to support continuous improvementWikipediaQUALITY METRICS26
  • 27. • Defects by Status• Open Defects by Severity• Open Defects by Severity & Functional Area• Open Defects by Severity & Release• Open Defect Aging …Product Quality Metrics• Defect Identification in Pre-Prod / Prod• Weekly Defect Rates per Environment• Defect Escape Ratio• Defects by Phase Found / Functional Area• Defects by Origin / Functional Area …Process Effectiveness Metrics• Test Coverage Planning• Execution Status / Execution Rate byFunctional Area/Cycle• Defect Rejection Ratio• Test Productivity …Testing Metrics• Percent Automatable• Automation Progress• Percent of Automated Testing Coverage …Test Automation MetricsQUALITY METRICS (CONT.)27
  • 28. Definition: Risk-based testing istesting method that base onidentified risks to– determine the “right level” ofquality– prioritize the tests and testing effort– focus on most important testingareas firstwith the aim to be clear of currentquality status and to get the bestreturn by the time completingtestingRISK-BASED TESTING28
  • 29. EXPLORATORY TESTING29“A style of testing in which youexplore the software whilesimultaneously designing andexecuting tests, using feedbacksfrom the last test to inform thenext.”Elisabeth HendricksonThis type of testing helps:• Discovering unknown and un-detectedbugs• Testers in learning new methods, teststrategies, think out of the box
  • 30. WHAT IS GOOD TESTING PROCESS?30
  • 31. WHAT IS GOOD TESTING PROCESS? (CONT.)311. Quality Gate/Check Points2. Peer Review3. Metrics-driven Management4. Root Cause Analysis5. Defect Prevention
  • 32. CHALLENGES IN ADOPTING NEW PROCESS321. Fear of changes2. Lack of management support3. Lack of supporting tools/solutions4. Not a long-term solution5. Takes time to bring values
  • 33. © 2012 KMS TechnologySOFTWARE TESTINGESTIMATIONVu Pham
  • 34. AGENDA• Important of Software Estimation• qEstimate - Test Case Point Analysis• Effort Estimation Methods using qEstimate34
  • 35. IMPORTANT OF SOFTWARE ESTIMATION• Software estimation– process of determining the size, cost, time of software projects,often before work is performed• Estimation is important for the success or failure ofsoftware projects. It provides input for:– Making investment decisions– Budget and staff allocation– Stakeholder/Client negotiation …35
  • 36. WHY TESTING ESTIMATION IMPORTANT?• Testing may consume up to 50% ofproject effort– ~ 70% effort in critical missionsystems• Current problem– No estimation for testing– Estimation is done for the wholeproject rather than testing36
  • 37. POPULAR SOFTWARE ESTIMATION METHODS• Sizing Methods– Source Lines of Code (SLOC)– Function Points Analysis …• Effort Estimation Methods– Expert Judgment/Experience– Productivity Index …• “Guestimate” Estimation Method– Using a test distribution percentage (Ex: Testing is 30% of totaleffort)37
  • 38. AGENDA• Important of Software Estimation• qEstimate - Test Case Point Analysis• Effort Estimation Methods using qEstimate38
  • 39. QESTIMATE – TESTING ESTIMATION• qEstimate - TCPA estimates the size of testing using testcases as input• Test case complexity is based on 4 elements:• Checkpoints• Precondition• Test Data• Type of Test39qEstimate: http://www.qasymphony.com/media/2012/01/Test-Case-Point-Analysis.pdf
  • 40. QESTIMATE – TESTING ESTIMATION (CONT.)TestCasesCountCheckpointsDeterminePreconditionComplexityDetermineTest DataComplexityUnadjustedTCPAdjust withTest TypeTCP40
  • 41. AGENDA• Important of Software Estimation• qEstimate - Test Case Point Analysis• Effort Estimation Methods using qEstimate41
  • 42. ESTIMATE TESTING EFFORT (CONT.)Typically, testing effort is distributed into phases as below:42
  • 43. PRODUCTIVITY INDEX• Effort is computed using Productivity Index of similarcompleted projects• Productivity Index is measured as TCP per person-hourPI = Average (TCP/Actual Effort)Effort (hrs) = TCP/Productivity IndexSimple method43
  • 44. REGRESSION ANALYSIS• Estimate effort of new projects using size and effort ofcompleted projectsA and B is calculated based on historical datay = Ax + B01020304050607080901000 100 200 300 400 500 600 700 800 900 1000Effort(PM)Adjusted TCP44
  • 45. © 2012 KMS TechnologyAUTOMATION TESTING &TOOLSThao Vo
  • 46. AGENDA• Software Test Automation• Software Performance Testing• Tools Support Testing46
  • 47. THINKING OF AUTOMATIONTest Automation is…Business values of AutomationGreater Coverage – More time for QA doing manual exploratory/risk-basedtesting.Improved Testing Productivity – Test suites can be run earlier and nightlyReduced Testing Cycle – Help shorten time-to-marketDoing what manual testing cannot – Load testingUsing Testing Effectively – Automation testing reduces tediousness,improve team moraleIncreased Reusability – Tests can be ran across different platforms andenvironmentsThe use of software and tools to perform the testingCode-Driven – Testing at source code level with a variety of input arguments.GUI-Driven – Testing at GUI level via keystrokes, mouse clicks, etc.47
  • 48. THINKING OF RETURN ON INVESTMENTTool, Implementation,Maintenance, Training,etc.Save Time, EarlyResponse, Reliable,Repeatable, etc.ROI: The most important measurement for test automation• ROI (effort): planning, development, maintenance, training, etc.• ROI (cost): tool license, environment, management, automation resources, etc.• ROI (quality): found defect, test coverage, etc.48
  • 49. END-TO-END TEST AUTOMATION PROCESS1• Assessment• Evaluation2• Pilot• Planning3• Design• Implementation4• Execution• Report5• Maintenance49PlanTestDesignTestExecuteTestCloseTest
  • 50. ASSESSMENT & EVALUATION• Assessment– Understand organization vision,priorities, process &methodology– Understand Application &Technology– Identify the Test requirements• Evaluation:– Vendor discussion (optional)– Tool evaluation– Recommendations– Finalize Testing tools50
  • 51. PILOT & PLANNING• Pilot– Do Proof of Concept– Define Test process– Finalize Test Approach &Methodology– Define Entry & Exit criteria• Planning:– Identity test requirements, testcases for Automation– Set up test environment– Define Automation framework– Finalize Resources and Testschedule51
  • 52. DESIGN & IMPLEMENTATION• Design– Define standards, guidelines, Pre& Post test procedures– Design input, output data– Monitoring tools and reportmetrics– Design Automation framework• Implementation:– Build driver script, actions,keywords, data driven– Build scripts– Validate and run underapplication test52
  • 53. EXECUTION & MAINTENANCE• Execution & Report– Setup environment– Run and schedule tests– Provide detailed and summaryreport– Provide automation handbook &training• Maintenance:– Implement new change request– Define new enhancement– Keep up-to-date with newfunction of application undertest.53
  • 54. AUTOMATION TOOLS LANDSCAPE• Tools– Quick Test Professional (HP)– Functional Tester (IBM)– SilkTest (Micro Focus)– TestComplete (SmartBear)– eggPlant (TestPlant)– Etc.• Advantages– Easy to use– Support multiple technologies• Disadvantages– Costly option (> 2K/license)– Lack of customizations or limitedintegration with other tools• Tools:– Selenium– Watir– Robotium– Cucumber– JMeter, SoapUI– Etc.• Advantages– Free– Can be integrated with otheropen source tools• Disadvantages– Some tools has limited supportfrom community– Need customizations to besuitable for product under test54
  • 55. AUTOMATION CHALLENGESHigh up-front investment costDemanding of skilled resourceSelection of the best testing toolsand approachIneffective collaboration processPersuade stakeholders to say “Yes”55
  • 56. AND SOLUTIONS56Above challenges can be resolved by investing intoeffective automation solution:• Flexible enough to leverage open-source landscape• Use high-level description language so any tester can use• Generate useful report and metrics• Automated tests can be ran with any tool
  • 57. A SAMPLE BEST PRACTICE57About the ClientSmart-pens revolutionize the act of writingby recording and linking speech tohandwriting. This is fundamentallyadvancing the way people capture, accessand share written and spoken information inthe paper and digital worlds.Business Challenges• Various testing methods & testingtechniques: web service, performanceand API testing• Multiple iterations, early and frequentneed of regression testing with limitedresourcesKMS’s Solution• Automation Planning: Define a TestStrategy & Test Approach for load,performance & API testing using opensource tools.• Automation Design and Development:Design and develop effective TestAutomation Framework.Achievements• 70% of all testing has been automatedusing open source solutions.• Framework and API web service testinghave extremely reduced cost andincreased ROI.• Performance automated testing solutionhas been implemented and run regularlyfor identifying/isolating potential bottlenecks to improve system ‘Up-time’ forclient businesses
  • 58. AGENDA• Software Test Automation• Software Performance Testing• Tools Support Testing58
  • 59. PERFORMANCE TESTINGDetermines… User expectations System constrains CostsFocuses on…To answer… How many…? How much…? What happensif…? Speed Scalability Stability59
  • 60. CROWD SPEED AVAILABITY How many usersbefore crashing? Do we have enoughhardware? Where are thebottlenecks in thesystem? Is the system fastenough to makecustomers happy? Will it slow down orwill it crash? Did I purchaseenough bandwidthfrom my ISP? How reliable is oursystem Will our system copewith the unexpected? What will happen ifour business grows? The failure of an application can be costly Locate potential problems before our customer do Assume performance and functionality under real-work conditions Reduce infrastructure cost60PERFORMANCE TESTING OVERVIEW
  • 61. PERFORMANCE TESTING PROCESS61Planning Preparation Baseline Execution ReportObjectives SetupTest DataDevelopValidateExecuteAnalyzeOptimizeFinal ReportAssessmentPlanStrategy
  • 62. PERFORMANCE TESTING CHALLENGES•How to replicate production environment as close as possible•Misleading data•Different hardware configurationRight TestEnvironment•Commercial tools: HP LoadRunner, IBM Rational PerformanceTester, Segue SilkPerformer, RadView WebLoad, NeoLoad, etc.•Open Source tools: JMeter, OpenSTA, The Grinder, LoadUI, etc.Testing ToolSelection•Ambiguous requirements•Unclear, unknown requirements•Best practice: Should start with minimum load and developerconsultingNon-functionalReq. Exploration•Ineffective framework•Select correct scenarios•Wrong script implementationIncorrectImplementationMonitoring &Analysis•Monitor performance each server in distributed testing•Collect a huge output data and analyze the bottlenecks, slowspots, etc.62
  • 63. THE FUTURE CHALLENGES OF AUTOMATION63
  • 64. AGENDA• Software Test Automation• Software Performance Testing• Tools Support Testing64
  • 65. TESTING TOOLS LANDSCAPE65ALM –ApplicationLife-cycleManagement• Purpose:communicatesacross multipleproject teams• Typical Tools: Rally,VersionOne, HPALMTMS – TestManagementSystem• Purpose: managesrequirement testmatrix• Typical Tools: HPQC, Test Link,QAComplete, qTestDTS: DefectTracking System• Purpose: managedefect• Typical Tools:BugZilla, Jira,MantisATT:AutomationTesting Tools• Purpose:Regression andspecific tests• Typical Tools: QTP,TestComplete,Selenium, Watir,JMeter,LoadRunner
  • 66. NEW TREND IN TESTING TOOLS66• Auto-sync requirements, test cases & defects• Import/export, integrate with other systems• Capture tools integrate into defect tracking toolSave Time & LessWork• View, mark result, update test cases and defects withoutleaving the target test application• Create defect quicklyFaster & Easy to Use• Easy to customize new features• Integrate into many specified toolsCustomization &Integration• Control and keep track of changes, assignments• Track status across lifecycles• View the real-time status, statistical data, associatedtrendsMore Control,Visibility• Flexible and low costCloud Deployment
  • 67. TO BE A GOOD AUTOMATION QA…Technical Skills:• Software development• Testing mindset, testing methodologiesand types (both functional and non-functional test)• Testing and development tools• Operating systems, networking, database• Technical writingOthers:• Domain knowledge• Soft Skills67
  • 68. © 2012 KMS TechnologyFUTURE OF SOFTWARETESTINGVu Pham
  • 69. WHERE WE ARE?• Ho Chi Minh City and Hanoi are continuously in thetop 10 emerging IT outsourcing cities (‘07  Today)http://www.tholons.com/Top50_article.pdf69Confidential• What is typical ratio of Testers in VN IT company?
  • 70. WHERE WE ARE? (CONT.)70Confidential Ho Chi Minh city is destination of global outsourcing in testing
  • 71. WHAT ARE OUR OPPORTUNITIES?Facts:• Testing outsourcingmarket value tripleincreased for every 4 year• Many VN outsourcingcompanies are testingfocus: Logigear, TMA,Global CyberSoft, KMS …71Confidential
  • 72. FUTURE OF SOFTWARE TESTING1. Faster – Higher – StrongerFaster release– Need value from every hour spent on testingHigher quality– Greater test coverage of specified and impliedrequirementsStronger capability– Not only functionality but also performance,security, usability …– Ability to develop test solutions2. Complicated technology/applicationplatform– Cloud Computing, Mobile, Enterprise System …72
  • 73. FUTURE OF SOFTWARE TESTING (CONT.)3. Global testing team – global competition– Communication, Crowd-source Testing ...4. Automation testing is must– More effective solutions are needed5. Less on processes, more on practices– Agile, Exploratory, Rapid testing73
  • 74. SUMMARY1. Testing is crucial for today business2. It becomes professional of choice3. Vietnam is destination of testing outsourcing4. Automation testing is must in future5. Requires intellectually, analytically and creatively mindset6. It takes years to become good7. Can’t be good if just learn from daily works8. Is fast-paced career advancement74
  • 75. © 2012 KMS TechnologyQ & A
  • 76. © 2012 KMS TechnologyAPPENDIX
  • 77. QESTIMATE – TESTING ESTIMATION• qEstimate - TCPA estimates the size of testing using testcases as input• Test case complexity is based on 4 elements:• Checkpoints• Precondition• Test Data• Type of Test77
  • 78. QESTIMATE – TESTING ESTIMATION (CONT.)TestCasesCountCheckpointsDeterminePreconditionComplexityDetermineTest DataComplexityUnadjustedTCPAdjust withTest TypeTCP78
  • 79. CHECKPOINT• Checkpoint:– Is the condition in which the tester verifies whether the resultproduced by the target function matches the expectedcriteria– One test case consists of one or many checkpoints• Counting rule:One checkpoint is counted as one Test Case Point79
  • 80. PRECONDITION• Counting rule:Each complexity level of precondition is assigned anumber of Test Case Points80
  • 81. TEST DATA• Counting rule:Each complexity level of Test Data is assigned anumber of Test Case Points81
  • 82. ADJUSTED TEST CASE POINT• Test Case Point counted till this point is considered UnadjustedTest Case Point (UTCP)nUTCP = ∑TCPii=1• UTCP is adjusted by considering types of test case– Each type of test case is assigned a weight– Adjusted Test Case Point (ATCP):nATCP = ∑UTCPi * Wii=1• UTCPi - the number of UTCP counted for the test case ith.• Wi - the weight of the test case ith, taking into account its test type82
  • 83. WEIGHT BY TYPE OF TEST83
  • 84. WEIGHT BY TYPE OF TEST (CONT.)84
  • 85. SUMMARY OF THE PROCESSTestCasesCountCheckpointsDeterminePreconditionComplexityDetermineTest DataComplexityUnadjustedTCPAdjust withTest TypeTCP85
  • 86. AGENDA• Important of Software Estimation• qEstimate - Test Case Point Analysis• Effort Estimation Methods using qEstimate86
  • 87. ESTIMATE TESTING EFFORT• Estimate testing effort using TCP• Test effort distribution into four phases– Test Planning (TP)– Test Analysis and Design (TAD)– Test Execution (TE)– Defect Tracking and Reporting (DTR)• Effort estimation methods– Productivity Index (PI)– Regression Analysis (RA)Each of these phases maybe performed multipletimes in a project87
  • 88. ESTIMATE TESTING EFFORT (CONT.)• Challenge is how to estimate effort need for each of phase intesting process• Typically, testing effort is distributed into phases as below:88
  • 89. PRODUCTIVITY INDEX• Effort is computed using Productivity Index of similarcompleted projects• Productivity Index is measured as TCP per person-hourPI = Average (TCP/Actual Effort)Effort (hrs) = TCP/Productivity IndexSimple method89
  • 90. REGRESSION ANALYSIS• Estimate effort of new projects using size and effort ofcompleted projectsA and B is calculated based on historical datay = Ax + B01020304050607080901000 100 200 300 400 500 600 700 800 900 1000Effort(PM)Adjusted TCP90
  • 91. SUMMARY OF QESTIMATE PROCESS91qEstimate: http://www.qasymphony.com/media/2012/01/Test-Case-Point-Analysis.pdf