Your SlideShare is downloading. ×
Test Automation—It’s a Journey, Not a Project
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Test Automation—It’s a Journey, Not a Project

43
views

Published on

Organizations implement automated testing in hopes of reducing time and cost while creating higher quality products. They invest in tools, training, and development; identify candidates for …

Organizations implement automated testing in hopes of reducing time and cost while creating higher quality products. They invest in tools, training, and development; identify candidates for automation; and then develop and execute test scripts. Unfortunately, some of these projects are considered failures because the promised savings are not realized. Funding for future automation efforts may be reduced, or automation projects may be cancelled outright. Management support, effective test design and test data management, low maintenance requirements, along with meaningful reports are key to a successful and cost effective automation. Paul Maddison shares how CUMIS has invested more than ten years in test automation—continually growing their investment, reducing costs, and shrinking time-to-market. Paul details their approaches for script design, data management, test candidate selection, and effective reporting to management. Learn approaches that can mitigate challenges and give you a higher return on investment—today and well into the future.

Published in: Technology

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
43
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1.     nt Session      Presen d by:  Paul Maddison  T       Brought to you by:      340 Corporate Way, Suite   Orange Park, FL 32073  888‐2 W7  Concurre 4/9/2014    12:45 PM          “Test Automation‐It’s a Journey, Not a Project”      te   he CUMIS Group           300, 68‐8770 ∙ 904‐278‐0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com 
  • 2. Paul Maddison The CUMIS Group   Senior quality assurance analyst Paul Maddison has more than ten years of experience in automated testing using a variety of tools. Working with business analysts and testers to identify automation candidates, Paul is continually expanding testing coverage, increasing the return on investment, and reducing regression testing timeframes. He has coordinated the automation team’s development and maintenance of a regression test bed of more than 8,000 scenarios representing about 75 percent of the overall test effort. Recently, Paul designed and developed a self- serve approach to automation execution for use by business analysts and testers allowing the automation team to focus on coding efforts to replace script execution. Contact Paul at paul.maddison@cumis.com.
  • 3. 1 Test Automation It’s a Journey, Not a Project Paul Maddison The CUMIS Group paul.maddison@cumis.com The CUMIS Group The CUMIS Group Limited (CUMIS) partners with credit unions to deliver competitive insurance and financial solutions In doing so it creates financialfinancial solutions. In doing so, it creates financial security and promotes the growth and success of the credit union system in Canada. As the leading provider of insurance-related products and services to the Canadian credit union system CUMIS serves approximately 380 credit 2 system, CUMIS serves approximately 380 credit unions, with a total of more than five million members.
  • 4. 2 Getting Started » Resources –Experienced developers. d f–Aptitude for testing. –Strong unit testing track record. » Tool selection –Establish your requirements. –Demo on your software. k d 3 –Take a test drive. –Report generation & augmentation. –Training availability. » Getting Started » Management Involvement –Visible management support. ff l k h–Current staff may look at project as a threat. –Establish development milestones. –Celebrate successes. –Communicate, communicate, communicate. 4
  • 5. 3 Return On Investment 5 Test Candidate Selection » Prerequisites – Reliable test environment. – Existence of effective manual test cases. – Availability of subject matter experts. » Failure Impact – Company credibility. – Affect on bottom line. M l T i Eff 6 » Manual Testing Effort – High number of resource intensive test cases. – Similar test cases with various data combinations.
  • 6. 4 Test Candidate Selection 7 Test Candidate Selection » Effort Savings Formula XX 8 e.g. 10 minutes * 50 test cases * 3 test cycles = 25 hours of manual testing effort
  • 7. 5 Test Candidate Selection » Insurance Premium Calculations Coverage Term 9 Script Design » Framework For Reusable Code – Flexible functions for data population and workflow. S i t i t i d d– Script maintenance is reduced. – Dynamic environment selection. » Coding Standards – Common naming conventions. – Internal & external documentation. » Importing Data 10 p g – Allows for creation using other tools and their features.
  • 8. 6 Data Management » Importance of data driven tests – Ease of expansion & maintenance. E b dd d f l– Embedded formulas. – Automation script with multiple data sets. » Watch Out For Dates – Use of day or year offsets. e.g. Birthdate vs Age » Formatting – True/False. 11 / – Large numeric. Automated Script Reporting » Purpose of Reports – Must be adequate enough to manually reproduce failing test casestest cases. » Levels of Granularity – Summary Report Contents » Description of each test scenario and the execution result. » Number of verifications performed. » Timeframe required for execution. Si l S i R t C t t 12 – Single Scenario Report Contents » Data used in test scenario, expected and generated values. – Execution Log Contents » Identification of failing field, expected and generated value.
  • 9. 7 Summary Report 13 Single Scenario Report 14
  • 10. 8 Execution Log 15 Management Reports » Report Generation – Source of Metrics » Test execution Summary Reports» Test execution Summary Reports. » Manual testing candidate evaluations. – Graphics vs. Numbers » Use of illustrations. » Additional metrics can be supplied if requested. » Slice & dice results to generate different views. – Granularity 16 Granularity » Differentiate between Functional & Regression testing. » Ensure total automation savings are included.
  • 11. 9 Management Reports 17 Management Reports 18
  • 12. 10 Taking It Further » Test data creation – Manufacture data files with correct formatting for use in automated tests or for processing in other applicationsautomated tests or for processing in other applications. » Data Extraction – Extract and save data with specified formatting. » Environment Smoke Testing – Test connectivity between applications and verify application functionality before starting a test cycle. 19 » Response Metrics – Compile response metrics for business team review. Questions? 20