SlideShare a Scribd company logo
™
        mVerify
        A Million Users in a Box ™


Experience with a Profile-based
Automated Testing Environment

       Presented at ISSRE 2003
         November 18, 2003
           Robert V. Binder
         mVerify Corporation

            www.mVerify.com
Overview

Levels of Automated Testing

System Under Test

Approach

Observations




                     © 2003 mVerify Corporation   2
Musa’s Observation

Testing driven by an operational profile is
very efficient because it identifies failures
(and hence the faults causing them) on
average, in order of how often they occur.

This approach rapidly increases reliability …
because the failures that occur most
frequently are caused by the faulty
operations used most frequently.


                                                 IEEE Software, March 1993

                    © 2003 mVerify Corporation                               3
Promise of Profile-Based Testing

 Tester's point of view versus reliability analyst
    Maximize reliability within fixed budget
    Measurement of reliability not primary goal

 Profile-Based Testing is optimal when
    Available information already used
    Must allocate resources to test complex SUT

 Many significant practical obstacles


                         © 2003 mVerify Corporation   4
Testing by Poking Around




           Manual
           “Exploratory”
           Testing


•Not Effective
•Low Coverage
•Not Repeatable
•Can’t Scale                                   System Under Test


                  © 2003 mVerify Corporation                       5
Manual Testing

                                                     Test Setup




Manual                Manual
Test Design/          Test Input
Generation



   •1 test per hour
               Test Results
   •Not repeatable
               Evaluation
                                                          System Under Test


                        © 2003 mVerify Corporation                            6
Automated Test Script

                                                   Test Setup




Manual         Test Script
Test Design/   Programming
Generation


   •10+ tests per hour
   •Repeatable Results
              Test
              Evaluation
   •Brittle                                             System Under Test


                      © 2003 mVerify Corporation                            7
Automated Generation/Agent

                                                 Test Setup



Model-based
Test Design/    Automatic
 Generation     Test
                Execution




•1000+ tests per hour
•High fidelity Results
             Test
             Evaluation
•Evaluation limited                                   System Under Test


                    © 2003 mVerify Corporation                            8
Full Test Automation

                                               Automated
                                               Test Setup




Model-based
Test Design/
Generation
               Automatic Test
               Execution

       Automated
•Advanced Mobile App Testing Environment
       Test Results
•Q3 2005
       Evaluation            System Under Test


                  © 2003 mVerify Corporation                9
Application Under Test

E-commerce/securities market, screen-based
 trading over private network

3 million transactions per hour

15 billion dollars per day

3 years, version 1.0 live Q4 2001



                      © 2003 mVerify Corporation   10
Development Process/Environment

Rational Unified process
About 90 use-cases, 600 KLOC Java
Java (services and GUI), some XML
Oracle DBMS
Many legacy interfaces
CORBA/IDL distributed object model
HA Sun server farm
Dedicated test environment

                   © 2003 mVerify Corporation   11
Profile-based Testing Approach

 Executable operational profile
 Simulator generates realistic unique test suites
 Loosely coupled automated test agents
 Oracle/Comparator automatically evaluate
 Support integration, functional, and stress test




                       © 2003 mVerify Corporation    12
Model-based Testing

Profile alone insufficient
Extended Use Case
    RBSC test methodology
    Defines feature usage profile
    Input conditions, output actions
Mode Machine
Invariant Boundaries


                         © 2003 mVerify Corporation   13
Simulator

 Discrete event simulation
    Generate any distribution with pseudo-random
 Prolog implementation (50 KLOC)
    Rule inversion
 Load Profile
    Time domain variation
    Orthogonal to operational profile
 Each event assigned a "port" and submit time


                        © 2003 mVerify Corporation   14
Test Environment

 Simulator generates interface-independent content
 Adapters for each SUT Interface
    Formats for test agent API
    Generates script code

 Test Agents execute independently
 Distributed processing/serialization challenges
    Loosely coupled, best-effort strategy
    Embed sever-side serialization monitor
                         © 2003 mVerify Corporation   15
Automated Run Evaluation
 Oracle accepts output of simulator
 About 500 unique rules
 Verification
    Splainer – result/rule backtracking tool
    Rule/Run coverage analyzer
 Comparator
    Extract transaction log
    Post run database state
    end-to-end invariant
 Stealth requirements engineering

                         © 2003 mVerify Corporation   16
Overall Process

 Six development increments
    3 to 5 months
    Test design/implementation parallel with app dev

 Plan each day's test run
    Load profile
    Total volume
    Configuration/operational scenarios


                        © 2003 mVerify Corporation      17
Daily Test Process

 Run Simulator
     100,000 events per hour
     FTP event files to test agents
   Start SUT
   Test agents automatically start at scheduled time
   Extract results
   Run Oracle/Comparator
   Prepare bug reports


                           © 2003 mVerify Corporation   18
Problems and Solutions

 One time sample not                   Simulator generates
  effective, but fresh test              fresh, accurate sample
  suites too expense                     on demand
 Too expensive to develop              Oracle generates
  expected results                       expected on demand
 Too many test cases to                Comparator automates
  evaluate                               checking
 Profile/Requirements                  Incremental changes to
  change                                 rule base
 SUT Interfaces change                 Common agent interface


                        © 2003 mVerify Corporation                19
Technical Achievements

 AI-based user simulation generates test suites
 All inputs generated under operational profile
 High volume oracle and evaluation
 Every test run unique and realistic (about 200)
 Evaluated functionality and load response with fresh tests
 Effective control of many different test agents (COTS/
  custom, Java/4Test/Perl/Sql/proprietary)




                          © 2003 mVerify Corporation           20
Problems

 Stamp coupling
    Simulator, Agents, Oracle, Comparator
 Re-factoring rule relationships, Prolog limitations
 Configuration hassles
 Scale-up constraints
 Distributed schedule brittleness
 Horn Clause Shock Syndrome

                       © 2003 mVerify Corporation       21
Results

 Revealed about 1,500 bugs over two years
    5% showstoppers
 Five person team, huge productivity increase
 Achieved proven high reliability
    Last pre-release test run: 500,000 events in two
     hours, no failures detected
    Bonus: 1M event run closed big affiliate deal
    No production failures


                         © 2003 mVerify Corporation     22

More Related Content

What's hot

Silk4j Tcm6 174177
Silk4j Tcm6 174177Silk4j Tcm6 174177
Silk4j Tcm6 174177
titita13
 
Mobile Testing Service Desk_Own.ppt
Mobile Testing Service Desk_Own.pptMobile Testing Service Desk_Own.ppt
Mobile Testing Service Desk_Own.ppt
QA Programmer
 
Silk For Use With SAP
Silk For Use With SAPSilk For Use With SAP
Silk For Use With SAP
titita13
 
Silk4net Tcm6 174178
Silk4net Tcm6 174178Silk4net Tcm6 174178
Silk4net Tcm6 174178
titita13
 

What's hot (20)

Discover the power of QA automation testing
Discover the power of QA automation testingDiscover the power of QA automation testing
Discover the power of QA automation testing
 
Keeping Your Continuous Test Automation Continuously Valuable
Keeping Your Continuous Test Automation Continuously ValuableKeeping Your Continuous Test Automation Continuously Valuable
Keeping Your Continuous Test Automation Continuously Valuable
 
Testing Mobile Applications
Testing Mobile ApplicationsTesting Mobile Applications
Testing Mobile Applications
 
Mobile Application Testing
Mobile Application TestingMobile Application Testing
Mobile Application Testing
 
Cloud based Testing Mobile Apps
Cloud based Testing Mobile AppsCloud based Testing Mobile Apps
Cloud based Testing Mobile Apps
 
Neil Tompson - SoftTest Ireland
Neil Tompson - SoftTest IrelandNeil Tompson - SoftTest Ireland
Neil Tompson - SoftTest Ireland
 
Automation Testing of Web based Application with Selenium and HP UFT (QTP)
Automation Testing of Web based Application with Selenium and HP UFT (QTP)Automation Testing of Web based Application with Selenium and HP UFT (QTP)
Automation Testing of Web based Application with Selenium and HP UFT (QTP)
 
Review
ReviewReview
Review
 
Service engineering
Service engineeringService engineering
Service engineering
 
Customized Test Automation Solution
Customized Test Automation SolutionCustomized Test Automation Solution
Customized Test Automation Solution
 
Mobile testing practices
Mobile testing practicesMobile testing practices
Mobile testing practices
 
Mobile Testing in the Cloud
Mobile Testing in the CloudMobile Testing in the Cloud
Mobile Testing in the Cloud
 
Silk4j Tcm6 174177
Silk4j Tcm6 174177Silk4j Tcm6 174177
Silk4j Tcm6 174177
 
Experitest & Hexaware Co-Webinar
Experitest & Hexaware Co-WebinarExperitest & Hexaware Co-Webinar
Experitest & Hexaware Co-Webinar
 
Michael Monaghan - Evolution of New Feature Verification in 3G Networks
Michael Monaghan - Evolution of New Feature Verification in 3G NetworksMichael Monaghan - Evolution of New Feature Verification in 3G Networks
Michael Monaghan - Evolution of New Feature Verification in 3G Networks
 
Mobile Testing Service Desk_Own.ppt
Mobile Testing Service Desk_Own.pptMobile Testing Service Desk_Own.ppt
Mobile Testing Service Desk_Own.ppt
 
Silk For Use With SAP
Silk For Use With SAPSilk For Use With SAP
Silk For Use With SAP
 
Ian Smith - Mobile Software Testing - Facing Future Challenges
Ian Smith -  Mobile Software Testing - Facing Future ChallengesIan Smith -  Mobile Software Testing - Facing Future Challenges
Ian Smith - Mobile Software Testing - Facing Future Challenges
 
selenium meetup sf talk march 2014 Selenium at Scale
selenium meetup sf talk march 2014 Selenium at Scaleselenium meetup sf talk march 2014 Selenium at Scale
selenium meetup sf talk march 2014 Selenium at Scale
 
Silk4net Tcm6 174178
Silk4net Tcm6 174178Silk4net Tcm6 174178
Silk4net Tcm6 174178
 

Viewers also liked

Viewers also liked (6)

ISSRE 2008 Trip Report
ISSRE 2008 Trip ReportISSRE 2008 Trip Report
ISSRE 2008 Trip Report
 
mVerify Investor Overview
mVerify Investor OverviewmVerify Investor Overview
mVerify Investor Overview
 
The Tester’s Dashboard: Release Decision Support
The Tester’s Dashboard: Release Decision SupportThe Tester’s Dashboard: Release Decision Support
The Tester’s Dashboard: Release Decision Support
 
Lessons learned validating 60,000 pages of api documentation
Lessons learned validating 60,000 pages of api documentationLessons learned validating 60,000 pages of api documentation
Lessons learned validating 60,000 pages of api documentation
 
MTS: Controllable Test Objects
MTS: Controllable Test ObjectsMTS: Controllable Test Objects
MTS: Controllable Test Objects
 
Performance Testing Mobile and Multi-Tier Applications
Performance Testing Mobile and Multi-Tier ApplicationsPerformance Testing Mobile and Multi-Tier Applications
Performance Testing Mobile and Multi-Tier Applications
 

Similar to Experience with a Profile-based Automated Testing Environment

Презентация
ПрезентацияПрезентация
Презентация
guest22d71d
 
Curiosity and Xray present - In sprint testing: Aligning tests and teams to r...
Curiosity and Xray present - In sprint testing: Aligning tests and teams to r...Curiosity and Xray present - In sprint testing: Aligning tests and teams to r...
Curiosity and Xray present - In sprint testing: Aligning tests and teams to r...
Curiosity Software Ireland
 
Justin Presentation PPT Upload
Justin Presentation PPT UploadJustin Presentation PPT Upload
Justin Presentation PPT Upload
techweb08
 

Similar to Experience with a Profile-based Automated Testing Environment (20)

Презентация
ПрезентацияПрезентация
Презентация
 
Test automation lessons from WebSphere Application Server
Test automation lessons from WebSphere Application ServerTest automation lessons from WebSphere Application Server
Test automation lessons from WebSphere Application Server
 
Performance testing
Performance testingPerformance testing
Performance testing
 
05 test infrastructure
05   test infrastructure05   test infrastructure
05 test infrastructure
 
Application Testing Suite
Application Testing SuiteApplication Testing Suite
Application Testing Suite
 
Microsoft ALM Support - Testing Perspective
Microsoft ALM Support - Testing PerspectiveMicrosoft ALM Support - Testing Perspective
Microsoft ALM Support - Testing Perspective
 
Webinar Presentation: Best Practices in QA Testing - Leveraging Open Source T...
Webinar Presentation: Best Practices in QA Testing - Leveraging Open Source T...Webinar Presentation: Best Practices in QA Testing - Leveraging Open Source T...
Webinar Presentation: Best Practices in QA Testing - Leveraging Open Source T...
 
Atagg2015 - Agile Testing by Leveraging Cloud
Atagg2015 - Agile Testing by Leveraging CloudAtagg2015 - Agile Testing by Leveraging Cloud
Atagg2015 - Agile Testing by Leveraging Cloud
 
Triple Assurance: AI-Powered Test Automation in UI Design and Functionality
Triple Assurance: AI-Powered Test Automation in UI Design and FunctionalityTriple Assurance: AI-Powered Test Automation in UI Design and Functionality
Triple Assurance: AI-Powered Test Automation in UI Design and Functionality
 
Curiosity and Xray present - In sprint testing: Aligning tests and teams to r...
Curiosity and Xray present - In sprint testing: Aligning tests and teams to r...Curiosity and Xray present - In sprint testing: Aligning tests and teams to r...
Curiosity and Xray present - In sprint testing: Aligning tests and teams to r...
 
Mobile test automation perfecto star east
Mobile test automation perfecto star eastMobile test automation perfecto star east
Mobile test automation perfecto star east
 
Achieving Very High Reliability for Ubiquitous Information Technology
Achieving Very High Reliability for Ubiquitous Information Technology Achieving Very High Reliability for Ubiquitous Information Technology
Achieving Very High Reliability for Ubiquitous Information Technology
 
Software Quality and Test Strategies for Ruby and Rails Applications
Software Quality and Test Strategies for Ruby and Rails ApplicationsSoftware Quality and Test Strategies for Ruby and Rails Applications
Software Quality and Test Strategies for Ruby and Rails Applications
 
Case Reference - Database Testing: Success Story
Case Reference - Database Testing: Success StoryCase Reference - Database Testing: Success Story
Case Reference - Database Testing: Success Story
 
A Complete Guide to Functional Testing
A Complete Guide to Functional TestingA Complete Guide to Functional Testing
A Complete Guide to Functional Testing
 
Agile Open Source Performance Testing Workshop for Business Managers
Agile Open Source Performance Testing Workshop for Business ManagersAgile Open Source Performance Testing Workshop for Business Managers
Agile Open Source Performance Testing Workshop for Business Managers
 
Tips to achieve continuous integration/delivery using HP ALM, Jenkins, and S...
 Tips to achieve continuous integration/delivery using HP ALM, Jenkins, and S... Tips to achieve continuous integration/delivery using HP ALM, Jenkins, and S...
Tips to achieve continuous integration/delivery using HP ALM, Jenkins, and S...
 
Everything You Need to Know About Regression Testing Automation.pdf
Everything You Need to Know About Regression Testing Automation.pdfEverything You Need to Know About Regression Testing Automation.pdf
Everything You Need to Know About Regression Testing Automation.pdf
 
Zen Test Labs Mobile Application Testing
Zen Test Labs Mobile Application TestingZen Test Labs Mobile Application Testing
Zen Test Labs Mobile Application Testing
 
Justin Presentation PPT Upload
Justin Presentation PPT UploadJustin Presentation PPT Upload
Justin Presentation PPT Upload
 

More from Bob Binder

More from Bob Binder (17)

How to Release Rock-solid RESTful APIs and Ice the Testing BackBlob
How to Release Rock-solid RESTful APIs and Ice the Testing BackBlobHow to Release Rock-solid RESTful APIs and Ice the Testing BackBlob
How to Release Rock-solid RESTful APIs and Ice the Testing BackBlob
 
Model-based Testing: Taking BDD/ATDD to the Next Level
Model-based Testing: Taking BDD/ATDD to the Next LevelModel-based Testing: Taking BDD/ATDD to the Next Level
Model-based Testing: Taking BDD/ATDD to the Next Level
 
Model-based Testing: Today And Tomorrow
Model-based Testing: Today And TomorrowModel-based Testing: Today And Tomorrow
Model-based Testing: Today And Tomorrow
 
Mobile App Assurance: Yesterday, Today, and Tomorrow.
Mobile App Assurance: Yesterday, Today, and Tomorrow.Mobile App Assurance: Yesterday, Today, and Tomorrow.
Mobile App Assurance: Yesterday, Today, and Tomorrow.
 
Popular Delusions, Crowds, and the Coming Deluge: end of the Oracle?
Popular Delusions, Crowds, and the Coming Deluge: end of the Oracle?Popular Delusions, Crowds, and the Coming Deluge: end of the Oracle?
Popular Delusions, Crowds, and the Coming Deluge: end of the Oracle?
 
Testing Object-Oriented Systems: Lessons Learned
Testing Object-Oriented Systems: Lessons LearnedTesting Object-Oriented Systems: Lessons Learned
Testing Object-Oriented Systems: Lessons Learned
 
Model-Based Testing: Why, What, How
Model-Based Testing: Why, What, HowModel-Based Testing: Why, What, How
Model-Based Testing: Why, What, How
 
MDD and the Tautology Problem: Discussion Notes.
MDD and the Tautology Problem: Discussion Notes.MDD and the Tautology Problem: Discussion Notes.
MDD and the Tautology Problem: Discussion Notes.
 
Testability: Factors and Strategy
Testability: Factors and StrategyTestability: Factors and Strategy
Testability: Factors and Strategy
 
Test Objects -- They Just Work
Test Objects -- They Just WorkTest Objects -- They Just Work
Test Objects -- They Just Work
 
A Million Users in a Box: The WTS Story
A Million Users in a Box: The WTS StoryA Million Users in a Box: The WTS Story
A Million Users in a Box: The WTS Story
 
Software Test Patterns: Successes and Challenges
Software Test Patterns: Successes and ChallengesSoftware Test Patterns: Successes and Challenges
Software Test Patterns: Successes and Challenges
 
Assurance for Cloud Computing
Assurance for Cloud ComputingAssurance for Cloud Computing
Assurance for Cloud Computing
 
The Advanced Mobile Application Testing Environment: Project Report
The Advanced Mobile Application Testing Environment: Project ReportThe Advanced Mobile Application Testing Environment: Project Report
The Advanced Mobile Application Testing Environment: Project Report
 
Software Testing: Models, Patterns, Tools
Software Testing: Models, Patterns, ToolsSoftware Testing: Models, Patterns, Tools
Software Testing: Models, Patterns, Tools
 
The Tester’s Dashboard: Release Decision Support
The Tester’s Dashboard: Release Decision SupportThe Tester’s Dashboard: Release Decision Support
The Tester’s Dashboard: Release Decision Support
 
Testability: Factors and Strategy
Testability: Factors and StrategyTestability: Factors and Strategy
Testability: Factors and Strategy
 

Recently uploaded

Search and Society: Reimagining Information Access for Radical Futures
Search and Society: Reimagining Information Access for Radical FuturesSearch and Society: Reimagining Information Access for Radical Futures
Search and Society: Reimagining Information Access for Radical Futures
Bhaskar Mitra
 
Future Visions: Predictions to Guide and Time Tech Innovation, Peter Udo Diehl
Future Visions: Predictions to Guide and Time Tech Innovation, Peter Udo DiehlFuture Visions: Predictions to Guide and Time Tech Innovation, Peter Udo Diehl
Future Visions: Predictions to Guide and Time Tech Innovation, Peter Udo Diehl
Peter Udo Diehl
 

Recently uploaded (20)

Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualitySoftware Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
 
In-Depth Performance Testing Guide for IT Professionals
In-Depth Performance Testing Guide for IT ProfessionalsIn-Depth Performance Testing Guide for IT Professionals
In-Depth Performance Testing Guide for IT Professionals
 
Search and Society: Reimagining Information Access for Radical Futures
Search and Society: Reimagining Information Access for Radical FuturesSearch and Society: Reimagining Information Access for Radical Futures
Search and Society: Reimagining Information Access for Radical Futures
 
Unsubscribed: Combat Subscription Fatigue With a Membership Mentality by Head...
Unsubscribed: Combat Subscription Fatigue With a Membership Mentality by Head...Unsubscribed: Combat Subscription Fatigue With a Membership Mentality by Head...
Unsubscribed: Combat Subscription Fatigue With a Membership Mentality by Head...
 
Knowledge engineering: from people to machines and back
Knowledge engineering: from people to machines and backKnowledge engineering: from people to machines and back
Knowledge engineering: from people to machines and back
 
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdfFIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
 
Quantum Computing: Current Landscape and the Future Role of APIs
Quantum Computing: Current Landscape and the Future Role of APIsQuantum Computing: Current Landscape and the Future Role of APIs
Quantum Computing: Current Landscape and the Future Role of APIs
 
ODC, Data Fabric and Architecture User Group
ODC, Data Fabric and Architecture User GroupODC, Data Fabric and Architecture User Group
ODC, Data Fabric and Architecture User Group
 
Assuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyesAssuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyes
 
Accelerate your Kubernetes clusters with Varnish Caching
Accelerate your Kubernetes clusters with Varnish CachingAccelerate your Kubernetes clusters with Varnish Caching
Accelerate your Kubernetes clusters with Varnish Caching
 
The Future of Platform Engineering
The Future of Platform EngineeringThe Future of Platform Engineering
The Future of Platform Engineering
 
Exploring UiPath Orchestrator API: updates and limits in 2024 🚀
Exploring UiPath Orchestrator API: updates and limits in 2024 🚀Exploring UiPath Orchestrator API: updates and limits in 2024 🚀
Exploring UiPath Orchestrator API: updates and limits in 2024 🚀
 
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
 
Transcript: Selling digital books in 2024: Insights from industry leaders - T...
Transcript: Selling digital books in 2024: Insights from industry leaders - T...Transcript: Selling digital books in 2024: Insights from industry leaders - T...
Transcript: Selling digital books in 2024: Insights from industry leaders - T...
 
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
 
To Graph or Not to Graph Knowledge Graph Architectures and LLMs
To Graph or Not to Graph Knowledge Graph Architectures and LLMsTo Graph or Not to Graph Knowledge Graph Architectures and LLMs
To Graph or Not to Graph Knowledge Graph Architectures and LLMs
 
How world-class product teams are winning in the AI era by CEO and Founder, P...
How world-class product teams are winning in the AI era by CEO and Founder, P...How world-class product teams are winning in the AI era by CEO and Founder, P...
How world-class product teams are winning in the AI era by CEO and Founder, P...
 
Future Visions: Predictions to Guide and Time Tech Innovation, Peter Udo Diehl
Future Visions: Predictions to Guide and Time Tech Innovation, Peter Udo DiehlFuture Visions: Predictions to Guide and Time Tech Innovation, Peter Udo Diehl
Future Visions: Predictions to Guide and Time Tech Innovation, Peter Udo Diehl
 
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024
 
Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...
Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...
Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...
 

Experience with a Profile-based Automated Testing Environment

  • 1. mVerify A Million Users in a Box ™ Experience with a Profile-based Automated Testing Environment Presented at ISSRE 2003 November 18, 2003 Robert V. Binder mVerify Corporation www.mVerify.com
  • 2. Overview Levels of Automated Testing System Under Test Approach Observations © 2003 mVerify Corporation 2
  • 3. Musa’s Observation Testing driven by an operational profile is very efficient because it identifies failures (and hence the faults causing them) on average, in order of how often they occur. This approach rapidly increases reliability … because the failures that occur most frequently are caused by the faulty operations used most frequently. IEEE Software, March 1993 © 2003 mVerify Corporation 3
  • 4. Promise of Profile-Based Testing  Tester's point of view versus reliability analyst  Maximize reliability within fixed budget  Measurement of reliability not primary goal  Profile-Based Testing is optimal when  Available information already used  Must allocate resources to test complex SUT  Many significant practical obstacles © 2003 mVerify Corporation 4
  • 5. Testing by Poking Around Manual “Exploratory” Testing •Not Effective •Low Coverage •Not Repeatable •Can’t Scale System Under Test © 2003 mVerify Corporation 5
  • 6. Manual Testing Test Setup Manual Manual Test Design/ Test Input Generation •1 test per hour Test Results •Not repeatable Evaluation System Under Test © 2003 mVerify Corporation 6
  • 7. Automated Test Script Test Setup Manual Test Script Test Design/ Programming Generation •10+ tests per hour •Repeatable Results Test Evaluation •Brittle System Under Test © 2003 mVerify Corporation 7
  • 8. Automated Generation/Agent Test Setup Model-based Test Design/ Automatic Generation Test Execution •1000+ tests per hour •High fidelity Results Test Evaluation •Evaluation limited System Under Test © 2003 mVerify Corporation 8
  • 9. Full Test Automation Automated Test Setup Model-based Test Design/ Generation Automatic Test Execution Automated •Advanced Mobile App Testing Environment Test Results •Q3 2005 Evaluation System Under Test © 2003 mVerify Corporation 9
  • 10. Application Under Test E-commerce/securities market, screen-based trading over private network 3 million transactions per hour 15 billion dollars per day 3 years, version 1.0 live Q4 2001 © 2003 mVerify Corporation 10
  • 11. Development Process/Environment Rational Unified process About 90 use-cases, 600 KLOC Java Java (services and GUI), some XML Oracle DBMS Many legacy interfaces CORBA/IDL distributed object model HA Sun server farm Dedicated test environment © 2003 mVerify Corporation 11
  • 12. Profile-based Testing Approach  Executable operational profile  Simulator generates realistic unique test suites  Loosely coupled automated test agents  Oracle/Comparator automatically evaluate  Support integration, functional, and stress test © 2003 mVerify Corporation 12
  • 13. Model-based Testing Profile alone insufficient Extended Use Case  RBSC test methodology  Defines feature usage profile  Input conditions, output actions Mode Machine Invariant Boundaries © 2003 mVerify Corporation 13
  • 14. Simulator  Discrete event simulation  Generate any distribution with pseudo-random  Prolog implementation (50 KLOC)  Rule inversion  Load Profile  Time domain variation  Orthogonal to operational profile  Each event assigned a "port" and submit time © 2003 mVerify Corporation 14
  • 15. Test Environment  Simulator generates interface-independent content  Adapters for each SUT Interface  Formats for test agent API  Generates script code  Test Agents execute independently  Distributed processing/serialization challenges  Loosely coupled, best-effort strategy  Embed sever-side serialization monitor © 2003 mVerify Corporation 15
  • 16. Automated Run Evaluation  Oracle accepts output of simulator  About 500 unique rules  Verification  Splainer – result/rule backtracking tool  Rule/Run coverage analyzer  Comparator  Extract transaction log  Post run database state  end-to-end invariant  Stealth requirements engineering © 2003 mVerify Corporation 16
  • 17. Overall Process  Six development increments  3 to 5 months  Test design/implementation parallel with app dev  Plan each day's test run  Load profile  Total volume  Configuration/operational scenarios © 2003 mVerify Corporation 17
  • 18. Daily Test Process  Run Simulator  100,000 events per hour  FTP event files to test agents  Start SUT  Test agents automatically start at scheduled time  Extract results  Run Oracle/Comparator  Prepare bug reports © 2003 mVerify Corporation 18
  • 19. Problems and Solutions  One time sample not  Simulator generates effective, but fresh test fresh, accurate sample suites too expense on demand  Too expensive to develop  Oracle generates expected results expected on demand  Too many test cases to  Comparator automates evaluate checking  Profile/Requirements  Incremental changes to change rule base  SUT Interfaces change  Common agent interface © 2003 mVerify Corporation 19
  • 20. Technical Achievements  AI-based user simulation generates test suites  All inputs generated under operational profile  High volume oracle and evaluation  Every test run unique and realistic (about 200)  Evaluated functionality and load response with fresh tests  Effective control of many different test agents (COTS/ custom, Java/4Test/Perl/Sql/proprietary) © 2003 mVerify Corporation 20
  • 21. Problems  Stamp coupling  Simulator, Agents, Oracle, Comparator  Re-factoring rule relationships, Prolog limitations  Configuration hassles  Scale-up constraints  Distributed schedule brittleness  Horn Clause Shock Syndrome © 2003 mVerify Corporation 21
  • 22. Results  Revealed about 1,500 bugs over two years  5% showstoppers  Five person team, huge productivity increase  Achieved proven high reliability  Last pre-release test run: 500,000 events in two hours, no failures detected  Bonus: 1M event run closed big affiliate deal  No production failures © 2003 mVerify Corporation 22