Moogilu QA Case Study for Machine Vision
Moogilu International
2455 N Naglee Road, Suite 227
Tracy, CA 95391
Channagiri Jagadish
Ph: 408 884 0325
650 245 1885
Jagadish@Moogilu.com
Title: Moogilu Mobile Platform for Social Gifting Case Study
Date: January 26, 2013
1 Executive Summary
The customer was shipping products into the marketplace without adequate QA.
This caused considerable problem in the field and many of the expensive
Machine Vision machines were either returned back or production was delayed
considerably.
The primarily goal of the QA Engagement was to ship a quality product and also
manage updates that did not breakdown the production line. The goal was to
enhance customers ROI and reputation in the market place by shipping quality
products.
The Machine Vision QA Engagement includes:
 In-depth understanding of Machine Vision System
 Comprehensive manual test plan
 Automation
 Smoke Testing
 Regression
 Production Testing
 Reporting
 Integration with JIRA
Moogilu QA Case Study for Machine Vision
1.1 Objectives
 Identify the product modules.
 Understanding the test approach.
 Understanding the test artifacts.
 Understanding the automation test process.
2 Scope of Testing
The products that are tested are as below:
2.1 Product Overview
2.1.1 Production Quality Advisor.
Production Quality Advisor is an application used to view inspection data
stored in a database.
2.1.2 Console
Console is an application used to control inspection and to view the
inspection data created by the inspection system.
2.1.3 Recipe
Recipe Manager is the application used to define the parameters for
inspecting a product. The parameters used to inspect a particular product
are contained in a configuration file known as a recipe.
2.1.4 Classifier Manager
Classifier Manager is an application used to create and manage classifiers.
A classifier specifies the parameters to determine the type of defects as
they are detected.
2.2 Test Coverage
Identify and describe the amount and type of testing that is required
Test Type Covered
Moogilu QA Case Study for Machine Vision
(Yes/No)
Functional Testing Yes
Regression Testing Yes
Performance Testing No
Automation Testing Yes
Database Testing No
Scenario Testing Yes
3 Test Deliverables
No Deliverable Name Deliverable Description
1 Test Case Documents Excel sheet with Test scenarios.
2 Test Results Documents Excel sheets with Test scenarios
along with the Results
3 Automation scripts execution
Guide
List of instructions to execute
automation scripts
3.1 Test Case document structure
ID Module Test Case Steps Data Expected
Result
Actual
Result
Status Weight Remarks
Module Name
Moogilu QA Case Study for Machine Vision
001 Test Case Tile 1. Test step 1
2. Test step 2
Result Pass/Fail/Not
Run
3.2 Test Case Result Updates
ID Module Test Case Steps Data Expected Result Actual Result Status Weight Remarks
Module Name
001 Test Case Tile 1. Test step 1
2. Test step 2
Result Passed
3.3 Test Case Execution Summary
Test Execution Summary Percentage (%)
Story
Total
Planned
Pass Fail On Hold
Total
Executed
Pass Fail On Hold
Total
Executed
Story 1 20 20 0 0 20 100 0 0 100
Story 2 67 50 10 7 60 74.626866 14.925373 10.447761 89.55223881
Story 3 8 6 1 1 7 75 12.5 12.5 87.5
Story 4 34 30 4 0 34 88.235294 11.764706 0 100
Story 5 84 75 3 6 78 89.285714 3.5714286 7.1428571 92.85714286
Total 213 181 18 14 199
4 Testing Approach
All the tests were designed and executed by Moogilu QA Team.
4.1 Functional Testing Approach (Manual)
For each application, system QA team wrote test cases with relevant test
steps. It will cover all the functional tests. There was 100% test coverage. All
the tests cases were executed and issues reported to JIRA. All test cases
were update with results.
Moogilu QA Case Study for Machine Vision
4.2 Regression Testing Approach (Manual)
When the bug fixes release deployed to the test server, system QA team will
verify all the bug fixes and verify them. If the fixed bugs are still in the
system, the bug will be reopened. New issues were added as bugs were
detected in the system
4.3 New Feature Testing Approach (Manual)
When the new feature release deployed to the test server, system QA team
will test all the new features and verify the functionalities. New issues will
be added as bugs.
4.4 Regression Testing Approach (Automation)
When the bug fixes release or new feature deployed to the test server,
system QA team executed the automation regression test suit to verify the
existing functionalities.
5 Automation Testing
5.1 Automation Test Requirement
Today, many IT organizations struggle to achieve quality objectives while facing
tight delivery schedules and constrained budgets. In these organizations, testing
remains primarily a resource intensive, manual effort despite the increasing
workloads, aggressive deadlines and escalating cost of skilled test engineers.
Moogilu helped the customer by architecting and executing an automation test
bed. The advantages of automation include:
 Accelerate testing cycles and release products on time
 Conduct extensive testing and increase test coverage
 Utilize test resources efficiently
 Improve test accuracy and test management
 Enhance the productivity of testing efforts
Moogilu QA Case Study for Machine Vision
5.1.1 Moogilu Approach to Automation
 Select Appropriate Tools – We are not obliged to use only one tool. We
select right tools from a stack of tools for the right project when
conducting UI Tests, Performance Tests, Web Service Tests, and Data
Validation Tests that support across web applications, desktop and mobile
applications on .Net, Java and PHP platforms.
 Knowledge – Over the years Moogilu has engaged with multiple test
automation projects and have in-depth understanding of test automation
processes, tools and techniques. We have highly specialized skills in
Selenium, Coded UI, Soap UI and JMeter.
 Set Realistic Expectations – We set client expectations at the start and
ensure that they are delivered.
 Use a Highly Maintainable Framework – We use Page Object Design
Pattern to minimize effort of modifications. Also UI Mapping is used to
store all the locators of the test suite in one place. Further application
credentials and test data are parameterized for easy maintenance of the
test suits.
 Use Reusable Components – We have created Automation Framework to
work with UI elements such as data grids, paging and search functions.
Also we use external components to read /write excel files, databases &
XML files. Customized APIs are used to communicate with test
management tools.
Moogilu’ extensive knowledge in Automation helped build a framework within a
month for this engagement.
5.2 Tools used for the automation
Test
Tool
Test
Category
Features
Moogilu QA Case Study for Machine Vision
Selenium Functional • Many language support (Java, C#,
Python and Perl)
• Open Source.
Coded UI Functional • Automatically generate more advance
Code compared to Selenium or Telerik.
• Supports C#.
TestLink Test
Management
• Maintain Test Cases
• Maintain Test execution Report.
Jmeter Load • Open Source.
• In build browser.
• Support both UI and Web Services.
SOAP
UI
Web Service • Functional and Load testing.
5.3 Automation Results sheets
This automation result will automatically generated by TestLink Test
management tool.
Test Case Build Tester Time Status Description Bugs
CGN-1: Test Case Title Build 1.0 admin 26/01/2013 10:53:18 Passed
Moogilu QA Case Study for Machine Vision
6 Results
The engagement is ongoing and with first 6 months the results include:
 Tested all products and some of the products had 100% test coverage
 Automate 30% of the test cases and is ongoing
 Smoke Testing on any staging build
 Release and Production Testing
 No Field issues reported after product shipment to customers
 Transfer of Knowledge to the Company
 The Customers has continued the engagement

Moogilu qa-case study (Software Testing)

  • 1.
    Moogilu QA CaseStudy for Machine Vision Moogilu International 2455 N Naglee Road, Suite 227 Tracy, CA 95391 Channagiri Jagadish Ph: 408 884 0325 650 245 1885 Jagadish@Moogilu.com Title: Moogilu Mobile Platform for Social Gifting Case Study Date: January 26, 2013 1 Executive Summary The customer was shipping products into the marketplace without adequate QA. This caused considerable problem in the field and many of the expensive Machine Vision machines were either returned back or production was delayed considerably. The primarily goal of the QA Engagement was to ship a quality product and also manage updates that did not breakdown the production line. The goal was to enhance customers ROI and reputation in the market place by shipping quality products. The Machine Vision QA Engagement includes:  In-depth understanding of Machine Vision System  Comprehensive manual test plan  Automation  Smoke Testing  Regression  Production Testing  Reporting  Integration with JIRA
  • 2.
    Moogilu QA CaseStudy for Machine Vision 1.1 Objectives  Identify the product modules.  Understanding the test approach.  Understanding the test artifacts.  Understanding the automation test process. 2 Scope of Testing The products that are tested are as below: 2.1 Product Overview 2.1.1 Production Quality Advisor. Production Quality Advisor is an application used to view inspection data stored in a database. 2.1.2 Console Console is an application used to control inspection and to view the inspection data created by the inspection system. 2.1.3 Recipe Recipe Manager is the application used to define the parameters for inspecting a product. The parameters used to inspect a particular product are contained in a configuration file known as a recipe. 2.1.4 Classifier Manager Classifier Manager is an application used to create and manage classifiers. A classifier specifies the parameters to determine the type of defects as they are detected. 2.2 Test Coverage Identify and describe the amount and type of testing that is required Test Type Covered
  • 3.
    Moogilu QA CaseStudy for Machine Vision (Yes/No) Functional Testing Yes Regression Testing Yes Performance Testing No Automation Testing Yes Database Testing No Scenario Testing Yes 3 Test Deliverables No Deliverable Name Deliverable Description 1 Test Case Documents Excel sheet with Test scenarios. 2 Test Results Documents Excel sheets with Test scenarios along with the Results 3 Automation scripts execution Guide List of instructions to execute automation scripts 3.1 Test Case document structure ID Module Test Case Steps Data Expected Result Actual Result Status Weight Remarks Module Name
  • 4.
    Moogilu QA CaseStudy for Machine Vision 001 Test Case Tile 1. Test step 1 2. Test step 2 Result Pass/Fail/Not Run 3.2 Test Case Result Updates ID Module Test Case Steps Data Expected Result Actual Result Status Weight Remarks Module Name 001 Test Case Tile 1. Test step 1 2. Test step 2 Result Passed 3.3 Test Case Execution Summary Test Execution Summary Percentage (%) Story Total Planned Pass Fail On Hold Total Executed Pass Fail On Hold Total Executed Story 1 20 20 0 0 20 100 0 0 100 Story 2 67 50 10 7 60 74.626866 14.925373 10.447761 89.55223881 Story 3 8 6 1 1 7 75 12.5 12.5 87.5 Story 4 34 30 4 0 34 88.235294 11.764706 0 100 Story 5 84 75 3 6 78 89.285714 3.5714286 7.1428571 92.85714286 Total 213 181 18 14 199 4 Testing Approach All the tests were designed and executed by Moogilu QA Team. 4.1 Functional Testing Approach (Manual) For each application, system QA team wrote test cases with relevant test steps. It will cover all the functional tests. There was 100% test coverage. All the tests cases were executed and issues reported to JIRA. All test cases were update with results.
  • 5.
    Moogilu QA CaseStudy for Machine Vision 4.2 Regression Testing Approach (Manual) When the bug fixes release deployed to the test server, system QA team will verify all the bug fixes and verify them. If the fixed bugs are still in the system, the bug will be reopened. New issues were added as bugs were detected in the system 4.3 New Feature Testing Approach (Manual) When the new feature release deployed to the test server, system QA team will test all the new features and verify the functionalities. New issues will be added as bugs. 4.4 Regression Testing Approach (Automation) When the bug fixes release or new feature deployed to the test server, system QA team executed the automation regression test suit to verify the existing functionalities. 5 Automation Testing 5.1 Automation Test Requirement Today, many IT organizations struggle to achieve quality objectives while facing tight delivery schedules and constrained budgets. In these organizations, testing remains primarily a resource intensive, manual effort despite the increasing workloads, aggressive deadlines and escalating cost of skilled test engineers. Moogilu helped the customer by architecting and executing an automation test bed. The advantages of automation include:  Accelerate testing cycles and release products on time  Conduct extensive testing and increase test coverage  Utilize test resources efficiently  Improve test accuracy and test management  Enhance the productivity of testing efforts
  • 6.
    Moogilu QA CaseStudy for Machine Vision 5.1.1 Moogilu Approach to Automation  Select Appropriate Tools – We are not obliged to use only one tool. We select right tools from a stack of tools for the right project when conducting UI Tests, Performance Tests, Web Service Tests, and Data Validation Tests that support across web applications, desktop and mobile applications on .Net, Java and PHP platforms.  Knowledge – Over the years Moogilu has engaged with multiple test automation projects and have in-depth understanding of test automation processes, tools and techniques. We have highly specialized skills in Selenium, Coded UI, Soap UI and JMeter.  Set Realistic Expectations – We set client expectations at the start and ensure that they are delivered.  Use a Highly Maintainable Framework – We use Page Object Design Pattern to minimize effort of modifications. Also UI Mapping is used to store all the locators of the test suite in one place. Further application credentials and test data are parameterized for easy maintenance of the test suits.  Use Reusable Components – We have created Automation Framework to work with UI elements such as data grids, paging and search functions. Also we use external components to read /write excel files, databases & XML files. Customized APIs are used to communicate with test management tools. Moogilu’ extensive knowledge in Automation helped build a framework within a month for this engagement. 5.2 Tools used for the automation Test Tool Test Category Features
  • 7.
    Moogilu QA CaseStudy for Machine Vision Selenium Functional • Many language support (Java, C#, Python and Perl) • Open Source. Coded UI Functional • Automatically generate more advance Code compared to Selenium or Telerik. • Supports C#. TestLink Test Management • Maintain Test Cases • Maintain Test execution Report. Jmeter Load • Open Source. • In build browser. • Support both UI and Web Services. SOAP UI Web Service • Functional and Load testing. 5.3 Automation Results sheets This automation result will automatically generated by TestLink Test management tool. Test Case Build Tester Time Status Description Bugs CGN-1: Test Case Title Build 1.0 admin 26/01/2013 10:53:18 Passed
  • 8.
    Moogilu QA CaseStudy for Machine Vision 6 Results The engagement is ongoing and with first 6 months the results include:  Tested all products and some of the products had 100% test coverage  Automate 30% of the test cases and is ongoing  Smoke Testing on any staging build  Release and Production Testing  No Field issues reported after product shipment to customers  Transfer of Knowledge to the Company  The Customers has continued the engagement