A more successful model for multi-shore testing
Upcoming SlideShare
Loading in...5
×
 

A more successful model for multi-shore testing

on

  • 3,083 views

The key to a successful project is being able to quickly and effectively identify the quality of the application under test ...

The key to a successful project is being able to quickly and effectively identify the quality of the application under test
For a multi-shore project this can be achieved with automation and test frameworks, an agile integrated testing model, and visibility and communication across the process.

Statistics

Views

Total Views
3,083
Views on SlideShare
3,079
Embed Views
4

Actions

Likes
1
Downloads
95
Comments
0

1 Embed 4

http://www.slideshare.net 4

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    A more successful model for multi-shore testing A more successful model for multi-shore testing Presentation Transcript

    • A more successful model for multi-shore testing Ken McCorkell January 28, 2010 The contents of this presentation are the sole copyrighted property of Perficient Inc and may not be reproduced in whole or in part without written permission from Perficient Inc.
    • Agenda
      • Introduction
      • When should you build more code?
      • Test Automation Strategies
      • Building custom automation frameworks
      • Agile Testing
      • Strategies for testing in a multi-shore environment
      • Q&A
    • Fast Facts
      • Founded in 1997
      • Public, NASDAQ: PRFT
      • ~$250 million in annualized revenues
      • Locations in 19 major North American markets
      • Global Delivery Centers in Europe and China
      • 1400+ technology consultants
      • Dedicated solution practices
      • Served 600+ clients in 2007
      • Alliance partnerships with major technology vendors
      • Multiple vendor/industry technology and growth awards
    • Perficient’s Global Delivery Center – Key Facts
      • Our Strengths
      • Fully owned and operated since 2004
      • Operating at SEI CMMI Level 5
      • Worldwide Leader! – One of the first to achieve CMMI using an Agile Methodology
      • 20-40% average organic growth strategy
      • Located in Hangzhou, China
        • About 2.5 hours from Shanghai
        • Home to prestigious universities
        • Large City - Pop. 6.5M+
        • #1 city to do business in China – Forbes (5 years running!)
        • Excellent talent pool- home to over 1200 technology enterprises
        • All business conducted in English
      China Global Delivery Center
    • Perficient’s ‘3D’ Testing Approach Breadth / Depth Lifecycle Innovation Agile Approach Classic Waterfall Comprehensive Testing 2 Dimensional Testing Testing as a Necessity Testing as a strategic differentiator
      • Functional testing
      • Usability testing
      • Accessibility testing
      • Compatibility testing
      • Load testing
      • Stress testing
      • Compliance testing
      • Internationalization testing
      • Localization verification
      • API / interface testing
      • Installation validation
      • Test data generation / masking
      • Elimination of traceability matrices
      • Continuous build, integration and testing
      • Test first principles (development best practice)
      • Planned regression testing
      • Early integration testing (steel threads)
      • Polymorphic development team roles
      • High degree of automation
      • Automation framework
      • Flexible COTS / open source toolkit
      • Multi-shore as a strategic advantage
      • Characteristics:
      • Based on Agile best practices where testing is integrated into the full lifecycle
      • Covers the ‘big-picture’ of testing; not just relegating most testing to an ‘after-thought’
      • A high degree of leading edge innovation in how testing is approached and how it can be leveraged in the organization as a critical practice
    • Agenda
      • Introduction
      • When should you build more code?
      • Test Automation Strategies
      • Building custom automation frameworks
      • Agile Testing
      • Strategies for testing in a multi-shore environment
      • Q&A
    • When should you build more code?
      • The answer is – when you have successfully validated the delivered code
      • The key to a successful project is being able to quickly and effectively identify the quality of the application under test
      • For a multi-shore project this can be achieved with:
      • Automation and test frameworks, an agile integrated testing model, and visibility and communication across the process
      In an agile testing model code is verified at least every iteration
    • Technical debt and untested code Technical Debt of untested features Stacks up “ Technical debt occurs incurs when a team chooses an approach that's expedient in the short term but that increases complexity and is more costly in the long term.” # of Code modules without tests Code dev without testing Technical Debt
    • Agenda
      • Introduction
      • When should you build more code?
      • Test Automation Strategies
      • Building custom automation frameworks
      • Agile Testing
      • Strategies for testing in a multi-shore environment
      • Q&A
    • Why should you use test automation?
      • Automation is essential to ensure constant communication between testing and development efforts
      • Quick visibility into code quality
      • Main reason: Automation is often the only way to generate the velocity for testing needed to release fully tested builds and potentially shippable projects in short iterations
    • Automated testing success
      • Did you know more than 50% of test automation projects fail to meet their goals?
      • Reasons for this low success rate include:
        • Expensive tools and technology are seen as the answer, but the processes are actually the critical elements
        • Automation projects are not planned as an integrated part of the development process
        • Automation coding is not planned with the same rigor as application coding
        • The proper resources and time are not allocated to building up a robust automation framework
    • How to best use Automated Testing
      • Automated tests should be used for:
        • automating the most repeated tests
        • replacing tests that are tedious to execute manually
        • testing on multiple OS or environment configurations
        • creating regression for legacy applications by implementing a set of tests each iteration
      The actual goal is usually not to automate 100% of all possible tests, but the most high value tests
    • Open Source as a Test Strategy
      • Consider using open source testing tools or a combination of open source and COTS
    • Agenda
      • Introduction
      • When should you build more code?
      • Test Automation Strategies
      • Building custom automation frameworks
      • Agile Testing
      • Strategies for testing in a multi-shore environment
      • Q&A
    • Automated Test Frameworks
      • Automated Test Frameworks are essential to providing efficient automation scripts based on the specific testing environment
      • Keep these principles in mind:
        • Do not simply record and playback tests
        • Parameterize data use for tests – Data-Driven Testing
        • Parameterize field names for easy maintenance
        • Write reusable modules for common test functions
        • Make tests atomic – tests should not depend on other tests
        • Remember: automated tests are code, and should be planned just like application code
    • Agenda
      • Introduction
      • When should you build more code?
      • Test Automation Strategies
      • Building custom automation frameworks
      • Agile Testing
      • Strategies for testing in a multi-shore environment
      • Q&A
    • Agile Testing: Traditional Testing vs Agile Testing
      • Traditional Testing Model:
      • Separate test group from development
      • Tests are derived from detailed requirements instead of being part of requirements
      • Testing is conducted after development is complete
      • Defects are written to communicate with developers
      • Developers and Testers are working against each other
      • Automation is a nice-to-have instead of a must-have
      This traditional approach can work if there is sufficient time and budget for re-testing and coding cycles
    • Agile Testing: Traditional Testing vs Agile Testing
      • Agile Testing Model:
      • Testers are part of the development team
      • Team works closely with customers to define acceptance tests for each requirement
      • Testers are included from the beginning of the requirements and design
      • Test each feature early as it is completed
      • Paired testing between testers and developers
      • Provides continuous feedback to development in person and with up-to-date test metrics and automation results
    • Combining Agile with Automation
      • Use continuous integration (CI) to automatically build and test
      • Test-driven development creates failing unit tests first then writes code that makes the test pass
      • Task tracking and burn-down reports can show progress on coding and automation tasks in one system
      Popular CI tools include Hudson, Cruise Control, Maven, and ant
    • Agenda
      • Introduction
      • When should you build more code?
      • Test Automation Strategies
      • Building custom automation frameworks
      • Agile Testing
      • Strategies for testing in a multi-shore environment
      • Q&A
    • Multi-shore testing
      • What does all of this fit in with multi-shore testing?
      • Multi-shore testing should pay even more attention to agile testing and automation, because they are essential for communication and visibility across the testing process
      • Visibility provided by automation and agile approaches closes gap between onshore and offshore teams
      • Maximize efficiency by practicing the “follow-the-sun” pattern with shared areas of responsibility
    • Multi-shore communication
      • Communication with multi-shore teams is difficult to integrate with a truly agile process
      • Reduce time between checkpoints to avoid communication gaps
      Communication gap offshore onshore time last checkpoint Cross-shore communication
    • Visibility and communication
      • • Daily Level
        • - Daily stand-ups
        • - Daily or hourly build of the software including running tests
        • Team self-organizes and discusses problems as they happen
        • Daily question and answer on the project Wiki page
        • Q&A sessions on instant messenger for off-hours times
        • - Daily posting of knowledge learned to the knowledge-sharing space on the wiki
    • Visibility and Communication (cont.)
      • • Weekly Level
        • - Weekly status report to the Project Management Organization (PMO) to escalate and communicate issues
        • - Weekly test reports are sent out with testing progress and bug reports
      • • Iteration Level (1- 4 weeks )
        • Demo to all interested parties including customer (real working code demo).
        • This gives real visibility about what the team has actually produced
        • - Retrospectives (these are important!)
      • • Project Level
        • - Project retrospective with lessons learned and feedback to the process patterns
        • - New process patterns that worked well get rolled back into the process pattern templates
      Keep to the fixed schedule for communication checkpoints
    • Summary
      • Best Practices for multi-shore testing
      • Test Automation should be used as a communication tool for product quality
      • Testers and developers should be paired on areas of functionality to share knowledge across shores
      • Code repositories and dev/test environments should be shared by cross-shore teams
      • Communication tools such as: wikis, task management systems, daily Q&A sessions, IM, and sync-up calls are essential to visibility across the team
      • Consider a first step of starting an offshore test initiative to maintain test automation scripts
      Most importantly, treat the resources all as one team!
    •  
    • Provider Healthcare Analytics and Driving Quality Outcome Measurements Thursday, February 25, 2010 12:00 - 1:00PM CST Next Month:
      • Healthcare analytics are of great importance in terms of both providing quality of care and in meeting increasing regulatory reporting requirements. Consider the following:
      • Healthcare insurance organizations are implementing pay-for-performance measures, including CMS, to change or
      • create better healthcare outcomes
      • ARRA stimulus dollars require providers to report on "meaningful use" of electronic health records and to report HHS
      • quality measures
      • Physician alignment necessitates transparency in reporting outcomes and performance expectation
      • Healthcare providers must provide outcome results as part of re-inventing service delivery
      • Join Perficient as we discuss how healthcare analytics can improve your outcomes:
      • The business case for why healthcare analytics matters to progressive health systems
      • A look at some of the market-leading healthcare analytics products and their associated architectures
      • Case studies of organizations that have begun to address the value of healthcare information
      www.perficient.com/webinars