• Share
  • Email
  • Embed
  • Like
  • Private Content
develop improve software testing
 

develop improve software testing

on

  • 12,528 views

Testing is an often misunderstood aspect of quality assurance, and as such many enterprises are not getting the most benefit out of testing efforts. Use this storyboard to gain insight from other ...

Testing is an often misunderstood aspect of quality assurance, and as such many enterprises are not getting the most benefit out of testing efforts. Use this storyboard to gain insight from other Info-Tech client interviews and surveys. Develop a testing strategy and plan for implementation.

•Assess your testing needs to determine a testing strategy
•Improve testing to make the most out of testing resources
•Develop a testing strategy based on resources and strategy assessment by putting together a test plan
•Understand testing to effectively communicate testing principles to testers, developers and management
The storyboard includes links to several tools and templates that will get you started on implementing a testing process regardless of your scenario.

Statistics

Views

Total Views
12,528
Views on SlideShare
12,511
Embed Views
17

Actions

Likes
12
Downloads
444
Comments
5

3 Embeds 17

http://www.freewebs.com 11
http://www.brijj.com 3
http://pinterest.com 3

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel

15 of 5 previous next Post a comment

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
  • nice .....
    Are you sure you want to
    Your message goes here
    Processing…
  • Very nice....
    Are you sure you want to
    Your message goes here
    Processing…
  • Nice read
    Are you sure you want to
    Your message goes here
    Processing…
  • looks comprehensive
    Are you sure you want to
    Your message goes here
    Processing…
  • Thank u...Very useful..
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    develop improve software testing develop improve software testing Presentation Transcript

    • Develop & Improve Your Software Testing Strategy Conveying and Inspiring Confidence Info-Tech Research Group “ Software implementation is a cozy bonfire, warm, bright, a bustle of comforting concrete activity. But beyond the flames is an immense zone of darkness. Testing is the exploration of this darkness.” - extracted from the 1992 Software Maintenance Technology Reference Guide
    • Whether you develop it, configure it, or integrate it you simply can’t afford to ignore it
        • How to improve your quality process to achieve a level of quality your business can be comfortable with
        • When to know that testing should stop.
        • What types of testing should be done in your environment
        • How to improve your testing results
        • Understand why dedicated testing resources are important
        • What kind of process do you need to develop to ensure quality is present
        • What is the testing approach in an Agile development environment
        • How do you measure your quality effort
      Info-Tech Research Group This solution set is for you if….
      • you develop software
      • you configure software
      • you integrate software
      This solution set will answer the following…. IT exists to put functioning software into the hands of businesses, their staff and their customers SQA exists to make sure those needs are met in a useful , meaningful , and acceptable manner Software Quality Assurance All require testing to ensure that requirements are met and the application functions according to expectations .
    • Executive Summary Your success is dependant on how well you test your applications. Whether you have developed them or simply integrated or configured them, this solution will help you understand how and why certain testing should be completed Software Quality Assurance or SQA is an umbrella term that refers to the process involved in ensuring quality in your deliverables. It encompasses both Software Quality Control (SQC) and Software Testing. Business leaders will all eagerly agree that quality is important , but understanding how to get there, what processes to implement, what people to engage is often a difficult decision … this solution will help you to sift through all the information and turn it into useable knowledge to increase your success As any development organization matures, priorities jockey for position. For startups the elusive quality is not the major concern , delivery is, but as the organization matures quality ultimately ends up at the top Attempting to turn a screw with a knife blade can work, but it works much better, faster, and more accurately with the proper tool. Having trained quality professionals will help the organization achieve the level of quality that your customers expect … and deserve Develop and Improve your overall quality by learning and implementing proper process that fits your organizational needs. Whether you develop for the web, or integrate off the shelf software, your customers deserve your best, and you deserve their trust. This solution will help guide you through creating and maintaining a process that works for you Info-Tech Research Group A ssess your readiness … I mprove your success D evelop the means to achieve … U nderstand SQA, SQC, and Testing
    • Info-Tech Research Group Develop & Improve Your Software Testing Strategy Assess where you stand with software quality assurance Success is determined by how well your customers feel toward you, your service, and your product Assessing your place in the world Assess The General Situation Your Environment Improve Registered Certified Gold Redesigned Network Develop Subscription Based Software Competencies Understand Program Management Concordant Bodies & Alternatives
    • Your approach to testing may be different depending on your development focus. Most development shops fall into 1 of 3 Info-Tech Research Group
      • System Integrators
      • Some custom development (usually in the form smaller dev efforts and scripting relating to integration)
      • Configuration of established systems
      • Focused on improving productivity through integration/unification of existing software applications and/or systems
      • May be a professional services outsource shop
      • Independent Software Vendors (ISV)
      • Develops software for sale for mass market or niche markets (markets can be very diverse)
      • Most ISV’s specialize in building software for certain markets, hardware platforms, software platforms (i.e. Windows, Unix), or larger applications (i.e. Oracle, SAP, SharePoint)
      • Typically little to no customized content
      • Product based development
      • Business to business/consumer
      • Typically single/suite applications (ie: not enterprise systems)
      • Does not answer directly to outside clients (influenced only)
      • Takes direction directly from the business/technology leaders
      • System Builders
      • Develops enterprise systems
      • Can include any/all aspects of system development from data storage to usability
      • Typically internal/corporate development for internal use
      • Can be virtually any development (ordering, invoicing, customer management…)
      • Custom software development
      • May be an outsource development shop
    • Most businesses with development groups will be system builders or integrators and will mature at a predictable pace Info-Tech Research Group Experienced : 15-30 Dev, 2-3 Dev Mgr, 1 Sr Dev Mgr, 0-1 BA, 0-1 PM, some formalized dev standards, most projects run by dev leaders, some early formalized planning, requirements mostly known, little change control, deadlines are set based on rough estimates and business/client need still largely asap. Inexperienced : 1-5 Dev, no Dev Mgr, no BA, no PM, no dev standards, projects run by dev, no real planning, requirements are extremely loose, no change control, deadlines are all asap. **Top Priority -> Delivery Mildly Experienced : 5-15 Dev, 1-2 Dev Mgr, no BA, no PM, unofficial dev standards, projects run by dev, no formalized planning, requirements are loose, no change control, deadlines are all asap. Very Experienced : 30-50 Dev, 5-8 Dev Mgr, 1-2 Sr Dev Mgr, 1-2 BA, 3-5 PM, formalized development standards being followed, some projects still run by dev leaders, most run by PM, formalized planning, requirements are set, some change control is attempted, deadlines are set based on established estimation process with business/client needs factoring heavily. Seasoned Veteran : 50+ Dev, 8+ Dev Mgr, 3+ Sr Dev Mgr, 2-3+ BA, 8+ PM, formalized dev standards are followed, all projects controlled by PM, standardized and formalized planning, requirements are known, change control is in place, deadlines are based on well thought out estimates and business/client needs. ** Top Priority -> Quality No Testers ad-hoc developer testing only 0-1 Testers very little testing, some developer testing, some ad-hoc exploratory type testing 1 SQA/Test Leader, 5-10 Testers some developer testing, some standardized testing, mostly regression, smoke, exploratory, acceptance 1-2 SQA/Test Leader, 8-20 Testers some developer testing, standardized SQA run testing – separate test environment 1-2 SQA/Test Leaders, 20+ Testers developer unit testing, automated build testing standardized SQA run testing, automated regression, other automated testing – multiple separate test environments Individual business focus may be different, the approach to testing may be different, but the goal toward quality will be constant
    • As development organizations mature, top priorities change from delivery to quality with additional priorities along the way 5 Info-Tech Research Group Application Delivery Controlled Change Technology Used React Quickly Standards Adoption Process Adherence Planning & Design Project Management Application Quality Chaotic Change As an organization matures, priorities and their importance to the overall organization become closer and become more tightly interwoven . As a result breakdowns in a more mature organization anywhere will have a far more critical overall impact to the organization. Startup (inexperienced) Established (experienced)
      • Distance between priorities tightens
      • New priorities are added
      • Priorities increase in importance
      • Change moves from chaotic to controlled
      • Process evolves with maturity
      • Processes refine with maturity
      • Maturity evolves with experience
    • Low risk projects can take it slow implementing SQA process, but if your projects are high risk, good SQA is critical for success For best results a quality process should, at minimum , include the following: Requirements Management with a goal of clear, complete, testable requirement specifications Design and code inspections Project post-mortems/ retrospectives
      • Solving problems is a high visibility process, finding people skilled at problem solving is an easy task. (SQC – reactive)
      • Preventing problems from arising is a very low visibility process, finding people skilled at preventing problems can be very challenging. (SQA – proactive)
      • The organizations leaders must make the determination of what constitutes “ sufficient quality ”
      • SQA must assist with risk analysis when it comes to assessing the project quality and readiness for release, however the final decision must lay with the project manager/owner.
      • For organizations with high-risk projects, management acceptance is required and a formalized SQA process is not only necessary but critical
      • Where the risk is lower, management buy-in and SQA process implementation, may be implemented with a slower step by step process
      • For small groups or projects, a more ad-hoc process may be appropriate, depending on the type of customers and projects. A lot will depend on team leads or managers, feedback to developers, and ensuring adequate communications among customers, managers, developers, and testers. Good communication is critical.
      • 76% of survey respondents say that having experienced staff increases their project success
      Info-Tech Research Group SQA processes should always be balanced with productivity so as to keep the proverbial red tape from getting out of hand
    • In many cases, convincing management of the need for testing resources is more difficult than you think Info-Tech Research Group Read the full “ Case Study: The business doesn’t care ” in Appendix I of this document. A financial industry business develops software for the web, and for internal applications. They have no testing resources, and all active testing is completed by developers. The business has not experienced any major pain from their lack of testing and as such will not justify the expense. Scenario:
      • Developers must attempt to build test plans and test their own development efforts.
      • No project methodology, no testing understanding, many problems get through to production and cause the development team to continually react to problems.
      • IT has no idea what to test.
      What Is Wrong:
      • IT Manager is aware of what needs to be done, but is unsure how to justify testing resources to the business.
      • Risk Analysis, Disaster planning must be done. The business has so far dodged the bullet, and are comfortable with that, but IT Management is on edge.
      Lessons:
    • When times are tight, cutting testers and letting developers do the testing is common, but it is a short sighted decision Info-Tech Research Group
      • Scaling back testing resources exposes businesses to dangerous levels of unacceptable risk
      • When testing resources are scaled back or cut all together, the result is that software invariably gets released in an untested state
      • While eliminating testing resources may look prudent and frugal in tight times, it is a poor management decision . The decision may provide short term relief, but can prove disastrous to clients and investors
      • Testing is not free. It has to fall into someone's budget, and it is a task that just has to be done! (even though you may have eliminated your testers, the developers still need to do the job)
      • A very old metric, which holds true today, states simply that for every $1 spent fixing a defect in design, you will spend $10 fixing it in the testing phase, and if skipped you will spend $100 fixing it after release ( Barry Boehm's , Software Engineering Economics)
        • * What does that mean to you …. If you don’t test, and test early you will pay dearly for that decision not to at some later date …. This is an almost certainty !
      Develop and Test : Opposites but connected by their similar being Distinct by orientation Unique by skills Incapable of performing the other! If you take this quick fix route, the results will be:
    • You know your cutting corners, you know it is wrong, but you’re a small shop with limited resources so what do you do? Info-Tech Research Group Read the full “ Case Study: Even small shops need a testing strategy ” in Appendix II of this document. A small IT shop that primarily deals with 3 rd party application integration. Internal resources are limited and a lot of trust is imparted to the vendor. No testing resources exist, and there often is very limited time available. The manager, also the CIO, cuts corners regularly and knows it. Scenario:
      • The CIO knows they are cutting corners but goes along because they are a small shop
      • No project methodology, no testing, quick to roll back is the safety net for lack of testing
      • Trust is placed in a 3rd party vendor who has no real stake in the business success (other than their own).
      What Is Wrong:
      • IT Manager and CIO knows that something must be done, but is unsure exactly what can be done within their limits
      • Risk Analysis, Disaster planning must be done. The business has so far been lucky with no major problems but … luck has a way of running out when you least expect it.
      • Bring in co-op students to take on some limited testing tasks, relieve the pressure from your developers, and bring the trust in house.
      Lessons:
    • When you develop Web apps, you need to understand that there are differences in testing requirements Info-Tech Research Group
      • Web sites are essentially client/server applications and must be tested for both the client side, and the more complicated server side.
      • Considerations must be given for html/asp/php/flash etc (interface), web services, communications (encrypted or not), firewalls, plug-ins and a whole host of other potential applications that could run on the server or the client side – meaning, you have to test for occurrences of things you may not know .
      • Browser types and versions must be considered,
      • as well as interactions between the browsers, the
      • OS, and the coding language used.
      • You have to test for outside influence. Many browsers
      • today can allow the user to “turn off” functionality that
      • your application may need …. Make sure you allow for this or you may find yourself up the proverbial river without a paddle!
      • Connection speeds, server performance (loads), standards and protocols, all can have a dramatic affect on your application. Make sure your testing strategy allows for, and tests for these aspects .
      “ With Internet apps, testing could be the last chance to ensure the safety of the data and the organization” - IT Professional with 25+ years experience
    • Your Web app is potentially visible to the world, so your effort to ensure its quality should not be taken lightly Info-Tech Research Group When building apps for the Web, you must consider and test for the following:
      • What are the expected server loads (e.g. hits/time).
      • What kind of performance is required, server response times, database query times, uptime/downtime.
      • What version of browsers will be used, and how much testing should be done per browser type.
      • Security – firewalls, encryption of data, passwords, data storage.
      • Internationalization – globalization & localization, does your site have an international audience, or multi-lingual audience.
      • Internet connection reliability – backups redundant failovers, caching.
      • Appearance & flow standards – There can never be too much emphasis put on this area , your users will stay longer if their experience is more pleasing and understandable.
      • Logging – Every application should have some version of logging. As testers, you will need to “ push ” this one to make sure your developers create meaningful logs. The logs should include information designed to assist in knowing what your users are/were doing when something occurs.
    • Testing can save a lot of people a lot of time by ensuring that every app has at least considered the basics Info-Tech Research Group Developers should always follow best practices, but it is the job of testing to make sure they have.
      • Update management – how are your updates going to be rolled out
      • Maintenance requirements, tracking & controlling content, who is going to be allowed to update content. Make sure to create an “ admin ” tool so that every minor change request does not come back to IT.
      • Standards and specifications – conforming to HTML specifications, XML, etc . This will make future changes and enhancements dramatically easier and quicker. Future developers will ‘ramp” up much faster if your site is built using standards, and for testing… it is much more likely that automation tools will work if the code is written using standards.
      • Internal/external link verifications – It is best to stay away from external links that you do not control, but if you have to do this, make sure it is included in your “admin” tool so that someone outside of IT can maintain the links.
      When testing your web applications, make sure your developers have considered the following basic requirements for web application development:
      • Every Web testing group should ensure that they have multiple test environments setup that can mimic the variety of browser and OS configurations of your users. (variability of browser options, connection differences, real world internet traffic)
    • If development is Agile, there is still a place for testers, but your focus and testing approach needs to change Info-Tech Research Group
      • Unit testing by developers is critical to ensure the software unit is functioning correctly and meeting requirements listed in the user stories
      • Regression testing – since the development itself is iterative it is highly possible that the next or future releases are built by modifying the previous code. As a result, regression testing gains significant importance in Agile development
      • Automated testing also gains importance due to the short delivery timelines.
        • Note : it is not always necessary to purchase costly automation tools for testing. Test automation can be achieved by creating in-house scripts.
      • Exploratory testing is encouraged. Exploratory Tests are not pre-designed or pre-defined. The Tests are designed and executed immediately
      • Ad-hoc testing may also be encouraged. Ad-hoc testing is done based on the tester’s experience and skills
      a different set of challenges for Software Testing communication between team members is critical for success innovative thinking and possibly non-standard resources for testing close communication results in better clarity and understanding of the system Testing best suited for use in Agile Development : “ With short timelines, it is critical that the tester has sufficient knowledge of the system, the objectives, and the requirements ” - SQA Trained Professional with over 16yrs exp. Agile requires:
    • Rapid test-on-the-fly may be the norm in Agile Development, but resources focused on testing are still critical to your success Info-Tech Research Group
      • Before the stories are written, you still need an overall strategy for testing . Follow the same guidelines, the same principles as in a more traditional development setting ( This solution will provide you with the best practices to follow, just make any necessary changes that fit your situation…. That’s what Agile is all about).
      • As part of the user stories, make sure you record how each requirement will be tested.
      • If possible, your testers should work very closely with your developers … while developing … to ensure that each requirement is tested as it is created (Developers should never begin coding anything, unless they know how they are going to test for its accuracy).
      Critical test considerations for Agile Development: The objectives of the Project are clear to the entire team At every stage the Software is tested to see if it meets the requirements Every requirement is translated to a test case While the processes and documentation are not stressed, sufficient steps must be taken to ensure that the software is delivered as per the user expectations This implies that each (sprint release) delivery is tested thoroughly before it is released Your testers should also be the voice on the team to make sure that:
    • If you’re testing an enterprise app, it is of little value how much testing has been done if it cannot be integrated Info-Tech Research Group Make sure your testers are aware of these Enterprise Testing Fundamentals: Testing is a critical requirement to making good enterprise deployment decisions. Recommendations include approaching your enterprise testing with a phased approach to your component, feature, and system testing. The test process for enterprise solutions can be best viewed as a reverse engineering of the product development. Typically this can be done in a 3-step process involving module verification , feature verification , and usage . Product testing can only begin once one or more of the interrelated functions have been unit tested sufficiently as to allow a normal progression through each of the major areas of functionality. Start testing early to provide your supplier, vendor, or internal development group sufficient time to respond with fixes and have your test team verify the fixes while still preserving your timeline Info-Tech Insight:
    • Enterprise App testing can include some assumptions of previous testing, but you still need to load it and smoke it Info-Tech Research Group
      • Smoke testing enterprise application to ensure it is functioning as expected will provide your team with a level of confidence before engaging in testing of specific changes that have been made.
      • If a requirement is that the application runs on multiple servers, and OS platforms, then everything that the user can do must be tested for each different component .
      • In enterprise integration projects, it is safe to assume that most component defects will have been found and corrected in earlier tests (ie: unit testing, vendor testing) and therefore most issues that arise at this stage will be interface or functionality related.
      • Most of your time in enterprise integration projects will be spent on system testing . In this phase of testing, testers must make every attempt to mimic and replicate the users actions .
      • System testing requires that all of the available components and features are in place and fully implemented into the enterprise system.
      • The enterprise system must be tested for load and stress . Testing the reliability of an enterprise system in this manner should be started as early as possible since it does tend to uncover defects that could be design flaws.
      Since system testing will help to determine the production readiness of the enterprise system, customer acceptance testing techniques are best used during enterprise testing Testers should be aware that: Info-Tech Insight:
    • Quality test data is a critical piece of ensuring the ultimate quality is there; without it, it’s a guessing game Info-Tech Research Group Read the full “ Case Study: Test data, the critical component ” in Appendix III of this document. A large organization with a large development and technology footprint. Well over 100 IT employees spread across multiple disciplines. SQA has 10 dedicated resources including a formally trained SQA Manager. The organization is responsible for custom development for internal purposes, web development, and enterprise integration. Scenario:
      • Several problems exist despite a well established process in particular the lack of proper testable data. Considering the nature of the applications (Health related) the data structure is quite extensive, however the independent test environment has limited testable data which creates problems during testing forcing certain tests to fail, or never complete.
      What Is Wrong:
      • SQA Manager is aware of the challenge facing the organization. Understanding the root problem is the first step to correcting the situation.
      • The recommendation is to spend time creating an actual subset of data from some production data, or create a system in production that can mirror production data writing to the test database using an algorithm to sufficiently change the data. This would be time consuming, require design and development, but would be well worth it to future testability of the overall system.
      Lessons :
    • Info-Tech Research Group Develop & Improve Your Software Testing Strategy Improve your overall approach to software quality testing It is impossible to “test” quality into a product Improving your testing effectiveness Assess The General Situation Your Environment Improve Your Testing Focus Your Testing Strategy Your Testing Effectiveness Best Practices Develop Subscription Based Software Competencies Understand Program Management Concordant Bodies & Alternatives
    • Incomplete requirements, poor specifications, and unclear objectives are listed clearly as the biggest project problems 0 Info-Tech Research Group In a recent unofficial independent poll of Software Quality Assurance professionals (Linked-In) when asked “What do you think the top problems with development projects are?” The results showed that poor requirements was the leading cause of failure in projects. Inadequate testing was NOT the biggest concern, poor requirements were Make sure your project team fully understands the requirements and make sure those requirements are testable If your team can’t test the requirement, you have a problem! approx N=19 18% 13% 7% 8% 28% Poor Requirements Feature Creep Miscommunication Inadequate Testing Unrealistic Schedule Having testers involved early in the projects can help to prevent the number one cause of failure in projects. *** Visionary requirements cannot be tested , but detailed, well thought out requirements can!
    • Improve your testing process by first understanding the 5 most common problems and how to address them Info-Tech Research Group solid requirements - clear, complete, detailed, attainable, testable requirements that are agreed to by all stakeholders. realistic schedules - allow adequate time for planning, design, testing, bug fixing, re-testing, changes, and documentation; project resources should be able to complete the project without burning out. adequate testing - start testing early, re-test after fixes or changes, plan for adequate time for testing and bug-fixing. 'Early' testing could include static code analysis/testing, test-first development, unit testing by developers, automated post-build testing, etc. stick to initial requirements where feasible - be prepared to defend against excessive changes and additions once development has begun, and be prepared to explain consequences. If changes are necessary, they should be adequately reflected in related schedule changes. If possible, work closely with customers/end-users to manage expectations. communication - require walkthroughs and inspections when appropriate; make extensive use of group communication tools - groupware, wiki's, bug-tracking tools and change management tools, intranet capabilities, etc.; ensure that information/documentation is available and up-to-date - preferably electronic, not paper; promote teamwork and cooperation; use prototypes and/or continuous communication with end-users if possible to clarify expectations.
    • To ensure you maintain a good quality process, make sure you follow these basic steps to keep yourself on the upward slope Info-Tech Research Group
      • Prepare test plans, with reviews and approvals by stakeholders
      • Establish test cases
      • Setup and prepare test environment
      • Obtain and install software release from development into test environment
      • Execute tests as determined by plans (test strategy)
      • Track problems, issues , bugs and their related fixes
      • Retest as applicable (regression, smoke tests)
      • Initiate Alpha/Beta testing (if applicable)
      • Update and maintain all test plans, test cases, test environments and automated test tools throughout
      Strategy, Planning & Requirements Development Design Deployment
      • Track down and obtain all relevant project related documentation
      • Get project budgets (time and schedule)
      • Determine resources (responsibilities, reporting structures, processes and standards)
      • Determine how project interacts with existing applications
      • Identify high risk and critical aspects of project.
      • Set testing priorities, determine scope and limitations
      • Determine type of testing to be used – unit testing, integration, functional, system, security, performance, regression, etc
      • Determine test environment requirements
      • Determine test tool usage (automation, tracking of bugs/issues etc)
      • Identify specific test tasks, resources responsible, time estimates (overall test strategy)
      • Estimate and set schedules, timelines and milestone objectives
    • Do not limit your testing resources to just testers, include resources from other areas to improve your success 0 Info-Tech Research Group “ Just like development, to be effective SQA needs to have good processes For good testing results those processes need to include other people as necessary” - SQA Analyst– IT Professional Services (Financial Industry) Organizations that included resources outside of testing throughout their process showed to be more successful in every case. N=72 Source: Info-Tech Research Group Business Analysts Business Users Developers Other IT +38% +33% +24% +83% Those that did Those that did not Clients use of additional resources
    • Use Info-Tech’s testing checklist as a guide to insure you and your testing team have sufficient test coverage Info-Tech Research Group
      • Use Info-Tech’s “ Testing Checklist ”
      • to quickly check off that required tasks are completed, and covered
      • Test checklists are a great way to keep everything on track.
      • Testing of any software application is a critical component of the life cycle. If testing is not done with sufficient coverage, a substandard product may be deployed.
      • Use the checklist as a guide to insure proper test coverage has been achieved
      • Are unit tests performed on all individual modules or components by the developers prior to integration?
      • Is component integration testing done?
      • Is stress testing (aka load, volume or performance testing) complete?
      • Is response time tested?
      • Are regression tests conducted?
      The checklist is a great way to remind yourself, your test team, and the project team what has been done, and what has yet to be done. If you don’t already have your own checklist, try this one out
    • All good projects begin with a plan. For testers, that means drafting a test strategy and a comprehensive test plan Info-Tech Research Group
      • Software identification (version & release)
      • Document Revision History
      • Document Purpose Statement – inc. intended audience
      • Testing objective statement (of product/project being tested)
      • Software product/project overview
      • Related document list (requirements, functional, design etc)
      • Standards (legislated by gov., legal, industry, or company mandated)
      • Naming Conventions (development & testing)
      • Resource responsibilities & allocation
      • Requirements traceability test matrix
      • Product/project assumptions & dependencies
      • Project testing priorities, scope of testing, risk analysis & mitigation
      • Test plan outline
      • Test environment description & production differences, locations
      • Any unique process related to customer
      • Test data setup and/or dependencies
      • Error logging and setup required throughout testing process
      • Additional software, hardware requirements
      • Test Automation tools
      • Project test metrics, what will be
      • measured and how
      • Reporting requirements, who needs
      • to know and when
      • Entrance and exit criteria
      • Initial project estimates
      • Test suspension and restart criteria
      • Licensing issues
      • Open issues (outstanding from previous releases)
      Test Plan – What you need to know! Project test plans are documents that describe the objectives , scope and approach for the project testing effort. The process of creating the test plan is a useful exercise to force an evaluation of the entire validation procedure for the given project. The completed test plan document is also extremely useful for others to understand the validation process, specifically why and how certain validation will occur. Of respondents who had the most success with their projects, 95% agreed they followed a testing strategy. Initialization: Overview & Setup: Testing Details:
    • Improve & streamline your test planning by utilizing Info-Tech’s test plan template Info-Tech Research Group Use Info-Tech’s “ Test Plan Template ” to quickly layout and streamline your project test plan. Test strategies are unique, but the categories and type of information you need to record is the same each time. This template will guide you through building a complete and comprehensive test plan, strategy document.
      • Project Factors
      • Project Objectives & Tasks
      • Testing strategy overview
      • Types of tests to be run
      • Hardware requirements
      • Software requirements
      • Project schedules
      • Resource schedule
      • Project dependencies
      • Project risks
      • Impacted stakeholders
      • Signoff and more
      The plan you produce should provide sufficient detail to permit identification of the major testing tasks and estimation of the time required to do each one With this template you will be able to describe the overall approach to testing. For each major group of features or feature combinations, you will be able to specify the approach which will ensure that these feature groups are adequately tested and specify the major activities, techniques, and tools which are used to test the designated groups of features.
    • Descriptive reporting of any issues found during testing is critical to ensure you continue on the upward slope of quality Info-Tech Research Group When reporting an incident, include:
      • Full and complete details about the bug so that developers can easily understand the bug and reproduce if necessary.
      • Some sense of the severity of the issue (critical to low)
      • Current status of the bug (e.g. re-test, new, etc)
      • Application name, version and release information
      • Function, feature, module, or object where the bug occurred
      • Environment details Include: system, platform, and hardware specifics (if Web related, include browser , client info)
      • Test case number (what test was running when bug was found)
      • Brief bug description (one line)
      • Full bug description
      • Description of steps leading to the bug occurrence (required for reproduction of the bug/issue)
      • Any specific data used during the test (e.g. files, data, messages)
      • Error messages produced (if any)
      • Whether the bug is reproducible
      • Tester Name, Date
      • Description of what caused the problem
      • Description of fix (be descriptive, “fixed” does not help anyone)
      • Section of code, module, function, class etc that was fixed
      • Date of fix
      • Software version (ie: build number) that contains the fix
      When reporting a resolution, include: When re-testing a resolution, include:
      • Tester responsible for re-test
      • Re-test date
      • Re-test results
      • Regression testing requirements
      • Tester responsible for regression testing
      • Regression testing results
      When a bug is found in the software it needs to be communicated back to the development team so that they can fix it. After development has resolved the issue, it needs to be re-tested and determinations made against the requirements as to whether any regression testing should be executed to determine if potential problems have been created elsewhere in the software as a result of the fix. If a problem/issue tracking system is in place it should already handle these processes. There is a very broad and robust commercial software base for issue tracking systems available if you do not currently have one
    • One of the most difficult questions is knowing when enough is enough; when testing should stop Info-Tech Research Group How do you know when enough is enough; when do you stop testing?
      • Modern software can be very complex. Running across multiple servers, jumping between environments and accessing numerous data sources along the way. As such it is virtually impossible to fully complete testing in many circumstances
      • As a guide, the following represents some common reasons to STOP testing:
        • Release deadline
        • Testing deadline
        • Test budget exhausted
        • Test coverage meets specified levels (you complete your test strategy)
        • Bug/Issue rate falls below a specified threshold
        • Alpha/Beta testing period ends
      What do you do ? When software released to testing is just too buggy to continue? The best thing to do in this case is to document the most critical and blocking type bugs This type of problem can have significant effect on project timelines and schedules It can point to deeper rooted problems within the development organization. Insufficient unit testing, integration testing, poor design, lack of following requirements, improper adherence to development build and release processes. SQA must ensure that Development managers are made aware of these situations and are provided with adequate documentation It sounds like an easy one to answer, however this question can be one of the most difficult to determine for SQA resources. It is imperative that SQA provide a comfort rating to management to aid in the decision to release the software to customers
    • Despite all your good efforts, sometimes the clock just isn’t on your side. If that happens, where does that leave testing Info-Tech Research Group You have done everything right. You planned, you strategized, you lined everything up … but the deadline is looming
      • Talk to your stakeholders, analyze the risks to determine where the best testing efforts should be focused.
      • Using risk analysis is a perfectly appropriate tool that should be used. It is rarely possible to test every aspect of an application, every event, every dependency or every possible issue that could go wrong.
      • Risk Analysis requires good judgment skills, common sense and most of all experience in order to make sound decisions.
      Evaluating risk can be challenging, but consider this:
      • What functionality is most important to the overall project
      • What functionality is most visible and apparent to the user/customer
      • What functionality has the most significant safety impact
      • What functionality is most important to the user/customer
      • What functionality is easily tested early
      • What functionality is most complicated and prone to errors
      • What area of the code was developed in a rush
      • What part of the requirements and/or design were weak or unclear
      • What do the developers think is high risk
      • What kind of problem might cause negative publicity
      • What kind of problems would cause the most customer service complaints
      What if the project just isn’t big enough to justify extensive testing?
      • Try not to consider the size of the project but rather focus on the impact if errors occur.
      • Even the smallest projects should at least have some risk assessment completed.
      • Often ad-hoc testing, or limited testing may be sufficient to achieve a level of confidence
    • Improve your effectiveness by following this simple list of best practices for software quality testers Info-Tech Research Group 1) Learn to analyze your test results thoroughly 2) Learn to maximize your test coverage 3) Break your application into smaller functional modules 4) Write test cases for intended functionality 5) Start testing the application with the intent of finding bugs 6) Write your test cases in the requirement analysis and design phase 7) Make your test cases available to developers prior to coding 8 ) If possible identify and group your test cases for later regression testing 9) Applications requiring critical response time should be thoroughly tested for performance 10) Developers should not test their own code 11) Go beyond requirement testing 12) While doing regression testing use bug data 13) Note down the new terms, concepts you learn while testing 14) Note down all code changes done for testing 15) Keep developers away from test environment 16) It’s a good practice to involve testers right from software requirement and design phase 17) Testing teams should share best testing practices 18) Increase your conversations with the developers 19) Don’t run out of time to do high priority testing tasks 20) Write clear, descriptive, unambiguous bug reports Read the full “ Testing Best Practices ” in Appendix IV of this document. Don’t forget: testing is a creative and challenging task. It depends on your skill and experience, how you handle this challenge
    • Info-Tech Research Group Develop & Improve Your Software Testing Strategy Develop your testers, your process, and the means to measure Involve SQA early, but not too early … wait until the ambiguity starts to settle Developing your testing strategies Assess The General Situation Your Environment Improve Your Testing Focus Your Testing Strategy Your Testing Effectiveness Best Practices Develop Your Testers & Test Coverage Your ability to measure success Understand Program Management Concordant Bodies & Alternatives
    • Like gears in a well oiled machine, having the right people on the job is your best guarantee for a quality product Info-Tech Research Group Software Tester – focus on testing – reaction oriented Software QA – focus on overall quality – prevention oriented
        • Test-to-Break Attitude
        • Strong desire towards quality
        • Close attention to detail
        • Capability of seeing big-picture
        • Understanding of customer point of view
        • Communicates with tact and diplomacy (necessary in communicating with developers)
        • All traits of Software Tester
        • Detailed understanding of SDLC and Business interactions
        • Capability to align tasks to company goals
        • Great communication skills
        • Ability to understand issues from all viewpoints
        • Patience, diplomacy and understanding
        • Familiarity with SDLC (Development Process)
        • Enthusiastic – team focused
        • Able to promote positive atmosphere despite working in a perceived negative environment
        • Capable of promoting teamwork to increase overall productivity
        • Able to promote cooperation between developers, testers, QA, and management
        • Diplomatic and capable of promoting improvements to QA processes
        • Ability to withstand pressures of having to say “NO” to other managers when quality is insufficient or when QA Process is not followed
        • Good judgment of people and their skills
        • Able to communicate both technically and non-technically
        • Organized, Methodical and focused
      QA Manager – focus on Quality – process & team
        • Easily communicates both technically and Non-technically
        • Development experience useful (not critical)
        • Good judgment skills
        • Problem solver
      65% of survey respondents say formal SQA training greatly improves their success
    • Having the right mix of experience and training in the right roles makes a difference to your chance of success 0 Info-Tech Research Group Increasing Success Our survey shows: Clients that showed greater success had 36% more experienced staff on their team Organizations that showed greater success had 45% more staff with development backgrounds Clients that showed greater success had 41% more staff with formal QA training N=72 Source: Info-Tech Research Group In a related unofficial poll of SQA Professionals, when asked “ What is a good ratio of developers to testers? ” the response was an overwhelming and unanimous “NO” The unanimous reasoning was simply that there are too many variables to give a reasonably accurate answer However the survey did show that while the minimum ratio was 0:1, the maximum was 1:30 the most common ratio was 1:3 (testers to developers) “ Having the right resources spread between Dev, SQA, and the BA’s is like 3 legs of a stool … you need them all ” - IT Manager Financial Industry +45% +36% +41% Relevant Work Experience Formal QA Training Development Background
    • Trained SQA testers should know what type of test is best, but using these should be your minimum coverage Info-Tech Research Group Unit Testing - the most micro scale of testing; to test particular functions or code modules. Typically done by the developer and not by testers, as it requires detailed knowledge of the internal program design and code. Not always easily done unless the application has a well-designed architecture with tight code; may require developing test driver modules or test harnesses Functional Testing - black-box type testing geared to functional requirements of an application; this type of testing should be done by testers. This doesn't mean that the programmers shouldn't check that their code works before releasing it (which of course applies to any stage of testing) Integration Testing - testing of combined parts of an application to determine if they function together correctly. The 'parts' can be code modules, individual applications, client and server applications on a network, etc. This type of testing is especially relevant to client/server and distributed systems Exploratory & Ad-Hoc Testing - often taken to mean a creative, informal software test that is not based on formal test plans or test cases; testers may be learning the software as they test it Regression Testing - re-testing after fixes or modifications of the software or its environment. It can be difficult to determine how much re-testing is needed, especially near the end of the development cycle. Automated testing approaches can be especially useful for this type of testing Usability Testing & User Acceptance - testing for user-friendliness. This is the most subjective, and will depend on the end-user or customer. User interviews, surveys, video recording of user sessions, and other techniques can be used. Developers and testers are usually not appropriate as usability testers “ Automated testing is not the silver bullet that everyone believes it to be but yet another tool in the QA toolbox. An automation framework is great when it is used to enhance testing ” - SQA Trained Professional with over 16yrs exp . Read the complete list of testing types in Appendix V
    • What to, when to, or even whether to automate is a crucial decision; make sure you choose the right tool for the job
      • Testing Tools
      • Testing tools and helpers are available as stand-alone applications to full function test suites
      • Software test suites are typically packed with a wide variety of tools ranging from unit test tools to management reporting and monitoring.
      • Automated tools can help you decide whether you have sufficient test coverage
      Info-Tech Research Group
      • Design
      • Source Testing
      • Functional Testing
      • Performance & Load Testing
      • Language Specific–unit testing
      • Test management tools
      Automation comes in many flavors: Some of the popular test tools/suites available today include: HP QuickTest Professional IBM Rational Functional Tester Selenium SilkTest TestComplete TestPartner WatiR/WatiN/WatiJ Functional & Regression Regression Record & Playback Testing for Web Apps Functional Enterprise Testing Full Suite – Functional, Regression, Unit, GUI… GUI Web – Browser based testing
      • Requirements
      • API Testing
      • Log Analysis
      • Bug & Defect Trackers
      • And on… and on…
      • Database Testing
      • Test Case Management
      • HTML & Link Validators
      • Site Management
      • Security Testing
      • Communications
      Automated testing tools can only find what they are programmed to find. Most will make this determination from a comparison to a baseline. If you don’t have a good baseline, your tool will be of no use to you “ Invest in your people first, then the tools. The tools are no magic bullet if the people don’t know the process” - IT Manager–Financial Industry
    • Get in the habit of writing test cases to validate your coverage. Follow these steps to develop a repeatable test case format Info-Tech Research Group
      • How to write test cases
        • Fields in test cases:
        • Test case id:
        • Unit to test: What needs to be to be verified?
        • Assumptions:
        • Test data: Variables and their values
        • Steps to be executed:
        • Expected result:
        • Actual result:
        • Pass/Fail: Comments:
      • When describing the test case it is good to phrase your statement in the following form:
      • Verify (XYZ) … Using (tool) … With (conditions) … To (do what, what is returned , displayed etc)
      • Keep in mind while writing test cases that all your test cases should be simple and easy to understand . Don’t write explanations like essays. Be to the point.
      What is a test case? A test case exists to describe an input, action or event and an expected response, to determine if a feature of an application is working correctly and as expected Use Info-Tech’s “ Defect Reporting Template ” to quickly layout and streamline your project test cases. Test cases are unique, but the categories and type of information we need to record is the same each time. This template will guide you through building a complete and comprehensive test case.
    • Develop a strategy for projects by accounting for the likelihood and impact of application failure Info-Tech Research Group Use Info-Tech’s “ Software Testing Strategy & Risk Assessment ” tool to determine risk and testing strategies by looking at these influencing factors. Testing strategies should be unique to each development project. The strategy for each project will vary depending on some of the important variables measured in Info-Tech’s Strategy Assessment tool. A strategy for each system in the application portfolio makes up the organization’s overall strategy for testing, leading to investments in resources that match a true reflection of testing resources
      • Project Factors
      • Project Deadlines
      • Project scope
      • Developer skills
      • Development methodology
      • Compliance
      • Application Factors
      • Complexity of code
      • Reliance on external systems and data
      • Number of users
      • Security risks
    • Assess the strengths & limitations of your development efforts by evaluating & measuring the quality of your product Info-Tech Research Group
      • Customer Satisfaction Index (periodic customer survey)
      • number of enhancement requests per year
      • number of maintenance fixes per year
      • User Friendliness : call volume to customer service or helpdesk
      • User Friendliness : training time required for new users
      • number of fix releases (patches) required per year
      • Delivered defect quantities
      • Number of known issues released with software
        • by level of severity
        • by category or cause (requirements defect, design defect, code defect, documentation/help defect, introduced by fixes, etc.)
      • Responsiveness
      • turnaround time of fixes to users by severity
      • time for enhancements (major vs. minor, actual vs. estimate)
      • Product Volatility
      • ratio of fixes vs. enhancement requests (fixes being bringing product in line with predefined requirements, enhancements altering or changing functionality)
      • Defect Ratio & Removal Efficiency
      • defects after delivery per functional area
      • defects found by customer after delivery, categorized by severity vs. all defects
      • defects found pre-release vs. all defects
      Assessing Product Quality
      • Test Coverage
      • functional coverage ( percentage of total coverage)
      • percentage of functional areas, paths, branches, or conditions that were tested
      • ratio of number defects actually found vs. predicted faults
      • Rework
      • re-work effort – hours as a percentage of the original development hours
      • re-worked components as a percentage of total components delivered
      • Reliability
      • availability of time system actually available vs. time required to be available
      • mean time between failures and defects ( MTF )
      • total person-hours to repair all defects ( PHD )
      • reliability ratio – MTF/PHD
      • total number of fix releases (patches)
      “ Many development shops only have a vague sense of their overall quality, usually this comes in the form of gut-feel. ” - IT Professional with 25+years experience
    • Evaluate the benefits of your system testing group by measuring and evaluating their costs and application coverage Info-Tech Research Group
      • Test coverage
      • Number of functional units tested / total size (functional units) of the system
      • Number of tests per functional unit
      • Number of test cases per functional unit
      • Acceptance criteria
      • Acceptance criteria tested / total acceptance criteria
      • Test Cost (%)
      • Cost of testing / total cost of project (includes resource cost)
      • Cost to locate defect
      • Cost of testing / number of defects
      • Achieving budget
      • Actual cost of testing / estimated (budget) cost of testing
      • Defects detected
      • Defects detected during testing / total system defects
      • Defects detected after release / total system defects
      • Quality of testing
      • Defects found during testing / total number of defects found
      Evaluate System Testing
      • Costs of:
      • Reviews, inspections and other preventative measures
      • Test planning and preparation
      • Test execution, defect tracking, version & change control
      • Diagnostics, debugging, and fixing
      • Tools and tool support
      • Test case library maintenance
      • Testing and SQA education for resources associated with the current project
      • Management overhead (assuming separate from development)
      Cost of Quality Effort/Activity
      • Costs of:
      • Business loss from defects (current project)
      • Interruption cost – cost of work-arounds
      • Lost sales and goodwill
      • Litigation cost resulting from defects
      Cost of Defects Doing an analysis of cost and benefits to testing can encourage better decision making and ensure that resources are allocated effectively to support the maximum level of quality for a project at the lowest cost.
    • Info-Tech Research Group Develop and Improve Your Software Testing Strategy If you only do a little, you need to understand the basics Understanding is the first step toward useful knowledge Understand how it all fits together Assess The General Situation Your Environment Improve Your Testing Focus Your Testing Strategy Your Testing Effectiveness Best Practices Develop Your Testers & Test Coverage Your ability to measure success Understand If you only test a little, this is what you need to know How testing fits in the bigger SQA picture
    • Unit Testing is the 1 st level of testing and the most important! Detecting and fixing bugs early helps reduce costly fixes later
      • Unit Testing
      • This is the first and the most important level of testing. As soon as the developer creates a unit of code the unit is tested for various scenarios. It is through unit testing that the earliest bugs will be found. It is for this reason that:
      • Unit testing is the most important of all the testing levels .
        • Unit Testing Tasks and Steps: Step 1: Create a Test Plan Step 2: Create Test Cases and Test Data Step 3: If applicable create scripts to run test cases Step 4: Once the code is ready, execute the test cases Step 5: Fix the bugs if any and re-test the code Step 6: Repeat the test cycle until the “unit” is free of all bugs
      • Documentation is an important consideration:
        • Documenting test cases prevents oversight
        • Documentation clearly indicates the quality of test cases
        • If the code needs to be retested you can be sure that you did not miss anything
        • It provides a level of transparency of what was really tested during unit testing Note: This is one of the most important aspects
        • It helps in knowledge transfer in case of employee attrition
        • Sometimes unit test cases can be used to develop test cases for other levels of testing
      •  
      Info-Tech Research Group Time pressures to get the job done may result in developers cutting corners in unit testing it helps to write scripts, which automate a part of unit testing Automating where necessary will help ensure that the necessary tests were done An effective unit testing process can increase the software reliability and the credibility of the developer Many new developers take the unit testing tasks lightly and realize the importance too late in the project
    • The size of the hamster powering the wheel is unimportant when functional testing is performed – requirements only
      • Functional Testing
      • This is usually the first test phase that the test organization is responsible for. It is one of the most powerful testing approaches which will significantly reduce the number of defects found in later stages.
      • Functional tests validate and verify that the developed system behaves according to the client/business specifications. Verifying that each component responds correctly to all conditions that may result from incoming events or data.
      • Functional testing requires well formed functional requirements, from which the testers can create definitive test cases.
      • Primary requirements for success:
      • Test planning, functional decomposition , requirements definition/verification , test case design, traceability matrix, defect management, coverage analysis.
      • Functional decomposition is the breakdown of the system into its functional areas. Often the task of creating this document falls to resources within development, however it is an important document in building a robust test strategy by the testers
      • Requirements definition is often the weakest deliverable in the development process. Many development groups go directly from concept to coding without any real preliminary design deliverables. If these documents are not delivered to the testers, then the test team must create its own set of testable requirements which can then be used in mapping against functional components of the system.
      •  
      Info-Tech Research Group The objective of functional testing is to measure the quality of the business components of the system Functional testing can be an overwhelming task for teams with little experience To ensure success, the scope of the testing effort must be well defined 94% of survey respondents ranked functional testing as their most important testing
    • If your application is going to be used by people, then you need to do usability testing … its just that simple!
      • Usability Testing
      • The primary reason for usability testing is to provide relevant and adequate feedback during the development process to ensure that the completed application will actually be easy to use, effective to use and will provide meaningful and valuable information to the user of the application.
      • Usability testing is coordinated by the test team, however neither developers nor testers should be involved in the actual testing. Testers for usability should be the users themselves.
      • Primary measurement for usability testing:
      • Ease and effectiveness of navigation - Do users find what they need easily. Is there a clear pattern to the navigation that fits easily into the users mental model.
      • Usefulness of content - What information do your users want/need? Have you organized the content on each page in such a way that it is easy for your users to quickly find it?
      • Effectiveness of presentation - Did the design, fonts and colors highlight the navigation and content, making the site easier to use? Or did the presentation distract or create a barrier between the user and the information?
      • Task success rate - Were the users able to accomplish the key task they needed/wanted to accomplish. If they were able to complete the task, did they feel satisfied, neutral or angry and frustrated?
      Info-Tech Research Group testing for user-friendliness the most subjective of all tests performed developers and designers, while talented, aren't like “ normal ” people Designing systems that make sense to developers will often lead to a site that is not usable by the average person While particularly necessary for web development, or client application development. It should also be considered for corporate or business application development
    • Any software change can cause existing functionality to fail It is common for defect fixes to introduce new problems
      • Regression Testing
      • If any component of the software has been changed, for any reason , then testing needs to be done to ensure that nothing has been negatively impacted by the change.
      • Regression testing verifies:
      • The application works as specified after changes, modifications, or enhancements have been made to it and that original functionality continues to work.
      • Changes, modifications, or enhancements to the application have not introduced new defects.
      • Successful Regression testing includes:
      • Create a Regression Test Plan : Plan should contain an overall strategy, test entry and exit criteria as well as an outline for testing which contains areas of the application affected by the change.  
      • Create Test Cases : Test cases that cover all the necessary areas of change are important. They should describe the change and what to test, steps needed to test, any necessary inputs and expected outputs.
      • Defect Tracking : As with any testing effort, it is important to track defects.
      • Test Automation : If the test cases are automated the test cases may be executed using scripts after each change is introduced in the system. The execution of test cases in this way helps eliminate human errors. It may result in faster and cheaper execution of test cases. However there is a cost involved in building the scripts.
      • Selective Testing : Some teams choose to execute the test cases selectively. They do not execute all the test cases during regression testing. They test only what they decide is relevant. This helps reduce the testing time and effort.
      Info-Tech Research Group Systems tend to become more fragile with each change, requiring additional testing of the original system in addition to changes With each change, regression testing can become more costly Regression testing is particularly important in various software methodologies where change is embraced and occurs often … i.e.: Agile
    • While unit testing focuses on testing the individual pieces, integration testing tests all the pieces as they come together
      • Integration Testing
      • Integration testing is arguably the most crucial step in the software development life cycle. Different components are integrated together and tested. This can be a complicated task in enterprise applications where diverse teams build different modules and components, or when commercial software is introduced.
      • Before integration testing can occur, all components must have been independently tested and verified.
      • Effective Integration testing includes:
      • Software Configuration Management : Since integration testing focuses on integration of components, and components can be built by different developers or development teams, it is important the right version of components are tested. One of the biggest problems faced in n-tier development is integrating the right version of components. Each time the application components are integrated you must know exactly what versions are being integrated.
      • Automate Build Process : A lot of errors occur because the wrong version of components were sent for the build or there are missing components. Write a build script to integrate and deploy the components to reduce manual errors.
      • Document : Document the integration process/build process to help eliminate the errors of omission or oversight. It is possible that the person responsible for integrating the components forgets to run a required script which could invalidate the build, or produce inaccurate results.
      • Defect Tracking : Integration testing will lose its edge if the defects are not tracked correctly. Each defect should be documented and tracked. Information should be captured as to how the defect was fixed. This is valuable information as it can help in future integration and deployment processes.
      Info-Tech Research Group Even if a component is successfully unit tested, it is of little value if the component cannot be successfully integrated with the rest of the application Integration test cases should focus on scenarios where one component is being called from another It is very common that a lot of bugs are discovered during integration testing
    • Automated testing tools make sense when the efficiency gained from use is greater than the tools cost to purchase and maintain
      • Automated Testing
      • Automated testing can never replace or reproduce everything that a person can do, however it can be extremely useful for repetitive testing such as regression testing or build verification testing.
      • Automation is generally put into place to automate a manual process that already has a well established and formalized plan.
      • Automated tests can easily be repeated, and can run through immense numbers of conditions and scenarios in a fraction of the time it would take to manually run the same tests.
      Info-Tech Research Group In some environments (ie: web development) where various hardware related differences exist, or multiple browsers must be tested for, automation can be an amazing time saver . Where manual testing may require several days, automated testing may require only hours, which translates directly into cost and time savings. Automated tests can be shared between your test group and developers, and can be triggered to run automatically every time new or changed source code is checked in. If the test fails, the developer can be notified automatically and avoid involving your testing resources until the build is successful and ready for testers to take control.
      • Automated testing can reproduce some situations that would be virtually impossible for a person to do. For example performance load testing of a system simulating hundreds or thousands of simultaneous users.
      Automated testing can be affordable, and is recommended in order to save both time and money, however you should be prepared to spend time setting up and maintaining automated test scripts.
    • If your testing team is not already an expert tester, then automation is a bad idea and a waste of time
      • The case against …. Automated Testing
      • Automated testing is not recommended if your testing team is not already an expert at testing.
      • Automating your testing can be a huge waste of time, and become a real money pit if your testing team does not know how to focus on what to test, why to test it, and when.
      • Once your test team has mastered managing risk and test coverage by applying sound principles, judgment and technique, then (and only then) should you start talking about the possibility of automating your testing.
      Info-Tech Research Group Certain testing can never be automated as it requires a person. Exploratory testing and usability testing in particular require more thought and randomness and, in the case of usability, require the subjectivity of a real person. It is highly recommended that, when just learning to automate, you bring in (or hire) a technical test engineer with specific and relevant experience with the automation software. If you don’t, you run the risk of reinventing the proverbial wheel which can lead to an extremely expensive venture.
      • Automated testing and automation tools require specialized experience and knowledge. You may be further ahead to bring in outside help, or have your testing team individually trained on the tools. Either way, your team will likely require an extended period of time feeling their way around the capabilities of the tools before you truly start to see any payback from your expense.
      • Some development does not lend itself well to automated testing. If your focus is on GUI (user-interface), then automation tools for this tend to be on the more expensive side and rarely will give you any great direct benefit.
      • It does not make sense to use automation tools if, during the analysis phase, it is determined that the time required to create the scripts and maintain them will exceed the time that can be reasonably assigned to the overall testing of the application itself. (short duration projects).
    • Info-Tech Research Group Software Quality Assurance is not all about the testing, in fact SQA encompasses all aspects of the Software Development Life Cycle from planning through support An organization that focuses on Quality Management looks at all aspects from beginning to end with a goal of total satisfaction. “ Getting them engaged early creates more successful projects because they have the context to plan and prepare.” - Manager of QA – Government (Health Sector) “ If you relegate QA responsibilities to just testing the end product, you overlook an opportunity to integrate quality into the entire software development life cycle.” - SQA Trained Professional with over 16yrs exp. “ To truly be effective at building quality, you can’t just look at testing alone. Making sure the requirements are even testable is a crucial step towards quality.” - SQA Trained Professional with over 16yrs exp. Software testers typically get involved during design and development phases …but you really shouldn’t wait, or stop there. Software Development Life Cycle (SDLC)
    • Info-Tech Research Group
      • is reactive - it is a set of actions that check, verify, and validate that the project has followed established standards and procedures and that the required deliverable has been produced
      • addresses specifically what has been found and cannot answer whether there is anything else to be found
      • is concrete
      • refines its focus to the technology – react & repair, product, components and actions
      • has short term focus
      • is proactive - it is a process of planned and systematic actions required to provide sufficient confidence that the developed product will conform to established standards and requirements
      • applies to the entire organization. To be effective, SQA must be independent and should report to the same executive level that the most senior development manager reports to
      • is broadly focused on the big picture – Prevention, people, process and change
      • is a way of life, a philosophy… it never stops
      • has long term focus
      • is very focused and specific to the product and its components - action and reaction, immediate, here and now
      Software Quality Assurance Software Quality Control Software Testing Often you will hear the terms SQA, SQC ,(QA/QC) , or even testing used as if they were 100% interchangeable It is important to understand the distinction, by the classic definitions to avoid confusion and add clarity to your resource roles and responsibilities Software Quality Assurance comprehends the whole Software Quality Control encompasses testing Testing is at the heart of Quality Control Using any of these terms is typically sufficient to get your point across to any of the development team, including management
    • Develop & Improve Your Testing Strategy Summary & Conclusions Info-Tech Research Group A ssess your readiness I mprove your success D evelop the means to achieve U nderstand SQA, SQC, and testing
      • Whether you develop it, configure it, or integrate it, you simply can’t afford to ignore Software Quality Assurance and having solid testing practices is the key
      • Software Quality Assurance encompasses both Software Quality Control (SQC) and Software Testing. Business leaders will all eagerly agree that quality is important , but understanding how to get there, what processes to implement, what people to engage is often a difficult decision, and a hard one to justify
      • As development organizations mature, priorities jockey for position. For inexperienced or startup groups the elusive quality is not the major concern , delivery is, but as the organization matures, quality ultimately and always ends up at the top of the pile
      • The most common problems and their solutions seem to be obvious, and understanding the depth of testing, the types of tests, why they are run and who needs to run them is critical to achieving success ; following a well thought out test plan is one of the first steps in the quality direction
      • Attempting to turn a screw with a knife blade can work, but it works much better, faster, and more accurately with the proper tool. Having trained quality professionals will help the organization achieve the level of quality that your customers expect … and deserve
      • Develop and improve your overall quality by learning and implementing proper process that fits your organizational needs. Whether you develop for the web, or integrate off the shelf software, your customers deserve your best, and you deserve their trust
      This solution has been designed to A I D U in improving your current testing practices, develop new, practical, tried and true process, and provide a level of understanding of software testing that will enable you to make informed and strategic decisions for your organization in your pursuit of quality!
    • Appendix Info-Tech Research Group
      • Appendix I Case Study: Testing is limited..The business doesn’t care
      • Appendix II Case Study: Even small shops need a testing strategy
      • Appendix III Case Study: Test data, the critical component
      • Appendix IV Testing best practices (x3)
      • Appendix V Understand the various testing types (x2)
      • Appendix VI Available tools & templates
      • Appendix VII Demographics
    • Appendix I Case Study: Testing is limited. The business doesn’t care Info-Tech Research Group The business remains uninvolved because they trust the development group. There is not enough regression testing to the manager’s estimation, and as a result there are post production issues. Since the business doesn’t see it as a concern they may never change testing practices. “ We have monitoring software for servers. Monitor but don’t test . We put it out there and hope that it doesn’t crash . It will go down and we’re reactive about it. The IT budget is limited, so we can’t invest in testing” The Situation : Finance Industry – Applications developed include website and internal applications . There are approximately 30 people in IT, 10 Web Developers ( no testers ) – Spread across an average of 10 applications and/or projects. Some project managers are involved in some of the projects and sometimes will produce a plan. Development staff typically will draft a test plan (of sorts). There is no standard methodology followed, everything is very ad-hoc. Attempts are made to collaborate with the business when possible; projects managers are too distant and operational managers are not equipped with knowledge to provide feedback. The Challenge : One of the biggest struggles facing the group is trying to get the business included and involved. They have no understanding of development or testing and do not support what they don’t know. IT has no idea what they are supposed to test, and because this has not caused a major problem yet , the business does not feel enough incentive to get involved , or to support the need for testing. The only time the business gets involved in our projects is when they need to sign off for compliance reasons. Generally the business takes on the attitude of … “we just trust you.” Comments: Recommendations: This business needs to take a step back and look at the risk associated with not doing testing. Trusting of their developers is admirable, but it won’t help the business. A serious look at the cost justifications for testing should be done. Compared against the cost to react to problems and the risk they are exposing themselves to, especially considering their industry, should be enough to help them in the right direction. Many businesses fall into this pit … they trust their developers, and so testing is a cost not justifiable. Before you right it off , as with any good business decision … do your due diligence , quality is worth it! Go Back
    • Appendix II Case Study: Even small shops need a testing strategy Info-Tech Research Group “ No, I think that for us, because we’re a smaller shop, it comes down to time. We just don’t (test)– I’m gonna say from a manpower perspective, we definitely take some shortcuts right now . In some cases, I have my VAR, my key contact, he will test it. He’ll show me what – we’ll quickly go over it and I’ll say “ Yeah, okay that looks good ” and I just let it go . I don’t test it ; I don’t necessarily test them myself just because I don’t have the time to do it .” For vendor patch updates, the partner installs and tests patches. But they don’t put all their faith in the hands of the development partner. Before any patches are applied, a copy is made of the previous release. If something happens once the development partner applies the patches they go back to original. The Situation : Small Shop Profile –CIO, Application manager and QA manager at a manufacturing industry IT shop of four.  Development for their iSeries based commercial ERP platform is an external partner, with whom they have had years of experience. They work with a VAR to specify and test ERP enhancements, and sometimes just don’t have time to test. The Challenge : Big projects – will test in conjunction with developers The alternative: For minor projects, and when there just isn’t time, will have the development partner test functionality.   Comments: Recommendations: The manager in this case recognizes the need to test, he uses the 3 rd party VAR whom he trusts to take on a lot of the testing responsibilities and trusts that if something goes wrong, they can back out of it. As with in the first case study, this business needs to take a step back and look at the risk associated with not doing testing. Trusting of their VAR in this case is risky, and it won’t help the business. The VAR does not have the same stake if something goes wrong. A serious look at the cost for testing should be done. All may be well today, but what about tomorrow… As a small shop of only four developers, it is easy to understand, and easy to justify not having specific testing resources. However, improved quality can be achieved by having dedicated resources that know how to plan, how to strategize and test the integration of the systems. A dedicated resource can free up the development resources to work on more relevant tasks. The development resources in small shops can be an amazing resource, but they can also be very quick to move on. Keep them productive and happy doing things they love to do …. develop. Focus the developers attention on the solutions, and bring in co-op students to handle the testing. its your quality, don’t leave it to someone else! Go Back
    • Appendix III Case Study: Test data, the critical component Info-Tech Research Group “ So I’ve seen one of my testers was testing maintenance for something, and there literally was not enough data to sort a column. And that project, that application had made it through to production.” Another scenario is when data is not unique enough. “We’ve had this one happen – the entire test, people in there had the same birth date, so you’re trying to validate reports that generate different things based on date of birth, you don’t really know if it’s working or not because you’re always seeing the same result.” The Situation : 100 person IT shop. 10 QA staff, lots of custom and commercial testing. There are software architects, developers, e-business analysts, and testers, and kind of quality analyst type roles, project managers, application desk-side support analysts. The Challenge : For security/compliance reasons an enterprise is limited in the real data they can tests in applications, and incomplete test data can limit testing. The result is that testing does not actually meet the targets in the test script. People need to create custom data and sometimes they just don’t create enough.   Comments: Recommendations: This organization has a well established and well formed environment. Their resource make up is good, the processes they have in place are good, the testing strategy is good, and there is adequate planning and support for their testing efforts. The short fall is with the infrastructure for the test environments. In this case, data. Time needs to be spent ensuring that the test team has sufficient data to properly test for conditions that properly reflect those situations that emulate the real world. The recommendation would be to duplicate one of the production databases, move this to an isolated test environment, and take the time to change the data from real to test … for example, changing names, changing titles, changing all facets of the data to test (fake) and obviously fake, data. This is a tedious and time consuming task, however once completed it will increase the effectiveness of the testing team. Another alternative recommendation would be to create a mirror within the production system that could write (changed) data to a separate database for use in the test environment. Go Back
    • Appendix IV Testing Best Practices Info-Tech Research Group 1) Learn to analyze your test results thoroughly. Do not ignore the test result. The final test result may be ‘pass’ or ‘fail’ but troubleshooting the root cause of ‘fail’ will lead you to the solution of the problem. Testers will be respected if they not only log the bugs but also provide solutions . 2) Learn to maximize the test coverage every time you test any application. Though 100 percent test coverage might not be possible you can always try to reach it. 3) To ensure maximum test coverage break your application into smaller functional modules. Write test cases on individual unit modules. Also, if possible, break these modules into smaller parts, e.g.: If you have divided your website application in modules and “ accepting user information ” is one of the modules. You can break this “ user information ” screen into smaller parts for writing test cases: Parts like UI testing, security testing, functional testing of the user information form etc. Apply all form field type and size tests, negative and validation tests on input fields and write all test cases for maximum coverage. 4) While writing test cases, write test cases for the intended functionality first i.e: for valid conditions according to requirements. Then write test cases for invalid conditions. This will cover expected as well unexpected behavior of the application. 5) Think positive . Start testing the application by intending to find bugs/errors. Don’t think beforehand that there will not be any bugs in the application. If you test the application with the intention of finding bugs you will definitely succeed. 6) Write your test cases in requirement analysis and the design phase itself. This way you can ensure all the requirements are testable. 7) Make your test cases available to developers prior to coding. Don’t keep your test cases with you waiting to get the final application release for testing, thinking that you can log more bugs. Let developers analyze your test cases thoroughly to develop a quality application. This will also save the re-work time. Go Back
    • Appendix IV - continued Testing Best Practices Info-Tech Research Group 8) If possible identify and group your test cases for regression testing. This will ensure quick and effective manual regression testing. 9) Applications requiring critical response time should be thoroughly tested for performance. Performance testing is the critical part of many applications. In manual testing this is mostly ignored by testers. Find out ways to test your application for performance. If it is not possible to create test data manually, then write some basic scripts to create test data for performance testing or ask the developers to write it for you. 10) Programmers should not test their own code. Basic unit testing of the developed application should be enough for developers to release the application for the testers. But testers should not force developers to release the product for testing. Let them take their own time. Everyone from lead to manger will know when the module/update is released for testing and they can estimate the testing time accordingly. This is a typical situation in an agile project environment . 11) Go beyond requirement testing. Test the application for what it is not supposed to do. 12) While doing regression testing use previous bug information. This can be useful to predict the most probable bug filled part of the application. 13) Keep a text file open while testing an application and write down the new terms and concepts you learn. Use these notepad observations while preparing a final test release report. This good habit will help you to provide a complete unambiguous test report and release details. 14) Many times testers or developers make changes in the code base for the application when under test. This is a required step in development or testing environments to avoid execution of live transaction processing like in banking projects . Record all such code changes done for testing purposes and at the time of final release make sure you have removed all these changes from the final client side deployment file resources. Go Back
    • Appendix IV – continued (2) Testing Best Practices Info-Tech Research Group Go Back 15) Keep developers away from the test environment. This is a required step to detect any configuration changes missing in a release or deployment document. Sometimes developers do some system or application configuration changes but forget to mention those in deployment steps. If developers don’t have access to the testing environment they will not make any of these changes accidentally. These changes must be captured at the right place . 16) It’s a good practice to involve testers right from the software requirement and design phase. This way testers can get knowledge of the application dependability resulting in detailed test coverage. If you are not being asked to be part of this development cycle then make a request to your lead or manager to involve your testing team in all decision making processes or meetings . 17) Testing teams should share best testing practices , and experience with other teams in their organization. 18) Increase your conversation with developers to know more about the product. Whenever possible, use face-to-face communication for resolving disputes quickly and to avoid any misunderstandings. But also, when you reach an understanding or resolve any dispute - make sure to communicate the same in writing. Do not leave anything strictly verbal . 19) Don’t run out of time to do high priority testing tasks. Prioritize your testing work from high to low priority and plan your work accordingly. Analyze all associated risks to prioritize your work. 20) Write clear, descriptive, unambiguous bug reports. Do not provide only the bug symptoms but also provide the effect of the bug and all possible solutions.
    • Appendix V Understand the various testing types Info-Tech Research Group Black box testing - not based on any knowledge of internal design or code. Tests are based on requirements and functionality White box testing - based on knowledge of the internal logic of an application's code. Tests are based on coverage of code statements, branches, paths, conditions Unit Testing - the most micro scale of testing; to test particular functions or code modules. Typically done by the developer and not by testers, as it requires detailed knowledge of the internal program design and code. Not always easily done unless the application has a well-designed architecture with tight code; may require developing test driver modules or test harnesses Incremental Integration Testing - continuous testing of an application as new functionality is added; requires that various aspects of an application's functionality be independent enough to work separately before all parts of the program are completed, or that test drivers be developed as needed; done by programmers or by testers Integration Testing - testing of combined parts of an application to determine if they function together correctly. The 'parts' can be code modules, individual applications, client and server applications on a network, etc. This type of testing is especially relevant to client/server and distributed systems Functional Testing - black-box type testing geared to functional requirements of an application; this type of testing should be done by testers. This doesn't mean that the programmers shouldn't check that their code works before releasing it (which of course applies to any stage of testing) System Testing - black-box type testing that is based on overall requirements specifications; covers all combined parts of a system End-to-End Testing - similar to system testing; the macro end of the test scale; involves testing of a complete application environment in a situation that mimics real-world use, such as interacting with a database, using network communications, or interacting with other hardware, applications, or systems if appropriate Smoke Testing - typically an initial testing effort to determine if a new software version is performing well enough to accept it for a major testing effort. For example, if the new software is crashing systems every 5 minutes, bogging down systems to a crawl, or corrupting databases, the software may not be in a condition to warrant further testing Regression Testing - re-testing after fixes or modifications of the software or its environment. It can be difficult to determine how much re-testing is needed, especially near the end of the development cycle. Automated testing approaches can be especially useful for this type of testing Acceptance Testing - final testing based on specifications of the end-user or customer, or based on use by end-users/customers over some specified period of time Load Testing - testing an application under heavy loads, such as testing of a web site under a range of loads to determine at what point the system's response time degrades or fails Go Back
    • Appendix V – continued Understand the various testing types Info-Tech Research Group Stress Testing - often used interchangeably with load and performance testing. Also, used to describe such tests as system functional testing while under unusually heavy loads, heavy repetition of certain actions or inputs, input of large numerical values, and large complex queries to a database system, etc. Performance Testing - often used interchangeably with stress and load testing. Ideally , performance testing is defined in requirements documentation or QA or Test Plans. (Specified performance criteria from the customer). Usability Testing - testing for user-friendliness. This is the most subjective, and will depend on the end-user or customer. User interviews, surveys, video recording of user sessions, and other techniques can be used. Developers and testers are usually not appropriate as usability testers. Deployment Testing - testing of full, partial, or upgrade install/uninstall processes. Recovery & Failover Testing - testing how well a system recovers from crashes, hardware failures, or other catastrophic problems. Security Testing - testing how well the system protects against unauthorized internal or external access, willful damage, etc.; may require sophisticated testing techniques. Compatibility Testing - testing how well software performs in a particular hardware/software/operating system/network/etc. environment. Exploratory Testing - often taken to mean a creative, informal software test that is not based on formal test plans or test cases; testers may be learning the software as they test it. Ad-Hoc Testing - similar to exploratory testing, but often taken to mean that the testers have significant understanding of the software before testing it. User Acceptance Testing - determining if software is satisfactory to an end-user or customer. Comparison Testing - comparing software weaknesses and strengths to competing products. Alpha/Beta Testing - testing of an application when development is nearing completion; minor design changes may still be made as a result of alpha testing. Beta testing occurs when development and testing are essentially completed and final bugs and problems need to be found before final release. Typically, alpha & beta testing is done by end-users or others, not by developers or testers. Go Back
    • Appendix VI Available Tools & Templates Info-Tech Research Group Use Info-Tech’s “ Test Plan Template ” to quickly layout and streamline your project test plan. (strategy) Use Info-Tech’s “ Defect Reporting Template ” to quickly record and document your project test cases. Use Info-Tech’s “ Software Testing Strategy & Risk Assessment ” tool to determine risk and testing strategies by looking at these influencing factors. Use Info-Tech’s “ Testing Checklist ” to help you determine if all that needs to be done, is done.
    • Appendix VII Demographics Info-Tech Research Group
    • Appendix VII Demographics Info-Tech Research Group
    • Appendix VII Demographics Info-Tech Research Group
    • Appendix VII Demographics Info-Tech Research Group
    • Appendix VII Demographics Info-Tech Research Group