You're under tight time pressure and have barely enough information to proceed with testing. How do you test quickly and inexpensively, yet still produce informative, credible, and accountable results? Rapid Software Testing, adopted by context-driven testers worldwide, offers a field-proven answer to this all-too-common dilemma. In this one-day sampler of the approach, Paul Holland introduces you to the skills and practice of Rapid Software Testing through stories, discussions, and "minds-on" exercises that simulate important aspects of real testing problems. The rapid approach isn't just testing with speed or a sense of urgency; it's mission-focused testing that eliminates unnecessary work, assures that the most important things get done, and constantly asks how testers can help speed up the successful completion of the project. Join Paul to learn how rapid testing focuses on both the mind set and skill set of the individual tester who uses tight loops of exploration and critical thinking skills to help continuously re-optimize testing to match clients' needs and expectations.
A Rapid Introduction to Rapid Software TestingTechWell
This document provides a summary of a presentation on Rapid Software Testing. The presentation was given by Michael Bolton of DevelopSense and covered the methodology and mindset of rapid software testing. It emphasizes testing software expertly under uncertainty and time pressure. The presentation defines rapid testing as testing more quickly and less expensively while still achieving excellent results. It compares rapid testing to other approaches like exhaustive, ponderous, and slapdash testing. The presentation also discusses principles of rapid testing, how to recognize problems quickly using heuristics, and testing rapidly to fulfill the mission of testing.
A test strategy is the set of ideas that guides your test design. It's what explains why you test this instead of that, and why you test this way instead of that way. Strategic thinking matters because testers must make quick decisions about what needs testing right now and what can be left alone. You must be able to work through major threads without being overwhelmed by tiny details. James Bach describes how test strategy is organized around risk but is not defined before testing begins. Rather, it evolves alongside testing as we learn more about the product. We start with a vague idea of our strategy, organize it quickly, and document as needed in a concise way. In the end, the strategy can be as formal and detailed as you want it to be. In the beginning, though, we start small. If you want to focus on testing and not paperwork, this approach is for you.
This document provides an overview and introduction to the Rapid Software Testing course. It acknowledges those who contributed to developing the course material. The document outlines some assumptions about the audience for the course, including that attendees test software and want to improve their testing process. It presents the primary goal of the course as teaching how to test under uncertainty and with scrutiny. Key themes of Rapid Testing are also summarized, including putting the tester's mind at the center and considering cost versus value in testing activities.
Michael Bolton - Heuristics: Solving Problems RapidlyTEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Heuristics: Solving Problems Rapidly by Michael Bolton. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Using Stories to Test Requirements and SystemsPaul Gerrard
The document discusses using business stories to test requirements and systems. It explains that stories can help identify omissions, inconsistencies, and ambiguity in requirements. Stories are applicable at any stage of a project for different purposes. Structured stories follow a common format with a header, scenarios with given/when/then structures, and can have multiple scenarios to test different conditions. Stories can validate requirements by example and generate both manual and automated test cases. The document argues that a structured, disciplined approach to stories can benefit both agile and structured development approaches.
- The speaker proposes 16 "test axioms" that are intended to provide a framework for testing approaches and represent principles that are context-insensitive and self-evidently true.
- The axioms are grouped into three categories: stakeholders, design, and delivery. The speaker argues the axioms can help testers think critically about testing and identify flaws in arguments.
- It is argued that process improvement models are not effective for improving testing because there is no consensus on best practices and processes must be tailored to context. True improvement requires understanding why current approaches are used given the context.
This document provides an overview of exploratory testing techniques. It discusses that exploratory testing involves simultaneous learning, test design, and test execution. Exploratory testing is tester-centric and focuses on problem solving strategies like heuristics rather than scripts. The document dispels some myths about exploratory testing, including that it is unstructured and cannot involve documentation. It provides examples of how documents can be used for reflection, information sharing, and reporting in exploratory testing.
Herman- Pieter Nijhof - Where Do Old Testers Go?TEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Where Do Old Testers Go? by Herman- Pieter Nijhof. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
A Rapid Introduction to Rapid Software TestingTechWell
This document provides a summary of a presentation on Rapid Software Testing. The presentation was given by Michael Bolton of DevelopSense and covered the methodology and mindset of rapid software testing. It emphasizes testing software expertly under uncertainty and time pressure. The presentation defines rapid testing as testing more quickly and less expensively while still achieving excellent results. It compares rapid testing to other approaches like exhaustive, ponderous, and slapdash testing. The presentation also discusses principles of rapid testing, how to recognize problems quickly using heuristics, and testing rapidly to fulfill the mission of testing.
A test strategy is the set of ideas that guides your test design. It's what explains why you test this instead of that, and why you test this way instead of that way. Strategic thinking matters because testers must make quick decisions about what needs testing right now and what can be left alone. You must be able to work through major threads without being overwhelmed by tiny details. James Bach describes how test strategy is organized around risk but is not defined before testing begins. Rather, it evolves alongside testing as we learn more about the product. We start with a vague idea of our strategy, organize it quickly, and document as needed in a concise way. In the end, the strategy can be as formal and detailed as you want it to be. In the beginning, though, we start small. If you want to focus on testing and not paperwork, this approach is for you.
This document provides an overview and introduction to the Rapid Software Testing course. It acknowledges those who contributed to developing the course material. The document outlines some assumptions about the audience for the course, including that attendees test software and want to improve their testing process. It presents the primary goal of the course as teaching how to test under uncertainty and with scrutiny. Key themes of Rapid Testing are also summarized, including putting the tester's mind at the center and considering cost versus value in testing activities.
Michael Bolton - Heuristics: Solving Problems RapidlyTEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Heuristics: Solving Problems Rapidly by Michael Bolton. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Using Stories to Test Requirements and SystemsPaul Gerrard
The document discusses using business stories to test requirements and systems. It explains that stories can help identify omissions, inconsistencies, and ambiguity in requirements. Stories are applicable at any stage of a project for different purposes. Structured stories follow a common format with a header, scenarios with given/when/then structures, and can have multiple scenarios to test different conditions. Stories can validate requirements by example and generate both manual and automated test cases. The document argues that a structured, disciplined approach to stories can benefit both agile and structured development approaches.
- The speaker proposes 16 "test axioms" that are intended to provide a framework for testing approaches and represent principles that are context-insensitive and self-evidently true.
- The axioms are grouped into three categories: stakeholders, design, and delivery. The speaker argues the axioms can help testers think critically about testing and identify flaws in arguments.
- It is argued that process improvement models are not effective for improving testing because there is no consensus on best practices and processes must be tailored to context. True improvement requires understanding why current approaches are used given the context.
This document provides an overview of exploratory testing techniques. It discusses that exploratory testing involves simultaneous learning, test design, and test execution. Exploratory testing is tester-centric and focuses on problem solving strategies like heuristics rather than scripts. The document dispels some myths about exploratory testing, including that it is unstructured and cannot involve documentation. It provides examples of how documents can be used for reflection, information sharing, and reporting in exploratory testing.
Herman- Pieter Nijhof - Where Do Old Testers Go?TEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Where Do Old Testers Go? by Herman- Pieter Nijhof. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
The Test Coverage Outline: Your Testing Road MapTechWell
To assist in risk analysis, prioritization of testing, and test reporting (telling your testing story), you need a thorough Test Coverage Outline (TCO)—a road map of your proposed testing activities. By creating a TCO, you can prepare for testing without having to create a giant pile of detailed test cases. Paul Holland says that a comprehensive TCO helps the test team to get buy-in for the overall test strategy very early in the project and is valuable for identifying risk areas, testability issues, and resource constraints. Paul describes how to create a TCO including the use of heuristic-based checklists to help ensure you don’t overlook important elements in your testing. Learn multiple approaches for critical information gathering, the artifacts used as input for creating a TCO, and how you can use a TCO to maintain testing focus. Take back a new, lightweight tool to help you tell the testing story throughout your project.
Fabian Scarano - Preparing Your Team for the FutureTEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Preparing Your Team for the Future by Fabian Scarano. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
This document discusses the need to rethink the role of testers in agile and structured projects. It argues that changes in business demands and development practices are squeezing testers and that many current testing roles and skills may disappear. Specifically, it predicts that half of onshore testing roles will be eliminated in 5 years. It recommends testers focus on more strategic roles like business analysis, requirements management, and assurance rather than traditional testing tasks.
The document provides guidance for managing a team of junior testers. It discusses challenges such as lack of skills and experience in junior testers. It recommends setting clear expectations, providing frequent communication and feedback, ensuring knowledge sharing, and protecting the team to help them succeed. Patience and structure are important, as is repeating key messages, to help junior testers learn and improve. The goal is for the team to work cooperatively toward a common objective.
Rikard Edgren - Testing is an Island - A Software Testing DystopiaTEST Huddle
This document summarizes trends in software testing that could diminish its effectiveness and enjoyment. It notes an increasing focus on verification over validation, precise measurement over subjective judgement, and short-term metrics over long-term quality. This narrowing scope risks making testers isolated and limiting their creativity, motivation and ability to consider the full context of a project. The document advocates a holistic and subjective approach that considers people and intangible factors, not just short-term quantifiable results. Subjectivity and considering the whole system, not just parts, are presented as useful for testing.
Santa Barbara Agile: Exploratory Testing Explained and ExperiencedMaaret Pyhäjärvi
Exploratory Testing Explained and Experienced
- Exploratory testing is an approach to software testing that involves dynamically testingsoftware without a fixed plan, using the results of previous tests to determine subsequent tests.
- It is a disciplined approach that finds unknown unknowns and helps testers examine software from different perspectives to uncover more bugs. Tests are performances rather than fixed artifacts.
- Exploratory testing requires testers to be able to strategically choose and defend their test approaches, explain what they have tested, and determine when they are done testing rather than just finding bugs randomly. It is a more systematic approach than unplanned testing.
Erkki Poyhonen - Software Testing - A Users GuideTEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Software Testing - A Users Guide by Erkki Poyhonen. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
This talk suggests how we might make sense of the tools landscape of the near future, where the pressure to modernise processes and automate is greatest, and what a new test process supported by tools might look like.
Takeaways:
- We need to take machine learning in testing seriously, but it won’t be taking our jobs just yet
- We don’t need more test automation tools; today we need tools that capture tester knowledge
- Tools that that learn and think can’t work for testers until we solve the knowledge capture challenge.
View On-Demand Webinar: https://youtu.be/EzyUdJFuzlE
The Pursuit of Quality - Chasing Tornadoes or Just Hot Air?Paul Gerrard
The document discusses quality and models for quality and testing. It begins by explaining that quality depends on perspective and is a relationship between systems and stakeholders. It then discusses how models are used everywhere, including in quality and testing, but that all models simplify reality and can be incomplete. The document provides examples of different types of models used for quality and testing purposes.
The document discusses how test axioms can be used to advance testing practices. It introduces 16 proposed test axioms grouped into stakeholder, design, and delivery axioms. The axioms represent critical thinking processes for testing any system. The document discusses how the axioms can help testers design test strategies, assess improvement opportunities, and define needed skills. It also proposes a "first equation of testing" that separates axioms, context, values, and thinking to allow for different valid approaches. Additionally, the concept of "quantum testing" is introduced to discuss assigning significance to tests rather than defining their value, which can only be determined by stakeholders.
Exploratory testing is an approach that emphasizes freedom and responsibility of individual testers in a process where continuous learning, test design, and execution occur simultaneously. It is a disciplined, planned, and controlled form of testing that focuses on continuous learning. Research has shown there is no significant difference in results between exploratory testing and preplanned test cases, but exploratory testing requires significantly less effort overall. Effective exploratory testing requires skills like making models, keeping an open mind, and risk-based testing approaches. Both the strengths and potential blind spots of exploratory testing are discussed.
This document discusses the need for leadership in the testing community to drive innovation and change. It provides examples of challenges facing testers at different companies and how they are addressing them through approaches like shifting testing left into development, adopting agile practices, and using analytics. It argues that testing is no longer just an end phase but must be integrated into continuous delivery. For change to happen, testers will need to embrace new approaches, challenge old ways of thinking, and stand up as leaders to define the future of testing.
Ho Chi Minh City Software Testing Conference January 2015
Software Testing in the Agile World
Website: www.hcmc-stc.org
Author: Lee Copeland
The IEEE 829 Test Documentation standard is thirty years old this year. Boris Beizer’s first book on software testing also turned thirty. Testing Computer Software, the best selling book on software testing, is twenty-five. During the last three decades, hardware platforms have evolved from mainframes to minis to desktops to laptops to tablets to smartphones. Development paradigms have shifted from waterfall to agile. Consumers expect more functionality, demand higher quality, and are less loyal to brands. The world has changed dramatically and testing must change to match it. Testing processes that helped us succeed in the past may prevent our success in the future. Lee Copeland shares his insights into the future of testing, sharing his Do’s and Don’ts in the areas of technology, organization, test processes, test plans, and automation. Join Lee for a thought provoking look at creating a better testing future.
This document discusses why checklists are better than test cases for documentation in quality assurance. It argues that test cases become overcrowded and focus too much on documentation rather than core functions. Checklists are more time-saving and easy to update. An example compares a test case to a checklist for login/registration flows. The author's company Hipo uses a test pad and robot framework integrated with checklists to share with clients and team members.
Ho Chi Minh City Software Testing Conference January 2015
Software Testing in the Agile World
Website: www.hcmc-stc.org
Author: Nhat Do, Vu Duong
Context-Driven Testing (CDT) rejects the notion of generalized “best practices” that apply to all projects, and instead accepts that different practices work best under different circumstances. The third principle of the seven defined in CDT states that people are the most important part of any project’s context. Less of a focus on processes and tools, with more emphasis on people and their collaboration empowers testers with the freedom to make choices about how best to do their job without following a restrictive plan.
In joining the game of workshop and some theory sharing in slides, you will a better understanding of Context-Driven Testing practices, principles and its benefits as well as know how is a nice Marriage of Agile and Context-Driven Testing.
Graham Thomas - The Testers Toolbox - EuroSTAR 2010TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on The Testers Toolbox by Graham Thomas. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Exploratory testing is an approach to testing that emphasizes the freedom and responsibility of testers to continually optimize the value of their work. It is the process of three mutually supportive activities done in parallel: learning, test design, and test execution. With skill and practice, exploratory testers typically uncover an order of magnitude more problems than when the same amount of effort is spent on procedurally scripted testing. All testers conduct exploratory testing in one way or another, but few know how to do it systematically to obtain the greatest benefits. Even fewer can articulate the process. Jon Bach looks at specific heuristics and techniques of exploratory testing that will help you get the most from this highly productive approach. Jon focuses on the skills and dynamics of exploratory testing, and how it can be combined with scripted approaches.
Tafline Murnane - The Carrot or The Whip-What Motivates Testers? - EuroSTAR 2010TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on The Carrot or The Whip-What Motivates Testers? by Tafline Murnane. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Ho Chi Minh City Software Testing Conference January 2015
Software Testing in the Agile World
Website: www.hcmc-stc.org
Author: Lee Copeland
Over the years writers have defined testing as a process of finding, a process of evaluating, a process of measuring, a process of improving. For a quarter of a century we as testers have been focused on the internal process of testing, while generally disregarding its real purpose. The real purpose of testing is to create information. James Bach nailed it when he wrote, “The ultimate reason testers exist is to provide information that others on the project use to create things of value.” That is why testing exists — to provide information of value. So, when managers complain that testing “costs too much” perhaps they are really trying to say, “I’m not getting enough valuable information to justify the cost of testing.” When testers say “my management doesn’t see the value in our work” perhaps they are really trying to say, “My management doesn’t value the information I’m providing to them.” To prove our worth, to increase the value of testing, we must first focus on testing’s purpose — providing valuable information — not its process. Join Lee as he discusses why quantifying the value of testing is difficult work — perhaps that’s why we concentrate so much on testing process—that’s much easier. But until we do this difficult work, until we prove our worth through quantifying our contribution, we should expect the bombardments to continue.
To be most effective, test managers must develop and use metrics to help direct the testing effort and make informed recommendations about the software’s release readiness and associated risks. Because one important testing activity is to “measure” the quality of the software, test managers must measure the results of both the development and testing processes. Collecting, analyzing, and using metrics is complicated because many developers and testers are concerned that the metrics will be used against them. Join Rick Craig as he addresses common metrics—measures of product quality, defect removal efficiency, defect density, defect arrival rate, and testing status. Learn the guidelines for developing a test measurement program, rules of thumb for collecting data, and ways to avoid “metrics dysfunction.” Rick identifies several metrics paradigms—including Goal-Question-Metric—and discusses the pros and cons of each. Delegates are urged to bring their metrics problems and issues for use as discussion points.
The cloud can deliver services over the Internet in three ways—software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS). Each of these approaches requires testers to focus on more than classical functional testing. Ruud Teunissen explores the new techniques and skills testers need to master for testing cloud services. Examples include testing for elasticity; testing fall back scenarios to guarantee continuity of business processes; testing for adherence to laws and regulations; and testing apps, web services, and the numerous platforms that need to be supported. Join Ruud and learn how to test these additional cloud requirements to get a grip on technical test issues, explore cloud services operations, and jump-start the broader scope of testing in the cloud. Take back practical approaches for tuning and tweaking your present test techniques to fly high in the cloud.
The Test Coverage Outline: Your Testing Road MapTechWell
To assist in risk analysis, prioritization of testing, and test reporting (telling your testing story), you need a thorough Test Coverage Outline (TCO)—a road map of your proposed testing activities. By creating a TCO, you can prepare for testing without having to create a giant pile of detailed test cases. Paul Holland says that a comprehensive TCO helps the test team to get buy-in for the overall test strategy very early in the project and is valuable for identifying risk areas, testability issues, and resource constraints. Paul describes how to create a TCO including the use of heuristic-based checklists to help ensure you don’t overlook important elements in your testing. Learn multiple approaches for critical information gathering, the artifacts used as input for creating a TCO, and how you can use a TCO to maintain testing focus. Take back a new, lightweight tool to help you tell the testing story throughout your project.
Fabian Scarano - Preparing Your Team for the FutureTEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Preparing Your Team for the Future by Fabian Scarano. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
This document discusses the need to rethink the role of testers in agile and structured projects. It argues that changes in business demands and development practices are squeezing testers and that many current testing roles and skills may disappear. Specifically, it predicts that half of onshore testing roles will be eliminated in 5 years. It recommends testers focus on more strategic roles like business analysis, requirements management, and assurance rather than traditional testing tasks.
The document provides guidance for managing a team of junior testers. It discusses challenges such as lack of skills and experience in junior testers. It recommends setting clear expectations, providing frequent communication and feedback, ensuring knowledge sharing, and protecting the team to help them succeed. Patience and structure are important, as is repeating key messages, to help junior testers learn and improve. The goal is for the team to work cooperatively toward a common objective.
Rikard Edgren - Testing is an Island - A Software Testing DystopiaTEST Huddle
This document summarizes trends in software testing that could diminish its effectiveness and enjoyment. It notes an increasing focus on verification over validation, precise measurement over subjective judgement, and short-term metrics over long-term quality. This narrowing scope risks making testers isolated and limiting their creativity, motivation and ability to consider the full context of a project. The document advocates a holistic and subjective approach that considers people and intangible factors, not just short-term quantifiable results. Subjectivity and considering the whole system, not just parts, are presented as useful for testing.
Santa Barbara Agile: Exploratory Testing Explained and ExperiencedMaaret Pyhäjärvi
Exploratory Testing Explained and Experienced
- Exploratory testing is an approach to software testing that involves dynamically testingsoftware without a fixed plan, using the results of previous tests to determine subsequent tests.
- It is a disciplined approach that finds unknown unknowns and helps testers examine software from different perspectives to uncover more bugs. Tests are performances rather than fixed artifacts.
- Exploratory testing requires testers to be able to strategically choose and defend their test approaches, explain what they have tested, and determine when they are done testing rather than just finding bugs randomly. It is a more systematic approach than unplanned testing.
Erkki Poyhonen - Software Testing - A Users GuideTEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Software Testing - A Users Guide by Erkki Poyhonen. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
This talk suggests how we might make sense of the tools landscape of the near future, where the pressure to modernise processes and automate is greatest, and what a new test process supported by tools might look like.
Takeaways:
- We need to take machine learning in testing seriously, but it won’t be taking our jobs just yet
- We don’t need more test automation tools; today we need tools that capture tester knowledge
- Tools that that learn and think can’t work for testers until we solve the knowledge capture challenge.
View On-Demand Webinar: https://youtu.be/EzyUdJFuzlE
The Pursuit of Quality - Chasing Tornadoes or Just Hot Air?Paul Gerrard
The document discusses quality and models for quality and testing. It begins by explaining that quality depends on perspective and is a relationship between systems and stakeholders. It then discusses how models are used everywhere, including in quality and testing, but that all models simplify reality and can be incomplete. The document provides examples of different types of models used for quality and testing purposes.
The document discusses how test axioms can be used to advance testing practices. It introduces 16 proposed test axioms grouped into stakeholder, design, and delivery axioms. The axioms represent critical thinking processes for testing any system. The document discusses how the axioms can help testers design test strategies, assess improvement opportunities, and define needed skills. It also proposes a "first equation of testing" that separates axioms, context, values, and thinking to allow for different valid approaches. Additionally, the concept of "quantum testing" is introduced to discuss assigning significance to tests rather than defining their value, which can only be determined by stakeholders.
Exploratory testing is an approach that emphasizes freedom and responsibility of individual testers in a process where continuous learning, test design, and execution occur simultaneously. It is a disciplined, planned, and controlled form of testing that focuses on continuous learning. Research has shown there is no significant difference in results between exploratory testing and preplanned test cases, but exploratory testing requires significantly less effort overall. Effective exploratory testing requires skills like making models, keeping an open mind, and risk-based testing approaches. Both the strengths and potential blind spots of exploratory testing are discussed.
This document discusses the need for leadership in the testing community to drive innovation and change. It provides examples of challenges facing testers at different companies and how they are addressing them through approaches like shifting testing left into development, adopting agile practices, and using analytics. It argues that testing is no longer just an end phase but must be integrated into continuous delivery. For change to happen, testers will need to embrace new approaches, challenge old ways of thinking, and stand up as leaders to define the future of testing.
Ho Chi Minh City Software Testing Conference January 2015
Software Testing in the Agile World
Website: www.hcmc-stc.org
Author: Lee Copeland
The IEEE 829 Test Documentation standard is thirty years old this year. Boris Beizer’s first book on software testing also turned thirty. Testing Computer Software, the best selling book on software testing, is twenty-five. During the last three decades, hardware platforms have evolved from mainframes to minis to desktops to laptops to tablets to smartphones. Development paradigms have shifted from waterfall to agile. Consumers expect more functionality, demand higher quality, and are less loyal to brands. The world has changed dramatically and testing must change to match it. Testing processes that helped us succeed in the past may prevent our success in the future. Lee Copeland shares his insights into the future of testing, sharing his Do’s and Don’ts in the areas of technology, organization, test processes, test plans, and automation. Join Lee for a thought provoking look at creating a better testing future.
This document discusses why checklists are better than test cases for documentation in quality assurance. It argues that test cases become overcrowded and focus too much on documentation rather than core functions. Checklists are more time-saving and easy to update. An example compares a test case to a checklist for login/registration flows. The author's company Hipo uses a test pad and robot framework integrated with checklists to share with clients and team members.
Ho Chi Minh City Software Testing Conference January 2015
Software Testing in the Agile World
Website: www.hcmc-stc.org
Author: Nhat Do, Vu Duong
Context-Driven Testing (CDT) rejects the notion of generalized “best practices” that apply to all projects, and instead accepts that different practices work best under different circumstances. The third principle of the seven defined in CDT states that people are the most important part of any project’s context. Less of a focus on processes and tools, with more emphasis on people and their collaboration empowers testers with the freedom to make choices about how best to do their job without following a restrictive plan.
In joining the game of workshop and some theory sharing in slides, you will a better understanding of Context-Driven Testing practices, principles and its benefits as well as know how is a nice Marriage of Agile and Context-Driven Testing.
Graham Thomas - The Testers Toolbox - EuroSTAR 2010TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on The Testers Toolbox by Graham Thomas. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Exploratory testing is an approach to testing that emphasizes the freedom and responsibility of testers to continually optimize the value of their work. It is the process of three mutually supportive activities done in parallel: learning, test design, and test execution. With skill and practice, exploratory testers typically uncover an order of magnitude more problems than when the same amount of effort is spent on procedurally scripted testing. All testers conduct exploratory testing in one way or another, but few know how to do it systematically to obtain the greatest benefits. Even fewer can articulate the process. Jon Bach looks at specific heuristics and techniques of exploratory testing that will help you get the most from this highly productive approach. Jon focuses on the skills and dynamics of exploratory testing, and how it can be combined with scripted approaches.
Tafline Murnane - The Carrot or The Whip-What Motivates Testers? - EuroSTAR 2010TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on The Carrot or The Whip-What Motivates Testers? by Tafline Murnane. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Ho Chi Minh City Software Testing Conference January 2015
Software Testing in the Agile World
Website: www.hcmc-stc.org
Author: Lee Copeland
Over the years writers have defined testing as a process of finding, a process of evaluating, a process of measuring, a process of improving. For a quarter of a century we as testers have been focused on the internal process of testing, while generally disregarding its real purpose. The real purpose of testing is to create information. James Bach nailed it when he wrote, “The ultimate reason testers exist is to provide information that others on the project use to create things of value.” That is why testing exists — to provide information of value. So, when managers complain that testing “costs too much” perhaps they are really trying to say, “I’m not getting enough valuable information to justify the cost of testing.” When testers say “my management doesn’t see the value in our work” perhaps they are really trying to say, “My management doesn’t value the information I’m providing to them.” To prove our worth, to increase the value of testing, we must first focus on testing’s purpose — providing valuable information — not its process. Join Lee as he discusses why quantifying the value of testing is difficult work — perhaps that’s why we concentrate so much on testing process—that’s much easier. But until we do this difficult work, until we prove our worth through quantifying our contribution, we should expect the bombardments to continue.
To be most effective, test managers must develop and use metrics to help direct the testing effort and make informed recommendations about the software’s release readiness and associated risks. Because one important testing activity is to “measure” the quality of the software, test managers must measure the results of both the development and testing processes. Collecting, analyzing, and using metrics is complicated because many developers and testers are concerned that the metrics will be used against them. Join Rick Craig as he addresses common metrics—measures of product quality, defect removal efficiency, defect density, defect arrival rate, and testing status. Learn the guidelines for developing a test measurement program, rules of thumb for collecting data, and ways to avoid “metrics dysfunction.” Rick identifies several metrics paradigms—including Goal-Question-Metric—and discusses the pros and cons of each. Delegates are urged to bring their metrics problems and issues for use as discussion points.
The cloud can deliver services over the Internet in three ways—software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS). Each of these approaches requires testers to focus on more than classical functional testing. Ruud Teunissen explores the new techniques and skills testers need to master for testing cloud services. Examples include testing for elasticity; testing fall back scenarios to guarantee continuity of business processes; testing for adherence to laws and regulations; and testing apps, web services, and the numerous platforms that need to be supported. Join Ruud and learn how to test these additional cloud requirements to get a grip on technical test issues, explore cloud services operations, and jump-start the broader scope of testing in the cloud. Take back practical approaches for tuning and tweaking your present test techniques to fly high in the cloud.
Embracing Uncertainty: A Most Difficult Leap of FaithTechWell
For the past couple of years, Dan North has been working with and studying teams who are dramatically more productive than any he's ever seen. In weeks they produce results that take other teams months. One of the central behaviors Dan has observed is their ability to embrace uncertainty, holding multiple contradictory opinions at the same time and deferring commitment until there is a good reason. Embracing uncertainty lies at the heart of agile delivery and is one of the primary reasons organizations struggle with agile adoption. We are desperately uncomfortable with uncertainty, so much so that we will replace it with anything-even things we know to be wrong. Dan claims we have turned our back on the original Agile Manifesto, and explains why understanding risk and embracing uncertainty are fundamental to agile delivery-and why we find it so scary. He describes how techniques like real options and deliberate discovery can expose dogma and make life more manageable. Join Dan to learn ways to face-and even embrace-uncertainty with courage and determination.
Problem Solving and Decision Making in Software DevelopmentTechWell
This document summarizes tips from cognitive science research for improving problem solving and decision making. It suggests taking regular breaks from focused work, including short walks. It also recommends exposing oneself to natural light and nature, staying hydrated, and incorporating physical movement into the workday. Diversity of perspectives in groups and informal social interactions are highlighted as important for innovation. Thinking techniques like mind mapping, explaining problems out loud, and imagining someone else's perspective can also help approach problems in new ways.
Exploratory testing is an approach to testing that emphasizes the freedom and responsibility of testers to continually optimize the value of their work. It is the process of three mutually supportive activities—learning, test design, and test execution—done in parallel. With skill and practice, exploratory testers typically uncover an order of magnitude more problems than when the same amount of effort is spent on procedurally scripted testing. All testers conduct exploratory testing in one way or another, but few know how to do it systematically to obtain the greatest benefits. Even fewer can articulate the process. Jon Bach looks at specific heuristics and techniques of exploratory testing that will help you get the most from this highly productive approach. Jon focuses on the skills and dynamics of exploratory testing, and how it can be combined with scripted approaches.
Right-sized Architecture: Integrity for Emerging DesignsTechWell
In agile projects, design ideally "emerges" over the course of development. However, if teams primarily focus on independent user stories, they risk losing sight of the product's vision and the integrity of well-thought-out architecture. Ken Kubo shares techniques he's used to improve the chances that a product's design will emerge into a cohesive and coherent architecture that serves its customers for many years. Join Ken to find out how you can incorporate contextual design principles and simple, visual techniques as part of his "A-Little-Before-Its-Time Design" framework. You can add these practices into your agile workflow to maintain a shared team understanding of your product's vision and the system's emerging design. Ken believes that you can only realize all the promises of agile development with a clearly and constantly communicated product vision and a set of architecture goals. Lack of these key principles leads to sub-optimizing system development-or much worse, failure.
Data Collection and Analysis for Better Requirements: Just the Facts, Ma'amTechWell
According to the Standish group, 64 percent of features in systems are rarely-or never-used. How does this happen? Today, the work of eliciting the customers' true needs, which remains elusive, can be enhanced using data-driven requirements techniques. Brandon Carlson introduces data collection approaches and analysis techniques you can employ on your projects right away. Find out how to instrument existing applications and develop new requirements based on operational profiles of the current system. Learn to use A/B testing-a technique for trying out and analyzing alternative implementations-on your current system to determine which new features will deliver the most business value. With these tools at hand, you can help users and business stakeholders decide the best approaches and best new features to meet their real needs. Now is the time to take the guesswork out of requirements and get "Just the facts, Ma'am."
Disciplined Agile Delivery: Extending Scrum to the EnterpriseTechWell
Going far beyond the limits of a team approach to agile, Scott Ambler explores a disciplined, full-lifecycle methodology for agile software delivery. In this interactive hands-on session, learn how to initiate a large-scale agile project, exploring ways to extend Scrum's value-driven development approach to include both value and risk in the equation. Discover project governance practices that will increase your team's chance of success. Explore with Scott the agile practices—Extreme Programming, Agile Modeling, Agile Data, and the Unified Process—he has found most valuable for large agile teams. Throughout the session, learn to apply the Agile Scaling Model to determine what set of agile practices and techniques will work best for you and your organization. Bring your biggest agile challenges and be prepared to dig into ways to adjust your approach for greater success.
Coaching and Leading Agility: A Discussion of Agile TuningTechWell
The document summarizes a presentation titled "Coaching and Leading Agility: A Discussion of Agile Tuning" given by David Hussman of DevJam. The presentation covers getting teams ready for agile, getting them productive in agile, and keeping them productive. It discusses conducting interviews and assessments, planning suggested changes, setting the stage with tools and techniques, planning discovery and delivery, using metrics to spark discussion, and moving from cycles to continuous delivery and learning. The overall focus is on coaching teams to continuously improve and adapt their agile practices.
ADC-BSC EAST 2013 Keynote: Reading the Tea Leaves: Predicting a Project’s FutureTechWell
Is a project’s fate preordained? Does a project’s past suggest its likely future? Can anything be done to influence that future when the current signs aren’t promising? Payson Hall has participated in and reviewed many projects during his thirty-year career in software development. Without claiming mystical or magical powers, Payson shares problem symptoms he has observed and discusses strategies for isolating and correcting them. He helps you learn to identify “problem seeds” that can grow into larger issues over time. For example, when a task exceeds its planned duration, questions that might help identify the cause include: Are the people assigned to the task working on something else? Has the schedule shifted the task into holidays, training, or vacations? Are tasks blocked awaiting information, materials, or approvals? Was the work clearly defined to begin with? Payson introduces a diagnostic framework that helps you determine the next steps in an investigation to identify root causes of project issues you observe and to formulate possible remedies.
Patterns in Test Automation: Issues and SolutionsTechWell
Testers often encounter problems when automating test execution. The surprising thing is that many testers encounter the very same problems, over and over again. These problems often have known solutions, yet many testers are not aware of them. Recognizing the commonality
All testers know that we can identify many more test cases than we will ever have time to design and execute. The key problem in testing is choosing a small, “smart” subset from the almost infinite number of possibilities available. Join Lee Copeland to discover how to design test cases using formal black-box techniques, including equivalence class and boundary value testing, decision tables, state-transition diagrams, and all-pairs testing. Explore white-box techniques with their associated coverage metrics. Evaluate more informal approaches, such as random and hunch-based testing, and learn the importance of using exploratory testing to enhance your testing ability. Choose the right test case design approaches for your projects. Use the test results to evaluate the quality of both your products and your test designs.
Test-driven development (TDD) is a powerful technique for combining software design, unit testing, and coding in a continuous process to increase reliability and produce better code design. Using the TDD approach, developers write programs in very short development cycles: first the developer writes a failing automated test case that defines a new function or improvement, then produces code to pass that test, and finally refactors the new code to acceptable standards. The developer repeats this process many times until the behavior is complete and fully tested. Rob Myers demonstrates the essential TDD techniques, including unit testing with the common xUnit family of open source development frameworks, refactoring as just-in-time design, plus Fake It, Triangulate, and Obvious Implementation. During this hands-on session, you’ll use exercises to practice the techniques. With many years of product development experience using TDD, Rob will address the questions that arise during your own relaxed exploration of test-driven development.
The Journey from Manager to Leader: Empowering Your TeamTechWell
As I reflect on my struggles empowering teams to become self-managing, I am amazed that I didn't understand earlier. Things that seem so obvious after the fact are often difficult to acknowledge in the moment. I failed to recognize that my extensive experience with risk mitigation was preventing the team from taking risks. Tricia Broderick shares the lessons she learned in her journey from manager to leader. Join in and expect challenging self-reflection as you work with Tricia to recognize how your past successes can create limitations for your team. Learn about assumptions and expectations surrounding self-managing teams, common misunderstandings of what you need to do to empower a team, and the reasons why so many managers, despite their good intentions, fail. Leave with a goal to let go of certain skills that helped achieve your professional success. Instead, focus on embracing the new skills required of a leader who is creating an environment for self-managing teams.
Critical thinking is the kind of thinking that specifically looks for problems and mistakes. Regular people don't do a lot of it. However, if you want to be a great tester, you need to be a great critical thinker. Critically thinking testers save projects from dangerous assumptions and ultimately from disasters. The good news is that critical thinking is not just innate intelligence or a talent—it's a learnable and improvable skill you can master. James Bach shares the specific techniques and heuristics of critical thinking and presents realistic testing puzzles that help you practice and increase your thinking skills. Critical thinking begins with just three questions—Huh? Really? and So?—that kick start your brain to analyze specifications, risks, causes, effects, project plans, and anything else that puzzles you. Join James for this interactive, hands-on session and practice your critical thinking skills. Study and analyze product behaviors and experience new ways to identify, isolate, and characterize bugs.
A Rapid Introduction to Rapid Software TestingTechWell
You're under tight time pressure and have barely enough information to proceed with testing. How do you test quickly and inexpensively, yet still produce informative, credible, and accountable results? Rapid Software Testing, adopted by context-driven testers worldwide, offers a field-proven answer to this all-too-common dilemma. In this one-day sampler of the approach, Michael Bolton introduces you to the skills and practice of Rapid Software Testing through stories, discussions, and "minds-on" exercises that simulate important aspects of real testing problems. The rapid approach isn't just testing with speed or a sense of urgency; it's mission-focused testing that eliminates unnecessary work, assures that the most important things get done, and constantly asks how testers can help speed up the successful completion of the project. Join Michael to learn how Rapid Testing focuses on both the mind set and skill set of the individual tester, using tight loops of exploration and critical thinking skills to help continuously re-optimize testing to match clients' needs and expectations.
A Rapid Introduction to Rapid Software TestingTechWell
You're under tight time pressure and have barely enough information to proceed with testing. How do you test quickly and inexpensively, yet still produce informative, credible, and accountable results? Rapid Software Testing, adopted by context-driven testers worldwide, offers a field-proven answer to this all-too-common dilemma. In this one-day sampler of the approach, Michael Bolton introduces you to the skills and practice of Rapid Software Testing through stories, discussions, and "minds-on" exercises that simulate important aspects of real testing problems. The rapid approach isn't just testing with speed or a sense of urgency; it's mission-focused testing that eliminates unnecessary work, assures that the most important things get done, and constantly asks how testers can help speed up the successful completion of the project. Join Michael to see how rapid testing focuses on both the mind set and skill set of the individual tester who uses tight loops of exploration and critical thinking skills to help continuously re-optimize testing to match clients' needs and expectations.
Testing the unknown: the art and science of working with hypothesisArdita Karaj
Testing what we know, or have a clear understanding of, is relatively straight forward, as is making decisions based on the expected result. But today’s world is presenting us with the Unknown and the Ambiguous, which can only be approached by hypothesizing and experimenting - a lot! This requires intentional thinking, and a different strategy to observe in context.
This session will uncover how testers are helping their teams and product owners, by basing their testing on the science behind creating hypotheses and running experiments. A testing mindset and probing the context around use cases are some of the most valuable competencies testers bring to the team in order to enable decisions based on data.
This document discusses an introduction to a class on rapid software testing. It states that the class aims to make students stronger, smarter and more confident testers by challenging them to think for themselves rather than simply listening to what the instructors say. The class can be beneficial for testers of all experience levels who want to improve at their work. Heuristics are discussed as techniques that can help substitute for complete analysis and involve guidewords, triggers, reframing ideas, and procedures to help solve problems.
Things Could Get Worse: Ideas About Regression TestingTechWell
Michael Bolton, DevelopSense
Tester, consultant, and trainer Michael Bolton is the coauthor (with James Bach) of Rapid Software Testing, a course that presents a methodology and mindset for testing software expertly in uncertain conditions and under extreme time pressure. Michael is a leader in the context-driven software testing movement with twenty years of experience testing, developing, managing, and writing about software. Currently, he leads DevelopSense, a Toronto-based consultancy.
Huib Schoots Testing in modern times - a story about Quality and Value - Test...FiSTB
Huib introduces himself as an experienced IT professional with over 25 years of experience in roles such as developer, tester, consultant, manager, trainer, and coach. He currently works as a managing consultant and senior consultant focused on quality and testing.
The document discusses testing and quality, noting that quality is defined by the value provided to stakeholders rather than conformance to requirements. Testing is described as evaluating a product through experience and exploration to build understanding.
It emphasizes the importance of learning for both individuals and organizations. High-performing teams and organizations are able to learn continuously and adapt their processes accordingly. Continuous learning is key for developing complex software products and keeping up with changing needs.
The document discusses how to achieve a happy marriage between context-driven and agile approaches to software development and testing. It advocates for involving testers from the start of projects and having them work closely with developers as part of integrated teams. The document also provides advice on skills needed for testers, such as domain knowledge and a willingness to learn, and emphasizes pairing with other roles like developers to facilitate collaboration and knowledge sharing.
Michael Bolton - Two Futures of Software TestingTEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Two Futures of Software Testing by Michael Bolton. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Exploratory testing is an approach to testing that emphasizes the freedom and responsibility of testers to continually optimize the value of their work. It is the process of three mutually supportive activities done in parallel: learning, test design, and test execution. With skill and practice, exploratory testers typically uncover an order of magnitude more problems than when the same amount of effort is spent on procedurally scripted testing. All testers conduct exploratory testing in one way or another, but few know how to do it systematically to obtain the greatest benefits. Even fewer can articulate the process. James Bach looks at specific heuristics and techniques of exploratory testing that will help you get the most from this highly productive approach. James focuses on the skills and dynamics of exploratory testing, and how it can be combined with scripted approaches.
Perhaps in no other professional field is the dichotomy between theory and practice more starkly different than in the realm of software testing. Researchers and thought leaders claim that testing requires a high level of cognitive and interpersonal skills, in order to make judgments about the ability of software to fulfill its operational goals. In their minds, testing is about assessing and communicating the risks involved in deploying software in a specific state.
However, in many organizations, testing remains a necessary evil, and a cost to drive down as much as possible. Testing is merely a measure of conformance to requirements, without regard to the quality of requirements or how conformance is measured. This is certainly an important measure, but tells an incomplete story about the value of software in support of our business goals.
We as testers often help to perpetuate the status quo. Although in many cases we realize we can add far more value than we do, we continue to perform testing in a manner that reduces our value in the software development process.
This presentation looks at the state of the art as well of the state of common practice, and attempts to provide a rationale and roadmap whereby the practice of testing can be made more exciting and stimulating to the testing professional, as well as more valuable to the product and the organization.
How to get what you really want from Testing' with Michael BoltonTEST Huddle
EuroSTAR Conferences, with the support of ISA Software Skillnet, Irish Software Innovation Network and SoftTest, were delighted to bring you a half-day software testing masterclass with Michael Bolton
In this session, Michael Bolton (who has extensive experience as a tester, as a programmer, and as a project manager) explained the role of skilled software testers, and why you might not want to think of testing as "quality assurance".
He present ideas about the relationship between management and testers, and about the service that testers really provide: making quality assurance possible by lighting the way of the project. For those of you who who attended this event, we really hope it was of use to you in your testing careers.
www.eurostarconferences.com
This document provides an overview of Agile software development. It begins by defining Agile as a project management process that encourages frequent inspection and adaptation. It then discusses some common Agile practices like Scrum and eXtreme Programming. The Agile Manifesto values individuals and interactions, working software, customer collaboration, and responding to change. Finally, it provides advice for different roles on how Agile can benefit them and their work.
Presented at Ford's 2017 Global IT Learning Summit (GLITS)Ron Lazaro
Presentation Details: The best way to think about product discovery is to think about it in relation to product delivery. It's not possible to build a product without doing both discovery and delivery. Discovery encompasses all the activities that we do to decide what to build. It includes all the decisions we make to decide what to build next, whereas delivery is all the activities we do to write code, package releases, ship products. It's how we deliver value to our customers.
Key takeaway for the participants will be to help them understand the difference between Product Discovery and Product Delivery and how to apply techniques in doing both.
Dallas Education ISO 9001:2008,20000-2005,27001:2013 Certified, based at Bangalore India, Providing services in software consulting, application development, outsourcing services, Recruitment and Training. Started operation in the year 2001. We design, build, and support customized applications for businesses large and small. We are the market leader in training and outsourcing in various technologies. We ,Dallas Education, serve and support IT companies in the areas of Mainframes, ERP, .net, Java/J2EE,Data Warehousing and Business Intelligence, etc. we also train and outsource fresh talents to our clients.
The document discusses agile testing and bug prevention. It advocates for embedding testers within development teams to focus on prevention rather than detection of bugs. The ideal approach involves continuous testing parallel to development with the entire team involved in testing.
- The document discusses the growth of the QA team at North from 1 person to 18 full-time employees and 2 co-ops over 3 years. It describes challenges faced such as hiring candidates and establishing processes as the team grew rapidly. Lessons learned include starting with basic tools, focusing on lightweight processes, and tailoring interviews and challenges to the role. The future includes expanding test automation and driving quality practices earlier in development.
A test strategy is the set of ideas that guides your test design. It's what explains why you test this instead of that, and why you test this way instead of that way. Strategic thinking matters because testers must make quick decisions about what needs testing right now and what can be left alone. You must be able to work through major threads without being overwhelmed by tiny details. James Bach describes how test strategy is organized around risk but is not defined before testing begins. Rather, it evolves alongside testing as we learn more about the product. We start with a vague idea of our strategy, organize it quickly, and document as needed in a concise way. In the end, the strategy can be as formal and detailed as you want it to be. In the beginning, though, we start small. If you want to focus on testing and not paperwork, this approach is for you.
Exploratory testing is an approach to testing that emphasizes the freedom and responsibility of testers to continually optimize the value of their work. It is the process of three mutually supportive activities—learning, test design, and test execution—done in parallel. With skill and practice, exploratory testers typically uncover an order of magnitude more problems than when the same amount of effort is spent on procedurally scripted testing. All testers conduct exploratory testing in one way or another, but few know how to do it systematically to obtain the greatest benefits. Even fewer can articulate the process. Paul Holland looks at specific heuristics and techniques of exploratory testing that will help you get the most from this highly productive approach. Paul focuses on the skills and dynamics of exploratory testing, and how it can be combined with scripted approaches.
Trends in Software Testing: There has been a slow realization among the top executives that simply outsourcing testing to the lowest bidder is not resulting in a sufficient level of quality in their software products. In this session, Paul Holland will discuss how American companies are starting to reconsider “factory school” testing and are no longer satisfied with the current situation of simply outsourcing their “checking”. As the development side of software continues its dramatic shift toward Agile development – what role can testers have and how can testers still add value?
Similar to A Rapid Introduction to Rapid Software Testing (20)
Isabel Evans stopped drawing and painting after being told she was not very good at it, which led to a loss of confidence in her creative and professional abilities. However, she realized that attempting creative activities is important for cognitive and emotional development, and that making mistakes and learning from failures allows for growth. By reengaging with failure through art and with support from others, Isabel was able to regain confidence in her abilities and reboot her career. The document discusses different perspectives on failure and the importance of learning from mistakes.
Instill a DevOps Testing Culture in Your Team and Organization TechWell
The DevOps movement is here. Companies across many industries are breaking down siloed IT departments and federating them into product development teams. Testing and its practices are at the heart of these changes. Traditionally, IT organizations have been staffed with mostly manual testers and a limited number of automation and performance engineers. To keep pace with development in the new “you build it, you own it” environment, testing teams and individuals must develop new technical skills and even embrace coding to stay relevant and add greater value to the business. DevOps really starts with testing. Join Adam Auerbach as he explains what DevOps is and how it relates to testing. He describes how testing must change from top to bottom and how to access your own environment to identify improvement opportunities. Adam dives into practices like service virtualization, test data management, and continuous testing so you can understand where you are now and identify steps needed to instill a DevOps testing culture in your team and organization.
Test Design for Fully Automated Build ArchitectureTechWell
This document summarizes a half-day tutorial on test design for fully automated build architectures presented by Melissa Benua of mParticle at STAREAST 2018. The tutorial covered guiding principles for test design including prioritizing important and reliable tests, structuring automated pipelines around components, packages, and releases, and monitoring test results through code coverage, flaky test handling, and logging versus counters. It also included exercises mapping test cases to functional boundaries and categories of tests to pipeline stages.
System-Level Test Automation: Ensuring a Good StartTechWell
Many organizations invest a lot of effort in test automation at the system level but then have serious problems later on. As a leader, how can you ensure that your new automation efforts will get off to a good start? What can you do to ensure that your automation work provides continuing value? This tutorial covers both “theory” and “practice”. Dot Graham explains the critical issues for getting a good start, and Chris Loder describes his experiences in getting good automation started at a number of companies. The tutorial covers the most important management issues you must address for test automation success, particularly when you are new to automation, and how to choose the best approaches for your organization—no matter which automation tools you use. Focusing on system level testing, Dot and Chris explain how automation affects staffing, who should be responsible for which automation tasks, how managers can best support automation efforts to promote success, what you can realistically expect in benefits and how to report them. They explain—for non-techies—the key technical issues that can make or break your automation effort. Come away with your own clarified automation objectives, and a draft test automation strategy to use to plan your own system-level test automation.
Build Your Mobile App Quality and Test StrategyTechWell
Let’s build a mobile app quality and testing strategy together. Whether you have a web, hybrid, or native app, building a quality and testing strategy means (1) knowing what data and tools you have available to make agile decisions, (2) understanding your customers and your competitors, and (3) testing your app under real-world conditions. Jason Arbon guides you through the latest techniques, data, and tools to ensure the awesomeness of your mobile app quality and testing strategy. Leave this interactive session with a strategy for your very own app—or one you pretend to own. The information Jason shares is based on data from Appdiff’s next-gen mobile app testing platform, lessons from Applause/uTest’s crowd, text mining hundreds of millions of app store reviews, and in-depth discussions with top mobile app development teams.
Testing Transformation: The Art and Science for SuccessTechWell
Technologies, testing processes, and the role of the tester have evolved significantly in the past few years with the advent of agile, DevOps, and other new technologies. It is critical that we testing professionals evaluate ourselves and continue to add tangible value to our organizations. In your work, are you focused on the trivial or on real game changers? Jennifer Bonine describes critical elements that help you artfully blend people, process, and technology to create a synergistic relationship that adds value. Jennifer shares ideas on mastering politics, maneuvering core vs. context, and innovating your technology strategies and processes. She explores how new processes can be introduced in an organization, what the role of organizational culture is in determining the success of a project, and how you can know what tools will add value vs. simply adding overhead and complexity. Jennifer reviews critically needed tester skills and discusses a continual learning model to evolve your skills and stay relevant. This discussion can lead you to technologies, processes, and skills you can stake your career on.
We’ve all been there. We work incredibly hard to develop a feature and design tests based on written requirements. We build a detailed test plan that aligns the tests with the software and the documented business needs. And when we put the tests to the software, it all falls apart because the requirements were changed without informing everyone. Mary Thorn says help is at hand. Enter behavior-driven development (BDD), and Cucumber and SpecFlow, tools for running automated acceptance tests and facilitating BDD. Mary explores the nuances of Cucumber and SpecFlow, and shows you how to implement BDD and agile acceptance testing. By fostering collaboration for implementing active requirements via a common language and format, Cucumber and SpecFlow bridge the communication gap between business stakeholders and implementation teams. In this workshop, practice writing feature files with the best practices Mary has discovered over numerous implementations. If you experience developers not coding to requirements, testers not getting requirements updates, or customers who feel out of the loop and don’t get what they ask for, Mary has answers for you.
Develop WebDriver Automated Tests—and Keep Your SanityTechWell
Many teams go crazy because of brittle, high-maintenance automated test suites. Jim Holmes helps you understand how to create a flexible, maintainable, high-value suite of functional tests using Selenium WebDriver. Learn the basics of what to test, what not to test, and how to avoid overlapping with other types of testing. Jim includes both philosophical concepts and hands-on coding. Testers who haven't written code should not be intimidated! We'll pair you up to make sure you're successful. Learn to create practical tests dealing with advanced situations such as input validation, AJAX delays, and working with file downloads. Additionally, discover when you need to work together with developers to create a system that's more easily testable. This tutorial focuses primarily on automating web tests, but many of the same concepts can be applied to other UI environments. Demos and labs will be in C# and Java using WebDriver. Leave this tutorial having learned how to write high-value WebDriver tests—and stay sane while doing so.
DevOps is a cultural shift aimed at streamlining intergroup communication and improving operational efficiency for development and operations groups. Over time, inclusion of other IT groups under the DevOps umbrella has become the norm for many organizations. But even broadening the boundaries of DevOps, the conversation has been largely devoid of the business units’ place at the table. A common mistake organizations make while going through the DevOps transformation is drawing a line at the IT boundary. If that occurs, a larger, more inclusive silo within the organization is created, operating in an informational vacuum and causing operational inefficiency and goal misalignment. Sharing his experiences working on both sides of the fence, Leon Fayer describes the importance of including business units in order to align technology decisions with business goals. Leon discusses inclusion of business units in existing agile processes, benefits of cross-departmental monitoring, and a business-first approach to technology decisions.
Eliminate Cloud Waste with a Holistic DevOps StrategyTechWell
Chris Parlette maintains that renting infrastructure on demand is the most disruptive trend in IT in decades. In 2016, enterprises spent $23B on public cloud IaaS services. By 2020, that figure is expected to reach $65B. The public cloud is now used like a utility, and like any utility, there is waste. Who's responsible for optimizing the infrastructure and reducing wasted expenses? It’s DevOps. The excess expense, known as cloud waste, comprises several interrelated problems: services running when they don't need to be, improperly sized infrastructure, orphaned resources, and shadow IT. There are a few core tenets of DevOps—holistic thinking, no silos, rapid useful feedback, and automation—that can be applied to reducing your cloud waste. Join Chris to learn why you should include continuous cost optimization in your DevOps processes. Automate cost control, reduce your cloud expenses, and make your life easier.
Transform Test Organizations for the New World of DevOpsTechWell
With the recent emergence of DevOps across the industry, testing organizations are being challenged to transform themselves significantly within a short period of time to stay meaningful within their organizations. It’s not easy to plan and approach these changes considering the way testing organizations have remained structured for ages. These challenges start from foundational organizational structures and can cut across leadership influence, competencies, tools strategy, infrastructure, and other dimensions. Sumit Kumar shares his experience assisting various organizations to overcome these challenges using an organized DevOps enablement framework. The framework includes radical restructuring, turning the tools strategy upside down, a multidimensional workforce enablement supported by infrastructure changes, redeveloped collaborations models, and more. From his real world experiences Sumit shares tips for approaching this journey and explains the roadmap for testing organizations to transform themselves to lead the quality in DevOps.
The Fourth Constraint in Project Delivery—LeadershipTechWell
All too often, the triple constraints—time, cost, and quality—are bandied about as if they are the be-all, end-all. While they are important, leadership—the fourth and larger underpinning constraint—influences the first three. Statistics on project success and failure abound, and these measurements are usually taken against the triple constraints. According to the Project Management Institute, only 53 percent of projects are completed within budget, and only 49 percent are completed on time. If so many projects overrun budget and are late, we can’t really say, “Good, fast, or cheap—pick two.” Rob Burkett talks about leadership at every level of a team. He shares his insights and stories gleaned from his years of IT and project management experience. Rob speaks to some of the glaring difficulties in the workplace in general and some specifically related to IT delivery and project management. Leave with a clearer understanding of how to communicate with teams and team members, and gain a better understanding of how you can be a leader—up and down your organization.
Resolve the Contradiction of Specialists within Agile TeamsTechWell
As teams grow, organizations often draw a distinction between feature teams, which deliver the visible business value to the user, and component teams, which manage shared work. Steve Berczuk says that this distinction can help organizations be more productive and scale effectively, but he recognizes that not all shared work fits into this model. Some work is best handled by “specialists,” that is people with unique skills. Although teams composed entirely of T-shaped people is ideal, certain skills are hard to come by and are used irregularly across an organization. Since these specialists often need to work closely with teams, rather than working from their own backlog, they don’t fit into the component team model. The use of shared resources presents challenges to the agile planning model. Steve Berczuk shares how teams such as those providing infrastructure services and specialists can fit into a feature+component team model, and how variations such as embedding specialists in a scrum team can both present process challenges and add significant value to both the team and the larger organization.
Pin the Tail on the Metric: A Field-Tested Agile GameTechWell
Metrics don’t have to be a necessary evil. If done right, metrics can help guide us to make better forward-looking decisions, rather than being used for simply managing or monitoring. They can help us identify trade-offs between options for what to do next versus punitive or worse, purely managerial measures. Steve Martin won’t be giving the Top Ten List of field-tested metrics you should use. Instead, in this interactive mini-workshop, he leads you through the critical thinking necessary for you to determine what is right for you to measure. First, Steve explores why you want to measure something—whether it’s for a team, a portfolio, or even an agile transformation. Next, he provides multiple real-life metrics examples to help drive home concepts behind characteristics of good and bad metrics. Finally, Steve shows how to run his field-tested agile game—Pin the Tail on the Metric. Take back this activity to help you guide metrics conversations at your organization.
Agile Performance Holarchy (APH)—A Model for Scaling Agile TeamsTechWell
A hierarchy is an organizational network that has a top and a bottom, and where position is determined by rank, importance, and value. A holarchy is a network that has no top or bottom and where each person’s value derives from his ability, rather than position. As more companies seek the benefits of agile, leaders need to build and sustain delivery capability while scaling agile without introducing unnecessary process and overhead. The Agile Performance Holarchy (APH) is an empirical model for scaling and sustaining agility while continuing to deliver great products. Jeff Dalton designed the APH by drawing from lessons learned observing and assessing hundreds of agile companies and teams. The APH helps implement a holarchy—a system composed of interacting organizational units called holons—centered on a series of performance circles that embody the behaviors of high performing agile organizations. Jeff describes how APH provides guidelines in the areas of leadership, values, teaming, visioning, governing, building, supporting, and engaging within an all-agile organization. Join Jeff to see what the APH is all about and how you can use it in your team and organization.
A Business-First Approach to DevOps ImplementationTechWell
DevOps is a cultural shift aimed at streamlining intergroup communication and improving operational efficiency for development and operations groups. Over time, inclusion of other IT groups under the DevOps umbrella has become the norm for many organizations. But even broadening the boundaries of DevOps, the conversation has been largely devoid of the business units’ place at the table. A common mistake organizations make while going through the DevOps transformation is drawing a line at the IT boundary. If that occurs, a larger, more inclusive silo within the organization is created, operating in an informational vacuum and causing operational inefficiency and goal misalignment. Sharing his experiences working on both sides of the fence, Leon Fayer describes the importance of including business units in order to align technology decisions with business goals. Leon discusses inclusion of business units in existing agile processes, benefits of cross-departmental monitoring, and a business-first approach to technology decisions.
Databases in a Continuous Integration/Delivery ProcessTechWell
The document summarizes a presentation about including databases in a continuous integration/delivery process. It discusses treating database code like application code by placing it under version control and integrating databases into the DevOps software development pipeline. This allows databases to be built, tested, and released like other software through continuous integration, delivery, and deployment.
Mobile Testing: What—and What Not—to AutomateTechWell
Organizations are moving rapidly into mobile technology, which has significantly increased the demand for testing of mobile applications. David Dangs says testers naturally are turning to automation to help ease the workload, increase potential test coverage, and improve testing efficiency. But should you try to automate all things mobile? Unfortunately, the answer is not always clear. Mobile has its own set of complications, compounded by a wide variety of devices and OS platforms. Join David to learn what mobile testing activities are ripe for automation—and those items best left to manual efforts. He describes the various considerations for automating each type of mobile application: mobile web, native app, and hybrid applications. David also covers device-level testing, types of testing, available automation tools, and recommendations for automation effectiveness. Finally, based on his years of mobile testing experience, David provides some tips and tricks to approach mobile automation. Leave with a clear plan for automating your mobile applications.
Cultural Intelligence: A Key Skill for SuccessTechWell
Diversity is becoming the norm in everyday life. However, introducing global delivery models without a proper understanding of intercultural differences can lead to difficulty, frustration, and reduced productivity. Priyanka Sharma and Thena Barry say that in our diverse world, we need teams with people who can cross these boundaries, communicate effectively, and build the diverse networks necessary to avoid problems. We need to learn about cultural intelligence (CI) and cultural quotient (CQ). CI is the ability to relate and work effectively across cultures. CQ is the cognitive, motivational, and behavioral capacity to understand and respond to beliefs, values, attitudes, and behaviors of individuals and groups. Together, CI and CQ can help us build behavioral capacities that aid motivation, behavior, and productivity in teams as well as individuals. Priyanka and Thena show how to build a more culturally intelligent place with tools and techniques from Leading with Cultural Intelligence, as well as content from the Hofstede cultural model. In addition, they illustrate the model with real-life experiences and demonstrate how they adapted in similar circumstances.
Turn the Lights On: A Power Utility Company's Agile TransformationTechWell
Why would a century-old utility with no direct competitors take on the challenge of transforming its entire IT application organization to an agile methodology? In an increasingly interconnected world, the expectations of customers continue to evolve. From smart meters to smart phones, IoT is creating a crisis point for industries not accustomed to rapid change. Glen Morris explains that pizzas can be tracked by the minute and packages at every stop, and customers now expect this same customer service model should exist for all industries—including power. Glen examines how to create momentum and transform non-IT-focused industries to an agile model. If you are struggling with gaining traction in your pursuit of agile within your business, Glen gives you concrete, practical experiences to leverage in your pursuit. Finally, he communicates how to gain buy-in from business partners who have no idea or concern about agile or its methodologies. If your business partners look at you with amusement when you mention the need for a dedicated Product Owner, join Glen as he walks you through the approaches to overcoming agile skepticism.
Fueling AI with Great Data with Airbyte WebinarZilliz
This talk will focus on how to collect data from a variety of sources, leveraging this data for RAG and other GenAI use cases, and finally charting your course to productionalization.
Ivanti’s Patch Tuesday breakdown goes beyond patching your applications and brings you the intelligence and guidance needed to prioritize where to focus your attention first. Catch early analysis on our Ivanti blog, then join industry expert Chris Goettl for the Patch Tuesday Webinar Event. There we’ll do a deep dive into each of the bulletins and give guidance on the risks associated with the newly-identified vulnerabilities.
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
Your One-Stop Shop for Python Success: Top 10 US Python Development Providersakankshawande
Simplify your search for a reliable Python development partner! This list presents the top 10 trusted US providers offering comprehensive Python development services, ensuring your project's success from conception to completion.
Infrastructure Challenges in Scaling RAG with Custom AI modelsZilliz
Building Retrieval-Augmented Generation (RAG) systems with open-source and custom AI models is a complex task. This talk explores the challenges in productionizing RAG systems, including retrieval performance, response synthesis, and evaluation. We’ll discuss how to leverage open-source models like text embeddings, language models, and custom fine-tuned models to enhance RAG performance. Additionally, we’ll cover how BentoML can help orchestrate and scale these AI components efficiently, ensuring seamless deployment and management of RAG systems in the cloud.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceIndexBug
Imagine a world where machines not only perform tasks but also learn, adapt, and make decisions. This is the promise of Artificial Intelligence (AI), a technology that's not just enhancing our lives but revolutionizing entire industries.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
Have you ever been confused by the myriad of choices offered by AWS for hosting a website or an API?
Lambda, Elastic Beanstalk, Lightsail, Amplify, S3 (and more!) can each host websites + APIs. But which one should we choose?
Which one is cheapest? Which one is fastest? Which one will scale to meet our needs?
Join me in this session as we dive into each AWS hosting service to determine which one is best for your scenario and explain why!
OpenID AuthZEN Interop Read Out - AuthorizationDavid Brossard
During Identiverse 2024 and EIC 2024, members of the OpenID AuthZEN WG got together and demoed their authorization endpoints conforming to the AuthZEN API
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
A Rapid Introduction to Rapid Software Testing
1. MA
Full-Day Tutorial
9/30/2013 8:30:00 AM
"A Rapid Introduction to Rapid
Software Testing"
Presented by:
Paul Holland
Testing Thoughts
Brought to you by:
340 Corporate Way, Suite 300, Orange Park, FL 32073
888-268-8770 ∙ 904-278-0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com
2. Paul Holland
Testing Thoughts
An independent software test consultant and teacher, Paul Holland has more than sixteen years
of hands-on testing and test management experience, primarily at Alcatel-Lucent where he led a
transformation of the testing approach for two product divisions, making them more efficient and
effective. As a test manager and tester, Paul focused on exploratory testing, test automation,
and improving testing techniques.
3. A Rapid Introduction
to Rapid Software Testing
James Bach, Satisfice, Inc.
james@satisfice.com
www.satisfice.com
+1 (360) 440-1435
Michael Bolton, DevelopSense
mb@developsense.com
www.developsense.com
+1 (416) 656-5160
Paul Holland, Testing Thoughts
paul@testingthoughts.com
www.testingthoughts.com
+1 (613) 297-3468
Acknowledgements
• Some of this material was developed in collaboration with
Dr. Cem Kaner, of the Florida Institute of Technology. See
www.kaner.com and www.testingeducation.org.
• Doug Hoffman (www.softwarequalitymethods.com) has also
contributed to and occasionally teaches from this material.
• Many of the ideas in this presentation were also inspired by or
augmented by other colleagues including Jonathan Bach, Bret
Pettichord, Brian Marick, Dave Gelperin, Elisabeth Hendrickson, Jerry
Weinberg, Noel Nyman, and Mary Alton.
• Some of our exercises were introduced to us by Payson Hall, Jerry
Weinberg, Ross Collard, James Lyndsay, Dave Smith, Earl Everett,
Brian Marick, Cem Kaner and Joe McMahon.
• Many ideas were improved by students who took earlier versions of
the class going back to 1996.
2
1
4. Assumptions About You
• You test software, or any other complex human
creation.
• You have at least some control over the design of your
tests and some time to create new tests.
• You are worried that your test process is spending too
much time and resources on things that aren’t
important.
• You test under uncertainty and time pressure.
• Your major goal is to find important problems quickly.
• You want to get very good at (software) testing.
3
A Question
What makes
testing harder or
slower?
2
5. Premises of Rapid Testing
1. Software projects and products are relationships
between people.
2. Each project occurs under conditions of uncertainty
and time pressure.
3. Despite our best hopes and intentions, some
degree of inexperience, carelessness, and
incompetence is normal.
4. A test is an activity; it is performance, not artifacts.
Premises of Rapid Testing
5. Testing’s purpose is to discover the status of the
product and any threats to its value, so that our
clients can make informed decisions about it.
6. We commit to performing credible, cost-effective
testing, and we will inform our clients of anything that
threatens that commitment.
7. We will not knowingly or negligently mislead our
clients and colleagues or ourselves.
8. Testers accept responsibility for the quality of their
work, although they cannot control the quality of the
product.
3
6. Rapid Testing
Rapid testing is a mind-set
and a skill-set of testing
focused on how to do testing
more quickly,
less expensively,
with excellent results.
This is a general testing
methodology. It adapts to
any kind of project or product.
7
How does Rapid Testing compare
with other kinds of testing?
More Work & Time
(Cost)
When testing is turned into an
elaborate set of rote tasks,
it becomes ponderous without
really being thorough.
Ponderous
Slow, expensive,
and easier
Slapdash
Much faster, cheaper,
and easier
You can always test
quickly...
But it might be poor
testing.
Management likes to talk
about exhaustive testing, but
they don’t want to fund it and
they don’t know how to do it.
Exhaustive
Slow, very expensive,
and difficult
Rapid
Faster, less expensive,
still challenging
Better Thinking & Better Testing
(Value)
Rapid testing may not
be exhaustive, but it is
thorough enough and
quick enough. It’s less
work than ponderous
testing. It might be less
work than slapdash
testing.
It fulfills the mission
of testing.
8
4
7. Excellent Rapid Technical Work
Begins with You
When the ball comes to you…
Do you know you have the ball?
Can you receive the pass?
Do you know your options?
Do you know what your
role and mission is?
Do you know where
your teammates are?
Is your equipment ready?
Can you read the
situation on the field?
Are you aware of the
Can you let your teammates help you? criticality of the situation?
Are you ready to act, right now?
9
…but you don’t have to be great at everything.
• Rapid test teams are about diverse talents cooperating
• We call this the elliptical team, as opposed to the team of
perfect circles.
• Some important dimensions to vary:
•
•
•
•
•
•
•
•
•
Technical skill
Domain expertise
Temperament (e.g. introvert vs. extrovert)
Testing experience
Project experience
Industry experience
Product knowledge
Educational background
Writing skill
• Diversity makes exploration far more powerful
• Your team is powerful because of your unique contribution
10
5
8. What It Means To Test Rapidly
• Since testing is about finding a potentially infinite
number of problems in an infinite space in a finite
amount of time, we must…
• understand our mission and obstacles to fulfilling it
• know how to recognize problems quickly
• model the product and the test space to know where to look
for problems
• prefer inexpensive, lightweight, effective tools
• reduce dependence on expensive, time-consuming artifacts,
while getting value from the ones we’ve got
• do nothing that wastes time or effort
• tell a credible story about all that
11
One Big Problem in Testing
Formality Bloat
• Much of the time, your testing doesn’t need to be very formal*
• Even when your testing does need to be formal, you’ll need to
do substantial amounts of informal testing in order figure out
how to do excellent formal testing.
•
•
Who says? The FDA. See http://www.satisfice.com/blog/archives/602
Even in a highly regulated environment, you do formal testing
You do informal testing to make sure
you don’t lose money, blow things up, or kill people.
primarily for the auditors.
* Formal testing means testing that must be done to verify a specific fact,
or that must be done in a specific way.
6
9. EXERCISE
Test the Famous Triangle
What is testing?
Serving Your Client
If you don’t have an understanding and an agreement
on what is the mission of your testing, then doing it
“rapidly” would be pointless.
14
7
10. Not Enough
Product and Project Information?
Where do we get
test information?
What Is A Problem?
A problem is…
8
11. How Do We Recognize Problems?
An oracle is…
a way to recognize
a problem.
Learn About Heuristics
Heuristics are fallible, “fast and frugal” methods of solving
problems, making decisions, or accomplishing tasks.
“The engineering method is
the use of heuristics
to cause the best change
in a poorly understood situation
within the available resources.”
Billy Vaughan Koen
Discussion of the Method
9
12. Heuristics: Generating Solutions
Quickly and Inexpensively
• Heuristic (adjective):
serving to discover or learn
• Heuristic (noun):
a fallible method for solving a problem
or making a decision
“Heuristic reasoning is not regarded as final and strict
but as provisional and plausible only, whose purpose
is to discover the solution to the present problem.”
- George Polya, How to Solve It
Oracles
An oracle is a heuristic principle or mechanism
by which we recognize a problem.
“It works!”
really means…
“...it appeared at least once to meet some
requirement to some degree”
“...uh, when I ran it”
“...that one time”
“...on my machine.”
10
13. Familiar Problems
If a product is consistent with problems we’ve seen before,
we suspect that there might be a problem.
Explainability
If a product is inconsistent with our ability to explain it
(or someone else’s), we suspect that there might be a problem.
11
14. World
If a product is inconsistent with the way the world works,
we suspect that there might be a problem.
History
Okay,
so how the
#&@ do I print
now?
If a product is inconsistent with previous versions of itself,
we suspect that there might be a problem.
12
15. Image
If a product is inconsistent with an image that
the company wants to project, we suspect a problem.
Comparable Products
WordPad
Word
When a product seems inconsistent with a product that is
in some way comparable, we suspect that there might be a problem.
13
16. Claims
When a product is inconsistent with claims that important
people make about it, we suspect a problem.
User Expectations
When a product is inconsistent with expectations that a
reasonable user might have, we suspect a problem.
14
17. Purpose
When a product is inconsistent with its designers’ explicit
or implicit purposes, we suspect a problem.
Product
When a product is inconsistent internally—as when it
contradicts itself—we suspect a problem.
15
18. Statutes and Standards
When a product is inconsistent with laws or widely
accepted standards, we suspect a problem.
Consistency (“this agrees with that”)
an important theme in oracle principles
•
•
•
•
•
•
Familiarity: The system is not consistent with the pattern of any familiar problem.
Explainability: The system is consistent with our ability to describe it clearly.
World: The system is consistent with things that we recognize in the world.
History: The present version of the system is consistent with past versions of it.
Image: The system is consistent with an image that the organization wants to project.
Comparable Products: The system is consistent with comparable systems.
• Claims: The system is consistent with what important people say it’s supposed to be.
• Users’ Expectations: The system is consistent with what users want.
• Product: Each element of the system is consistent with comparable elements in the
same system.
• Purpose: The system is consistent with its purposes, both explicit and implicit.
• Standards and Statutes: The system is consistent with applicable laws, or relevant
implicit or explicit standards.
Consistency heuristics rely on the quality of your
models of the product and its context.
32
16
19. All Oracles Are Heuristic
An oracle doesn’t tell you that there IS a problem.
An oracle tells you that you might be seeing a problem.
An oracle can alert you to a possible problem,
but an oracle cannot tell you that there is no problem.
Consistency heuristics rely on the quality of
your models of the product and its context.
Rely solely on documented, anticipated sources of oracles,
and your testing will likely be slower and weaker.
Train your mind to recognize patterns of oracles
and your testing will likely be faster
and your ability to spot problems will be sharper.
General Examples of Oracles
things that suggest “problem” or “no problem”
•
•
•
•
•
•
•
•
•
•
People
A person whose opinion matters.
An opinion held by a person who matters.
A disagreement among people who matter.
A reference document with useful information.
A known good example output.
Mechanisms
A known bad example output.
A process or tool by which the output is checked.
A process or tool that helps a tester identify patterns.
A feeling like confusion or annoyance.
Feelings
A desirable consistency between related things.
Principles
34
17
20. Oracles from the Inside Out
Explicit
Tacit
Tester
Inference
Your
Feelings &
Mental Models
Observable
Consistencies
Other People
Experience
Conference
Reference
Stakeholders’
Feelings &
Mental Models
Shared Artifacts
(specs, tools, etc.)
Oracle Cost and Value
• Some oracles are more authoritative
• but more responsive to change
• Some oracles are more consistent
• but maybe not up to date
• Some oracles are more immediate
• but less reliable
• Some oracles are more precise
• but the precision may be misleading
• Some oracles are more accurate
• but less precise
• Some oracles are more available
• but less authoritative
• Some oracles are easier to interpret
• but more narrowly focused
18
21. Feelings As Heuristic Triggers For Oracles
• An emotional reaction is a trigger to attention
and learning
• Without emotion, we don’t reason well
• See Damasio, The Feeling of What Happens
• When you find yourself mildly concerned about
something, someone else could be very
concerned about it
• Observe emotions to help overcome your
biases and to evaluate significance
An emotion is a signal; consider looking into it
All Oracles Are Heuristic
• We often do not have oracles that establish a definite correct or incorrect
result, in advance. Oracles may reveal themselves to us on the fly, or later.
That’s why we use abductive inference.
• No single oracle can tell us whether a program (or a feature) is working
correctly at all times and in all circumstances.
That’s why we use a variety of oracles.
• Any program that looks like it’s working, to you, may in fact be failing in
some way that happens to fool all of your oracles. That’s why we proceed
with humility and critical thinking.
• We never know when a test is finished.
That’s why we try to maintain uncertainty when everyone else on the
project is sure.
• You (the tester) can’t know the deep truth about any result.
That’s why we report whatever seems likely to be a bug.
38
19
22. Oracles are Not Perfect
And Testers are Not Judges
• You don’t need to know for sure if something is a bug;
it’s not your job to decide if something is a bug; it’s your
job to decide if it’s worth reporting.
• You do need to form a justified belief that it MIGHT be
a threat to product value in the opinion of someone
who matters.
• And you must be able to say why you think so; you must
be able to cite good oracles… or you will lose credibility.
MIP’ing VS. Black Flagging
39
Coping With Difficult Oracle Problems
• Ignore the Problem
• Ask “so what?” Maybe the value of the information doesn’t justify the cost.
• Simplify the Problem
• Ask for testability. It usually doesn’t happen by accident.
• Built-in oracle. Internal error detection and handling.
• Lower the standards. You may be using an unreasonable standard of correctness.
• Shift the Problem
• Parallel testing. Compare with another instance of a comparable algorithm.
• Live oracle. Find an expert who can tell if the output is correct.
• Reverse the function. (e.g. 2 x 2 = 4, then 4/2 = 2)
• Divide and Conquer the Problem
• Spot check. Perform a detailed inspection on one instance out of a set of outputs.
• Blink test. Compare or review overwhelming batches of data for patterns that stand
out.
•
•
•
•
Easy input. Use input for which the output is easy to analyze.
Easy output. Some output may be obviously wrong, regardless of input.
Unit test first. Learn about the pieces that make the whole.
Test incrementally. Learn about the product by testing over a period of time.
40
20
23. “Easy Input”
• Fixed Markers. Use distinctive fixed input patterns that are easy to
spot in the output.
• Statistical Markers. Use populations of data that have
distinguishable statistical properties.
• Self-Referential Data. Use data that embeds metadata about itself.
(e.g. counterstrings)
• Easy Input Regions. For specific inputs, the correct output may be
easy to calculate.
• Outrageous Values. For some inputs, we expect error handling.
• Idempotent Input. Try a case where the output will be the same as
the input.
• Match. Do the “same thing” twice and look for a match.
• Progressive Mismatch. Do progressively differing things over time
and account for each difference. (code-breaking technique)
41
Oracles Are Linked To Threats
To Quality Criteria
Capability
Scalability
Reliability
Compatibility
Usability
Performance
Charisma
Installability
Security
Development
Any inconsistency may represent diminished value.
Many test approaches focus on capability (functionality)
42
and underemphasize the other criteria.
21
24. Oracles Are Linked To Threats
To Quality Criteria
Supportability
Testability
Maintainability
Portability
Localization
Any inconsistency may represent diminished value.
Many test approaches focus on capability (functionality)
43
and underemphasize the other criteria.
Focusing on Preparation and Skill
Can Reduce Documentation Bloat
3.0 Test Procedures
3.1 General testing protocol.
•
In the test descriptions that follow, the word “verify" is used to highlight specific items
that must be checked. In addition to those items a tester shall, at all times, be alert for
any unexplained or erroneous behavior of the product. The tester shall bear in mind
that, regardless of any specific requirements for any specific test, there is the
overarching general requirement that the product shall not pose an unacceptable risk
of harm to the patient, including an unacceptable risk using reasonably foreseeable
misuse.
•
Test personnel requirements: The tester shall be thoroughly familiar with the
generator and workstation FRS, as well as with the working principles of the devices
themselves. The tester shall also know the working principles of the power test jig and
associated software, including how to configure and calibrate it and how to recognize if
it is not working correctly. The tester shall have sufficient skill in data analysis and
measurement theory to make sense of statistical test results. The tester shall be
sufficiently familiar with test design to complement this protocol with exploratory
testing, in the event that anomalies appear that require investigation. The tester shall
know how to keep test records to credible, professional standard.
22
25. Remember…
For skilled testers,
good testing isn’t just about
pass vs. fail.
For skilled testers,
testing is about
problem vs. no problem.
Where Do We Look For Problems?
Coverage is…
how much of the
product has been tested.
23
26. What IS Coverage?
Coverage is “how much of the product we have tested.”
It’s the extent to which we have
traveled over some map of the product.
MODELS
Models
• A model is an idea, activity, or object…
such as an idea in your mind, a diagram, a list of words, a spreadsheet,
a person, a toy, an equation, a demonstration, or a program
•
…that heuristically represents (literally,
re-presents) another idea, activity, or object…
such as something complex that you need to work with or study
•
…whereby understanding something about the
model may help you to understand or manipulate
the thing that it represents.
- A map is a model that helps to navigate across a terrain.
- 2+2=4 is a model for adding two apples to a basket that already has two apples.
- Atmospheric models help predict where hurricanes will go.
- A fashion model helps understand how clothing would look on actual humans.
- Your beliefs about what you test are a model of what you test.
24
27. There are as many kinds of test coverage as there are
ways to model the system.
Intentionally OR Incidentally
One Way to Model Coverage:
Product Elements (with Quality Criteria)
• Structure
• Function
• Data
• Interfaces
• Platform
• Operations
• Time
Capability
Reliability
Usability
Charisma
Security
Supportability
Scalability
Testability
Compatibility
Performance Maintainability
Installability
25
28. To test a very simple product meticulously,
part of a complex product meticulously,
or to maximize test integrity…
1.
2.
3.
4.
5.
6.
Start the test from a known (clean) state.
Prefer simple, deterministic actions.
Trace test steps to a specified model.
Follow established and consistent lab procedures.
Make specific predictions, observations and records.
Make it easy to reproduce (automation may help).
51
General Focusing Heuristics
• use test-first approach or unit testing for better code
coverage
• work from prepared test coverage outlines and risk lists
• use diagrams, state models, and the like, and cover them
• apply specific test techniques to address particular coverage
areas
• make careful observations and match to expectations
To do this more rapidly, make preparation and artifacts fast and frugal:
leverage existing materials and avoid repeating yourself.
Emphasize doing; relax planning. You’ll make discoveries along the way!
26
29. To find unexpected problems,
elusive problems that occur in sustained field use,
or more problems quickly in a complex product…
That’s a
PowerPoint
bug!
1.
2.
3.
4.
5.
6.
Start from different states (not necessarily clean).
Prefer complex, challenging actions.
Generate tests from a variety of models.
Question your lab procedures and tools.
Try to see everything with open expectations.
Make the test hard to pass, instead of easy to reproduce.
53
General Defocusing Heuristics
• diversify your models; intentional coverage in one area can lead
to unintentional coverage in other areas—this is a Good Thing
• diversify your test techniques
• be alert to problems other than the ones that you’re actively
looking for
• welcome and embrace productive distraction
• do some testing that is not oriented towards a specific risk
• use high-volume, randomized automated tests
27
30. DISCUSSION
How Many Test Cases?
What About Quantifying Coverage Overall?
•
A nice idea, but we don’t know how to do it in a way
that is consistent with basic measurement theory
•
•
If we describe coverage by counting test cases, we’re
committing reification error.
If we use percentages to quantify coverage, we need to
establish what 100% looks like.
•
•
But we might do that with respect to some specific models.
Complex systems may display emergent behaviour.
28
31. Extent of Coverage
• Smoke and sanity
• Can this thing even be tested at all?
• Common, core, and critical
• Can this thing do the things it must do?
• Does it handle happy paths and regular input?
• Can it work?
• Complex, harsh, extreme and exceptional
• Will this thing handle challenging tests, complex data flows,
and malformed input, etc.?
• Will it work?
How Might We Organize,
Record, and Report Coverage?
•
•
•
•
•
•
automated tools (e.g. profilers, coverage tools)
annotated diagrams and mind maps
coverage matrices
bug taxonomies
Michael Hunter’s You Are Not Done Yet list
James Bach’s Heuristic Test Strategy Model
• described at www.satisfice.com
• articles about it at www.developsense.com
• Mike Kelly’s MCOASTER model
• product coverage outlines and risk lists
• session-based test management
• http://www.satisfice.com/sbtm
See three articles here:
http://www.developsense.com/publications.html#coverage
29
32. What Does Rapid Testing Look Like?
Concise Documentation Minimizes Waste
Testing Heuristics
Risk Catalog
General
Coverage Model Risk Model
ProjectSpecific
Testing
Playbook
Schedule
Issues
Bugs
Test Strategy
Reference
Status
Dashboard
59
Rapid Testing Documentation
• Recognize
• a requirements document is not the requirements
• a test plan document is not a test plan
• a test script is not a test
• doing, rather than planning, produces results
• Determine where your documentation is on the continuum:
product or tool?
• Keep your tools sharp and lightweight
• Obtain consensus from others as to what’s necessary and what’s
excess in products
• Ask whether reporting test results takes priority over
obtaining test results
• note that in some contexts, it might
• Eliminate unnecessary clerical work
30
34. Visualizing Test Progress
See “A Sticky Situation”, Better Software, February 2012
What IS Exploratory Testing?
• Simultaneous test design, test
execution, and learning.
• James Bach, 1995
But maybe it would be a good idea to underscore
why that’s important…
32
35. What IS Exploratory Testing?
• I follow (and to some degree contributed to) Kaner’s definition, which
was refined over several peer conferences through 2007:
Exploratory software testing is…
•
•
•
•
•
a style of software testing
that emphasizes the personal freedom and responsibility
of the individual tester
to continually optimize the value of his or her work
by treating test design, test execution, test result interpretation,
and test-related learning
So maybe it would be
• as mutually supportive activities
a good idea to keep it
• that run in parallel
brief most of the
time…
• throughout the project.
See Kaner, “Exploratory Testing After 23 Years”, www.kaner.com/pdfs/ETat23.pdf
Why Exploratory Approaches?
• Systems are far
more than
collections of
functions
• Systems typically
depend upon and
interact with
many external
systems
33
36. Why Exploratory Approaches?
• Systems are too
complex for
individuals to
comprehend and
describe
• Products evolve
rapidly in ways
that cannot be
anticipated
In the future, developers will likely do more verification and validation at the
unit level than they have done before.
Testers must explore, discover, investigate, and learn about the system.
Why Exploratory Approaches?
• Developers are using tools and frameworks that
make programming more productive, but that
may manifest more emergent behaviour.
• Developers are increasingly adopting unit testing
and test-driven development.
• The traditional focus is on verification, validation,
and confirmation.
The new focus must be on exploration, discovery,
investigation, and learning.
34
37. Why Exploratory Approaches?
•
•
•
•
•
•
•
•
•
•
We don’t have time to waste
preparing wastefully elaborate written plans
for complex products
built from many parts
and interacting with many systems
(many of which we don’t control…
…or even understand)
where everything is changing over time
and there’s so much learning to be done
and the result, not the plan, is paramount.
Questions About Scripts…
arrows and cycles
What happens when the
unexpected happens
during a script?
Where do scripts
come from?
What do we do
with what we
learn?
Will everyone follow the same
script the same way?
(task performing)
35
38. Questions About Exploration…
arrows and cycles
What happens when
the unexpected
happens during
Where does
exploration
come from?
exploration?
What do we do
with what we
learn?
Will everyone
explore the same way?
(value seeking)
Exploration is Not Just Action
arrows and cycles
36
39. You can put them together!
arrows and cycles
You can put them together!
arrows and cycles
37
40. What Exploratory Testing Is Not
• Touring
• http://www.developsense.com/blog/2011/12/what-exploratory-testing-is-not-part-1touring/
• After-Everything-Else Testing
• http://www.developsense.com/blog/2011/12/what-exploratory-testing-is-not-part-2-aftereverything-else-testing/
• Tool-Free Testing
• http://www.developsense.com/blog/2011/12/what-exploratory-testing-is-not-part-3-toolfree-testing/
• Quick Tests
• http://www.developsense.com/blog/2011/12/what-exploratory-testing-is-not-part-4-quicktests/
• Undocumented Testing
• http://www.developsense.com/blog/2011/12/what-exploratory-testing-is-not-part-5undocumented-testing/
• “Experienced-Based” Testing
• http://www.satisfice.com/blog/archives/664
• defined by any specific example of exploratory testing
• http://www.satisfice.com/blog/archives/678
Exploratory Testing
The way we practice and teach it, exploratory testing…
• IS NOT “random testing” (or sloppy,
or slapdash testing)
• IS NOT “unstructured testing”
• IS NOT procedurally structured
• IS NOT unteachable
• IS NOT unmanageable
• IS NOT scripted
• IS NOT a technique
• IS “ad hoc”, in the dictionary sense,
“to the purpose”
• IS structured and rigorous
• IS cognitively structured
• IS highly teachable
• IS highly manageable
• IS chartered
• IS an approach
38
41. Contrasting Approaches
Scripted Testing
Exploratory Testing
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Is directed from elsewhere
Is determined in advance
Is about confirmation
Is about controlling tests
Emphasizes predictability
Emphasizes decidability
Like making a speech
Like playing from a score
Is directed from within
Is determined in the moment
Is about investigation
Is about improving test design
Emphasizes adaptability
Emphasizes learning
Like having a conversation
Like playing in a jam session
Exploratory Testing IS Structured
• Exploratory testing, as we teach it, is a structured process conducted by a
skilled tester, or by lesser skilled testers or users working under supervision.
• The structure of ET comes from many sources:
•
•
•
•
•
•
•
•
•
•
•
Test design heuristics
Chartering
Not procedurally
Time boxing
structured, but
cognitively structured.
Perceived product risks
The nature of specific tests
The structure of the product being tested
The process of learning the product
Development activities
Constraints and resources afforded by the project
In other words,
The skills, talents, and interests of the tester
it’s not “random”,
The overall mission of testing
but systematic.
39
42. Exploratory Testing IS Structured
In excellent exploratory testing, one structure tends to
dominate all the others:
Exploratory testers construct a compelling story of
their testing. It is this story that
gives ET a backbone.
79
To test is to compose, edit, narrate, and justify
THREE stories.
A story about the status of the PRODUCT…
…about how it failed, and how it might fail...
…in ways that matter to your various clients.
A story about HOW YOU TESTED it…
…how you configured, operated and observed it…
…about what you haven’t tested, yet…
…and won’t test, at all…
A story about how GOOD that testing was…
…what the risks and costs of testing are…
…what made testing harder or slower…
…how testable (or not) the product is…
…what you need and what
you recommend.
80
40
43. What does “taking advantage of resources” mean?
• Mission
• The problem we are here to solve for our customer.
• Information
• Information about the product or project that is needed for testing.
• Developer relations
• How you get along with the programmers.
• Team
• Anyone who will perform or support testing.
• Equipment & tools
• Hardware, software, or documents required to administer testing.
• Schedule
• The sequence, duration, and synchronization of project events.
• Test Items
• The product to be tested.
• Deliverables
• The observable products of the test project.
81
“Ways to test…”?
General Test Techniques
•
•
•
•
•
•
•
•
•
Function testing
Domain testing
Stress testing
Flow testing
Scenario testing
Claims testing
User testing
Risk testing
Automatic checking
82
41
44. Cost as a Simplifying Factor
Try quick tests as well as careful tests
A quick test is a cheap test that has some value
but requires little preparation, knowledge,
or time to perform.
• Happy Path
• Tour the Product
•
•
•
•
•
•
Sample Data
Variables
Files
Complexity
Menus & Windows
Keyboard & Mouse
•
•
•
•
•
•
•
Interruptions
Undermining
Adjustments
Dog Piling
Continuous Use
Feature Interactions
Click on Help
83
Cost as a Simplifying Factor
Try quick tests as well as careful tests
A quick test is a cheap test that has some value
but requires little preparation, knowledge,
or time to perform.
•
•
•
•
•
Input Constraint Attack
Click Frenzy
Shoe Test
Blink Test
Error Message Hangover
•
•
•
•
Resource Starvation
Multiple Instances
Crazy Configs
Cheap Tools
84
42
45. Touring the Product:
Mike Kelly’s FCC CUTS VIDS
•
•
•
•
•
•
Feature tour
Complexity tour
Claims tour
Configuration tour
User tour
Testability tour
•
•
•
•
•
Scenario tour
Variability tour
Interoperability tour
Data tour
Structure tour
43
46. Summing Up:
Themes of Rapid Testing
•
•
•
•
•
•
•
•
•
•
•
Put the tester's mind at the center of testing.
Learn to deal with complexity and ambiguity.
Learn to tell a compelling testing story.
Develop testing skills through practice, not just talk.
Use heuristics to guide and structure your process.
Replace “check for…” with “look for problems in…”
Be a service to the project community, not an obstacle.
Consider cost vs. value in all your testing activity.
Diversify your team and your tactics.
Dynamically manage the focus of your work.
Your context should drive your choices, both of which
evolve over time.
88
44