Traditional test plans are incompatible with agile software development because we don't know all the details about all the requirements up front. However, in an agile software release, you still must decide what types of testing activities will be required—and when you need to schedule them. Janet Gregory explains how to use the Agile Testing Quadrants, a model identifying the different purposes of testing, to help your team understand your testing needs as you plan the next release. Janet introduces you to alternative, lightweight test planning tools that allow you to plan and communicate your big picture testing needs and risks. Learn how to decide who does what testing—and when. Determine what types of testing to consider when planning an agile release, the infrastructure and environments needed for testing, what goes into an agile “test plan,” how to plan for acquiring test data, and lightweight approaches for documenting your tests and recording test results.
Janet Gregory presented on the mindset change needed for agile testers. She discussed how testers should focus on collaboration, communication, and adding value beyond just finding bugs. Testers need to adopt a mindset of being analytical, curious, and skeptical. Gregory also covered different levels of testing from iteration to release level and techniques like exploratory testing, using examples, and mind mapping. She emphasized the importance of imagination and adapting different testing tools to context.
Behavior Driven Development—A Guide to Agile Practices by Josh EastmanQA or the Highway
The document discusses Behavior Driven Development (BDD) and how it can help increase quality and prepare an organization for increased business demands. It describes BDD as an industry practice where the whole team collaborates on system testing and definition of done. BDD promotes requirements using examples, collaboration between roles, finding defects earlier and more often through automation, and keeping technical debt low.
The document discusses holistic testing in DevOps. It emphasizes testing early in collaboration with customers to help prevent defects. It also discusses testing throughout the deployment pipeline from development to production, including testing releases and using monitoring to observe and learn. The goal is to optimize the entire process from concept to delivering value to customers through continuous testing, delivery, learning and improvement.
Successfully Implementing BDD in an Agile WorldSmartBear
This document provides an overview of successfully implementing Behavior Driven Development (BDD) in an agile environment. It discusses shifting testing left by involving testers earlier in the development process. The document then covers the key aspects of a BDD process including discovery workshops to understand requirements, writing examples and scenarios in a Given/When/Then format, automating scenarios, and using continuous integration to ensure tests always pass. It emphasizes that adopting BDD requires changes to people, processes, and tools to facilitate collaboration between all teams.
Quality Jam: BDD, TDD and ATDD for the EnterpriseQASymphony
During Quality Jam 2016 I had the privilege of presenting with one of QASymphony's earliest customers, Better Cloud, on how methodologies like BDD, TDD and ATDD scale for the enterprises. Adam Satterfield is the VP of Quality Assurance at Bettercloud and has been in QA for many years; he has taught me a lot about Behavior Driven Development, Test Driven Development, Acceptance Test Driven Development. In the session we share a new way of testing-- what Adam and I believe to be the next generation of testing development.
We know that there are several ways to do testing and we are just showing one new way to do it - If this session doesn't inspire action, hopefully it will at least give you and your team something to think about.
TDD involves writing tests before writing code to satisfy requirements. The document discusses TDD, providing:
1. An overview of the TDD process and definitions of its key steps - make a test, make it fail, make it pass.
2. An example walking through writing a test for an "easy button" and implementing the code to pass the test.
3. Reasons for using TDD, including improved code quality, design, discipline, and documentation from maintaining an automated test suite.
The document discusses moving from a "gatekeeper" model of testing, where testing is done separately after development, to a "partner" model where testing is integrated into development and shared responsibility of the team. It provides tips for making this transition, such as fixing problems developers experience with testing, integrating testing into development workflows, and helping testers contribute to other parts of development to become true partners. The overall message is that testing is most effective when it is easy to do and an inherent part of the development process done collaboratively by the entire team.
This document provides an overview of SpecFlow, a behavior-driven development (BDD) tool for .NET. It introduces SpecFlow, discusses BDD and how SpecFlow fits into agile practices like test-driven development (TDD) and acceptance test-driven development (ATDD). The document outlines TechTalk, the company behind SpecFlow, demonstrates SpecFlow functionality, and discusses integrations with tools like Visual Studio and build servers. It also previews upcoming SpecFlow sessions at NDC 2011 and takes questions from the audience. The summary concludes in 3 sentences or less.
Janet Gregory presented on the mindset change needed for agile testers. She discussed how testers should focus on collaboration, communication, and adding value beyond just finding bugs. Testers need to adopt a mindset of being analytical, curious, and skeptical. Gregory also covered different levels of testing from iteration to release level and techniques like exploratory testing, using examples, and mind mapping. She emphasized the importance of imagination and adapting different testing tools to context.
Behavior Driven Development—A Guide to Agile Practices by Josh EastmanQA or the Highway
The document discusses Behavior Driven Development (BDD) and how it can help increase quality and prepare an organization for increased business demands. It describes BDD as an industry practice where the whole team collaborates on system testing and definition of done. BDD promotes requirements using examples, collaboration between roles, finding defects earlier and more often through automation, and keeping technical debt low.
The document discusses holistic testing in DevOps. It emphasizes testing early in collaboration with customers to help prevent defects. It also discusses testing throughout the deployment pipeline from development to production, including testing releases and using monitoring to observe and learn. The goal is to optimize the entire process from concept to delivering value to customers through continuous testing, delivery, learning and improvement.
Successfully Implementing BDD in an Agile WorldSmartBear
This document provides an overview of successfully implementing Behavior Driven Development (BDD) in an agile environment. It discusses shifting testing left by involving testers earlier in the development process. The document then covers the key aspects of a BDD process including discovery workshops to understand requirements, writing examples and scenarios in a Given/When/Then format, automating scenarios, and using continuous integration to ensure tests always pass. It emphasizes that adopting BDD requires changes to people, processes, and tools to facilitate collaboration between all teams.
Quality Jam: BDD, TDD and ATDD for the EnterpriseQASymphony
During Quality Jam 2016 I had the privilege of presenting with one of QASymphony's earliest customers, Better Cloud, on how methodologies like BDD, TDD and ATDD scale for the enterprises. Adam Satterfield is the VP of Quality Assurance at Bettercloud and has been in QA for many years; he has taught me a lot about Behavior Driven Development, Test Driven Development, Acceptance Test Driven Development. In the session we share a new way of testing-- what Adam and I believe to be the next generation of testing development.
We know that there are several ways to do testing and we are just showing one new way to do it - If this session doesn't inspire action, hopefully it will at least give you and your team something to think about.
TDD involves writing tests before writing code to satisfy requirements. The document discusses TDD, providing:
1. An overview of the TDD process and definitions of its key steps - make a test, make it fail, make it pass.
2. An example walking through writing a test for an "easy button" and implementing the code to pass the test.
3. Reasons for using TDD, including improved code quality, design, discipline, and documentation from maintaining an automated test suite.
The document discusses moving from a "gatekeeper" model of testing, where testing is done separately after development, to a "partner" model where testing is integrated into development and shared responsibility of the team. It provides tips for making this transition, such as fixing problems developers experience with testing, integrating testing into development workflows, and helping testers contribute to other parts of development to become true partners. The overall message is that testing is most effective when it is easy to do and an inherent part of the development process done collaboratively by the entire team.
This document provides an overview of SpecFlow, a behavior-driven development (BDD) tool for .NET. It introduces SpecFlow, discusses BDD and how SpecFlow fits into agile practices like test-driven development (TDD) and acceptance test-driven development (ATDD). The document outlines TechTalk, the company behind SpecFlow, demonstrates SpecFlow functionality, and discusses integrations with tools like Visual Studio and build servers. It also previews upcoming SpecFlow sessions at NDC 2011 and takes questions from the audience. The summary concludes in 3 sentences or less.
Slides from the session "TDD - That Was Easy!" presented by Fadi Stephan from Kaizenko at AgileDC2019 on September 23, 2019 in Washington DC. A blog post accompanying this talk will be published soon on kaizenko.com
Abstract:
Have you tried TDD? Do you hate it? Do you have a hard time applying it in practice? Do you find it promoting bad design decisions because you must write micro tests instead of looking at the big picture? Are your tests tightly coupled to the implementation due to a lot of mocking making refactoring a pain? Do tons of tests break when a simple change is made? Do you have a hard time justifying all the time spent on writing tests vs. just focusing on development?
You are not alone. Every organization or team that I run into is supposedly Agile. Some are also applying agile engineering practices such as automated unit, integration and acceptance testing, etc… However, many struggle with TDD. TDD is hard, seems counter-intuitive and requires a lot of investment. Come to this session for a TDD reboot. We will look at the benefits of TDD, discuss the resistance to TDD and uncover some common difficulties along with misconceptions. We will address these misunderstandings and explore different approaches to making TDD easier. Leave with a fresh perspective and new insights on how to become better at TDD and apply it with ease
Learn how Acceptance Test Driven Development (ATDD) provides the process for capturing detailed requirements as acceptance criteria and turn them into as test cases before development begins using Behavior Driven Development (BDD). The BDD approach and Gherkin format is the language used to create easy to understand and actionable scenarios that map from the functional level to the components and units. We will discuss the different approaches to TDD including a realistic approach leveraging BDD to a purest standpoint where TDD use the tests to drive the design of the application. Finally understand how the tools in Visual Studio and Team foundation Server to support BDD such as SpecFlow (Cucumber in .NET), Refactoring tools, and Test Cases in MTM.
This document discusses Behavior Driven Development (BDD), which is an agile software development methodology that focuses on defining and testing business requirements through executable specifications and acceptance criteria. The document covers the key concepts of BDD, including outside-in development, pull-based planning, and defining behavior through user stories and scenarios. It also discusses how BDD compares to other techniques like test-driven development and finite state machines. The overall goal of BDD is to facilitate collaboration between developers and business stakeholders to build the right product through living documentation of desired behaviors.
Janet Gregory - Agile testing challenges Knowit 2014Knowit Oy
The document is a presentation by Janet Gregory from DragonFire Inc about agile testing. It discusses challenges with agile testing such as testers not being fully integrated into the team. It provides suggestions for overcoming these challenges, such as automating testing, having testers involved in planning, and emphasizing collaboration across the entire team. The presentation also covers topics like testing throughout development, defining what constitutes finished work, and balancing automation with exploratory testing.
This document discusses how to integrate Scrum and Behavior Driven Development (BDD). It recommends starting with refining the product backlog by splitting user stories, defining acceptance criteria through examples, and collaborating with stakeholders. Examples show how to write specifications using the Given-When-Then format. The document emphasizes starting each sprint by writing automated tests based on the specifications before writing code. This ensures the team builds the right product by focusing on delivering value through small, testable increments.
Agile Software Development and Test Driven Development: Agil8's Dave Putman 3...agil8 Ltd
David Putman of agil8’s training and consulting team discussed the anti-patterns observed in organisations introducing technical practices into their Agile software development teams, and how to avoid them.
This presentation was made at agil8’s Community Event for past students, clients, colleagues and agil8 associates on 30 October 2014.
This document discusses Acceptance Test Driven Development (ATDD). It begins by explaining that ATDD focuses on communication, collaboration, and building automated acceptance criteria to develop the right product. It then contrasts ATDD with traditional Test Driven Development, noting that while TDD ensures the code is developed correctly, ATDD ensures the correct product is developed. The document concludes by demonstrating ATDD in action using Cucumber, and discusses some challenges and anti-patterns of adopting ATDD.
Code Review for Teams Too Busy to Review Code - Atlassian Summit 2010Atlassian
This document provides a summary of code review for busy teams. It begins with an introduction to the author and topics to be covered. It then discusses making code review activity and code itself more visible and accessible through tools. It emphasizes the importance of encouraging discussion around code and providing feedback in a constructive manner. The document concludes with some tips for conducting more effective code reviews, such as focusing on problems rather than solutions and embracing feedback to improve.
The document discusses Acceptance Test Driven Development (ATDD), where acceptance tests are used to define requirements and drive the development process. It describes how ATDD works through a cycle of writing examples and tests, implementing features to pass the tests, and ensuring the tests continue to pass as changes are made. The benefits of ATDD include improved collaboration, a shared understanding of requirements, and preventing defects. Various tools that can be used for ATDD are also outlined, including FIT and Robot Framework. Adopting ATDD requires training, evangelism, and addressing organizational challenges through shared understanding.
Behavior Driven Development Pros and ConsIosif Itkin
The Cons of Behavior Driven Development (BDD)
Ivan Bobrov, ClubQA Co-Founder, Kostroma
The Pros of Behavior Driven Development (BDD): Business User Scenarios
Natalia Zaitseva, Exchange Functional Test Automation Lead Innovative Trading Systems
EXTENT Conference.
October 29-30, 2011
Test Automation for Trading Systems
Renaissance Hotel Moscow
xUnit and TDD: Why and How in Enterprise Software, August 2012Justin Gordon
“A comprehensive suite of JUnit tests is one of the most import aspects of a software project because it reduces bugs, facilitates adding new developers, and enables refactoring and performance tuning with confidence. Test-driven development (TDD) is the best way to build a suite of tests. And the Dependent Object Framework is the best way to test against database objects.” This presentation covers the benefits of TDD along with practical advice on how to implement TDD in complex projects.
Contents:
Behavior Driven Development (BDD)
Features of BDD
BDD Tools
BDD Framework
Examples of Cucumber/SpecFlow/BDD test
Gherkin – BDD Language
The Problem
Example of Gherkin
The Conclusion
SpecFlow Feature File
Keywords for the Feature File creation
Jump Start Agile Testing with Acceptance Test Driven DevelopmentTechWell
Does your agile team struggle to find the right level of detail prior to beginning development? You may be suffering from “chunky” user stories—those that are too large or insufficiently defined to implement or test efficiently. Acceptance test driven development (ATDD) can help you quickly slice those user stories down to a testable size and provide the necessary detail for your developers to begin coding. Join Susan Brockley as she discusses the difference between agile user stories and traditional requirements, and why both are necessary for effective testing. Through real-world examples, Susan shows you how to apply ATDD to quickly define acceptance criteria that can be coded and tested without the heavy documentation typically associated with traditional requirements. She gives tips on when to use ATDD, its relationship to test driven development, how both can enable simultaneous testing by all team members, and how you can incorporate ATDD into your company’s agile practices.
This document discusses testing approaches in Agile development. It notes that Agile methods require discipline and sustainable practices. Agile teams value continuous testing to ensure continuous progress, with testing seen as a way of life rather than a phase. Shortening feedback loops through automated testing allows teams to detect problems quickly. The document emphasizes that testing moves the project forward by providing ongoing feedback, rather than acting as a gatekeeper. It also highlights practices like keeping code clean, using lightweight documentation, and considering work "done" only after it is implemented and tested.
Usability Test Results Xtext New Project WizardSandra Schering
A discussion in GitHub concerning the new project wizard of an Xtext project was the origin for a usability test in which three versions of the wizard were compared. The slides show the study design and the usability findings.
This document summarizes a presentation on agile testing practices. It outlines four lessons learned around testing in agile projects, including recognizing problems through symptoms, providing feedback early, automating regression tests, and preventing complacency. For each lesson, symptoms and practical ideas are provided, such as using retrospectives to identify issues, practicing acceptance test driven development, and keeping tests always running green. The importance of whole team collaboration and involvement in testing is emphasized throughout.
TDD vs. ATDD - What, Why, Which, When & WhereDaniel Davis
This is a slide deck for a discussion about Test Driven Development (TDD) and Acceptance Test Driven Development (ATDD) and starting to explore the differences between them. Get some insight into why we use them and the advantages and disadvantages of both, as well as, get a better understanding of which should be used where and when. By the end of the session you should be well along the path to TDD vs. ATDD enlightenment.
This document discusses test-driven development (TDD), behavior-driven development (BDD), and acceptance test-driven development (ATDD). It explains that while they have different names, they all share the same core idea of using examples from business requirements to create automated tests. The document provides examples of how to write tests before having a user interface, and recommends abstracting from the GUI to focus on business logic. It also lists some popular tools that can be used for ATDD, BDD, and TDD.
This presentation is simply for motivating developers towards test automation and test-driven development. It discusses lightly unit testing, mocking and integration testing, too.
The nature of exploration, coupled with the ability of testers to rapidly apply their skills and experience, make exploratory testing a widely used test approach—especially when time is short. Unfortunately, exploratory testing often is dismissed by project managers who assume that it is not reproducible, measurable, or accountable. If you have these concerns, you may find a solution in a technique called session-based test management (SBTM), developed by Jon Bach and his brother James to specifically address these issues. In SBTM, testers are assigned areas of a product to explore, and testing is time boxed in “sessions” that have mission statements called “charters” to create a meaningful and countable unit of work. Jon discusses—and you practice—the skills of exploration using the SBTM approach. He demonstrates a freely available, open source tool to help manage your exploration and prepares you to implement SBTM in your test organization.
Tune Agile Test Strategies to Project and Product MaturityTechWell
For optimum results, you need to tune agile project's test strategies to fit the different stages of project and product maturity. Testing tasks and activities should be lean enough to avoid unnecessary bottlenecks and robust enough to meet your testing goals. Exploring what "quality" means for various stakeholder groups, Anna Royzman describes testing methods and styles that fit best along the maturity continuum. Anna shares her insights on strategic ways to use test automation, when and how to leverage exploratory testing as a team activity, ways to prepare for live pilots and demos of the real product, approaches to refine test coverage based on customer feedback, and techniques for designing a production "safety net" suite of automated tests. Leave with a better understanding of how to satisfy your stakeholders’ needs for quality-and a roadmap for tuning your agile test strategies.
Slides from the session "TDD - That Was Easy!" presented by Fadi Stephan from Kaizenko at AgileDC2019 on September 23, 2019 in Washington DC. A blog post accompanying this talk will be published soon on kaizenko.com
Abstract:
Have you tried TDD? Do you hate it? Do you have a hard time applying it in practice? Do you find it promoting bad design decisions because you must write micro tests instead of looking at the big picture? Are your tests tightly coupled to the implementation due to a lot of mocking making refactoring a pain? Do tons of tests break when a simple change is made? Do you have a hard time justifying all the time spent on writing tests vs. just focusing on development?
You are not alone. Every organization or team that I run into is supposedly Agile. Some are also applying agile engineering practices such as automated unit, integration and acceptance testing, etc… However, many struggle with TDD. TDD is hard, seems counter-intuitive and requires a lot of investment. Come to this session for a TDD reboot. We will look at the benefits of TDD, discuss the resistance to TDD and uncover some common difficulties along with misconceptions. We will address these misunderstandings and explore different approaches to making TDD easier. Leave with a fresh perspective and new insights on how to become better at TDD and apply it with ease
Learn how Acceptance Test Driven Development (ATDD) provides the process for capturing detailed requirements as acceptance criteria and turn them into as test cases before development begins using Behavior Driven Development (BDD). The BDD approach and Gherkin format is the language used to create easy to understand and actionable scenarios that map from the functional level to the components and units. We will discuss the different approaches to TDD including a realistic approach leveraging BDD to a purest standpoint where TDD use the tests to drive the design of the application. Finally understand how the tools in Visual Studio and Team foundation Server to support BDD such as SpecFlow (Cucumber in .NET), Refactoring tools, and Test Cases in MTM.
This document discusses Behavior Driven Development (BDD), which is an agile software development methodology that focuses on defining and testing business requirements through executable specifications and acceptance criteria. The document covers the key concepts of BDD, including outside-in development, pull-based planning, and defining behavior through user stories and scenarios. It also discusses how BDD compares to other techniques like test-driven development and finite state machines. The overall goal of BDD is to facilitate collaboration between developers and business stakeholders to build the right product through living documentation of desired behaviors.
Janet Gregory - Agile testing challenges Knowit 2014Knowit Oy
The document is a presentation by Janet Gregory from DragonFire Inc about agile testing. It discusses challenges with agile testing such as testers not being fully integrated into the team. It provides suggestions for overcoming these challenges, such as automating testing, having testers involved in planning, and emphasizing collaboration across the entire team. The presentation also covers topics like testing throughout development, defining what constitutes finished work, and balancing automation with exploratory testing.
This document discusses how to integrate Scrum and Behavior Driven Development (BDD). It recommends starting with refining the product backlog by splitting user stories, defining acceptance criteria through examples, and collaborating with stakeholders. Examples show how to write specifications using the Given-When-Then format. The document emphasizes starting each sprint by writing automated tests based on the specifications before writing code. This ensures the team builds the right product by focusing on delivering value through small, testable increments.
Agile Software Development and Test Driven Development: Agil8's Dave Putman 3...agil8 Ltd
David Putman of agil8’s training and consulting team discussed the anti-patterns observed in organisations introducing technical practices into their Agile software development teams, and how to avoid them.
This presentation was made at agil8’s Community Event for past students, clients, colleagues and agil8 associates on 30 October 2014.
This document discusses Acceptance Test Driven Development (ATDD). It begins by explaining that ATDD focuses on communication, collaboration, and building automated acceptance criteria to develop the right product. It then contrasts ATDD with traditional Test Driven Development, noting that while TDD ensures the code is developed correctly, ATDD ensures the correct product is developed. The document concludes by demonstrating ATDD in action using Cucumber, and discusses some challenges and anti-patterns of adopting ATDD.
Code Review for Teams Too Busy to Review Code - Atlassian Summit 2010Atlassian
This document provides a summary of code review for busy teams. It begins with an introduction to the author and topics to be covered. It then discusses making code review activity and code itself more visible and accessible through tools. It emphasizes the importance of encouraging discussion around code and providing feedback in a constructive manner. The document concludes with some tips for conducting more effective code reviews, such as focusing on problems rather than solutions and embracing feedback to improve.
The document discusses Acceptance Test Driven Development (ATDD), where acceptance tests are used to define requirements and drive the development process. It describes how ATDD works through a cycle of writing examples and tests, implementing features to pass the tests, and ensuring the tests continue to pass as changes are made. The benefits of ATDD include improved collaboration, a shared understanding of requirements, and preventing defects. Various tools that can be used for ATDD are also outlined, including FIT and Robot Framework. Adopting ATDD requires training, evangelism, and addressing organizational challenges through shared understanding.
Behavior Driven Development Pros and ConsIosif Itkin
The Cons of Behavior Driven Development (BDD)
Ivan Bobrov, ClubQA Co-Founder, Kostroma
The Pros of Behavior Driven Development (BDD): Business User Scenarios
Natalia Zaitseva, Exchange Functional Test Automation Lead Innovative Trading Systems
EXTENT Conference.
October 29-30, 2011
Test Automation for Trading Systems
Renaissance Hotel Moscow
xUnit and TDD: Why and How in Enterprise Software, August 2012Justin Gordon
“A comprehensive suite of JUnit tests is one of the most import aspects of a software project because it reduces bugs, facilitates adding new developers, and enables refactoring and performance tuning with confidence. Test-driven development (TDD) is the best way to build a suite of tests. And the Dependent Object Framework is the best way to test against database objects.” This presentation covers the benefits of TDD along with practical advice on how to implement TDD in complex projects.
Contents:
Behavior Driven Development (BDD)
Features of BDD
BDD Tools
BDD Framework
Examples of Cucumber/SpecFlow/BDD test
Gherkin – BDD Language
The Problem
Example of Gherkin
The Conclusion
SpecFlow Feature File
Keywords for the Feature File creation
Jump Start Agile Testing with Acceptance Test Driven DevelopmentTechWell
Does your agile team struggle to find the right level of detail prior to beginning development? You may be suffering from “chunky” user stories—those that are too large or insufficiently defined to implement or test efficiently. Acceptance test driven development (ATDD) can help you quickly slice those user stories down to a testable size and provide the necessary detail for your developers to begin coding. Join Susan Brockley as she discusses the difference between agile user stories and traditional requirements, and why both are necessary for effective testing. Through real-world examples, Susan shows you how to apply ATDD to quickly define acceptance criteria that can be coded and tested without the heavy documentation typically associated with traditional requirements. She gives tips on when to use ATDD, its relationship to test driven development, how both can enable simultaneous testing by all team members, and how you can incorporate ATDD into your company’s agile practices.
This document discusses testing approaches in Agile development. It notes that Agile methods require discipline and sustainable practices. Agile teams value continuous testing to ensure continuous progress, with testing seen as a way of life rather than a phase. Shortening feedback loops through automated testing allows teams to detect problems quickly. The document emphasizes that testing moves the project forward by providing ongoing feedback, rather than acting as a gatekeeper. It also highlights practices like keeping code clean, using lightweight documentation, and considering work "done" only after it is implemented and tested.
Usability Test Results Xtext New Project WizardSandra Schering
A discussion in GitHub concerning the new project wizard of an Xtext project was the origin for a usability test in which three versions of the wizard were compared. The slides show the study design and the usability findings.
This document summarizes a presentation on agile testing practices. It outlines four lessons learned around testing in agile projects, including recognizing problems through symptoms, providing feedback early, automating regression tests, and preventing complacency. For each lesson, symptoms and practical ideas are provided, such as using retrospectives to identify issues, practicing acceptance test driven development, and keeping tests always running green. The importance of whole team collaboration and involvement in testing is emphasized throughout.
TDD vs. ATDD - What, Why, Which, When & WhereDaniel Davis
This is a slide deck for a discussion about Test Driven Development (TDD) and Acceptance Test Driven Development (ATDD) and starting to explore the differences between them. Get some insight into why we use them and the advantages and disadvantages of both, as well as, get a better understanding of which should be used where and when. By the end of the session you should be well along the path to TDD vs. ATDD enlightenment.
This document discusses test-driven development (TDD), behavior-driven development (BDD), and acceptance test-driven development (ATDD). It explains that while they have different names, they all share the same core idea of using examples from business requirements to create automated tests. The document provides examples of how to write tests before having a user interface, and recommends abstracting from the GUI to focus on business logic. It also lists some popular tools that can be used for ATDD, BDD, and TDD.
This presentation is simply for motivating developers towards test automation and test-driven development. It discusses lightly unit testing, mocking and integration testing, too.
The nature of exploration, coupled with the ability of testers to rapidly apply their skills and experience, make exploratory testing a widely used test approach—especially when time is short. Unfortunately, exploratory testing often is dismissed by project managers who assume that it is not reproducible, measurable, or accountable. If you have these concerns, you may find a solution in a technique called session-based test management (SBTM), developed by Jon Bach and his brother James to specifically address these issues. In SBTM, testers are assigned areas of a product to explore, and testing is time boxed in “sessions” that have mission statements called “charters” to create a meaningful and countable unit of work. Jon discusses—and you practice—the skills of exploration using the SBTM approach. He demonstrates a freely available, open source tool to help manage your exploration and prepares you to implement SBTM in your test organization.
Tune Agile Test Strategies to Project and Product MaturityTechWell
For optimum results, you need to tune agile project's test strategies to fit the different stages of project and product maturity. Testing tasks and activities should be lean enough to avoid unnecessary bottlenecks and robust enough to meet your testing goals. Exploring what "quality" means for various stakeholder groups, Anna Royzman describes testing methods and styles that fit best along the maturity continuum. Anna shares her insights on strategic ways to use test automation, when and how to leverage exploratory testing as a team activity, ways to prepare for live pilots and demos of the real product, approaches to refine test coverage based on customer feedback, and techniques for designing a production "safety net" suite of automated tests. Leave with a better understanding of how to satisfy your stakeholders’ needs for quality-and a roadmap for tuning your agile test strategies.
New Testing Standards Are on the Horizon: What Will Be Their Impact?TechWell
The history of testing standards has not always been auspicious. Testing standards documents have been expensive to obtain, limited in scope, inflexible in expectations, and inconsistent. However, they contain important lessons learned from experienced practitioners—if a tester is willing to overcome the obstacles to get to the useful information. A set of new international standards is coming. These new standards are tailorable, consistent, and comprehensive in scope. In addition, they will be freely available (some are already). Claire Lohr provides a complete roadmap to all of the available—or soon-to-be-available—testing-related standards. Learn where to go for testing process guidelines, complete definitions of all test design techniques, full examples of test documentation (for both agile and traditional projects), and free international standards documents. Take away a “start-up guide” for how different types of projects can use the new standards along with valuable tips and practical lessons you can get from these standards.
Keynote: Lean Software Delivery: Synchronizing Cadence with ContextTechWell
Daily, we are told that adopting agile, PaaS, DevOps, crowdsourced testing, or any of the myriad of current buzzwords will help us deliver better software faster. However, for the majority of software development organizations, naïve agile transformations that don’t look beyond the needs of developers will fail to produce the promised results. Mik Kersten says that instead of focusing on development alone to transform our software delivery, we must acknowledge the different contexts and mismatched cadences that define the work of business analysts, developers, testers, and project managers. For example, a developer working in an agile team may deliver code every two weeks, but the performance testing group may need more time for its work, while the operations group has a planned release cycle of once per quarter. To achieve optimum flow, which is the goal of end-to-end lean delivery, we must identify the different cadences of each group and interconnect the collaborators and their work—requirements, development, testing, and deployment.
Build Your Own Performance Test Lab in the CloudTechWell
Many cloud-based performance and load testing tools claim to offer “cost-effective, flexible, pay-as-you-go pricing.” However, the reality is often neither cost-effective nor flexible. With many vendors, you will be charged whether or not you use the time (not cost effective), and you must pre-schedule test time (not always when you want and not always flexible). In addition, many roadblocks are thrown up—from locked-down environments that make it impossible to load test anything other than straight-forward applications, to firewall, security, and IP spoofing issues. Join Leslie Segal to discover when it makes sense to set up your own cloud-based performance test lab, either as a stand-alone or as a supplement to your current lab. Learn about the differences in licensing tools, running load generators on virtual machines, the real costs, and data about various cloud providers. Take home a road map for setting up your own performance test lab—in less than twenty-four hours.
Pay Now or Pay More Every Day: Reduce Technical Debt Now!TechWell
Is your team missing delivery dates? Is your velocity inconsistent from sprint to sprint? Are customers complaining about defects or the time it takes to add new features? These are signs that you are mired in technical debt-a metaphor that describes the long-term costs of doing something in a quick and dirty way and not going back to clean up the mess. Fadi Stephan shares a technical debt management approach to help you make prudent decisions on how much effort to invest in reducing technical debt. Discover ways to measure the quality of your current code base and determine the cost of eventual rework hanging over your system. Learn how to engage executives and get buy-in on a debt removal plan that will improve system design, increase the quality of your code, and return your team to high productivity. If you are burdened with technical debt, the choice is to pay now or continue paying more every day-forever.
Introducing Mobile Testing to Your OrganizationTechWell
Mobile is an integral part of our daily lives, and if it’s not already part of your business model, it soon will be. When that happens, will you be ready to tackle the demands of testing web and native mobile apps? From the perspective of a test lead, Eric Montgomery describes the challenges Progressive Insurance, a company with a strong web presence, recently faced—learning new technologies, transforming the approach of testers from PC-based to mobile-based, and working with testing tools in a market that has yet to see a definitive leader emerge. Learn from Eric's experiences and return to your job with ideas on training web testers to be mobile testers. Take back proven techniques for testing mobile devices, ways of choosing devices for test, methods of sharing information, developing a sense of community among testers, choosing tools from the available market, and keeping up with rapid technology changes.
Better Security Testing: Using the Cloud and Continuous DeliveryTechWell
Even though many organizations claim that security is a priority, that claim doesn’t always translate into supporting security initiatives in software development or test. Security code reviews often are overlooked or avoided, and when development schedules fall behind, security testing may be dropped to help the team “catch up.” Everyone wants more secure development; they just don’t want to spend time or money to get it. Gene Gotimer describes his experiences with implementing a continuous delivery process in the cloud and how he integrated security testing into that process. Gene discusses how to take advantage of the automated provisioning and automated deploys already being implemented to give more opportunities along the way for security testing without schedule disruption. Learn how you can incrementally mature a practice to build security into the process—without a large-scale, time-consuming, or costly effort.
In today’s competitive world, more and more HTML5 applications are being developed for mobile and desktop platforms. Spotify has partnered with world-renowned organizations to create high quality apps to enrich the user experience. Testing a single application within a few months can be a challenge. But it's a totally different beast to test multiple world-class music discovery apps every week. Alexander Andelkovic shares insights into the challenges they face coordinating all aspects of app testing to meet their stringent testing requirements. Alexander describes an agile way to use the Kanban process to help out. He shares lessons learned including the need for management of acceptable levels of quality, support, smoke tests, and development guidelines. If you are thinking of starting agile app development or want to streamline your current app development process, Alexander’s experience gives you an excellent starting point.
Testers have been taught they are responsible for all testing. Some even say “It’s not tested until I run the product myself.” Eric Jacobson thinks this old school way of thinking can hurt a tester’s reputation and—even worse—may threaten team success. Learning to recognize opportunities where you may NOT have to test can eliminate bottlenecks and make you everyone’s favorite tester. Eric shares eight patterns from his personal experiences where not testing was the best approach. Examples include patches for critical production problems that can’t get worse, features that are too technical for the tester, cosmetic bug fixes with substantial test setup, and more. Challenge your natural testing assumptions. Become more comfortable with approaches that don’t require testing. Eliminate waste in your testing process by asking, “Does this need to be tested? By me?” Take back ideas to manage not testing including using lightweight documentation for justification. Not testing may actually be a means to better testing.
Specification-by-Example: A Cucumber ImplementationTechWell
We've all been there. You work incredibly hard to develop a feature and design tests based on written requirements. You build a detailed test plan that aligns the tests with the software and the documented business needs. When you put the tests to the software, it all falls apart because the requirements were updated without informing everyone. But help is at hand. Enter business-driven development and Cucumber, a tool for running automated acceptance tests. Join Mary Thorn as she explores the nuances of Cucumber and shows you how to implement specification-by-example, behavior-driven development, and agile acceptance testing. By fostering collaboration for implementing active requirements via a common language and format, Cucumber bridges the communication gap between business stakeholders and implementation teams. If you experience developers not coding to requirements, testers not getting requirements updates, or customers who feel out of the loop and don't get what they ask for, be here!
Stakeholders always want to release when they think we’ve finished testing. They believe we have discovered “all of the important problems” and “verified all of the fixes”—and now it’s time to reap the rewards. However, as testers we still can assist in improving software by learning about problems after code has rolled live—especially if it’s a website. Jon Bach explores why and how at eBay they have a post-ship site quality mindset in which testers continue to learn from live A/B testing, operational issues, customer sentiment analysis, discussion forums, and customer call patterns—just to name a few. Jon explains what eBay’s Live Site Quality team learns every day about what they just released to production. Take away new ideas on what you can do to test and improve value—even after you’ve shipped.
It Seemed a Good Idea at the Time: Intelligent Mistakes in Test AutomationTechWell
Some test automation ideas seem very sensible at first glance but contain pitfalls and problems that can and should be avoided. Dot Graham describes five of these “intelligent mistakes”—1. Automated tests will find more bugs quicker. (Automation doesn’t find bugs, tests do.) 2. Spending a lot on a tool must guarantee great benefits. (Good automation does not come “out of the box” and is not automatic.) 3. Let’s automate all of our manual tests. (This may not give you better or faster testing, and you will miss out on some benefits.) 4. Tools are expensive so we have to show a return on investment. (This is not only surprisingly difficult but may actually be harmful.) 5. Because they are called “testing tools,” they must be tools for testers to use. (Making testers become test automators may be damaging to both testing and automation.) Join Dot for a rousing discussion of “intelligent mistakes”—so you can be smart enough to avoid them.
Throughout the years, Lightning Talks have been a popular part of the STAR conferences. If you’re not familiar with the concept, Lightning Talks consists of a series of five-minute talks by different speakers within one presentation period. Lightning Talks are the opportunity for speakers to deliver their single biggest bang-for-the-buck idea in a rapid-fire presentation. And now, lightning has struck the STAR keynotes. Some of the best-known experts in testing—James Bach, Jon Bach, Michael Bolton, Jennifer Bonine, Hans Buwalda, Bob Galen, John Fodeh, Dawn Haynes, Geoff Horne, and Griffin Jones—will step up to the podium and give you their best shot of lightning. Get ten keynote presentations for the price of one—and have some fun at the same time.
Innovations in Test Automation: It’s Not All about RegressionTechWell
Although classic test automation, which usually focuses on regression testing, has its its place in testing, there is much more you can do to improve testing productivity and its value to the project and your organization. Through experience-based examples, video clips, and demonstrations, John Fodeh shares one company’s innovation journey to improve its test automation practice. John illustrates how they learned to apply automated “test monkeys” that explore the software in new ways each time a test is executed. Then, he describes how the test team uses weighted probability tables to increase each test’s “intelligence” factor. Find out how they implemented model-based testing to improve automation effectiveness and how this practice led to the even more valuable behavior-driven testing approach they employ today. With these and other alternative approaches you, too, can get more mileage from your automation efforts. Join John to get inspired and start your own journey of innovation with new ideas that enhance your test automation strategy.
Exploratory testing is an approach to testing that emphasizes the freedom and responsibility of testers to continually optimize the value of their work. It is the process of three mutually supportive activities done in parallel: learning, test design, and test execution. With skill and practice, exploratory testers typically uncover an order of magnitude more problems than when the same amount of effort is spent on procedurally scripted testing. All testers conduct exploratory testing in one way or another, but few know how to do it systematically to obtain the greatest benefits. Even fewer can articulate the process. Jon Bach looks at specific heuristics and techniques of exploratory testing that will help you get the most from this highly productive approach. Jon focuses on the skills and dynamics of exploratory testing, and how it can be combined with scripted approaches.
On traditional projects, testers usually join the project after coding has started, or even later when coding is almost finished. Testers have no role in advising the project team early regarding quality issues but focus only on finding defects. They become accustomed to this style of working and adjust their mental processes accordingly. In agile, testers must collaborate closely with customers and programmers throughout the development lifecycle, where their focus changes from finding defects to preventing them. Janet Gregory shares ways to change the tester’s mindset from “How can I break the software?” to “How can I help deliver excellent software?”—a critical mental shift on agile projects. Another facet of the mind-set change is learning how to test early and incrementally. Janet uses interactive exercises and examples to help you understand how effective this mindset change is—and how you can apply it on your agile projects.
The document is a presentation on distributed agile testing given by Janet Gregory. It discusses challenges with distributed teams such as communication difficulties due to time zone differences and lack of face-to-face interaction. It provides strategies for effective collaboration and communication when teams are distributed, including using video conferencing, pairing remotely, and integrating testing with development. The presentation emphasizes the importance of experimentation and adapting practices to overcome issues unique to distributed teams.
In her book Agile Testing: A Practical Guide for Testers and Agile Teams, Janet Gregory recommends using the automation pyramid as a model for test coverage. In the pyramid model, most automated tests are unit tests written and maintained by the programmers,and tests that execute below the user interface—API-level tests that can be developed and maintained collaboratively by programmers and testers. However, as agile becomes mainstream, some circumstances may challenge this model. Many applications are transferring logic back to the client side by using programming languages such as JavaScript. Legacy systems, using languages such as COBOL, don’t have access to unit testing frameworks. Janet shows how to adapt the model to your needs and addresses some of these automation issues. During the session, delegates are encouraged to share their challenges and success stories.
Janet Gregory presents Current Testing Challenges with SoftTest IrelandDavid O'Dowd
The document discusses challenges with testing agile software development and proposes collaborative solutions. It covers topics like test automation strategies, different levels to automate tests, roles of testers and developers in testing, and challenges with new technologies. The document aims to start a discussion on how testers and developers can better work together to improve testing in agile projects.
The document discusses challenges with testing agile software development and proposes collaborative solutions. It covers topics like test automation strategies, different levels to automate tests, roles of testers and developers in testing, and challenges with new technologies. The document aims to start a discussion on how testers and developers can better work together to improve testing in agile projects.
Tired of doing upfront test script creation in your testing efforts? Feeling bad for demotivating your testers? Want something to replace this sickening approach to software testing? This presentation outlines why test scripts are not useful, and how test ideas are the new way forward to better testing. Coverage, traceability, reporting, automation and skills are all covered. Take a quick look and see if you can see there is another way to do software testing that is actually pure common sense.
Patterns for Collaboration: Toward Whole-Team QualityTechWell
A lot of talk goes on in agile about how collaboration among team members helps drive a shared responsibility for quality—and more. However, most teams don't do much more than just hold stand-up meetings and have programmers and testers sit together. Although these practices improve communications, they are not collaboration! Most teams simply don't understand how to collaborate. Janet Gregory and Matt Barcomb guide you through hands-on activities that illustrate collaboration patterns for programmers and testers, working together. They briefly review the acceptance test-driven development process, then illustrate what programmers should know about testing—and what testers should know about programming—to effectively create whole-team quality. Janet and Matt conclude with visual management techniques for joint quality activities and discuss the shift in the product owner role regarding release quality. Leave with new ideas about collaboration to take back to your organization and make whole-team responsibility for quality a reality.
This document summarizes Janet Gregory's work promoting agile testing practices. It notes that she has been involved with agile teams since 2000 and has authored books and online courses on agile testing. The document discusses how testing should be a shared responsibility of the whole team. It emphasizes that testing provides feedback to improve quality, not just find bugs, and explains practices like examples, acceptance test-driven development, and exploratory testing that involve the whole team in testing activities.
EuroSTAR Software Testing Conference 2013 presentation on Readable, Executable Requirements: Hands-On by Emily Bache.
See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Do testers have to code... to be useful? Janet Gregory and Lisa Crispin plena...lisacrispin
This document discusses whether testers need to code to be useful on agile teams. It notes that while development teams already have coders, testers can add unique value through specialized testing skills like exploratory testing and expertise in areas like security, performance, and user experience testing. The document advocates that testers develop technical awareness of tools and concepts used by the team, but that coding skills are not absolutely required as long as testers have strong thinking and collaboration skills. Testers should focus on competencies over prescribed roles and find ways to add value through skills like eliciting examples and communicating effectively with all team members.
Create Software Design with unit testing, build user experience with UX testing, check definition of done with functional testing – all these are my day-to-day activities. Indeed, I am a developer who has found the value of testing to deliver quality software.
In this presentation I share with you how I have come to use tests for: understanding the features, choosing the best user experience design, choosing the best technical solution, implementing the features and test them to create a reliable system.
You will see practical examples of how tools like Jasmine, Spock, Geb are used for the above types of tests. You will see a project with test code and we will discuss how testing can effectively enhance your professional performance.
How do you know if you have too much process, too little, or just the right amount? If you ignore process completely, unpredictability and chaos can follow. If you define the process to the nth degree and follow it religiously, the work grinds to a halt. Janet Gregory shares her experiences about how to find the tastiest balance of process and creativity for your projects and organization. She proposes that a formally defined process is sometimes necessary, but that it should be the exception. Explore with Janet the many variables—team size, complexity, criticality, organization structure, and culture—you must assess to find just the right balance. Learn how to make existing processes better by adding visibility to the process, getting team members’ input, and adapting documentation you need. Learn how to transform complicated processes into simpler ones—such as reporting a simple “thumbs up” or “thumbs down”—and go home with new tools to sprinkle on just enough process.
The document discusses using unmoderated remote testing to conduct user research. It outlines benefits like being fast to set up, allowing frequent testing, and saving time on conducting sessions and analyzing data. Key aspects for effective remote testing are using realistic scenarios without a moderator's guidance, asking good specific questions, and ensuring mockups are user-ready. The document emphasizes treating remote testing as collaborative work across teams to get stakeholder buy-in and input on goals, flows, questions, and prototypes. Intentional research is about having a strategy beyond just fixing problems, exploring solutions, and evaluating concepts in full contexts.
The document summarizes an advanced agile testing workshop hosted by Lisa Crispin. The workshop aims to be collaborative and help attendees solve testing problems through experiments and discussions on topics like impact mapping, testing quadrants, skills development, tool selection, technical debt, and test automation. Attendees will identify their biggest testing challenges, prioritize them, and brainstorm experiments to address high priority problems through techniques like impact mapping and story mapping. The workshop provides resources and examples to facilitate these discussions.
Are You Building the Right Thing? - Janet G @ CMBAgileConf 2016ColomboCampsCommunity
In Agile Development, we focus on building in quality and preventing defects, rather than finding defects at the tail-end of a development cycle. Focusing our testing efforts on the things customers care about most is key. In this presentation, Janet explains how Testers and Agile teams can ensure they are building the right thing.
The document discusses strategies for creating engaging sprint reviews that attract stakeholders and enlighten customers. It describes iContact's journey to improve their reviews, including establishing ownership with the Product Owner, focusing on demonstrations of working software over PowerPoints, preparing the whole team, and ensuring reviews provided value to customers. Key lessons included laser focusing on customers, embracing feedback, and making every demonstration, including failures, count. The ultimate goal was to generate excitement about upcoming releases.
Agile Testing: Learning Journeys for the Whole Team - Janet G @ CMBAgileConf ...ColomboCampsCommunity
When Agile Development first gained popularity, Agile meant collocated teams, including testers, programmers, analysts, and customers who were expected to perform many functions. As Agile methods have spread and expanded, many organizations with globally-distributed teams are facing challenges with their Agile deployment.
Janet shares her experiences and lessons learned that show how testing activities can help develop open communication, and share data and information within a team or across continents. These ideas can help create the common understanding for what the team is building so the team can work together – with fun.
We've been discussing software craftsmanship for years. But does it match the realities of a business? How?
This is a story about applying the ideas and practices of software craftsmanship for a real project in a difficult context. The conclusion? It helped us, it might help you as well.
STARCANADA 2013 Keynote: Lightning Strikes the KeynotesTechWell
Throughout the years, Lightning Talks have been a popular part of the STAR conferences. If you’re not familiar with the concept, Lightning Talks consists of a series of five-minute talks by different speakers within one presentation period. Lightning Talks are the opportunity for speakers to deliver their single biggest bang-for-the-buck idea in a rapid-fire presentation. And now, lightning has struck the STAR keynotes. Some of the best-known experts in testing—Jon Bach, Michael Bolton, Fiona Charles, Janet Gregory, Paul Holland, Griffin Jones, Keith Klain, Gerard Meszaros, and Nate Oster—will step up to the podium and give you their best shot of lightning. Get ten keynote presentations for the price of one—and have some fun at the same time.
Similar to Planning Your Agile Testing: A Practical Guide (20)
Isabel Evans stopped drawing and painting after being told she was not very good at it, which led to a loss of confidence in her creative and professional abilities. However, she realized that attempting creative activities is important for cognitive and emotional development, and that making mistakes and learning from failures allows for growth. By reengaging with failure through art and with support from others, Isabel was able to regain confidence in her abilities and reboot her career. The document discusses different perspectives on failure and the importance of learning from mistakes.
Instill a DevOps Testing Culture in Your Team and Organization TechWell
The DevOps movement is here. Companies across many industries are breaking down siloed IT departments and federating them into product development teams. Testing and its practices are at the heart of these changes. Traditionally, IT organizations have been staffed with mostly manual testers and a limited number of automation and performance engineers. To keep pace with development in the new “you build it, you own it” environment, testing teams and individuals must develop new technical skills and even embrace coding to stay relevant and add greater value to the business. DevOps really starts with testing. Join Adam Auerbach as he explains what DevOps is and how it relates to testing. He describes how testing must change from top to bottom and how to access your own environment to identify improvement opportunities. Adam dives into practices like service virtualization, test data management, and continuous testing so you can understand where you are now and identify steps needed to instill a DevOps testing culture in your team and organization.
Test Design for Fully Automated Build ArchitectureTechWell
This document summarizes a half-day tutorial on test design for fully automated build architectures presented by Melissa Benua of mParticle at STAREAST 2018. The tutorial covered guiding principles for test design including prioritizing important and reliable tests, structuring automated pipelines around components, packages, and releases, and monitoring test results through code coverage, flaky test handling, and logging versus counters. It also included exercises mapping test cases to functional boundaries and categories of tests to pipeline stages.
System-Level Test Automation: Ensuring a Good StartTechWell
Many organizations invest a lot of effort in test automation at the system level but then have serious problems later on. As a leader, how can you ensure that your new automation efforts will get off to a good start? What can you do to ensure that your automation work provides continuing value? This tutorial covers both “theory” and “practice”. Dot Graham explains the critical issues for getting a good start, and Chris Loder describes his experiences in getting good automation started at a number of companies. The tutorial covers the most important management issues you must address for test automation success, particularly when you are new to automation, and how to choose the best approaches for your organization—no matter which automation tools you use. Focusing on system level testing, Dot and Chris explain how automation affects staffing, who should be responsible for which automation tasks, how managers can best support automation efforts to promote success, what you can realistically expect in benefits and how to report them. They explain—for non-techies—the key technical issues that can make or break your automation effort. Come away with your own clarified automation objectives, and a draft test automation strategy to use to plan your own system-level test automation.
Build Your Mobile App Quality and Test StrategyTechWell
Let’s build a mobile app quality and testing strategy together. Whether you have a web, hybrid, or native app, building a quality and testing strategy means (1) knowing what data and tools you have available to make agile decisions, (2) understanding your customers and your competitors, and (3) testing your app under real-world conditions. Jason Arbon guides you through the latest techniques, data, and tools to ensure the awesomeness of your mobile app quality and testing strategy. Leave this interactive session with a strategy for your very own app—or one you pretend to own. The information Jason shares is based on data from Appdiff’s next-gen mobile app testing platform, lessons from Applause/uTest’s crowd, text mining hundreds of millions of app store reviews, and in-depth discussions with top mobile app development teams.
Testing Transformation: The Art and Science for SuccessTechWell
Technologies, testing processes, and the role of the tester have evolved significantly in the past few years with the advent of agile, DevOps, and other new technologies. It is critical that we testing professionals evaluate ourselves and continue to add tangible value to our organizations. In your work, are you focused on the trivial or on real game changers? Jennifer Bonine describes critical elements that help you artfully blend people, process, and technology to create a synergistic relationship that adds value. Jennifer shares ideas on mastering politics, maneuvering core vs. context, and innovating your technology strategies and processes. She explores how new processes can be introduced in an organization, what the role of organizational culture is in determining the success of a project, and how you can know what tools will add value vs. simply adding overhead and complexity. Jennifer reviews critically needed tester skills and discusses a continual learning model to evolve your skills and stay relevant. This discussion can lead you to technologies, processes, and skills you can stake your career on.
We’ve all been there. We work incredibly hard to develop a feature and design tests based on written requirements. We build a detailed test plan that aligns the tests with the software and the documented business needs. And when we put the tests to the software, it all falls apart because the requirements were changed without informing everyone. Mary Thorn says help is at hand. Enter behavior-driven development (BDD), and Cucumber and SpecFlow, tools for running automated acceptance tests and facilitating BDD. Mary explores the nuances of Cucumber and SpecFlow, and shows you how to implement BDD and agile acceptance testing. By fostering collaboration for implementing active requirements via a common language and format, Cucumber and SpecFlow bridge the communication gap between business stakeholders and implementation teams. In this workshop, practice writing feature files with the best practices Mary has discovered over numerous implementations. If you experience developers not coding to requirements, testers not getting requirements updates, or customers who feel out of the loop and don’t get what they ask for, Mary has answers for you.
Develop WebDriver Automated Tests—and Keep Your SanityTechWell
Many teams go crazy because of brittle, high-maintenance automated test suites. Jim Holmes helps you understand how to create a flexible, maintainable, high-value suite of functional tests using Selenium WebDriver. Learn the basics of what to test, what not to test, and how to avoid overlapping with other types of testing. Jim includes both philosophical concepts and hands-on coding. Testers who haven't written code should not be intimidated! We'll pair you up to make sure you're successful. Learn to create practical tests dealing with advanced situations such as input validation, AJAX delays, and working with file downloads. Additionally, discover when you need to work together with developers to create a system that's more easily testable. This tutorial focuses primarily on automating web tests, but many of the same concepts can be applied to other UI environments. Demos and labs will be in C# and Java using WebDriver. Leave this tutorial having learned how to write high-value WebDriver tests—and stay sane while doing so.
DevOps is a cultural shift aimed at streamlining intergroup communication and improving operational efficiency for development and operations groups. Over time, inclusion of other IT groups under the DevOps umbrella has become the norm for many organizations. But even broadening the boundaries of DevOps, the conversation has been largely devoid of the business units’ place at the table. A common mistake organizations make while going through the DevOps transformation is drawing a line at the IT boundary. If that occurs, a larger, more inclusive silo within the organization is created, operating in an informational vacuum and causing operational inefficiency and goal misalignment. Sharing his experiences working on both sides of the fence, Leon Fayer describes the importance of including business units in order to align technology decisions with business goals. Leon discusses inclusion of business units in existing agile processes, benefits of cross-departmental monitoring, and a business-first approach to technology decisions.
Eliminate Cloud Waste with a Holistic DevOps StrategyTechWell
Chris Parlette maintains that renting infrastructure on demand is the most disruptive trend in IT in decades. In 2016, enterprises spent $23B on public cloud IaaS services. By 2020, that figure is expected to reach $65B. The public cloud is now used like a utility, and like any utility, there is waste. Who's responsible for optimizing the infrastructure and reducing wasted expenses? It’s DevOps. The excess expense, known as cloud waste, comprises several interrelated problems: services running when they don't need to be, improperly sized infrastructure, orphaned resources, and shadow IT. There are a few core tenets of DevOps—holistic thinking, no silos, rapid useful feedback, and automation—that can be applied to reducing your cloud waste. Join Chris to learn why you should include continuous cost optimization in your DevOps processes. Automate cost control, reduce your cloud expenses, and make your life easier.
Transform Test Organizations for the New World of DevOpsTechWell
With the recent emergence of DevOps across the industry, testing organizations are being challenged to transform themselves significantly within a short period of time to stay meaningful within their organizations. It’s not easy to plan and approach these changes considering the way testing organizations have remained structured for ages. These challenges start from foundational organizational structures and can cut across leadership influence, competencies, tools strategy, infrastructure, and other dimensions. Sumit Kumar shares his experience assisting various organizations to overcome these challenges using an organized DevOps enablement framework. The framework includes radical restructuring, turning the tools strategy upside down, a multidimensional workforce enablement supported by infrastructure changes, redeveloped collaborations models, and more. From his real world experiences Sumit shares tips for approaching this journey and explains the roadmap for testing organizations to transform themselves to lead the quality in DevOps.
The Fourth Constraint in Project Delivery—LeadershipTechWell
All too often, the triple constraints—time, cost, and quality—are bandied about as if they are the be-all, end-all. While they are important, leadership—the fourth and larger underpinning constraint—influences the first three. Statistics on project success and failure abound, and these measurements are usually taken against the triple constraints. According to the Project Management Institute, only 53 percent of projects are completed within budget, and only 49 percent are completed on time. If so many projects overrun budget and are late, we can’t really say, “Good, fast, or cheap—pick two.” Rob Burkett talks about leadership at every level of a team. He shares his insights and stories gleaned from his years of IT and project management experience. Rob speaks to some of the glaring difficulties in the workplace in general and some specifically related to IT delivery and project management. Leave with a clearer understanding of how to communicate with teams and team members, and gain a better understanding of how you can be a leader—up and down your organization.
Resolve the Contradiction of Specialists within Agile TeamsTechWell
As teams grow, organizations often draw a distinction between feature teams, which deliver the visible business value to the user, and component teams, which manage shared work. Steve Berczuk says that this distinction can help organizations be more productive and scale effectively, but he recognizes that not all shared work fits into this model. Some work is best handled by “specialists,” that is people with unique skills. Although teams composed entirely of T-shaped people is ideal, certain skills are hard to come by and are used irregularly across an organization. Since these specialists often need to work closely with teams, rather than working from their own backlog, they don’t fit into the component team model. The use of shared resources presents challenges to the agile planning model. Steve Berczuk shares how teams such as those providing infrastructure services and specialists can fit into a feature+component team model, and how variations such as embedding specialists in a scrum team can both present process challenges and add significant value to both the team and the larger organization.
Pin the Tail on the Metric: A Field-Tested Agile GameTechWell
Metrics don’t have to be a necessary evil. If done right, metrics can help guide us to make better forward-looking decisions, rather than being used for simply managing or monitoring. They can help us identify trade-offs between options for what to do next versus punitive or worse, purely managerial measures. Steve Martin won’t be giving the Top Ten List of field-tested metrics you should use. Instead, in this interactive mini-workshop, he leads you through the critical thinking necessary for you to determine what is right for you to measure. First, Steve explores why you want to measure something—whether it’s for a team, a portfolio, or even an agile transformation. Next, he provides multiple real-life metrics examples to help drive home concepts behind characteristics of good and bad metrics. Finally, Steve shows how to run his field-tested agile game—Pin the Tail on the Metric. Take back this activity to help you guide metrics conversations at your organization.
Agile Performance Holarchy (APH)—A Model for Scaling Agile TeamsTechWell
A hierarchy is an organizational network that has a top and a bottom, and where position is determined by rank, importance, and value. A holarchy is a network that has no top or bottom and where each person’s value derives from his ability, rather than position. As more companies seek the benefits of agile, leaders need to build and sustain delivery capability while scaling agile without introducing unnecessary process and overhead. The Agile Performance Holarchy (APH) is an empirical model for scaling and sustaining agility while continuing to deliver great products. Jeff Dalton designed the APH by drawing from lessons learned observing and assessing hundreds of agile companies and teams. The APH helps implement a holarchy—a system composed of interacting organizational units called holons—centered on a series of performance circles that embody the behaviors of high performing agile organizations. Jeff describes how APH provides guidelines in the areas of leadership, values, teaming, visioning, governing, building, supporting, and engaging within an all-agile organization. Join Jeff to see what the APH is all about and how you can use it in your team and organization.
A Business-First Approach to DevOps ImplementationTechWell
DevOps is a cultural shift aimed at streamlining intergroup communication and improving operational efficiency for development and operations groups. Over time, inclusion of other IT groups under the DevOps umbrella has become the norm for many organizations. But even broadening the boundaries of DevOps, the conversation has been largely devoid of the business units’ place at the table. A common mistake organizations make while going through the DevOps transformation is drawing a line at the IT boundary. If that occurs, a larger, more inclusive silo within the organization is created, operating in an informational vacuum and causing operational inefficiency and goal misalignment. Sharing his experiences working on both sides of the fence, Leon Fayer describes the importance of including business units in order to align technology decisions with business goals. Leon discusses inclusion of business units in existing agile processes, benefits of cross-departmental monitoring, and a business-first approach to technology decisions.
Databases in a Continuous Integration/Delivery ProcessTechWell
The document summarizes a presentation about including databases in a continuous integration/delivery process. It discusses treating database code like application code by placing it under version control and integrating databases into the DevOps software development pipeline. This allows databases to be built, tested, and released like other software through continuous integration, delivery, and deployment.
Mobile Testing: What—and What Not—to AutomateTechWell
Organizations are moving rapidly into mobile technology, which has significantly increased the demand for testing of mobile applications. David Dangs says testers naturally are turning to automation to help ease the workload, increase potential test coverage, and improve testing efficiency. But should you try to automate all things mobile? Unfortunately, the answer is not always clear. Mobile has its own set of complications, compounded by a wide variety of devices and OS platforms. Join David to learn what mobile testing activities are ripe for automation—and those items best left to manual efforts. He describes the various considerations for automating each type of mobile application: mobile web, native app, and hybrid applications. David also covers device-level testing, types of testing, available automation tools, and recommendations for automation effectiveness. Finally, based on his years of mobile testing experience, David provides some tips and tricks to approach mobile automation. Leave with a clear plan for automating your mobile applications.
Cultural Intelligence: A Key Skill for SuccessTechWell
Diversity is becoming the norm in everyday life. However, introducing global delivery models without a proper understanding of intercultural differences can lead to difficulty, frustration, and reduced productivity. Priyanka Sharma and Thena Barry say that in our diverse world, we need teams with people who can cross these boundaries, communicate effectively, and build the diverse networks necessary to avoid problems. We need to learn about cultural intelligence (CI) and cultural quotient (CQ). CI is the ability to relate and work effectively across cultures. CQ is the cognitive, motivational, and behavioral capacity to understand and respond to beliefs, values, attitudes, and behaviors of individuals and groups. Together, CI and CQ can help us build behavioral capacities that aid motivation, behavior, and productivity in teams as well as individuals. Priyanka and Thena show how to build a more culturally intelligent place with tools and techniques from Leading with Cultural Intelligence, as well as content from the Hofstede cultural model. In addition, they illustrate the model with real-life experiences and demonstrate how they adapted in similar circumstances.
Turn the Lights On: A Power Utility Company's Agile TransformationTechWell
Why would a century-old utility with no direct competitors take on the challenge of transforming its entire IT application organization to an agile methodology? In an increasingly interconnected world, the expectations of customers continue to evolve. From smart meters to smart phones, IoT is creating a crisis point for industries not accustomed to rapid change. Glen Morris explains that pizzas can be tracked by the minute and packages at every stop, and customers now expect this same customer service model should exist for all industries—including power. Glen examines how to create momentum and transform non-IT-focused industries to an agile model. If you are struggling with gaining traction in your pursuit of agile within your business, Glen gives you concrete, practical experiences to leverage in your pursuit. Finally, he communicates how to gain buy-in from business partners who have no idea or concern about agile or its methodologies. If your business partners look at you with amusement when you mention the need for a dedicated Product Owner, join Glen as he walks you through the approaches to overcoming agile skepticism.
Introducing Milvus Lite: Easy-to-Install, Easy-to-Use vector database for you...Zilliz
Join us to introduce Milvus Lite, a vector database that can run on notebooks and laptops, share the same API with Milvus, and integrate with every popular GenAI framework. This webinar is perfect for developers seeking easy-to-use, well-integrated vector databases for their GenAI apps.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Speck&Tech
ABSTRACT: A prima vista, un mattoncino Lego e la backdoor XZ potrebbero avere in comune il fatto di essere entrambi blocchi di costruzione, o dipendenze di progetti creativi e software. La realtà è che un mattoncino Lego e il caso della backdoor XZ hanno molto di più di tutto ciò in comune.
Partecipate alla presentazione per immergervi in una storia di interoperabilità, standard e formati aperti, per poi discutere del ruolo importante che i contributori hanno in una comunità open source sostenibile.
BIO: Sostenitrice del software libero e dei formati standard e aperti. È stata un membro attivo dei progetti Fedora e openSUSE e ha co-fondato l'Associazione LibreItalia dove è stata coinvolta in diversi eventi, migrazioni e formazione relativi a LibreOffice. In precedenza ha lavorato a migrazioni e corsi di formazione su LibreOffice per diverse amministrazioni pubbliche e privati. Da gennaio 2020 lavora in SUSE come Software Release Engineer per Uyuni e SUSE Manager e quando non segue la sua passione per i computer e per Geeko coltiva la sua curiosità per l'astronomia (da cui deriva il suo nickname deneb_alpha).
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
1. 1/22/2013
Star Canada 2013
Janet Gregory, DragonFire Inc.
Copyright 2013
With material from Lisa Crispin
My experience comes …
As a tester, working on agile teams
Coaching and training, learning
Agile Testing: A Practical Guide for Testers and Agile Teams;
Addison Wesley 2009
Copyright 2013 DragonFire Inc.
1
2. 1/22/2013
How many years experience with agile?
◦
◦
◦
◦
◦
At least 2 years
Less than 2 years
Less than 1 year
Have done some reading, starting in a team
Completely new to the concept
Are you a ...
◦
◦
◦
◦
Tester
Test Manager
Programmer
Other
Copyright 2013 DragonFire Inc.
3
10 minutes ~1 minute / person
Name
Where you are from?
Why you are here?
Let everyone have a chance to introduce themselves
Copyright 2013 DragonFire Inc.
4
2
3. 1/22/2013
Copyright 2013 DragonFire Inc.
•
•
•
•
•
•
5
have short iterations
encourage active customer participation
demand whole team collaboration
test features & stories as they are coded
deliver business value at regular intervals
adapt their processes based on feedback
Copyright 2013 DragonFire Inc.
6
3
4. 1/22/2013
Sequential, phased – eg. waterfall
Requirements
Specifications
Code
Testing
Release
Time
D
C
C
A
It 0
B
It 1
A
B
It 2
A
Agile: iterative, incremental
F
E
D
C
B
It 3
A
– Each story is expanded, coded and
tested
– Possible release after each iteration
B
It 4
7
What concerns you
about agile
and
test planning?
Copyright 2013 DragonFire Inc.
4
5. 1/22/2013
Test Approach – The Agile Way
Project Initiation
Release/Project
Planning
Each Iteration
1
….
X
System Test /
End Game
Release to Prod/
Support
Get an understanding of the project (business)
Participate in sizing stories
Create Test Plan
Estimate tasks, Run regression tests
Collaborate with customers on acceptance tests
Write, automate and execute new story tests
Pair test with other testers, developers
Perform exploratory testing
Perform Final Load Test
Complete Final Regression Test
Perform UAT
Perform Mock Deploy
Participate in Release Readiness
Participate in Release to Prod
Participate in Retrospectives
Copyright 2013 DragonFire Inc.
9
Project
Test Plan
Release
Test Plan
Copyright 2013 DragonFire Inc.
Story
Tests
10
5
6. 1/22/2013
• Product Roadmap
◦ High level feature ideas
• Release Planning
◦ Backlog of sized and prioritized stories
• Iteration Planning
◦ Backlog of estimated tasks
◦ Estimations are in hours: how long will it take
Copyright 2013 DragonFire Inc.
11
• Types of testing needed
• Automation
• Complexity
• Size
More details later …
Copyright 2013 DragonFire Inc.
12
6
7. 1/22/2013
Instead of saying NO,
or being the gatekeeper
Be the information provider so
business can make the decisions
Questions… on agile approach to planning
Copyright 2013 DragonFire Inc.
13
Copyright 2013 DragonFire Inc.
14
7
8. 1/22/2013
Let’s talk a little
testing ….
What is enough
to keep you out
of trouble?
Copyright 2013 DragonFire Inc.
15
• Find bugs
• Check correctness
• Test reliability
• Check usability
• Answer “Is it done?”
• Learn about the application
• Feedback into future stories
What else???
Copyright 2013 DragonFire Inc.
16
8
9. 1/22/2013
• Each group, collaborate
• Write different types of testing that you do
• One type per sticky
• 5 minutes
Copyright 2013 DragonFire Inc.
17
• Model for classifying tests
• Looks at the purpose of the tests – the ‘why’
Copyright 2013 DragonFire Inc.
18
9
10. 1/22/2013
Agile Testing Quadrants
Brain child of Brian Marick
19
• Take your tests and put them in the quadrant
you think they belong in.
• Were you able to get them all classified?
• Add tests as you think about them.
Copyright 2013 DragonFire Inc.
20
10
11. 1/22/2013
Agile Testing Quadrants
(Brian Marick)
21
• Can be used as a communication tool
o
to explain testing in a common language
• Emphasize whole-team responsibility
o
o
focus on collaboration
whole team participation
• Help plan what tests to automate
o
and tools and infrastructure needed
Copyright 2013 DragonFire Inc.
11
12. 1/22/2013
For each story / feature
◦ No story is done until tested
◦ Customer needs captured as passing tests
◦ Automated regression tests
For release readiness
◦ Delivers value
◦ “Doneness” in all quadrants
Copyright 2013 DragonFire Inc.
23
Copyright 2013 DragonFire Inc.
24
12
13. 1/22/2013
Unit Tests
o
o
o
o
Tests developer intent - program design
Tests a small piece of code
Makes sure it does what it should
TDD
Component Tests
o
o
Tests architecture intent – system design
Tests that components work together correctly
Connectivity Tests
Copyright 2013 DragonFire Inc.
• Builds quality in
• Focus on internal code quality
• Builds testability into code
• Go faster, do more
o
Unit tests provide refactoring support
• Provides instant feedback
• TDD increases confidence in design
• Forms the foundation of automation suite
Copyright 2013 DragonFire Inc.
13
15. 1/22/2013
• Use to elicit requirements
• Acceptance Test (or Example) Driven Development
o
Allows developers to code until the tests pass
• Capture examples, express as executable tests
• User experience
o
wire frames; mock-ups / prototypes
Copyright 2013 DragonFire Inc.
• Help customers achieve advance clarity
• Gain shared common understanding of stories
• Drive development with business-facing tests
• Obtain enough requirements to start coding
• Focus is external quality
• Customer – developer – tester collaboration
◦ Power of Three
Copyright 2013 DragonFire Inc.
15
16. 1/22/2013
Can you think
of anything else
for Quadrant 3?
Discussions?
31
Copyright 2013 DragonFire Inc.
32
16
17. 1/22/2013
• Exploratory testing
o
what didn’t we think about
• Test for usability
o
understand end users, personas
• Tours
• User Acceptance Testing (UAT)
Copyright 2013 DragonFire Inc.
• Iteration reviews / demos
o
Builds confidence
o
Quick feedback loop
• Pair test with customers
• Informal demos
o
Pair exploratory testing with customer
Copyright 2013 DragonFire Inc.
17
18. 1/22/2013
• Provide feedback ….
o
Turn learnings into tests that drive new features
o
Change process as needed
• Evaluation of the product
• Recreate actual user experiences
• Realistic use
Copyright 2013 DragonFire Inc.
• Know your customers
• Make them real
• Plan your exploratory
testing using them
• Picture – from Jeff Patton’s
Pragmatic Personas weekly column
on Sticky Minds (1/25/2010)
Copyright 2013 DragonFire Inc.
18
19. 1/22/2013
• Think of some personas or characters and
devise exploratory scenarios which that role
might get into, for example:
Shopping website - Amazon
o
Senior citizen who never shopped on the web
o
Hacker looking to cause trouble
o
Working mom in a rush
o
Internet-savvy teen
37
Copyright 2013 DragonFire Inc.
37
Any questions about
Quadrant 3?
38
19
20. 1/22/2013
39
Copyright 2013 DragonFire Inc.
• Non-functional tests
• Performance, scalability, stress, load
• “ility” testing
• Automated deployments
• Memory management
• Infrastructure testing
• Security testing
o
Roles & permissions , system ‘hacking’
• Data migration
• Recovery
Copyright 2013 DragonFire Inc.
20
21. 1/22/2013
• May be higher priority than functional req’ts
• Makes the ‘finished’ product
• Transfer specialized expertise
Consider the four quadrants
• Would you add anything?
• Consider your quality attributes
• Do you need to move any tests?
• What tests are you missing?
• What is your team is not doing now?
Copyright 2013 DragonFire Inc.
42
21
22. 1/22/2013
Automate at
the feature
level
push
the
tests
lower
Automate
at the story
level
Automate
at the task
level
Sequential, phased – eg. waterfall
Requirements
All
automation
done here
Specifications
Code
Testing
Release
Time
D
C
C
A
It 0
B
It 1
A
B
It 2
A
Agile: iterative, incremental
F
E
D
C
B
It 3
A
– Each story is expanded, coded and
tested
– Possible release after each iteration
B
Automation
here
It 4
Automation
Automation here
Automation
here
here
44
22
23. 1/22/2013
Instead of saying NO,
or being the gatekeeper
Be the information provider so
business can make the decisions
Questions… on types of testing?
Copyright 2013 DragonFire Inc.
45
Copyright 2013 DragonFire Inc.
46
23
24. 1/22/2013
Release Level Test Planning
Copyright 2013 DragonFire Inc.
47
• Consider scope, priorities, risks
• Document only what is useful
• Consider all types of testing
• Budget time for:
o
o
o
infrastructure
tools
automation needs
• Strive for simplicity
Copyright 2013 DragonFire Inc.
48
24
25. 1/22/2013
49
• Sizing of stories
◦ think about impacts to system
• Ask questions
◦ that may impact the ‘bigness’ of the story
◦ to uncover hidden assumptions
◦ such as ‘what if?’, or “what happens if?”
• Not the time for details
• Create a project “test plan”
Copyright 2013 DragonFire Inc.
50
25
26. 1/22/2013
• Are we working with a vendor?
◦ If so, how do we coordinate testing?
• Are there dependencies between …
◦ features?
◦ teams?
◦ Stories?
Copyright 2013 DragonFire Inc.
51
26
27. 1/22/2013
• Should be project specific
• High-light critical information
◦ risks
◦ assumptions
◦ constraints
• Focus on what is really needed
• Put static information –in a Quality
Management Strategy document
Copyright 2013 DragonFire Inc.
53
Copyright 2013 DragonFire Inc.
54
27
28. 1/22/2013
Copyright 2013 DragonFire Inc.
55
A test matrix (one example)
◦ provides a view of the release
◦ can also be used as visible progress
But, remember
◦ the value is in the planning
Let`s look at an example
Copyright 2013 DragonFire Inc.
56
28
29. 1/22/2013
Copyright 2009 Janet
Copyright 2013 DragonFire Inc.
Gregory, DragonFire
57
Let’s do some collaboration,
& work together to find a solution
Copyright 2013 DragonFire Inc.
58
29
30. 1/22/2013
rules
Sub topic
rules
user name
save
password
Sub topic
first time
Sub topic
encryption
MAIN
new account
TOPIC
Sub topic
change
Sub topic
Copyright 2013 DragonFire Inc.
59
Mind map testing ideas for a feature
How many ideas can you generate
in 5 minutes?
Copyright 2013 DragonFire Inc.
30
31. 1/22/2013
Feature A
• As an on-line shopper, I want to provide my
shipping address when I check out so that my
order goes to the right location
Feature B
• As an on-line shopper, I want to provide
payment information when I check out so I am
billed correctly.
Copyright 2013 DragonFire Inc.
• Create a test matrix for the stories on the next
slide.
• Functionality down the left side
• Test conditions across the top
o
Think heuristics
o
Use mnemonics (ex. SFDPOT)
• Gray out the squares that are not applicable.
Copyright 2013 DragonFire Inc.
31
32. 1/22/2013
Feature A
• As an on-line shopper, I want to provide my
shipping address when I check out so that my
order goes to the right location
Feature B
• As an on-line shopper, I want to provide
payment information when I check out so I am
billed correctly.
Copyright 2013 DragonFire Inc.
• What value do you see in something like this?
• What did you learn?
• How would you use it?
Copyright 2013 DragonFire Inc.
32
33. 1/22/2013
Instead of saying NO,
or being the gatekeeper
Be the information provider so
business can make the decisions
Questions on test planning at the release level?
Copyright 2013 DragonFire Inc.
65
Copyright 2013 DragonFire Inc.
66
33
34. 1/22/2013
Iteration Planning:
• Stories tell us about the tip of the iceberg
• What don’t we know?
• What questions should we ask to find out?
Copyright 2013 DragonFire Inc.
68
34
35. 1/22/2013
Create a
user story
Write
Customer
(Q2) Tests
Expand
tests –
Story Tests
Automate
Q2 Tests
Start
thinking
how to code
Pair,
“Show Me”
TDD
Product owner
Product owner/ Tester
Tester
Tester/Programmer
Programmer
Exploratory
testing
Customer
User
Acceptance
Copyright 2013 DragonFire Inc.
69
• Be proactive – preplanning
• Try specifications workshops – Gojko Adzic
• Help customer achieve ‘advance clarity’ on
stories
◦
◦
◦
◦
Customers speak with “one voice”
Testable stories
Steel threads
Create acceptance tests
Copyright 2013 DragonFire Inc.
70
35
36. 1/22/2013
• Define high level story tests or examples
• Find hidden assumptions
• Define and estimate testing tasks – consider:
o
o
o
o
automation needs
test data,
exploratory testing
Q4 tests (‘ilities’, security, performance, etc.)
Copyright 2013 DragonFire Inc.
71
Ask questions
• What's the business goal?
• Can the user mess up?
• What’s the best thing?
• What’s the worst thing that can happen?
• Watch for scope creep or ‘bling’
• Is this story testable?
Copyright 2013 DragonFire Inc.
72
36
37. 1/22/2013
To
Review
Copyright 2013 DragonFire Inc.
Picture from
Mike Cohn’s
website
73
73
•
•
•
•
Express the intent of the story
Use examples as specific instances of a scenario
Think expected and unexpected behaviors
Gives a shared common understanding of the
story
• Feeds into TDD (Test Driven Development)
Copyright 2013 DragonFire Inc.
74
37
38. 1/22/2013
As a new user, I want to create an
account with a user name and
password so that only I can access
my information.
===============================
Copyright 2013 DragonFire Inc.
75
75
• To create my account, I enter a valid user name
and password, my information is saved and I am
logged into the system and on the home page.
• If I enter an invalid user name, I get an error
message “Invalid User Name”, and I am able to
try again.
• If I enter an invalid password, I get an error
message “Invalid Password”, and I am able to try
again.
Copyright 2013 DragonFire Inc.
76
38
39. 1/22/2013
BDD – Behavioural Driven Development
Given the user has no existing account
When she requests to create a new account,
Then she enters a valid user name and valid
password (rules defined)
And the information is saved upon submitting .
77
Copyright 2013 DragonFire Inc.
77
returnValue TestLogin ( userName, password)
User Name
Password
Expected result
comments
JanetGregory
Password
Access system as
JanetGregory
Valid combo saved
Janet Gregory
Password
Error
Space in user name
JanetGregory
Abc
Error
Invalid password
Copyright 2013 DragonFire Inc.
78
39
40. 1/22/2013
• Write Acceptance Test(s) for one of the two
stories – any format you like
• To help, try mind-mapping or draw a flow
diagram
• Ask your customer for examples.
• Think “Amazon”
• (~15 min)
Copyright 2013 DragonFire Inc.
Story a – Part of Feature A
• As the company shipper, I need to verify the
city, state and postal code so that the order
goes to the right location.
Story b – Part of Feature B
• As the company accountant, I need to ensure
the credit card information is correct so the
on-line shopper is billed correctly.
Copyright 2013 DragonFire Inc.
40
41. 1/22/2013
We walk out of the iteration planning
meeting ....
Now what?
Copyright 2013 DragonFire Inc.
82
41
42. 1/22/2013
• Start simple with the high level acceptance tests
◦ Add boundary, edge conditions, etc.
◦ Add non-functional tests
• Experiment to find the right detail level
• Review with programmers
◦ Then automate
• Keep adding until story is complete
• Consider exploratory test scenarios
Copyright 2013 DragonFire Inc.
83
Acceptance Tests – Fit Automation Style
-- Third thread
User Name
Password
Expected result
Comments
JanetGregory
Password
Access system as
JanetGregory
Valid combo saved
Janet Gregory
Password
Error
Space in user name
Janet#Gregory
Password
Error
Special char not
allowed
Password
Error
Blank user name
JanetGregory
Password
Error
User already exists
JanetGregory
Abc
Error
Not enough char in
password
Error
Blank password
JanetGregory
Copyright 2013 DragonFire Inc.
84
42
43. 1/22/2013
How do you know
you are done?
Copyright 2013 DragonFire Inc.
Description
85
Completed
1
Story tests reviewed, automated and pass
Yes
2
Exploratory testing complete
Yes
3
Unit tests reviewed and pass
Yes
4
Q4 tests complete
Yes
5
Acceptance tests pass
Yes
Copyright 2013 DragonFire Inc.
86
43
44. 1/22/2013
Instead of saying NO,
or being the gatekeeper
Be the information provider so
business can make the decisions
Questions on test planning at the story level?
Copyright 2013 DragonFire Inc.
87
Copyright 2013 DragonFire Inc.
88
44
45. 1/22/2013
• Understand your context
• Understand the purpose
• Consider ROI (return on investment)
• Push the tests lower
• Automate the repetitive and boring tests
• Plan, but document simply
• Plan, but plan for the appropriate level
Copyright 2013 DragonFire Inc.
• How much is too much?
• What is not enough?
• Why do you need them?
• What is the right information?
• Who looks at them?
• What do they use them for?
• What is the simplest way you can capture
them?
Copyright 2013 DragonFire Inc.
90
45
46. 1/22/2013
•
•
•
•
•
•
•
•
What problem are you trying to solve?
What measurements make sense?
What is the simplest way to gather them?
Who needs to see them?
What is the simplest way to make them visible?
How are you going to use them?
When will you know to stop?
Be Aware!!!
o
o
The wrong measurements can be deadly
Metrics can be misinterpreted
Copyright 2013 DragonFire Inc.
• How are you going to know when you are done?
• What is enough?
• Collaborate to decide what coverage you want
• Think risk – include the customer
• Functional coverage
• Lines of code? … be careful
• What tools can you use?
o
o
Heuristics
Mnemonics
Copyright 2013 DragonFire Inc.
46
47. 1/22/2013
• Understand the level of precision you need
◦ Think big picture in release planning
◦ Think tasks during iteration planning
◦ Think tests during story test planning
• Use ATDD for feature and story testing
• Make it visible
• Make it simple
• Make it valuable
• You want to be testing, not documenting
Copyright 2013 DragonFire Inc.
Instead of saying NO,
or being the gatekeeper
Be the information provider so
business can make the decisions
Are there still unanswered questions.. Concerns?
Copyright 2013 DragonFire Inc.
94
47
48. 1/22/2013
Now Available
Agile Testing: A Practical Guide for
Testers and Agile Teams
By Lisa Crispin and Janet Gregory
www.agiletester.ca
My contact info
www.janetgregory.ca
http://janetgregory.blogspot.com/
Email: janet@agiletester.ca
www.lisacrispin.com
http://lisacrispin.com
Email: lisa@agiletester.ca
Copyright 2013 DragonFire Inc.
95
•
•
•
•
•
•
•
•
www.lisacrispin.com
agile-testing@yahoogroups.com
http://www.exampler.com - Brian Marick’s web site
www.testobsessed.com (heuristics cheat sheet) – Elisabeth Hendrickson
http://www.shino.de/blog/ - Markus Gärtner’s blog
Gojko Adzic, Bridging the Communication Gap, 2009
Gojko Adzic, Executable Specifications, Manning, 2011
vwww.mountaingoatsoftware.com – Mike Cohn’s web site (and all his
books)
• http://www.developsense.com/2009/04/of-testing-tours-anddashboards.html
• James Whittaker, Exploratory Software Testing, 2009 Addison-Wesley
• http://www.stickyminds.com/ Pragmatic Personas - Jeff Patton’s weekly
column 1/25/2010
• Jean Tabaka, Collaboration Explained, 2006 Addison-Wesley
• Copyright 2013Manifesto: http://agilemanifesto.org/
Agile DragonFire Inc.
96
48