Some people thrive on challenges, while others struggle with how to deal with them. Handled well, challenges can make us stronger in our passion, drive, and determination. Lloyd Roden describes the challenges we face today in software testing and how we can respond in a positive, constructive manner.
One of the challenges Lloyd often sees is identifying and eliminating metrics that lie. While we (hopefully) do not set out to deceive, we must endeavor to employ metrics that have significance, integrity, and operational value. Another challenge test leaders face is providing estimates that have clarity, accuracy, and meaning. Often we omit a vital ingredient when developing test estimates - the quality required in the product.
A third challenge is convincing test managers to actually test regularly to attain credibility and respect with the team they are leading. A further challenge is to see why the use of the term "best practice" can be so damaging.
EuroSTAR Software Testing Conference 2009 presentation on Spend Wisely, Test Well by John fodeh. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Johan Jonasson - Introducing Exploratory Testing to Save the ProjectTEST Huddle
EuroSTAR Software Testing Conference 2009 presentation on Introducing Exploratory Testing to Save the Project by Johan Jonasson . See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Isabel Evans - Quality In Use - EuroSTAR 2011TEST Huddle
EuroSTAR Software Testing Conference 2011 presentation on Quality In Use by Isabel Evans. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
EuroSTAR Software Testing Conference 2010 presentation on Testing and Lean Principles by Beata Karpinska . See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Hans-Henrik Olesen - What to Automate and What not to AutomateTEST Huddle
EuroSTAR Software Testing Conference 2009 presentation on What to Automate and What not to Automate by Hans-Henrik Olesen. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Clive Bates - A Pragmatic Approach to Improving Your Testing Process - EuroST...TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on A Pragmatic Approach to Improving Your Testing Process by Clive Bates. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
EuroSTAR Software Testing Conference 2009 presentation on Serving Two Masters by Ard Kramer . See more at conferences.eurostarsoftwaretesting.com/past-presentations/
EuroSTAR Software Testing Conference 2009 presentation on Spend Wisely, Test Well by John fodeh. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Johan Jonasson - Introducing Exploratory Testing to Save the ProjectTEST Huddle
EuroSTAR Software Testing Conference 2009 presentation on Introducing Exploratory Testing to Save the Project by Johan Jonasson . See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Isabel Evans - Quality In Use - EuroSTAR 2011TEST Huddle
EuroSTAR Software Testing Conference 2011 presentation on Quality In Use by Isabel Evans. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
EuroSTAR Software Testing Conference 2010 presentation on Testing and Lean Principles by Beata Karpinska . See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Hans-Henrik Olesen - What to Automate and What not to AutomateTEST Huddle
EuroSTAR Software Testing Conference 2009 presentation on What to Automate and What not to Automate by Hans-Henrik Olesen. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Clive Bates - A Pragmatic Approach to Improving Your Testing Process - EuroST...TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on A Pragmatic Approach to Improving Your Testing Process by Clive Bates. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
EuroSTAR Software Testing Conference 2009 presentation on Serving Two Masters by Ard Kramer . See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Gitte Ottosen - Agility and Process Maturity, Of Course They Mix!TEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Agility and Process Maturity, Of Course They Mix! by Gitte Ottosen. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Andrew Goslin - TMMi, What is Not in the Text Book - EuroSTAR 2010TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on TMMi, What is Not in the Text Book by Andrew Goslin. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Julian Harty - Alternatives To Testing - EuroSTAR 2010TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on "Presentation Title" by "Speaker Name". See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Kristian Fischer - Put Test in the Driver's SeatTEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Put Test in the Driver's Seat by Kristian Fischer. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Jelle Calsbeek - Stay Agile with Model Based Testing revisedTEST Huddle
EuroSTAR Software Testing Conference 2009 presentation on Evolution of New Feature Verification in 3G Networks by Michael Monaghan. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Derk-Jan de Grood - 9 Causes of losing valuable testing time - EuroSTAR 2010TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on 9 Causes of losing valuable testing time by Derk-Jan de Grood. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
EuroSTAR Software Testing Conference 2009 presentation on The Power of Risk by Erik Beolen. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
James Whittaker - Pursuing Quality-You Won't Get There - EuroSTAR 2011TEST Huddle
EuroSTAR Software Testing Conference 2011 presentation on Pursuing Quality-You Won't Get There by James Whittaker. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Dirk Van Dael - Test Accounting - EuroSTAR 2010TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on Test Accounting by Dirk Van Dael. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Paul Gerrard - Advancing Testing Using Axioms - EuroSTAR 2010TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on Advancing Testing Using Axioms by Paul Gerrard. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Thomas Axen - Lean Kaizen Applied To Software Testing - EuroSTAR 2010TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on Lean Kaizen Applied To Software Testing by Thomas Axen . See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Geoff Thompson - Why Do We Bother With Test StrategiesTEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Why Do We Bother With Test Strategies by Geoff Thompson. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
'Test Data Management and Project Quality Go Hand In Hand' by Kristian Fische...TEST Huddle
Traditionally, the testing community has perceived test data the same way most organisations perceive test. Boring, time consuming and none value-adding. But new winds are blowing. Initiated by the complex project and test environments of today, testing departments are now taking the first small steps to recognise the importance of a focused test data management function. Maybe the testing community will too? Realising that we have long passed the good old days where a mainframe test data copy would do the trick, challenges in implementing a TDM function in today’s complex set-ups are many and insidious. And it needs a well executed plan.
This presentation takes outset in experiences and hardships gained from a TDM optimising project and provide a live demo, inspiration and guidelines in moving forward with implementing and optimising a TDM function. The project was run alongside a big-scale on-going SOA programme at a major Danish pension fund. The project focused on three areas: Technical, Process, and People & Communication.
In the Technical area, the project developed a TDM Dashboard. As a main management component, the
Dashboard provides a test data copy function from Production to Test and between test environments. Besides, it offers an overview of the test data in the different applications and environments.
The Process area developed a TDM strategy and optimised the test data processes in order to deliver valid, transversal test data quicker. It focused on a wide range of areas such as production copying, data generation, handling of requirements, data cleaning, profile usage, data pools and data re-use.
The People & Communication area focused on including stakeholders proactive in the test data process and communicating roles and responsibilities as well as new functions and processes.
Not only has the project delivered measurable and visible results, number of defects in Production has been reduced; hereby stressing that a well implemented TDM function with continuous focus on optimising TDM is added value and worth the effort.
Niels Malotaux - Help We Have a QA Problem!TEST Huddle
EuroSTAR Software Testing Conference 2009 presentation on Help We Have a QA Problem! by Niels Malotaux. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Martin Koojj - Testers in the Board of DirectorsTEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Testers in the Board of Directors by Martin Koojj. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Ane Clausen - Success with Automated Regression Test revisedTEST Huddle
EuroSTAR Software Testing Conference 2009 presentation on Success with Automated Regression Test revised by Ane Clausen. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
There are not only one correct solution to many of the tasks we do within testing, we need a toolbox of many different tools and the pragmatism to acknowledge that we cannot use the same every time - there are no one size fits all :-)
Gustav Olsson - Agile - Common Sense with a New Name Tag revisedTEST Huddle
EuroSTAR Software Testing Conference 2009 presentation on Agile - Common Sense with a New Name Tag revised by Gustav Olsson. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Michael Bolton - Two Futures of Software TestingTEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Two Futures of Software Testing by Michael Bolton. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Fabian Scarano - Preparing Your Team for the FutureTEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Preparing Your Team for the Future by Fabian Scarano. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
'Architecture Testing: Wrongly Ignored!' by Peter ZimmererTEST Huddle
State-of-the-art testing approaches typically include different testing levels like reviews, unit testing, component testing, integration testing, system testing, and acceptance testing. There is also common sense that typically unit testing is done by developers (they are responsible to check the quality of their units at least to some extent) and system testing is done by professional independent testers. But, who is responsible to adequately test the architecture which is one of the key artifacts in developing and maintaining flexible, powerful, and sustainable products and systems? History has shown that too many project failures and troubles are caused by deficiencies in the architecture.Furthermore, what does the term architecture testing mean and why is this term seldom used?
To answer these questions, Peter describes what architecture testing is all about and explains a list of pragmatic practices and experiences to implement it successfully. He offers practical advice on the required tasks and activities as well as the needed involvement, contributions, and responsibilities of software architects in the area of testing – because a close cooperation between testers and architects is the key to drive and sustain a culture of prevention rather than detection across the lifecycle.
Finally, if we claim to be in pursuit of quality then adequate architecture testing is not only a lever for success but a necessity. And this results not only in better quality but also speeds up development by facilitating change and decreasing maintenance efforts.
Gitte Ottosen - Agility and Process Maturity, Of Course They Mix!TEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Agility and Process Maturity, Of Course They Mix! by Gitte Ottosen. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Andrew Goslin - TMMi, What is Not in the Text Book - EuroSTAR 2010TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on TMMi, What is Not in the Text Book by Andrew Goslin. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Julian Harty - Alternatives To Testing - EuroSTAR 2010TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on "Presentation Title" by "Speaker Name". See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Kristian Fischer - Put Test in the Driver's SeatTEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Put Test in the Driver's Seat by Kristian Fischer. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Jelle Calsbeek - Stay Agile with Model Based Testing revisedTEST Huddle
EuroSTAR Software Testing Conference 2009 presentation on Evolution of New Feature Verification in 3G Networks by Michael Monaghan. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Derk-Jan de Grood - 9 Causes of losing valuable testing time - EuroSTAR 2010TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on 9 Causes of losing valuable testing time by Derk-Jan de Grood. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
EuroSTAR Software Testing Conference 2009 presentation on The Power of Risk by Erik Beolen. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
James Whittaker - Pursuing Quality-You Won't Get There - EuroSTAR 2011TEST Huddle
EuroSTAR Software Testing Conference 2011 presentation on Pursuing Quality-You Won't Get There by James Whittaker. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Dirk Van Dael - Test Accounting - EuroSTAR 2010TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on Test Accounting by Dirk Van Dael. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Paul Gerrard - Advancing Testing Using Axioms - EuroSTAR 2010TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on Advancing Testing Using Axioms by Paul Gerrard. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Thomas Axen - Lean Kaizen Applied To Software Testing - EuroSTAR 2010TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on Lean Kaizen Applied To Software Testing by Thomas Axen . See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Geoff Thompson - Why Do We Bother With Test StrategiesTEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Why Do We Bother With Test Strategies by Geoff Thompson. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
'Test Data Management and Project Quality Go Hand In Hand' by Kristian Fische...TEST Huddle
Traditionally, the testing community has perceived test data the same way most organisations perceive test. Boring, time consuming and none value-adding. But new winds are blowing. Initiated by the complex project and test environments of today, testing departments are now taking the first small steps to recognise the importance of a focused test data management function. Maybe the testing community will too? Realising that we have long passed the good old days where a mainframe test data copy would do the trick, challenges in implementing a TDM function in today’s complex set-ups are many and insidious. And it needs a well executed plan.
This presentation takes outset in experiences and hardships gained from a TDM optimising project and provide a live demo, inspiration and guidelines in moving forward with implementing and optimising a TDM function. The project was run alongside a big-scale on-going SOA programme at a major Danish pension fund. The project focused on three areas: Technical, Process, and People & Communication.
In the Technical area, the project developed a TDM Dashboard. As a main management component, the
Dashboard provides a test data copy function from Production to Test and between test environments. Besides, it offers an overview of the test data in the different applications and environments.
The Process area developed a TDM strategy and optimised the test data processes in order to deliver valid, transversal test data quicker. It focused on a wide range of areas such as production copying, data generation, handling of requirements, data cleaning, profile usage, data pools and data re-use.
The People & Communication area focused on including stakeholders proactive in the test data process and communicating roles and responsibilities as well as new functions and processes.
Not only has the project delivered measurable and visible results, number of defects in Production has been reduced; hereby stressing that a well implemented TDM function with continuous focus on optimising TDM is added value and worth the effort.
Niels Malotaux - Help We Have a QA Problem!TEST Huddle
EuroSTAR Software Testing Conference 2009 presentation on Help We Have a QA Problem! by Niels Malotaux. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Martin Koojj - Testers in the Board of DirectorsTEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Testers in the Board of Directors by Martin Koojj. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Ane Clausen - Success with Automated Regression Test revisedTEST Huddle
EuroSTAR Software Testing Conference 2009 presentation on Success with Automated Regression Test revised by Ane Clausen. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
There are not only one correct solution to many of the tasks we do within testing, we need a toolbox of many different tools and the pragmatism to acknowledge that we cannot use the same every time - there are no one size fits all :-)
Gustav Olsson - Agile - Common Sense with a New Name Tag revisedTEST Huddle
EuroSTAR Software Testing Conference 2009 presentation on Agile - Common Sense with a New Name Tag revised by Gustav Olsson. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Michael Bolton - Two Futures of Software TestingTEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Two Futures of Software Testing by Michael Bolton. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Fabian Scarano - Preparing Your Team for the FutureTEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Preparing Your Team for the Future by Fabian Scarano. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
'Architecture Testing: Wrongly Ignored!' by Peter ZimmererTEST Huddle
State-of-the-art testing approaches typically include different testing levels like reviews, unit testing, component testing, integration testing, system testing, and acceptance testing. There is also common sense that typically unit testing is done by developers (they are responsible to check the quality of their units at least to some extent) and system testing is done by professional independent testers. But, who is responsible to adequately test the architecture which is one of the key artifacts in developing and maintaining flexible, powerful, and sustainable products and systems? History has shown that too many project failures and troubles are caused by deficiencies in the architecture.Furthermore, what does the term architecture testing mean and why is this term seldom used?
To answer these questions, Peter describes what architecture testing is all about and explains a list of pragmatic practices and experiences to implement it successfully. He offers practical advice on the required tasks and activities as well as the needed involvement, contributions, and responsibilities of software architects in the area of testing – because a close cooperation between testers and architects is the key to drive and sustain a culture of prevention rather than detection across the lifecycle.
Finally, if we claim to be in pursuit of quality then adequate architecture testing is not only a lever for success but a necessity. And this results not only in better quality but also speeds up development by facilitating change and decreasing maintenance efforts.
'Playing Around With Risks' by Jurgen CleurenTEST Huddle
I looked at my cards. 2 Aces. The best hand possible to have in poker on an empty board. At this point there is no risk that I can be beaten. I decide to exploit the situation. Get as much value as possible, but not letting my opponents know I have such a good hand. I don’t raise. 3 cards come on the board. I wait. The 4th card comes. I still wait. The fifth and last card comes and I make my move. I put all my money on the line and as it turns out, I got beaten by someone who has made a straight. How is this possible? I had the best hand, I evaluated the risk and still lost. The reason is obvious, the board changed 3 times, and with each extra card, my risk of losing also changed. And I did not adapt. I didn’t re-evaluate my risks and acted accordingly.
There are quite a few games that deal with risks and risk responses. Poker and Monopoly are a few examples. There are world championships held in these games and there is general consensus who are the best players in the world. Those players have game tactics. What if we can map those tactics to Risk Based Testing? Can we improve our process based on those successful game tactics? In this presentation, I will elaborate on a few game tactics and map them on the Risk Based Testing process. I will give concrete examples of similarities between them and demonstrate that they can be adapted to improve our test process.
'Mixing Open And Commercial Tools' by Mauro GarofaloTEST Huddle
Mixing open source and commercial software is a challenge we face today. The right combined solution offers advantages in flexibility, functionality, performance, and management that aren't available when either open source or commercial technologies are used alone. But one of the major issues is that they don’t always play well together. Some of them can’t be loaded together or they fail to integrate properly.
In this presentation we will provide a case study of successful blending open source and commercial software for testing Java applications. The testing environment consisted of using Subversion as versioned repository for tests, JIRA for issue management and in mixing IBM Rational Functional Tester with an open framework for automated functional, regression and GUI testing. These tools provide a rich set of high-level Java API useful for integrating with each other, by minimizing the integration costs. During the presentation, we will explore the costs (open tools used have no fees, integration and training costs …), the advantage (innovation, products that are best-of-breed …), the risks (dedicated support, community feedbacks, ability to add extensions …) and the product’s quality achieved by our solution compared to a full open source or commercial approach. By our experience, we will provide some hints and tips to guide testers and managers to choose a good mix between open and commercial tools based on: budget, technology, know-how and requested quality.
'Acceptance Test Driven Development Using Robot Framework' by Pekka Klarch & ...TEST Huddle
Acceptance test driven development (ATDD) is an important agile practice merging requirement gathering with acceptance testing. In its core are concrete examples, created together with the team, that provide collaborative understanding and, as automated acceptance tests, make sure that the features are implemented correctly. There are many ways to create ATDD examples/tests, and the behavior driven development (BDD) style with Given-When-Then format is one of the more popular ones.
Robot Framework is an open source test automation framework suitable for ATDD and acceptance testing in general. It has a flexible test data syntax that supports keyword-driven, data-driven, and BDD styles, but is still simple enough so that also non-programmers can create and understand test cases. The simple test library API makes extending the framework easy, and there are several ready made libraries that allow testing generic interfaces such as web, databases, Swing, SWT, Windows GUIs, Flex, and SSH out-of-the-box.
This presentation gives an introduction both to ATDD and Robot Framework. It contains different demonstrations and
all the material will be freely available after the presentation.
Ben Walters - Creating Customer Value With Agile Testing - EuroSTAR 2011TEST Huddle
EuroSTAR Software Testing Conference 2011 presentation on Creating Customer Value With Agile Testing by Ben Walters. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
'Continuous Quality Improvements – A Journey Through The Largest Scrum Projec...TEST Huddle
In this presentation you will learn about how the testing process and continuous quality improvements are aligned to the scrum process in a large software project. We hope that our hands -on experience will give you inspiration on how to tailor the test process in an agile environment. The project has been running for more than two years, with six successful releases to end users. We would like to share our experiences with managing test processes in a large scrum project – our do’s and don’ts, our success stories and also our lessons learned. The project is the largest scrum project in Norway to date.
The project scope is to implement system support for managing a new pension reform for all inhabitants in Norway that are members of the pension fund, and replacing existing system due to outdated technology. Approximately 750 000 project hours will be spent and between 100-180 people are involved in the project: thirteen scrum teams, plus two project management and acceptance testing teams, and one business expert team. Each scrum team contains all the knowledge and expertise needed for developing high quality software: Scrum master, business expert, technical architect, UX designer, developers, build/deploy responsible, and of course, dedicated test resources.
Each software delivery in this project contains five sprints. Each sprint is three weeks, followed by acceptance testing before the delivery is shipped. Test driven development is used in all levels of development, from unit tests all the way up to functional system testing. All test levels up to system integration testing is performed during the development sprint by the scrum teams. We tried to automate UI tests, but this was not successful. However, tests in all other levels are successfully automated, and after each delivery, a fully automated regression test suite is shipped with the code.
Stefaan Lukermans & Dominic Maes - Testers And Garbage Men - EuroSTAR 2011TEST Huddle
EuroSTAR Software Testing Conference 2013 presentation on Testers And Garbage Men by Stefaan Lukermans & Dominic Maes. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
'How To Apply Lean Test Management' by Bob van de BurgtTEST Huddle
Cost reductions and the quest for more efficiency are more evident in today’s business world. It also follows that our testing processes will ultimately be affected. When test techniques and methods for structured testing are introduced, this results in improvements in the production of more consistent and predictable results.
Introducing a risk based approach to testing makes it easier for the business to determine to what extent testing is necessary and most efficient. The resulting Go/No- Go decision process may not be sufficient for all companies so other creative methods need to be investigated. Many management theories speak about “Lean” as being one of the solutions. One of the key steps in using “Lean” is the identification of which steps add value to the customer and which do not. This track will give you information to start using “Lean” within testing and more specifically within test management.
The presenter will also look at Lean Six Sigma as being one of the more popular theories that introduces the concept of “Lean” in combination with obtaining higher quality products. This subject will also be explained in combination with testing and test management. This track will focus on applying Lean Six Sigma techniques to test management processes using practical examples from customer cases. The audience can take home a practical “Lean Test Management” overview which they can apply in their own companies.
This track is especially of interest to business managers, IT managers, QA managers and test managers that are involved in improving the quality of test management processes.
'What the top 10 Most Disruptive Technology Trends Mean for QA and Testing' b...TEST Huddle
New and emerging technologies such as mobile apps, tablets, 4G, cloud computing, and HTML5 are making big headlines and impacting software engineering and testing organizations in various industries. These technological innovations are allowing sensitive data to be accessed through the web and on mobile devices more than ever before.
With so much critical data flowing to smart phones and tablets, there is immense pressure to ensure that apps – those that a company produces for its customers and supports for employees, vendors or partners – are reliable, scalable, private and secure. And this evolution of technologies and user behavior dramatically impacts those who are responsible for developing and testing applications.
The ways web and mobile apps are designed, developed and delivered are changing dramatically, and therefore the ways these apps are being tested are being taxed and stretched to the breaking point. Using real-world examples, Doron Reuveni identifies the top ten technology trends that have transformed the software industry and outlines what they mean for the QA and testing community today.
Mieke Gevers - Performance Testing in 5 Steps - A Guideline to a Successful L...TEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Performance Testing in 5 Steps - A Guideline to a Successful Load Test by Mieke Gevers. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
'Model Based Test Design' by Mattias ArmholtTEST Huddle
MBT (Model Based Testing) has been used within my department in Ericsson since 2007. As an MBT tool we have been using Conformiq Modeler, which is a commercially available tool. This has been a great success, and is now our main way of working when verifying functional requirements.
Until now, MBT has neither within Ericsson nor outside, only been used very rarely for verification of non-functional requirements, such as performance testing, load testing, stability and robustness tests and characteristics measurements.
This presentation covers the work of two Master Students, who in 2010 performed a study of the possibilities to use MBT for verifying non-functional requirements. One of the results of this study was a new method, inspired by MPDE (Model Driven Performance Engineering), where non-functional requirements can be covered by test models describing the functional behavior. Test Cases can then be generated from these models with an MBT tool.
The proposed method provides different possibilities to handle the non-functional requirements. The requirements can, for example, be introduced with new dedicated states in the behavioral model, or be introduced by extending the existing state model. Another possibility is to implement the non-functional requirements in the test harness, and by that keeping the model simple. The most realistic scenario, however, is a combination of all the above. The grouping and allocation of both functional and non-functional requirements should be considered already in the early test analysis phase.
The new method has been tried out and evaluated. It has been proved useful and fully applicable, and there are clear indications that it is beneficial, and that project lead time can be reduced by using it. We have therefore now started to apply this method in our new development projects.
The presentation includes examples of real cases where MBT has been used for verifying non-functional requirements.
Stefaan Luckermans - Number for Passion, Passion for Numbers - EuroSTAR 2010TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on Number for Passion, Passion for Numbers by Stefaan Luckermans. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
Dietmar Strasser - Traditional QA meets Agile DevelopmentTEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Traditional QA meets Agile Development by Dietmar Strasser. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Bart Knaack - The Truth About Model-Based Quality ImprovementsTEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on The Truth About Model-Based Quality Improvements by Bart Knaack. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Mickiel Vroon - Test Environment, The Future Achilles’ HeelTEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Test Environment, The Future Achilles’ Heel by Mickiel Vroon. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
Isabel Evans - Route Cards to the FutureTEST Huddle
EuroSTAR Software Testing Conference 2008 presentation on Route Cards to the Future by Isabel Evans. See more at conferences.eurostarsoftwaretesting.com/past-presentations/
'Customer Testing & Quality In Outsourced Development - A Story From An Insur...TEST Huddle
The insurance company made the decision to outsource most of its IT development and technical maintenance to suppliers. This demanded new requirements to testing and quality ensuring in the company and raised a lot of questions:
- How do we ensure that suppliers perform a test which provides a solution that is not filled with
defects?
- What are the responsibilities for the test activities between supplier and customer?
- How do we ensure effective testing without delays due to misunderstandings between supplier and
tester?
- What are the test criteria to the supplier and how should they report these?
- How do we ensure that test material used by one supplier for development can be re-used by another
supplier for maintenance testing in future?
- How is defect handling, test reporting etc. best done between supplier and customer?
From this, the company created a new test model and test policy which includes setting test- and quality requirements for the supplier. The model has a defined test contract appendix which sets the requirements for the suppliers. These include that suppliers in future should use the company’s own templates and must uphold the company’s test policy. This was done to ensure that all suppliers were following the same guidelines, as many projects had more than one supplier as part of application- and technical developments. The model has a high focus on test quality ensuring, test reporting and approval in each test phase, according to the defined acceptance criteria.
In-house, the company had a focus on communicating and educating anyone working as testers within acceptance tests, or who worked as test managers. This was to ensure that they were adequately trained to perform test activity of high quality, had the competencies to ensure test quality from suppliers and to ensure that delivery by suppliers was as required.During implementation of the new model there was a specific focus on communication with, and
approval by, management to ensure success.
Studies show that at least half of all software defects are rooted in poor, ambiguous, or incomplete requirements. For decades, testing has complained about the lack of solid concrete requirements, claiming that this makes our task more difficult and in some instances—impossible. Lloyd Roden challenges these beliefs and explains why having detailed requirements can be at best damaging and at worst can even be harmful to both testing and the business. Rather than constantly complaining, Lloyd shows how testers and test managers can rise to the challenges of testing without requirements, testing with evolving requirements, testing with vague requirements, and testing with wrong requirements. To help make your testing more effective, Lloyd provides practical tips and techniques for each of these testing challenges.
Keynote: Surviving or Thriving: Top Ten Lessons for the Professional TesterTechWell
As testers and test managers we often find ourselves struggling just to survive within our organization—sometimes with the possibility of job loss due to outsourcing looming. Often, we are told to become more “effective,” “efficient,” and do “more with less.” However, most testers and test managers are unsure of what those mandates actually mean. Today, it is not sufficient to just survive; we must take initiatives to thrive. Lloyd Roden shares ten valuable lessons on how you can become better at testing and thrive in your career. Lloyd's lessons include the importance of using modern technology in testing, using test design techniques when reviewing documentation, testing the testers with techniques such as bug seeding, reporting project waste, providing management with feedback on decisions that they made, becoming a pioneer or explorer rather than a settler or outlaw in your organization, and more. Lloyd’s advice is practical—and challenging—for all testers, test leads, and test managers.
In today's fast-paced IT world we are often told to deliver higher quality systems to our customers under challenging time schedules, with fewer resources, and reduced budgets. As test managers and team leaders, we must become more effective and efficient with the resources we are given. We should begin questioning whether all those testing processes really must be executed and whether all that documentation should be produced, or whether some, if not all, can be streamlined. Are test plans really important? Are detailed scripts really useful? How can we create highly productive teams? Based on his upcoming book, Test Management for Busy People, Lloyd reveals how the important tasks can be determined and how time can be allocated to them, while minimizing working on multiple projects simultaneously (a real time waster). Lloyd shares a flexible framework showing which tasks can and should be streamlined so that valuable time can be restored to the busy test manager.
Gerlof Hoekstra - OMG What Have We Done - EuroSTAR 2013TEST Huddle
EuroSTAR Software Testing Conference 2013 presentation on "OMG What Have We Done" by Gerlof Hoekstra.
See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
How to Get Stuff Right
Innovation and facts over opinions and assumptions; follow a plan that responds to change.
Project Control is application of control theory to development of solutions. Through creating tests first we create negative feedback to help development.
Obstacle Driven Development combines the latest engineering methods and software development. ODD helps identify, correct and prevent errors as early and efficiently as practical.
This presentation shows how ODD extends and combines ISO compatible V-models, Test Driven Development, requirements analysis, extended specifications and Agile.
Please see the series for further information.
The Lean Startup Method: Its Value for TestersTechWell
A startup is an organization created to deliver a new product or service under conditions of extreme uncertainty. Approximately 40 percent of all startups will cease operation with investors losing everything; 95 percent will fall short of their financial projections. And the number one cause of startup failure? No one wants to buy their product. Eric Ries, author of The Lean Startup, learned that under conditions of extreme uncertainty, classical management methods do not bring success. Based on his and others’ experiences, he formulated the Lean Startup methodology consisting of five important principles: (1) Build-Measure-Learn (BML) loop, (2) Minimum Viable Product (MVP), (3) Validated Learning, (4) Customer Development, and (5) One Metric That Matters. Lee Copeland believes these same Lean Startup ideas have great value for testers. Come and discover how the BML loop is similar to exploratory testing, how the MVP idea suggests a Minimum Viable Set of Tests, how Customer Development suggests developing clients for your testing services, and more. Learn how to apply Lean Startup ideas in your testing organization.
Project Control is a new mathematical model and method which allows the user to create fully testable solutions to every stage of a development process.
Adapted from control theory and inspired by Test Driven Development (TDD); the models allow us to create tests first and ensure we are following a scientific method.
Using Control Theory models and Boolean logic for a test and solution; we have created a mathematical method to create fully testable solutions.
Use with Obstacle Driven Development to create solutions for every stage of a development which are fully testable.
We need a QA team that works with development teams to help ship features quickly and safely. Traditional testers help to ship safely by doing the testing, but this can have the side-effect of slowing the team down. When bugs are found during the testing stage, they take longer to fix because they requires rework of existing code. The team is then put in a position where they have to decide between shipping quickly or safely. To ship features both quickly and safely we need to find defects as early in the development process as possible to prevent bugs instead of detecting them. In Son's presentation, he will share how to apply agile quality process to achieve this for product-centric teams.
Open Mastery: Let's Conquer the Challenges of the Industry!Arty Starr
What if you could get upper management to care about your technical developer problems? Would you be willing to measure and prioritize the problems?
What if **WE** could stop the relentless business pressure that drives our software projects into the ground *across the industry*? I know this probably sounds impossible, but before you dismiss the idea entirely, let me show you that it *is* possible.
We can start a cascade of changes across the industry with only a handful of people that are willing to work together to make it happen.
Open Mastery is a peer learning network focused on codifying open decision models and standards to solve industry-wide problems. This presentation is about the obstacles, the strategy, and the business model.
Lastly, I want your help in looking for gaps in my ideas. Let's identify where the strategy might break, and figure out how to make it work. I'm launching Open Mastery in early 2016. Let's make this dream a reality.
Learn The 4 Steps To Identify, Hire & Keep High Performers – Webinar with Dr....Growth Institute
Apply for the interactive Topgrading Master Business Course — and learn how to identify, hire & keep A-Players from the world’s #1 hiring expert Dr. Brad Smart >> growthinstitute.com/topgrading
Outpost24 webinar - The economics of penetration testing in the new threat la...Outpost24
Penetration testing has long been a tried and tested method to simulate an attack against companies’ IT systems to find exploitable vulnerabilities before anyone does. But is the price tag worth it?
Quality Clinic - Lean Six Sigma Fundamentals Training - SampleMark H. Davis
A sample of slides from our Quality Clinic training, which teaches the foundational elements of Lean Six Sigma DMAIC, with a dash of A3 and Constraints Management.
Why We Need Diversity in Testing- AccentureTEST Huddle
In this webinar Rasa (Testing capability lead for Denmark) and Matthias (EALA Testing capability lead) will share some of their own experiences why diversity matters, give insights into how Accenture as a global firm is promoting diversity and how we are in the process of changing our attitudes and processes to make all of this sustainable
Keys to continuous testing for faster delivery euro star webinar TEST Huddle
Your business needs to deliver faster. To accommodate, Development needs to introduce fewer changes but in a much more frequent cadence. This creates a challenge for test teams to keep up with the rapid pace of change without compromising on quality. Automation is paramount to the success or failure of Continuous Delivery, and Continuous Testing enables early and frequent quality feedback throughout the CI/CD pipeline.
In this webinar, Eran & Ayal will explore how to implement Continuous Testing to ensure high quality releases in a Continuous Delivery environment; including what to test and when to automate new functionality in order to optimize your efforts.
In this webinar Carsten will explore the role of the tester in a Scrum team. He will examine where the tester play an important role in Scrum and how you can contribute to a teams performance.
Leveraging Visual Testing with Your Functional TestsTEST Huddle
Designing and implementing (or selecting) the right automation strategy, for functional testing, with visual testing, can help your project with greater test coverage while improving test scalability
This talk suggests how we might make sense of the tools landscape of the near future, where the pressure to modernise processes and automate is greatest, and what a new test process supported by tools might look like.
Takeaways:
- We need to take machine learning in testing seriously, but it won’t be taking our jobs just yet
- We don’t need more test automation tools; today we need tools that capture tester knowledge
- Tools that that learn and think can’t work for testers until we solve the knowledge capture challenge.
View On-Demand Webinar: https://youtu.be/EzyUdJFuzlE
In this session, we’ll write tests and code for solving a real Star Wars problem. And we’ll discuss what we’re doing, refine our specs, as well as see what changes in the design tell us.
View On-Demand Webinar: https://huddle.eurostarsoftwaretesting.com/resource/test-management/tdd-rest-us/
Scaling Agile with LeSS (Large Scale Scrum)TEST Huddle
In this webinar, Elad will cover the principles that the #LeSS framework has to offer in order to enable bug organisations to become agile.
View webinar recording - https://huddle.eurostarsoftwaretesting.com/resource/agile-testing/scaling-agile-less-large-scale-scrum/
Creating Agile Test Strategies for Larger EnterprisesTEST Huddle
Having difficulty creating an agile test strategy for your company? Let Testing Excellence Award winner, Derk-Jan de Grood, show you how it’s done
View webinar recording here - http://huddle.eurostarsoftwaretesting.com/resource/agile-testing/creating-agile-test-strategies-larger-enterprises/
3 key takeaways
- Do you know the meaning of your organisation, system, product?
- Can you deliver the important risks right away?
- How can you communicate about the (process and product) risks your dealing with?
View Webinar recording: https://huddle.eurostarsoftwaretesting.com/resource/test-management/is-there-a-risk/
Growing a Company Test Community: Roles and Paths for TestersTEST Huddle
Over the past three years, our company’s test team has grown from three lonesome testers to a community of nine – with more planned. Since we don’t see testers as “click monkeys”, but as valuable and integrated project members who bring a specific skill set to the table, it’s important for us to choose testers well and to train them in various areas so that they can contribute, grow and see their own career path within testing.
To structure to our internal tester training program, we have been developing role descriptions, education paths and career options for our testers, which I’d like to share with you in this webinar.
View webinar - https://huddle.eurostarsoftwaretesting.com/resource/webinar/growing-company-test-community-roles-paths-testers/
It’s the same argument again and again. One side says “team members should all be able to do everything, and the programmers should do their testing and all testers should be writing code”. The other side says “No, that can’t possibly work – programmers don’t know how to test, they don’t have the right mindset”. And on and on it goes.
http://huddle.eurostarsoftwaretesting.com/resource/webinar/need-testers-agile-teams/
In this webinar, Dave Haeffner (Elemental Selenium, USA) discusses how to:
- Build an integrated feedback loop to automate test runs and find issues fast
- Setup your own infrastructure or connect to a cloud provider
-Dramatically improve test times with parallelization
https://huddle.eurostarsoftwaretesting.com/resource/webinar/use-selenium-successfully/
Practical Test Strategy Using HeuristicsTEST Huddle
Key Takeaways
- See what makes a good test strategy
- Learn how to make a thorough test strategy
- Identify what is the ‘Heuristic Test Strategy Model’ is
- Develop a solid test strategy that fits fast
- Discover how diversification can help you to create a test strategy
Key Takeaways:
- A diagramming method that helps discuss roles
- A one page analysis heuristic for roles
- Why roles matter on projects
https://huddle.eurostarsoftwaretesting.com/resource/people-skills/thinking-through-your-role/
Key Takeaways:
- What will this release contain
- What impact will it have on your test runs
- How can you preserve your existing investment in tests using the Selenium WebDriver APIs, and your even older RC tests
- Looking forward, when will the W3C spec be complete
- What can we expect from Selenium 4
https://huddle.eurostarsoftwaretesting.com/
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdfPeter Spielvogel
Building better applications for business users with SAP Fiori.
• What is SAP Fiori and why it matters to you
• How a better user experience drives measurable business benefits
• How to get started with SAP Fiori today
• How SAP Fiori elements accelerates application development
• How SAP Build Code includes SAP Fiori tools and other generative artificial intelligence capabilities
• How SAP Fiori paves the way for using AI in SAP apps
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
Let us start by defining what a “challenge” is… there are various definitions: * call to battle (e.g. a duel) * contest that requires special skill (e.g. tennis competitions) * call into question (e.g. challenge the statement) Climbing Mount Everest: Has anyone done this? Would anyone like to do this? You need to train hard for this challenge – for most of us it would be something that is outside our reach or desire Running a Marathon Has anyone done this? Maybe for some of us it is a challenge just to…run But maybe if you have already done this you want to beat your time or perhaps run 42 marathons in 52 days like Eddie Izzard! Running a marathon would certainly be a challenge for me. I have just taken up running on a treadmill…1 mile a day, but even that is a challenge for me Cooking Dinner for 20 people Not a real problem for me as I love to cook for dinner parties. 20 people would be manageable for me but for some of you – you would prefer to climb Mount Everest Flying Not literally – that would be a challenge. But for some flying in an Airplane it would certainly be a challenge as this might be overcoming a fear that you have, This is a different sort of challenge Getting out of bed Anyone here have teenagers? Now you know what I mean! We spend the first 10 years getting them into bed and the next 10 years trying to get them out of bed
It is important for us to realise that challenges can be both “bad” and “good” Bad Challenges “ I challenge you to a game of squash” – not a problem if you are under 40 but if you are 70 and haven’t played the game before! “ I challenge you to run a marathon after this session” “ I challenge you to fight Mike Tyson I challenge you to cook a meal for 20 people all of whom are VIPs I challenge you to get out of bed before 10am each morning – result grumpy teenager Good Challenges These help improve our self or open our minds to other things I challenge myself to run a little further each day and a little faster (but not too much) I challenge myself to eat more healthily – smoothies taste really nice! I challenge myself to try anything once when it comes to culture and national food (sushi, bone, pigs ears to name but a few)
Our reactions will vary with challenges that we are faced with: “ Yes” or “No way” or “Boring” or “don’t really care” I hope the challenges I a going to share will invoke a positive reaction…
Go through the definitions of: BEST and PRACTICE Why is it that we use these terms? STOP IT! Who are we trying to impress? Vendors – STOP Saying that your product/process/methodology is best practice Managers – STOP saying you have adopted best practice What are you trying to prove? If you have best practice then it means that you cannot improve…best by definition is highest qualifier. Also are you trying to downgrade other methodologies? Ours is BEST practice – therefore yours must be inferior practice or WORST Practice
The Dreyfus Model. In the 1970s, the brothers Dreyfus (Hubert an Stuart) began their seminal research on how people can attain and master skills. The Dreyfus brothers looked at highly skilled practitioners, including commercial airline pilots and world chess masters. Their research showed quite a bit changes as you move from Novice to Expert. You just don’t “know more” or gain skill. Instead, you experience fundamental differences in how you perceive the world, how you approach problem solving and the mental models you form and use. Unlike other models or assessments that rate the whole person, the Dreyfus model is applicable per skill. In other words, it is a situational model and not a trait or talent model. You are neither “expert” nor “novice” at all things.; Rather you are at one of these stages in some particular skill domain. You might be a “novice” cook but an “expert” sky diver, or visa versa. Practices are the things we do…behaviour Let’s look at how Practices can help or hinder in the Dreyfus model… Go through the stages… Let us use the analogy of creating a test plan… (Subverts = to ruin or overturn) Novice – needs the practice/standard for test plans – they don’t know how to create one Advanced Beginner – uses the test pan procedure on a daily basis, starting to get to know how to do them Competent – knows all about test plans, so defines the procedures for other people to follow Proficient – they don’t need the procedure, they will be able to apply the test plan to different contexts. If they get stuck then they might refer to the procedure Expert – will challenge the use of test plans on every project, they might adapt and remove sections for projects since they are not relevant
So can practices ever be useful? well yes… Protecting Novices – they need practices because they don’t know what to do Helps Advanced Beginners as they will use the practices to guide them Provides work for competent people because they write the practices. These people write them with no contextual awareness BUT be warned If you continually use “best practice” and insist they are used then your proficient and expert people will leave Let’s take the concept of test scripts. Hands up who uses/ encourages the use/ writes Test Scripts within your organisation? (these are step by step sets of instructions which need to be followed) Well competent people will usually define them, novice and advanced people will use them BUT Expert and Proficient people WILL be bored and will leave! It is like baking a cake and giving the recipe to an expert chef (like Gordon Ramsay)
Welcome to the weakest link… This is the show in which we can decide, just by asking a few questions, who the weakest link is in our test team and then we can “fire them”…a fun game that can be played by all Test Managers with their staff… Let me introduce you to the testing team playing today’s weakest link… John…. Carol…. Rick…. Pam…
Read the slide… Once the answer has come up say… “ That is great John, well done. You must be pleased with that result!”
Read the slide… Once the answer has come up say… “ Well Carol that is most disappointing. I do hope you will try better on the next round!”
Read the slide… Once the answer has come up say… “ That is incredible Rick, truly amazing. You must be so pleased with that result and so must your management, excellent work”
Read the slide… Once the answer has come up say… “ Zero!! Pam what have you been doing. That result is appalling, even I could have run more than that and I don’t know anything about testing. I am sorry Pam but it is clearly obvious to everyone in this audience that YOU ARE THE WEAKEST LINK…GOODBYE!” I wonder if we sub-consciously think like this in our organisation. Or maybe our management does Now in the above scenario we have 4 different definitions of a test case and 4 definitely different answers. But I bet even if we have the same definition for the 4 contenstants we would have 4 different interpretations.
Without context on these test cases the only number that has meaning is ZERO I know that the person hasn’t run ANYTHING!!! With the others they have run SOMETHING, But we don’t know what Now let’s have a look at the same numbers …But with context… My conclusion is that the 2 test cases that were run by Carol were quite detailed and probably testing a lot of functionality. I now understand why Pam has not run anything This scenario REALLY happened during a recent consultancy assignment. Managers were judging the testers purely on the number of test cases being run….without knowing the context
Let us consider the problem…. This is a section of an on-line booking system for courses. If we were to test this…what would constitute a test case? Well reading the booking conditions…each hyperlink can be an individual test case testing each field for boundary conditions can each be a test case… Processing an entire booking can be a test case…etc etc THE ANALOGY OF A CASE IS A GOOD ONE BECAUSE We have many different size cases….and cases within cases…each having it’s own unique purpose. BUT the quality of the cases also has relevance as they wont last very long if you have poor quality cases…they break and also might damage the contents….
We have a major problem in our organisations with management focusing too much on the QUANTITY… How many have you run? What percentage of test cases are complete? I want our 1.5million test case script run tonight… Without understanding the QUALITY of what is being run.
So how can we measure quality of the tests….well with two aspects. There are only 2 reasons for us to tests systems… To Gain confidence…. and to find bugs This all leads to us providing information about the QUALITY of the system so that management/stakeholders can make informed decisions I shall leave you with one final thought…
We don’t lie intentionally but we do not spend time understanding the metrics we gather or the graphs we produce. Disraeli said there are 3 types of lies: lies, dammed lies and statistics Go through the graphs. Graph 1: number of bugs being raised (increasing) Graph 2: DMR = the rate at which we are finding them (this is coming down) Graph 3: What is wrong? scale starts with 20 – to make things look good…maybe
Pie charts Pie charts: These were created by Florence Nightingale in 1858. It is suggested that Florence Nightingale’s reputation was built on her ability to give clear and concise presentation of data. There were earlier uses Leon Lalanne used these diagrams in 1829 DDP Percentages are great but we need the actual numbers to draw any concrete conclusions. We must have SIGNIFICANT DATA – that means sufficient amount to draw conclusions. This is elementary statistics which we learnt at school. Test Cases The case against test case. The analogy of a case is a good one Show my suit case…one case….open it up and we have a laptop case…open it up ….another case….open it up…another case. Counting cases is meaningless, it is what is inside them that is important. This is why airlines weigh our cases rather than just count them! How many organisations count test cases? How many people encourage counting test cases? QUESTION – what does it actually mean? This was brought home to me recently when I taught an Exploratory Testing class and I got each group to “estimate” the number of tests they run during a 30minute ET session: Group 1: 35 Group 2: 50 Group 4: 90 Group 4: 1 Conclusion – group 4 should be fired!
Story about little Suzie…
Here are some facts about how we are increasing complexity unnecessarily.
NOTE : Testers should improve their technical skills if they want to progress. We must remember that there are some people who are quite happy to stay in the job that they are in and not progress. There are those who treat their work as a job (9-5) there are those who treat their work as a career and those who treat their work as a passion. Look at some of the excuses for Test Managers ….for not testing…
This is non-negotiable for me and managers/leads who use excuses for not testing are not worthy of the title “test manager” or “test Lead”. Too many managers it in their ivory towers looking down rather than rolling their sleeves up and getting stuck in. Identify and empathise you will go through similar experiences to the team, you will be involved with them. lead by example William Wallace lead from the front – he was not like the English gentry who lead from the back. His men were committed to their leader. testing yourself will enable you to help with testing when overtime is required improve your estimation skills When I started to test I soon realised that certain tasks took lots more time than I realised gain credibility and trust This is a by-product of being a tester. You can mentor others. Look at leading chefs, pilots, craftsmen
Good Generals will not fight every battle – they will choose the ones they think they can win This is the only piece of advice I give to parents…”choose your battles” Choose your battles…story about Hannah THANK YOU