The document discusses different testing approaches including Acceptance Test Driven Development (ATDD), Test Driven Development (TDD), GUI automation, and exploratory testing. It explains that ATDD and TDD are design processes that help ensure software meets project needs, while testing involves asking questions of a product. GUI automation can simulate user actions but is fragile. Exploratory testing involves testing design and execution together in a flexible way. The document argues that these approaches work best in balance and that exploration is important at all levels, including with automation. It emphasizes putting the customer first and seeing the approaches as interdependent parts of an overall quality process.
Exploratory testing is an approach to testing that emphasizes the freedom and responsibility of testers to continually optimize the value of their work. It is the process of three mutually supportive activities done in parallel: learning, test design, and test execution. With skill and practice, exploratory testers typically uncover an order of magnitude more problems than when the same amount of effort is spent on procedurally scripted testing. All testers conduct exploratory testing in one way or another, but few know how to do it systematically to obtain the greatest benefits. Even fewer can articulate the process. Jon Bach looks at specific heuristics and techniques of exploratory testing that will help you get the most from this highly productive approach. Jon focuses on the skills and dynamics of exploratory testing, and how it can be combined with scripted approaches.
Exploratory testing is an approach to testing that emphasizes the freedom and responsibility of testers to continually optimize the value of their work. It is the process of three mutually supportive activities—learning, test design, and test execution—done in parallel. With skill and practice, exploratory testers typically uncover an order of magnitude more problems than when the same amount of effort is spent on procedurally scripted testing. All testers conduct exploratory testing in one way or another, but few know how to do it systematically to obtain the greatest benefits. Even fewer can articulate the process. Jon Bach looks at specific heuristics and techniques of exploratory testing that will help you get the most from this highly productive approach. Jon focuses on the skills and dynamics of exploratory testing, and how it can be combined with scripted approaches.
Chapter stage - Building and evaluating prototypesRenner Modafares
This presentation describes how to build and evaluate prototypes using the Enterprise Design Thinking framework by IBM.
This content was presented to Hortolândia DT Chapter members in February/2020, during the weekly Thursday DT session, created to share knowledge and open space to practice the Enterprise Design Thinking (EDT) framework.
Exploratory testing is an approach to testing that emphasizes the freedom and responsibility of testers to continually optimize the value of their work. It is the process of three mutually supportive activities done in parallel: learning, test design, and test execution. With skill and practice, exploratory testers typically uncover an order of magnitude more problems than when the same amount of effort is spent on procedurally scripted testing. All testers conduct exploratory testing in one way or another, but few know how to do it systematically to obtain the greatest benefits. Even fewer can articulate the process. James Bach looks at specific heuristics and techniques of exploratory testing that will help you get the most from this highly productive approach. James focuses on the skills and dynamics of exploratory testing, and how it can be combined with scripted approaches.
Exploratory testing is an approach to testing that emphasizes the freedom and responsibility of testers to continually optimize the value of their work. It is the process of three mutually supportive activities done in parallel: learning, test design, and test execution. With skill and practice, exploratory testers typically uncover an order of magnitude more problems than when the same amount of effort is spent on procedurally scripted testing. All testers conduct exploratory testing in one way or another, but few know how to do it systematically to obtain the greatest benefits. Even fewer can articulate the process. Jon Bach looks at specific heuristics and techniques of exploratory testing that will help you get the most from this highly productive approach. Jon focuses on the skills and dynamics of exploratory testing, and how it can be combined with scripted approaches.
Exploratory testing is an approach to testing that emphasizes the freedom and responsibility of testers to continually optimize the value of their work. It is the process of three mutually supportive activities—learning, test design, and test execution—done in parallel. With skill and practice, exploratory testers typically uncover an order of magnitude more problems than when the same amount of effort is spent on procedurally scripted testing. All testers conduct exploratory testing in one way or another, but few know how to do it systematically to obtain the greatest benefits. Even fewer can articulate the process. Jon Bach looks at specific heuristics and techniques of exploratory testing that will help you get the most from this highly productive approach. Jon focuses on the skills and dynamics of exploratory testing, and how it can be combined with scripted approaches.
Chapter stage - Building and evaluating prototypesRenner Modafares
This presentation describes how to build and evaluate prototypes using the Enterprise Design Thinking framework by IBM.
This content was presented to Hortolândia DT Chapter members in February/2020, during the weekly Thursday DT session, created to share knowledge and open space to practice the Enterprise Design Thinking (EDT) framework.
Exploratory testing is an approach to testing that emphasizes the freedom and responsibility of testers to continually optimize the value of their work. It is the process of three mutually supportive activities done in parallel: learning, test design, and test execution. With skill and practice, exploratory testers typically uncover an order of magnitude more problems than when the same amount of effort is spent on procedurally scripted testing. All testers conduct exploratory testing in one way or another, but few know how to do it systematically to obtain the greatest benefits. Even fewer can articulate the process. James Bach looks at specific heuristics and techniques of exploratory testing that will help you get the most from this highly productive approach. James focuses on the skills and dynamics of exploratory testing, and how it can be combined with scripted approaches.
Exploratory testing is a big part of 'agile' but what exactly does it mean?
How does it differ to other approaches? How do we get value from it? How does the team benefit from it?
How can the whole team participate? What are some misconceptions?
Time, effectiveness and the value of Exploratory Testing can be lost if the team doesn't have a decent understanding of what it is. It is one of the most powerful learning tools your team has and the team can be losing out on the value of this.
During this interactive session lets learn about the tool so we can make the best use of it. And get that time, effectiveness and value back.
Tired of doing upfront test script creation in your testing efforts? Feeling bad for demotivating your testers? Want something to replace this sickening approach to software testing? This presentation outlines why test scripts are not useful, and how test ideas are the new way forward to better testing. Coverage, traceability, reporting, automation and skills are all covered. Take a quick look and see if you can see there is another way to do software testing that is actually pure common sense.
Graham Thomas - Software Testing Secrets We Dare Not Tell - EuroSTAR 2013TEST Huddle
EuroSTAR Software Testing Conference 2013 presentation on Software Testing Secrets We Dare Not Tell by Graham Thomas.
See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
The Test Coverage Outline: Your Testing Road MapTechWell
To assist in risk analysis, prioritization of testing, and test reporting (telling your testing story), you need a thorough Test Coverage Outline (TCO)—a road map of your proposed testing activities. By creating a TCO, you can prepare for testing without having to create a giant pile of detailed test cases. Paul Holland says that a comprehensive TCO helps the test team to get buy-in for the overall test strategy very early in the project and is valuable for identifying risk areas, testability issues, and resource constraints. Paul describes how to create a TCO including the use of heuristic-based checklists to help ensure you don’t overlook important elements in your testing. Learn multiple approaches for critical information gathering, the artifacts used as input for creating a TCO, and how you can use a TCO to maintain testing focus. Take back a new, lightweight tool to help you tell the testing story throughout your project.
David Hayman - Say What? Testing a Voice Avtivated System - EuroSTAR 2010TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on Say What? Testing a Voice Avtivated System by David Hayman. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
A Rapid Introduction to Rapid Software TestingTechWell
You're under tight time pressure and have barely enough information to proceed with testing. How do you test quickly and inexpensively, yet still produce informative, credible, and accountable results? Rapid Software Testing, adopted by context-driven testers worldwide, offers a field-proven answer to this all-too-common dilemma. In this one-day sampler of the approach, Paul Holland introduces you to the skills and practice of Rapid Software Testing through stories, discussions, and "minds-on" exercises that simulate important aspects of real testing problems. The rapid approach isn't just testing with speed or a sense of urgency; it's mission-focused testing that eliminates unnecessary work, assures that the most important things get done, and constantly asks how testers can help speed up the successful completion of the project. Join Paul to learn how rapid testing focuses on both the mind set and skill set of the individual tester who uses tight loops of exploration and critical thinking skills to help continuously re-optimize testing to match clients' needs and expectations.
And I thought I knew QTP - QTP Concepts UnpluggedTarun Lalwani
How many times have you wished that there was somebody who could clear some niggling doubts about a particular aspect of QTP? Or explain some difficult to-grasp concepts and smart workarounds? Or show you
some of the lesser known features of QTP?
Written by the author of the best selling QTP book – “QuickTest Professional Unplugged”, this book does
just that in a gripping story that will make you turn every page in anticipation. “And I Thought I knew QTP! – QTP Concepts Unplugged” is ‘different’ in the way it seeks to explain the various concepts through an interesting and innovative story-telling style (which is rarely used for technical books). Instead of following a textbook format, this book is more like a technical novel.
Whether it is to brush up your QTP concepts or simply to satiate your curiosity about how a seasoned IT veteran fared in a no-holds-barred ‘intellectual duel’ on QTP – whatever your reason to pick up this book, it is sure to leave you astounded with its pace of narration, expertise, and breadth of topics covered.
Exploratory testing is an approach to testing that emphasizes the freedom and responsibility of testers to continually optimize the value of their work. It is the process of three mutually supportive activities—learning, test design, and test execution—done in parallel. With skill and practice, exploratory testers typically uncover an order of magnitude more problems than when the same amount of effort is spent on procedurally scripted testing. All testers conduct exploratory testing in one way or another, but few know how to do it systematically to obtain the greatest benefits. Even fewer can articulate the process. Paul Holland looks at specific heuristics and techniques of exploratory testing that will help you get the most from this highly productive approach. Paul focuses on the skills and dynamics of exploratory testing, and how it can be combined with scripted approaches.
** An easy Software Interface to guide the user through program creation ** Intuitive and easy to use ** 5x7 full color touch panel display ** Inviting to look at ** Most options are preset and allow toggling between options ** Testing capability through preview screen
Coyote Teaching: A New (Old) Take on the Art of MentorshipMichael Larsen
Coyote Teaching takes from the ideas of Tribal Cultures and focuses on the essential elements needed to make long lasting memories and connections to real skills transfer. In this presentations Harrison Lovell and Michael Larsen discuss the steps and approaches used in their mentoring relationship, and what they learned along the way.
Exploratory testing is a big part of 'agile' but what exactly does it mean?
How does it differ to other approaches? How do we get value from it? How does the team benefit from it?
How can the whole team participate? What are some misconceptions?
Time, effectiveness and the value of Exploratory Testing can be lost if the team doesn't have a decent understanding of what it is. It is one of the most powerful learning tools your team has and the team can be losing out on the value of this.
During this interactive session lets learn about the tool so we can make the best use of it. And get that time, effectiveness and value back.
Tired of doing upfront test script creation in your testing efforts? Feeling bad for demotivating your testers? Want something to replace this sickening approach to software testing? This presentation outlines why test scripts are not useful, and how test ideas are the new way forward to better testing. Coverage, traceability, reporting, automation and skills are all covered. Take a quick look and see if you can see there is another way to do software testing that is actually pure common sense.
Graham Thomas - Software Testing Secrets We Dare Not Tell - EuroSTAR 2013TEST Huddle
EuroSTAR Software Testing Conference 2013 presentation on Software Testing Secrets We Dare Not Tell by Graham Thomas.
See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
The Test Coverage Outline: Your Testing Road MapTechWell
To assist in risk analysis, prioritization of testing, and test reporting (telling your testing story), you need a thorough Test Coverage Outline (TCO)—a road map of your proposed testing activities. By creating a TCO, you can prepare for testing without having to create a giant pile of detailed test cases. Paul Holland says that a comprehensive TCO helps the test team to get buy-in for the overall test strategy very early in the project and is valuable for identifying risk areas, testability issues, and resource constraints. Paul describes how to create a TCO including the use of heuristic-based checklists to help ensure you don’t overlook important elements in your testing. Learn multiple approaches for critical information gathering, the artifacts used as input for creating a TCO, and how you can use a TCO to maintain testing focus. Take back a new, lightweight tool to help you tell the testing story throughout your project.
David Hayman - Say What? Testing a Voice Avtivated System - EuroSTAR 2010TEST Huddle
EuroSTAR Software Testing Conference 2010 presentation on Say What? Testing a Voice Avtivated System by David Hayman. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
A Rapid Introduction to Rapid Software TestingTechWell
You're under tight time pressure and have barely enough information to proceed with testing. How do you test quickly and inexpensively, yet still produce informative, credible, and accountable results? Rapid Software Testing, adopted by context-driven testers worldwide, offers a field-proven answer to this all-too-common dilemma. In this one-day sampler of the approach, Paul Holland introduces you to the skills and practice of Rapid Software Testing through stories, discussions, and "minds-on" exercises that simulate important aspects of real testing problems. The rapid approach isn't just testing with speed or a sense of urgency; it's mission-focused testing that eliminates unnecessary work, assures that the most important things get done, and constantly asks how testers can help speed up the successful completion of the project. Join Paul to learn how rapid testing focuses on both the mind set and skill set of the individual tester who uses tight loops of exploration and critical thinking skills to help continuously re-optimize testing to match clients' needs and expectations.
And I thought I knew QTP - QTP Concepts UnpluggedTarun Lalwani
How many times have you wished that there was somebody who could clear some niggling doubts about a particular aspect of QTP? Or explain some difficult to-grasp concepts and smart workarounds? Or show you
some of the lesser known features of QTP?
Written by the author of the best selling QTP book – “QuickTest Professional Unplugged”, this book does
just that in a gripping story that will make you turn every page in anticipation. “And I Thought I knew QTP! – QTP Concepts Unplugged” is ‘different’ in the way it seeks to explain the various concepts through an interesting and innovative story-telling style (which is rarely used for technical books). Instead of following a textbook format, this book is more like a technical novel.
Whether it is to brush up your QTP concepts or simply to satiate your curiosity about how a seasoned IT veteran fared in a no-holds-barred ‘intellectual duel’ on QTP – whatever your reason to pick up this book, it is sure to leave you astounded with its pace of narration, expertise, and breadth of topics covered.
Exploratory testing is an approach to testing that emphasizes the freedom and responsibility of testers to continually optimize the value of their work. It is the process of three mutually supportive activities—learning, test design, and test execution—done in parallel. With skill and practice, exploratory testers typically uncover an order of magnitude more problems than when the same amount of effort is spent on procedurally scripted testing. All testers conduct exploratory testing in one way or another, but few know how to do it systematically to obtain the greatest benefits. Even fewer can articulate the process. Paul Holland looks at specific heuristics and techniques of exploratory testing that will help you get the most from this highly productive approach. Paul focuses on the skills and dynamics of exploratory testing, and how it can be combined with scripted approaches.
** An easy Software Interface to guide the user through program creation ** Intuitive and easy to use ** 5x7 full color touch panel display ** Inviting to look at ** Most options are preset and allow toggling between options ** Testing capability through preview screen
Coyote Teaching: A New (Old) Take on the Art of MentorshipMichael Larsen
Coyote Teaching takes from the ideas of Tribal Cultures and focuses on the essential elements needed to make long lasting memories and connections to real skills transfer. In this presentations Harrison Lovell and Michael Larsen discuss the steps and approaches used in their mentoring relationship, and what they learned along the way.
The New Testers: Critical Skills and Capabilities to Deliver Quality at Speed Michael Larsen
This isn’t your parent’s generation of computers and interaction, and the speed of change is only going to accelerate going forward. Software development, and software testing, is undergoing a radical change, but while organizations have embraced the idea of changes in development and delivery, why are we still looking at old, so called “best practices” in software testing as though we’re still testing the software the previous generation wrote?
In this talk, I will discuss a variety of ways that testing is moving ahead and proving to be just as relevant as it ever has, and how we can equip the next generation of software testers. Through initiatives like SummerQAmp, PerScholas, Weekend Testing and other environments aimed at delivering hands on, real world skills to up and coming testers. Emphasis on rapid learning, direct peer communication, and an emphasis on heuristics and their application can give that edge to new testers, and could also help spark creativity and curiosity in established testers, too.
Big Data Testing: Ensuring MongoDB Data QualityRTTS
You've made the move to MongoDB for its flexible schema and querying capabilities in order to enhance agility and reduce costs for your business. Shouldn't your data quality process be just as organized and efficient?
Using QuerySurge for testing your MongoDB data as part of your quality effort will increase your testing speed, boost your testing coverage (up to 100%), and improve the level of quality within your Big Data store. QuerySurge will help you keep your team organized and on track too!
To learn more about QuerySurge, visit www.QuerySurge.com
Agile Testing: The Role Of The Agile TesterDeclan Whelan
This presentation provides an overview of the role of testers on agile teams.
In essence, the differences between testers and developers should blur so that focus is the whole team completing stories and delivering value.
Testers can add more value on agile teams by contributing earlier and moving from defect detection to defect prevention.
Simulating APIs for Effective Testing: (Micro)Service Virtualisation for the ...Andrew Morgan
As we work more with distributed systems, microservices and legacy services, we introduce a web of inter-service dependencies that cause us to face many challenges across our development and deployment pipeline. Resource consumption, deployment time, our testing feedback cycle, third party service flakiness and costing can cause problems. This talk addresses these issues by demonstrating how the technique of ‘API Simulation’ (modern service virtualisation) can be used to overcome these issues. We’ll introduce the theory and practice, and use an open source tool named Hoverfly to easily produce and run third party services throughout your stack – from producing test environments, to unit testing, and to being used with custom middleware in staging environments. Come and learn about (micro)service virtualisation in the 21st century, and leave the session with practical techniques to improve your application testing.
Introduction to Agile software testing - The 5th seminar in public seminar series from KMS Technology which have been delivering from 2011 in every two months
This ppt is done by my dear classmate Sap, almost each ppt I have uploaded is copied from net and other sources.I hope this will b a little useful for students..
A Common Sense Guide to Agile Development and Testing that might just change your Agile approach forever.
Answering the 9 most common questions asked about Agile Testing:
- What is Agile Testing?
- Do we still need testers in Agile?
- What is an Agile Tester?
- What does a Software Tester Actually Do?
- Should we automate our testing?
- What tools should we use for our Agile Testing?
- How Much Should we Automate?
- How can we automate and still finish the sprint?
- How can we finish all our testing in the sprint?
A high quality download of the 9 points as a free "Print out and Keep" Poster is available at http://eviltester.com/agile
Test Automation in Agile: A Successful ImplementationTechWell
Many teams feel that they are forced to make an either/or decision when it comes to investing time to automate tests versus executing them manually. Sometimes a “silver bullet” tool is purchased, and testers are forced to use it when there may be a better option; other times unskilled team members are designated the automation engineers; and often there is a lack of good guidance on what to automate. These pitfalls cause product owners to de-prioritize those tasks when there’s a better way. Melissa Tondi shares how test teams should evaluate automated tools, both open source and commercial; areas to be aware of when traditional manual testers transition to automation engineers; and recommended priorities for automating tests. By streamlining automation tasks in your project and incorporating these recommendations, you’ll find that your automation intersection becomes a clearly marked thruway to a successfully released product.
[QE 2018] Paul Gerrard – Automating Assurance: Tools, Collaboration and DevOpsFuture Processing
The Digital Transformation is real. It is having a profound effect on how business is done and the nature of the systems required to deliver productive customer experiences and consequent business benefits. The demand for flexible and rapid delivery of software and systems is there. Software development teams can deliver if they adopt the disciplines of Continuous Delivery, DevOps and in-production experimentation. The barrier to achieving success in the software delivery process is likely to be the inability of testers to align testing and automated testing in particular to the development processes. Our track record in test automation is not good enough. In order to succeed a new way of thinking about testing is required, and the New Model of Testing offers a way of identifying the elements of the test process that must be ‘shifted left’. This does not necessarily mean testers move, but rather that the thinking processes must move.
During this lecture, Paul has shown that it is possible that users, BAs, and developers take some responsibility in this area. The New Model applies to all testing, whether performed in development, integration, system or user testing, by people or tools.
Test automation and Agile software developmentBas Dijkstra
Slides for my workshop on test automation, creating realistic expectations around it and what the role of test automation in Agile software development is
Approaches to unraveling a complex test problemJohan Hoberg
When testing a complex system you are often faced with complex test problems. Cause and effect cannot be deduced in advance, only in retrospect.
According to the Cynefin framework, the general approach to tackle complexity is probe-sense-respond. Try something, analyze the outcome, and based on that outcome, try something else. This is the basis of all my approaches to begin unraveling complex test problems. But how do I select my test scope for a specific complex test problem?
Despite the belief that a shared context and collaboration drives quality, too often, software testers and quality professionals struggle to find their place within today's integrated agile teams. This session is a practitioner’s view of testing and testing practices within an iterative/incremental development environment. We will begin with a discussion of some of the challenges of testing within an agile environment and delve into the guiding principles of Agile Testing and key enabling practices. Agile Testing necessitates a change in mindset, and it is as much, if not more, about behavior, as it is about skills and tooling, all of which will be explored.
By starting early and considering Accessibility as a core initiative of software development, organizations can develop software that is easier to use and makes information available to more people.
The Intersection of Accessibility and Inclusive DesignMichael Larsen
Accessibility and Inclusive Design are complementary initiatives. One makes information and services available to as many people as possible with the use of additional technology where needed. Inclusive Design focuses on making information and services available to as many as possible without having to use external technology. By blending these two initiatives, we can develop software that works better for everyone.
Senses Working Overtime: Improving Software Quality Through Accessibility and...Michael Larsen
Using Inclusive Design principles, we can make the development of software applications better for everyone as well as making Accessibility easier to achieve.
My thanks to Tea Time With Testers for interviewing me as a feature story for their July 2013 issue. In this interview, I discuss discovering testing, dynamics of live blogging, technical skill development, return of investment on automation, an alternative to formal education and degrees, SummerQAmp and other topics.
There are three danger signs that testers need to look out for; Ignorance, Stupidity and Apathy. All three can derail us and keep us from learning and progressing, but Apathy is the most insidious and dangerous. Don't accept it, fight it! Here are some ideas as to how.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Leading Change strategies and insights for effective change management pdf 1.pdf
Get the Balance Right: Acceptance Test Driven Development, GUI Automation and Exploratory Testing
1. ATDD GUI
ET
GET THE BALANCE RIGHT
ACCEPTANCE TEST DRIVEN
DEVELOPMENT, GUI AUTOMATION AND
EXPLORATORY TESTING
2. MICHAEL LARSEN
Senior Tester, Sidereel.com
Chair, Education Special Interest
Group, Association For Software Testing
Black Belt, Miagi-do School of Software Testing
Founder/Facilitator, Weekend Testing Americas
TESTHEAD: http://mkltesthead.com
Twitter: @mkltesthead
3. WHAT’S HAPPENED
SINCE 2001?
Eleven years since the Agile Manifesto
Test Driven Development (TDD) has emerged as a
standard to guide code development driven by tests.
Acceptance Test Driven Development (ATDD)
extends model.
Proliferation of automated testing tools for a variety
of platforms and needs.
4. DO WE STILL NEED
TESTERS?
NO: Testing is dead. With the development
of these tools, developers now do the
testing earlier and up front [Savioa, 2011].
YES: Apply reason and dynamic thought to
problems. Thinking, actively engaged
testers are needed now more than ever
[Tomlinson, 2011].
6. ATDD, WHAT IS IT?
Places the focus on User Stories from the
expectations of the customers.
Is geared towards the members of the team
that are not necessarily programmers.
Uses tools that allow a more natural
language to present the requirements.
7. ATDD
Scenario: Twitter user signs in
Given I connect to twitter from the sign in page
When I log into twitter with good credentials
Then I go to the user private profile page
And I can confirm my twitter credentials
Expressed in Gherkin, the language used
with Cucumber.
Tied to actual coded methods to make the
process work.
8. TDD AND ATDD ARE
NOT TESTING!
TDD and ATDD are design processes.
Help develop clean, well functioning code.
Help ensure software being developed
meets needs of the project
9. WHY ISN’T THIS
TESTING?
Testing:
“The art and the act of asking questions of
a product, then developing & devising new
and more inventive questions based on the
answers we receive.”
10. WHERE DO TDD AND
ATDD EXCEL?
Disciplined approach to developing cleaner
software.
Act as a brake on "cowboy code".
New programmers can see code in context
with the tests that have been written.
11. WHERE DO TDD AND
ATDD EXCEL?
TDD and ATDD work well with Continuous Integration.
Keeps issues based on dependencies to a minimum.
Programmers focus on "just enough design" and "just
enough development".
Entire product team does design and development
work.
12. WHERE DO TDD AND
ATDD FALL SHORT?
Unrealistic expectations.
Only be as good as the overall skill of the
developer(s).
Can be over applied.
13. WHAT IS FRONT END
GUI AUTOMATION?
Proliferation of tools that help simulate the actions
of users.
Follow standard workflows
• (logging in, navigating to pages, clicking on
links, filling in forms, etc.).
Tools range from lightweight apps like Texter up to
full feature development suites
(Selenium/WebDriver, FitNesse, TestComplete, etc
.).
14. HOW DOES GUI
AUTOMATION DIFFER
FROM TDD/ATDD?
Helps with constructing
acceptance test cases
Helps to check and
demonstrate the acceptance
criteria has been met.
19. WHERE DOES GUI
AUTOMATION EXCEL?
Tremendous blessing when dealing
with repetitious steps (set up and take
down).
Important part of Acceptance Test
verification.
Semi proxy for human interaction.
20. GUI AUTOMATION
DEFICIENCIES
GUI tends to be the most fragile
layer of an application.
Tremendous overhead of script
maintenance.
Not good at checking context.
22. WHAT IS EXPLORATORY
TESTING?
“Scientific thinking in real-time” [Bach, 2012]
Puts test design and test execution
together.
The tests that we did before inform the tests
that we will do now. Those tests will inform
the tests we will perform later.
23. EXPLORATORY
TESTING
Concepts are defined, but they are not completely
predefined or run in a rigid sequence.
Allows for a development of test ideas that are
interesting.
Example: Automated tests can be made
exploratory by randomizing order.
Purpose: Discover where dependencies or state
conditions do exist.
24. EXPLORATORY TESTING
VS. AUTOMATED
TESTING
Instead of saying "automated testing", I
prefer to use the term "computer aided
testing".
Exploratory Testing can, and often
does, use elements of computer aided
testing to help accomplish its goals.
25. AUTOMATION CAN’S
AND CANNOT’S
Automation Can: Automation Cannot:
Generate test data to be used in Create curiosity.
forms or as variable values.
Parse the output of a program and Make sapient (actively thinking)
use it as input for another program. decisions.
Create a log of actions and Invent a new idea or an approach
transactions. based on the output of a sequence
of tests.
Alert if an assertion is met or not Notice something unexpected
met. (unless we have already defined
what is unexpected).
Search for patterns in output, or Make a judgment call as to the
help to reveal patterns we did not value or importance of a piece of
know about. functionality.
26. WHERE DOES
EXPLORATORY TESTING
EXCEL?
Allows the tester to choose the sequence of
steps that can provide answers about the
state or condition of a product.
Imagination and creativity open up
possibilities.
When there is little in the way of formalized
documentation.
27. EXPLORING
REQUIREMENTS
Remember this acceptance test?
Scenario: Twitter user signs in through Twitter
Given I connect to twitter from the sign in page
When I log into twitter with good credentials
Then I go to the user private profile page
And I can confirm my twitter credentials
28. SAMPLE EXPLORATION
QUESTIONS
Is there a login interface?
Does it provide for proper error handling if I can't log
in?
Does it give me feedback to let me know that I have
successfully logged in?
Can I trick the system into letting me log in with
improper credentials?
Does it present me with information that could help me
guess a login without actually having one?
Where else could I use this functionality?
29. EXPLORATORY TESTING
IS PART OF TDD/ATDD
Proposing a failing test first, and then
creating code that meets the acceptance
criteria
Programmer must consider what the
application needs to do
Determine which avenue(s) to implement
the feature(s).
30. BENEFITS OF
EXPLORATORY TESTING
Walking down various paths
Jumping off point to look in other places
Simple Cucumber statement "And let me see that":
And /^let me see that$/ do
puts "PAUSED - Press Enter to continue: #{$1}aaa"
puts ""
$stdin.gets
End
31. EXPLORATORY
TESTING’S DEFICIENCIES
Less likely to find new revelations by going
over the same ground (pesticide paradox)
Not enough time to go through all
possibilities.
Sentient humans do it. Sentient humans get
bored.
• SBTM techniques can help
32. CAN WE CREATE A
BALANCE?
TDD, ATDD, and front end GUI automation
are processes.
Exploratory Testing is a process and a
mindset.
Developing and defining acceptance is
exploratory.
Front End GUI Automation starts from
capturing discoveries made while
examining requirements.
33. THE ROLE OF
TESTERS
”Testing is dead" == testing needed to be a
part of an entire process of quality and
improvement, starting at the earliest stages
of design and development.
Exploration, automation, and testing must
come earlier in the process, and involve not
just testers, but everyone on the product
team.
34. STRIKING A BALANCE
Can we leverage the "testing
everywhere" idea, and help develop
a balance for all of these goals?
Can we see them as
interdependent, and not as separate
and standalone activities?
35. TDD AND ATDD TAKE AN
EXPLORATORY MINDSET
• Testing is part of the initial development effort.
• Testing comes first.
• Putting the emphasis on TDD/ATDD makes sure that it
meets the criteria it is being designed for.
Testers can also help guide and provide consideration for
testability in areas the programmers might not consider.
Asking "what if" questions vital at this stage
36. FOCUS ON DELIVERING
BUSINESS VALUE
Question 1: "is what I am doing helping
to deliver real value to my customers"?
If we put the business value first, that
will guide us in our testing efforts and
the approaches that we use
[Crispin, 2008].
37. AUTOMATION DOES NOT
HAVE TO BE PERMANENT
Make room for "throwaway automation”; steps
that can be used to help explore different
avenues.
Freight or passenger trains vs. taxi cabs.
• Freight and passenger trains limited to the
rails the trains can ride on.
• Taxi cab can be called to appear anywhere
and take us anywhere.
38. EXPLORATION IS A
MINDSET
Exploratory Testing is a way of thinking
about questions and answers.
Exploratory Testing does not live outside
of automated testing.
Automation and Computer Aided Testing
can be central to exploration.
39. ZEN MIND,
BEGINNERS MIND
Domain Expert Level Exploratory Testing
Novice Level Exploratory Testing
They are different.
Source of mental friction is knowing the
eccentricities of your application and when
they can get in the way of effective testing and
exploration.
40. USE PERSONAS TO
VISUALIZE GOALS
Put yourself into the shoes of as many potential
customers of our product as I can.
That means understanding the interests, desires
and ways of interaction of many different groups of
people.
The ways that they interact with the devices they
use to discover content/interact is very different.
41. SESSION BASED
TECHNIQUES
• Developing small charters and executing
on them in set periods of time.
• Focus testing areas that are most
important.
• Focus on a small sub system.
Keep our testing fresh, our eyes and minds
focused, work on what's most important.
42. CONCLUSION
TDD focuses on writing correct and clean code.
ATDD focuses on making sure the functionality
being delivered makes good on the goals and
promises made to our customers.
Automation happens at many levels.
Exploration happens at all of these levels, and is
an approach and a mindset, not a methodology.
Working together, the odds of quality software are
much more heavily in our favor.
“by repeatedly running the same tests, we will not only not find new bugs, but thebugs that are left will be more resistant to the tests we do perform [Beizer, 1990].
Developers have an exploratory mindset, but at the moment, ATDD explores to discover the real requirements. TDD is not exploratory in the same way. There is an aspect that is exploratory, the first TDD test will lead you to the nest test, the small tests lead you to TDD.