In this session you will learn:
Introduction to HP Quality Center.
Release Management Module.
Test Plan Module.
Test Lab Module.
Defect Management Module.
Reports Module.
For more information: https://www.mindsmapped.com/courses/quality-assurance/qa-software-testing-training-for-beginners/
The document provides an overview and training on Test Director 7.6 for Intralinks QA team members. It describes the key components of Test Director including access, requirements tracking, test planning, test execution, and defect management. It explains how each component will be used as part of Intralinks' testing process and standard operating procedures.
The document describes Testlink processes and workflow. It includes:
1) An overview of Testlink modules including test project management, test plan management, requirement specification, build/release creation, test specification, test execution, and test report generation.
2) A high-level workflow showing the typical process from creating a test project to executing tests and generating reports.
3) Detailed steps for key Testlink functions like creating a test project, test plan, requirements, test suites, test cases, assigning tests, test execution, and reports.
TestLink is a test management tool that allows users to:
1. Create projects, test cases, test plans, and specify builds to organize testing.
2. Execute test cases and view reports on results including charts tracking progress.
3. Tag test cases with keywords and link them to requirements to ensure coverage.
This document provides an overview of Microsoft Test Manager (MTM) 2013 and how to use it for test planning, test case management, test runs, exploratory testing, and lab management. Key capabilities covered include creating test plans and test suites, managing manual and automated test cases, running tests and recording results, performing exploratory testing sessions, and setting up and using lab environments to collect diagnostic data during testing. The document demonstrates these capabilities through examples and screenshots.
The document discusses various aspects of developing a test strategy for software projects. It covers topics like test levels, roles and responsibilities, test types, test methodologies, test estimation processes, risk analysis and management. Some key points include defining the scope, risks, test priorities and approach in the strategy. It also discusses test estimation techniques like use case points, function points and test case points to estimate the testing effort.
This document outlines the test approach, scope, objectives, assumptions, and methodology for testing applications. It describes unit, integration, system, regression, and user acceptance testing. The primary objective is to ensure all requirements are met and the system functions as intended. The secondary objective is to identify and address all issues before release. Test deliverables include documents like the test approach, plan, and specifications as well as test cases, bug reports, and status reports.
The document outlines the test plan for Alfresco 4.2.1, including the scope of testing third party integrations and components, test environments on three stacks, and test schedule from January 13 to February 21. It provides links to test cases in Test Link and defect tracking in Jira. Post release tasks are defined for downloading artifacts and updating support documentation.
This document provides an overview of automation fundamentals and an introduction to QuickTest Professional (QTP) 9.2. It discusses what test automation is, its benefits, and when it is applicable. It also covers QTP concepts like the user interface, recording and running tests, checkpoints, parameters, synchronization, and the object repository. Key points include how QTP recognizes and identifies objects, how to save and view test results, and best practices for configuring options and settings in QTP.
The document provides an overview and training on Test Director 7.6 for Intralinks QA team members. It describes the key components of Test Director including access, requirements tracking, test planning, test execution, and defect management. It explains how each component will be used as part of Intralinks' testing process and standard operating procedures.
The document describes Testlink processes and workflow. It includes:
1) An overview of Testlink modules including test project management, test plan management, requirement specification, build/release creation, test specification, test execution, and test report generation.
2) A high-level workflow showing the typical process from creating a test project to executing tests and generating reports.
3) Detailed steps for key Testlink functions like creating a test project, test plan, requirements, test suites, test cases, assigning tests, test execution, and reports.
TestLink is a test management tool that allows users to:
1. Create projects, test cases, test plans, and specify builds to organize testing.
2. Execute test cases and view reports on results including charts tracking progress.
3. Tag test cases with keywords and link them to requirements to ensure coverage.
This document provides an overview of Microsoft Test Manager (MTM) 2013 and how to use it for test planning, test case management, test runs, exploratory testing, and lab management. Key capabilities covered include creating test plans and test suites, managing manual and automated test cases, running tests and recording results, performing exploratory testing sessions, and setting up and using lab environments to collect diagnostic data during testing. The document demonstrates these capabilities through examples and screenshots.
The document discusses various aspects of developing a test strategy for software projects. It covers topics like test levels, roles and responsibilities, test types, test methodologies, test estimation processes, risk analysis and management. Some key points include defining the scope, risks, test priorities and approach in the strategy. It also discusses test estimation techniques like use case points, function points and test case points to estimate the testing effort.
This document outlines the test approach, scope, objectives, assumptions, and methodology for testing applications. It describes unit, integration, system, regression, and user acceptance testing. The primary objective is to ensure all requirements are met and the system functions as intended. The secondary objective is to identify and address all issues before release. Test deliverables include documents like the test approach, plan, and specifications as well as test cases, bug reports, and status reports.
The document outlines the test plan for Alfresco 4.2.1, including the scope of testing third party integrations and components, test environments on three stacks, and test schedule from January 13 to February 21. It provides links to test cases in Test Link and defect tracking in Jira. Post release tasks are defined for downloading artifacts and updating support documentation.
This document provides an overview of automation fundamentals and an introduction to QuickTest Professional (QTP) 9.2. It discusses what test automation is, its benefits, and when it is applicable. It also covers QTP concepts like the user interface, recording and running tests, checkpoints, parameters, synchronization, and the object repository. Key points include how QTP recognizes and identifies objects, how to save and view test results, and best practices for configuring options and settings in QTP.
This document provides an overview of automation fundamentals and an introduction to QuickTest Professional (QTP) 9.2. It discusses what test automation is, the benefits of automation, and factors to consider in automation planning. It also covers supported technologies and browsers in QTP, the add-in manager, and the main QTP window interface. The document provides a high-level introduction to recording and running tests in QTP.
Find out:
- what is a master test plan
- common parts of a master test plan
-master test plan in an Agile age
Full webinar recording video:
https://www.practitest.com/qa-learningcenter/webinars/master-test-plan-webinar/
The document discusses the benefits of using HP Quality Centre for testing web-based applications at IFSA. It provides an overview of Quality Centre's key features including managing requirements, test plans, test runs, and defects. It also describes how Quality Centre supports collaboration, provides reporting and dashboards, and integrates with test automation tools. The document concludes with some pros and cons of Quality Centre for IFSA's test team.
This document proposes adding test management and project management functionality to MagicDraw modeling tool. This would allow monitoring test implementation progress by building tests to verify models, and monitoring new feature development by tracking key performance indicators. Tests would be associated with models to help ensure completeness. Dashboards would provide visibility into test and requirements coverage, relationships between releases, features and models, and metrics on automated testing. The goal is to help users deliver products with zero defects by shifting testing left into the modeling process.
The document provides an overview of software testing methodology and trends:
- It discusses the evolution of software development processes and how testing has changed and become more important. Testing now includes more automation, non-functional testing, and professional testers.
- The key components of a testing process framework are described, including test management, quality metrics, risk-based testing, and exploratory testing.
- Automation testing, performance testing, and popular testing tools are also covered.
- The future of software testing is discussed, with notes on faster release cycles, more complex applications, global testing teams, increased use of automation, and a focus on practices over processes.
This software test plan document provides details on testing for a software project called XXX. It describes the test environment, identification of planned tests, schedules, and traceability of requirements to tests. Tests are divided into phases and categories and will verify requirements from the software requirements specification document for XXX.
This document discusses simplifying test plans by removing unnecessary information and keeping them dynamic. It recommends including only essential information like test ownership, the system configuration under test, definition of done, identified risks, test activities, and a dynamic test schedule. The test plan should evolve continuously through a self-learning loop to improve test scope based on lessons learned. Static information can be moved to other documents to keep the test plan focused on guiding the test project.
Ho Chi Minh City Software Testing Conference January 2015
Software Testing in the Agile World
Website: www.hcmc-stc.org
Author: Tho Thanh Quang
As testing is increasingly incurring a substantial cost in software development, there are many attempts made to automate the testing process. One notable approach is automatic generation of test cases. Recent research has suggested generating test cases from UML-based diagrams, which are over-formal to be applied effectively in industry. In this talk we introduce a framework, known as FATS (Framework for Automated Testing Scenarios), to counter this problem. In FATS, we suggest representing user-defined use-cases by a markup language, therefore activity graphs and test scenarios can be developed accordingly in an automatic manner.
Microsoft Test Manager is a test management tool that allows users to manage requirements, plan tests, execute tests, and track bugs throughout the testing lifecycle. It includes modules for requirements management, test case management, bug tracking and reporting, and risk management. Test teams can use Microsoft Test Manager as a centralized repository to store testing artifacts and control various testing activities like unit testing, integration testing, regression testing, and more.
The document outlines key QA documents used in the product development process including the Product Requirement Analysis Document (PRAD), Functional Specification, Test Strategy, Test Matrix, Test Cases, Test Results by Build, and Release Package.
The PRAD defines product requirements and is used by developers and QA. The Functional Specification details how features will be implemented and is used by QA to build test plans. The Test Strategy outlines QA's testing approach and criteria. The Test Matrix identifies test types, suites, and categories. Test Cases contain specific test steps, expected results, and status. Test Results by Build provide coverage reports. The Release Package compiles all documents and recommends release.
The document discusses software testing processes and techniques. It covers topics like test case design, validation testing vs defect testing, unit testing vs integration testing, interface testing, system testing, acceptance testing, regression testing, test management, deriving test cases from use cases, and test coverage. The key points are that software testing involves designing test cases, running programs with test data, comparing results to test cases, and reporting test results. Different testing techniques like unit testing, integration testing, and system testing address different levels or parts of the system. Test cases are derived from use case scenarios to validate system functionality.
This document provides an agenda and step-by-step instructions for teachers to create common formative assessments using the Galileo Assessment Program. The workshop aims to teach World History teachers at Cortez High School how to construct assessments, assign state standards to questions, publish and schedule assessments, analyze assessment data, and share assessments with their Professional Learning Community team. The ultimate goals are to facilitate the use of common formative assessments and data-driven instruction as part of the school's Professional Learning Community approach.
1. The document describes various testing documents created at different levels of a project testing process. Test policy, strategy, and methodology documents are created at higher levels, while test plans, cases, procedures, scripts, and reports are created at the project level.
2. It provides details on different testing documents - test policy defines testing objectives, test strategy defines the testing approach, and test methodology provides the testing approach for a specific project. It also describes how test plans are created, test cases are designed based on requirements, and the different levels of test execution.
3. The key testing documents created are test policy, strategy, methodology, plan, cases, procedures, scripts, and reports. Test cases are designed based
In this quality assurance training session, you will learn HP ALM QC. Topics covered in this course are:
• HP ALM Overview
• HP ALM Solution
• HP ALM Segments
• QA Process
• Requirements
• Test Plan
• Test LAB
To know more, visit this link: https://www.mindsmapped.com/courses/quality-assurance/software-testing-quality-assurance-qa-training-with-hands-on-exercises/
The document discusses HP Quality Center, a test management tool. It covers the different modules in HP Quality Center including Release Management, Test Plan, Test Lab, and Defect Management. The document provides information on setting up releases and cycles in the Release Management module, designing test plans and test cases in the Test Plan module, creating and executing test sets in the Test Lab module, and tracking defects in the Defect Management module. It also discusses linking requirements to tests and generating reports.
The document discusses the phases of the Software Testing Life Cycle (STLC). It begins by introducing the group members and defining software testing as a process to find bugs by executing a program. It then outlines the six main phases of the STLC: 1) Requirements analysis to understand requirements and identify test cases, 2) Test planning to create test plans and strategies, 3) Test case development to write test cases and scripts, 4) Environment setup to prepare the test environment, 5) Test execution and bug reporting to run tests and log defects, and 6) Test cycle closure to review testing artifacts and lessons learned. Each phase is described in 1-2 sentences with its activities, deliverables, and examples provided.
In this quality assurance training session, you will learn QTP/UFT automation testing. Topics covered in this course are:
• Introduction to QTP
• Features of QTP
• Recording modes of QTP
• Object Repository
• Synchronization point
• Step Generator
• Object Spy
• Checkpoints
• Data-driven testing & Parameterization
• Working with actions
• Reporting in QTP
TO know more, visit this link: https://www.mindsmapped.com/courses/quality-assurance/get-practical-training-on-software-testing-quality-assurance-qa/
The document discusses test design which includes creating test scenarios and test cases to thoroughly test all features of a system. It provides templates and guidelines for writing effective test scenarios and test cases, including elements like preconditions, test steps, and expected results. The document also discusses traceability matrices to map test cases to requirements and help determine test coverage.
The document discusses Application Lifecycle Management (ALM) and HP Quality Center. It provides information on what Quality Center is, its modules, how to map requirements to test cases, generate tests from requirements, and use filters. It also answers various questions related to Quality Center features, usage, libraries, databases, and more.
This document discusses continuous testing in an agile environment. It defines continuous testing as testing throughout the development process to identify bugs early. It explains that continuous testing helps control side effects, avoid defects, support multiple environments, get fast results, anticipate risks, and create reliable processes. The document provides an overview of how continuous testing works, including test environments, data management, automatic deployment, and test automation. It also discusses creating a continuous testing project, the agile test process, and how to implement effective continuous testing to improve quality and business value.
Nowadays in IT market, most of enterprises are trying to adopt agile and DevOps methodology to meet the time-to-market expectations and continue satisfying their customers. As the result, continue deploying new applications are become challenges for any software development teams.
During the development cycles, several questions has been identified and one of the most interesting questions is How to fit automated tests into agile projects because within agile sprints, there is simply not enough time to automate the set of tests?
Action Based Testing (ABT) methodology is becoming a solution to help you achieve your expectations on automated test coverage within the Agile iterations/sprints.
ABT uses a modular keyword-driven approach which tests are organized in test modules and are built up of sequences of actions. Well-defined test modules can provide a healthy framework for teams to work with, in particular if modules have a clear and unambiguous scope, the scope is well-differentiated from other test modules, and all test cases …within the test module reflect the scope.
A key differentiation is between business tests and interaction tests. Business tests have a business-oriented scope and should not contain UI details. Interaction tests focus on the interaction between the user (or another system) and the application.
This topic is about how to apply ABT methodologies into SDLC with some discussions on the three Holy Grail of test design approaches from Hans Buwalda.
This document provides an overview of automation fundamentals and an introduction to QuickTest Professional (QTP) 9.2. It discusses what test automation is, the benefits of automation, and factors to consider in automation planning. It also covers supported technologies and browsers in QTP, the add-in manager, and the main QTP window interface. The document provides a high-level introduction to recording and running tests in QTP.
Find out:
- what is a master test plan
- common parts of a master test plan
-master test plan in an Agile age
Full webinar recording video:
https://www.practitest.com/qa-learningcenter/webinars/master-test-plan-webinar/
The document discusses the benefits of using HP Quality Centre for testing web-based applications at IFSA. It provides an overview of Quality Centre's key features including managing requirements, test plans, test runs, and defects. It also describes how Quality Centre supports collaboration, provides reporting and dashboards, and integrates with test automation tools. The document concludes with some pros and cons of Quality Centre for IFSA's test team.
This document proposes adding test management and project management functionality to MagicDraw modeling tool. This would allow monitoring test implementation progress by building tests to verify models, and monitoring new feature development by tracking key performance indicators. Tests would be associated with models to help ensure completeness. Dashboards would provide visibility into test and requirements coverage, relationships between releases, features and models, and metrics on automated testing. The goal is to help users deliver products with zero defects by shifting testing left into the modeling process.
The document provides an overview of software testing methodology and trends:
- It discusses the evolution of software development processes and how testing has changed and become more important. Testing now includes more automation, non-functional testing, and professional testers.
- The key components of a testing process framework are described, including test management, quality metrics, risk-based testing, and exploratory testing.
- Automation testing, performance testing, and popular testing tools are also covered.
- The future of software testing is discussed, with notes on faster release cycles, more complex applications, global testing teams, increased use of automation, and a focus on practices over processes.
This software test plan document provides details on testing for a software project called XXX. It describes the test environment, identification of planned tests, schedules, and traceability of requirements to tests. Tests are divided into phases and categories and will verify requirements from the software requirements specification document for XXX.
This document discusses simplifying test plans by removing unnecessary information and keeping them dynamic. It recommends including only essential information like test ownership, the system configuration under test, definition of done, identified risks, test activities, and a dynamic test schedule. The test plan should evolve continuously through a self-learning loop to improve test scope based on lessons learned. Static information can be moved to other documents to keep the test plan focused on guiding the test project.
Ho Chi Minh City Software Testing Conference January 2015
Software Testing in the Agile World
Website: www.hcmc-stc.org
Author: Tho Thanh Quang
As testing is increasingly incurring a substantial cost in software development, there are many attempts made to automate the testing process. One notable approach is automatic generation of test cases. Recent research has suggested generating test cases from UML-based diagrams, which are over-formal to be applied effectively in industry. In this talk we introduce a framework, known as FATS (Framework for Automated Testing Scenarios), to counter this problem. In FATS, we suggest representing user-defined use-cases by a markup language, therefore activity graphs and test scenarios can be developed accordingly in an automatic manner.
Microsoft Test Manager is a test management tool that allows users to manage requirements, plan tests, execute tests, and track bugs throughout the testing lifecycle. It includes modules for requirements management, test case management, bug tracking and reporting, and risk management. Test teams can use Microsoft Test Manager as a centralized repository to store testing artifacts and control various testing activities like unit testing, integration testing, regression testing, and more.
The document outlines key QA documents used in the product development process including the Product Requirement Analysis Document (PRAD), Functional Specification, Test Strategy, Test Matrix, Test Cases, Test Results by Build, and Release Package.
The PRAD defines product requirements and is used by developers and QA. The Functional Specification details how features will be implemented and is used by QA to build test plans. The Test Strategy outlines QA's testing approach and criteria. The Test Matrix identifies test types, suites, and categories. Test Cases contain specific test steps, expected results, and status. Test Results by Build provide coverage reports. The Release Package compiles all documents and recommends release.
The document discusses software testing processes and techniques. It covers topics like test case design, validation testing vs defect testing, unit testing vs integration testing, interface testing, system testing, acceptance testing, regression testing, test management, deriving test cases from use cases, and test coverage. The key points are that software testing involves designing test cases, running programs with test data, comparing results to test cases, and reporting test results. Different testing techniques like unit testing, integration testing, and system testing address different levels or parts of the system. Test cases are derived from use case scenarios to validate system functionality.
This document provides an agenda and step-by-step instructions for teachers to create common formative assessments using the Galileo Assessment Program. The workshop aims to teach World History teachers at Cortez High School how to construct assessments, assign state standards to questions, publish and schedule assessments, analyze assessment data, and share assessments with their Professional Learning Community team. The ultimate goals are to facilitate the use of common formative assessments and data-driven instruction as part of the school's Professional Learning Community approach.
1. The document describes various testing documents created at different levels of a project testing process. Test policy, strategy, and methodology documents are created at higher levels, while test plans, cases, procedures, scripts, and reports are created at the project level.
2. It provides details on different testing documents - test policy defines testing objectives, test strategy defines the testing approach, and test methodology provides the testing approach for a specific project. It also describes how test plans are created, test cases are designed based on requirements, and the different levels of test execution.
3. The key testing documents created are test policy, strategy, methodology, plan, cases, procedures, scripts, and reports. Test cases are designed based
In this quality assurance training session, you will learn HP ALM QC. Topics covered in this course are:
• HP ALM Overview
• HP ALM Solution
• HP ALM Segments
• QA Process
• Requirements
• Test Plan
• Test LAB
To know more, visit this link: https://www.mindsmapped.com/courses/quality-assurance/software-testing-quality-assurance-qa-training-with-hands-on-exercises/
The document discusses HP Quality Center, a test management tool. It covers the different modules in HP Quality Center including Release Management, Test Plan, Test Lab, and Defect Management. The document provides information on setting up releases and cycles in the Release Management module, designing test plans and test cases in the Test Plan module, creating and executing test sets in the Test Lab module, and tracking defects in the Defect Management module. It also discusses linking requirements to tests and generating reports.
The document discusses the phases of the Software Testing Life Cycle (STLC). It begins by introducing the group members and defining software testing as a process to find bugs by executing a program. It then outlines the six main phases of the STLC: 1) Requirements analysis to understand requirements and identify test cases, 2) Test planning to create test plans and strategies, 3) Test case development to write test cases and scripts, 4) Environment setup to prepare the test environment, 5) Test execution and bug reporting to run tests and log defects, and 6) Test cycle closure to review testing artifacts and lessons learned. Each phase is described in 1-2 sentences with its activities, deliverables, and examples provided.
In this quality assurance training session, you will learn QTP/UFT automation testing. Topics covered in this course are:
• Introduction to QTP
• Features of QTP
• Recording modes of QTP
• Object Repository
• Synchronization point
• Step Generator
• Object Spy
• Checkpoints
• Data-driven testing & Parameterization
• Working with actions
• Reporting in QTP
TO know more, visit this link: https://www.mindsmapped.com/courses/quality-assurance/get-practical-training-on-software-testing-quality-assurance-qa/
The document discusses test design which includes creating test scenarios and test cases to thoroughly test all features of a system. It provides templates and guidelines for writing effective test scenarios and test cases, including elements like preconditions, test steps, and expected results. The document also discusses traceability matrices to map test cases to requirements and help determine test coverage.
The document discusses Application Lifecycle Management (ALM) and HP Quality Center. It provides information on what Quality Center is, its modules, how to map requirements to test cases, generate tests from requirements, and use filters. It also answers various questions related to Quality Center features, usage, libraries, databases, and more.
This document discusses continuous testing in an agile environment. It defines continuous testing as testing throughout the development process to identify bugs early. It explains that continuous testing helps control side effects, avoid defects, support multiple environments, get fast results, anticipate risks, and create reliable processes. The document provides an overview of how continuous testing works, including test environments, data management, automatic deployment, and test automation. It also discusses creating a continuous testing project, the agile test process, and how to implement effective continuous testing to improve quality and business value.
Nowadays in IT market, most of enterprises are trying to adopt agile and DevOps methodology to meet the time-to-market expectations and continue satisfying their customers. As the result, continue deploying new applications are become challenges for any software development teams.
During the development cycles, several questions has been identified and one of the most interesting questions is How to fit automated tests into agile projects because within agile sprints, there is simply not enough time to automate the set of tests?
Action Based Testing (ABT) methodology is becoming a solution to help you achieve your expectations on automated test coverage within the Agile iterations/sprints.
ABT uses a modular keyword-driven approach which tests are organized in test modules and are built up of sequences of actions. Well-defined test modules can provide a healthy framework for teams to work with, in particular if modules have a clear and unambiguous scope, the scope is well-differentiated from other test modules, and all test cases …within the test module reflect the scope.
A key differentiation is between business tests and interaction tests. Business tests have a business-oriented scope and should not contain UI details. Interaction tests focus on the interaction between the user (or another system) and the application.
This topic is about how to apply ABT methodologies into SDLC with some discussions on the three Holy Grail of test design approaches from Hans Buwalda.
Software test management overview for managersTJamesLeDoux
Software test management presentation given to the senior management of several Fortune 100 companies to aid them in planning their software development management efforts.
The document provides information on types of software testing, test strategy and planning, and test estimation techniques. It describes various types of testing including functional, system, end-to-end, load, security, and others. It also discusses test strategy, test planning, and creating test plans. Finally, it outlines several techniques for estimating testing efforts such as best guess, analogies, work breakdown structure, three-point estimation, and function point analysis.
Testing frameworks provide an execution environment for automated tests. The main types are modular, data-driven, and keyword-driven frameworks. Modular frameworks organize tests into independent scripts representing application modules. Data-driven frameworks store test data and expected results in external files to reduce code duplication. Keyword-driven frameworks use external files to store test actions and data. Hybrid frameworks combine advantages of the different approaches. While frameworks work with waterfall models, agile methodologies benefit more from test-driven development and behavior-driven development which integrate testing throughout development.
The document discusses test cases, defects (bugs), and bug reports. It provides definitions and examples of test cases, their purpose and components. Examples of test management tools and test-driven development are also presented. Defects and what constitutes a good bug report are defined. The importance of collaboration between testers and developers is emphasized.
This document provides an overview of how to use TestDirector 8.0 software for test management. It discusses setting up test requirements and cases, creating test sets, executing tests, tracking defects, and analyzing results using reports and graphs. The training objectives are to learn TestDirector functionality and features for managing the entire testing process from one central location. Instructional methods include slides, demonstrations, and hands-on exercises.
7 Tips from Siemens Energy for Success with AutomationWorksoft
Nathan Sharp of Siemens Energy recently spoke at the SAP Project Management in Atlanta and shared 7 important elements for the successful adoption of automated business process validation in their organization.
Originally presented by Nathan Sharp of Siemens Energy at SAPinsider’s Project Management conference.
How Manual Testers Can Break into Automation Without Programming SkillsRanorex
Adoption of automating tests has not happened as quickly as organizations need. As more companies move toward implementing agile development as their software development lifecycle, more features are being implemented and released more quickly. This leaves less time for full regression testing of the system, nonetheless this should still be done. Manual testers need to transform into test automation testers as well.
Learn how to make this jump as a manual tester and focus on the right areas first e.g. automation test structure, object recognition and results interpretation.
This document provides instructions for creating a simple test in TestComplete. It describes adding the sample Orders application to the list of tested applications, planning a test to add a new order, recording user actions to perform that test, analyzing the recorded test, running the test, and analyzing the test results. The goal is to create an automated test that emulates user actions in the Orders application and verifies that a new order was added correctly.
This document provides an overview of test execution, including its purpose, entry and exit criteria, cycles, and methodologies. Test execution involves running test cases against software to find defects and assess quality. Key activities include verifying the test environment, selecting test cases, executing them, and logging any defects found. Test cycles are run as planned, with additional cycles as more defects are uncovered. Retesting and regression testing help ensure defects are closed without impacting previous functionality. Testing ends when criteria like completing test cases, reaching an acceptable defect rate, and schedule constraints are met.
The document provides step-by-step instructions for using the TestLink test management system. It explains how to create projects, test cases, test plans, assign testers, execute test cases, and view reports. Additional features covered include assigning keywords to test cases and specifying requirements. The benefits of TestLink include having structured and organized documentation, version control, and the ability to track and report on the testing process.
In this session you will learn:
Configuring Selenium - Webdriver
For more information: https://www.mindsmapped.com/courses/quality-assurance/qa-software-testing-training-for-beginners/
In this session you will learn:
Installing Selenium IDE
For more information: https://www.mindsmapped.com/courses/quality-assurance/qa-software-testing-training-for-beginners/
In this session you will learn:
FEATURES OF SELENIUM
COMPONETS OF SELENIUM
SELENIUM IDE
SELENIUM RC
SELENIUM Web Driver
SELENIUM GRID
SELENESE
For more information: https://www.mindsmapped.com/courses/quality-assurance/qa-software-testing-training-for-beginners/
In this session you will learn:
Software Testing Tool – Overview
Advantage- Automation
Disadvantage - Automation
Grouping of Automation Tool
Functional Tool
Source Code Testing Tool
Performance Tool
Test Management Tool
Security Testing Tool
For more information: https://www.mindsmapped.com/courses/quality-assurance/qa-software-testing-training-for-beginners/
In this session you will learn:
What Exactly is JIRA?
JIRA as an Issue Tracker
JIRA as a Project Management Tool
JIRA Roles
JIRA Request Format
JIRA Workflow Model
General JIRA Structure
Browsing Project issues
Created vs Resolved issue Report
JIRA Help
For more information: https://www.mindsmapped.com/courses/quality-assurance/qa-software-testing-training-for-beginners/
In this session you will learn:
Defect Life Cycle
Defect States
Defect Content
Severity Vs Priority
Severity Levels
For more information: https://www.mindsmapped.com/courses/quality-assurance/qa-software-testing-training-for-beginners/
In this session you will learn:
Test Case Design and Techniques
Black-box: Three major approaches
Steps for drawing cause-Effect Diagram:
Behavior Testing
Random Testing
White Box Techniques
Path Testing
Statement Coverage
Data Flow Testing
For more information: https://www.mindsmapped.com/courses/quality-assurance/qa-software-testing-training-for-beginners/
In this session you will learn:
Test Estimation Techniques
Work Breakdown Structure (WBS)
Benefits of Work Breakdown Structure
Three Point Estimation
Functional Point Method
For more information: https://www.mindsmapped.com/courses/quality-assurance/qa-software-testing-training-for-beginners/
In this session you will learn:
Test Strategy and Planning
Test Strategy Document
Test Planning
Test Estimation Techniques
For more information: https://www.mindsmapped.com/courses/quality-assurance/qa-software-testing-training-for-beginners/
In this session you will learn:
Requirement Management
Configuration Management
Project Management
Risk Management
For more information: https://www.mindsmapped.com/courses/quality-assurance/qa-software-testing-training-for-beginners/
In this session you will learn:
Types of Testing
Start and Stop of Testing
For more information: https://www.mindsmapped.com/courses/quality-assurance/qa-software-testing-training-for-beginners/
In this session you will learn:
Overview of Testing Life Cycle
Testing Methodologies
Black Box Testing
White Box Testing
Gray Box Testing
Integration Testing
System Testing
Regression Testing
User Acceptance Testing (UAT)
For more information: https://www.mindsmapped.com/courses/quality-assurance/qa-software-testing-training-for-beginners/
In this session you will learn:
Agile Approach
What does the Agile Manifesto Mean?
12 Principles of Agile
Central: Incremental and Iterative Development
Agile Methods
Scrum Lifecycle
Agile Methods – Scrum
Scrum Values
For more information: https://www.mindsmapped.com/courses/quality-assurance/qa-software-testing-training-for-beginners/
Introduction to Software Testing - Part 2Sachin-QA
In this session you will learn:
Defect/Bugs in Software Testing
Quality Team Roles and Responsibilities
Career options available for a Test Engineer
Testing documentation
Testing Fundamentals
Testing Certification
For more information: https://www.mindsmapped.com/courses/quality-assurance/qa-software-testing-training-for-beginners/
In this session you will learn:
SDLC and Quality Standard
What is SDLC and Stages
Phases of SDLC
SDLC Models
For more information: https://www.mindsmapped.com/courses/quality-assurance/qa-software-testing-training-for-beginners/
In this session you will learn:
Course Overview
Introduction to Software Testing
Is Testing a Technical role
Project And Product
Quality Assurance Vs Quality Control
QC VS QA
Verification and Validation
For more information: https://www.mindsmapped.com/courses/quality-assurance/qa-software-testing-training-for-beginners/
In this session you will learn:
Introduction to Test Automation Framework
What is a Test Automation Framework?
Utility of Test Automation Framework
Sample Automation Test Framework
Types of Automation Frameworks
Data Driven Automation Framework
Keyword Driven Automation Framework
Hybrid Automation Framework
Benefits of Automation Framework Approach
For more information: https://www.mindsmapped.com/courses/quality-assurance/qa-software-testing-training-for-beginners/
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
20 Comprehensive Checklist of Designing and Developing a WebsitePixlogix Infotech
Dive into the world of Website Designing and Developing with Pixlogix! Looking to create a stunning online presence? Look no further! Our comprehensive checklist covers everything you need to know to craft a website that stands out. From user-friendly design to seamless functionality, we've got you covered. Don't miss out on this invaluable resource! Check out our checklist now at Pixlogix and start your journey towards a captivating online presence today.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AIVladimir Iglovikov, Ph.D.
Presented by Vladimir Iglovikov:
- https://www.linkedin.com/in/iglovikov/
- https://x.com/viglovikov
- https://www.instagram.com/ternaus/
This presentation delves into the journey of Albumentations.ai, a highly successful open-source library for data augmentation.
Created out of a necessity for superior performance in Kaggle competitions, Albumentations has grown to become a widely used tool among data scientists and machine learning practitioners.
This case study covers various aspects, including:
People: The contributors and community that have supported Albumentations.
Metrics: The success indicators such as downloads, daily active users, GitHub stars, and financial contributions.
Challenges: The hurdles in monetizing open-source projects and measuring user engagement.
Development Practices: Best practices for creating, maintaining, and scaling open-source libraries, including code hygiene, CI/CD, and fast iteration.
Community Building: Strategies for making adoption easy, iterating quickly, and fostering a vibrant, engaged community.
Marketing: Both online and offline marketing tactics, focusing on real, impactful interactions and collaborations.
Mental Health: Maintaining balance and not feeling pressured by user demands.
Key insights include the importance of automation, making the adoption process seamless, and leveraging offline interactions for marketing. The presentation also emphasizes the need for continuous small improvements and building a friendly, inclusive community that contributes to the project's growth.
Vladimir Iglovikov brings his extensive experience as a Kaggle Grandmaster, ex-Staff ML Engineer at Lyft, sharing valuable lessons and practical advice for anyone looking to enhance the adoption of their open-source projects.
Explore more about Albumentations and join the community at:
GitHub: https://github.com/albumentations-team/albumentations
Website: https://albumentations.ai/
LinkedIn: https://www.linkedin.com/company/100504475
Twitter: https://x.com/albumentations
Introducing Milvus Lite: Easy-to-Install, Easy-to-Use vector database for you...Zilliz
Join us to introduce Milvus Lite, a vector database that can run on notebooks and laptops, share the same API with Milvus, and integrate with every popular GenAI framework. This webinar is perfect for developers seeking easy-to-use, well-integrated vector databases for their GenAI apps.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Speck&Tech
ABSTRACT: A prima vista, un mattoncino Lego e la backdoor XZ potrebbero avere in comune il fatto di essere entrambi blocchi di costruzione, o dipendenze di progetti creativi e software. La realtà è che un mattoncino Lego e il caso della backdoor XZ hanno molto di più di tutto ciò in comune.
Partecipate alla presentazione per immergervi in una storia di interoperabilità, standard e formati aperti, per poi discutere del ruolo importante che i contributori hanno in una comunità open source sostenibile.
BIO: Sostenitrice del software libero e dei formati standard e aperti. È stata un membro attivo dei progetti Fedora e openSUSE e ha co-fondato l'Associazione LibreItalia dove è stata coinvolta in diversi eventi, migrazioni e formazione relativi a LibreOffice. In precedenza ha lavorato a migrazioni e corsi di formazione su LibreOffice per diverse amministrazioni pubbliche e privati. Da gennaio 2020 lavora in SUSE come Software Release Engineer per Uyuni e SUSE Manager e quando non segue la sua passione per i computer e per Geeko coltiva la sua curiosità per l'astronomia (da cui deriva il suo nickname deneb_alpha).
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
2. Page 2Classification: Restricted
Agenda
• Introduction to HP Quality Center.
• Release Management Module.
• Test Plan Module.
• Test Lab Module.
• Defect Management Module.
• Reports Module.
3. Page 3Classification: Restricted
Introduction
• HP Quality Center is a test management tool.
• It offers an organized framework for testing applications.
• It is a web based application which manages all aspects of testing process
which otherwise is a time consuming activity.
• It helps maintain a project database of tests that cover all aspects of
application functionality.
• It can be attached to our email system so that the information about the
defect can be supplied to all people concerned. For example Developers,
persons in customer support and quality assurance personnel.
• This can be integrated with automated tools like Winrunner, QTP, Load
runner so that we can get a fully automated application testing.
• Graphs and reports can be generated to analyze the information.
4. Page 4Classification: Restricted
Why Quality Center
• One stop shop for all testing related tasks.
• Coherence of different tasks.
• Better analysis and management.
• Easier to track
5. Page 5Classification: Restricted
Quality Center Modules
The quality center has following basic modules.
• Releases
• Requirements
• Test Plan
• Test Lab
• Defects
Additional modules
• Business components
• Dash board
6. Page 6Classification: Restricted
Quality Center Add- ins
• At times Quality center responds slow due to the client server nature of
application. The response depends on many parameters like network
configurations, geographical locations of testing team, load on system etc.
• To overcome the network problems test wares can be first created in
Microsoft word or Excel and then uploaded in QC.
• To upload MS word document QC needs Microsoft word add-in
• To upload Excel document QC needs Microsoft Excel add-in
• To connect to QTP, needs QTP Add-in.
7. Page 7Classification: Restricted
Quality Center – Release Management
The application testing process begins by defining a release tree in Releases
module. Here is the release management work flow.
Define release and Cycle
Assign Requirements
Assign and run Test sets
Assign Defects
Analyze Releases and Cycles
8. Page 8Classification: Restricted
Defining Releases and Cycles
For example,
Suppose you are defining upcoming releases for the sample Mercury
Tours application. The Mercury Tours Application folder contains Release
10.5. This release includes four cycles. You might define the releases and
cycles in the releases tree as follows:
9. Page 9Classification: Restricted
Assigning Requirements
After defining the releases and cycles, the QA manager assigns the
Requirements from requirement module to Releases and Cycles.
12. Page 12Classification: Restricted
Assigning Defects
If an application flaw is detected while running a test set, the QA engineer
can submit a defect. Quality Center automatically creates a link between the
test run, associated release and cycle, and the new defect.
13. Page 13Classification: Restricted
Analyzing Releases and Cycles
•Following test runs, the QA manager reviews the test progress to
determine how well it meets the release goals.
• The QA manager can also determine how many defects were resolved, and
how many still remain open. The results can be analyzed at a release or at a
cycle level.
• It also helps track the progress of the testing process in real time by
analyzing the releases tree and ensuring that it matches the release goals.
14. Page 14Classification: Restricted
Test Plan Module
Test Plan
It is a repository of test cases.
Can be accessed through the Test Plan
section in the Quality Center
Uses Subject (Root)-Folders-Test Model
Folder or Test name can be
# Module name
# Scenario name
# Functionality name
Test Plan starts after requirements are
baselined
16. Page 16Classification: Restricted
Key elements in the Test Plan Module are
• Developing a Test Plan Tree
• Designing Tests
• Designing Test Steps
• Using parameters in tests
• Calling Tests
• Creating and Viewing Requirements Coverage
• Monitoring the status of test plans
18. Page 18Classification: Restricted
Designing Tests
Adding a test to the subject folder.
Available test type:
• Manual
• Business Process
• WR_Automated
• LR_Scenario
• VAPI_XP Test
• System Test
• Alt_Scenario
• Quick Test_Test (Need QTP add-in)
19. Page 19Classification: Restricted
Designing Test
MANUAL: A Quality Center manual test.
WR-AUTOMATED: A test that is executed by WinRunner, HP's functional
testing tool for Microsoft Windows applications.
LR-SCENARIO: A scenario that is executed by LoadRunner, HP's load testing
tool.
QUICKTEST_TEST: A test that is executed by QuickTest Professional, HP's
functional enterprise testing tool. This test type is only available if you
have installed the appropriate add-in from the HP Quality Center Add-ins
page
VAPI-XP-TEST: A test that is created using Visual API-XP, the Quality Center
open test architecture API testing tool.
SYSTEM-TEST: A test that instructs Quality Center to provide system
information, capture a desktop image, or restart a machine.
BUSINESS-PROCESS: A business process test.
20. Page 20Classification: Restricted
Designing Test
• The new test is added to the
test plan tree under subject
folder
• Add a test Description.
• In the Details tab, you can
see the test name, test
designer, creation date, test
status, and other
information.
21. Page 21Classification: Restricted
Designing Test Steps
Designing Test Steps:
Goto Design Steps tab of created
Test
• Click the Design Steps
tab.
• Click the New Step
button. The Design Step
Editor opens
22. Page 22Classification: Restricted
Designing Test Steps
Define a step for displaying Yahoo login page
Step Name: Display yahoo Login Page.
Description: Launch a browser and enter URL as Yahoomail.com.
Expected Result: Yahoo Login Page should be shown.
Click OK
23. Page 23Classification: Restricted
Designing Test Steps
To create an another step click on New Step icon
Define a step for displaying Yahoo login(Example) page
Step Name: User Name & Password.
Description: Enter User Name and Password.
Click Login.
Expected Result: User must Log on.
Click OK
Repeat the same to add more number of steps.
24. Page 24Classification: Restricted
Exporting Excel Data to Quality Center
Select all the rows in the Excel sheet that are to be exported
Open the Excel sheet and click on “Tools->Export to Quality Center”
26. Page 26Classification: Restricted
Exporting Excel Data to Quality Center
Select a Map
An existing map can be selected or a new map can be created
This maps each field on the excel sheet with a corresponding field in the
Quality Center
27. Page 27Classification: Restricted
Exporting Excel Data to Quality Center
List Box on the left contains on the fields that are required for logging defects.
Fields in ‘Red’ color are the mandatory fields. These fields are set up by the admin.
A field is selected from the left list box and added to the right list box. Against this
field then the field name from the excel sheet is entered.
Mapped
fields
Creating
Mapping
30. Page 30Classification: Restricted
Copying Test Steps
• Display the Design Steps tab for yahoo_Login(Example)
• Click the Design Steps tab.
• Select the steps that you want to copy.
• Copy the selected steps.
• Paste the steps into the Yahoo_Search test(Example)
31. Page 31Classification: Restricted
Linking Requirements to a Test
• Display the Yahoo_Search test.
• Display the Req Coverage tab.
• Display the requirements tree.
• Click the Select Req button and expand the requirements tree displayed on the
right.
• Add the Child1 requirement to the coverage grid.
• Hide the requirements tree. Click the Close button.
33. Page 33Classification: Restricted
Test Lab
• Test Lab module is used to run the test cases.
• The Test run process begins with creating the Test Set Tree and
• running the tests.
• Initially a Test Set Folder is created.
• Depending on the testing goals you can add Tests to the Test Folder.
• Test sets can contain both manual and automated Test.
• We can include the same Test in different Test Sets or add a few Test
instances to the same Test Set.
• We can schedule date and time for the execution of Test Sets.
35. Page 35Classification: Restricted
Create a Test Set
• Select the Test Lab module
• Click on Create Folder icon at the left corner of the module and give
the name for the folder.
• Select the created folder and click on create test set to create a set and
give name and description for the test set.
• A New Test Set is created.
• Select the Test set created.
• In the ‘Test set properties’
36. Page 36Classification: Restricted
Create a Test Set
Test Set Properties Window:
• The ‘Details’ tab enables to give the estimated open date of
test and estimated close date of test set.
• In the ‘attachments’ tab we can add an attachment to the test
set. Attachment can be a file, URL, snapshot of application,
and item from the clip board or system information.
37. Page 37Classification: Restricted
Create Test Set
• ‘On failure’ tab enables to set the conditions if any of the
automation test fails. Conditions are like stop the test set,
repeat the failed test or rerun the test set or do nothing.
‘Notifications’ tab enables send notifications to an user if any
of the test is finished with failed status, or failed due to some
network issues, or execution of test set is finished.
38. Page 38Classification: Restricted
Create Test Set
Execution Grid Window:
In the execution grid we can select the tests to be executed from test plan.
Tests can be either manual tests or automation tests.
Adding tests to test set
•Select the a Test Set
•Click on ‘Select tests’ icon at the top corner of the set.
•Drag and drop the tests from test plan tree displayed at right corner.
39. Page 39Classification: Restricted
Create a Test Set
If it is a manual test and that test is having some parameters,
parameters of the test window will open while dragging the test.
Parameters should be given when we execute the scripts. So we can just
close ‘parameters of the test’ window, with out giving any parameters.
42. Page 42Classification: Restricted
Create a Test Set
Specify the Execution Flow:
• The execution flow tab gives the Order and flow of execution of tests.
• You can specify a test to run on a specific date and time or based on
a condition.
• Condition would be a test run will start only when a test is passed or
finished.
43. Page 43Classification: Restricted
Create a Test Set
• To specify a condition double click a test,
select the execution condition tab in ‘Run
Schedule’ window.
• Click on ‘New’ Button to create a condition.
44. Page 44Classification: Restricted
Create a Test Set
• Now we can observe the flow of test is changed.
• Now the notification will be sent to the assigned tester to start
testing of specified test on the scheduled day and time.
• The second can not be executed unless the previous test is
finished, because we did set condition in previous steps.
Select the ‘Test’ and
‘Condition’, click on
‘OK’ icon.
45. Page 45Classification: Restricted
Running The Tests
• Test Run can be in two ways:
• Manual Run
• Automatic
• Manual Run: To execute a manual test.
• Automatic: To execute automation script(s).
47. Page 47Classification: Restricted
Running The Tests
• Select the tester that who is executing the current test. By default
it will be current username of QC.
• If you want you can rename the ‘Run Name’
• Then click on ‘Begin Run’ Icon.
• When run begins it will ask for the parameter values if you have
any parameters while creating the tests in test plan.
• Give the parameter values and click on ‘OK’ Button.
48. Page 48Classification: Restricted
Running The Tests
• Once execution begins the steps are shown with default status ‘No
Run’.
• Click on ‘Compact View’ icon to see the description, expected values
and to add Actual results.
• You can view the expected result, but can not modify.
49. Page 49Classification: Restricted
Running The Tests
• Execute all the steps and enter actual results for all the steps.
• Click on Compact View icon again, to come back to steps grid.
52. Page 52Classification: Restricted
Running The Tests
• Automation scripts can be executed as a set or can be executed
as individual.
• To execute as a test click on ‘Run test set’ Icon.
• Executing Automated Scripts:
53. Page 53Classification: Restricted
Running The Tests
• To execute the script on a Remote machine, enter the remote
machine name in ‘Run on Host’ column.
• To execute locally select the option ‘Run All Tests Locally’.
• Execute all the tests one by one, click in Click on ‘Run All’.
• To Execute Individually select a test to be executed and click in Run.
• It will launch the tool and will execute the script.
54. Page 54Classification: Restricted
Running The Tests
• Once the execution is completed , an email will be sent to the specified
user.
• if the notification has been selected in test set properties window.
55. Page 55Classification: Restricted
Viewing Results
• Double click a test in the test set to open Test instance
properties window.
• Select the run name and click on ‘Launch report’ icon.
• It will launch the Quick test report for that particular instance.
57. Page 57Classification: Restricted
Linking Defects
• To create and link a new defect click ‘Add and link defect’ icon.
• It will open defects module, create a defect and save it. Created defect
will be linked automatically to the test instance.
58. Page 58Classification: Restricted
Linking Defects
• To link an existing defect click on ‘link an existing defect’ icon.
• Link can be done in two ways.
• By defect ID
• By select the defect from defects module.
• Default is by defect ID.
• Give the defect ID and click on ‘Link’ icon.
59. Page 59Classification: Restricted
Adding Parameters
• Configuration tab enables to enter the parameters for manual tests and
Automated tests.
• It also enables to set a test to run how many number iterations if a test
fails.
60. Page 60Classification: Restricted
Defect Management using QC
• Locating and repairing defects is an essential phase in testing.
Analyzing defects and issues is what helps managers make the “go/no-
go” decision about application deployment. Quality Center helps
tracking application defects and enabling you to monitor defects
closely from initial detection until resolution.
• Defects gives a snapshot of the application under test and tell exactly
how many defects you currently have, their status, severity, priority,
age, etc.
61. Page 61Classification: Restricted
Defect Management using QC
The following things can be done in the defects module of
Quality Center:
• Tracking defects (stages)
• Adding Defects
• Reviewing Defects
• Matching Defects
• Updating Defects
• Mailing Defects
• Linking Defects
• Filtering/Sorting Defects
• Creating/Viewing Favorite views
62. Page 62Classification: Restricted
Tracking defects
• When you submit a defect to a Quality
Center project, it is tracked through
these stages: New, Open, Fixed and
Closed. A defect may also be Rejected
or it may be Reopened after it is fixed.
• When you initially report the status of
the defect is New by default.
65. Page 65Classification: Restricted
New Defect entry
• Selecting “New Defect”
button in Defect section
creates a new bug. All
fields marked by (*) or
in red are required.
• Description should have
steps to recreate and
test data.
• Attachments and
screenshots can be
added.
• Defect is submitted for
tracking.
66. Page 66Classification: Restricted
Reviewing open defects
• Various ways to
search defects in
Quality center
(using columns,
search, or
favorites).
• Double click
activity to
review in detail,
change status,
or add
comments.
67. Page 67Classification: Restricted
Matching Defects
• Matching defects enables you to
eliminate duplicate or similar defects in
your project. Each time you add a new
defect, QC stores lists of keywords from
the Summary and Description fields.
When you search for similar defects,
keywords in these fields are matched
against other defects.
• This filter can be set on the defects by
using the "Find similar defects" button.
• The results are stored in the similar
defects dialog box, sorted by the
percentage of detected similarity.
68. Page 68Classification: Restricted
Updating Defect in Quality Center
• When a defect needs
updated go to Defect
Details page.
• Change appropriate fields.
• Add comments.
• Save by selecting OK.
69. Page 69Classification: Restricted
Mailing Defects
• On the Defect Details page click on
the send email button.
• Send email dialog opens. Enter
valid To address, Add comments
and click on Send button to send
email.
• You can also include the
attachments and history of that
particular defect..
70. Page 70Classification: Restricted
Linking Defects
• A Defect can be linked directly or indirectly to an entity.
• When you add a defect from a test step a QC adds direct link to the
step and indirect link to its run, test instance and requirement if the
case is covered by the requirement.
71. Page 71Classification: Restricted
Filter / Sort Defects
• In the Defect module you can set filter to view defects
with some condition. For ex: Defects detected by an user.
• Click on the Set Filter/sort button
• The Filter dialog opens. Select the Detected By field and
click on the browse button.
72. Page 72Classification: Restricted
Filter / Sort Defects
The filter condition dialog opens with list of all users in the QC. Select the
username and click ok to apply the filter condition. Similarly you can select
status as “Not closed”. Defects grid displays defects detected by selected user
and whose status is Not closed (lists all defect status other than closed).
73. Page 73Classification: Restricted
Creating Favorite views for defects
• On the Defects module, select “Add
favorite” from the Favorites Menu
(available in the Header links).
• In the Name field type “My detected
defects” (for the above filtered defects).
• This favorites can be added to public or
private folder. Views in public folder is
accessible by all users. Views in private
can be accessed by the person who
created them.
• Select private for your defects list and
click on OK to add the view name to the
Favorite list.
74. Page 74Classification: Restricted
Viewing Favorite Views for Defects
On the Defects module, select the list saved as favorites from the
“Favorite” dropdown. The defects detected by you with status other
than closed will be displayed.
75. Page 75Classification: Restricted
Reports in Quality Center
• Introduction
• Available Reports and Sub Reports
• Generating Reports
• Customizing Reports
• Document generator
• Excel Reports
76. Page 76Classification: Restricted
Generating Reports
Generating Reports
• Quality Center reports can be
generated from each
Quality Center module.
• Report generation can be done
through “Analysis” menu
77. Page 77Classification: Restricted
Generating Reports
About Generating Reports
• You can generate reports at any time during the testing process.
• Reports can be generated from the Requirements, Test Plan, Test Lab,
and Defects modules. You can display reports using their default
settings, or you can customize them.
• You can save the settings of your reports as favorite views and reload
them as needed. You can also save your reports as text files or HTML
documents. In addition, you can export report data to Microsoft Excel.
• You can further customize the report by adding sub-reports.
78. Page 78Classification: Restricted
Requirement module reports
Requirement module reports
• The following reports are available with Requirement module reports
• Report Description Standard Requirements: Lists the requirements
that appear in the requirements tree.
• Tabular: Displays the requirements that appear in the requirements
tree in a grid format.
• Requirements with Coverage Tests Lists the requirements that
appear in the requirements tree with their tests coverage
information.
79. Page 79Classification: Restricted
Requirement module reports
• Requirements with Coverage Tests and Steps Lists the requirements
that appear in the requirements tree with their tests coverage
information. It also displays the test steps for each tests coverage.
• Requirements with Linked Defects Lists the requirements that
appear in the requirements tree with their linked defects.
• Requirements with Traceability Lists the requirements that appear in
the requirements tree with their associated traced to and traced
from requirements.
80. Page 80Classification: Restricted
Test Plan Module Reports
Test plan module reports
• Standard Test Planning Lists the tests in the test plan tree.
• Subject Tree Lists the tests in the test plan tree by subject.
• Tests with Design Steps Lists the tests that appear in the test plan tree,
including their design steps.
• Tests with Covered Requirements Lists the tests that appear in the test
plan tree with their requirements coverage information.
• Tests with Linked Defects Lists the tests that appear in the test plan
tree with their linked defects. Test plan module reports
81. Page 81Classification: Restricted
Test Lab Module Reports
• Current Test Set Lists the tests that appear in the current test set.
• Cross Test Set Lists the test sets that appear in the Test Sets list, without
listing their tests.
• Test Set Hierarchy with Tests Lists the test sets hierarchically, as well as the
status of each of the test sets.
• Cross Test Set with Tests Lists the test sets that appear in the Test Sets list,
including their tests.
82. Page 82Classification: Restricted
Test Lab Module Reports
• Current Test Set with Failed Test Runs Lists tests from the current test set,
with "Failed" test run status.
• Cross Test Set with Failed Test Runs Lists tests from all test sets, with
"failed" test run status.
• Execution Notification Lists the tests that are displayed in the current test
set with the results of their last test run.
83. Page 83Classification: Restricted
Defects Module Reports
Defects Module Reports:
• Standard Defects Lists the defects that appear in the project.
• Tabular Defects Displays the defects that appear in the project in a grid
format.
• Defects with Linked Tests and Runs Lists the defects with their linked tests
and test run results.
• Fixed or Rejected Defects Lists defects with "fixed" or "rejected" status.
84. Page 84Classification: Restricted
Defects Module Reports
• Fixed or Rejected Defects Detected by Current User Lists defects with
"fixed" or "rejected" status that were detected by the current user.
• Opened Defects Assigned to Current User List defects with "open" status
that are assigned to the current user.
85. Page 85Classification: Restricted
Available Sub Reports
• Each report can contain sub-reports. In addition, sub-reports
themselves might contain other sub-reports. The sub-reports
available depend on the type of the parent report.
The following sub-reports are available:
• Contained Tests Lists the tests in a test set.
• Coverage Requirements Lists information for
requirements that cover a test.
• Design Steps Lists the design steps for a test.
• Linked Defects Lists the defects that are linked to a record.
• Linked Entities List all entities that are linked to a defect.
• Parent Test Lists the parent test of a test.
86. Page 86Classification: Restricted
Available Sub Reports
• Related Defects Lists related defects for each subject in a test plan
tree
• Related Requirements Lists the requirements that are linked to a
defect
• Contained Tests Lists the tests in a test set.
• Coverage Requirements Lists information for requirements that
cover a test.
• Design Steps Lists the design steps for a test.
• Linked Defects Lists the defects that are linked to a record.
87. Page 87Classification: Restricted
Available Sub Reports
• Linked Entities List all entities that are linked to a defect.
• Parent Test Lists the parent test of a test.
• Related Defects Lists related defects for each subject in a test plan tree.
• Related Requirements Lists the requirements that are linked to a defect.
• Requirements Coverage Lists the tests that cover a requirement.
• Run Steps Lists the run steps for a test run.
• Runs Lists all runs of a test.
88. Page 88Classification: Restricted
Creating Reports
Creating Reports:
You can create a report from
the Requirements, Test Plan,
Test Lab, and Defects modules.
Depending on the current
module, you have different
report options. You can use the
default report or customize it to
meet your needs.
89. Page 89Classification: Restricted
Creating Reports
To create a report:
• Select the Quality Center
module from which you
want to create a report.
• Choose Analysis > Reports,
and select the type of report
you want to create
90. Page 90Classification: Restricted
Creating Reports
• You can click the First Page button to display the first page of the report,
or the Previous Page button to display the preceding page
• You can click the Next Page button to display the subsequent page of the
report, or the Last Page button to display the final page.
• To customize your report, click the Configure Report and Sub-Reports
button.
• To regenerate the report so that it displays the most up-to-date data, click
the Generate report button.
• To print your report, click the Print arrow and choose Current Page or All
Pages. The Print dialog box opens. Change the printer settings if
necessary. Click Print.
91. Page 91Classification: Restricted
Creating Reports
• To save your report, click the Save arrow and choose Current Page or All
Pages. The Save Web Page dialog box opens. Change the file name if
necessary. To save the report in its original format, select Web Page,
complete in the Save as type list. To save it as a text file, select Text File and
click Save.
• To export the report data to Microsoft Excel, right-click the report and
choose Export to Microsoft Excel. Excel must be installed on your machine
to export report data to Excel.
92. Page 92Classification: Restricted
Creating Reports
• To save the settings of your report as a favorite view, click the Add to
Favorites button. For more information, see Chapter 6, “Working with
Favorite Views.”
• Click Close to close the report and return to the current Quality Center
module.
93. Page 93Classification: Restricted
Creating Quick Reports
Creating Quick Reports
• You can create a quick report for
specific records. In addition, in the
Requirements module you can
create a quick report for a
requirement and its children.
• Note: You cannot view a quick
report for multiple nodes in the
test plan tree.
94. Page 94Classification: Restricted
Creating Quick Reports
To create a quick report:
• Select the requirements, tests, or
defects for which you want to
create a report. To create a report
for more than one record, press
the Ctrl key and select the records
for which you want to create a
report.
95. Page 95Classification: Restricted
Creating Quick Reports
Create the report using one of the following options:
• To create a report for the selected records, choose Analysis > Report
Selected. Alternatively, right-click the records and choose Report Selected.
The report opens with data for the selected records displayed.
• In the Requirements module, to create a quick report for a requirement
and its children, choose Analysis > Report Selected with Children.
Alternatively, right-click the requirement and choose Report Selected with
Children.
96. Page 96Classification: Restricted
Customizing Reports
To customize a report:
• Select the Quality Center module
from which you want to generate a
report.
• Choose Analysis > Reports and select
the report you want to customize. The
report opens with default data
displayed.
• Click the Configure Report and
Sub-Reports button to customize
your report. The Report Configuration
page opens with the default options
displayed.
97. Page 97Classification: Restricted
Customizing Reports
• In the Reports list, select a main report or a sub-report. The Report
Configuration pane displays the available options.
• Under Page, you can set the number of items per display page (available for
the main report):
• To limit the number of items per page, select Limit items per page to and
specify the number of items per page. To display all items in one page,
select All items in one page.
• Under Template, you can use the Quality Center default report template or
your own template. (This option is available for the main report only.)
98. Page 98Classification: Restricted
Customizing Reports
• Under Filter, you can define or clear filters and sorting priorities:
• Click the Set Filter/Sort button to filter and sort your data according to
criteria you choose.
• Click the Clear Filter/Sort button to clear all the filters and sorting priorities.
• Select All Fields (auto-layout) to display all fields in the report.
• Select Custom Fields (layout), and click the Select Fields button to choose
the fields and set their order.
• You can also select the following options. Note that not all options are
available in all modules.
99. Page 99Classification: Restricted
Customizing Reports
• Grid View Displays the report as a grid.
• Attachments Displays a list of associated attachments.
• History Displays a list of all the changes made to a requirement, test, or
defect.
• Keep Parent-Child Order Displays the requirement topic with the child
requirement below it. Selecting this option disables your defined filters and
sorting priorities.
100. Page 100Classification: Restricted
Customizing Reports
Show Paragraph Number Displays the assigned hierarchical numbers
to each requirement in the tree. Note that the numbers are not related
to the unique Req ID assigned to each requirement.
Rich Text Includes rich text for the requirements in the report.
Show Full Coverage Displays the tests coverage for each requirement
• To add a sub-report, click the Add Sub-Report button. In the
Type list, select
• a sub-report type and click OK. The sub-report is added to the
Reports list.
• To delete a sub-report, select the sub-report and click the
Delete Sub-Report button.
• Click the Apply button to generate a new report.
101. Page 101Classification: Restricted
Document Generator
The Quality Center Document
Generator enables you to create
a Microsoft Word document
containing a project's requirements,
planning, test list, test set folders,
and defect tracking data.
Note: The Document Generator can
only be run if Microsoft Word has been
enabled to run macros.
You can create the document by
performing the following tasks:
# Set document format.
# Specify document content.
# Generate and edit the document.
102. Page 102Classification: Restricted
Document Generator
Document Settings:
Select a check box in the
Document Generator tree.
Following information's
can be given.
But not mandatory.
• Title Name
• Author
• Mail
• Description
107. Page 107Classification: Restricted
Excel Reports
Excel Reports:
• Enables user to export QC data to Microsoft Excel
• Export the data to Excel by defining SQL queries on the Quality Center
project database. After the data has been exported, you can also run a
Visual Basic script on the data within Excel to process and analyze the data.
This feature provides you with increased flexibility when analyzing Quality
Center data.
108. Page 108Classification: Restricted
Excel Reports
• Query tab. Enables you to define and test SQL queries that extract data
from the Quality Center project database to Excel
• Post-processing tab. Enables you to define a Visual Basic script to run in
Excel after report data has been exported.
• Generation Settings tab. Enables you to define settings for generating a
report.
• Public. Reports in this folder are available to all users of the project.
• Private. Reports in this folder are available only to the user who created
them.
109. Page 109Classification: Restricted
Excel Reports
Creating Excel Reports
Add the report to the Excel Reports tree
Define which data to include in the report
through SQL query
Generate the report
Adding Reports:
Click the Tools button on the upper-right
of the Quality Center window,and select
Excel Report Generator. The Excel Report
Generator opens.
In the Excel Reports tree select the
required public or private folder