In the context of a European ERDF project, researchers from UMONS and FUNDP in Belgium carried out a survey on the use of software quality practices in software producing and maintaining companies in Wallonia.
Contact: tom.mens@umons.ac.be
Agile Software Development (ASD) is becoming more popular in all fields of industry. For an agile transformation, organizations need to continuously improve their established approaches to Requirements Engineering (RE) as well as their approaches to software development. This is accompanied by some challenges in terms of agile RE. The main objective of this paper is to identify the most important challenges in agile RE industry has to face today. Therefore, we conducted an iterative expert judgement process with 26 experts in the field of ASD, comprising three complementary rounds. In sum, we identified 20 challenges in three rounds. Six of these challenges are defined as key challenges. Based on the results, we provide options for dealing with those key challenges by means of agile techniques and tools. The results show that the identified challenges are often not limited to ASD, but they rather refer to software development in general. Therefore, we can conclude that organizations still struggle with agile transition and understanding agile values, in particular, in terms of stakeholder and user involvement.
Using CMMI Process Management Practices to Build and Maintain a QMSPECB
This webinar discussed some of the management tools which help in developing and maintaining the QMS. Moreover, the session provided insights about the importance that the designing and building of the QMS has for the future of the company.
Main points covered:
• How CMMI practices in the OPF and OPD process areas can benefit from the QMS development and maintenance
• The process as a product – the importance of designing your QMS
• Building a QMS for now and for the future
Presenter:
Michael West has spent 25 years in process and performance improvement. As a principal and founder of Natural Systems Process Improvement, Inc. he has helped many client organizations in numerous industry sectors achieve their process and performance goals. Mr. West is a CMMI Institute-Certified Lead Appraiser and an AS9100 Auditor.
Link to the recorded YouTube video: https://youtu.be/VkEEmYZlgr4
Process Improvement for better Software Technical Quality under Global Crisis...Optimyth Software
Software development failure rates are higher than any other mankind activity. Lack of quality is frequently the most relevant underlying reason. Agile methodologies offer a framework that try to support change, covering scope and time/budget, and keeping quality at an adequate level at the same time. But the gap between current software complexity and the Software Quality Assurance (SQA) knowledge, techniques and tools has grown bigger, even for organizations with high maturity level correctly using Agile methodologies.
Experience tells us that simply adopting Agile was not sufficient for producing software with quality above average. The most efficient techniques and trends from the SQA arsenal, adapted to Agile methodologies, will be discussed. Their cost/benefit ratios will be analyzed, and a process improvement roadmap will be presented, as a practical way to made software deliverables both more agile and with higher technical quality, under the constraints dictated by the Global Economic Crisis.
The presentation ends with a case study in process improvement for quality in a TSP + Agile scenario, and final recommendations that any organization using Agile methodologies could implement for short-term benefits.
Engineering DevOps Right the First TimeMarc Hornbeek
Marc Hornbeek is an experienced DevOps consultant with over 39 years of experience in IT architecture, development, and management. He discusses engineering DevOps right from the start through a top-down/middle-out approach focusing on leadership alignment, gap assessment, and process re-engineering to optimize agility, efficiency, quality and stability. Key aspects include modeling the DevOps pipeline, analyzing elements like tools and metrics, and controlling technology and process evolution over time.
This document discusses process improvement. It explains that process improvement aims to introduce changes to achieve organizational objectives like quality improvement, cost reduction, and schedule acceleration. Most improvements so far have focused on defect reduction. The stages of process improvement are described as process analysis, improvement identification, change introduction, change training, and change tuning. Process and product quality are closely related, with process usually determining product quality. The Capability Maturity Model (CMM) developed by the Software Engineering Institute aims to improve software processes. It defines five levels of process maturity from initial to optimizing.
Three original implementations of the quality assurance role in two different companies. How creative management can solve the problem of making QA be both a career path and a positive influence in the process improvement path.
Test Process Improvement with TPI NEXT - what the model does not tell you but...SQALab
The document discusses the Test Process Improvement (TPI) NEXT model. It describes how the model divides testing into 16 key areas and 4 maturity levels, and provides 157 checkpoints and improvement suggestions. The results of an assessment using the model are presented visually in a testing maturity matrix. The document emphasizes that properly defining and implementing improvements are critical steps, and discusses challenges organizations may face in the improvement process and how to address them.
Agile Software Development (ASD) is becoming more popular in all fields of industry. For an agile transformation, organizations need to continuously improve their established approaches to Requirements Engineering (RE) as well as their approaches to software development. This is accompanied by some challenges in terms of agile RE. The main objective of this paper is to identify the most important challenges in agile RE industry has to face today. Therefore, we conducted an iterative expert judgement process with 26 experts in the field of ASD, comprising three complementary rounds. In sum, we identified 20 challenges in three rounds. Six of these challenges are defined as key challenges. Based on the results, we provide options for dealing with those key challenges by means of agile techniques and tools. The results show that the identified challenges are often not limited to ASD, but they rather refer to software development in general. Therefore, we can conclude that organizations still struggle with agile transition and understanding agile values, in particular, in terms of stakeholder and user involvement.
Using CMMI Process Management Practices to Build and Maintain a QMSPECB
This webinar discussed some of the management tools which help in developing and maintaining the QMS. Moreover, the session provided insights about the importance that the designing and building of the QMS has for the future of the company.
Main points covered:
• How CMMI practices in the OPF and OPD process areas can benefit from the QMS development and maintenance
• The process as a product – the importance of designing your QMS
• Building a QMS for now and for the future
Presenter:
Michael West has spent 25 years in process and performance improvement. As a principal and founder of Natural Systems Process Improvement, Inc. he has helped many client organizations in numerous industry sectors achieve their process and performance goals. Mr. West is a CMMI Institute-Certified Lead Appraiser and an AS9100 Auditor.
Link to the recorded YouTube video: https://youtu.be/VkEEmYZlgr4
Process Improvement for better Software Technical Quality under Global Crisis...Optimyth Software
Software development failure rates are higher than any other mankind activity. Lack of quality is frequently the most relevant underlying reason. Agile methodologies offer a framework that try to support change, covering scope and time/budget, and keeping quality at an adequate level at the same time. But the gap between current software complexity and the Software Quality Assurance (SQA) knowledge, techniques and tools has grown bigger, even for organizations with high maturity level correctly using Agile methodologies.
Experience tells us that simply adopting Agile was not sufficient for producing software with quality above average. The most efficient techniques and trends from the SQA arsenal, adapted to Agile methodologies, will be discussed. Their cost/benefit ratios will be analyzed, and a process improvement roadmap will be presented, as a practical way to made software deliverables both more agile and with higher technical quality, under the constraints dictated by the Global Economic Crisis.
The presentation ends with a case study in process improvement for quality in a TSP + Agile scenario, and final recommendations that any organization using Agile methodologies could implement for short-term benefits.
Engineering DevOps Right the First TimeMarc Hornbeek
Marc Hornbeek is an experienced DevOps consultant with over 39 years of experience in IT architecture, development, and management. He discusses engineering DevOps right from the start through a top-down/middle-out approach focusing on leadership alignment, gap assessment, and process re-engineering to optimize agility, efficiency, quality and stability. Key aspects include modeling the DevOps pipeline, analyzing elements like tools and metrics, and controlling technology and process evolution over time.
This document discusses process improvement. It explains that process improvement aims to introduce changes to achieve organizational objectives like quality improvement, cost reduction, and schedule acceleration. Most improvements so far have focused on defect reduction. The stages of process improvement are described as process analysis, improvement identification, change introduction, change training, and change tuning. Process and product quality are closely related, with process usually determining product quality. The Capability Maturity Model (CMM) developed by the Software Engineering Institute aims to improve software processes. It defines five levels of process maturity from initial to optimizing.
Three original implementations of the quality assurance role in two different companies. How creative management can solve the problem of making QA be both a career path and a positive influence in the process improvement path.
Test Process Improvement with TPI NEXT - what the model does not tell you but...SQALab
The document discusses the Test Process Improvement (TPI) NEXT model. It describes how the model divides testing into 16 key areas and 4 maturity levels, and provides 157 checkpoints and improvement suggestions. The results of an assessment using the model are presented visually in a testing maturity matrix. The document emphasizes that properly defining and implementing improvements are critical steps, and discusses challenges organizations may face in the improvement process and how to address them.
1010 guide is essentially a flowchart to help software managers choose an efficient software project management methodology based on the metrics they have. Differentiation between critical and non-critical projects which is followed by choosing the pre-defined metrics for the team. Finally a table corresponds to the options chosen and results in a straight forward selection of the appropriate SDLC based on the values given.
This document provides an overview of quality management in software engineering. It discusses software quality, standards, reviews and inspections, as well as software measurement and metrics. The key points covered include establishing an organizational framework for quality management, applying specific quality processes and standards at the project level, and conducting independent reviews to ensure compliance. Software metrics can help quantify attributes and identify anomalous components, but meaningful relationships between internal metrics and external quality attributes can be difficult to establish.
This document discusses agile test planning and compares it to traditional test planning methods. It proposes a new template for agile test planning that combines elements of the IEEE 829 test plan standard and James Bach's heuristic test strategy model. The document reviews literature on agile principles, quality assurance, and test planning. It analyzes the components of IEEE 829 and identifies which could be adopted for agile test planning while still adhering to agile values. A research methodology using multiple case studies is presented to analyze the effectiveness of the proposed new agile test planning template.
The document discusses software quality and quality assurance. It defines software as computer programs, procedures, and documentation pertaining to computer system operation. Software errors are faults made by programmers, while failures occur when faults are activated. Nine common causes of software errors are identified, including faulty requirements, communication failures, and testing shortcomings. Software quality is defined as meeting requirements or customer needs/expectations. Software quality assurance is a planned, systematic set of actions to ensure software meets technical and managerial requirements with adequate confidence. It differs from quality control by focusing on preventing errors throughout development.
One of the really serious problems with modern software development is the fact that 99% of software development organizations equate Software Quality Assurance with Testing. This is a very costly misunderstanding. Quality Assurance is supposed to help reduce costs and help produce better software. Testing can only add to costs and it can only be done once something is built. So when testing starts, all the errors are already made and all the bad things that SQA is supposed to prevent have already occurred. Testing is pretty much the exact opposite of Quality Assurance.
The first report for Obstacle Driven Development which has been released and is intended to comprehensively cover the basics of ODD while being concise.
ODD is used for software, hardware and embedded and is the product of combining various engineering and software techniques.
Other presentations cover how ODD extends and combines Test Driven Development, requirements analysis, V-models and Agile.
A further paper is to follow with full referencing.
Gekkobrain provides tools for you to run a reliable SAP in your enterprise. The tools include HANA automated assessment and custom code remediation, performance improvement, and SAP DevOps
Innovation and facts over opinions and assumptions; follow a plan that responds to change.
Project Control is application of control theory to development of solutions. Through creating tests first we create a negative feedback to to help develppe
Obstacle Driven Development combines the latest engineering methods and software development. ODD helps identify, correct and prevent errors as early and efficiently as practical.
This presentation is the most comprehensive so far and demonstrates how ODD extends and combines ISO compatible V-models, Test Driven Development, requirements analysis, extended specifications and Agile.
Please see the series for further details.
This presentation explains how testing activities constitute the main bottleneck to flow in most continuous delivery pipelines. Continuous Testing strategies are designed to reduce testing bottlenecks, and accelerate time-to-quality.
A blueprint is presented for Continuous Testing. Specific strategies are presented including Continuous Test Tenets, Leadership and Culture practices, Test strategies and Plans, Test Management, Test Automation, Test Tools, Test Environment Management and Test Results Analysis. A Continuous Testing Assessment approach is described to help assess current state of of continuous testing. A phased implementation approach is explained.
How Do You Measure The KM Maturity Of Your Organization Final Ver.Art Schlussel
This paper explores practical ways to measure the KM state of an organization, examines accepted KM initiatives used throughout the KM community, and identifies useful metrics for those KM initiatives. Included is the Army Knowledge Management Maturity Indicator tool. The tool will assist you in assessing your organization\'s overall level of KM maturity.
The document summarizes a research paper presented at the 19th International Conference on Production Research about improving software development processes using Six Sigma. It presents the SW-DMAIC method, which was structured based on a multiple case study of software companies. The SW-DMAIC method adapts Six Sigma to address limitations of applying the DMAIC method to software development, such as inaccurate problem identification and changing requirements. It was observed that Six Sigma processes should be present not just in improvement projects but also when introducing Six Sigma and new projects. The benefits can be achieved without executing all DMAIC activities or requiring high maturity.
Using the test process improvement models. Case study based on TPI Next model...Sigma Software
The document discusses using the TPI Next test process improvement model. It provides an overview of the TPI Next model, which evaluates test processes across 16 key areas and 4 maturity levels. It then presents a case study example of implementing TPI Next on a project. The case study involves evaluating the current test process maturity, identifying improvement priorities, creating a test process improvement plan, implementing improvements, and planning the next improvement cycle. While most improvements were successfully implemented, one faced resistance from management.
IDG MarketPulse: Virtual Graphics Processing Unit (vGPU)jmariani14
Presented on behalf of Dell, VMware & NVIDIA.
The purpose of this survey is to better understand user experiences and challenges with demanding graphics applications in use today, in addition to deployment plans for virtualized graphics solutions. The survey explores the expected and experienced benefits of virtualized graphics solutions, as well as primary barriers to deployment.
Systems Engineering, Project Management and Bespoke Training for Industry Professionals in Switzerland and Europe.
Systems Engineering and Project Management are core engineering disciplines used to enable the delivery of complex projects within schedule and cost expectations.
Delivering complex projects demands cross-functional engineering disciplines such as Systems Engineering, Project Management, Safety Engineering, Product Development and Design Thinking. SE-Training has been founded to offer specifically tailored training courses that support the drive, ambition and success in providing innovate and high quality products and services.
There are a high number of engineering organisations based across Europe with diverse needs; SE-Training addresses these unique needs through structured and bespoke courses provided by expert engineering professionals and academics.
This document discusses the use of seven basic quality tools (7QC tools) in continuous improvement processes. It describes the 7QC tools as flow charts, Pareto diagrams, check sheets, control charts, histograms, scatter plots, and cause-and-effect diagrams. These tools can be used in all phases of product development and production management. The document also discusses how the 7QC tools can be applied in the PDCA cycle of continuous improvement, the DMAIC methodology of Six Sigma, and the DMADV approach of Design for Six Sigma. Using these tools systematically can help teams identify and analyze problems and develop and evaluate solutions.
Maintaining a full QA process is vital for product development. Following QA standards leads to well-defined, user-oriented quality products. A good software process provides a framework for managing activities to ensure quality. The crucial role of a QA process is to produce superior products on time with better viability.
Quality Engineering and Testing with TMAP in DevOps IT deliveryRik Marselis
To continuously deliver IT systems at speed with a focus on business value, cross-functional DevOps IT delivery teams integrate quality engineering in their way of working.
Quality engineering is the new concept in achieving the right quality of IT systems. Testing an application only after the digital product has been fully developed has long been a thing of the past. But more is needed to guarantee the quality of applications that are delivered faster and more frequently in today’s high-performance IT delivery models. The road to quality engineering means changes in terms of starting points, skills, organization, automation and quality measures.
The TMAP body of knowledge introduces the VOICE model which guides teams to align their activities with the business value that is pursued, and by measuring indicators, teams give the right information to stakeholders to establish their confidence that the IT delivery will actually result in business value for the customers.
TMAP's topics are a useful grouping of all activities relevant to quality engineering. The organizing topics are relevant to align activities between teams and the performing topics have a focus on the operational activities within a team.
Also, to be able to deliver quality at speed, for DevOps teams it is crucial to benefit from automating activities, for example in a CI/CD pipeline, whereby people must remember that automation is not the goal but just a way to increase quality and speed.
In this webinar the audience will learn why a broad view on quality engineering is important and how quality engineering can be implemented to achieve the right quality of IT products, the IT delivery process and the people involved. Also we will introduce the new TMAP training courses for quality engineering and testing.
This webinar and the training courses are based on the TMAP book "Quality for DevOps teams" (ISBN 978-90-75414-89-9) which supports high-performance cross-functional teams in implementing quality in their DevOps culture, with practical examples, useful knowledge and some theoretical background. The TMAP body of knowledge is found on www.TMAP.net.
Key takeaways:
Quality engineering is the new concept aiming to deliver quality at speed
By measuring the right indicators the team supports confidence in achieving pursued value
By applying the proper quality measures and tools the team focuses on relevant activities
The TMAP certification scheme (with exams provided by iSQI) has 3 practical courses for DevOps people
In case IT delivery is done with multiple teams TMAP aligns with the Scaled Agile Framework to achieve quality at scale
This document provides an overview of the Rational Unified Process (RUP). It discusses the phases of RUP which include inception, elaboration, construction, and transition. It also discusses the core workflows, best practices, and tools used in RUP. The document outlines the dynamic and static structures of RUP, describing the phases, milestones, roles, activities, artifacts, and workflows. It provides details on the objectives and goals of each phase in the RUP lifecycle.
Software Engineering The Multiview Approach And Wisdmguestc990b6
The document provides an overview of web information system development methodology. It discusses key components of information systems and why structured methodologies are important for information system projects. It then describes various software development models including waterfall, iterative, evolutionary, spiral and V-model. Finally, it discusses special considerations for web-based information systems and proposes a socio-technical web information system development methodology called WISDM that takes organizational, technical and human factors into account.
1010 guide is essentially a flowchart to help software managers choose an efficient software project management methodology based on the metrics they have. Differentiation between critical and non-critical projects which is followed by choosing the pre-defined metrics for the team. Finally a table corresponds to the options chosen and results in a straight forward selection of the appropriate SDLC based on the values given.
This document provides an overview of quality management in software engineering. It discusses software quality, standards, reviews and inspections, as well as software measurement and metrics. The key points covered include establishing an organizational framework for quality management, applying specific quality processes and standards at the project level, and conducting independent reviews to ensure compliance. Software metrics can help quantify attributes and identify anomalous components, but meaningful relationships between internal metrics and external quality attributes can be difficult to establish.
This document discusses agile test planning and compares it to traditional test planning methods. It proposes a new template for agile test planning that combines elements of the IEEE 829 test plan standard and James Bach's heuristic test strategy model. The document reviews literature on agile principles, quality assurance, and test planning. It analyzes the components of IEEE 829 and identifies which could be adopted for agile test planning while still adhering to agile values. A research methodology using multiple case studies is presented to analyze the effectiveness of the proposed new agile test planning template.
The document discusses software quality and quality assurance. It defines software as computer programs, procedures, and documentation pertaining to computer system operation. Software errors are faults made by programmers, while failures occur when faults are activated. Nine common causes of software errors are identified, including faulty requirements, communication failures, and testing shortcomings. Software quality is defined as meeting requirements or customer needs/expectations. Software quality assurance is a planned, systematic set of actions to ensure software meets technical and managerial requirements with adequate confidence. It differs from quality control by focusing on preventing errors throughout development.
One of the really serious problems with modern software development is the fact that 99% of software development organizations equate Software Quality Assurance with Testing. This is a very costly misunderstanding. Quality Assurance is supposed to help reduce costs and help produce better software. Testing can only add to costs and it can only be done once something is built. So when testing starts, all the errors are already made and all the bad things that SQA is supposed to prevent have already occurred. Testing is pretty much the exact opposite of Quality Assurance.
The first report for Obstacle Driven Development which has been released and is intended to comprehensively cover the basics of ODD while being concise.
ODD is used for software, hardware and embedded and is the product of combining various engineering and software techniques.
Other presentations cover how ODD extends and combines Test Driven Development, requirements analysis, V-models and Agile.
A further paper is to follow with full referencing.
Gekkobrain provides tools for you to run a reliable SAP in your enterprise. The tools include HANA automated assessment and custom code remediation, performance improvement, and SAP DevOps
Innovation and facts over opinions and assumptions; follow a plan that responds to change.
Project Control is application of control theory to development of solutions. Through creating tests first we create a negative feedback to to help develppe
Obstacle Driven Development combines the latest engineering methods and software development. ODD helps identify, correct and prevent errors as early and efficiently as practical.
This presentation is the most comprehensive so far and demonstrates how ODD extends and combines ISO compatible V-models, Test Driven Development, requirements analysis, extended specifications and Agile.
Please see the series for further details.
This presentation explains how testing activities constitute the main bottleneck to flow in most continuous delivery pipelines. Continuous Testing strategies are designed to reduce testing bottlenecks, and accelerate time-to-quality.
A blueprint is presented for Continuous Testing. Specific strategies are presented including Continuous Test Tenets, Leadership and Culture practices, Test strategies and Plans, Test Management, Test Automation, Test Tools, Test Environment Management and Test Results Analysis. A Continuous Testing Assessment approach is described to help assess current state of of continuous testing. A phased implementation approach is explained.
How Do You Measure The KM Maturity Of Your Organization Final Ver.Art Schlussel
This paper explores practical ways to measure the KM state of an organization, examines accepted KM initiatives used throughout the KM community, and identifies useful metrics for those KM initiatives. Included is the Army Knowledge Management Maturity Indicator tool. The tool will assist you in assessing your organization\'s overall level of KM maturity.
The document summarizes a research paper presented at the 19th International Conference on Production Research about improving software development processes using Six Sigma. It presents the SW-DMAIC method, which was structured based on a multiple case study of software companies. The SW-DMAIC method adapts Six Sigma to address limitations of applying the DMAIC method to software development, such as inaccurate problem identification and changing requirements. It was observed that Six Sigma processes should be present not just in improvement projects but also when introducing Six Sigma and new projects. The benefits can be achieved without executing all DMAIC activities or requiring high maturity.
Using the test process improvement models. Case study based on TPI Next model...Sigma Software
The document discusses using the TPI Next test process improvement model. It provides an overview of the TPI Next model, which evaluates test processes across 16 key areas and 4 maturity levels. It then presents a case study example of implementing TPI Next on a project. The case study involves evaluating the current test process maturity, identifying improvement priorities, creating a test process improvement plan, implementing improvements, and planning the next improvement cycle. While most improvements were successfully implemented, one faced resistance from management.
IDG MarketPulse: Virtual Graphics Processing Unit (vGPU)jmariani14
Presented on behalf of Dell, VMware & NVIDIA.
The purpose of this survey is to better understand user experiences and challenges with demanding graphics applications in use today, in addition to deployment plans for virtualized graphics solutions. The survey explores the expected and experienced benefits of virtualized graphics solutions, as well as primary barriers to deployment.
Systems Engineering, Project Management and Bespoke Training for Industry Professionals in Switzerland and Europe.
Systems Engineering and Project Management are core engineering disciplines used to enable the delivery of complex projects within schedule and cost expectations.
Delivering complex projects demands cross-functional engineering disciplines such as Systems Engineering, Project Management, Safety Engineering, Product Development and Design Thinking. SE-Training has been founded to offer specifically tailored training courses that support the drive, ambition and success in providing innovate and high quality products and services.
There are a high number of engineering organisations based across Europe with diverse needs; SE-Training addresses these unique needs through structured and bespoke courses provided by expert engineering professionals and academics.
This document discusses the use of seven basic quality tools (7QC tools) in continuous improvement processes. It describes the 7QC tools as flow charts, Pareto diagrams, check sheets, control charts, histograms, scatter plots, and cause-and-effect diagrams. These tools can be used in all phases of product development and production management. The document also discusses how the 7QC tools can be applied in the PDCA cycle of continuous improvement, the DMAIC methodology of Six Sigma, and the DMADV approach of Design for Six Sigma. Using these tools systematically can help teams identify and analyze problems and develop and evaluate solutions.
Maintaining a full QA process is vital for product development. Following QA standards leads to well-defined, user-oriented quality products. A good software process provides a framework for managing activities to ensure quality. The crucial role of a QA process is to produce superior products on time with better viability.
Quality Engineering and Testing with TMAP in DevOps IT deliveryRik Marselis
To continuously deliver IT systems at speed with a focus on business value, cross-functional DevOps IT delivery teams integrate quality engineering in their way of working.
Quality engineering is the new concept in achieving the right quality of IT systems. Testing an application only after the digital product has been fully developed has long been a thing of the past. But more is needed to guarantee the quality of applications that are delivered faster and more frequently in today’s high-performance IT delivery models. The road to quality engineering means changes in terms of starting points, skills, organization, automation and quality measures.
The TMAP body of knowledge introduces the VOICE model which guides teams to align their activities with the business value that is pursued, and by measuring indicators, teams give the right information to stakeholders to establish their confidence that the IT delivery will actually result in business value for the customers.
TMAP's topics are a useful grouping of all activities relevant to quality engineering. The organizing topics are relevant to align activities between teams and the performing topics have a focus on the operational activities within a team.
Also, to be able to deliver quality at speed, for DevOps teams it is crucial to benefit from automating activities, for example in a CI/CD pipeline, whereby people must remember that automation is not the goal but just a way to increase quality and speed.
In this webinar the audience will learn why a broad view on quality engineering is important and how quality engineering can be implemented to achieve the right quality of IT products, the IT delivery process and the people involved. Also we will introduce the new TMAP training courses for quality engineering and testing.
This webinar and the training courses are based on the TMAP book "Quality for DevOps teams" (ISBN 978-90-75414-89-9) which supports high-performance cross-functional teams in implementing quality in their DevOps culture, with practical examples, useful knowledge and some theoretical background. The TMAP body of knowledge is found on www.TMAP.net.
Key takeaways:
Quality engineering is the new concept aiming to deliver quality at speed
By measuring the right indicators the team supports confidence in achieving pursued value
By applying the proper quality measures and tools the team focuses on relevant activities
The TMAP certification scheme (with exams provided by iSQI) has 3 practical courses for DevOps people
In case IT delivery is done with multiple teams TMAP aligns with the Scaled Agile Framework to achieve quality at scale
This document provides an overview of the Rational Unified Process (RUP). It discusses the phases of RUP which include inception, elaboration, construction, and transition. It also discusses the core workflows, best practices, and tools used in RUP. The document outlines the dynamic and static structures of RUP, describing the phases, milestones, roles, activities, artifacts, and workflows. It provides details on the objectives and goals of each phase in the RUP lifecycle.
Software Engineering The Multiview Approach And Wisdmguestc990b6
The document provides an overview of web information system development methodology. It discusses key components of information systems and why structured methodologies are important for information system projects. It then describes various software development models including waterfall, iterative, evolutionary, spiral and V-model. Finally, it discusses special considerations for web-based information systems and proposes a socio-technical web information system development methodology called WISDM that takes organizational, technical and human factors into account.
The document summarizes the results of a survey of stakeholders in accessibility. It found that while awareness of standards like WCAG is growing, there is still a need for more education and tools to help developers evaluate accessibility. Developers expressed interest in accessibility simulation and validation tools integrated into development environments. Service providers were knowledgeable about standards but wanted improved mobile accessibility and simulation tools. Public bodies focused on evaluation tools but lacked internal expertise. Assessors relied heavily on evaluation and simulation tools. End users faced significant barriers to access but still used technology frequently despite difficulties. Training for all groups fell short of expectations.
The document discusses principles of software management and development practices. It covers:
1. Establishing iterative lifecycle processes that identify risks early through multiple iterations of problem understanding, solution design, and planning.
2. Transitioning design methods to emphasize component-based development using pre-existing code to reduce custom development.
3. Enhancing change freedom through automated tools that support round-trip engineering and synchronization across different formats and stages of the iterative development process.
DSDM is a software development methodology based on RAD that emphasizes iterative development, user involvement, and adaptability. It aims to deliver working software frequently within budget and schedule while allowing changing requirements. DSDM was developed in the 1990s in the UK and uses principles like active user involvement, empowered decision-making teams, and reversible changes to facilitate iterative and collaborative development.
This document provides an overview of software development life cycle (SDLC) models and their comparison. It discusses several SDLC models including waterfall, V-shaped, iterative, prototyping, RAD, spiral and agile. Each model is described in terms of its phases, advantages and disadvantages. The document also presents related work from other scholars and states that while agile was not fully extreme programming, using Scrum principles resulted in return on investment and lower costs. It proposes future work to identify knowledge sharing procedures and user-centered SDLC models that overcome limitations of existing approaches.
The document discusses Integrated Product and Process Development (IPPD), which is a Department of Defense management technique that simultaneously integrates all essential activities through multidisciplinary teams. IPPD provides a systematic approach to product development that achieves timely collaboration throughout the product life cycle to satisfy customer needs. The document then outlines the tenets of IPPD and how it relates to the CMMI process areas at different maturity levels. It also discusses the Rational Unified Process (RUP) framework and its iterative development cycle.
The document provides an overview of the Rational Unified Process (RUP), a software development process originally developed by Rational Software. It describes RUP as an iterative process with four phases (inception, elaboration, construction, transition) and six disciplines (business modeling, requirements, analysis and design, implementation, test, deployment). The document outlines some advantages of RUP like regular feedback, efficient use of resources, and improved risk management compared to traditional waterfall approaches. It also notes some potential disadvantages like complexity and needing expertise to fully adopt RUP.
This presentation was part of my session in "Agile in Business" seminar in Chennai on July 27th, 2013, organized by Unicom. This addresses the different aspects to be considered when a test team is transformed in an Agile set-up performing Agile Testing.
This document provides an overview of Lean Manufacturing (also known as Just-in-Time or JIT) concepts and implementation process. It discusses that Lean aims to eliminate waste from manufacturing operations to improve efficiency. The seven types of waste are identified as overproduction, inventory, waiting, transportation, overprocessing, motion, and defects. A five step process is outlined for implementing Lean: 1) analysis to identify non-value added activities, 2) choosing appropriate Lean solutions, 3) implementation, 4) verification of impact, and 5) standardization. Critical success factors are also discussed such as aligning with product costing methods and quality practices.
The document discusses software project management. It defines what a project and project management are, and describes the key characteristics of a software project. It outlines several software development lifecycles and methodologies including waterfall, prototype, spiral, agile, Scrum, extreme programming (XP), and rapid application development (RAD). It also discusses software project roles, risk management, project monitoring, defining a lifecycle model, software team organization structures, communication and coordination practices, and factors to consider when selecting a lifecycle model.
The document discusses software process maturity and assessment. It describes several software maturity frameworks including the Capability Maturity Model (CMM) which defines five levels of software process maturity from initial to optimizing. It also discusses the principles of software process change including the need for continuous improvement and investment. Software process assessment identifies problems and priorities for improvement through structured interviews to enroll opinion leaders in change.
Introduction of TMAP to representatives of ISTQB boards in the GA week in Mar...Rik Marselis
TMAP is the body of knowledge for quality engineering and testing.
I presented this to representatives of ISTQB boards that were present in Marrakech where the ISTQB GA was held, and where the 20th anniversary of ISTQB was celebrated.
Also ISTQB president Olivier Denoo handed me the International Software Testing Excellence Award 2022, for which I'm very honoured and grateful.
Key points:
• Focus on quality engineering in broad perspective (and testing is part of this)
• Focus on all members of cross-functional teams
• Many hands on templates on website
• Fully aligned with DevOps (but also applicable to Agile, Scrum, SAFe ®
• Almost 30 years of history and innovation
• Aligns very well with ISTQB
• 3rd edition of TMAP book “Quality for DevOps teams” was just released
If you were not able to attend, here is the presentation. If you have any questions please don't hesitate to contact me, my email address is mentioned at the end of the presentation.
Good luck with applying the www.TMAP.net body of knowledge in your daily quality & testing practice!!
Model-Based Software Engineering: A Multiple-Case Study on Challenges and Dev...Rodi Jolak
A recurring theme in discussions about the adoption of Model-Based Engineering (MBE) is its effectiveness. This is because there is a lack of empirical assessment of the processes and (tool-)use of MBE in practice. We conducted a multiple-case study by observing 2 two-month MBE projects from which software for a Mars rover were developed. We focused on assessing the distribution of the total software development effort over different development activities. Moreover, we observed and collected challenges reported by the developers during the execution of projects. We found that the majority of the effort is spent on the collaboration and communication activities. Furthermore, our inquiry into challenges showed that tool-related challenges are the most encountered.
A Method for Evaluating End-User Development TechnologiesClaudia Melo
Presentation at Americas Conference on Information Systems, 2017. Paper abstract:
End-user development (EUD) is a strategy that can reduce a considerable amount of business demand on
IT departments. Empowering the end-user in the context of software development is only possible
through technologies that allow them to manipulate data and information without the need for deep
programming knowledge. The successful selection of appropriate tools and technologies is highly
dependent on the context in which the end-user is embedded. End-users should be a central piece in any
software package evaluation, being key in the evaluation process in the end-user development context.
However, little research has empirically examined software package evaluation criteria and techniques in
general, and in the end-user development context in particular. This paper aims to provide a method for
technology evaluation in the context of end-user development and to present the evaluation of two
platforms. We conclude our study proposing a set of suggestions for future research.
Learn Key Insights from The State of Web Application Testing Research ReportSencha
In a recent study by Dimensional Research of 1,011 development and QA professionals, almost every survey respondent cited that application quality is important, with 84% believing it is very or critically important. Despite this, findings revealed that 94% of teams still face challenges when it comes to conducting adequate QA. View the presentation to learn why organizations must prioritize automated testing and QA practices to deliver high-quality applications and increase customer satisfaction.
The document discusses systems analysis and design. It provides an overview of the software development life cycle (SDLC) and various methodologies including Rational Unified Process (RUP), Agile, Scrum, eXtreme Programming (XP), and others. It describes the phases of RUP including inception, elaboration, construction, and transition. It also discusses key aspects of RUP like risk-driven development, use case driven development, and architecture-centric design.
This document discusses software engineering processes and quality. It states that for a quality software product, both the quality of the product itself and the quality of the software process are important. It also notes that fixing problems later in the development process costs significantly more than earlier phases, so more attention should be paid early on. The document then summarizes Boehm's "Industrial Software Metrics Top 10 List" and discusses Pareto analysis and principles of software maintenance and processes.
Rudy Katchow and Andy Rooswinkel: Software Product Manager: A Mechanism to Manage Software Products in Small and Medium ISVs
Many tools have been introduced in the market to manage software products, yet there are many small and medium software companies do not apply systematic approach to manage their products and they tend to use general purpose text processing and spreadsheet solutions. In this research, we present SP Manager as an innovative tool for managing software products in small and medium independent software vendors (ISVs). This tool includes several concepts, such as situational method engineering and the integration with defect management, that makes this tool easy to adopt and deployed in different situations. If you are curious to know these concepts that you should be included in software product management tools and our suggested easy to adapt and deploy tool, then please attend this session.
Similar to A survey on software quality practice - Pilot study in the Walloon region (20)
Keynote talk targeted to PhD students, during the BENEVOL 2023 research seminar (focused on software evolution) in Nijmegen, 27 November 2023, by Tom Mens (full professor in software engineering at University of Mons, Belgium). The keynote aims to provide tips, tricks and practical advice on how to become successful as a PhD student.
Recognising bot activity in collaborative software developmentTom Mens
Presentation by Natarajan Chidambaram during the International ICSE Workshop on Bots in Software Engineering (BotSE 2023) in Australia. Joint work with Mehdi Golzadeh, Tom Mens, Alexandre Decan of the Software Engineering Lab of the University of Mons and with Eleni Constantinou.
A Dataset of Bot and Human Activities in GitHubTom Mens
Presentation at the IEEE International Conference on Mining Software Repositories (MSR 2023) by Natarajan Chidambaram (Software Engineering Lab, University of Mons, Belgium) of a dataset of bot and human activities extracted from GitHub
This document discusses the rise of GitHub Actions (GHA) as a dominant continuous integration (CI) service based on a longitudinal study of 91,810 GitHub repositories. The study analyzed the evolution and usage of seven popular CI services over nine years, focusing on their co-usage and migration patterns. The study provides statistical evidence that GHA became the most used CI service within 18 months of its introduction, coinciding with a decrease in Travis usage likely due to policy changes and migrations to GHA. Interviews with software practitioners revealed competition between services and reasons for co-using or migrating between alternatives.
Nurturing the Software Ecosystems of the FutureTom Mens
In January 2018, four Software Engineering research groups located in different Belgian Universities launched a five year research project to nurture the software ecosystems of the future. We assembled a diverse team of about a dozen researchers and embarked on an exciting journey leading to a rich and diverse suite of papers, tools and datasets. Halfway into the project the corona pandemic intervened, but despite several months of lockdown, we succeeded in increasing inter-university collaboration. In this paper we share our achievements so that the BENEVOL community may benefit from our experience.
Comment programmer un robot en 30 minutes?Tom Mens
Comment apprendre à programmer un robot en 30 minutes? Atelier organisé par Tom Mens (en collaboration avec Pierre Zielinski, Gauvain Devillez et Sebastien Bonte) lors des Journées Math-Sciences du Printemps des Sciences 2022 à l'Université de Mons
On the rise and fall of CI services in GitHubTom Mens
Presentation of SANER 2022 conference article "On the rise and fall of CI services in GitHub" by Mehdi Golzadeh (co-authored with Alexandre Decan and Tom Mens).
On backporting practices in package dependency networksTom Mens
Presentation at FOSDEM 2022 Composition and Dependency Management DevRoom of empirical research on backporting practices in package dependency networks, published in the IEEE Transactions in Software Engineering in 2021 (https://doi.org/10.1109/TSE.2021.3112204)
Joint work by Alexandre Decan, Tom Mens; Ahmed Zeourali, Coen De Roover as part of the Belgian Excellence of Science research project SECOASSIST (https://secoassist.github.io)
Comparing semantic versioning practices in Cargo, npm, Packagist and RubygemsTom Mens
Presentation by Tom Mens at PackagingCon 2021 on Wednesday 10 November 2021.
Abstract: Semantic versioning (semver) is a commonly accepted open source practice, used by many package management systems to inform whether new package releases introduce possibly backward incompatible changes. Maintainers depending on such packages can use this practice to reduce the risk of breaking changes in their own packages by specifying version constraints on their dependencies. Depending on the amount of control a package maintainer desires to assert over her package dependencies, these constraints can range from very permissive to very restrictive. We empirically compared the evolution of semver compliance in four package management systems: Cargo, npm, Packagist and Rubygems. We discuss to what extent ecosystem-specific characteristics influence the degree of semver compliance, and we suggest to develop tools adopting the wisdom of the crowds to help package maintainers decide which type of version constraints they should impose on their dependencies.
We also studied to which extent the packages distributed by these package managers are still using a 0.y.z release, suggesting less stable and immature packages. We explore the effect of such "major zero" packages on semantic versioning adoption.
Our findings shed insight in some important differences between package managers with respect to package versioning policies.
Our empirical results have been published in two peer-reviewed academic journals: the IEEE Transactions in Software Engineering (https://doi.org/10.1109/TSE.2019.2918315) and Elsevier Science of Computer Programming (https://doi.org/10.1016/j.scico.2021.102656).
Achknowledgments: Research conducted in the context of the SECOASSIST "Excellence of Science" Research Project.
Presentation by Tom Mens at FOSDEM21 (Free Open Source Developers Meeting, February 2021). Published in Science of Computer Programming, August 2021.
https://doi.org/10.1016/j.scico.2021.102656
Abstract: When developing open source software end-user applications or reusable software packages, developers depend on software packages distributed through package managers such as npm, Packagist, Cargo, RubyGems. In addition to this, empirical evidence has shown that these package managers adhere to a large extent to semantic versioning principles. Packages that are still in major version zero are considered unstable according to semantic versioning, as some developers consider such packages as immature, still being under initial development.
This presentation reports on large-scale empirical evidence on the use of dependencies towards 0.y.z versions in four different software package distributions: Cargo, npm, Packagist and RubyGems. We study to which extent packages get stuck in the zero version space, never crossing the psychological barrier of major version zero. We compare the effect of the policies and practices of package managers on this phenomenon. We do not reveal the results of our findings in this abstract yet, as it would spoil the fun of the presentation.
Evaluating a bot detection model on git commit messagesTom Mens
Detecting the presence of bots in distributed software development activity is very important in order to prevent bias in socio-technical empirical studies. In previous work, we proposed a classification model to detect bots in GitHub repositories based on the pull request and issue comments of GitHub accounts. The current study generalises the approach to git contributors based on their commit messages. We train and evaluate the classification model on a large dataset of 6,922 git contributors. The original model based on pull request and issue comments obtained a precision of 0.77 on this dataset, whereas retraining the classification model on git commit messages increased the precision to 0.80. As a proof-of-concept, we implemented this model in BoDeGiC, an open source command-line tool to detect bots in git repositories.
Is my software ecosystem healthy? It depends!Tom Mens
QUATIC 2020 keynote presentation by Tom Mens (University of Mons) on dependency-related health issues in software ecosystems and research advances to address such health issues. Part of the presented research has been conducted as part of the Belgian SECO-ASSIST Excellence of Science Research Project.
Bot or not? Detecting bots in GitHub pull request activity based on comment s...Tom Mens
Presentation by Mehdi Golzadeh (Software Engineering Lab, University of Mons) of an article published at the 2nd International ICSE Workshop on Bots In Software Engineering (BotSE). See https://doi.org/10.1145/3387940.3391503
Abstract: Many empirical studies focus on socio-technical activity in social coding platforms such as GitHub, for example to study the onboarding, abandonment, productivity and collaboration among team members. Such studies face the difficulty that GitHub activity can also be generated automatically by bots of a different nature. It therefore becomes imperative to distinguish such bots from human users. We propose an automated approach to detect bots in GitHub pull request activity. Relying on the assumption that bots contain repetitive message patterns in their pull request comments, we analyse the similarity between multiple messages from the same GitHub identity, using a clustering method that combines the Jaccard and Levenshtein distance. We empirically evaluate our approach by analysing 20,090 comments of 250 users and 42 bots in 1,262 GitHub repositories. Our results show that the method is able to clearly separate bots from human users.
How magic is zero? An Empirical Analysis of Initial Development Releases in S...Tom Mens
1. 0.y.z packages are highly prevalent, contributing to 90% of packages in some distributions even though documentation states they are for initial development.
2. It generally takes a few months for packages to reach ≥1.0.0 but 20% take over a year, suggesting packages get stuck in 0.y.z.
3. 0.y.z packages are updated slightly more frequently but the difference is negligible, and there is little practical difference in how 0.y.z and ≥1.0.0 packages are used.
Comparing dependency issues across software package distributions (FOSDEM 2020)Tom Mens
This talk reports on our findings based on multiple empirical studies that we have conducted to understand different aspects of dependency management and their practical implications. This includes:
* the outdatedness of package dependencies, the transitive impact of such "technical lag", and its relation to the presence of bugs and security vulnerabilities.
* the impact of using either more permissive or more restrictive version contraints on dependencies.
* the virtues and limitations of being compliant to semantic versioning, a common policy to inform dependents whether new releases of software packages introduce possibly backward incompatible changes.
* the impact of specific characteristics, policies and tools used by the packaging ecosystem and its supporting community on all of the above.
The contents of the talk is primarily based on the following peer-reviewed scientific articles:
* What do package dependencies tell us about semantic versioning? Alexandre Decan, Tom Mens. IEEE Transactions on Software Engineering, 2019. https://doi.org/10.1109/TSE.2019.2918315
* An empirical comparison of dependency network evolution in seven software packaging ecosystems. Alexandre Decan, Tom Mens, Philippe Grosjean. Empirical Software Engineering 24(1):381-416, 2019. https://doi.org/10.1007/s10664-017-9589-y
* A formal framework for measuring technical lag in component repositories and its application to npm. Ahmed Zerouali, Tom Mens, Jesus Gonzalez‐Barahona, Alexandre Decan, Eleni Constantinou, Gregorio Robles. Journal of Software: Evolution and Process 31(8), 2019. https://doi.org/10.1002/smr.2157
* On the Impact of Security Vulnerabilities in the npm Package Dependency Network. Alexandre Decan, Tom Mens, Eleni Constantinou. International Conference on Mining Software Repositories, 2018. https://doi.org/10.1145/3196398.3196401
* On the Evolution of Technical Lag in the npm Package Dependency Network. Alexandre Decan, Tom Mens, Eleni Constantinou. International Conference on Software Maintenance and Evolution, 2018. https://doi.org/10.1109/ICSME.2018.00050
Measuring Technical Lag in Software Deployments (CHAOSScon 2020)Tom Mens
Presentation at CHAOSSCon Europe 2020 about the generic technical lag software measurement framework. Technical lag measures the increasing difference between deployed software components and the ideal upstream software components.
For more information, see https://doi.org/10.1002/smr.2157
This presentation reports on the research results achieved in the context of the interuniversity interdisciplinary research project SECOHealth "Vers une méthodologie et analyse socio-technique interdisciplinaire de la santé des écosystèmes logiciels" co-financed by FRS-FNRS Belgium and FRQ (FRSC - FRNT, Québec) with principal investigators Tom Mens (UMONS), Bram Adams (Polytechnique Montréal) and Josianne Marsan (Université Laval).
Introduction to the research seminar on empirical analysis of open source software ecosystems, organised by the SECO-ASSIST "excellence of science" research project, on September 4th, 2019 at the University of Mons, Belgium. With invited presentations by Alexander Serebrenik, Jesus Gonzalez-Barahona, Dario Di Nucci and Henrique Nucci. The seminar concludes with the public PhD defense of Ahmed Zerouali (supervised by Tom Mens) on the topic of "A Measurement Framework for Analyzing Technical Lag in Open-Source Software Ecosystems"
Empirically Analysing the Socio-Technical Health of Software Package ManagersTom Mens
Invited presentation at Concordia University (Montreal, Canada) by Eleni Constantinou and Tom Mens on recent research about the socio-technical health issues in software package management ecosystems.
Abstract: The large majority of today’s software is relying on open software software components. Such components are typically distributed through package managers for a wide variety of programming languages, and developed and maintained through online distributed software development services like GitHub. Software component repositories are perceived as software ecosystems that constitute complex and evolving socio-technical software dependency networks. Because of their complexity and evolution, these ecosystems tend to suffer from a wide variety of software health issues that can be either technical or social in nature. Examples of such issues include the ecosystem fragility due to exponential growth and transitive dependencies; the abundance of outdated, unmaintained or obsolete software components; the prolonged presence of unfixed bugs and security vulnerabilities; the abandonment or high turnover of key contributors, suboptimal collaboration between contributors, and many more. This presentation will report on our past and ongoing empirical research that studies such health factors within and across different software packaging ecosystems (such as npm, RubyGems, Cargo, CRAN, CPAN). We provide empirical evidence of some of the health problems, compare their presence across different ecosystems, and suggest ways to reduce their potential impact by providing concrete guidelines and tools. The presented research Is being conducted by researchers of the Software Engineering Lab at the University of Mons in the context of two ongoing projects SECOHealth and SECO-ASSIST, aiming to analyse and improve the health of software ecosystems.
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
Have you ever been confused by the myriad of choices offered by AWS for hosting a website or an API?
Lambda, Elastic Beanstalk, Lightsail, Amplify, S3 (and more!) can each host websites + APIs. But which one should we choose?
Which one is cheapest? Which one is fastest? Which one will scale to meet our needs?
Join me in this session as we dive into each AWS hosting service to determine which one is best for your scenario and explain why!
OpenID AuthZEN Interop Read Out - AuthorizationDavid Brossard
During Identiverse 2024 and EIC 2024, members of the OpenID AuthZEN WG got together and demoed their authorization endpoints conforming to the AuthZEN API
CAKE: Sharing Slices of Confidential Data on BlockchainClaudio Di Ciccio
Presented at the CAiSE 2024 Forum, Intelligent Information Systems, June 6th, Limassol, Cyprus.
Synopsis: Cooperative information systems typically involve various entities in a collaborative process within a distributed environment. Blockchain technology offers a mechanism for automating such processes, even when only partial trust exists among participants. The data stored on the blockchain is replicated across all nodes in the network, ensuring accessibility to all participants. While this aspect facilitates traceability, integrity, and persistence, it poses challenges for adopting public blockchains in enterprise settings due to confidentiality issues. In this paper, we present a software tool named Control Access via Key Encryption (CAKE), designed to ensure data confidentiality in scenarios involving public blockchains. After outlining its core components and functionalities, we showcase the application of CAKE in the context of a real-world cyber-security project within the logistics domain.
Paper: https://doi.org/10.1007/978-3-031-61000-4_16
“An Outlook of the Ongoing and Future Relationship between Blockchain Technologies and Process-aware Information Systems.” Invited talk at the joint workshop on Blockchain for Information Systems (BC4IS) and Blockchain for Trusted Data Sharing (B4TDS), co-located with with the 36th International Conference on Advanced Information Systems Engineering (CAiSE), 3 June 2024, Limassol, Cyprus.
Ocean lotus Threat actors project by John Sitima 2024 (1).pptxSitimaJohn
Ocean Lotus cyber threat actors represent a sophisticated, persistent, and politically motivated group that poses a significant risk to organizations and individuals in the Southeast Asian region. Their continuous evolution and adaptability underscore the need for robust cybersecurity measures and international cooperation to identify and mitigate the threats posed by such advanced persistent threat groups.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Speck&Tech
ABSTRACT: A prima vista, un mattoncino Lego e la backdoor XZ potrebbero avere in comune il fatto di essere entrambi blocchi di costruzione, o dipendenze di progetti creativi e software. La realtà è che un mattoncino Lego e il caso della backdoor XZ hanno molto di più di tutto ciò in comune.
Partecipate alla presentazione per immergervi in una storia di interoperabilità, standard e formati aperti, per poi discutere del ruolo importante che i contributori hanno in una comunità open source sostenibile.
BIO: Sostenitrice del software libero e dei formati standard e aperti. È stata un membro attivo dei progetti Fedora e openSUSE e ha co-fondato l'Associazione LibreItalia dove è stata coinvolta in diversi eventi, migrazioni e formazione relativi a LibreOffice. In precedenza ha lavorato a migrazioni e corsi di formazione su LibreOffice per diverse amministrazioni pubbliche e privati. Da gennaio 2020 lavora in SUSE come Software Release Engineer per Uyuni e SUSE Manager e quando non segue la sua passione per i computer e per Geeko coltiva la sua curiosità per l'astronomia (da cui deriva il suo nickname deneb_alpha).
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
Ivanti’s Patch Tuesday breakdown goes beyond patching your applications and brings you the intelligence and guidance needed to prioritize where to focus your attention first. Catch early analysis on our Ivanti blog, then join industry expert Chris Goettl for the Patch Tuesday Webinar Event. There we’ll do a deep dive into each of the bulletins and give guidance on the risks associated with the newly-identified vulnerabilities.
Infrastructure Challenges in Scaling RAG with Custom AI modelsZilliz
Building Retrieval-Augmented Generation (RAG) systems with open-source and custom AI models is a complex task. This talk explores the challenges in productionizing RAG systems, including retrieval performance, response synthesis, and evaluation. We’ll discuss how to leverage open-source models like text embeddings, language models, and custom fine-tuned models to enhance RAG performance. Additionally, we’ll cover how BentoML can help orchestrate and scale these AI components efficiently, ensuring seamless deployment and management of RAG systems in the cloud.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
Generating privacy-protected synthetic data using Secludy and MilvusZilliz
During this demo, the founders of Secludy will demonstrate how their system utilizes Milvus to store and manipulate embeddings for generating privacy-protected synthetic data. Their approach not only maintains the confidentiality of the original data but also enhances the utility and scalability of LLMs under privacy constraints. Attendees, including machine learning engineers, data scientists, and data managers, will witness first-hand how Secludy's integration with Milvus empowers organizations to harness the power of LLMs securely and efficiently.
Things to Consider When Choosing a Website Developer for your Website | FODUUFODUU
Choosing the right website developer is crucial for your business. This article covers essential factors to consider, including experience, portfolio, technical skills, communication, pricing, reputation & reviews, cost and budget considerations and post-launch support. Make an informed decision to ensure your website meets your business goals.
Monitoring and Managing Anomaly Detection on OpenShift.pdfTosin Akinosho
Monitoring and Managing Anomaly Detection on OpenShift
Overview
Dive into the world of anomaly detection on edge devices with our comprehensive hands-on tutorial. This SlideShare presentation will guide you through the entire process, from data collection and model training to edge deployment and real-time monitoring. Perfect for those looking to implement robust anomaly detection systems on resource-constrained IoT/edge devices.
Key Topics Covered
1. Introduction to Anomaly Detection
- Understand the fundamentals of anomaly detection and its importance in identifying unusual behavior or failures in systems.
2. Understanding Edge (IoT)
- Learn about edge computing and IoT, and how they enable real-time data processing and decision-making at the source.
3. What is ArgoCD?
- Discover ArgoCD, a declarative, GitOps continuous delivery tool for Kubernetes, and its role in deploying applications on edge devices.
4. Deployment Using ArgoCD for Edge Devices
- Step-by-step guide on deploying anomaly detection models on edge devices using ArgoCD.
5. Introduction to Apache Kafka and S3
- Explore Apache Kafka for real-time data streaming and Amazon S3 for scalable storage solutions.
6. Viewing Kafka Messages in the Data Lake
- Learn how to view and analyze Kafka messages stored in a data lake for better insights.
7. What is Prometheus?
- Get to know Prometheus, an open-source monitoring and alerting toolkit, and its application in monitoring edge devices.
8. Monitoring Application Metrics with Prometheus
- Detailed instructions on setting up Prometheus to monitor the performance and health of your anomaly detection system.
9. What is Camel K?
- Introduction to Camel K, a lightweight integration framework built on Apache Camel, designed for Kubernetes.
10. Configuring Camel K Integrations for Data Pipelines
- Learn how to configure Camel K for seamless data pipeline integrations in your anomaly detection workflow.
11. What is a Jupyter Notebook?
- Overview of Jupyter Notebooks, an open-source web application for creating and sharing documents with live code, equations, visualizations, and narrative text.
12. Jupyter Notebooks with Code Examples
- Hands-on examples and code snippets in Jupyter Notebooks to help you implement and test anomaly detection models.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
A survey on software quality practice - Pilot study in the Walloon region
1. A survey on software quality practice -
Pilot study in the Walloon region
Javier Perez, Tom Mens, Service de Génie Logiciel, Université de Mons
Flora Kamseu, Naji Habra, PRECISE lab, FUNDP
Presented at SATToSE seminar
University of Koblenz, August 2012
SATTOSE 2012
2. Context
• Portefeuille TIC
– ERDF project lead by CETIC (2007-2013)
• CEIQS: Center of expertise in engineering and
quality of systems
– aimed at developing a portfolio of innovative techniques
allowing local companies to master the diversity,
complexity, quality and rapid evolution of information
systems
• workpackage QUALGEN
– collaboration between FUNDP and UMONS since 2010
– Supported by Wallonia
SATTOSE 2012 2
3. Objectives
• Explore how quality-related software development
practice is being performed in industry
• Compare this across different regions and
countries
• Relates this to what is being taught in academia
– Is there a gap between teaching and industry needs?
– What good methodologies, practices, tools are not
being used and why?
SATTOSE 2012 3
4. About the survey
• Online survey carried out in Walloon region
– Using LimeSurvey, from 29/5 till 30/6 2012
– Companies involved in software development or
software maintenance
• Addressed topics
– Use of processes during software development and
maintenance
– Use of software quality measurement and improvement
– Use of quality models and quality standards
– Use of testing
– Organisational support of development teams
SATTOSE 2012 4
5. Structure of the questionnaire
• Introductory questions (6)
– Details of respondent and company
• General development questions (5)
– Perspective on dev. practices carried out by company
• Structural software quality (5)
• Software testing and maintenance (5)
• Quality models and quality standards (5)
• Organisational support of development teams (5)
SATTOSE 2012 5
6. Respondents – Number
– Initial mailing sent out to 145 companies
– 71 responses out of 188 contacted respondents
• Response rate 37,8%
– Responses from 47 different companies
• Multiple responses from same companies were
aggregated into a single one
– Incomplete responses were ignored
– 44 fully completed questionnaires kept for analysis
• Corresponds to 62% of received responses
SATTOSE 2012 6
8. Respondents – Company size
Good balance between company size (number of employees) of respondents
SATTOSE 2012 8
9. Respondents – Training Level
Many at master+engineer level (18+17) and bachelor (20) level
Few or none with PhD (10+26), other degree (10+16) or no degree (3+27)
SATTOSE 2012 9
10. Process – Dev. process support
• Use of a well-defined and well-documented
development process? 45,5%
– No: 19
– Yes: 20 (agile or scrum, Prince 2, RUP, ISO
certification, proprietary)
– Don’t know: 5
• Use of agile practices or methods? 63,6%
– No: 12
– Yes: 28 (17 mention SCRUM, 2 mention Prince2)
– Don’t know: 4
SATTOSE 2012 10
11. Process - change or configuration
management process
Use of change or configuration management process is highly popular
73,8% (31/42)
SATTOSE 2012 11
12. Process - perceived importance factors
for software project success
• Rated from (1) not important to (5) essential
Average : 4 4,2
SATTOSE 2012 12
14. Process - Creation and modification of
software artefacts
Are arch. descr. (36,4%) and design models (39,5%) being evolved?
SATTOSE 2012 14
15. Tools - Use of integrated
platform for
Version control (97,6%) and bug tracking (92,7%) well established
Platforms for continuous integration (57,1%), configuration (55,6%)
and testing (60,5%) a bit less
SATTOSE 2012 15
16. Tools - Programming languages
used followed by scripting languages.
OO languages most popular (Java, C#, C++),
SQL and Cobol legacy also remains important.
SATTOSE 2012 16
17. Tools - Development
environments used
Dichotomy between Java and .Net visible at IDE level.
Many others but much less frequently used.
SATTOSE 2012 17
18. Tools - support for design models,
documentation and code synchro
• Very basic
– Visio (10), Word (8), Sparx Enterprise Architect
(5), Doxygen (3), StarUML (2), Confluence (2)
and many others
– Little use of UML modeling tools
• Little or no support for model-code
synchronisation
SATTOSE 2012 18
19. Quality - Use of design patterns
Use of design patterns is highly popular 72,7% (32/44)
SATTOSE 2012 19
20. Quality - Use of quality
improvement techniques
Only moderately popular (35,7%<x<43%) except for refactoring
Often or
(58,5%) continuously
36,6% (15/41)
42,9% (18/42)
35,7% (15/42)
58,5% (24/41)
SATTOSE 2012 20
21. Use of quality support continued
Poor support for quality (no quality tools, processes or models)
36,8% (14/38)
32,4% (12/37)
19,4% (7/36)
SATTOSE 2012 21
22. Popularity of quality
improvement techniques
• Most popular
Version control 97,6% and bug tracking 92,7% platforms
Change and configuration management: 73,8%
Design patterns: 72,7%
Refactoring: 58,5%
• Less popular
Design improvement (e.g. code smell reduction): 42,9%
Bad quality detection tools: 36,8%
Metrics and visualisation tools: 35,7%
Dynamic analysis tools (profiling etc.): 36,6%
Quality support or improvement process: 32,4%
• Unpopular
Use of a quality model: 19,4%
SATTOSE 2012 22
23. Reuse of libraries, components
and platforms
Reuse is highly successful
86% from own company
75% from open source
60% from other companies
(excluding “don’t know” results)
SATTOSE 2012 23
24. Testing - Use of test process
Testing is done by nearly all respondents (97,7%), but
test process used by only 46,5% of the respondents
SATTOSE 2012 24
26. Testing - tool support
• Very varied
– Mostly unit testing frameworks (8)
– Others: Mantis (4), HP Quality Center (4),
Selenium (3), Hudson (2), Jenkins (2), Quick
Test Pro (2), and many more
SATTOSE 2012 26
27. Preliminary conclusions
• Wide range of respondents, from very small to big
companies, using many different programming
languages and development environments
• Strong points
– All respondents believe that quality assurance and
testing are very important for project success
– Wide use of testing (97,7%), agile practices (63,6%),
design patterns (72,9%) and refactoring (56,5%)
– High level of reuse of components/libraries/platforms
(60% to 86% depending on source of reuse)
SATTOSE 2012 27
28. Preliminary conclusions
• Weak points
– Mitigated success of processes
• development processes (45,5%), test processes (46,5%),
quality support/improvement process (32,4%), quality
models (19,4%)
• Exception: change management process (73,8%)
SATTOSE 2012 28
29. Preliminary conclusions
• Weak points
– Mitigated success of static and dynamic code analysis
tools for detecting quality issues, visualisation,
computing metrics, profiling, etc… (popularity
between 35% and 43%)
– Does not reflect the high
perceived importance of
quality assurance for
software project success
SATTOSE 2012 29
30. What’s next?
• Analyse results in more detail
– Correlate results to company size and training level
• Are bigger companies more process-driven and smaller
ones more agile? Does training level play a role?
• Report on the results
• Repeat the study in other countries
– Identify regional or national trends
– Compare differences and commonalities
– We need your help here!
studies
SATTOSE 2012 30
31. Collaborative study
• Research 2.0
– Carry out follow-up / more narrow studies
– Carry out this study as a collaborative
community effort?
• Store / share / reuse our data with others
– Which format/platform/… to use?
• Have a working session on this topic during
SATTOSE/SOTESOLA?
SATTOSE 2012 31
Editor's Notes
188 potential respondents were identified and invited by mail to respond to survey
Pay attention that this does not necessarily represent the real distribution of number of companies per size in considered region. This would mean that, in principle, all charts and results shown hereafter should be “weighted” to reflect the real distribution of sizes in the considered region.
Where are PhDs working? SMEs? Bigger companies? Agile-driven companies?
Correlate this to the company size !
multiple answers were possible Responses were very varied Singletons: ASP; Flex; HTML5; PERL; Eiffel; Pascal; Objective C; Delphi; Mumps; Groovy; Scala; Matlab; JCL; CL; CICS; DDS; RPG; …